Does AI abundance actually devalue knowledge itself?
If AI generates vastly more claims than humans can evaluate, does the sheer volume undermine the social processes that normally establish what counts as reliable knowledge? And what would that erosion look like?
Inflation alone describes the case where quantity rises and per-unit value falls. Stagflation names a sharper problem: quantity rises, per-unit value falls, and the real economy underneath stagnates or contracts at the same time. The epistemic version has the same structure. AI generates more claims than the discursive economy has ever had to metabolize, while the conversational, institutional, and expert processes that would normally convert claims into reliable knowledge are themselves decaying.
This matters because the standard inflationary critique — "there is too much AI content" — underdescribes the damage. The volume is not the problem in isolation. The problem is that How does AI writing escape the conversations that govern knowledge?, so the mechanisms that produce real knowledge-value contract while the mechanisms that produce nominal knowledge-tokens expand. Quantity and quality move in opposite directions, and the productive base erodes.
The stagflation framing forecasts several observable effects: diminishing returns to search (more results, less findable signal), compression of expert compensation (the signal experts provide is drowned out by undifferentiated AI claims), and a preference shift toward social-proof shortcuts (trust in the person replaces trust in the argument). None of these are predictions of scarcity — they are predictions of structural devaluation under abundance.
Counterargument: markets eventually price in unbacked tokens and discount them. But currency-like circulation of knowledge claims depends on a social consensus about what counts as backing. That consensus is precisely what AI erodes.
Source: Epistemic Inflation
Related concepts in this collection
-
How does AI writing escape the conversations that govern knowledge?
If knowledge claims normally get filtered and refined through social discourse, what happens when AI generates claims outside that governing process? Why does scale matter here?
the mechanism behind the stagflation dynamic
-
Does AI separate intellectual form from the thinking behind it?
Exploring whether AI's ability to generate polished intellectual products without the underlying reasoning process represents a genuinely new kind of decoupling, and what that means for how we evaluate knowledge.
the form/backing gap that makes devaluation possible
-
Does AI reshape expert work into knowledge management?
As AI generates knowledge at scale, does expert work shift from creating new understanding to curating and validating machine outputs? This matters because curation and creation demand different cognitive skills.
one downstream consequence of the productive base eroding
Click a node to walk · click center to open · click Open full network for a force-directed map
Original note title
AI produces epistemic stagflation — quantity of knowledge rises while value and reliability fall