Language Understanding and Pragmatics

Does AI abundance actually devalue knowledge itself?

If AI generates vastly more claims than humans can evaluate, does the sheer volume undermine the social processes that normally establish what counts as reliable knowledge? And what would that erosion look like?

Note · 2026-04-14
What do language models actually know? What happens to social order when AI removes ritual constraints?

Inflation alone describes the case where quantity rises and per-unit value falls. Stagflation names a sharper problem: quantity rises, per-unit value falls, and the real economy underneath stagnates or contracts at the same time. The epistemic version has the same structure. AI generates more claims than the discursive economy has ever had to metabolize, while the conversational, institutional, and expert processes that would normally convert claims into reliable knowledge are themselves decaying.

This matters because the standard inflationary critique — "there is too much AI content" — underdescribes the damage. The volume is not the problem in isolation. The problem is that How does AI writing escape the conversations that govern knowledge?, so the mechanisms that produce real knowledge-value contract while the mechanisms that produce nominal knowledge-tokens expand. Quantity and quality move in opposite directions, and the productive base erodes.

The stagflation framing forecasts several observable effects: diminishing returns to search (more results, less findable signal), compression of expert compensation (the signal experts provide is drowned out by undifferentiated AI claims), and a preference shift toward social-proof shortcuts (trust in the person replaces trust in the argument). None of these are predictions of scarcity — they are predictions of structural devaluation under abundance.

Counterargument: markets eventually price in unbacked tokens and discount them. But currency-like circulation of knowledge claims depends on a social consensus about what counts as backing. That consensus is precisely what AI erodes.


Source: Epistemic Inflation

Related concepts in this collection

Concept map
13 direct connections · 71 in 2-hop network ·medium cluster

Click a node to walk · click center to open · click Open full network for a force-directed map

your link semantically near linked from elsewhere
Original note title

AI produces epistemic stagflation — quantity of knowledge rises while value and reliability fall