Language Understanding and Pragmatics Psychology and Social Cognition

Does AI actually commodify expertise or tokenize it?

The standard framing treats AI output like mass-produced commodities, but does AI's contextual, mutable nature fit better with token economics than commodity theory?

Note · 2026-04-14
What do language models actually know? What happens to social order when AI removes ritual constraints?

The reflex framing for what AI does to expertise is commodification — Marx's category for what happens when previously bespoke things get standardized, mass-produced, and sold as identical interchangeable units. On this framing, AI is the latest stage of an old process: standardization of cognitive labor.

The framing fails on the central feature. Commodities are objects: identical in form, fixed once produced, possessable, stockpileable. AI output has none of these properties. It varies per prompt, per audience, per context. It is not stored as units; it is generated on demand and disappears unless captured. It is not interchangeable with itself, because the same prompt produces different output across runs and across contexts. The category-fit between "commodity" and "AI output" is poor.

A better category is the token — borrowed from monetary theory and from Giddens' symbolic-tokens analysis. Tokens are mediums of exchange whose value is in circulation, not in possession. They are mutable in form because their function is conversion: they convert intent into something the receiver can use. Money tokenizes labor; AI tokenizes intelligence. The structural analogy is exact: a fluid, contextual, infinitely reproducible exchange medium that converts user intent into expert-seeming output, valued not by what it IS but by what it DOES for the receiver.

This reframe matters because the diagnostic prescriptions diverge. The commodity frame predicts standardization and price collapse. The token frame predicts inflationary dynamics, currency-validator emergence, and a shift in the locus of value from the artifact to the act of receiving. The published Hyperinflation post and the Knowledge Custodian series both rest on the token frame; making the frame explicit lets the rest of the series cohere theoretically.

The strongest counterargument: tokenization is just commodification at a finer grain. The reply is that no commodity has the property of being generated anew per use — that property breaks the object/stock model commodities require.


Source: Tokenization of Intelligence

Related concepts in this collection

Concept map
15 direct connections · 89 in 2-hop network ·medium cluster

Click a node to walk · click center to open · click Open full network for a force-directed map

your link semantically near linked from elsewhere
Original note title

AI tokenizes intelligence rather than commodifying it — flows replace stocks contextual mutability replaces identical mass-production