Is AI fundamentally changing how value gets produced?
Rather than automating commodity production, does AI represent a shift from making identical stockpiled objects to generating contextual tokens on demand? And what makes this genuinely new?
The commodity age — the long stretch of industrial modernity Marx analyzed — was organized around objects: things produced at scale, identical in form, possessable, stockpileable, exchangeable through their material substance. Production was material, distribution was logistical, value lived in the object. Even when knowledge entered this economy (textbooks, encyclopedias, recorded music) it was knowledge-as-object: stamped, identical, stocked.
The token age does not abolish the commodity age but layers a different productive logic over it. Production is contextual: each output is generated for the immediate use, not stamped to a template. Distribution is non-logistical: the token does not need to be moved because it is generated where it is consumed. Value lives in the relation between token and receiver, not in the token itself. The form of the productive thing is linguistic rather than material — strings of text, audio, image, code, generated at point of use.
This periodization helps explain why intuitions formed in the commodity age misfire. Commodification predictions (price collapse, standardization, deskilling) describe what happens when previously bespoke objects become mass-produced. Tokenization produces different effects: not price collapse but inflationary devaluation of the token-class as a whole, not standardization but extreme contextual variation that simulates customization, not deskilling but a transformation of skill from production to validation. Does AI abundance actually devalue knowledge itself? is a token-age effect that has no clean commodity-age analog.
The strongest counterargument: this is just digital production, not a new age. But digital production prior to AI still produced storeable artifacts (files, programs, recordings). Tokens generated on demand are not storeable in the same sense — even when captured, the captured artifact is a snapshot of a process, not a productive object. The category is genuinely new.
Source: Tokenization of Intelligence
Related concepts in this collection
-
Does AI actually commodify expertise or tokenize it?
The standard framing treats AI output like mass-produced commodities, but does AI's contextual, mutable nature fit better with token economics than commodity theory?
the structural reframe this is the periodization of
-
Does AI abundance actually devalue knowledge itself?
If AI generates vastly more claims than humans can evaluate, does the sheer volume undermine the social processes that normally establish what counts as reliable knowledge? And what would that erosion look like?
an effect that follows the token-age logic
-
Does AI reshape expert work into knowledge management?
As AI generates knowledge at scale, does expert work shift from creating new understanding to curating and validating machine outputs? This matters because curation and creation demand different cognitive skills.
the role-shift that the periodization explains
Click a node to walk · click center to open · click Open full network for a force-directed map
Original note title
AI marks the transition from the age of the commodity to the age of the token — stocks to flows identical to mutable material to linguistic