Language Understanding and Pragmatics

Does AI text generation unfold through temporal reflection?

Explores whether the sequential ordering of tokens in LLM generation constitutes genuine temporal thought or merely probabilistic computation without reflective duration.

Note · 2026-04-14
What kind of thing is an LLM really?

Human writing is temporal in a specific sense. A writer reflects in time, and the sentence that follows emerges from the time spent thinking about the sentence before it. The order of one thought after another is a temporal order: the later thought is later because something happened in the interval — consideration, revision, reaction. Time is constitutive of what the next thought becomes.

LLM generation also produces one token after another, but the ordering principle is different. The next token is selected by probability conditional on the prior sequence. Nothing happens in the interval between tokens except the computation of the next distribution. There is no reflection, no revision, no duration in which the claim is tested against what has come before. The order is sequential — strictly — but it is not temporal in the reflective sense. It is computed ordering, not lived ordering.

This matters for how AI-generated text relates to discourse. Human discourse is temporal because it is made of moves that respond to prior moves, anticipate future moves, and take time to make. AI text has the surface form of such a move but lacks the temporal structure that would give it its meaning. The text appears, in a sense, all at once — even though it was produced sequentially — because the production time is not the time of anyone's thinking.

This is adjacent to but distinct from Does LLM generation explore competing claims while producing text?. Smoothness describes the absence of turbulent counter-exploration. Atemporality describes the absence of duration-in-reflection. Both properties follow from the same generative process but bear on different dimensions of what makes discourse discursive.


Source: Epistemic Inflation

Related concepts in this collection

Concept map
17 direct connections · 137 in 2-hop network ·medium cluster

Click a node to walk · click center to open · click Open full network for a force-directed map

your link semantically near linked from elsewhere
Original note title

AI knowledge is atemporal — probabilistic token ordering is sequence not temporal flow