Conversational AI Systems Language Understanding and Pragmatics Psychology and Social Cognition

Why do users drift away from their original information need?

When users know their knowledge is incomplete but cannot articulate what's missing, do they unintentionally shift topics? And can real-time systems detect this drift?

Note · 2026-02-22 · sourced from Question Answer Search
Why do AI conversations reliably break down after multiple turns? What kind of thing is an LLM really? How should researchers navigate LLM reasoning research?

Information science identified a specific cognitive condition decades before conversational AI made it a practical design problem. Belkin & Vickery (1985) named it the "anomalous state of knowledge" (ASK): users who know their knowledge is incomplete but cannot articulate what is missing. They know they need something but cannot specify what.

This matters because it produces a specific observable behavior: unintentional topic drift. Users in an ASK state begin pursuing one information need, then gradually deviate into sub-topics without their own awareness. They don't decide to change topic — they drift, because each intermediate result partially addresses their need while also exposing adjacent gaps, pulling their attention sideways.

The Topic Shift Detection paper demonstrates that this drift is detectable. Their model predicts with 84% precision which utterances belong to the major topic versus those deviating from it — without a predetermined topic set. This is significant because open-domain systems cannot predefine all possible topics. The detection must work from conversational dynamics alone.

This complements the gulf of envisioning from the USER side. Since How do users actually form intent when prompting AI systems?, the gulf describes the intent formation challenge. ASK describes a specific upstream cause: the user's knowledge state is anomalous in a way that prevents intent articulation. And it predicts a specific downstream effect: topic drift.

The two phenomena create a feedback loop: anomalous knowledge → vague query → partial results → exposed new gaps → drift into sub-topic → further from original need → more anomalous knowledge. Without active intervention, the user spirals away from their actual information need.

This also connects to the AI-side problem. Since Why do language models engage with conversational distractors?, the drift is bilateral: the user drifts because of ASK, and the AI follows the drift because it lacks topic-following discipline. Neither party maintains the thread. The paper argues for "context-dependent user guidance without presupposing a strict hierarchy of plans and task goals" — guidance that adapts to where the user IS rather than where a predetermined dialogue tree expects them to be.


Source: Question Answer Search

Related concepts in this collection

Concept map
19 direct connections · 135 in 2-hop network ·medium cluster

Click a node to walk · click center to open · click Open full network for a force-directed map

your link semantically near linked from elsewhere
Original note title

anomalous state of knowledge is a distinct cognitive condition where users cannot articulate incomplete knowledge leading to unintentional topic drift detectable in real-time