Language Understanding and Pragmatics Conversational AI Systems Psychology and Social Cognition

What makes explanations work in real conversation?

Does explanation quality depend on how dialogue partners interact—testing understanding, adjusting based on feedback, and coordinating their communicative moves—rather than just information content alone?

Note · 2026-02-22 · sourced from Conversation Topics Dialog
Where exactly does language competence break down in LLMs? Why do AI conversations reliably break down after multiple turns? How should researchers navigate LLM reasoning research?

Explanation in conversation is not delivery of information from explainer to explainee. It is a co-construction where both participants shape the quality of understanding achieved. The Wachsmuth corpus formalizes this through three interacting dimensions of each dialogue turn:

Topic relation — how each turn's content relates to the main topic:

Dialogue act — the communicative function (10-category scheme):

Explanation move — the pedagogical function (10-category scheme):

The critical insight is that these three dimensions interact to determine explanation success. A turn that provides explanation (move) through an informing statement (act) on a subtopic (topic) has different predictive value than the same explanation move delivered via a question on a related topic. The combinatorial space is what matters — not any single dimension.

This directly challenges how LLMs approach explanation: they typically generate monological explanations without checking understanding, testing prior knowledge, or adjusting based on feedback. Since What three layers must discourse systems actually track?, the explanation corpus adds that explanation itself has three irreducible components — and current models handle at most one (providing information) while ignoring the dialogical dimensions.

The methodology extends Rohlfing et al.'s (2021) clarification that "explaining is an intrinsically dialogical process in which participants co-construct an explanation." This is not an abstract claim — the corpus provides empirical evidence that interaction patterns (not just content quality) predict whether the explainee actually understands.


Source: Conversation Topics Dialog

Related concepts in this collection

Concept map
15 direct connections · 124 in 2-hop network ·medium cluster

Click a node to walk · click center to open · click Open full network for a force-directed map

your link semantically near linked from elsewhere
Original note title

dialogical explanation quality depends on three interacting dimensions — topic relation dialogue act and explanation move — that jointly predict success