Conversational AI Systems Recommender Systems

Does conversation order matter for recommending items in dialogue?

Conversational recommendation systems typically ignore the sequence in which items are mentioned, treating dialogue as a bag of entities. But does the order itself carry predictive signal about what to recommend next?

Note · 2026-05-03 · sourced from Recommenders Conversational
What breaks when specialized AI models reach real users?

CRS dialogues mention items and entities in order. People discuss Fast & Furious 1 before recommending Fast & Furious 4. They mention a director before items by that director. The order is informative — recommending the next sequel makes sense after the prior installment is the topic; recommending a film by a freshly-mentioned director makes sense once the director is in context. Most prior CRS work treated the conversation as a bag of mentioned entities, discarding this sequential structure.

TSCR brings transformer-based sequential modeling into CRS. The conversation is represented as a sequence of items and entities in mention-order, and a transformer learns the dependencies between adjacent and non-adjacent items in the sequence. User preferences are inferred not just from "what was mentioned" but from "what was mentioned in what order". This captures sequential dependencies that knowledge-graph and entity-linking approaches miss.

The architectural move is small but structurally important: it imports sequence modeling techniques from sequential recommendation (where user purchase histories form sequences) into a domain (CRS) that had been treating conversations as static feature bags. The result is improved recommendation accuracy on standard CRS benchmarks. The general lesson: when a domain throws away order, ask why — and check whether the order carries information that the bag-of-features representation can't access.


Source: Recommenders Conversational

Related concepts in this collection

Concept map
14 direct connections · 79 in 2-hop network ·medium cluster

Click a node to walk · click center to open · click Open full network for a force-directed map

your link semantically near linked from elsewhere
Original note title

CRS items mentioned in conversation form sequences with prequel-sequel dependencies — Transformer sequential modeling improves recommendation