OpenDialKG: Explainable Conversational Reasoning with Attention-based Walks over Knowledge Graphs
We study a conversational reasoning model that strategically traverses through a largescale common fact knowledge graph (KG) to introduce engaging and contextually diverse entities and attributes. For this study, we collect a new Open-ended Dialog $ KG parallel corpus called OpenDialKG, where each utterance from 15K human-to-human roleplaying dialogs is manually annotated with ground-truth reference to corresponding entities and paths from a large-scale KG with 1M+ facts. We then propose the DialKG Walker model that learns the symbolic transitions of dialog contexts as structured traversals over KG, and predicts natural entities to introduce given previous dialog contexts via a novel domain-agnostic, attention-based graph path decoder. Automatic and human evaluations show that our model can retrieve more natural and human-like responses than the state-of the- art baselines or rule-based models, in both in-domain and cross-domain tasks.