Think-on-Graph: Deep and Responsible Reasoning of Large Language Model with Knowledge Graph
“Large language models (LLMs) have made significant strides in various tasks, yet they often struggle with complex reasoning and exhibit poor performance in scenarios where knowledge traceability, timeliness, and accuracy are crucial. To address these limitations, we present Think-on-Graph (ToG), a novel framework that leverages knowledge graphs to enhance LLMs’ ability for deep and responsible reasoning. By employing ToG, we can identify entities relevant to a given question and conduct exploration and reasoning to retrieve related triples from an external knowledge database. This iterative procedure generates multiple reasoning pathways consisting of sequentially connected triplets until sufficient information is gathered to answer the question or the maximum depth is reached. Through experiments on complex multi-hop reasoning question-answering tasks, we demonstrate that ToG outperforms existing methods, effectively addressing the aforementioned limitations of LLMs without incurring additional training costs.
Building upon the need for deep and responsible reasoning, we introduce the Think-on-Graph (ToG) framework which leverages factual knowledge to drive the step-by-step thinking of LLMs. Different from previous approaches that decompose the input question into sub-questions Li et al. (2023), ToG first identifies topic entities in the input question. It then proceeds to iteratively retrieve relevant triples through exploration and reasoning on external knowledge bases using LLM. This iterative process generates multiple high-probability reasoning pathways consisting of sequentially connected triples. This process continues until sufficient information is acquired to answer the question or the maximum depth is reached. Following this procedure, we can significantly enhance the factual reasoning capabilities of LLMs, thus effectively mitigating hallucination issues and improving response precision.”