Papers
arxiv:2405.18357

Faithful Logical Reasoning via Symbolic Chain-of-Thought

Published on May 28, 2024
Authors:
,
,
,
,

Abstract

While the recent Chain-of-Thought (CoT) technique enhances the reasoning ability of large language models (LLMs) with the theory of mind, it might still struggle in handling <PRE_TAG>logical reasoning</POST_TAG> that relies much on <PRE_TAG>symbolic expressions</POST_TAG> and rigid deducing rules. To strengthen the <PRE_TAG>logical reasoning</POST_TAG> capability of LLMs, we propose a novel Symbolic Chain-of-Thought, namely SymbCoT, a fully LLM-based framework that integrates <PRE_TAG>symbolic expressions</POST_TAG> and logic rules with CoT prompting. Technically, building upon an LLM, SymbCoT 1) first translates the natural language context into the symbolic format, and then 2) derives a step-by-step plan to solve the problem with symbolic logical rules, 3) followed by a verifier to check the translation and reasoning chain. Via thorough evaluations on 5 standard datasets with both First-Order Logic and Constraint Optimization <PRE_TAG>symbolic expressions</POST_TAG>, SymbCoT shows striking improvements over the CoT method consistently, meanwhile refreshing the current state-of-the-art performances. We further demonstrate that our system advances in more faithful, flexible, and explainable <PRE_TAG>logical reasoning</POST_TAG>. To our knowledge, this is the first to combine <PRE_TAG>symbolic expressions</POST_TAG> and rules into CoT for <PRE_TAG>logical reasoning</POST_TAG> with LLMs. Code is open at https://github.com/Aiden0526/SymbCoT.

Community

Sign up or log in to comment

Models citing this paper 2

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2405.18357 in a dataset README.md to link it from this page.

Spaces citing this paper 6

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.