Chain-of-Thought (CoT)
Also known as: Chain-of-Thought / CoT / 思考の連鎖 / チェーンオブソート
A prompting technique that elicits step-by-step reasoning from an LLM, significantly improving accuracy on multi-step math, logic, and planning tasks. Even the phrase 'think step by step' triggers CoT.
Overview
Introduced by Wei et al. in 2022, CoT asks the LLM to produce intermediate reasoning steps rather than a direct answer. It substantially improves accuracy on multi-step math, logic puzzles, and planning tasks. Even a simple 'Let's think step by step' cue in the prompt activates CoT behavior.
Extensions
Self-Consistency generates multiple independent reasoning chains and takes a majority vote. ReAct interleaves reasoning and tool use. Tree-of-Thought explores a branching solution space. All build on the core CoT insight that explicit intermediate reasoning improves final answers.
Related Columns
Related Terms
Feel free to contact us
Contact Us