Grounding
Also known as: Grounding / グラウンディング / 事実根拠付け
The practice of anchoring LLM outputs to verifiable external sources (documents, databases, search results) to prevent hallucination. RAG is the dominant technical approach to grounding.
Overview
Grounding anchors LLM responses to specific external sources — retrieved documents, web search results, or database query outputs — reducing hallucination and improving verifiability. Citing sources in the response lets users check the evidence behind an answer.
Enterprise importance
Regulated industries (medical, legal, financial) require evidence-backed AI responses, making grounding a non-negotiable design requirement. Anthropic, Google, and others treat grounded generation as a core safety feature of their enterprise offerings.
Related Columns
Related Terms
Feel free to contact us
Contact Us