--- title: AI hallucination slug: nccf0505 canonical_url: https://docs.coveo.com/en/nccf0505/ collection: glossary source_format: adoc --- # AI hallucination {doctitle} is a phenomenon where a generative large language model (LLM) creates text that's nonsensical or inaccurate. [Relevance Generative Answering (RGA)](https://docs.coveo.com/en/nbtb6010/) uses [grounding](https://docs.coveo.com/en/nccf0415/) to reduce the chances of AI hallucinations.