Glossary

Hallucination (LLM)

The phenomenon by which a large language model generates text that is confident and fluent but factually incorrect, posing particular risks in high-stakes compliance contexts.

Learn More

Related Terms