Glossary

The hallucination problem

LLMs generating fluent, confident text that is factually incorrect — is a fundamental limitation for political communications use. LLMs are fluency machines, not accuracy machines. Robust human review and fact-checking infrastructure is an ethical requirement for any LLM deployment in political cont

Learn More

Related Terms