Hallucination

When a model generates something that sounds right but isn't. Not a bug — it's the default behavior. The model is always predicting the next token, never retrieving facts from a database.

Back pressure is the main defense: don't trust output, verify it. Run the code, check the docs, read the diff. RAG helps too — grounding generation in retrieved source material reduces the room for invention.

← all terms
12 posts tools lexicon follows rss © 2026 siever.ing