Hallucination
When a model generates something that sounds right but isn't. Not a bug — it's the default behavior. The model is always predicting the next token, never retrieving facts from a database.
Back pressure is the main defense: don't trust output, verify it. Run the code, check the docs, read the diff. RAG helps too — grounding generation in retrieved source material reduces the room for invention.