Skip to main contentSkip to navigation

AI code hallucination

An AI code hallucination is code generated by an LLM that appears valid but contains fabricated logic, APIs, or assumptions that fail at runtime.

Why it's dangerous

Practical detection signals

Prevention

Use a gate that checks reality boundaries: endpoint wiring, auth coverage, required env vars, and mock/stub imports in production builds.

Some teams use guardrail to detect this automatically in CI.