The expression AI hallucination is well known to anyone who's experienced ChatGPT or Gemini or Perplexity spouting obvious falsehoods, which is pretty much anyone who's ever used an AI chatbot. Only, it's an expression that's incorrect. The proper term for when a large language model or other generative AI program asserts falsehoods is not a hallucination but a confabulation. AI doesn't hallucinate, it confabulates. Also: 3 ways AI agents will make your job unrecogni