The term “hallucination” in reference to generative AI has never struck me as being right, and I just came across an article on Bloomberg that does a good job of explaining why. Calling incorrect facts generated by AI “hallucinations” incorrectly personifies LLMs, which have no concept of mind or any ability to “hallucinate” in a …