![](https://i0.wp.com/techgriot.co/wp-content/uploads/2024/03/ag04032024.png?resize=800%2C400&ssl=1)
What is an artificial intelligence hallucination ? 🤖
Cliquez ici pour lire en français
Following the recent ChatGPT and Gemini slip-ups, this week’s #AskGriots will look at the hallucinations of Artificial Intelligences.
AI can go crazy
Artificial Intelligence is said to hallucinate when it generates false or misleading information that does not correspond to reality. AI hallucinations can take many forms, such as creating false images, generating erroneous texts or producing unexpected results.
An object recognition algorithm might, for example, identify a lion as a cat. Or an image-creation AI might draw unicorns when asked for a horse race.
AI are not 100% reliable
Hallucinations are inherent to AIs due to the creation process used. Several factors can cause hallucinations, including insufficient or biased training data, ambiguous user queries or inherent limitations in the language models themselves.
That’s why it’s important to be aware of the limitations of AI systems, and to verify information obtained from reliable sources before trusting it.