Ask the Griots

What is an artificial intelligence hallucination ?

Cliquez ici pour lire en fran癟ais

Following the recent ChatGPT and Gemini slip-ups, this week’s #AskGriots will look at the hallucinations of Artificial Intelligences.

AI can go crazy

Artificial Intelligence is said to hallucinate when it generates false or misleading information that does not correspond to reality. AI hallucinations can take many forms, such as creating false images, generating erroneous texts or producing unexpected results.

An object recognition algorithm might, for example, identify a lion as a cat. Or an image-creation AI might draw unicorns when asked for a horse race.

AI are not 100% reliable

Hallucinations are inherent to AIs due to the creation process used. Several factors can cause hallucinations, including insufficient or biased training data, ambiguous user queries or inherent limitations in the language models themselves.

That’s why it’s important to be aware of the limitations of AI systems, and to verify information obtained from reliable sources before trusting it.

Qu'en avez-vous pens矇?

Je suis fan
Je me questionne

Vous pourriez aussi aimer

Laisser une r矇ponse

Votre adresse e-mail ne sera pas publi矇e. Les champs obligatoires sont indiqu矇s avec *

Plus dans:Ask the Griots