Ask the Griots

What is an artificial intelligence hallucination ? 🤖

Cliquez ici pour lire en français

Following the recent ChatGPT and Gemini slip-ups, this week’s #AskGriots will look at the hallucinations of Artificial Intelligences.

AI can go crazy

Artificial Intelligence is said to hallucinate when it generates false or misleading information that does not correspond to reality. AI hallucinations can take many forms, such as creating false images, generating erroneous texts or producing unexpected results.

An object recognition algorithm might, for example, identify a lion as a cat. Or an image-creation AI might draw unicorns when asked for a horse race.

AI are not 100% reliable

Hallucinations are inherent to AIs due to the creation process used. Several factors can cause hallucinations, including insufficient or biased training data, ambiguous user queries or inherent limitations in the language models themselves.

That’s why it’s important to be aware of the limitations of AI systems, and to verify information obtained from reliable sources before trusting it.

Qu'en avez-vous pensé?

Excité
0
Joyeux
0
Je suis fan
0
Je me questionne
0
Bof
0

Vous pourriez aussi aimer

Laisser une réponse

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *

Plus dans:Ask the Griots