Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The neurological term for it is "Confabulation", which is a lot better than "Hallucination" as used in AI.

Confabulation is the unintended generation of false memories.

Hallucination is false perception.

Clearly, the phenomenon we are seeing with LLM researchers call Hallucination better fits Confabulation.



Sometimes it helps when the audience gets the meaning of a word. Confabulation is not really popular among non-native english speakers, I am sure.


It's also not popular among native English speakers, I can assure you.


I don't actually think either term is more precise than the other when we're talking about LLMs, which aren't human brains. It doesn't have either memory or perception in a way that we do.


I think the horse had left the barn on this one.


“Confidently presented bullshit” is probably much more accurate. Added benefit no new vocabulary terms :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: