Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not the poster you’re replying to, but -

I took his point to mean that hallucinate is an inaccurate verb to describe the phenomenon of AI creating fake data, because the word hallucination implies something that is separate from the “real world.”

This term is thus not an accurate label, because that’s not how LLMs work. There is no distinction between “real” and “imagined” data to an LLM - it’s all just data. And so this metaphor is one that is misleading and inaccurate.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: