Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I’ll give you that. And I think what they say, interestingly, is how much of our language is very much rote, R-O-T-E, rather than generated directly, because it can be collapsed down to this set of parameters. But in that “Seven Deadly Sins” article, I said that one of the deadly sins was how we humans mistake performance for competence.

On this, I think he might be wrong. I think the hallucination ability shows that the generation of language can be rote, such that the embedding of ideas is a rote item learnable in the billions-to-trillions parameter space, but not the entirety of language. To me, logic and truth seem to be separate concepts from generation propensity.

Note: I am still learning the mathematics driving LLMs, and my opinions might change in the future.



Hallucination right now is just the exponential divergence that LeCun talks about if you are interested in reading more about it.

LLMs probably need generative diffusion but still lack the fundamentals to reason, plan and evaluate.


I found his twitter thread discussing it. Very informative, thanks for search key.


Logic is inconsistent and/or incomplete.

Truth is both consistent and complete.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: