Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If it walks like a duck?


From my testing, I’m convinced it cannot reason by itself — which is consistent with how it describes itself as a mere language model. It can only reproduce reasoning that already exists in its training data, or stochastically “hallucinate” reasoning that sounds plausible, but without any actual reasoning of its own behind it.


It doesn't reason like a duck, tho. Play around with it a bit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: