Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have worked a bit with transformers, the model underlying GPT. They absolutely learn to copy training data, and that’s perfectly normal.

What is happening here is we’re running into exactly what modern ML is NOT capable of: deductive reasoning. It does not think “I need to query the Twitter API for some posts, then filter them. Right, the API works like this…” No. It doesn’t think at all. It is a regression machine. “This sequence begins/looks like something I have seen before, here’s the corresponding output modulo adaptations.”

ML does not self-reflect, question motives and analyse causes. It’s just a complete lie to suggest otherwise, and to call this “pair programming”? What an absolute joke. It’s a lot like Tesla calling its glorified lane keeping an autopilot.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: