Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It sounds good, except that there’s no results for “screechings” or “only cure” in the books, and Ron doesn’t encounter Voldemort in Goblet, only Harry and Cedric do.

Again, it has no semantic comprehension, it doesn’t know that certain strings of words are valid in certain contexts because they refer to actual events, objects, or concepts, it’s just putting letters together.

Probably some of its training material included Harry Potter fan fiction or other non-canon stuff, and that’s why it said what it said.



That's fair, its references for where chocolate is used are hallucinated (although it is the case that chocolate is used as a remedy in the books; I did check that independently!)

> it has no semantic comprehension

I'm not sure about this. I either don't think that's true, or I think we're going to need to start defining "semantics" in a circular way: as by definition something we conjecture humans can do that machines can't.

Now, I'm used to being the guy telling everyone ML models don't have semantic comprehension; but I don't think that's a sustainable position any more.

I'd say it has at least some degree of semantic comprehension.

It's got to have, in order to return the answer about Ron perhaps not getting sick.

It has to 'understand' that my query about getting sick refers to eating too much chocolate, and it has to understand enough to be uncertain of whether Ron will get sick because it thinks chocolate is a remedy for wizards.

That's quite a semantic thing to do. There's certainly no obvious or apparent way in which it's just doing some syntactic similarity.

Or, alternatively if we say that's just a super complex syntactic process, then I think that's like arguing there's no such thing as semantics, a bit like someone might argue there's no such thing as biology because it's all just physics lower down.

Yes, the model then goes too far and hallucinates a bunch of things, and that's a good example for never taking anything it says as authoritative.

But I think these examples are beyond certainly what I can identify as 'just' syntactic manipulation.


Chocolate is used in the books as a remedy after being attacked by Dementors- basically the Dementors sap one’s will to live and chocolate acts as an anti-depressant.

https://harrypotter.fandom.com/wiki/Chocolate


Yep, those two specific examples are pointing to events that don’t really happen in the books. It’s actually a good illustration of the kind of specific false facts that tend to crop up.


Seems it's fantastic at BS'ing. Was reading something where a professor had it solve problems and had to double and triple check whether its answers were correct because it sounded so convincing.

Which means it could be a watershed moment, just maybe not for information but rather disinformation.


Sounds like most of the better students in my classes at university.


Lol I mean, part of me is thinking it'll put me out of a job :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: