Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs are pretty good at providing names and search terms for very vague prompts.

Although that's also often an invitation for hallucinations so you have to be even more careful than usual.



I was just going to say the same— LLMs are great for giving a name to a described concept, architecture, or phenomenon. And I would argue that hallucinations don't actually much matter for this usage as you're going to turn around and google the name anyway, once you've been told it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: