Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And the problem is that often times we are not talking about random poor people, but rather, specific populations of poor people whose demographics and other traits do not match the highest level average. So the LLM is entirely wrong.


I don't think the LLM can be wrong. It's just giving you a random reflection of the world. Keep generating pictures until it matches your use case. The people using inappropriate pictures are the only ones here who are wrong, or can even be.


> Keep generating pictures until it matches your use case.

You mean until it’s not wrong?


No.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: