Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I still don't understand what you're trying to say.

What I was saying is if you ask an LLM to generate an image of a poor person, it makes sense they'd be brown or black because if you were to randomly pick actual poor people from Earth, the chances are very high it'd be a brown or black person. In this case, it's just accurate representation.



And the problem is that often times we are not talking about random poor people, but rather, specific populations of poor people whose demographics and other traits do not match the highest level average. So the LLM is entirely wrong.


I don't think the LLM can be wrong. It's just giving you a random reflection of the world. Keep generating pictures until it matches your use case. The people using inappropriate pictures are the only ones here who are wrong, or can even be.


> Keep generating pictures until it matches your use case.

You mean until it’s not wrong?


No.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: