And the problem is that often times we are not talking about random poor people, but rather, specific populations of poor people whose demographics and other traits do not match the highest level average. So the LLM is entirely wrong.
I don't think the LLM can be wrong. It's just giving you a random reflection of the world. Keep generating pictures until it matches your use case. The people using inappropriate pictures are the only ones here who are wrong, or can even be.