Since European expansion into the Americas, white people have demonized Black people and portrayed them as undesirable, violent and hypersexual. Originally, the intent of this demonization was to legitimize the conquest and sale of African people.
Since European expansion into the Americas, white people have demonized Black people and portrayed them as undesirable, violent and hypersexual. Originally, the intent of this demonization was to legitimize the conquest and sale of African people.
Leave A Comment