American slavery wasn’t just a white man’s business − new research shows how white women profited, too

News Talk

Lifestyle / News Talk 20 Views 0 comments

This post was originally published on The Conversation. By: Trevon Logan As the United States continues to confront the realities and legacy of slavery, Americans continue to challenge myths about the country’s history. One enduring myth is that slavery was a largely male endeavor — that, for the most part, the buying, selling, trading and profiting from enslavement were carried out by white men alone. While white women certainly interacted with enslaved people in household management and day-to-day tasks, historians once argued that they weren’t active owners and had very limited involvement in transactions. This was once widely believed to be a reason why Southern white women supported the institution – they were assumed to be blind to its darker side. As an expert in the economic history of slavery, I know the story is far more complex. In fact, slavery was unique in economically empowering women. It was, in essence, an early feminist institution – but exclusively for white women. The myth that women didn’t profit from slavery has endured for several reasons. First, before the American Civil War, married women generally owned nothing of their own. The legal institution of coverture made the property a woman brought into...

0 Comments