It offends my sensibilities even to think about this, but it has been interesting to learn about the importance of slavery to not just the Southern colonies but all Americans during colonial times. I think I was always taught that slavery was simply a race thing, an institution in which the white race decided to assert their authority over the black race. There certainly was plenty of that going on, but at root it was an economic enterprise. Southern plantations had been doing well with European indentured servants. This situation was kind of like today’s third-world sweatshops: conditions weren’t great, and there was a lot of abuse, but the servants won their freedom and often an endowment of land or cash after a contracted period. Everyone won.
But then the European economy improved and demanded wages increased, which made African captives look quite a bit more appealing than they had before. Slavery brought a lot of costs with it, not to mention all of the risks of revolt and runaways. The incentives didn’t align, but it suddenly became the next best option for Southern farmers to keep their costs relatively low. Slave labor was the primary reason why the South was the richest region of Colonial America, despite the fact that it had no urban areas of any size or importance.
The really interesting part of all of this for me is how important slavery was to the colonies eventually asserting their independence from England. Independence never would have been possible without being able to pay for the fight, and paying for the fight wouldn’t have been possible without the sky-high profits earned by slave-worked Southern plantations. Plus, northerners had plenty of incentives to tolerate the institution they hated so much. It kept the prices of their tobacco, rice, and many other items low, leaving them with more money to put toward the fight. The African slaves bought our freedom. How ironic.