Friday, January 11, 2013

How did the Civil War change the South?

The Civil War did not significantly change the culture of the South or its views towards African Americans. However, the Civil War did significantly impact both the economy and political power of the southern states. The Civil War itself caused significant destruction in the southern states. Railroads were destroyed, farms were scavenged for food, cities were bombed and burned. This forced the southern states to rely on the provision of federal aide to help recover from the destruction of war. The material destruction and the freeing of the slaves also significantly reduced the wealth and power of many of the prior plantation families. Prior to the Civil War, the large plantation-owning families held significant power within the states because of their economic holdings. With these holdings significantly reduced, the former plantation owners also lost significant political power. The freeing of the slaves and reconstruction also temporarily broke the power of southern politicians in Congress because the representatives from the southern states had to appeal to freed slaves to gain office, which generally required some alignment with the Republican party.
While the effects of the economic changes would linger in the South, the impacts of the political changes were short lived and were quickly reversed at the end of reconstruction. Voter suppression and racial gerrymandering are some of the legacies of the culture of the pre-Civil War South enduring into the current century.

No comments:

Post a Comment

What is the theme of the chapter Lead?

Primo Levi's complex probing of the Holocaust, including his survival of Auschwitz and pre- and post-war life, is organized around indiv...