Wars are a terrible thing, but sometimes good comes out of the horror and death that they produce. Many would argue that the Civil War ultimately had a good impact on American society. It ended slavery once and for all. Not only was slavery a demonstrably cruel and corrupting institution, its existence was an open wound tearing the country apart. It polarized the citizenry. For more than a decade, most government business was subordinated to...
Wars are a terrible thing, but sometimes good comes out of the horror and death that they produce. Many would argue that the Civil War ultimately had a good impact on American society. It ended slavery once and for all. Not only was slavery a demonstrably cruel and corrupting institution, its existence was an open wound tearing the country apart. It polarized the citizenry. For more than a decade, most government business was subordinated to this concern. Westward expansion became a bloody testing ground for free state or slave state. Members of Congress physically attacked each other over slavery. There came a point when, finally, compromise was impossible. War, and the decisive Union victory, gave the country a fresh start.
Likewise, though unspeakably horrible, destructive, and genocidal, World War II settled certain questions for the better. The world, shocked and horrified by the carnage, made concerted efforts to try to build peace. Shocked countries worked together to create the United Nations. Europe and the United States reckoned with their complicity in anti-Semitism, finally working to try to eradicate that ancient hatred. The creation of the state of Israel, if a blunder in terms of the Palestinians already living there, was a goodhearted attempt to atone for past evils. Germany itself, utterly devastated by bombing and invading armies and shocked by its own barbarity, finally gave up its extreme militarism and embraced peace and modern democracy.
No comments:
Post a Comment