How Did The War Change American Society

652 Words3 Pages
This war was the beginning of it all, it created the United States, and it unified us as a country. Before the American Revolution, we were under strict British control. Yes, we had right, but they had to be granted to us. But that would all change post-war. Also the war inspired others, it changed the life’s of women, and it gave some power to more ordinary people, not just the elites and changed more of the social aspects of society. The war created institutions of our government and also infused into our culture, what we believe today. Prior to the war, we were living in a monarchical society; we were merely just subject of the crown. We had only right granted to us be the king. The American Revolution was so important because it strengthened people’s way of…show more content…
What we believe as Americans came from this war. The principles of the war continue to influence us as a society. We reference for inspiration and enlightenment. We use quotes all the time in political cases or to prove one point or another. During the war many slave became free by running away. The northern states abolished slavery after we won the war. Slaves also began to fight against slavery, they wrote petitions and made good arguments. People were beginning to open their eyes and question slavery. Well, not speaking for the southern states, but it was a start and did free many. Finally, the American Revolution brought out some of the greatest leaders such as Thomas Paine, Thomas Jefferson, Benjamin Franklin and Alexander Hamilton, Patrick Henry. With all of my points being said, I believe that this major event in history impacted in such a way that made it: the most important event. From a small wedge of women’s rights, to creating institutions of government, the war changed almost everything. The creation of the United States will always be important because if it were not for it, we wouldn’t be where we are
Get Access