preview

How Did The Vietnam War Changed American Society

Good Essays
The Vietnam War drastically changed how Americans viewed their country. The US emerged from World War II as a world superpower and as a country where patriotism meant serving one’s country and following authority’s orders. However, in the 1960s, the discontent of many minority groups who believed that the “American Dream” was only obtainable by a select few, led to many social changes in the US. This discontent also fueled the many individuals who questioned what the US was doing fighting communism on the other side of the world. The Vietnam War divided American society at home on their views on national pride, police protection and justice, and trust in the US government, and also changed Americans view of their countries nobility. During
Get Access