preview

The 1960s: Gender Roles And Race Relations

Good Essays
The 1960s brought along important and beneficial changes to America, especially changes regarding gender roles and race relations. Even after World War II and the increasing tensions between the United States and Russia and Vietnam, America’s culture was changing faster than before. During the 1960s, gender roles changed for the better and race relations improved significantly.

The role of women in the 1960s changed after centuries of little to no freedom. However, women gained freedom during World War II and a sense of equality between the genders grew throughout the late 1900s. Before the 1960s, few women actually made their way into college because the public thought females did not need an education to care for a family. But during the
Get Access