Imperialism is what made the America we know possible. American imperialism refers to the economic, military, and cultural influence of the United States internationally. Also, yes and no, the United States should have been involved in overseas expansion. Imperialism is also a major role in the involvement of World War 1.
The age of imperialism was when major world powers rapidly expanded their territorial possessions. One of the most notable instances of American imperialism was the annexation of Hawaii in 1898. With the expansion of the United States in Hawaii it gave the United States access all ports, buildings, harbors, and military equipment. During the time of imperialism, industrialization caused American businessmen to seek international…show more content… The reason was because during World War 1 the United States followed the policy of Imperialism. They decided to extend their military, economic and political control over Cuba, Guam, Hawaii, Puerto Rico, Samoa, Alaska, the Philippines, the Panama Canal Zone, and the Midway and Wake islands. Even though imperialism is believed to be a bad thing today, it was once seen as a good thing by a majority of the United State citizens. The majority were businessmen that wanted international trade among countries. It was also favored because it brought more raw materials into the United States. Imperialism was also accepted because there was a rising demand for military strength. Another reason it was accepted because the United States had a strong feeling of superiority. The United States also wanted to spread Christianity and their free government of Democracy. The American Anti-Imperialist League was a group that was against imperialism of the Philippines. The anti-imperialists opposed the expansion because they believed imperialism violated republicanism. The anti-imperialists were defeated in terms of public opinion and the actions of Congress who were just coming to power supported