Isolationism and U.S. Foreign Policy After World War 1
Beginning with George Washington’s presidency, the United States sought a policy of isolationism and neutrality with regards to the internal affairs of other nations. Early American political leaders argued that with the exception of free trade, self-defense and humanitarian emergencies, the U.S. would do best to avoid permanent alliances that do not serve American interests but instead deflect attention from domestic issues. When World War I broke out in July 1914, the United States actively maintained a stance of neutrality, and President Woodrow Wilson encouraged the U.S. as a whole to avoid becoming emotionally or ideologically involved in the conflict. Americans were more than happy to stay out of the war, and Wilson won a second Presidential term in 1916 by running on a platform of non-interference; the phrase, “he kept us out of war” became a popular slogan used by Wilson’s supporters.
Upon re-election, Woodrow Wilson was resolute in staying out of war, even as a significant movement within the American government advocated for preparedness in the face of events that signified growing German international aggression—such as the sinking of the British ocean liner Lusitania by a German submarine, which claimed the lives of many Americans. After several years observing these and similar acts of aggression by the Germans, Wilson—a political scientist by profession—began to change his viewpoint as he saw that the devastating war in Europe was threatening to spill across the Atlantic Ocean. With massive loss of life came a moral imperative that could no longer be ignored, requiring the United States to take a leadership role in maintaining and promoting freedom, sovereignty and self-determination for all nations. Wilson began making public statements that framed war as a means to right the wrongs in the world rather than simple military posturing. Thus, the United States’ intervention in the First World War or, the “Great War,” helped shape the nation’s status as a self-proclaimed defender of freedom and democracy worldwide and radically altered U.S. foreign policy.
Seeds of Isolationism
On April 2, 1917, President Wilson asked Congress for permission to enter the war and make the world “safe for democracy” by April 6th, the resolution was approved and the U.S. officially declared war on Germany. While the United States did not join the Allies in an official capacity, it fought alongside the British and French against Germany and the other Central Powers, such as Austria-Hungary and the Ottoman Empire. As the war continued and news of wanton devastation made headlines, U.S. public support for the war began to wane. While Europe suffered more casualties than the United States, (tens of millions of Europeans lost their lives, compared to over 400,000 Americans), Americans reeled from the emotional and financial costs of war and began to feel as though joining the war effort was a mistake. In 1918, Wilson articulated fourteen points to help end the war and establish a basis for cooperation, which included freedom of the seas, open economic trade, the evacuation of occupied territories, the liberation of non-Turkish peoples in the Ottoman Empire, and a general collection of nation states to offer members territorial integrity and political independence—setting the stage for what would later become the League of Nations.
What ensued was a radical shift in U.S. foreign policy, which promoted a stance of isolationism that would last until World War II. Warren Harding won the 1920 presidential election on the promise of staying out of global affairs, and by arguing that the United States needed normalcy and a focus on internal problems. Thus, U.S. foreign policy during the 1920s was characterized by the enactment of isolationist policies; for instance, the U.S. opted not to join the burgeoning League of Nations, even though it had been the nation to first propose such international cooperation. Instead, the United States focused on building the domestic economy by supporting business growth, encouraging industrial expansion, imposing tariffs on imported products and limiting immigration.
The League of Nations
In 1916, Wilson first articulated his vision for the League of Nations as an international organization designed to facilitate cooperation, and it was backed by many Americans eager to see the end to the devastating war. The League of Nations was intended to help ensure a global “permanent peace” in which nations, small and large, would be protected and could take any actions necessary to safeguard said peace. The League of Nations would also provide mechanisms for promoting negotiation and mediating disputes. While the idea of the League of Nations was popular at the time, and the consequences of war showcased the necessity for such an organization, some members of Congress—such as Henry Cabot Lodge—opposed it and thought it would be an expensive distraction from the United States’ own interests.
As more Americans lamented the consequences of war and voiced their desire to avoid future intervention in foreign affairs at all costs, public opposition to the League of Nations grew. It was perhaps isolationist Warren Harding’s election to the office of President that offered the greatest repudiation of the League of Nations and Wilson’s interventionism. While Wilson had participated in the creation of the League of Nations in Geneva, Switzerland, Harding never allowed the United States to become a member. Many historians and political theorists attribute the relative inability of the League of Nations to prevent World War II to U.S. isolationism and the country’s lack of participation and leadership in the organization.
Following President Harding’s victory in 1920, he and his vice president, Calvin Coolidge, decided to focus more on domestic problems facing the United States. In contrast to Wilson’s progressive agenda—which allowed government to regulate big business—the new administration sought to empower businesses, decrease regulation and cut taxes to enable them to grow and contribute more to domestic production. This was not surprising given Harding’s campaign promise: “less government in business and more business in government.” Business became a symbol of American prosperity, and the 1920s saw the United States come out of a post-war recession with high economic growth. During that time, industry flourished, the stock market rose, technology rapidly evolved, and the commercialization and expansion of aviation and the automobile radically altered the American lifestyle.
The economic boom was facilitated by tariffs that were enacted to restrict the influx of imported goods, thereby increasing domestic production. American farmers had experienced dramatic growth during World War I, following a significant increase in demand stemming from their European counterparts’ inability to keep up with agricultural demands while fighting the war. As European farmers began to recover postwar, U.S. agriculture suffered from overproduction, and demand and prices plummeted. American farmers, who had gone heavily into debt to finance their expansion during the war, faced difficulty in honoring their repayments and asked the government for assistance, hoping tariffs would lead to an increase in prices; ultimately, this only helped prices for specific products, such as sugar and wool. Upon taking office, President Harding implemented the Emergency Tariff of 1921, which imposed duties on over two dozen food imports and agricultural products.
In 1922, President Harding signed the Fordney-McCumber Act into law, which raised tariffs by about 25 percent and made it easier to enact them without congressional approval. In 1930, Congress also passed the Smoot-Hawley Act, which raised duties and tariffs significantly on over 20,000 foreign products in all sectors of the economy. Many debate whether these tariffs were effective, but in the long term, they may have done more harm than good, as other countries retaliated by increasing their own tariffs. In turn, this created trade barriers that ultimately hurt American producers and jobs by decreasing the presence of U.S. products in foreign markets. It also impeded Europeans from generating enough revenue to pay back their wartime debts to the United States. Many economists argue that the combination of these tariffs and other factors led to the collapse of the U.S. economy in 1929.
With 1.3 million immigrants passing through Ellis Island, New York in 1907 alone, Congress had grown concerned that continuing to accept immigrants at an unfettered rate would become costly to the U.S. economy. Efforts to restrict immigration began during World War I with the 1917 Immigration Act, which imposed literacy tests and taxes on immigrants, and banned immigrants deemed “undesirables”—essentially any sick, disabled, or criminal members of society. The label of undesirable soon came to include specific races and ethnicities; all immigrants from a region termed the Asiatic Barred Zone (modern day India, Afghanistan, Iran, Saudi Arabia, Russia, Southeast Asia and the Asian-Pacific islands) were blocked, supposedly as an effort to protect national interests and security.
The U.S. government sought to enact greater limits on immigration after World War I because of strong anti-European sentiment, exacerbated by the “Red Scare” that convinced many Americans that communism, anarchism and a Bolshevik-style revolution would soon sweep the United States. Attempts were made to Americanize immigrants through new naturalization processes that included higher taxes on immigrant adults and new literacy initiatives. After the war the U.S. economy slowly declined into recession while anti-immigration sentiments continued to peak. The cost of living had become unmanageable for most Americans and unemployment rates had spiked, with many citizens holding a belief that their misfortune was due to a disproportionate number of recent immigrants claiming jobs that would normally be worked by native U.S. citizens.
In 1921, Congress passed the Emergency Quota Act, which set numerical quotas for the number of immigrants coming to the United States. Under the Emergency Quota Act, the U.S. would refer to the 1910 census in order to determine the number of foreign citizens living in America; subsequently, the U.S. would admit immigrants on a country by country basis, taking no more than the equivalent of three percent of a country’s existing U.S.-based population. Immigrants from Southern and Eastern Europe were disproportionately targeted by the new limits with a three percent quota of total immigrants already living in the United States; professionals, such as actors, scientists, ministers and other specialists from all countries were excluded from the quota.
In 1924, Congress enacted the Johnson-Reed Act to replace the Emergency Quota Act. The Johnson-Reed Act further reduced the percentage of immigrants eligible for admission to the U.S.; the act would allow for a number of immigrants equal to two percent of a country’s population already present in the United States. Because this act used data from the 1890 census, rather than the 1910 census, the lower number of current residents being used to calculate the percentages practically ensured that far fewer immigrants would be accepted into the U.S. The act was notable for introducing the concepts of consular control and border protection to fight smuggling by establishing Border Patrol and requiring visas to be obtained abroad prior to being allowed to enter the U.S.
During Woodrow Wilson’s presidency, the United States briefly shed its isolation-based foreign policy in order to defend democracy on global scale. However, the effects of World War I led the United States to retreat from global affairs and engage in isolationist policies to help foster internal growth and development—with decidedly mixed results. By completing a Master of International Relations degree program, students can enhance their understanding of how isolationist policies have impacted other international governments and juxtapose that knowledge with what they have learned about U.S. foreign policy to develop a strong grasp of the ways in which diverse theories of international relations can impact the global community.
As the nation’s oldest private military college, Norwich University has been a leader in innovative education since 1819. Through its online programs, Norwich delivers relevant and applicable curricula that allow its students to make a positive impact on their places of work and their communities.
Our online Master of Arts in International Relations program offers a curriculum that evolves with current events to help you face the future of international affairs. Norwich University’s master’s degree in international relations covers many subjects to give you a look at the internal workings of international players, examine the role of state and non-state actors on the global stage, and explore different schools of thought. You can further strengthen your knowledge by choosing one of five concentrations in International Security, National Security, International Development, Cyber Diplomacy, or Regions of the World.
Do Economic Sanctions Work?
Interstate Relations Versus International Relations Careers
November 2 2017