1930s American History Summary

The Wall Street Crash of 1929 dramatically closed the curtain on the prosperity of the twenties and precipitated the greatest economic decline in US history. In 1929 only 3% of Americans were without a job, by 1933, the unemployment rate had risen to 25%. The Wall Street Crash led to a World Depression which was partly responsible for the rise of aggressive dictatorships in Europe.

Natural calamities added to the nation’s miseries. Drought in America’s heartland turned the once rich soil to dust. Winds whipped the loose soil into gigantic dust storms that ravaged the country from South Dakota to Texas. Thousands were forced to abandon their farms, clogging the highways as they headed West in the hopes of finding a better life.

In order to combat the growing depression, President Herbert Hoover asks the U.S. Congress to pass a $150 million public works project to increase employment and economic activity. By the 1930s money was scarce because of the depression, so people did what they could to make their lives happy.

Movies were hot, parlor games and board games were popular. People gathered around radios to listen to the Yankees. Young people danced to the big bands. Franklin Roosevelt influenced Americans with his Fireside Chats. The golden age of the mystery novel continued as people escaped into books, reading writers like Agatha Christie, Dashielle Hammett, and Raymond Chandler.

Some of the novels of this period explored what was happening in America during the Great Depression. One standout, John Steinbeck’s “The Grapes of Wrath”, chronicled the life of a displaced Oklahoma family who had lost its farm to the drought of the Dust Bowl.

Economics dominated politics in the 1930’s and the decade began with the construction of shanty towns called “Hoovervilles” (named after a president who felt that relief should be left to the private sector) and ended with a series of federal programs funded by the national government and an assortment of commissions set up to regulate Wall Street, the banking industry, and other business enterprises.

Finally in 1939, the United States declared its neutrality in the European war after Germany invaded Poland, effectively beginning World War II after a year of European attempts to appease Hitler and the aims of expansionist Nazi Germany. (Photo below) U.S. Troops land on the beach at Normandy, France in 1944. The United States ended its neutrality after the attack on Pearl Harbor in December of 1941.