Penguin1502
Penguin1502
08.02.2021 • 
History

What effect did World War I have on the United States? The country's credibility wavered because of the war's outcome in Europe.
The wartime economy kindled an increase in racial unrest across the country.
The nation became a world leader as a member of the League of Nations.
The country experienced an economic recession due to wartime production.

Solved
Show answers

Ask an AI advisor a question