yay47
yay47
14.03.2020 • 
History

True or false: Hitler believed that Germany would have won World War I if the military hadn't been "stabbed in the back" or betrayed by the civilian leadership ("November Criminals") that took power from the Kaiser in late 1918.

Solved
Show answers

Ask an AI advisor a question