What did Adolf Hitler and the Nazis think about the United States?
Adolf Hitler and the Nazi regime had a complex and often contradictory view of the United States. Initially, Hitler admired certain aspects of American society, especially the country’s industrial power and its ability to mobilize for war. In the 1920s and early 1930s, Hitler saw the U.S. as a potential ally due to its growing influence in the global economy and its apparent disregard for European entanglements. He also noted that the United States, like Nazi Germany, had a strong sense of nationalism and racial purity, which he thought could be aligned with his own ideals. However, these initial perceptions would quickly shift as Nazi ideology became more entrenched.
Credit to : Der Kommandant English
