06-10-2014, 05:09 PM
|
|
Liv Benson to You, Bitch
Industry Role:
Join Date: Aug 2007
Location: Maryland and WV
Posts: 6,060
|
Quote:
Originally Posted by Rochard
On one hand I agree with this. Here in the US we are taught that the US won the war, and most of us believe this. And I believe this for a long time until I learned otherwise. No one fought more or lost more than Russia.
However, we tend to think of WWII as being fought in Europe against Germany. And for most countries it was exactly that. But for the United States, the war in Europe against Germany was only half of the war. The other half of the war was fought across the Pacific Ocean against Japan. Everyone concentrated on "Germany first" but no one really fought Japan except for the US with some assistance from the UK. Russia, for example, although they were close to Japan, didn't declare war until August 1945, shortly after Hiroshima and Nagasaki was bombed, and Japan surrendered days later.
And of course Hollywood has given the world the impression that the US did win the war.
|
FYI:
http://www.historynet.com/world-war-...alkhin-gol.htm
|
|
|