View Single Post
Old 04-16-2007, 12:23 PM  
rotowa85
Confirmed User
 
Join Date: Feb 2007
Posts: 278
America won the war

Can the people out their who go around saying that america won ww2 please stop, because the simple fact is that the allies won the war. dont get me wron i do believe that the US had a big part to play in how it turned out.
It's correct that without Roosevelt's Lend-Lease program the brits would have been starved into submission by 1942.

It's correct that we would NEVER have been able to invade Western Europe without the millions of US Servicemen and women who came to help following Pearl Harbour, not to mention all the equpment such as landing-craft (even if Winston Churchill designed some of them, we had no material to build them!).

It's true to say that D-Day was about 50% an American invasion, but it still annoys me when americans make out that they won the war single handedly, because to me it just pisses all over the memories of the canadian, australian, french, russian and british soilders who gave up their lives to protect the world at large.
rotowa85 is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote