Why Do America Think They Won WW2

America Think They Won WW2

There were several European and other nations involved in winning the World War 2. However, Americans have claimed many a times that they won this battle for Europe and saved it from the rule of Nazis. This was majorly because of the supplies America sent to Britain to fight this war. Many people believe that the supply of arms and ammunition was not done for free. America greatly exploited Britain for this favor. In fact, Britain was economically destroyed by the end of the World War 2. It had lost all its industrial and economic resources in exchange for the supplies received from America.

Read more