The ‘Good War’ Myth of World War Two

World War II was not only the greatest military conflict in history, it was also America’s most important twentieth-century war. It brought profound and permanent social, governmental and cultural changes in the United States, and has had a great impact on how Americans regard themselves and their country’s place in the world.

19 Декабрь 2014|Weber Mark, director of the Institute for Historical Review