Americans, correct me if i'm wrong
As a Finn, i had always a conception that Americans get in school mostly taught how great America is. I once got my hands into an American history book, when i read about the Vietnam War and how America was the hero and tried to "liberate" the country, while in reality also America did a lot of war crimes etc.