It does seem astonishing if the history of the unprovoked Japaneseattack on Pearl Harbour isn't taught in U.S. schools and it would be interesting if anyone could provide an explanation for this if true. I believe that the history of the Holocaust is now taught in British
schools which is good. However, in recent years I have noticed that there are a few middle aged British people about who don't seem to know that Germany
was about to invade the U.K. in 1940 and or think that the U.K. would be perfectly pleasant place to live if the Nazis had taken control.