To me, "The American Way" is that of over self-indulgence, over flamboyance, blindly following disgusting mass-market consumer religion (Christianity included, because let's face it, these days Christianity IS a product!). "The American Way" is self-praising when no praise is warranted and believing all the time that anything American is right or superior, democratic or correct, the American way to resolve conflict is to muscle in and assume control. I have heard stories that Americans overall know little to nothing of foreign affairs! Is this true? Can such a ludicrous thought actually be true? If so, is that not self indulgence on sterioids? Gluttony of epic proportions?