Hi Im writing a research paper about western movies (most notably those with Wayne). Ive got a chapter called "...Westerns seen by the Americans themselves". This is where you can help. Answer two questions: 1.What John Wayne and his characters in his movies mean to you as a American ? In what way does he correlate with the American culture ? 2.What western films mean to you from a cultural standpoint? Write as much you want ! And I need only answers from Americans ! I will quote your answers and add the user name of the author. Maybe we can start a discussion also or sth ! Thanks !