Sunday, December 2, 2012

Entertainment or Propaganda?

This past weekend I watched a movie released a while ago called "Act of Valor". It was a very inspirational movie about Navy SEALs and their heroic deeds.



Despite being amazed by some of the things these men did, this movie also got me thinking about the American movie industry. Is there ever a time where a movie depicts the wrongdoings of Americans? Although there is a bias because most producers are from America, it seems like most movies are selected specifically to make our country out to be "the good guys". Do you believe there is a secret motivation for movie producers to make our country seem great? And if so, why do you think that is?

1 comment:

  1. I don't believe a movie made in America will ever show Americans doing anything you could describe as "wrongdoing" - it simply wouldn't sell. There have been a number of films made about mistakes Americans made, however, from poor tactical and strategic decisions on the behalf of generals to political idiocies on behalf of Presidents (the dozen-odd Watergate documentaries spring to mind), but the general idea seems to be teach a lesson to make sure it doesn't happen again.

    ReplyDelete