Quantcast
Viewing latest article 42
Browse Latest Browse All 144

question of the day: What do you think Hollywood movies tell the rest of the world about America?

Image may be NSFW.
Clik here to view.
Hollywood sign

In the wake of the Newtown shooting, there have once again been rumblings about how violent movies (and violent video games) are perhaps partly to blame for the epidemic of mass shootings in the United States. This is nonsense, of course. As I noted yesterday:

The entire planet watches violent Hollywood movies. And plays Silicon Valley’s violent games. Yet the vast majority of mass shootings happen in America.

And it’s been interesting for me to note, from my perch here in London, that no one here in the U.K., as far as I’ve seen in either the media or in cmy own onversations with people, seems to believe that violent movies and games could be the problem… perhaps because of their own firsthand evidence that there doesn’t seem to be a connection. (The shooting has been top news here and the subject of much debate and consternation.)

Yet it does have me wondering, if, for instance, violent movies don’t necessarily make the rest of the world think that those are the cause of America’s violent culture, then:

What do you think Hollywood movies tell the rest of the world about America?

I don’t mean just in relation to violence but in all aspects. Idiotic romantic comedies. Alien invasion movies. Everything Adam Sandler. What do such movies say, if anything?

I’m curious to hear, of course, from both my American readers as well as those outside the United States.

(If you have a suggestion for a QOTD, feel free to email me. Responses to this QOTD sent by email will be ignored; please post your responses here.)

(please click through for commenting, social networking, tags, and more)



Viewing latest article 42
Browse Latest Browse All 144

Trending Articles