While I was talking with my friend who lived in Pakistan for several years back in the '90s, she also told me that since the Muslim people in Muslim countries understand the United States to be a Christian nation, they think the things that come out of the United States reflect Christian values.
She was talking about what comes out of Hollywood--American movies and TV shows. They see the sex and violence and filthy language and think that's what Christianity is, she said.
Hollywood so often seems to have a huge amount of disdain for Christians and Christianity. Often movies and TV shows portray Christians as idiots, hypocrites, and guilty of the worst crimes. And people in other parts of the world think what comes out of Hollywood is Christian and is put out by Christians.
Oy.
I wonder how the non-Christians in Hollywood, especially those who buck against anything truly Christian, would feel if they knew what they are doing represents "Christianity" to others around the world.
And after all that, many people in the United States, in Hollywood and elsewhere, turn around and blame Christians for the problems in the world...and the trouble that has come to our own country.
Could this get any more ironic?
Again, being a Christian is something a person chooses to become, making a conscious choice based on an understanding that Jesus is the Christ, the Messiah God promised to send to save that person, individually, from his or her sin.
A Christian is not something a person automatically is because he or she was born in the United States of America.
Subscribe to:
Post Comments (Atom)
1 comment:
I never in a million years would have thought people in other countries might think what came out of Hollywood represented "Christian." No wonder they see Christians as hypocritical!
Post a Comment