by Josh Welch (RC Member & Student at Ohio University)
In American pop culture, Christians who share their faith (or even admit to believing in Christ) are viewed with disdain and perceived as threatening. Of course, a few Christians do act like jerks, but this is not why people are so upset at Christians. In fact, it is evangelism itself that upsets them. I was discussing this topic with someone once, and in a burst of passion, she finally admitted what upset her about evangelism. “You Christians think you’re right!” she burst out.
The fact that Christians claim to have the truth is offensive. It flies in the face of the widely held idea that questions of morality and religion are open to individual interpretation. Nobody has any right to tell another person that what he is doing is wrong. I mean, who do these Christians think they are? They have no right to tell others that the way they're living is wrong, right? As one author put it, “The freedom of our day is the freedom to devote ourselves to any values we please, on the mere condition that we do not believe them to be true.” Most people in our culture hold this view, though they may not ever consciously realize it.