-
Should Christians Be Fighting A Culture War?
Contributed by Justin Steckbauer on Mar 27, 2021 (message contributor)
Summary: Have you ever heard the phrase "the culture war?" It's a blanket phrase that describes the various cultural "conflicts" between perhaps more conservative and more liberal beliefs in real world society.
- 1
- 2
- 3
- 4
- Next
Have you ever heard the phrase "the culture war?" It's a blanket phrase that describes the various cultural "conflicts" between perhaps more conservative and more liberal beliefs in real world society. Images come to mind of the the battle over Christmas, yard signs that say "Keep Christ in Christmas" and of course who can forget topics like Columbus Day, Thanksgiving changing to Turkey Day, the debate over gay marriage, Religious liberty, NFL players taking a knee for the national anthem, Hollywood, "morality in the public square" and other sort of cultural topics with certain political undertones.
So what is a culture war? Does it really even exist? If so, who are fighting it? And should Christians be involved?
Usually when someone brings up "the culture war" it's in the context of talking about the mean, bad, "Christian Right" these terrible Christian supremacists (as Tim Keller called them) who won't stop fighting and being mean and driving people away from Christ.
But is this characterization accurate? Christians, especially young Christians don't just dislike this "Christian right" they downright hate them. Why? Well, it's hard to say. They're angry about what they focus on, I suppose. They've intermingled the Christian message with politics, those evil bastards! Is this correct? Let's talk about this culture war and the people who fight back and forth.
So does this culture war actually exist? The phrase "culture war" wasn't actually coined until 1991. But if one is referring to the struggle to define America waged by leaders on the left and on the right of the political spectrum, emerging in issues like "taking a knee" for the national anthem, the debates about calling it Christmas or winter season, and the discussions about gay marriage, religious liberty, and conscientious objection, then yes, a sort of struggle does certainly exist in American society and does emerge in political and social debate in media. Anyone who fails to recognize this apparent conflict is simply not paying attention to modern American society. It's one thing that nearly all of us can recognize and many of us often loath to recognize.
Is there a cultural struggle occurring? Yes. But it's not a literal "war" with guns, bullets, and death. The phrase "culture war" seems to describe a conflict of ideas, of ideologies and philosophies of life, not any sort of violence.
So if a cultural conflict exists, and we know it does, what caused it? How did it come about?
One could say that the culture war has been going on to some extent since the 1920s. Before then there had been a shared vision in the culture of the United States regarding how Americans saw the country. American society generally saw itself as Christian, moral, ethical, patriotic, and embracing of certain values like liberty, freedom, justice, and equality. But a new movement was beginning that would call into question all of these values and begin to question if America was really good, free or equal.
There were many leaders and sub-movements that one could talk about as to how the movement began: Fabian socialists, the results of World Wars I and II, massive immigration to the United States, cultural marxism, Saul Alinsky, Herbert Marcuse, the Frankfurt School's teachings on Critical Theory, the "new deal" under Franklin D. Roosevelt, the policies of President Woodrow Wilson, and the great depression to name a few. But that's not the main topic of our discussion today.
The culture war broke into new bounds in the 1960s with the emergence of the new left, prayer and the Bible being driven from public schools, and evolutionary theory being enshrined in the public school systems. The United States had been largely a Christian society, struggling with issues like segregation, eugenics, the debates over Watergate, the Vietnam war, and many other issues. But ever since the 1960s America society has been undergoing a major change in ethics and values. This change has been pushed by leaders in the business world, the public schools, Hollywood, the news media, the sciences, and most prominently the university systems of the United States. That's just a bit of history of the culture war.
So who started the culture war? This may come as a surprise to you, since I know the news media almost universally lays the blame for the culture war at the feet of traditionalists, conservatives, and Christians. But the data is unequivocal, the culture war has been perpetrated by those on the other side of the spectrum, the progressive movement, materialists, naturalists, atheists, and the new left. Now, it's quite true to say that both sides have made many mistakes. Many traditionalists and conservatives and "evangelicals" have made poor decisions in the culture war. I'm not trying to absolve anyone of shameful decisions. I'm just looking at the history here. But it's quite obvious from the history, that the opening shots were fired by the progressives, and indeed the progressive side is the one pushing for the radical changes to American society which has always been traditionally founded on a moral and ethical society based in Christianity, and centered on ethics like liberty, freedom, absolute truths, and equality under the law.