Culture Wars
What does the term culture wars mean? Simply put, it means that there is a fight to change the underlying culture of this country. However, with all the turmoil in the world, I’m pretty sure that term applies to most cultures and seems to be especially applicable to the United States and Europe. Why is that so? There are a couple explanations for it.
A Brief History
During the past 1,500 years or so, western culture was shaped and molded by the rise of Christianity because the underlying influence the followers of Jesus had on the human experience was overwhelming. Medicine, science, art, literature, civil government, and much more conformed to the ideas and concepts espoused by the Bible. Some of the greatest achievements in human history can be attributed to those who claimed to be Christians. In that sense, the rising acceptance and inclusion of Christian thought was evidence of the culture wars that the church had within then-prevalent societies
In the United States, the general acceptance and adherence of the national culture to Christian principles was manifested in the colonial charters as well as local laws and governance. This influence continued through the formation and institution of governing documents such as Declaration of Independence, the US States Constitution, and every single colony attributed their rights and blessings to the God of creation. While some people claim that’s not the case, history shows it to be true.
The Clouds of War
During the past 100 years or so a shift has taken place regarding how we view and interact with society, for a few different reasons. The influx of non-Christian and/or non-assimilating immigrants, the exclusive teaching of evolution in the public school system, and other forces, have led to a drastic decline in the number of people who adhere to the teachings of Jesus. The militancy of agnostics and atheists added fuel to the fire by demanding a ‘separation of church and state,’ a term which is not contained in the founding documents of our country. The result is that the cultures wars had arrived!
Even within the church, growing numbers of people deem the Bible as outmoded. The focus of the church has shifted from personal responsibility to God to a general sense of ‘the common good,’ subverting the authority of the Word in everyday life. Parents increasingly rely on the church to instill that personal relationship between their children and the Lord. The result of these negative forces on western society is painfully obvious.
The Solution
What can be done about the culture wars we see raging all around us? Will it take political involvement? Should we increase government spending on social services? Do we need to employ more police officers? These are important considerations but they will not influence the culture wars as much as we think. The solution is to return to what made western civilization what it once was. It will take the followers of Jesus to become the salt and light that we are supposed to be.
Only the power of the Holy Spirit operating through the followers of Christ will positively affect the culture wars.
Share what you think about this!