Christianity and culture: Change it or be changed by it?

Gerald R. Baron
8 min readOct 28, 2022
Photo by Stefan Katrandjiski on Unsplash

Should the content of the Christian faith be altered by the culture where Christians find themselves? Or should those Christians upholding universal Christian beliefs and values change the culture in which they live?

The question is central to discussions about the reformation of Christianity, the topic of this lengthy series involving Graham Pemberton and Prudence Louise.

There was a time, of course, in Western cultural history, when Christianity and culture were closely intertwined. This was the medieval period, the height of Christendom with the Church personified by the Pope, providing the spiritual and temporal leadership previously provided by the Roman consuls and then the emperors.

Some would argue that the United States was born as a Christian nation, guided by the culture and faith initially of the Puritans, developed through the Great Awakening, and flourishing in a largely Protestant form in both white and black cultures through perhaps the post war period and maybe even into the post protest period. Few would argue today that the dominant culture as presented by major media, education, entertainment, and politics is Christian. Whether it is anti-Christian to the degree that many Christians believe is disputed by a growing number of other believers. Sadly, the culture wars with very…

--

--

Gerald R. Baron

Dawdling at the intersection of faith, science, philosophy and theology.