Cassius
YaBB God
Posts: 4,632
|
|
« on: December 29, 2013, 05:55:01 PM » |
|
Well, the west, as it exists now, is largely a product of Judeo-Christian values. Taking aside for one moment the political connotations of the term, Christianity (and the Jewish influences which are embedded in it, not least the Old Testament) has been one of the most important cultural influences in shaping the west as we know it today. That is not to discount other, important influences (the influence of Greek philosophy for example), but it the fact remains that Christianity has been the dominant faith in the western world for over 1500 years. The vast majority of scholarship carried out in Europe (the cultural precursor to America) was carried out by Christian clergymen up until the 1400's or so. The vast majority of people in the west would have considered themselves, at least nominally, as Christians, right into the 20th century. We can see all around us the influence of Christianity and Judaism, from our names (John, Matthew, Paul, Mark, David, James and Peter, to name but a few of the more common ones), our politics (which particular Christian denomination you belong to can, even today, can be an important marker of where you stand politically), our culture and even the very buildings that you see every day. 'Judeo-Christian' traditions are so deeply embedded in our society that occasionally we're in danger of forgetting that they're even of a religious origin. So, the phrase itself is perfectly sound, in the same way as Islamic traditions have shaped the Islamic world, or Hinduism has shaped the subcontinent (just look at the caste system). So it's a very real concept.
Politically, I also have a favourable opinion of the phrase.
|