So, I was thinking about a conversation I had yesterday regarding East versus West. As I stood in the shower this morning, I thought: The West has no religion of its own (save consumerism-capitalism-militarism-absolutism).
Hinduism, Buddhism, Judaism, Islam, and yes, Christianity all have their roots in the East. Near-eastern and far-eastern religions are what this world is defined by. I don't mean to disclude other smaller religions or sects created in the West; I just mean the large percentage of religious people.
Yet many think of Christianity as a Western religion, rooted in Western understandings of God [read: Platonic and Socratic understandings of dualism, the soul, etc.], perhaps rooted in the East, but refined and defined by the West.
Have we gone astray in these thoughts? Or is this just how history works? Hmm.