(June 17, 2014 at 12:54 am)Lek Wrote: The US has never been a christian nation. Most of the founders or their families had come from countries where there were state religions. They were fleeing religious persecution and were very concerned about the need to create a government system without a state religion and in which people would be free to believe as they desired without fear of persecution. Because of the fact that most of the citizens of the country have been christians, the culture became very much christian. The culture is now moving away from christianity as the percentage of other religions, as well as "no religious preference" and atheists grows. No matter how the culture changes it's extremely important that our system of government allows citizens to be free to believe as they desire.
While the country has never been an official Christian nation all of the Colonies were established as religious entities. The problem was that each of the Colonies had different religions so in order to form one unified nation they gave up the idea of having a State religion. Even so some of the States retained their own individual State religions for decades.