Just a thought: If you reject religion, and the only claims that god exists came from religion...then why still believe. It's like with deists; once they realized that religion was fake, they pushed god back, but why? The original basis for their belief (I'm talking the first deists, the ones from the enlightenment era) was from religion, without it the god claim is held up by nothing (not that religion can defend it either).
John Adams Wrote:The Government of the United States of America is not, in any sense, founded on the Christian religion.