(August 31, 2019 at 12:07 pm)wyzas Wrote: Christian belief in America has more to do with conforming to family traditions and local society and has little to do with any rational thought. For the majority I interact with IRL, the extent of their hypothesis is "because I'm told I am".
That has not been my experience; perhaps it differs by location and denomination. My guess is you will find such attitudes among the older establishments, like Catholics, Lutherans, an other European denominations that have migrated to America.
Most of my interactions are with Jehovah's Witnesses, Latter-Day Saints, and Seventh-Day Adventists. They are relatively new denominations, birthed in the states, so they lack the type of culture and tradition of Catholics for example. I lived in Tennessee for a few years, and my impression of the culture there was that most people that were religious were sincere in their beliefs.