Can I ask, as an european, why does religion have such a great impact on the american society? I'm asking this because european countries are mostly secular and religion does not usually interfere with public life. I believe to have read your amendment on religious freedom and, as a law student I must say your state pretty much stated indirectly 'you should believe in god or else you're fucked'. What reasons are there for this? Why do senators in the US need to believe in god to become senators?
Whoever fights monsters should see to it that in the process he does not become a monster. And if you gaze long enough into an abyss, the abyss will gaze back into you