I would like to know why Christians feel it is their "duty" to tell others how to live their lives?
When I was a Christian, I didn't thrust my views onto others, condemning them if they didn't believe as me.
So, enlighten me OP. :-)
When I was a Christian, I didn't thrust my views onto others, condemning them if they didn't believe as me.
So, enlighten me OP. :-)