I'm sure at some point this question has been asked before. But it was sparked by another thread. And I've asked myself this multiple times.
Christianity is a misogynistic violent religion - There are numerous passages in the bible telling you how much it sucks to be Female. Do women just ignore that?
(For Christianity we'll include all the major ones, LDS, every kind of Baptist, Catholicism, etc. etc.)
Judaism isn't any better. They take it all the way to gender separation in the synagogues.
Islam is just fucking nuts. Women are just slightly above property. Why the hell would any woman want to be Muslim?
I sincerely do not understand.
Christianity is a misogynistic violent religion - There are numerous passages in the bible telling you how much it sucks to be Female. Do women just ignore that?
(For Christianity we'll include all the major ones, LDS, every kind of Baptist, Catholicism, etc. etc.)
Judaism isn't any better. They take it all the way to gender separation in the synagogues.
Islam is just fucking nuts. Women are just slightly above property. Why the hell would any woman want to be Muslim?
I sincerely do not understand.
I reject your reality and substitute my own!