Africa is so fucked. It's amazing to me how Christians (and Muslims) always try to point towards missionaries in Africa to show how their religion does more good than evil. Really? When you were in Africa handing out bibles did you not look around at a single thing happening? Religion has done worse things to that continent than any other.
![[Image: dcep7c.jpg]](https://images.weserv.nl/?url=i46.tinypic.com%2Fdcep7c.jpg)