(February 1, 2014 at 3:34 am)futilethewinds Wrote: Yeah. They think that slavery was better for both white people and black people. Because that's what they teach Southern kids.
While I'm sure there are a few sick individuals that think that, living in the dregs of trailer home communities in the South, I don't personally know of anyone that teaches slavery is better for anyone. Or has been taught such.
Everything I needed to know about life I learned on Dagobah.