When I mention sins I'm talking about the fact that hundreds of years ago the people who came to this country slaughtered the people who were here first and took their land. Later they enslaved the blacks, treated women like property and wouldn't even allow them to vote until as soon as 1920. Sending the Japanese people into camps after the attack at Pearl Harbor. And I'm sure the list just gets bigger and bigger.
Is there anyway for America to truly confront its past and change for the better or is it just too great a wound to heal?
Discuss.
Is there anyway for America to truly confront its past and change for the better or is it just too great a wound to heal?
Discuss.