(November 18, 2019 at 12:59 pm)The Valkyrie Wrote: Unfortunately this kind of shit happened all over.
Had to educate the “savages” of the lands we invaded.
But, but, but...it was Manifest Destiny! God made us do it...
Manifest Destiny, a phrase coined in 1845, is the idea that the United States is destined—by God, its advocates believed—to expand its dominion and spread democracy and capitalism across the entire North American continent.
Disappointing theists since 1968!