(August 18, 2021 at 8:16 am)Spongebob Wrote: So, I'm in my 50's and most of my life has been very well adjusted and stable. I grew up with an image that the United States was an altogether great country and only getting better. The last few years have been a total disaster in many ways and they have made it clear that despite the nice coat of red, white and blue patriotism, America is mostly made up of greedy, ignorant white people who hate just about everyone else and hate the very idea of diversity of race, thought or creed. It has made me sick and I have become disgusted with it. Just about every single moral concept I was taught as a Christian is complete bullshit. Americans don't value anyone and don't value life at all. Americans value money and their own personal power and absolutely nothing else. Take away the nice infrastructure and cheap food supply and this country would descend into chaos overnight. Some of this I already knew, but the depths of American depravity is so much worse than I ever thought possible.
I think Americans have always been very self-absorbed, but the average American is very friendly, and believes they have as much charity and moral values as anyone else.
But I agree that something is happening in the political realm that is dangerous and self-destructive.
The current Western system of democracy, global trade, and global rules was largely put together by the U.S., as the sole superpower at the end of WWII. Since then, democracy increased around the world, until now, when I believe it is in retreat - especially in the U.S. On the rise are autocrats, oligarchs, single-party dictatorships, and strong-men who say "only I can fix it".