(July 18, 2017 at 2:09 am)It_Was_me Wrote:(July 18, 2017 at 2:06 am)Interaktive Wrote: The colonization of the world by the West after the Cold War helped spread the liberal democracy
Is this the end of the story? The West has won forever?
What are you even talking about?
After the Cold War, neocolonialism came. Russia has become a colony of the United States and the West.