It's hard, but we don't know that it's impossible. And as Bostrom's explains, even if the chance of it happening is something like 0.00000001 it's still worrisome. When you think of the potential trillions that may never be born into a life (of wealth at that) because of our negligence you start looking at existential risks a little differently. And this is one almost no one else talks about in these terms. They either go with Kurzweil's happy singularity scenario or put it off as too far in the future to worry about.
Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: May 28, 2025, 2:14 pm
Thread Rating:
Can we build AI without losing control over it? | Sam Harris
|
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)