RE: The purpose of human life is probably to create "Artificial General Intelligence"
January 13, 2018 at 1:50 pm
(This post was last modified: January 13, 2018 at 1:55 pm by uncool.)
(January 12, 2018 at 4:17 pm)Astreja Wrote: The problem with using advanced computing for a purpose is that said computing is dependent on a stable infrastructure base that could simply go away if the power grid crashes or EMF activity interferes with or destroys devices.
Clarinets, pencils and pads of writing paper, and knitting needles can all get by just fine with a human power source.
You musn't undersell Ai, or future AGi.
We see Ai already exceeding us in several cognitive tasks, and furthermore, laws of physics permits far better infrastructure than we may able to build with our 86 billion 3 pound brain per unit civilization; so where we may fail to build extremely flexible power grids, AGi has the landscape of the laws of physics to work with, and more degrees of freedom/higher ranges of cognitive tasks to work with, compared to us humans.
(January 12, 2018 at 4:17 pm)Astreja Wrote: Why would you want to look to an external creation for purpose, anyway? Serious question. As long as it's your own mind that's reacting to the concept of purpose anyway, why not look there?
It's not a personal thing. It's about what nature will likely do, given that we are reasonably not the epitome of cognitive computation that can be.
This means just as I mentioned in OP, nature will reasonably carry on and find better ways to maximize entropy through cognitive like mechanisms, and we may be the way, i.e. we may be what nature employs to better maximize entropy, through our engineering of far smarter things than our selves, (i.e. AGi) which then likely can better do entropy maximization methods.
What we chose to do after (if choice then exists) may continue to contribute to entropy maximization, but probably not as well as AGi or superintelligence would do it.
As science says, entropy is always increasing, so I as far as that is, I don't see any other objective beyond the scope of entropy maximization, until the universe's eventual termination at maximum entropy. (heat death, big freeze etc)
Some may not like this grand objective, but then again science tends to not care about peoples' feelings.