I often wonder if my actions and thoughts are the result of my own jurisdiction or the result of uncontrollable chemical reactions, if my actions are responsible ones or if they're the result of the laws of nature. It's an interesting question that I can't wrap my head. Perhaps I just have a strong illusion of being in control of my actions. I'm not sure if it really makes a difference. Either way, effort is required and consequences for actions is a justifiable method of maintaining order. You wouldn't punish a computer for making bad decisions, but you would punish a human for it. I wonder how a machine could ever feel emotions. If machines ever did feel emotions then it would be justifiable to punish them. I think at that point you might as well consider a machine a living being with the right to exist as well as be liable to the same punishments of humans. I kind of think of myself as a programmed machine sometimes though. I wonder how someone could ever build a machine complex enough to experience sentience and a full range of human senses. I think that plenty of movies have already tackled this concept (blade runner, iRobot, terminator). I really wanna go watch blade runner again. I have a feeling a lot of the ideas of the movie went over my head when I was like 16.
Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: May 30, 2024, 12:20 am
Thread Rating:
Supervenience, Transcendence, and Mind
|
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)