(October 23, 2013 at 2:04 am)FallentoReason Wrote: It's a tricky thing to define for sure. I'd say the entity needs to show emotions/feelings/instincts.
Given that an entity can be a conscious entity without having sentience - why do you consider the existence of emotions/feeling/instincts as necessary for assigning meaning?
(October 23, 2013 at 2:04 am)FallentoReason Wrote: "The only entities conscious of them were the next program".
I love the word choice here. It's clear as day that you're begging the question.
Am I? Or am I using it in such a manner that it doesn't beg the question?
The word conscious normally is used to describe either biological or obviously self-aware entities - but the limits of the word are not defined. The simplest explanation of consciousness - without any dualistic baggage - would be "X is conscious of Y when some information from Y is received and processed by X". The behavior of the programs fits this category.
(October 23, 2013 at 2:04 am)FallentoReason Wrote: Anyways, to the above I shrug my shoulders. Where one program ends and the other starts is an arbitrary boundary. What if we had a Mega Program that contained the algorhythm of all three? Your non-issue would dissolve and we would be at square one.
Actually, in that case the issue would be compounded. If we have a mega-program the sections of which are exchanging information with each other, then I could make an argument that that program has a degree of self-awareness.
(October 23, 2013 at 2:04 am)FallentoReason Wrote: Bottom line here is that it doesn't matter how many programs or how long the algorhythms are, you still have a whole bunch of physical causal relations that take something in and spit something out without ever having to give it meaning.
But they do have to give it meaning - that was my point. The user is not aware of intermediate outputs but there is a set of abstract categories which can assign meaning to them and that is done by the programs.