Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: April 25, 2024, 8:24 pm

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[Serious] Is ChatGPT a Chinese Room?
#11
RE: Is ChatGPT a Chinese Room?
If we ignore phenomenal consciousness altogether (and so discard imagery and visualizing and all that), then what does it really mean to understand the meaning of a word or statement? When I'm focusing on a word on this screen, what exactly goes on in my head that constitutes the interpretation of it in a sense that's categorically different from how a computer treats words? Is it just a matter of structural differences? Complexity? Or is there something more to the distinction?

This stuff is challenging to me.
Reply
#12
RE: Is ChatGPT a Chinese Room?
I think “understanding a statement” by an entity broadly means the entity has contextualized the meaning of a statement inside some generalized model of reality used by the entity for planning, and either used the meaning of the statement to adjust some element of the model or at least assessed whether any adjustment to the model is warranted in light of the statement.      

So understanding means more than just triggering some simple algorithmic response.
Reply
#13
RE: Is ChatGPT a Chinese Room?
No, but it is a Chinese Buffett.

10$ gets you all-you-can-eat Asian cuisine made by A.I.

Dishes like

Generalized Sew Chicken
Babies black ribs
Wand soup
Fortunate crumbles

and more.
"For the only way to eternal glory is a life lived in service of our Lord, FSM; Verily it is FSM who is the perfect being the name higher than all names, king of all kings and will bestow upon us all, one day, The great reclaiming"  -The Prophet Boiardi-

      Conservative trigger warning.
[Image: s-l640.jpg]
                                                                                         
Reply
#14
RE: Is ChatGPT a Chinese Room?
(March 5, 2023 at 9:20 am)GrandizerII Wrote: If we ignore phenomenal consciousness altogether (and so discard imagery and visualizing and all that), then what does it really mean to understand the meaning of a word or statement? When I'm focusing on a word on this screen, what exactly goes on in my head that constitutes the interpretation of it in a sense that's categorically different from how a computer treats words? Is it just a matter of structural differences? Complexity? Or is there something more to the distinction?

This stuff is challenging to me.

The way I've always looked at this question is to think of what we call 'understanding', whether phenomenally or neurally, as the measure of an idea/concept's integration and coherency. Coherency though is not the same thing as correctness, so you can be wrong in your understanding of something, because it's not a measure of truth but rather of how well things fit together, rightly or wrongly. So to me, to "understand" something is to be in a non-questioning state regarding that thing, but that understanding can still be wrong (as any experience of 'certainty' in the game of Mafia frequently attests Wink).

And from a neural network point of view, I would (at least in the past, when I was more interested in neural networks) liken that same situation to being a measure of the 'settled-ness' of a neural network into a stable state. That is to say that for any set of information a NN is representing, it goes through a process of values fluctuating until it 'settles' into a stable state, and it stays in that state until new information (eg the addition, subtraction, or change of an input) needs to be integrated, at which point it fluctuates again until it reaches a new stable state. So from this point of view, a coherent and integrated state... a state of non-questioning... would be the settled state of the network and a questioning state, would be the state the network is in when new information is added that has yet to be integrated with existing information, with conscious questioning itself interrupting/slowing this process down, in the sense that it may require real-time action, such a searching for something or whatever, to find the missing links as it were, that integrate the new data with the old.

And all this from the perspective that everything differentiable in phenomenal consciousness, ie noticeable, has a corresponding neural representation (hence my view of determinism regarding this) not least because everything we can notice we can name, and to name means to associate with.

Ultimately though, I think this is probably naive. It's how I used to view it, and in these terms it's an understanding, in the sense that it's, to me at least, internally coherent and integrated, but whether it's the objective truth is a completely different matter and I think probably not because it is not grounded enough in reality, just extrapolated from a theoretical understanding of neural networks (ie from the sort of perspective of the book I used to love "Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain"... and to be clear, I'm not saying this book puts forward my theories above, it doesn't, just that this book was the foundation for those theories).
Reply
#15
RE: Is ChatGPT a Chinese Room?
(March 6, 2023 at 10:19 am)emjay Wrote:
(March 5, 2023 at 9:20 am)GrandizerII Wrote: If we ignore phenomenal consciousness altogether (and so discard imagery and visualizing and all that), then what does it really mean to understand the meaning of a word or statement? When I'm focusing on a word on this screen, what exactly goes on in my head that constitutes the interpretation of it in a sense that's categorically different from how a computer treats words? Is it just a matter of structural differences? Complexity? Or is there something more to the distinction?

This stuff is challenging to me.

The way I've always looked at this question is to think of what we call 'understanding', whether phenomenally or neurally, as the measure of an idea/concept's integration and coherency. Coherency though is not the same thing as correctness, so you can be wrong in your understanding of something, because it's not a measure of truth but rather of how well things fit together, rightly or wrongly. So to me, to "understand" something is to be in a non-questioning state regarding that thing, but that understanding can still be wrong (as any experience of 'certainty' in the game of Mafia frequently attests Wink).

What beyond coherency would you say is required for truth?
[Image: extraordinarywoo-sig.jpg]
Reply
#16
RE: Is ChatGPT a Chinese Room?
(March 6, 2023 at 10:36 am)Angrboda Wrote:
(March 6, 2023 at 10:19 am)emjay Wrote: The way I've always looked at this question is to think of what we call 'understanding', whether phenomenally or neurally, as the measure of an idea/concept's integration and coherency. Coherency though is not the same thing as correctness, so you can be wrong in your understanding of something, because it's not a measure of truth but rather of how well things fit together, rightly or wrongly. So to me, to "understand" something is to be in a non-questioning state regarding that thing, but that understanding can still be wrong (as any experience of 'certainty' in the game of Mafia frequently attests Wink).

What beyond coherency would you say is required for truth?

That's a tough one, but one thing I can say is that the experience of certainty is often wrong, and not a reliable guide at all to the truth. In the game of Mafia you can have it all 'figured out' in a way that 'fits together' and you can often experience a sense of certainty that goes to the core of your being... as you might remember, I think you've played in the past? But then the 'flip' comes, and overturns everything you thought you knew. Or another example would be lateral thinking puzzles... puzzles where the only way to solve them is to think outside the box of your existing assumptions, the 'box' in that sense being an internally coherent understanding itself. So what makes one or other understanding closer to the actual truth?

This is just off the top of my head, so please forgive it if it's not well thought-out but from these examples it seems to be more a matter of degree, ie in both cases new information comes in that needs to integrated, and so the new representation better models the available data than the old representation. So I guess from this point of view this would make me some sort of relativist regarding truth because the brain in this sense is doing the same thing all the time, trying it's best to accurately model reality, or at least inasmuch as it's required to do so to survive... integrating ever more and more information with its existing model. So with each new integration it perhaps gets closer to some 'objective' truth, as a measure of integration, coherency, and perhaps add quantity of data (ie a lifetime's worth of observations about the world), but how to define that outside of what the brain seems to be doing, I don't know. I'll have to think about that.

ETA: Forget this, it's not a great answer and relativist is probably not the right word.
Reply
#17
RE: Is ChatGPT a Chinese Room?
(March 5, 2023 at 9:20 am)GrandizerII Wrote: If we ignore phenomenal consciousness altogether (and so discard imagery and visualizing and all that), then what does it really mean to understand the meaning of a word or statement? When I'm focusing on a word on this screen, what exactly goes on in my head that constitutes the interpretation of it in a sense that's categorically different from how a computer treats words? Is it just a matter of structural differences? Complexity? Or is there something more to the distinction?

This stuff is challenging to me.

IMO, Our understanding of words and statements comes down to personal experience.

IE: When I think of the word fight, I'm thinking of two guys punching each other.  Someone else might think of the word fight as two people having a loud and worded argument.
"For the only way to eternal glory is a life lived in service of our Lord, FSM; Verily it is FSM who is the perfect being the name higher than all names, king of all kings and will bestow upon us all, one day, The great reclaiming"  -The Prophet Boiardi-

      Conservative trigger warning.
[Image: s-l640.jpg]
                                                                                         
Reply
#18
RE: Is ChatGPT a Chinese Room?
(March 6, 2023 at 11:18 am)emjay Wrote:
(March 6, 2023 at 10:36 am)Angrboda Wrote: What beyond coherency would you say is required for truth?

That's a tough one, but one thing I can say is that the experience of certainty is often wrong, and not a reliable guide at all to the truth. In the game of Mafia you can have it all 'figured out' in a way that 'fits together' and you can often experience a sense of certainty that goes to the core of your being... as you might remember, I think you've played in the past? But then the 'flip' comes, and overturns everything you thought you knew. Or another example would be lateral thinking puzzles... puzzles where the only way to solve them is to think outside the box of your existing assumptions, the 'box' in that sense being an internally coherent understanding itself. So what makes one or other understanding closer to the actual truth?

This is just off the top of my head, so please forgive it if it's not well thought-out but from these examples it seems to be more a matter of degree, ie in both cases new information comes in that needs to integrated, and so the new representation better models the available data than the old representation. So I guess from this point of view this would make me some sort of relativist regarding truth because the brain in this sense is doing the same thing all the time, trying it's best to accurately model reality, or at least inasmuch as it's required to do so to survive... integrating ever more and more information with its existing model. So with each new integration it perhaps gets closer to some 'objective' truth, as a measure of integration, coherency, and perhaps add quantity of data (ie a lifetime's worth of observations about the world), but how to define that outside of what the brain seems to be doing, I don't know. I'll have to think about that.

ETA: Forget this, it's not a great answer and relativist is probably not the right word.

I'm not looking for a finished answer, just asking your thoughts. I think it's natural to think that our approximations are getting closer and closer to the truth, but this presents several issues. The first is the question of how we know that we're getting closer to truth rather than further away. I recall a discussion I once had with an individual who, on the basis of the increasing sophistication of science and technology, we could predict that the upward trend would continue and that eventually most mysteries would be solved. But it turns out that such intuitions have no real foundation. If you picture increasing knowledge as a line on a graph trending upward, it's simply a truism that the line could change direction at any time unless we have independent reason to believe that it won't; predicting the future course of the graph's plot based solely on the fact that it has been trending upward consistently in the past is fallacious -- you can't assume the past behavior will continue. But the intuition is strong. Why do we feel confidence that our approximations are even close to the truth? I think the common answer is that if our approximations weren't close to true then they wouldn't be as useful as they are. I'm not sure that intuition is valid either, but I think it's a genuine question as to whether it is or not. Even if our approximations were getting less and less accurate, simply by virtue of continuous applied effort we'd find more and more useful things. In a sense, I'm suspicious that the belief that we are getting closer and closer to the truth in some sense depends upon the pretense of knowing where the truth lies. Like some theological questions, such as what God would do, answering the question presumes that we have the ability -- to independently gauge where truth lies -- when that is the very ability that is in question, just as answering what God would do might, in a strong sense, require that we be godlike in possessing omniscience and omnipotence and moral perfection like He does.
[Image: extraordinarywoo-sig.jpg]
Reply
#19
RE: Is ChatGPT a Chinese Room?
(March 6, 2023 at 12:43 pm)Angrboda Wrote:
(March 6, 2023 at 11:18 am)emjay Wrote: That's a tough one, but one thing I can say is that the experience of certainty is often wrong, and not a reliable guide at all to the truth. In the game of Mafia you can have it all 'figured out' in a way that 'fits together' and you can often experience a sense of certainty that goes to the core of your being... as you might remember, I think you've played in the past? But then the 'flip' comes, and overturns everything you thought you knew. Or another example would be lateral thinking puzzles... puzzles where the only way to solve them is to think outside the box of your existing assumptions, the 'box' in that sense being an internally coherent understanding itself. So what makes one or other understanding closer to the actual truth?

This is just off the top of my head, so please forgive it if it's not well thought-out but from these examples it seems to be more a matter of degree, ie in both cases new information comes in that needs to integrated, and so the new representation better models the available data than the old representation. So I guess from this point of view this would make me some sort of relativist regarding truth because the brain in this sense is doing the same thing all the time, trying it's best to accurately model reality, or at least inasmuch as it's required to do so to survive... integrating ever more and more information with its existing model. So with each new integration it perhaps gets closer to some 'objective' truth, as a measure of integration, coherency, and perhaps add quantity of data (ie a lifetime's worth of observations about the world), but how to define that outside of what the brain seems to be doing, I don't know. I'll have to think about that.

ETA: Forget this, it's not a great answer and relativist is probably not the right word.

I'm not looking for a finished answer, just asking your thoughts.  I think it's natural to think that our approximations are getting closer and closer to the truth, but this presents several issues.  The first is the question of how we know that we're getting closer to truth rather than further away.  I recall a discussion I once had with an individual who, on the basis of the increasing sophistication of science and technology, we could predict that the upward trend would continue and that eventually most mysteries would be solved.  But it turns out that such intuitions have no real foundation.  If you picture increasing knowledge as a line on a graph trending upward, it's simply a truism that the line could change direction at any time unless we have independent reason to believe that it won't; predicting the future course of the graph's plot based solely on the fact that it has been trending upward consistently in the past is fallacious -- you can't assume the past behavior will continue. But the intuition is strong.[...]

I think one reason for this intuition, and why it seems strong, is the nature of science itself, in how it is constantly questioning and trying to falsify it's own results; that in the terms of what I was saying earlier would be the equivalent of never letting our state of knowledge of the world 'settle' into a stable state, a state of unquestioned certainty. So basically from that point of view, the intuition would be that as long as we're always questioning, we're always moving closer to truth, but as soon as we stop questioning, that's when we're in danger of let's say 'unsyncing' our personal truth from objective, actual truth Wink, which intuitively it would seem would always have more to add. Ie maybe we should never seek certainty... or at least not mental certainty... ie maybe seek objectively high probability, but never close the book so to speak. And in that sense, science as a discipline, does what is difficult to maintain 100% of the time personally, mentally... ie it extends and improves upon our natural tendencies when thinking alone, which can become lax and/or irrational.

Quote:[...]Why do we feel confidence that our approximations are even close to the truth?  I think the common answer is that if our approximations weren't close to true then they wouldn't be as useful as they are.  I'm not sure that intuition is valid either, but I think it's a genuine question as to whether it is or not.  Even if our approximations were getting less and less accurate, simply by virtue of continuous applied effort we'd find more and more useful things.  In a sense, I'm suspicious that the belief that we are getting closer and closer to the truth in some sense depends upon the pretense of knowing where the truth lies. Like some theological questions, such as what God would do, answering the question presumes that we have the ability -- to independently gauge where truth lies -- when that is the very ability that is in question, just as answering what God would do might, in a strong sense, require that we be godlike in possessing omniscience and omnipotence and moral perfection like He does.

Yes, I don't know either. Except inasmuch as it seems to relate to the first intuition, in that the greater the quantity of information integrated, the more accurate a model it is assumed to be, so if from that perspective we consider the actual, objective truth to be like a model consisting of everything in the universe and all the relationships between them, then every question that is asked, and integrated into our own personal models, seems to move us a little bit closer to that hypothetical ideal model, which we could call objective truth.
Reply
#20
RE: Is ChatGPT a Chinese Room?
(March 5, 2023 at 2:59 am)BrianSoddingBoru4 Wrote:
(March 5, 2023 at 12:39 am)Tomato Wrote: Developer: OpenAI, OpenAI is an American artificial intelligence

https://en.m.wikipedia.org/wiki/OpenAI

That's nothing to do with it. The Chinese Room Argument makes the case that if, for example, a computer programmed with Chinese characters responds to questions so well that any speaker of Chinese would assume that they were talking to a real person (passing the Turing test), that person would not be able to tell if the computer actually understands Chinese, or is merely simulating understanding Chinese. If you were locked in a room with a copy of the AI program, and questions written in Chinese were passed to you under the door, you could use the program to print our your responses in Chinese without understanding Chinese. Same with a computer.

Boru

...all of which raises the question "what is the difference between understanding a language and the ability to use its symbols?" IMHO the sematic content of words is difficult to pin down. Perhaps there is no semantic content. Is it maybe that is wholely reducable to syntax? As in the meaning of any given word is solely determined by the words around it...which leads to an infinite regress.
<insert profound quote here>
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  ChatGPT & AI John 6IX Breezy 21 2426 January 18, 2023 at 11:35 am
Last Post: Aegon



Users browsing this thread: 2 Guest(s)