Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Quantum mechanics, the Chinese room and the limits of understanding

#1
C C Offline
The Chinese Room is subsumed by the symbol grounding problem. If a computer is trapped entirely in language (never experiences the images, sounds, odors, tactile feelings that humans represent the world with), then the computer is unable to apprehend the phenomenal significance of what the symbol arrangements ultimately correspond to. IOW, it is akin to a pictureless dictionary that is limited to explaining words by referring to other words (a circularity that it cannot break out of).
- - - - - - - - - - - -

Quantum mechanics, the Chinese room and the limits of understanding
https://johnhorgan.org/cross-check/quant...erstanding

EXCERPT: . . . Searle’s thought experiment has provoked countless objections. Here’s mine. The Chinese room experiment is a splendid case of begging the question (not in the sense of raising a question, which is what most people mean by the phrase nowadays, but in the original sense of circular reasoning). The meta-question posed by the Chinese Room Experiment is this: How do we know whether any entity, biological or non-biological, has subjective, conscious experiences?

When you ask this question, you are bumping into what I call the solipsism problem. No conscious being has direct access to the conscious experience of any other conscious being. I cannot be absolutely sure that you or any other person is conscious, let alone a jellyfish, say, or an AI program like ChatGPT. I can only make inferences based on the behavior of the person, jellyfish or chatbot.

Now, I assume that most humans, including those of you reading these words, are conscious, as I am. I also suspect that Searle is probably right, and that even an AI as clever as ChatGPT only mimics understanding of English. It doesn’t feel like anything to be ChatGPT, which manipulates bits mindlessly. That’s my guess, but I can’t know for sure, because of the solipsism problem.

Nor can I know what it’s like to be the man in the Chinese room. He may or may not understand Chinese; he may or may not be conscious. There is no way of knowing, again, because of the solipsism problem. Searle’s argument assumes that we can know what’s going on, or not going on, in the man’s mind, and hence, by implication, what’s going on or not in a machine. Searle’s flawed initial assumption leads to his flawed, question-begging conclusion.

That doesn’t mean the Chinese room experiment has no value. Far from it. The Stanford Encyclopedia of Philosophy calls it “the most widely discussed philosophical argument in cognitive science to appear since the Turing Test.” Searle’s thought experiment continues to pop up in my thoughts. Recently, for example, it nudged me toward a disturbing conclusion about quantum mechanics, which I’ve been struggling to learn.

Physicists emphasize that you cannot understand quantum mechanics without understanding its underlying mathematics. You should have, at a minimum, a grounding in logarithms, trigonometry, calculus (differential and integral) and linear algebra. Knowing Fourier transforms wouldn’t hurt.

That’s a lot of math, especially for a geezer and former literature major like me. I was thus relieved to discover Q Is for Quantum by physicist Terry Rudolph (see illustration above). He explains superposition, entanglement and other key quantum concepts with a relatively simple mathematical system, which involves arithmetic, a little algebra and lots of diagrams with black and white balls falling into and out of boxes.

Rudolph emphasizes, however, that some math is essential. Trying to grasp quantum mechanics without any math, he says, is like “having van Gogh’s ‘Starry Night’ described in words to you by someone who has only seen a black and white photograph. One that a dog chewed.”

But here’s the irony. Mastering the mathematics of quantum mechanics doesn’t make it easier to understand and might even make it harder. Rudolph, who teaches quantum mechanics and co-founded a quantum-computer company, says he feels “cognitive dissonance” when he tries to connect quantum formulas to sensible physical phenomena... (MORE - missing details)
Reply
#2
confused2 Offline
Hm. Chinese room problem.
Kind of like a blind person who has never seen a town doesn't know what a town looks like.
Let's ask an imaginary blind person to direct us to the bus station.
700 steps to the next junction. Hang a right then 7,000 steps, if you start smelling fresh bread you've gone too far. Cross the road at 7,000 steps, hang right for 300 steps until you get to a junction, hang left and 400 steps takes you to the bus station.
Reply
#3
C C Offline
(Mar 7, 2024 12:38 PM)confused2 Wrote: Hm. Chinese room problem.
Kind of like a blind person who has never seen a town doesn't know what a town looks like.
Let's ask an imaginary blind person to direct us to the bus station.
700 steps to the next junction. Hang a right then 7,000 steps, if you start smelling fresh bread you've gone too far. Cross the road at 7,000 steps, hang right for 300 steps until you get to a junction, hang left and 400 steps takes you to the bus station.

Accordingly, the blind since birth person has no understanding or knowledge of what it's like to experience the world as images, colors, etc. Or apprehension of what it's like to navigate to _X_ via visual sensations.

Blind-since-birth person does still experience the external world as presentations of sound, tactile sensations, odors, etc. Disembodied computer resting on a shelf has no manifested representations of the world whatsoever. (Manipulation of information in nothingness.)

Computer arguably has a "map" of the world constituted of symbol management (akin to the dictionary), but the "map" is not the original territory (of human and animal experiences).

In turn, an advanced robot (embodied AI) may navigate its environment successfully without the world also manifesting itself as anything whatsoever. Via mechanistic interactions and processes (blind-sight, deaf-hearing, detect without feeling).

Thus raising the question of what the point of experiences are if they seem superfluous for survival (or, what is the point of a "world that manifests itself" if an invisible, soundless, unfelt world would still suffice)?

However, one of the criticisms of a philosophical zombie is why would it be pretending to have phenomenal presentations in the first place? And how could it be reliably accurate about them or accurately correlate its pretended experiences to the actual ones of conventional humans?

Thereby, one "trivial" purpose of phenomenal images, sounds, odors, and tactile sensations that "show themselves" privately for the brained organism is so that there is a legitimate reason for reporting experiences and for maintaining consensus agreement with others who have them. Which is to say, merely pretending to have them (like the speculative philosophical zombie) gets too complicated to explain and maintain the overall consistency of.

Above is all in a context of conventional materialism where matter lacks qualitative internal states (anti-panpsychism across the board in all degrees and kinds). All bets off if even the molecules in rocks have intrinsic phenomenal states, or they exist/exhibit as such to themselves.
Reply
#4
confused2 Offline
CC Wrote:Computer arguably has a "map" of the world constituted of symbol management (akin to the dictionary), but the "map" is not the original territory (of human and animal experiences).

I was once told off by an LLM for under-estimating it's ability to 'nuance' . Compare LLM indexing with (for example) this forum where words are indexed to speed searching but there no real indication of relevance, there's a sort of trick - an LLM is all trick.
Pi Wrote:In an LLM, an item isn't just present or absent in the index. Instead, each item is associated with a coefficient of relevance, which can range from weak to strong. So, rather than a simple binary "in or out" of the index, an item's relevance can vary along a continuum.
^^^ I don't know about you but that pretty much describes how I think I think - add in a bit of parallel processing, fuzzy logic and the need to keep myself alive - add water and a few other chemicals and that's me. Sad


Pi Wrote:This [the continuum] allows the LLM to better represent the real-world complexity of language and relationships between concepts. For example, the word "bank" might have a strong relevance coefficient in contexts related to finance, but a weaker relevance coefficient in contexts related to rivers or geography.
So, instead of just checking whether an item is in the index, the LLM also considers the strength of the item's relevance coefficient, which helps it to better understand the context and provide more accurate responses.
Reply
#5
Zinjanthropos Offline
Quote: this: How do we know whether any entity, biological or non-biological, has subjective, conscious experiences?

There’s a video on YouTube that shows a male lion surrounded by a pack of 20 hyenas. Normally male lions can handle many hyenas at once but on this occasion there were too many for him to deal with. Somehow the hyenas knew they had the upper hand and moved in for the kill or at least cause harm to their enemy. How did they know? Can they count and know the hyena/lion ratio that makes attacking a male lion a relatively safe adventure?

They know something because just as they were winning the battle, another male lion shows up. The hyenas, all of a sudden, were not so brave. How did they know to back off? Who’s doing the adding and odds calculations? The first lion miscalculated and risked death. Guess he gets the honorable Darwin Award for surviving a risky move. What I can’t be sure of is whether the second lion knew he would tip the scales by just showing up, so perhaps just dumb luck.

By having the sense of when to attack or retreat, does that make the hyenas conscious of their world, how to survive and avoid harm?

I think if you watch a prey animal run like hell from a predator, that the potential victim is well aware of its existence. Does it mean it’s a conscious entity, good chance I think.
Reply
#6
C C Offline
(Mar 7, 2024 07:24 PM)Zinjanthropos Wrote:
Quote: this: How do we know whether any entity, biological or non-biological, has subjective, conscious experiences?

There’s a video on YouTube that shows a male lion surrounded by a pack of 20 hyenas. Normally male lions can handle many hyenas at once but on this occasion there were too many for him to deal with. Somehow the hyenas knew they had the upper hand and moved in for the kill or at least cause harm to their enemy. How did they know? Can they count and know the hyena/lion ratio that makes attacking a male lion a relatively safe adventure?

They know something because just as they were winning the battle, another male lion shows up. The hyenas, all of a sudden, were not so brave. How did they know to back off? Who’s doing the adding and odds calculations? The first lion miscalculated and risked death. Guess he gets the honorable Darwin Award for surviving a risky move. What I can’t be sure of is whether the second lion knew he would tip the scales by just showing up, so perhaps just dumb luck.

By having the sense of when to attack or retreat, does that make the hyenas conscious of their world, how to survive and avoid harm?

I think if you watch a prey animal run like hell from a predator, that the potential victim is well aware of its existence. Does it mean it’s a conscious entity, good chance I think.

It's private experiences that Horgan is referring to, though. Not awareness and intelligence as expressed by outer behavior (which even an advanced robot displays).

But because we believe in universals (we couldn't make sense of the world otherwise), we do assume that anything with a brain has phenomenal consciousness. What applies to individual _X_ applies to all _X_s (barring rare medical conditions).

But that's inference instead of the empirical proof of literally observing the other organism's experiences. Open up a skull and for the public there is just neural tissue and electrical measurements, not daydreams of chasing a rabbit or the feeling of being hungry that is manifesting privately to that brain or organism.
Reply
#7
Shocked  Zinjanthropos Offline
(Mar 7, 2024 07:55 PM)C C Wrote:
(Mar 7, 2024 07:24 PM)Zinjanthropos Wrote:
Quote: this: How do we know whether any entity, biological or non-biological, has subjective, conscious experiences?

There’s a video on YouTube that shows a male lion surrounded by a pack of 20 hyenas. Normally male lions can handle many hyenas at once but on this occasion there were too many for him to deal with. Somehow the hyenas knew they had the upper hand and moved in for the kill or at least cause harm to their enemy. How did they know? Can they count and know the hyena/lion ratio that makes attacking a male lion a relatively safe adventure?

They know something because just as they were winning the battle, another male lion shows up. The hyenas, all of a sudden, were not so brave. How did they know to back off? Who’s doing the adding and odds calculations? The first lion miscalculated and risked death. Guess he gets the honorable Darwin Award for surviving a risky move. What I can’t be sure of is whether the second lion knew he would tip the scales by just showing up, so perhaps just dumb luck.

By having the sense of when to attack or retreat, does that make the hyenas conscious of their world, how to survive and avoid harm?

I think if you watch a prey animal run like hell from a predator, that the potential victim is well aware of its existence. Does it mean it’s a conscious entity, good chance I think.

It's private experiences that Horgan is referring to, though. Not awareness and intelligence as expressed by outer behavior (which even an advanced robot displays).

But because we believe in universals (we couldn't make sense of the world otherwise), we do assume that anything with a brain has phenomenal consciousness. What applies to individual _X_ applies to all _X_s (barring rare medical conditions).

But that's inference instead of the empirical proof of literally observing the other organism's experiences. Open up a skull and for the public there is just neural tissue and electrical measurements, not daydreams of chasing a rabbit or the feeling of being hungry that is manifesting privately to that brain or organism.

Ya, I realize they’re talking subjective and I get it, we can’t know.

If I go back to the second male lion’s arrival. Instinct or did he think it over, based on personal experience before getting involved? IOW he might need to paint a mental picture of what’s going to happen next.

The only part of Avatar movie I liked. The connection of all life on Pandora….they didn’t physically speak but communicated nonetheless.
Reply
#8
C C Offline
(Mar 7, 2024 08:41 PM)Zinjanthropos Wrote: [...] If I go back to the second male lion’s arrival. Instinct or did he think it over, based on personal experience before getting involved? IOW he might need to paint a mental picture of what’s going to happen next. [...]

That another male lion would intervene at all probably means they were both part of either a group of nomadic males (coalition) or part of a pride (that has females).

There's an age-old rivalry between lions and hyenas, so I expect it's part of a social pattern for other lions [of a group] to assist if needed.

Studies have indicated that [some] animals can judge quantity in terms of aggregates (smaller or larger), though not necessarily via counting individual items. Although primates like chimpanzees might indeed not only perform the latter instinctively, but better than very young human children (or at least they remember the positions of items in complex arrangements better).

If the initial lion had been a loner nomadic male (no coalition), then the intervention of the second male may have indeed been an accident. The latter wouldn't have cared, unless maybe he was also a loner looking for an opportunity to start a band of drifters. I get the sense, though, that the core of most coalitions started from male cubs already familiar with each other that leave or get expelled from the same pride when they get old enough.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Article Reality is not revealed by quantum mechanics (against reductionism) C C 1 61 Jun 20, 2023 07:11 PM
Last Post: Magical Realist
  Ten questions about the hard limits of human intelligence C C 2 153 Sep 12, 2022 08:13 PM
Last Post: Magical Realist
  The "Mindfulness is unsuited for real self-understanding" proposal C C 3 439 Jul 27, 2019 09:31 PM
Last Post: Syne
  David Papineau interview (video) - Physicalism (part 1) & Mary's Room (part 2) C C 2 389 May 10, 2018 02:23 PM
Last Post: Secular Sanity



Users browsing this thread: 1 Guest(s)