Author Topic: The "hard problem of consciousness"  (Read 48659 times)

0 Members and 1 Guest are viewing this topic.

Offline Scotty

  • 1.21 gigawatts!
  • 211
  • Guns, guns, guns.
Re: The "hard problem of consciousness"
Luis, would it make more sense and/or make you feel better if you instead thought of the teleporter as killing you, and then an almost immeasurable amount of time later bringing you back to life in another location?  I feel like there's a sense here where there's a perceived interruption of the continuity of thought, when no such thing takes place.

 
Re: The "hard problem of consciousness"
On a slightly different note, anyone played SOMA? I loved it. It's sometimes billed as a game about consciousness, and teleporter stuff comes up a lot.

 

Offline Bobboau

  • Just a MODern kinda guy
    Just MODerately cool
    And MODest too
  • 213
Re: The "hard problem of consciousness"
well, for some of the modes of operation, in a way it's more like the teleporter is reviving you somewhere else, before killing you.

but I guess if these sort of things bug you then you've got what the intergalactic call a very planetary mindset.
Bobboau, bringing you products that work... in theory
learn to use PCS
creator of the ProXimus Procedural Texture and Effect Generator
My latest build of PCS2, get it while it's hot!
PCS 2.0.3


DEUTERONOMY 22:11
Thou shalt not wear a garment of diverse sorts, [as] of woollen and linen together

 
Re: The "hard problem of consciousness"
On a slightly different note, anyone played SOMA? I loved it. It's sometimes billed as a game about consciousness, and teleporter stuff comes up a lot.
It's on my to play list, esp. after reading this. Spoilers though!

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Luis, would it make more sense and/or make you feel better if you instead thought of the teleporter as killing you, and then an almost immeasurable amount of time later bringing you back to life in another location?  I feel like there's a sense here where there's a perceived interruption of the continuity of thought, when no such thing takes place.

That's not the alternative scenario we are dealing here, that's the scenario I'm describing.

Of course I would not have problems with it if it worked exactly as waking up from sleep, as a complete continuation of my consciousness. My problem is that I have no way to decide beforehand if this is the case or not.

The cloning problem makes this only even clearer. Scotty, imagine the following scenario, that is being allowed in all of this philosophical teleportation device we've been dealing here:

Person A is scanned and killed. All required information goes to place 1000km away and is rebuilt. Person A wakes up sensing a total continuation of his Consciousness. Everything happened as you described in your question.

Now let's add a variable. Person A is scanned and killed. All required information goes to 1000 places 1000km apart. One thousand People "A" wake up sensing a total continuation of their Consciousnesses.

So my question is, who is you, you? Did your Consciousness really transferred itself out to another place without significant interruption, or was it merely copied?

Clearly, it was copied. And a thousand new Consciousnesses were "born". But if they were copied, they were not "transferred". Which means that in the first scenario, what happened was a copy mechanism, not a transfer mechanism.

Through Batutta's philosophy, these are one and the same, these words describe the same phenomena, they are de facto synonyms. And he insists that everything we know attest to this fact. But I submit that we don't, in fact, know this. We can speculate, we can guess, we can write interesting scenarios to test all these ideas. We can bring up Theseus ship and all sorts of stuff. This is all legitimate.

What I do not find legitimate is to ascertain that we know Conscience works exactly as it is described here in order for this teleportation device to work.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
How else would it work? What causal mechanism could support another account? There doesn't seem to be any grounds for deviation.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
The burden of proof in saying that you can kill people and rebuild them and it will all work out with "continuity" for the first person is not on me, Battuta. I'm sure you are really really sure of your ideas, but the stakes are a tad too high IMHO.

Regardless, I haven't read it yet (my eyes are very tired today), but I'm sure this paper by Dennett seems to go around these same issues we've been dealing with, they might help disentangle some concepts, ideas or semantics we might be misusing. It's a small lecture, and it sounds a good lecture: http://www.lehigh.edu/~mhb0/Dennett-WhereAmI.pdf

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I don't actually think much of continuity - it's illusory and retrospective.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Continuity seems to me to be the core basis of every one of our concerns for our own future being. If "continuity" is false, why should my consciousness "care" about my future self at all? Why should I sacrifice one second of my life, or even plan for a future state of being, if there is no such thing as "Continuous"? Perhaps we are using different meanings for the same words.

 

Offline Bobboau

  • Just a MODern kinda guy
    Just MODerately cool
    And MODest too
  • 213
Re: The "hard problem of consciousness"
saying it's illusory is meaningless if you consider that consciousness is in large part the perception it's self.
Bobboau, bringing you products that work... in theory
learn to use PCS
creator of the ProXimus Procedural Texture and Effect Generator
My latest build of PCS2, get it while it's hot!
PCS 2.0.3


DEUTERONOMY 22:11
Thou shalt not wear a garment of diverse sorts, [as] of woollen and linen together

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Let's be careful in the discussion. What is being said to be illusory is *not* Consciousness, but the "Continuity" of Consciousness. Consciousness is a brute fact, here I wholeheartedly agree with GhylTarvoke (the Cotard delusion question notwithstanding, which does indicate I should ponder a bit more on the technicality of this issue). It is obviously ridiculous to say that the entire subjective experience is an "illusion", for that already assumes that "someone" is being "deluded" - it's a contradiction.

What happens when people say that (say) the Self or the "Continuity" of the Self or whatever other attribute of Consciousness is an Illusion is to say that while the first order experience is not an illusion, some or all of these attributes are, built by the brain, etc.

 

Offline AtomicClucker

  • 28
  • Runnin' from Trebs
Re: The "hard problem of consciousness"
Consciousness is physical, the concept of self isn't - while it is a byproduct of consciousness and a form of apriori knowledge - but I'd argue it's both illusory and metaphysical but at the same time "synthetic" when one realizes self is "true."

Depending if you peg personhood to purely physical systemic definitions, the knowledge of self is still a metaphysical transfer of knowledge from one clone to the next.

Since I'm clarifying self as a form of knowledge, even if your "clone" at the starting point is killed, and rebuilt at the other end, the "self" continues to persist from clone to clone, perhaps coming into jeopardy if more than one clone were living or knowledge that another was alive. However, I dissent and establish that "self" is illusory, but as form of knowledge, isn't bound by physical restraints.

Case in point, 1+1 = 2 regardless of time, place and location. It's math, and it's metaphysical. In my piss-poor argument, as long as the clone persists in a singular fashion, that transfer of knowledge persists of the self, being held as a self-evident truth.
Blame Blue Planet for my Freespace2 addiction.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
We had already clarified the algebraic nature of that type of definition of Self.

 
Re: The "hard problem of consciousness"
I'd like to revive this discussion! For the past month, I've been turning an argument over and over in my head, and I think I'm now ready to present it.

I format the argument as a monologue, a la Descartes; if you like, you can imagine yourself saying the same things. I occasionally make "notes to the reader", like so:  [Here's a note.]

Part 1: ConsciousnessMe


Ideally, I'd begin by defining consciousness - but as I've seen, this is tricky. I instead begin with some simpler definitions, given from my perspective.

----------
Definition: We say that something certainly exists (in actuality, not merely as an abstract notion) if it's impossible for it not to exist.
----------

For example, the light I'm seeing right now certainly exists, because it's a fundamental part of my current experience. Even if I'm living in a digital simulation, the light certainly exists within the simulation. One could say that it doesn't even matter if I'm living in a simulation; for me, the light still exists in every way that matters. Similarly, if I'm dreaming, the light still exists as part of my dream.

On the other hand, Pluto does not certainly exist, because I've never seen it. Maybe everyone's been lying to me about its existence, or maybe an even greater deception is going on.

----------
Definition: ConsciousnessMe (i.e. "my consciousness") is the collection C that satisfies the following criteria:
  • For any object O that certainly exists, O is a constituent of C.
  • C certainly exists.
----------

There are three possible concerns with this definition:
  • There might not be any collection that satisfies the criteria. If so, I'm not actually defining anything.
  • There might be more than one collection that satisfies the criteria. If so, referring to "the collection C" is incorrect.
  • Even if my definition makes sense, it might not mean what I want it to mean.
The first two concerns are relatively easy to address. First of all, something certainly exists.  [If you truly believe it's possible that nothing exists, then I can't help you.]  Let C be the collection of all objects that certainly exist. Then by construction, C satisfies both criteria (which addresses the first concern), and C is unique (which addresses the second).

Finally, I think ConsciousnessMe is exactly what I mean by "my consciousness": it's the accumulation of everything in my awareness. Furthermore, it agrees with almost every quasi-definition that I've seen - e.g. "my subjective experience" or "what-it-is-like to be me" - because almost everybody who talks about consciousness agrees that its existence is a brute fact.  [This includes General Battuta, but excludes people who deny the phenomenon entirely.]  Hence my definition is sound.

I now observe that scientific theories play the same role in ConsciousnessMe that they play in the "real world" (which, unlike ConsciousnessMe, might not exist). ConsciousnessMe appears to obey certain laws. For example, if "I" "let go of" "an apple", "the apple" "falls" (where scare quotes indicate that everything takes place within ConsciousnessMe). Science thus allows me to make predictions with great accuracy. Incidentally, it also predicts the existence of Pluto.

One last definition:

----------
Definition: The constituents of ConsciousnessMe are QualiaMe.
----------

Two rough examples of QualiaMe: (my perception of) "the quality of deep blue", and (my perception of) "the sensation of middle C". My computer, as a combination of many different QualiaMe, is more complex.

I'll be back with the punchline soon.

 
Re: The "hard problem of consciousness"

Part 2: ConsciousnessBob


When I try to generalize my definition of ConsciousnessMe, I encounter some semantic difficulties. There are certain combinations of QualiaMe that I refer to as "people". By definition, they certainly exist - but I can't be sure that they exist (or anything exists) in a "real" sense, i.e. in a sense that's independent of ConsciousnessMe. The entire universe might actually be ConsciousnessMe. Now, this would be a bizarre and inelegant state of affairs, and I'd be supremely egocentric to take the possibility seriously - but nothing precludes it. With that in mind, I'll refer to people freely, without using quotation marks.

One important property of QualiaMe is that they appear to have a "locus", or point of view; they appear to center on a particular person.  [In my case, the person happens to be a twenty-year-old student.]  This property allows me to hypothesize the existence of QualiaMe-like objects. Conveniently, there's a person standing in my room right now. His name is Bob. I define QualiaBob by way of analogy:

----------
Definition: QualiaBob are the objects that resemble QualiaMe, but have Bob as their locus instead of me.
----------

This definition is completely natural; to go from QualiaMe to QualiaBob, I simply change the subscript. But it's not clear that I've actually defined anything, for the following reason. Since QualiaMe and QualiaBob have different loci, they can't be the same things. Thus, by my definition of QualiaMe, I can't be sure that QualiaBob exist.

Nevertheless, the abstract notion of QualiaBob isn't hard to grasp. I just "put myself in Bob's shoes".  [Empathetic human beings do it all the time.]  More than that, I firmly believe QualiaBob exist, and most people appear to share this belief. As before, the alternative is bizarre, inelegant, and egocentric.

Finally, my definition of ConsciousnessBob should be obvious:

----------
Definition: ConsciousnessBob is the collection of QualiaBob.
----------

Remark: At this point, I can define Consciousness (without a subscript) to be the collection of all QualiaL, where L is a locus, and I can define Qualia to be the constituents of Consciousness. But I won't be needing these definitions for my argument.

In the third and final part, I'll take another look at Bob, who has agreed to stay in my room.

 
Re: The "hard problem of consciousness"

Part 3: The Argument


With the basic definitions taken care of, there are at least two possible models of reality. The first one is more natural - in fact, it's usually taken for granted - but both models completely account for my observations. Additionally, both models incorporate science in its entirety.

First Model: I'm not special. QualiaBob exist; Bob exists in the same way that I do, and is conscious in the same way that I am. Bob and I both live in an external universe. Scientific theories explain not only the behavior of QualiaMe, but also the behavior of the external objects that QualiaMe represent.

Second Model: I'm special. QualiaBob don't exist; Bob only exists as a combination of QualiaMe. The universe is ConsciousnessMe. Scientific theories merely explain the behavior of QualiaMe.

Okay. So what?

Here's the point. Although the models have differing perspectives, they both incorporate science in its entirety: for every scientific argument that goes through in one model, the same argument also goes through in the other model. (To reiterate my example from Part 1, science predicts the existence of Pluto in both models; the models only disagree on the metaphysical issue of Pluto's "true nature".) Furthermore, both models are sound: they account for all of my observations, and lead to no logical contradictions.  [Unless our current understanding of science is self-contradictory.]  But ConsciousnessBob exists in the first model, whereas ConsciousnessBob does not exist in the second model. Thus, ConsciousnessBob (unlike virtually everything else, including Pluto) is logically independent of science, in the sense that science says nothing about its existence. Even if I assume the inviolate truth of science, I can't conclude anything about the existence of ConsciousnessBob.

This is not the same as saying that science and ConsciousnessBob are unrelated! Based on my own experience, there's a strict correspondence between ConsciousnessMe and (if it exists) the external universe.  [As General Battuta writes: wherever the object, the subject.]  It's reasonable to hypothesize a similar correspondence between ConsciousnessBob (if it exists) and the external universe (if it exists). My argument is that science and ConsciousnessBob are logically independent, which undermines every scientific attempt to "explain" consciousness.


 
Re: The "hard problem of consciousness"
Addendum: Science is the branch of knowledge dealing with testable predictions, and only observations are testable. So when I say (in either model) that science predicts Pluto's existence, I really mean that science predicts observations of Pluto's existence. The assertion that Pluto "really exists" isn't testable.

 

Offline watsisname

Re: The "hard problem of consciousness"
My argument is that science and ConsciousnessBob are logically independent, which undermines every scientific attempt to "explain" consciousness.

I can make a model of stars wherein they are lights shone through holes in a black tarp by very clever demons.  This undermines every scientific attempt to explain stars.
In my world of sleepers, everything will be erased.
I'll be your religion, your only endless ideal.
Slowly we crawl in the dark.
Swallowed by the seductive night.

 
Re: The "hard problem of consciousness"
Stars may only exist as qualia, or as part of a simulated reality, or as illusions created by very clever demons. Nevertheless, they exist - and as far as we know, science can explain every facet of their behavior.

 

Offline watsisname

Re: The "hard problem of consciousness"
And is the clever tarp demon model potentially falsifiable with current abilities?  What is its motivation?  How does it fit into the framework of the rest of our understanding of nature?

This is what I and a few others have been trying to help you to understand.  You have repeatedly said that dualism and physicalism are both models of reality and they make the same set of predictions.  But I really don't think you understand what the predictive power of a model means, or why we should want to favor one model over another even if they both are consistent with observations.
In my world of sleepers, everything will be erased.
I'll be your religion, your only endless ideal.
Slowly we crawl in the dark.
Swallowed by the seductive night.