Author Topic: The "hard problem of consciousness"  (Read 48450 times)

0 Members and 1 Guest are viewing this topic.

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I will try to explain my Absurdio ad Equus case later. I think I'll have to be more formal in that one.

Now. You see, when you proclaim that what makes every "Fork" of Consciousness as equal as your own, then you are necessarily objectifying yourself. You're saying that your own Subjective experience is meaningless or at least irrelevant, what matters is that there exists an equivalent one "anywhere" in the Universe. From an algebraic point of view, that is true. From a moral "consequentialist" point of view, that might even be a good thing. But the question was not posed in this "uncaring" "objective" manner. We might be "objects" from a scientific point of view, but that's not necessarily true in a complete sense, it's just true in an assumption basis: Science only deals with objects, science can only deal with objects. To then proclaim we are "just as objects as anything else" is not really a "scientific conclusion", but something that sparked from its very own design. And if you only have a hammer, then you will regard everything.. etc.

My problem is compounded here by the fact that you're not just "treating the brain as a physical system", you are treating it as a computer, and consciousness as a software program. There are two ways to tackle this opinion.

First the philosophical / metaphysical one. Basically, your viewpoint necessarily requires that we adopt a worldview wherein our consciousness is actually an illusion, that this "permanence of being" is a lie that our brain tells us, and that the only thing that exists is a very sudden "Ourselves" in the "Very Present" at all times, and that if you die and something else sufficiently similar to you appears elsewhere then that's fine because that person will exist by "HimSelf" in the "Very Present" at all times. We are continuously dying and birthing at every second. Metaphorically speaking.

You need something like this in order to believe that it's ok to kill yourself and be "cloned" at the far end of the planet. Let's bear in mind that this looks like redefining the word "Death" into something as vulgar and present in everyday life that we come to allow ourselves to Die in the teleportation process (newspeak? Just putting that out there). Now, regardless, we could accept this. You can believe that, but you cannot confuse this metaphysics with science. This is a definition of what human beings are and what consciousness is in absolutist terms, down to details that you cannot possibly know as far as we can tell.

And that comes to my second tackle. My problem here is that you are reducing the problem of consciousness to arbitrary conceptual building blocks that might have zero to do with how a comprehensive brain science model still to be made will describe them. Not just "might", but almost "assuredly so". I say this, and you retort with "it seems to be 'you're wrong because you aren't right.'", it's a lot deeper and simpler than that:

Science is conservative. This means that reality is inherently difficult to parse and all ideas we have about it are most assuredly wrong. Some wronger than others, but it's fairly certain that many paradigms are yet to be found, many new "metaphors" much more productive, efficient and explanatory are to be written into textbooks. But until that time comes, the immense number of possible metaphors and the ways we use them are wrong. Why do I say this? Because there is no engineering level of rigor in terms of testing any of these criterias, how each lever goes, what kind of influence patterns have, what does it mean to transfer a bit or a byte of consciousness, etc. This utter void means that all these metaphors are almost certainly wrong.

Now, we could say, "well, ok Luis, you're conservative, but this is the best I have it's what I'll go with, OK?", fine. But here's the problem, you're going well ahead with your ideas and drawing out far reaching conclusions without any known method of testing them. And not only that, you're so certain of them, you're even willing to die for them.

I find that over the top. You are indeed an atheist, but one filled with a lot of faith.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
We are all filled with faith that we won't erupt in fractal jets of flesh that bind us together into a screaming world-meat, because we have a model of the world that tells us this cosmically unlikely.

So too with subjectivity. Wherever you have the object, you have the subject: all other explanations so far fall into the category of 'not even in contention.'

This is metaphysics only as far as it accepts physics as a complete explanation for the universe.

Your talk of existing only in the moment is correct, but you're 180 off on death: we do not die every moment, we live. Our brain state propagates forward in time. Death is the failure of this feed-forward and the irretrievable loss of information, which is not a computer term but a basic tenet of physics since Shannon.

As our ability to retrieve information improves, fewer and fewer states qualify as 'dead'.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Very well, I think we are not progressing here into a consensus, but I think I'm at least learning what the differences between our viewpoints are.

Ok, let me make another scenario, let's see if this expresses better than all the others I made.

Imagine that you are in a room. There's a scanning machine and a person in there. The person tells you that he's going to scan the entire body of yourself. You let him. After doing so, he will tell you that a new "You" has been born in a room identical to this one but on the other side of a huge wall, he's atomically identical to you. You find this acceptable, but still curious.

Then he informs you that you are about to be killed in a non painful way. You seem confused, and so he explains to you that this is just a manner of getting you through this thick wall, which was what you wanted in the first place.

"Wait a minute", you say, "but I don't see how this is making me go through this Wall. The only thing that is happening here is that a copy of me was made beyond the wall, but I'm still here, and apparently, I'm not going anywhere, I'm merely about to die!"

"No, you got it all wrong", he replies, "because what makes you "You" is just patterns. And we have transferred these patterns to the other side of the Wall. And since "you" = "patterns", then we have sucessfully transferred you to the other side of the wall. To complete the contract however, we must kill this instance of "You", for then, you see, mathematically speaking we would have had only kept half of our compromise".

"Wait, what? No, you don't have to k..." ZAP.


This, basically, is what you find acceptable in teleportation.

 

Offline watsisname

Re: The "hard problem of consciousness"
This is not an acceptable fork, because there is divergence between the forks after the scan but before one is terminated.  One fork is given information that the other is not.  This is not equivalent to scenarios 1-3 given earlier.
In my world of sleepers, everything will be erased.
I'll be your religion, your only endless ideal.
Slowly we crawl in the dark.
Swallowed by the seductive night.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Yep, watsisname is correct. You need to examine this scenario rigorously, from all the 'yous' involved:

From the pre-teleport person's perspective, they only know that their body's going to be scanned. If they knew the actual terms of the deal, they would say 'wait, wait, if we do this, one of my causal descendants is going to die! They'll diverge and then terminate irretrievably! Sure, the other fork will survive, but I don't want my child subjectivity to experience that!'

From Fork A's perspective, on the far side of the wall, they've suddenly jumped into an identical room but without the presence of the scan operator. Weird!

From Fork B's perspective, they have been given a body scan, and now suddenly they're going to be murdered! They are causally divergent from Fork A, and their brainstate is about to be eradicated. It will not feed forward through ordinary causality or through a teleporter. It's just done, gone, leaving.

 
Re: The "hard problem of consciousness"
Just 'well subjectivity might work differently than the entire rest of the cosmos, for reasons we cannot begin to guess and which fall outside physicalism in some unanticipated way.'

Consciousness is epistemologically primary. It's the only thing we can be certain of; "everything else" may only exist as a constituent of consciousness. This decisively sets consciousness apart from "everything else", at the deepest possible level.
No invocation of computers or engineering required. All we need to do is to point to the constraints that circumscribe all knowledge of the universe: the brain is physical, so consciousness must be physical. We only have one viable, coherent model with predictive power, placing consciousness not at an epistemologically primary stage but far down at the end of the train. Our oTwn subjectivity is explained by this model.

None of this refutes the fact that consciousness is unique. If your objection is that dualist models of consciousness have no additional predictive power, then I agree - but this objection also applies to all of metaphysics and philosophy. There's nothing wrong with restricting oneself to topics of practical value, but it's the scientist's mindset, not the philosopher's mindset.

 
Re: The "hard problem of consciousness"
Some materialists believe that with science, there is no more need for philosophy. I believe that they are very naive, and I don't see anyone expressing that opinion here.
What Battuta is saying for example is certainly half scientific, half philosophical: scientific in the description of consciousness, and philosophical in its valuation.


Also, @Luis if you don't mind, I'll restate my previous question: you seem to be fine with the entire human body being perfectly copyable - brain included - but the consciousness is one thing you're skeptical about. I think this exceptionalism is what people aren't understanding here.
The lyf so short, the craft so long to lerne.

 
Re: The "hard problem of consciousness"
Quote
Some materialists believe that with science, there is no more need for philosophy. I believe that they are very naive, and I don't see anyone expressing that opinion here.

Considering that science itself derives from philosophy (it was called natural philosophy in the past for good reason), this would be a hard thing to accomplish anyhow.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I am indeed examining all of this very rigorously, "from all the yous involved", but I'm about to go out so I will develop the idea later. I'll just leave this hint if you want to guess where I'm going with it: a crime of murder does not necessitate that you prove your victim was aware it would die, or that it suffered in any way, or that it had "new memories". Killing people while sleeping is still murder, for example. But I'll be more specific, and, as you correctly put it, "very rigorous" later on.

@Meneldil, not in anything I wrote did I say that Consciousness is not copiable. That idea is completely perpendicular to my concerns.

There is a joke on how Consciousness is indeed something different than other "body parts", although it does not clarify the discussion I was having one bit (it's still funny though): we are all perfectly fine with having several organs of ours being transplanted, substituted by new ones. I will guess that you would mind to have a brain transplant.
« Last Edit: October 17, 2015, 09:58:45 am by Luis Dias »

 
Re: The "hard problem of consciousness"
Ah, I think I see your point. Suppose Bob1 walks into the transporter, and as he's being scanned, Bob2 is constructed elsewhere.

Now consider the moment immediately after the scan. Bob1 and Bob2 are distinct individuals who happen to be physically identical. If Bob1 had the time to think about his situation, then (as Battuta says) he would want to live; the existence of Bob2 would be cold comfort. Thus, giving Bob1 the opportunity to fear death would be unethical.

But annihilating Bob1 would also be unethical, since it would amount to killing Bob1 without his consent, regardless of whether or not he felt fear. So the "annihilation transporter" is unethical.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
The annihilation transporter is indistinguishable from everyday life.

 
Re: The "hard problem of consciousness"
That may be true when you compare the initial stage and the final stage, but the transporter murders someone at an intermediate stage.

Would it be ethical to blink someone into existence, then blink them out of existence? In the end, nothing changes, but I would say that creating life and then immediately terminating it is still unethical.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
The transporter is indistinguishable from everyday life at all stages. We are constantly spawning 'intermediate stages' and exterminating them an instant later.

Death is loss of information. If no information is lost, no death.

 
Re: The "hard problem of consciousness"
"Loss of information" seems a very weak and vague definition of death. Information is lost all the time. Yesterday, I took apart my little brother's Lego construction.

... I feel like I'm grappling with an alien mindset. (And I mean that respectfully.)

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
The Lego construct died: it cannot feed its state forward. What other sensible definition of death exists? You're trying to apply a human word to a world that doesn't recognize objects, only particles and forces.

Here, maybe this will help clarify. Would you volunteer for a machine that could roll your whole brain back one second? Would you be afraid of it? Would it kill you?

 
Re: The "hard problem of consciousness"
OK, saying that the Lego construct died is reasonable. But in ethics, you have to apply human concepts at some point. Do you just take "human lives are worth more than Lego lives" as an axiom?

I wouldn't volunteer for any machine that changes my brain. From a philosophical standpoint, however, I doubt it would kill me.

 
Re: The "hard problem of consciousness"
Lemme see if I got a grab on this.

You exist in 4 dimensions: Time, X, Y, Z.

Simply by breathing or by moving around you move along in those coördinates. You can never have two instances of yourself in the same spot, but thanks to time moving forward such collisions don't happen - the instance of you that was sitting in the chair in dimension T-1 does not exist in dimension T - otherwise you'd be very uncomfortable right now. You are constantly being moved forward without percieving it as such, previous instances of you dissapearing into the void to ensure that there is room for you. An instance of you that moves to the toilet and back again will not encounter oneself sitting in the chair due to that instance of you being from T-100 and thus no longer existing in your time dimension. Since humans can not travel along time except at a rate of 1 to 0 (depending on you approaching the pseed of light), the previous and future instances of you effectively cease to exist (although it's fairer to say that they never existed in the first place, you simply copied yourself into a new area).

When you look at yourself from a (T, X, Y, Z) perspective, there is no difference from you walking around (T+1, X+1, Y+1, Z) or from you being teleported (T+1, X+1000, Y+1000, Z+10000). Therefore, occam's razor states that all the other things are also the same, unless there is evidence to the contrary.
« Last Edit: October 17, 2015, 03:08:33 pm by -Joshua- »

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
OK, saying that the Lego construct died is reasonable. But in ethics, you have to apply human concepts at some point. Do you just take "human lives are worth more than Lego lives" as an axiom?

I wouldn't volunteer for any machine that changes my brain. From a philosophical standpoint, however, I doubt it would kill me.

Yet you allow your brain to change every day. You go to sleep or get anesthetized and trust that the matter of your brain will recreate your subjectivity. What's the difference?

If you got amnesia and lost a day, would you die and be replaced by a clone?

If your brain is injured and the lost areas are rebuilt exactly as they'd been, do you die?

Joshua: I think you are basically getting at the fact that there is no nuclear, single, immutable 'I'. We just slap a label on a loosely coherent and causally bound system and say it's us until it loses the ability to copy itself forward on its own power.

 
Re: The "hard problem of consciousness"
Yet you allow your brain to change every day. You go to sleep or get anesthetized and trust that the matter of your brain will recreate your subjectivity. What's the difference?

As Luis said, one difference is that I have no choice about day-to-day living. I do have a choice about using transporters or brain-altering machines. Not knowing exactly how consciousness is preserved or destroyed, I wouldn't touch them with a ten-foot pole.

For example, consciousness may be a product of continuity through time and space. This matches my intuition that "I" am not merely the configuration of my atoms at this particular moment, but the accumulation of all my previous configurations. Teleportation or brain reconstruction would disrupt the continuity and instantiate a new consciousness.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
That argument was debunked in the last teleportation thread. Continuity is basically a red herring - once you look at it you realize you're actually saying 'what matters is that my past mindstates influence my current mindstate,' which is exactly what the teleporter preserves.

Remember, in physicalism, saying 'I am the sum of my past configurations' is exactly the same as 'there is causal connection between my past brainstatea and my current one.' All that matters is Shannon information.

The 'choice' argument is flimsy. You think day to day life is as dangerous as teleporters, you just choose not to use the teleporter because you're helplessly resigned to constantly undergoing the same process?

Let me restate how important it is to get past the continuity fallacy. Unless you are a dualist, the past has no influence on the present except for the information transmitted forward by causal rules. That info is all stored in the brain at any given moment, in the physical meat.