Author Topic: The "hard problem of consciousness"  (Read 48468 times)

0 Members and 1 Guest are viewing this topic.

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
The 'acceptable delta' (in answer to both above posts) is the preservation of information, namely retaining the ability of a brainstate to propagate itself forward by causal rules.

Sure, it has to survive the process, but that is a small consensual detail. What is more important is that this criteria fails to nullify why wouldn't an output like a "horse" be unacceptable then. If the only criteria is  survival of the end product we could even substitute "you" for a copy of anyone else.

Why would that be a problem at all?

The problem here is that we instinctively understand this is wrong because it's "not us it's someone else", but according to you this should not be a problem since my brain states are always changing anyway, and since you define a self as being that "brain state", that means I'm constantly stopping being myself anyway so it's not really different at all.

There are crude paradoxes here that are easy to recognize when you realise they all stem from how science regards everything as an object and how that starts to crack when it deals with subjectivity itself.





Quote
Vaporizing your brain introduces entropy into the system, destroying it: the brainstate cannot copy itself forward without becoming very lossy. Teleporting your brain may vaporize it, but no information is lost: the brainstate propagates through the teleporter.

Ok no "information" is lost but were you killed or not? Will the you you, your own experience of yourself live in that body or not?

This question seems of the utmost importance but it also seems unanswerable.

 

Offline Droid803

  • Trusted poster of legit stuff
  • 213
  • /人 ◕ ‿‿ ◕ 人\ Do you want to be a Magical Girl?
    • Skype
    • Steam
Re: The "hard problem of consciousness"
If no "information" is lost does it matter? All evidence indicates that we consist of nothing more than this "information", after all. Isn't that all there is to this?
You may have been killed, but you did not die.
(´・ω・`)
=============================================================

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
The 'acceptable delta' (in answer to both above posts) is the preservation of information, namely retaining the ability of a brainstate to propagate itself forward by causal rules.

Sure, it has to survive the process, but that is a small consensual detail. What is more important is that this criteria fails to nullify why wouldn't an output like a "horse" be unacceptable then. If the only criteria is  survival of the end product we could even substitute "you" for a copy of anyone else.

By what sensible causal rule could a person's brain spontaneously become a horse's? You're trying to argue that a catastrophic failure is somehow equivalent.

The teleporter is safe because it allows your brain state to propagate forward by its ordinary causal rules. It does not change or distort the information present. It is wholly unlike the examples you're floating.

You become some other you by the introduction of stimuli which are processed according to the brain's logic.

Death is trivially defined: irrecoverable loss of information. If the brain state isn't lost and can keep firing itself forward on its own power, you're not dead.

 

Offline Scotty

  • 1.21 gigawatts!
  • 211
  • Guns, guns, guns.
Re: The "hard problem of consciousness"
Luis it's important to realize that the end product Battuta refers to isn't "the consciousness of a human being".  It's "You".  You yourself.  An arbitrary human being that undergoes the process and remains the same arbitrary human being, not another human being.  If the end result was not identical in all ways including thought and memory, it is not a real teleportation.  And if it is identical in all ways including thought and memory, then "You" are still alive, and arguably safer than any single other instance of your entire life.

 

Offline watsisname

Re: The "hard problem of consciousness"
In other words, from the Comic in the OP:


Can you elaborate on what you do not understand?

Oh yeah, that's clarified now. I had misinterpreted "descendant fork" in Batt's post to refer to the fork constructed from the transmitted data.  Which of course then led to confusion, since why would that fork experience agony from the other fork being killed by the transmission process, after the data was scanned?  The correct interpretation is probably pretty obvious, but I was very tired when I was reading.

So yeah, I'd be fine with #1-3, but find #4 to be unacceptable.

I apologize for harping on this, but I'm still confused. How perfect does the copy need to be to allow your brainstate to propagate, and how do teleportation imperfections differ from day-to-day stimuli?

The human brain can suffer some pretty severe traumas without breaking the continuity of the "self", (though the trauma may dramatically change the character of the person).  Or, we might say that the continuity of the self is an illusion, produced by the pattern of information propagating forward in time.  This pattern changes with every stimulus.  As long as the pattern maintains some essential history of your world line, your "memory" and "thoughts", you are still "you".

The hypothetical teleporter here is assumed to scan and reconstruct the particles of your body (whether by using the very same particles or not) flawlessly.  This is why some are arguing that the transporter is safer than ordinary day-to-day life.
In my world of sleepers, everything will be erased.
I'll be your religion, your only endless ideal.
Slowly we crawl in the dark.
Swallowed by the seductive night.

 
Re: The "hard problem of consciousness"
If the end result was not identical in all ways including thought and memory, it is not a real teleportation.  And if it is identical in all ways including thought and memory, then "You" are still alive, and arguably safer than any single other instance of your entire life.
I apologize for harping on this, but I'm still confused. How perfect does the copy need to be to allow your brainstate to propagate, and how do teleportation imperfections differ from day-to-day stimuli?

The human brain can suffer some pretty severe traumas without breaking the continuity of the "self", (though the trauma may dramatically change the character of the person).  Or, we might say that the continuity of the self is an illusion, produced by the pattern of information propagating forward in time.  This pattern changes with every stimulus.  As long as the pattern maintains some essential history of your world line, your "memory" and "thoughts", you are still "you".

The hypothetical teleporter here is assumed to scan and reconstruct the particles of your body (whether by using the very same particles or not) flawlessly.  This is why some are arguing that the transporter is safer than ordinary day-to-day life.

Okay, but what if the reconstruction isn't flawless? What if the copy is different from the original - but only by a single atom?

In this case (assuming that the flawless teleporter doesn't kill you), saying that the flawed teleporter kills you is even more absurd than saying that you die from moment to moment; the teleporter's imperfection is even more harmless than the effects of day-to-day stimuli. But now we reach a contradiction, because proceeding by induction, the teleporter can change every atom in your body without killing you, which is also absurd. The conclusion that our original assumption (the flawless teleporter doesn't kill you) was wrong seems inescapable.

 
Re: The "hard problem of consciousness"
Is one grain a heap of sand is a question we should leave with Ancient Greeks, ie. dead.
Induction works by taking a binary predicate, and a couple of ~true~ assumptions. The simplistic logic of this is how we do math, but it's useless for describing complex real life phenomena. For example, I don't think that "this is me" in the sense of this discussion is a binary predicate, nor that "changing one atom doesn't change me" is a binary truth, and that's why I find it wrong to go from "I breathe in, I breathe out" to "I'm a horse", and especially to claim that it's an inevitable conclusion.

I agree it's unsettling that we know so little about consciousness (especially since we value it so much!), but is this not also to some extent true about any other part of human body? What's the difference between consciousness and a liver?
The lyf so short, the craft so long to lerne.

 
Re: The "hard problem of consciousness"
double post, sorry
The lyf so short, the craft so long to lerne.

 
Re: The "hard problem of consciousness"
Good point. My impulse is to respond, "me-ness is clearly defined, but heap-ness is not", but that would be begging the question.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
By what sensible causal rule could a person's brain spontaneously become a horse's? You're trying to argue that a catastrophic failure is somehow equivalent.

That's an irrelevant point. From the get go, we have been having engineering liberties. Indulge me, if you will, on this "liberty" as well, for philosophically these difficulties are equivalent. What matters is that there is a delta, and you have accepted a delta. If deltas are acceptable, and no other reason than "It seems it must be so" is given for some arbitrary "red line" similitude to the original, then it follows that there is no "red line" and we can posit the weirdest things imaginable.

Quote
The teleporter is safe because it allows your brain state to propagate forward by its ordinary causal rules. It does not change or distort the information present. It is wholly unlike the examples you're floating.

This is bad reasoning, and I see many other commenters making, sorry to say, the same mistake. The problem here is that you are presuming that Consciousness = Brain State, or C = "Information Present". But my point is precisely that you have failed to prove this point. This is just a hypothesis, a guess, your metaphor of what might be going on. A guess made by someone who knows his neuroscience ****, no doubt, but this is still 2015 and I don't think anyone has yet sussed what C is, and since you're just a mammal like me, I will infer that God hasn't spoken to you either about the "True Nature of The Universe".

Quote
You become some other you by the introduction of stimuli which are processed according to the brain's logic.

My brain paralyses with these magical uses of the word "You", wherein the "Me" Me "becomes" someone "Other [Me]" by the introduction of etc. Clearly we are dealing with a confusing semantical problem related to a mess between Aristotelian ways to use words like "identity" and "essences" and so on with Platonic ideas "It's just information that matters", with (anti) Cartesian, etc. IOW, the whole sentences are a complete mess, and might sound good, but ultimately they are meaningless.

Quote
Death is trivially defined: irrecoverable loss of information. If the brain state isn't lost and can keep firing itself forward on its own power, you're not dead.

This is what I mean by "edgelording", redefining words to mean something entirely different. When I lose a notebook, I never say it "died" or that the information within "died". That's absurd. Death is not something confinable to such definitions, it is related to "Life", which is more than information. I do see that this "Brains = Computers" metaphor is so ingrained in your mind that it is gruesomely hard to get you to see how limited in scope and meaning it is. It's like saying the ninth of Beethoven is "contained" or "bound" by the knowledge (or idea) that it's "all just soundwaves". Yeah, you won't get very far with that approach.


Luis it's important to realize that the end product Battuta refers to isn't "the consciousness of a human being".  It's "You".  You yourself.  An arbitrary human being that undergoes the process and remains the same arbitrary human being, not another human being.  If the end result was not identical in all ways including thought and memory, it is not a real teleportation.  And if it is identical in all ways including thought and memory, then "You" are still alive, and arguably safer than any single other instance of your entire life.

The redefinition of the word "YOU" into an object that is interchangeable in algebraic terms just like any other scientific object might well be what is meant, but it's an incorrect take on the word, given the question posed. The question is directed at the SUBJECTIVE YOU, not this "OBJECTIVE YOU-ness", that scientifically we can determine to be just about the same / equal to any other instance of itself.

If I were to use the latter, then yes, of course, the "Me" that would be alive after teleportation would be alive, and in "some way", I'd be alive through "Him". Everyone else would regard "him" as "me" and from their point of view, we are one and the same. And yet, from my point of view, it might as well just happen that my life ends right there and some other Consciousness is suddenly born with my memories and continues "my life". From the point of the universe, nothing really changed. A guy named Luis was at spot 1, then he teleported his information to spot 2 and a guy named Luis appeared from that spot. Everyone else acted normally as if it's the same Luis. For all purposes and forever, it is the same.

What we cannot ever possibly postulate is if whether this new person is the continuation of your own subjective experience, or if your life ended at that point, period.

Now you can be just like zookeeper and say "That's irrelevant, everything else keeps working, who cares if I die if I'm substituted by a perfect replica?", to which I'll just open my eyes in terror. It's like watching people go willingly to gas chambers because they want their replicas to go to Tokyo faster.




Lastly, I just want to reiterate that I know some people here believe that Consciousness is "just a pattern". I can be facile here and merely ask "Oh yeah, your proof?" and wait for the next hundred years for it. Instead, I'll just point out that irrespectively of your beliefs, you should acknowledge those are merely beliefs, that these metaphors you are using are most probably unable to capture the actual things that are going on in C, and that perhaps you shouldn't risk your life and your consciousness with a pre-22nd century analysis of what it's all really about.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I can't follow you at all. I feel like my (phone, alas) response would just be a series of emptyquotes of past statements in the thread.

Your stance reminds me of arguments for theism or quantum karma - we don't know exactly how it works, ergo magic! The delta argument in particular I find confounding, since you seem to be attacking yourself. If the teleporter introduces less drift than day to day life, how is saying 'what if it introduced MORE drift? All drift is the same!' in any way logical or useful to your stance?

Calling death what it is is not 'edgelording'. Death is only death when it leads to irretrievable loss of the brainstate's ability to copy forward under its own power. That's the only sensible idea of death we have.

These are not just 'beliefs', they are the only coherent hypotheses given the evidence we have. The Subjective You is physical because there is nothing but the physical. Wherever the Object You arises, so does the Subject. No other hypothesis has any support at all.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Is anyone following the all delta is the same argument and able to crumple it down for my greasy brain

 

Offline zookeeper

  • *knock knock* Who's there? Poe. Poe who?
  • 210
Re: The "hard problem of consciousness"
I think Battuta might be a P-zombie. :eek:

 
Re: The "hard problem of consciousness"
Is anyone following the all delta is the same argument and able to crumple it down for my greasy brain

I understood Luis to be making one of the following arguments:

I think Luis is saying that, since we're always changing, information is never preserved. So it's no big deal if the transporter reassembles us differently.
Okay, but what if the reconstruction isn't flawless? What if the copy is different from the original - but only by a single atom?

In this case (assuming that the flawless teleporter doesn't kill you), saying that the flawed teleporter kills you is even more absurd than saying that you die from moment to moment; the teleporter's imperfection is even more harmless than the effects of day-to-day stimuli. But now we reach a contradiction, because proceeding by induction, the teleporter can change every atom in your body without killing you, which is also absurd. The conclusion that our original assumption (the flawless teleporter doesn't kill you) was wrong seems inescapable.

Side note. In these discussions, I often get the impression that certain participants are zimbos. It's nonsense (I hope), but there seems to be an impenetrable language barrier.

EDIT: Ninja'd.  ;)

 

Offline AtomicClucker

  • 28
  • Runnin' from Trebs
Re: The "hard problem of consciousness"
Don't have much to say but the discussion of what is "consciousness" devolves into a semantic mess of vagueness, and interpretations between mechanistic and deterministic responses. But I will say the vague nature of speech and its illogical function only makes the matter worse. To engage in consciousness is to delve into the vague, imprecise and illogical. These "vague" concepts upend logic, create paradoxes, and quickly crushes logical attempts at rationalizing the vague. Ergo, it's a giant black hole.

To put it simply, mechanistic systems collapse when confronted with vague, undefinable set of circumstances. YMMV on approaching consciousness, but it's important to keep in mind dealing with discussions of self and means confronting that ugly elephant in the room we call the problem of meaning.
Blame Blue Planet for my Freespace2 addiction.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Is anyone following the all delta is the same argument and able to crumple it down for my greasy brain

I understood Luis to be making one of the following arguments:

But these arguments are nonsense, and they have been answered pretty decisively. Not all changes are remotely equivalent! Your brain receiving stimuli from the optic nerve, encoding those stimuli using evolved pathways, parsing out factual information from the stimulus, activating semantic relationships, priming motor responses, moving concepts into working memory, and finally recording the image in long-term memory is a change that obeys the internal logic of the brain. The brain-state is feeding itself forward to the next Planck instant or whatever according to its own logic.

The brain suddenly being transmuted into a horse's brain obeys no such logic! The brainstate is not determining itself forward. Causal pathways are severed. Information is lost.

Remember, I'm the one arguing that the teleporter is strictly safer than day to day existence, and any argument which renders the teleporter unsafe also renders day to day existence unsafe! I still haven't seen (or maybe I have seen many times but am not recalling) any substantive refutation of this point. Just 'well subjectivity might work differently than the entire rest of the cosmos, for reasons we cannot begin to guess and which fall outside physicalism in some unanticipated way.'

The 'language barrier' from my perspective seems to be the reluctance to accept that subjectivity is simply a product of objective structure. If the object's there, so is the subject.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I can't follow you at all. I feel like my (phone, alas) response would just be a series of emptyquotes of past statements in the thread.

Your stance reminds me of arguments for theism or quantum karma - we don't know exactly how it works, ergo magic! The delta argument in particular I find confounding, since you seem to be attacking yourself. If the teleporter introduces less drift than day to day life, how is saying 'what if it introduced MORE drift? All drift is the same!' in any way logical or useful to your stance?

Wrong on all accounts. It's a language barrier between us. Just because you think you can explain something with a scientifically sounded metaphor doesn't mean you are right, nor does it mean that any skepticism towards those metaphors you are using are based upon magical thinking. IOW, you don't need to believe in magic in order to be skeptical about your ideas. IOW, there are more possibilities than either "Spirits!" or "Consciousness is just a pattern of information". I can't make it more simple than this, so you have to work this through in your end, I'm sorry.

Quote
Calling death what it is is not 'edgelording'. Death is only death when it leads to irretrievable loss of the brainstate's ability to copy forward under its own power. That's the only sensible idea of death we have.

Those sentences are already embedding a lot of models and beliefs that I find a tad overreaching, but nevertheless you are saying something different now. You equate a brain with a computer, and thus you analyse life and death through those lenses and words. The problem I have with this is that it gives the analysis a veneer of engineering-like quality to the statements, but without the actual reliable engineering effort underneath it that might justify them. IOW, you're borrowing the respectability of other types of analysis and insights to a field where these things have yet to show any reliable work regarding Consciousness.

TL DR, to say that the brain is dead because it can't "copy" forward is as silly as saying that muscles are overheaten because they need to vent some "steam". It's a kind of metaphor that is chronocentric to the fads of our age and do not reflect the true realities of our brains and consciousnesses.

Quote
These are not just 'beliefs', they are the only coherent hypotheses given the evidence we have. The Subjective You is physical because there is nothing but the physical. Wherever the Object You arises, so does the Subject. No other hypothesis has any support at all.

When someone does not know about one subject, the best course of action is to say "I don't know", rather than trying to come up with explanations and proclaim they "must be right" by fiat because no one else can really explain them. Of course the universe is physical and etc., but many things that seemingly follow from those simple premises turn into bizarre absurds if you think philosophically about them.

For instance, imagine that you can copy yourself into a thousand Battutas. Do you believe that your own consciousness will be "transferred" to any one of those? Or will it remain inside of yourself? And if you are still "inside of yourself", the original, do you see yourself as "interchangeable" between all those Battutas, or will you still prefer your life over all of those other Battutas?

For instance, take this scenario: Imagine 9 Battutas are immediately copied, resulting in 10 Battutas. However, because of a strange sequence of events, only one Battuta will survive and you are to choose who will. This choice is not to be discussed between you and other Battutas. You know they are exactly like you, but you alone are to choose (and are free to do so) if *you* are to survive or any other Battuta is to survive. Once you decide, 9 Battutas are immediately vaporized through a process that takes exactly 2 Planck-o-seconds.

Now, according to your thesis, it does not matter *which* Battuta survives. So what will you do here? Notice how you feel about your decision. Beware of sensations of "generosity" or "fairness" or "altruisticness", for they are already subtly implying a "sacrifice". But there is no sacrifice involved. This decision shouldn't be difficult at all and should be totally random: it's like breathing after all and aren't you doing *that* every single second?


Your last comment shows you failed to understand my own philosophical argument. You're very smart, so I take it it's my lack of expressing abilities. I'll try better next time.

 
Re: The "hard problem of consciousness"
Just 'well subjectivity might work differently than the entire rest of the cosmos, for reasons we cannot begin to guess and which fall outside physicalism in some unanticipated way.'

Consciousness is epistemologically primary. It's the only thing we can be certain of; "everything else" may only exist as a constituent of consciousness. This decisively sets consciousness apart from "everything else", at the deepest possible level.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
No invocation of computers or engineering required. All we need to do is to point to the constraints that circumscribe all knowledge of the universe: the brain is physical, so consciousness must be physical. We only have one viable, coherent model with predictive power, placing consciousness not at an epistemologically primary stage but far down at the end of the train. Our oTwn subjectivity is explained by this model.

Luis' argument about namechecking 'hard' disciplines is flawed, because we don't need to namecheck them at all! We just have to treat the brain as a physical system.

Quote
IOW, there are more possibilities than either "Spirits!" or "Consciousness is just a pattern of information". I can't make it more simple than this, so you have to work this through in your end, I'm sorry.

No work required. What are the other possibilities?

Quote
TL DR, to say that the brain is dead because it can't "copy" forward is as silly as saying that muscles are overheaten because they need to vent some "steam". It's a kind of metaphor that is chronocentric to the fads of our age and do not reflect the true realities of our brains and consciousnesses.

What does this mean? In terms of what you're actually saying it seems to be 'you're wrong because you aren't right.'

Quote
For instance, imagine that you can copy yourself into a thousand Battutas. Do you believe that your own consciousness will be "transferred" to any one of those? Or will it remain inside of yourself? And if you are still "inside of yourself", the original, do you see yourself as "interchangeable" between all those Battutas, or will you still prefer your life over all of those other Battutas?

We dissected this at great length. Any brain scan/copy process produces valid causal forks. All forks will feel that they have remained 'inside themselves.' Subjectivity is copied alongside everything else. Pre-fork Battuta doesn't care about which fork lives, since they are all valid causal descendants. Once the fork has occurred, the forks are causally unrelated and only care about themselves.

Quote
For instance, take this scenario: Imagine 9 Battutas are immediately copied, resulting in 10 Battutas. However, because of a strange sequence of events, only one Battuta will survive and you are to choose who will. This choice is not to be discussed between you and other Battutas. You know they are exactly like you, but you alone are to choose (and are free to do so) if *you* are to survive or any other Battuta is to survive. Once you decide, 9 Battutas are immediately vaporized through a process that takes exactly 2 Planck-o-seconds.

Now, according to your thesis, it does not matter *which* Battuta survives. So what will you do here? Notice how you feel about your decision. Beware of sensations of "generosity" or "fairness" or "altruisticness", for they are already subtly implying a "sacrifice". But there is no sacrifice involved. This decision shouldn't be difficult at all and should be totally random: it's like breathing after all and aren't you doing *that* every single second?

This is trivial if you're rigorous about defining 'you.' It's one of the reasons I'm so confident in the simple materialist model, because it resolves situations like this.

Battuta before the copying process doesn't care which of the 10 causal descendants survive. All will be valid causal forks. Remember, there is no 'original' and no 'copy'. Pick at random, it doesn't matter: this will feel no different than going about day to day life.

After the copying process, all 10 Battutas will be diverging, and none of the others will ever be valid causal descendants of them. All would choose themselves to survive. But the 9 who are vaporized will have no more than 2 Planck-o-seconds to diverge, and I'm perfectly happy (lol, this reminds me a bit of arguments about where life begins in pregnancy) to say that no information will be lost.

Pre-fork Battuta lives no matter what. 9 post-fork Battutas die. 1 post-fork Battuta lives. Forking is a great way to continue living, but not a good way to avoid death.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Whatever credentials of subjective me-ness exist, they must be physical. They will be copied by any physically faithful copying process.

Subjectivity can be forked.