Author Topic: The "hard problem of consciousness"  (Read 48122 times)

0 Members and 1 Guest are viewing this topic.

Offline zookeeper

  • *knock knock* Who's there? Poe. Poe who?
  • 210
Re: The "hard problem of consciousness"
So if you are going to die but you don't know anyway, it's no biggie. I mean, if this isn't the endgame of misanthropy and nihilism, I don't know what is.

Of course it's no biggie. And what would be closer to the endgame of misanthropy and nihilism would be to consider #4 to be no biggie, and no one's doing that.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
"Of course it's no biggie", well if you wanted to convince me you actually don't care for your life, you surely did a tremendous job.

e: "Hey people it's amazing, we got this new teleport machine working! We are 100% sure it will copy you into another place altogether, and we are mostly positive that you yourself won't really, really die because we have this crazy untestable metaphysics, and anyways, even if you die, you won't notice it! Isn't this the best product of the world?


Sign me the **** out. Immediately.

 

Offline zookeeper

  • *knock knock* Who's there? Poe. Poe who?
  • 210
Re: The "hard problem of consciousness"
"Of course it's no biggie", well if you wanted to convince me you actually don't care for your life, you surely did a tremendous job.

It's not what I specifically wanted to do, but sure, of course I don't care for my life. That'd be somewhat silly and paradoxical, after all.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Not sure if serious...

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
The only crazy untestable metaphysics here is that suggesting that teleporters are dangerous at all! What we know of physics and neuroscience suggests NOTHING except a monist, physical mind. There's not even a loose thread to tug on to head towards 'teleporters might kill you.' It's purely the product of intuitive misunderstandings of the mind and qualia.

If your argument is that 'well, they are less dangerous to consciousness than the passage of a day, but we can't choose to stop time,' would you freeze yourself in stasis given the option? To save yourself from a process that will inevitably destroy more of your current brain state than a philosophical teleporter?

You'd be like a reverse Barclay, terrified of leaving the pattern buffer.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I guess your argument is 'even if there is a vanishing chance a philosophical teleporter would kill you, it's not worth the risk.' My reply is 'any framework in which the teleporter kills also makes day to day life tremendously more risky and fatal, which is a nonsensical outcome.'

 
Re: The "hard problem of consciousness"
For those of you who would use #3, but not #4: would you be okay with #4 if you took a gun into the transporter, then blew your brains out after the scan?

 
Re: The "hard problem of consciousness"
One thing I do want to point out (I'm woefully outclassed in these kind of discussions) is that Luis is using terms like "Edgelordyness" and "Misanthropy", both of which are terms used to describe a lack of empathy towards other human beings in the person the term is applied to. However, I would never suspect either Zookeper or Battutta of the posters of lacking such (and I'd certainly argue the opposite in favour of Empattuta). Not believing in the concept of a soul does not mean you do not have a moral code or that you don't care about other human beings in the same way that saying that Freespace 2 consists of a lot of lines ending in semicolons does not mean that you don't love it.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I guess your argument is 'even if there is a vanishing chance a philosophical teleporter would kill you, it's not worth the risk.' My reply is 'any framework in which the teleporter kills also makes day to day life tremendously more risky and fatal, which is a nonsensical outcome.'

That's what you keep claiming, and yet you have no way to test this... I was going to say "Hypothesis" but that would imply testability of any kind. I'm sorry, you can't even calculate the risk itself for there are no bayesian parameters you can even hold on to (and frequentist analysis is just laughing at you here).

Of course, what you are saying might be totally true and render my issues absolutely to the ground, *if*, for instance, what you are stating about brain states is the entire story of Consciousness. First, you cannot claim this without testing. Saying "We don't know of any other possibility" is no remedy since Consciousness hasn't been successfully sussed out yet. It's like claiming that thunders must come from God because "What else could it be?". It's a failure of the proponent to grasp his own limitations regarding what we don't know about the universe and Consciousness. In a word, it's called "Hubris".

Second, even if we "accept" this story, you could then say "Well see, the problem here is solved", but it could well be solved in a very unsatisfying way, because if you bring Heisenbergian notions to play here, you'd realise that an actual complete and precise copy of your brain state is quantum mechanically impossible. And so, the only way to make your crazy scenario possible would be to violate physics themselves. IOW, this could be the universe's way to "prevent" copies of consciousness (don't take me literally here, I'm making an analogy with how "naked singularities" are always hidden behind event horizons).

In that case, a further horror could emerge. Someone claiming they had achieved perfect copy, but only down to that barrier. But hey, as good merchants they would be, they would totally convince a lot of people that would be "Good Enough". But would it really? Well, perhaps it would. Perhaps it would make sufficiently similar copies to fool us. Would these not equal, but just "alike" harbor our consciousness? Well, we can see how this reasoning starts to become irrational, for there is no apparent barrier to "how similar" must a brain state be for me to "care" about it as much as about myself.

Think, does my wife harbor my consciousness? Does your SO harbor YOUR consciousness? Would you be happy to die since she's alive so... who cares? Why not?

From the scientific point of view, every consciousness is interchangeable. It's just behavior on top of behavior, atoms doing their thing, nothing remotely special about any of them, and any system that is equal to another is just another "instance" of it. And I do believe many people "believe" in this, at least intellectually. Of course, if they subjectively actually internalized this view they would completely fall apart and despair into full depression and suicide. It's simply *not true*. And the only thing you need to "falsify" this view is to have the subjective experience of yourself. It's not to say that the theory is wrong.... no, the problem is that it is incomplete.

And by "incomplete", I don't mean to say "yeah we need spirits". I mean to say, this is not accounted for. There's a whole LOT that is unnacounted for. And for this reason, to trust these insane engineers that would have invented this machine would be akin to believe in doctor Frankenstein and think "This is all going to be alright now".

No, it doesn't.

...Luis is using terms like "Edgelordyness" and "Misanthropy", both of which are terms used to describe a lack of empathy towards other human beings in the person the term is applied to.

Joshua, I won't be tone moderated by you, so kindly drop it. The fact that you misinterpreted those two words as the things you said is proof enough that you are incapable of performing that duty anyway. I will hereby explain to you what I actually did. By "edgelording" I mean that I do believe Battuta is trying too much to be "edgy" in his philosophical conclusions. That is, I sensed (decreasingly so, I must add) that he was just placing his most controversial statements out there and keeping his more moderate caveats to himself. The misanthropic note is a purely philosophical critique. The idea that you shouldn't care if you are about to die or not is necessarily predicated in a disregard for human subjective experience, which is all we have. Do I believe zookeeper is a misanthrope? No, I don't know enough about him to say that, but his idea clearly was.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Our understanding of consciousness is not complete, but it is bounded. We know the explanation falls within a certain territory. The territory has parameters: it is monist, it is physical, it is causally closed. These parameters speak to the risk of teleportation. It's not like claiming that thunder must come from God, because what else could it be; it's like claiming that thunder must be a physical event, because we have no evidence for nonphysical events.

The objection to the engineering parameters of the teleporter is valid, which is why I've been careful to specify a philosophical teleporter: arbitrarily precise reconstruction of the physical body.

Internalizing the world-view that consciousnesses are 'interchangeable and material' is hardly a formula for suicide. It's not a disturbing attitude! It loses you nothing. You are a subprocess of the universe, computing yourself forward. Your own behavior is contingent on your past experiences, your knowledge, and beliefs. If anything it's humanizing.

The fact that my statements seem controversial or misanthropic is, I think, evidence of how deeply our society is still predicated on illusions about what we are. The notion that we are only a pattern of information, stored in meat and endlessly mutable, is somehow radical and depressing. Yet it requires only that we give up things we never actually had at all!

For those of you who would use #3, but not #4: would you be okay with #4 if you took a gun into the transporter, then blew your brains out after the scan?

Blowing your brains out after the scan implies serious loss of information! One fork would diverge and then die. The reason dropping dead immediately after the scan is acceptable to me is that it doesn't require any fork to subjectively experience suffering or trauma. Yet from the perspective of the fork who remains at the teleporter origin, they are now simply committing suicide after the teleporter looks at them. The other fork's diverging existence is cold comfort.

A pre-teleport instance might be okay with the gun scenario, since they know one of their causal descendants will survive. But they might prefer that all their causal descendants avoid death, in which case they'd turn down this variant on #4 (yet be perfectly happy with instantaneous vaporization, since it doesn't leave any fork to subjectively experience suffering and death).

The post-teleport instance would certainly not want to shoot himself in the head.

You have to remember that teleportation/mind forking is a great way to ensure life, but not a great way to avoid death. You also have to remember to abandon chauvinistic notions of a 'real self' — you are ONLY a pattern of information, bleeding into the world around you, held together as a loosely continuous object by the ability to copy information forward in time. The philosophical teleporter is no different than what happens to us moment to moment.

We are okay with annihilating ourself as we were yesterday by becoming the person we are today, as long as that person isn't around today to suffer. We would not be okay with peeling our future self off the past self and leaving the past self to drown.

 
Re: The "hard problem of consciousness"
I meant "blowing your brains out" to be an instantaneous death, so there would be no physical pain. If you mean the psychological pain of committing suicide after the scan, what if (before entering the transporter) you rig a gun to blow your brains out after the scan?

 

Offline zookeeper

  • *knock knock* Who's there? Poe. Poe who?
  • 210
Re: The "hard problem of consciousness"
The misanthropic note is a purely philosophical critique. The idea that you shouldn't care if you are about to die or not is necessarily predicated in a disregard for human subjective experience, which is all we have. Do I believe zookeeper is a misanthrope? No, I don't know enough about him to say that, but his idea clearly was.

Nonono. The idea that you shouldn't care if you live or not has nothing to do with how one values the contents of subjective experience; it's precisely because only subjective experience matters that the concern becomes irrelevant.

The question of whether you live or not is a question of whether your subjective experience exists at all. I hold that it would be paradoxical to care about it ending, because the only way you can make a value/preference judgement is from within that subjective viewpoint. To prefer existence over non-existence would require being able to compare them from some kind of objective viewpoint, which you cannot do, making "I prefer to exist" something of an oxymoron.

So, there is a disregard for existence of subjective experience, but not for what subjective experience is like. You can call that misanthropic (although the word is obviously unnecessarily antropocentric) if you want, but it seems like a simplistic and misleading characterization.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I meant "blowing your brains out" to be an instantaneous death, so there would be no physical pain. If you mean the psychological pain of committing suicide after the scan, what if (before entering the transporter) you rig a gun to blow your brains out after the scan?

A gun is a pretty crude way to clean up a fork. You'll have massive chunks of the brain continuing to fire for milliseconds or whole seconds, you'll have the chance of survival, you have the rest of your body continuing to do its thing for a little...it doesn't offer the immediate and total annihilation of scan-and-vaporize/scan-by-vaporize.

Again, when thinking about forking you do want to think about the welfare of both forks. 'One fork will be instantly vaporized, experiencing nothing' is pretty safe — a physical instance of your mind is closed down, as with any other form of death, but its brainstate is propagated forward safely.

Even a tiny delay before a traumatic injury by gunshot means you're not safely propagating forward the brainstate that occurs between the instant of teleportation and the impact of the bullet. You're putting a fork in a pretty ****ty position.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Our understanding of consciousness is not complete, but it is bounded. We know the explanation falls within a certain territory. The territory has parameters: it is monist, it is physical, it is causally closed. These parameters speak to the risk of teleportation. It's not like claiming that thunder must come from God, because what else could it be; it's like claiming that thunder must be a physical event, because we have no evidence for nonphysical events.

I like debating this stuff, but still if you don't mind, I'll retort yet again.

My point about the "thunder" is precisely to say that we cannot say exactly what "bounds" consciousness until we actually know how it behaves. Someone who didn't even imagine what those "physical" things could even be, he would obviously run to his most parsimonious explanations. The correct explanation was just beyond his scope of understanding. Knowledge is inherently unbounded at all times, which is not to say that it is impossible. We do believe we understand thunders because we have modeled them and correctly identified all the parameters that enable them, etc. But consciousness? I'd say we are at a new level here. We have no idea if even other people have consciousness (we just assume they do, it's parsimonious, etc.). But if we would really wanted to know, there would be no means (and by means, I even say, philosophical means) to correctly detect if others have it or are plainly zimbos.

The monist attitude is fine (and I do have opinions on how to tackle this problem, but that's for another time), but the problem is at the very core of experience and how it is fundamentally undetectable by the outside (indistinguishable from "zimbos").

Quote
The objection to the engineering parameters of the teleporter is valid, which is why I've been careful to specify a philosophical teleporter: arbitrarily precise reconstruction of the physical body.

Except that the engineering objection destroys your philosophical one. You yourself claimed that what makes "me" "me" is the correct localization of my entire brain atoms (and more), so that if we correctly copied all of this, then that would be "me". Then you said something apparently contradictory: that there is no such thing as a constant mental state: we are always different. My brain state in a nano second ago differs from mine now. That should mean that my "me" me is just inherently different that the "me" me a nanosecond ago. We are two different people. This is true even colloquially, but it nevertheless demolishes the first point altogether.

Furthermore, if it is indeed true that these small differences do not matter because "what I am now is different than I was before", then those engineering differences shouldn't matter as well, at least philosophically speaking. The end result is just someone with a differerent mental state than myself, but that was going to be a given anyway, since I change myself every second, so we're still good.

But, philosophically, this escalates quickly into a reductio ad absurdum because you'd be forced to recognize that all mental states are therefore "equivalent" to you. What if the teleporter would kill you and substituted you for someone completely different? Clearly that is as philosophically valid as your own proposal. But this is clearly absurd and goes against your premise.

So something went awfully wrong. And what I am claiming is that this paradox points to an incompleteness of how you are framing consciousness by itself. Your claims about how you are able to make a boundary around Consciousness is overly optimistic.

Quote
Internalizing the world-view that consciousnesses are 'interchangeable and material' is hardly a formula for suicide. It's not a disturbing attitude! It loses you nothing. You are a subprocess of the universe, computing yourself forward. Your own behavior is contingent on your past experiences, your knowledge, and beliefs. If anything it's humanizing.

Yeah that might be true, people might be at peace with that. I'd submit that this comes with a different understanding of what those ideas imply, but at that point I'd also have to admit that I am equally in an unknown territory so I might be totally wrong.

Quote
The fact that my statements seem controversial or misanthropic is, I think, evidence of how deeply our society is still predicated on illusions about what we are. The notion that we are only a pattern of information, stored in meat and endlessly mutable, is somehow radical and depressing. Yet it requires only that we give up things we never actually had at all!

It's not that it's "radical and depressing", it's incomplete. People in the 19th century equated people to steam engines. "They are just like steam engines!", and ... sigh, it's like, "Oh look people are just atoms". Well, that's true in some sense, but it's just an incredibly ignorant sense. It ignores everything else that might be at play here and that is perhaps orders of magnitude more important than "atoms". Likewise, I feel you are merely happy to equate consciousness to some sorts of metaphors: "patterns of information", "thing that can be stored in meat", "endlessly mutable", etc., while the grander truth is that you actually don't know what consciousness really is! Engineeringly speaking.

It's akin to hear from someone in the 17th century that emotions are "bounded" in the regions of the heart. "We know this to be a fact", he would say. Well of course he would.

Which is not to say that I know. I just feel that I'm at least having more respect? for the big gap between what is actually true and our knowledge.


The misanthropic note is a purely philosophical critique. The idea that you shouldn't care if you are about to die or not is necessarily predicated in a disregard for human subjective experience, which is all we have. Do I believe zookeeper is a misanthrope? No, I don't know enough about him to say that, but his idea clearly was.

Nonono. The idea that you shouldn't care if you live or not has nothing to do with how one values the contents of subjective experience; it's precisely because only subjective experience matters that the concern becomes irrelevant.

The question of whether you live or not is a question of whether your subjective experience exists at all. I hold that it would be paradoxical to care about it ending, because the only way you can make a value/preference judgement is from within that subjective viewpoint. To prefer existence over non-existence would require being able to compare them from some kind of objective viewpoint, which you cannot do, making "I prefer to exist" something of an oxymoron.

So, there is a disregard for existence of subjective experience, but not for what subjective experience is like. You can call that misanthropic (although the word is obviously unnecessarily antropocentric) if you want, but it seems like a simplistic and misleading characterization.

I do see the problem here better, your answer clarifies a lot. You see, you're analysing this in an "objective" standpoint, while the original question was placed for your own subjectivity. This is your mistake. The question was, and I quote, How willingly would you use each transporter? Now, we can be all scientists all day and discuss this very objectively, but at the end of the day, the decision is not from the "universe" point of view, but from yours. You are about to enter the teleportation device. Will you say "energise" or not?  You even complain that these words are too "anthropocentric", but I failed to see that this was a problem for the bees or horses. This is a dillema for a human, and a question directly posed to one, not as a scientist, but as a human being with a will.

So yes, you do deny that your analysis is misanthropic, but it ends up being apathetic. Again, that's fine. You can do Hamlet all day and decide like some people do that there is no real difference between being alive or dead (is there any objective difference anyway?), but that is not responding to the question at hand.

I am a human being and I stand for my initial answer: I would NOT enter those teleporters, not in a million years.


 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I am a human being and I stand for my initial answer: I would NOT enter those teleporters, not in a million years.

I have yet to read anything you've posted that seems to get at this objection. You seem okay to concede that a teleporter cannot be more hazardous than day-to-day existence: after all, your consciousness is already bound by the engineering constraints of the brain, which must propagate you forward in wetware. Nor is there a risk of reductio ad absurdum, because it's trivial to note that NOT all mental states are equivalent — what's vital is the preservation of information, a logical causal pathway. The teleporter cannot kill you and substitute someone completely different. That would be a critical loss of information. The teleporter must be safer, in terms of information loss, than day to day existence!

You didn't answer my super clever stasis question either.

I don't understand how you are constructing a teleporter such that it's more risky than going to sleep. Nor am I convinced by the idea that the boundaries of physicalism and monism will somehow be punctured by the study of consciousness, when those boundaries have remained inviolate since the beginning of human investigation of everything.


 
Re: The "hard problem of consciousness"
Joshua, I won't be tone moderated by you, so kindly drop it. The fact that you misinterpreted those two words as the things you said is proof enough that you are incapable of performing that duty anyway. I will hereby explain to you what I actually did. By "edgelording" I mean that I do believe Battuta is trying too much to be "edgy" in his philosophical conclusions. That is, I sensed (decreasingly so, I must add) that he was just placing his most controversial statements out there and keeping his more moderate caveats to himself. The misanthropic note is a purely philosophical critique. The idea that you shouldn't care if you are about to die or not is necessarily predicated in a disregard for human subjective experience, which is all we have. Do I believe zookeeper is a misanthrope? No, I don't know enough about him to say that, but his idea clearly was.

For the purposes of clarification: I am not tone moderating you, I am pointing out that thje words you use make no sense in the context of this discussion (or that you do not understand the words you use). Misanthropy indicates an active dislike for human society, whilst edgelordyness is more related to 4chan cultures and it being the primary reason for the existence of misanthropy today.

The notion that the humans are nothing more then chemical machines is not misanthropic (nor edgy) in itself, nor is it controversial (A game like The Witcher can mention it without anyone batting an eyelid). I would argue that it's egilitarian, any notion of a higher purpose or whatnot is eradicated. The only thing that remains is your unique arrangement of monecules. It only really becomes misanthropy when you start referring to human beings as meatbags and constantly calculate the best way to assinate any number of them at any given moment.

Also: WOLFENSTEIN! (spoilers)
« Last Edit: October 15, 2015, 02:46:14 pm by -Joshua- »

 
Re: The "hard problem of consciousness"
Regarding the gun thing: fair enough.

Nor is there a risk of reductio ad absurdum, because it's trivial to note that NOT all mental states are equivalent — what's vital is the preservation of information, a logical causal pathway. The teleporter cannot kill you and substitute someone completely different. That would be a critical loss of information.

I think Luis is saying that, since we're always changing, information is never preserved. So it's no big deal if the transporter reassembles us differently.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I think to say that it is "trivial to note that NOT all mental states are equivalent" without noting the philosophical elephant in the room is wanting. I see I did not express my reductio properly.

I will answer to your super clever stasis down at the bottom. I will also say, before continuing, that I din't say physicalism and monism have an exception on consciousness. I even hinted that I might have some ideas on how to engineeringly tackle that problem. What I did talk about is how certain things appear to be, right now, untestable. And since they are untestable, there is no way about distiguinshing whether if the teleported person is "equivalent" to the person who just had some sleep who is just "equivalent" to a person who is continuously awake for a few minutes.

These are just untestable things, not only but also because these terminologies are absolutely incomplete. Take your sentence: "Not all mental states are equivalent". That is a really vague statement. Equivalent in what about? You speak of information, but what are we measuring here? We have already established that our brains are continuously changing information in them, so some changes should be allowed. Objectively speaking, they are surely not equivalent. What is the acceptable delta here, objectively speaking? How can you even define a criteria to decide?

Look, you basically answer this with "[my criteria is] day to day loss", without any rational reason for why should this be the case. Sure, everyone would be (in a common sense way) subjectively more satisfied with that answer, because the dude going out of the teleporter would be very similar to the dude going in. But there's no philosophical reason why this should be preferable. If any delta in mental states are "enough" to carry your own subjective experiences with you (that is, you will keep living just like you are now living from this moment to... this moment to.... this moment.... etc), then there is no obvious philosophical barrier of change. You could be turned into a horse. It would "still be you", according to your own metaphysics.

And the reason that this is absurd is because the underlying metaphysics is absurd, not that the "horse" example is absurd. It follows from the premises.

IOW, it's an error that is born out of our very lacking in engineering precise grammar and jargon over what consciousness really deals about, making us write arguments that nevertheless accept the correctness of the terminologies we are using, which are most assuredly wrong. IOW, for someone in "the far future", it's like watching Deepak Chopra talk about Quantum Mechanics. The words might follow in a soothing manner, and it might even make some... ahhh.... sense, if you actually don't know anything technically about Quantum Mechanics, but once you do, you open your eyes in terror of the gruesome logics, reasonings, arguments and, gasp, conclusions some people reach.

Like perhaps saying that transmitting equal mental states to another "container" is what we need to transfer "yourself".

Regarding your stasis thing, look, you're basically telling me to either die half of me or take a chance that your metaphysics is correct. It's a terrible choice regardless. Would I take the bet or not? I have no idea, I'd rather stay here.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
The 'acceptable delta' (in answer to both above posts) is the preservation of information, namely retaining the ability of a brainstate to propagate itself forward by causal rules. Vaporizing your brain introduces entropy into the system, destroying it: the brainstate cannot copy itself forward without becoming very lossy. Teleporting your brain may vaporize it, but no information is lost: the brainstate propagates through the teleporter.

Day-to-day life brings in stimuli, which couple with the brain and alter the way the brainstate propagates forward. This alters our brainstates over time.

 
Re: The "hard problem of consciousness"
I apologize for harping on this, but I'm still confused. How perfect does the copy need to be to allow your brainstate to propagate, and how do teleportation imperfections differ from day-to-day stimuli?