Author Topic: The "hard problem of consciousness"  (Read 48117 times)

0 Members and 1 Guest are viewing this topic.

Re: The "hard problem of consciousness"
I'd say that the idealist model I described produces exactly the same predictions as the physicalist model. Practically speaking, there's no difference between the two. But then you could argue that the only difference is terminology, and we're really in agreement.

Anyway, I really like your post, and I think it's as good a place as any to leave the debate.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Before you go away, I'd like to reference Terrence Deacon's work regarding this attempt to bridge what appears as a difficult gap (which Battuta dismisses as an easy problem, well good for him).

His idea is that the Self is an emergent property that stems from symbiotic relationships between certain pattern structures in the physical world.

I'm on my phone so I have troubles easily linking you stuff. But do Google it. There's a good interview on YouTube with him talking about his thesis for over an half an hour.

 

Offline Turambar

  • Determined to inflict his entire social circle on us
  • 210
  • You can't spell Manslaughter without laughter
Re: The "hard problem of consciousness"
ITT: arrangements of molecules with the ability to feel special
10:55:48   TurambarBlade: i've been selecting my generals based on how much i like their hats
10:55:55   HerraTohtori: me too!
10:56:01   HerraTohtori: :D

 
Re: The "hard problem of consciousness"
Before you go away, I'd like to reference Terrence Deacon's work regarding this attempt to bridge what appears as a difficult gap (which Battuta dismisses as an easy problem, well good for him).

His idea is that the Self is an emergent property that stems from symbiotic relationships between certain pattern structures in the physical world.

I'm on my phone so I have troubles easily linking you stuff. But do Google it. There's a good interview on YouTube with him talking about his thesis for over an half an hour.

Thanks! I saw this video and wasn't sure where he was going at first, but it came together nicely at the end.

 
Re: The "hard problem of consciousness"
ITT: arrangements of molecules with the ability to feel special

Doesn't this sum up the entire human race?

 

Offline Mikes

  • 29
Re: The "hard problem of consciousness"
ITT: arrangements of molecules with the ability to feel special

Doesn't this sum up the entire human race?

And cats?

:p

 
Re: The "hard problem of consciousness"
ITT: arrangements of molecules with the ability to feel special

Doesn't this sum up the entire human race?

And cats?

:p

YES :D

 
Re: The "hard problem of consciousness"
Sorry about the necro - I have a question, and I'm very curious about how you'll respond. It's a sharpened version of the transporter question.

Transporter #1 works by splitting you into your component atoms, then reconstructing you at another location with the same atoms.
Transporter #2 works by scanning you, then reconstructing you at another location with different atoms. The scanning process annihilates the "original you".
Transporter #3 works by scanning you, then reconstructing you at another location with different atoms. The scanning process causes the "original you" to drop dead.
Transporter #4 works by scanning you, then reconstructing you at another location with different atoms. The data transmission causes the "original you" to die in agony.

Assume that all four transporters work exactly as advertised. How willingly would you use each transporter?

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
1-3 produce a valid fork and seem safe since no fork experiences suffering or aversive events. #4 causes a causal descendant fork to experience agony and is unsafe from the perspective of the pre-fork mind.

 

Offline zookeeper

  • *knock knock* Who's there? Poe. Poe who?
  • 210
Re: The "hard problem of consciousness"
Obviously I wouldn't use #4.

I might end up using #3 or #2 if I didn't mind dying (it would happen without the usual downsides of dying, after all).

I'd probably use #1, although only after observing others do it.

The differences between how I'd approach 1-3 are of course largely just psychological and not particularly rational.

 

Offline watsisname

Re: The "hard problem of consciousness"
Why does #4 cause the descendant fork to experience agony?  (Honest question; I don't understand it).
In my world of sleepers, everything will be erased.
I'll be your religion, your only endless ideal.
Slowly we crawl in the dark.
Swallowed by the seductive night.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I wouldn't use anyone of those. And I think anyone who does have either not thought this sufficiently through, or they are insane.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Re: The "hard problem of consciousness"
Why does #4 cause the descendant fork to experience agony?  (Honest question; I don't understand it).

For reasons. I think the specific mechanism of why one fork experiences an agonizing death doesn't really matter, the philosophical question of whether or not you're willing to use a piece of tech if it causes an exact copy of you pain and death is the interesting part here.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 

Offline watsisname

Re: The "hard problem of consciousness"
@E, I do understand the purpose of the question (and my initial thoughts closely mirror's Battuta's), but I don't understand his claim on the effects from #4.  I'm interested in discussing that because I feel it might improve my understanding of the principles and affect the conclusions I draw from the exercise.
In my world of sleepers, everything will be erased.
I'll be your religion, your only endless ideal.
Slowly we crawl in the dark.
Swallowed by the seductive night.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Re: The "hard problem of consciousness"
Can you elaborate on what you do not understand?
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I think the confusion might be in my answer? Specifically that I'm saying it causes a fork to die in agony.

My perspective on the whole teleporter problem is that we have to give up on the illusion of 'an original'. Think of yourself as a brain-state constantly copying itself forward, one instant at a time: yourself tomorrow will be a teleporter duplicate, produced by the 'teleportation' of simple causality. Teleporters just produce two valid descendants instead of one, two 'forks'.

So in my analysis of #4, I'm calling the "original you" just another valid causal fork. And to the pre-fork consciousness, it's important to understand that both the forks who result from the teleporter are going to be The Real You. There's no distinction between 'original' and 'duplicate'.

1-3 are okay because they involve the same amount of risk as day to day life: your current brain state will 'die' but it will propagate forward in time.

4 is not okay because it requires one of your future selves to go through agony.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I find the hypothesis that the "original" is *just* equal to a past "me" therefore irrelevant, that my current "brain state" is in all purposes exactly as if it was just teleported from my previous "brain state" just a plank second ago (or something), a very intellectually interesting one.

To call the skepticism of this idea an "illusion" is mind boggling to me. As far as I am concerned, even if such technology could be possible, there would be no way to reliably test that idea. It's an unfalsifiable idea. If it is true, then the teleported person will act as if it was true, if it is false, the teleported person will act as if it was true, with the difference being, it's not really you anymore (you died, sorry about that).

It's the untestability of this idea that gives rise to the horror of it all. Star Trek might well be filled with creatures that constantly kill themselves and are being eternally substituted by exact clones without even noticing it, without anyone ever even be aware of it or even know the possibility of it (except the brightest paranoid in there, who was on to something). You see captain Janeway being destroyed by a beam only for an incredibly equal copy to emerge just besides you and you end up believing you'll be ok. So you'll beam yourself too. But those will just be the last few nanoseconds of your life.

This idea that you are just your current brain state and nothing else is interesting. Many difficulties start happening once you start questioning it (how far back in time must I go to say it's not "me" anymore?), and its weakness comes from purely taking the mechanical approach to consciousness - "Consciousness is just the result of a program running in the brain". An analogy as faulty as the one made a hundred years ago (that the human body was akin to a steam engine, and thus should let go some "steam" once in a while). Again, it is an interesting idea and one that might well be true (I really don't think so). I have no idea if it is true.

But to blindly accept it as such and declare the denial of it as something delusional, is just an intellectual overreach for the sake of philosophical edgelording IMO.

And even if you believe in it, just the possibility that it is indeed false (and the impossibility of knowing eitherwise) should render your decision as clear as day: Never teleport. EVER.

 

Offline zookeeper

  • *knock knock* Who's there? Poe. Poe who?
  • 210
Re: The "hard problem of consciousness"
It's the untestability of this idea that gives rise to the horror of it all.

Or, alternatively, dispels the horror. Something that no one will ever know about or experience the consequences of can't be horrible, and blinking out of existence in a teleporter isn't any different.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
An identical copy is you. All plausible credentials for you-ness are material: they are duplicated as well.

Our faith that we'll still be ourselves tomorrow is MORE alarming than our faith in a  perfect teleporter. A day gives a lot of time for drift!

And testability isn't a problem - the real problem is the testability of the alternative! If we're afraid of winking out and being replaced by a clone with new qualia, we need some logic or premise to support the fear. Yet the fork contains all plausible substrates for personality and consciousness. Fear of teleporters leads to the fear of dying every instant of every day.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
You're asserting things out of faith: it's an untestable assertion, and yet you claim it is true. It's an irrational position. That one may feel inclined to believe it might well be the state of things is one thing, I accept that, that you live by those beliefs is quite uncontroversial to me. That you are willing to kill yourself over that question (over and over) is not.

Your inference that we should fear "moment to moment" qualia states as much (or more! - I mean, can you cut your edgelordiness here for a sec?) as teleportation seems only sensible if you don't stop to realise that you have no choice on the "moment to moment" qualia matter. It is irrational to fear that problem because you cannot do anything about it. You cannot choose to remain in your previous state. However, you can choose to teleport yourself or not. The fact that you state that there's no difference isn't a proof that there isn't. It's mere belief.

Beliefs are fine. They drive things forward. I just have this knack of not betting my life on someone's whims on how exactly consciousness works or not and how "it doesn't really matter if your original brain dies". And I think badly of anyone who does. Not too badly though. Only intelligent literate and well informed people usually reaches these outrageous positions.

It's the untestability of this idea that gives rise to the horror of it all.

Or, alternatively, dispels the horror. Something that no one will ever know about or experience the consequences of can't be horrible, and blinking out of existence in a teleporter isn't any different.

So if you are going to die but you don't know anyway, it's no biggie. I mean, if this isn't the endgame of misanthropy and nihilism, I don't know what is.