Author Topic: The "hard problem of consciousness"  (Read 48638 times)

0 Members and 1 Guest are viewing this topic.

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
You think that physicalism doesn't address consciousness. I think that physicalism addresses everything, and that nothing is nonphysical.

 
Re: The "hard problem of consciousness"
Thank you for making that clear.

Now for the rest of my paragraph: if you agree that existence and consciousness are brute facts, and you agree that these are the only brute facts, our definitions must coincide (though mine is more precise). Hence the brain cannot be consciousness, because the brain violates our definition: it is not a brute fact. This is simple logic.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Re: The "hard problem of consciousness"
Treating consciousness as "an emanation of the luminiferous aether" (which isn't an accurate description of dualism, but never mind) is better than not addressing consciousness at all. A god of the gaps argument would claim that physicalism doesn't currently explain consciousness, and hence consciousness is special. The actual situation is much worse: physicalism doesn't even address consciousness.

We haven't "introduced an acausal mechanism into the universe". The phenomenon is inescapable.

But you did! Dualism postulates that consciousness is something that cannot be completely described in terms of physical interactions. But since consciousness has undeniable physical sideeffects, consciousness has to appear acausal from the point of view of a purely physical examiner.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Luis, I'm curious whether you think a set of really verbose quantum field equations describing a human body would be conscious (if worked out, say, by hand in an arbitrarily large colony of scriveners).

I'll take your silly chinese room bait after you answer my points regarding murdering Fork B, kthnks.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I agree that we can begin at 'I exist' as a reasonable starting point. But I don't think that's simple or useful logic. Let me quote a post from upthread...

Quote
I guess I don't see a way to get around this fundamental disconnect right now:

As far as I am concerned, there's no sense talking about anything that doesn't contribute to a single, coherent, unified explanation of everything. 'The only thing we can be sure of is that we're conscious' is something I can agree with — but what do we do with that?

We look around, observe the universe, search out causal logic, and if we eventually arrive at a causal model that begins with nothing and ends up explaining us, including our consciousness, we say 'this model is useful and predictive, and unlike any other, it seems to provide an account that explains everything we see. We thus consider it to be a model of the universe, of which we are a subsystem.'

You seem to say, 'we can do that, but when we're done we shrug and say, well, we might also have ghostly dualist voodoo which has no detectable effect, is unnecessary to explain anything, and is not suggested by anything except our own cultural traditions and desire to believe we're special...but we can't disprove that consciousness is special somehow...'

Those of you who would argue that physicalism will never explain qualia must contend with the fact that it already has. Physicalism says that we each have our own subjectivity for the same reason cameras take pictures from their own perspectives: that's what the machine does. It monitors itself, models itself and others, manipulates symbols in a workspace, applies global states like 'emotion' to modify function in response to adaptive challenges, and generally does a lot of stuff which requires it to have a 'this is me' concept. The brain needs to be able to model itself from someone else's perspective, and to integrate conflicting motor responses, and to do all kinds of **** which, it turns out, we experience as subjectivity. How else would we experience those things? Like the man inside the Chinese Room, blindly manipulating symbols? We're not the man. We're the room.

A brain is a meat machine. You build the machine, you build everything in the brain. We are in our brains. We are meat.

We have a consciousness. We can do a lot of stuff with it. One thing we can do is poke the environment around us and see what happens.

As we do this, we begin to detect causal logic in how the environment behaves. This leads us to the choice between solipsism and objective reality. Solipsism is not useful: it undercuts itself.

Once we have chosen objective reality, we must begin trying to build models of how it works. And when we build a model that, in the end, explains ourselves, when it contains only the necessary and sufficient factors to explain our own consciousness, then we have come full circle. We know what we are, where we come from, and what our minds do. We have demoted consciousness from brute fact to a mundane subsystem of a universe built out of quantum fields, and we know that our own illusory certainty that consciousness comes first is only that: an illusion.

We are the laser told to search for the darkness. Wherever we look, we see qualia, so we assume that qualia have primacy. But consciousness comes last.

This is physicalism. It is the only account of consciousness with any value. It tells us why we have qualia.

Consciousness is a calculation conducted by the human brain. Consciousness is the brain. This is the only logic.

Luis, I'm curious whether you think a set of really verbose quantum field equations describing a human body would be conscious (if worked out, say, by hand in an arbitrarily large colony of scriveners).

I'll take your silly chinese room bait after you answer my points regarding murdering Fork B, kthnks.

It's not bait, I'm just curious. I know my answer for sure. Didn't we answer all the fork questions pages back?

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
I don't think I've ever got a sufficiently interesting response to this comment: http://www.hard-light.net/forums/index.php?topic=90258.msg1799328#msg1799328

e: Regarding the chinese room thing, I have to reiterate my attitude regarding these things, which is to say that I have no idea how it works. Now, don't confuse stuff. To say I don't know exactly how the "chinese room" can produce consciousness is not the same as saying that I don't know if you can simply transfer this chinese room to another office and still call it the "same" consciousness, without dying. IOW, even if it all seems to fit engineering-wise, from the self point of view, I have no way to know if the pre-teleported person is just going to die, full stop, or not.

These are different questions. For the sake of the discussion I'm willing to accept we are all "chinese rooms", but with a design of which utterly escapes us. Just yesterday, for instance, I learnt that a good connected neuron can have 15 thousand synapses. *One* neuron. IDK, the scales are somewhat incredible.
« Last Edit: October 19, 2015, 09:53:02 am by Luis Dias »

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Yeah, that's good ****. Okay —

Quote
A) Can the destruction of "Fork B" be called "murder"? It seems that it is. If this person is not exterminated, it can continue existing in the world, if he is exterminated, the Cosmos is accounting for one less Consciousness in it. There's blood in the ground, there's a killer, there is an energy discharged to destroy this fork. I don't see how it is not murder;

 B) Is the murder of "Fork B" dependent on the pre-knowledge of Fork B that he is going to die? That is, does the determination that what happens to him is murder depends on the words the scan operator tells you? If the scan operator is silently killing you, does that stop being murder? Clearly, that is ridiculous. People don't get out of jail sentences for being silently killing people.

 C) Is the murder of "Fork B" dependent on the speed of his death? If he is immediately killed after the scan, can we say that the operator is therefore innocent of his actions? This is absurd: if he does not kill the Fork, the Fork lives as the laws of physics allows him to. It is the action by the operator that causes the extermination of the Fork. The speed to which he does this is irrelevant: He could wait an hour, he could wait a minute, he could wait a second, he could wait a micro-second. The murder is murder nevertheless, a Consciousness *has* been erradicated nevertheless.

 D) Something has been hinted at the "suffering" of Fork B. Oh, the humanity, so concerned with the "suffering". It's a total strawman. You can easily depict a scenario where this person was given a drug before being scanned that prevented his psychological suffering. This administration of this drug does not absolve anyone from the crime of murdering him.

A) Sure, I guess you could call it murder, but I think that's because our concepts of 'death' and 'murder' aren't adapted to the realities of what we are — not single continuous entities with persistent existence, but a mind-state that changes every instant. I'd call this eradication of Fork B the same as day-to-day life (my favorite argument horse). Your mind state yesterday could've gone on living in any number of ways. But it was overwritten, it was lost, and only one way survived. What about the other ways it could have gone? They were lost.

I don't think you'll find this satisfactory: Fork B has no causal descendants if vaporized after Fork A diverges. If you find that distasteful, it's pretty simple to ensure that Fork A only comes into existence after Fork B is gone. But to be honest, I simply don't think that the loss of a couple milliseconds (or even seconds) of existence is a major loss, or constitutes subjectivity death. I would use that 'roll your brain back ten seconds' machine with only a very little thrill of worry.

In my mind, our qualia and subjectivity emerge from the physical brain. Consciousness follows mechanism. If we wind the mechanism back, if we get drunk, if we take a head blow or get viral encephalitis, even if we get vaporized and then rebuilt from a couple seconds ago, we change. But we don't die.

B and C) This is a good question, and really interesting. I think it probes at the inadequacies of current ethics. If we do not tell the scan target he's about to be scanned and duplicated, no distress will occur, even if we wait ten seconds to vaporize one fork. The pre-scan individual has survived. One of the post-scan forks has died. I'll get into whether this is murder after I handle speed.

As for speed: it's all a matter of what you think about divergence. I'm not worried about the fact that the fork could continue to exist, because your mind state yesterday might have continued to exist in any number of ways but it only got one. But I am, like you, uncomfortable with the idea of letting a fork diverge and then killing it.

Here is my cold-blooded fork answer: the person who steps into the teleporter can be absolutely assured that they will live. If one fork persists for long enough between the scan and the vaporization to experience subjectivity, then any divergence they undergo will be irretrievably lost. I'd probably be okay with this, but I think many wouldn't, and for that reason I think the teleporter should be built to avoid it. How long is too long? I don't know. You'd argue that any time is too long. You might be right.

D) Why not be concerned with suffering? If I am tranquilized and rendered unconscious before I go into the teleporter, I'm honestly...pretty okay with that, even if one of my surviving forks is vaporized hours later (as long as I don't wake up). All I'm scared of is the subjectivity of knowing that I'm about to die and leave no causal descendants.

Imagine that I fall asleep one night and develop an alternate personality in my dreams. I live a hundred years in dreamland. However, none of these experiences make it out of my short-term memory buffer, and when I wake up they have no causal effect on me. I never know I had this other life. From the post-dream fork's perspective, what was lost? Nothing. From the dream fork's perspective, what was lost? They lived, they died, but they never thought 'oh no, I am about to be murdered.'

What we are afraid of, all of us, is not the actual fact of stopping, failing to propagate our brainstate forward. Whatever. That's not inherently frightening. What we are afraid of is creating forks who have to live with this knowledge, right? Or even creating forks who have experiences that will be lost, even if they don't know they're about to die. We don't want to be that person because we will definitely be that person, even as we are also definitely going to be the other fork.

If I'm not conscious when I'm teleported/scanned/whatever, I can be sure there are no qualia getting lost. When my qualia reboot, they'll be rebooting from only my surviving fork. I'm cool with that.

Is that too freaky and posthuman to be sympathetic? I can understand how it'd be freaky and posthuman and disturbing. But I think it's an unintuitive but inevitable consequence of looking at this rigorously.
« Last Edit: October 19, 2015, 10:19:06 am by General Battuta »

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Some of that post might not be totally nailed down to rigor, I might have to cogitate on it a bit more. Those were great questions, Luis, thank you.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
I really don't expect anyone to agree with me on 'it's cool to be tele-copied and then vaporized, as long as you're asleep', and probably you are right to disagree, I gotta chew on that one. It violates a lot of my own firmly monist principles.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
So, just to clarify two things:

1. My issue with "suffering" as not mattering is not to say that I don't care about suffering as a thing, rather that I don't care about it as an argument.

2. My thing about the whole "brain creates consciousness, so it's fine if we build a new brain with the exact same brain state elsewhere", is not that I'm "worried" about forks or what happens in the meanwhile or whatever. What worries me is that I just end. That is, there is nothing in Physicalism that guarantees that my own, personal experiencing of living will continue in a very similar brain elsewhere. I don't give a rat's ass if that brain is exactly like mine, behaving like I do, having a consciousness equal to mine.

My problem is simpler: Is it me doing the travelling or is it me giving birth to a clone to the place I bought a ticket into?

Physicalism - as you seemingly define it - treats both stories as one and the same, while I do think that the difference is between you experiencing death and blankness forever (while giving birth to a clone) and actually travelling to the other side of the wall.


Physicallism tells me there's no real difference between my own conscious experience and any other's. Except that I'm stuck in mine. Everyone's stuck in their own. And this stuckness is ill defined as of yet. We still do not understand it very well. We might say "Oh but you see it's all due to what is connected to your synapses and so on", ok, sure, it's still very unclear.

So my point is, if I'm stuck in my Consciousness, and I'm the Person who is being scanned and killed, then it logically follows that my Consciousness is stuck to die. The only technological miracle that is independent of this basic murder story is that, somehow, through a process, a new Consciousness will be born out of an equal chinese room in another room.

From the point of view of that guy who is just 2 seconds old, it's good. It's been fun! He just travelled thousands of whatever, closer to his goals. He will even gladly pay what he owes and shake the hands of the operators. And when he comes back again in a week, his life span will have been merely a week.

 
Re: The "hard problem of consciousness"
Treating consciousness as "an emanation of the luminiferous aether" (which isn't an accurate description of dualism, but never mind) is better than not addressing consciousness at all. A god of the gaps argument would claim that physicalism doesn't currently explain consciousness, and hence consciousness is special. The actual situation is much worse: physicalism doesn't even address consciousness.

We haven't "introduced an acausal mechanism into the universe". The phenomenon is inescapable.

But you did! Dualism postulates that consciousness is something that cannot be completely described in terms of physical interactions. But since consciousness has undeniable physical sideeffects, consciousness has to appear acausal from the point of view of a purely physical examiner.

Yes, dualism postulates that consciousness cannot be completely described in terms of physical interactions. (In fact, I would argue that this is a logical necessity.) It does not, however, "postulate" the existence of consciousness: this is a brute fact. I think our disagreement is purely terminological.
----------
@Battuta: You're still claiming that consciousness is the brain, ignoring my counterargument.

Now for the rest of my paragraph: if you agree that existence and consciousness are brute facts, and you agree that these are the only brute facts, our definitions must coincide (though mine is more precise). Hence the brain cannot be consciousness, because the brain violates our definition: it is not a brute fact. This is simple logic.

Here's what I think is going on. You observe that changes in consciousness are always accompanied by changes in the brain. You then jump to the conclusion that consciousness is the brain (which is certainly tempting, because it "explains away" the hard problem). But this you cannot do, because the brain violates our agreed-upon definition of consciousness.

In short, you're trying to fit a square peg into a round hole. This is exactly what I mean by first-level and third-level knowledge.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
No, I've explained how we get from consciousness as brute fact to consciousness as physical.

Quote
We have a consciousness. We can do a lot of stuff with it. One thing we can do is poke the environment around us and see what happens.

As we do this, we begin to detect causal logic in how the environment behaves. This leads us to the choice between solipsism and objective reality. Solipsism is not useful: it undercuts itself.

Once we have chosen objective reality, we must begin trying to build models of how it works. And when we build a model that, in the end, explains ourselves, when it contains only the necessary and sufficient factors to explain our own consciousness, then we have come full circle. We know what we are, where we come from, and what our minds do. We have demoted consciousness from brute fact to a mundane subsystem of a universe built out of quantum fields, and we know that our own illusory certainty that consciousness comes first is only that: an illusion.

We are the laser told to search for the darkness. Wherever we look, we see qualia, so we assume that qualia have primacy. But consciousness comes last.

This is physicalism. It is the only account of consciousness with any value. It tells us why we have qualia.

Consciousness is a calculation conducted by the human brain. Consciousness is the brain. This is the only logic.

This is not jumping to a conclusion, unless you call the entire development of human science 'jumping to conclusions'. Rather, it is use using our initial 'brute fact' definition to explore the possibility space for more information, locating one thread that works, and using it to update our definition of consciousness.

We have discovered that our peg is, in fact, square. It was never round at all.

Quote
2. My thing about the whole "brain creates consciousness, so it's fine if we build a new brain with the exact same brain state elsewhere", is not that I'm "worried" about forks or what happens in the meanwhile or whatever. What worries me is that I just end. That is, there is nothing in Physicalism that guarantees that my own, personal experiencing of living will continue in a very similar brain elsewhere. I don't give a rat's ass if that brain is exactly like mine, behaving like I do, having a consciousness equal to mine.

My problem is simpler: Is it me doing the travelling or is it me giving birth to a clone to the place I bought a ticket into?

Physicalism - as you seemingly define it - treats both stories as one and the same, while I do think that the difference is between you experiencing death and blankness forever (while giving birth to a clone) and actually travelling to the other side of the wall.

Physicallism tells me there's no real difference between my own conscious experience and any other's. Except that I'm stuck in mine. Everyone's stuck in their own. And this stuckness is ill defined as of yet. We still do not understand it very well. We might say "Oh but you see it's all due to what is connected to your synapses and so on", ok, sure, it's still very unclear.

So my point is, if I'm stuck in my Consciousness, and I'm the Person who is being scanned and killed, then it logically follows that my Consciousness is stuck to die. The only technological miracle that is independent of this basic murder story is that, somehow, through a process, a new Consciousness will be born out of an equal chinese room in another room.

From the point of view of that guy who is just 2 seconds old, it's good. It's been fun! He just travelled thousands of whatever, closer to his goals. He will even gladly pay what he owes and shake the hands of the operators. And when he comes back again in a week, his life span will have been merely a week.

Yeah, I get you. But I am making an end run around this whole problem. I think that our 'stuckness' is simply a story we tell ourselves because we are never forced to confront what we really are: the epiphenomenal experience of being a material brain, one Planck instant at a time. We actually aren't stuck. We are endlessly teleporting: giving birth to a clone who travels forward one tick.

I believe that everything we are, including our own subjective experience of being me, having been me since I was born — in short, our credentials, our qualia — is physical. If we rebuild the physical we rebuild that subjective experience.

This is why it is important to remember that we can only claim continuous identity retrospectively. We can say 'tomorrow I will', but we cannot remember ourselves doing it. We are only planning. Our credentials haven't been established yet.

To your worry about the man who takes a teleporter trip and lives only a week, I would say that he lives far longer than the man who sleeps, and lives only a day. He lives longer than the man who gets blackout drunk, and exists as a drunken and transient bubble of joy and vomit for only half an hour before he passes out.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
We are not one core identity moving forward through time. We're more like a reel of film. Pull out one instantaneous frame and let it pop, like a hologram, and from it emerge all the qualia and sense of me-ness that existed at that moment. In that frame is consciousness. But the consciousness is within the frame, not vice versa: it emerges only from the physical states of the brain.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Yeah, I get you. But I am making an end run around this whole problem. I think that our 'stuckness' is simply a story we tell ourselves because we are never forced to confront what we really are: the epiphenomenal experience of being a material brain, one Planck instant at a time. We actually aren't stuck. We are endlessly teleporting: giving birth to a clone who travels forward one tick.

Well that's fine. That's a good story. I never said it wasn't a coherent story. I'd gladly read that novel and have a real blast with it (perhaps I've done so in all Star Trek series so far). What I said was, you have no way to test if that story is true. It's therefore, not physics. It's not science. At most, it's metaphysics, and a tad in need of rigorous checks. It does seem to run into basic aristotelian roadblocks of essences and forms, for instance.

The testability is fundamental here. Look at your next paragraphs:

Quote
I believe that everything we are, including our own subjective experience of being me, having been me since I was born — in short, our credentials, our qualia — is physical. If we rebuild the physical we rebuild that subjective experience.

This is why it is important to remember that we can only claim continuous identity retrospectively. We can say 'tomorrow I will', but we cannot remember ourselves doing it. We are only planning. Our credentials haven't been established yet.

To your worry about the man who takes a teleporter trip and lives only a week, I would say that he lives far longer than the man who sleeps, and lives only a day. He lives longer than the man who gets blackout drunk, and exists as a drunken and transient bubble of joy and vomit for only half an hour before he passes out.

I like how they are phrased now. They espouse your beliefs. But consider this: when you speak on how such a person "will live longer than the man who gets blackout drunk" and so on, I have the sensation I'm not reading anything really rational or scientific now, merely poetical. As an analogy, it's like reading someone saying that "I'm going to live forever through my sons and daughters, I'll live forever through my work". At the end of the day, if I'm to decide to teleport myself or not, I will ponder through those things poetically only after considering my true existence of my stuck conscience in that world. Not all of these poetical things. Like Woody Allen said:

“I don't want to achieve immortality through my work; I want to achieve immortality through not dying. I don't want to live on in the hearts of my countrymen; I want to live on in my apartment.”

Substitute "work" for clones, substitute "hearts of my countrymen" for "teleported copies of my own".

 

Offline Scotty

  • 1.21 gigawatts!
  • 211
  • Guns, guns, guns.
Re: The "hard problem of consciousness"
GhylTarvoke: In the metaphor of the button and the circuitry, the brain is not the button and consciousness is not the circuitry.  Qualia, experiences, are the button.  The brain is the circuitry.  Consciousness is the outcome.

I've noticed that this is something you've continually misapplied as an argument in your favor that there must be something else.

 
Re: The "hard problem of consciousness"
That's all very nice, and completely dodges the question. I'm not sure how to say it more plainly, but I'll try.
----------
Assumptions
1. "Existence", "consciousness", and "brain" are meaningful words.
2. Whatever they are, existence and (the existence of) consciousness are brute facts.
3. The existence of brains is not a brute fact.
4. "Bruteness" is a property.
5. If two things do not have the same properties, then they are not the same.

If you're denying 1 or 2 (and I'm pretty sure you're not), we have reached an impasse.
If you're using the standard definition of "brain", 3 is clear. We may be digital simulations, and hence have no brains.
You've referred to brute facts repeatedly, so I'm pretty sure you're not denying 4.
If you're denying 5, you must be using an extremely nonstandard definition of "sameness".

Claim: With these assumptions, consciousness cannot be the brain.
Proof:
By 1, the words "existence", "consciousness", and "brain" are meaningful, and we can use them.
By 2, consciousness is a brute fact. By 3, the existence of brains is not.
By 4, bruteness is a property. Hence consciousness has a property (bruteness) that the brain does not.
By 5, this implies that consciousness and the brain are not the same.
----------
The proof is logically valid, so if you disagree with the conclusion, you must disagree with one of the premises. I can't figure out which one you disagree with.

EDIT: This is in response to Battuta.

 

Offline Luis Dias

  • 211
Re: The "hard problem of consciousness"
Battuta is not equating Consciousness to Brains prima facie. He's concluding that Consciousness = Brains through several findings both a priori and empirical.

I don't agree with the sentence "Consciousness = Brain", but I think he was just sacrificing rigor for summary there. The point he's making is that Consciousness is a product of the Brain, it's a physical process that happens within the Brain.

 

Offline Scotty

  • 1.21 gigawatts!
  • 211
  • Guns, guns, guns.
Re: The "hard problem of consciousness"
Indeed.  A brain is not a consciousness.  A brain is a mechanism for interpreting experiences and producing consciousness from that data.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Well that's fine. That's a good story. I never said it wasn't a coherent story. I'd gladly read that novel and have a real blast with it (perhaps I've done so in all Star Trek series so far). What I said was, you have no way to test if that story is true. It's therefore, not physics. It's not science. At most, it's metaphysics, and a tad in need of rigorous checks. It does seem to run into basic aristotelian roadblocks of essences and forms, for instance.

The testability is fundamental here. Look at your next paragraphs:

I like how they are phrased now. They espouse your beliefs. But consider this: when you speak on how such a person "will live longer than the man who gets blackout drunk" and so on, I have the sensation I'm not reading anything really rational or scientific now, merely poetical. As an analogy, it's like reading someone saying that "I'm going to live forever through my sons and daughters, I'll live forever through my work". At the end of the day, if I'm to decide to teleport myself or not, I will ponder through those things poetically only after considering my true existence of my stuck conscience in that world. Not all of these poetical things. Like Woody Allen said:

“I don't want to achieve immortality through my work; I want to achieve immortality through not dying. I don't want to live on in the hearts of my countrymen; I want to live on in my apartment.”

Substitute "work" for clones, substitute "hearts of my countrymen" for "teleported copies of my own".

I think these things are as testable as whether we'll still be ourselves tomorrow, which is the only criteria we need. If I sound poetical it's because our ideas of 'self', 'dying', 'consciousness' and so on are just poetry — dressed-up terms to disguise the fact that we're reels of film.

I do not share your fear that living on through teleporters is like living on through your work. If you rebuild the object, you rebuild the subject. This is not just an untestable article of faith but the default conclusion of every single piece of evidence we have about the universe. I do not see any risk to the philosophical teleporter.

That's all very nice, and completely dodges the question. I'm not sure how to say it more plainly, but I'll try.
----------
Assumptions
1. "Existence", "consciousness", and "brain" are meaningful words.
2. Whatever they are, existence and (the existence of) consciousness are brute facts.
3. The existence of brains is not a brute fact.
4. "Bruteness" is a property.
5. If two things do not have the same properties, then they are not the same.

If you're denying 1 or 2 (and I'm pretty sure you're not), we have reached an impasse.
If you're using the standard definition of "brain", 3 is clear. We may be digital simulations, and hence have no brains.
You've referred to brute facts repeatedly, so I'm pretty sure you're not denying 4.
If you're denying 5, you must be using an extremely nonstandard definition of "sameness".

Claim: With these assumptions, consciousness cannot be the brain.
Proof:
By 1, the words "existence", "consciousness", and "brain" are meaningful, and we can use them.
By 2, consciousness is a brute fact. By 3, the existence of brains is not.
By 4, bruteness is a property. Hence consciousness has a property (bruteness) that the brain does not.
By 5, this implies that consciousness and the brain are not the same.
----------
The proof is logically valid, so if you disagree with the conclusion, you must disagree with one of the premises. I can't figure out which one you disagree with.

EDIT: This is in response to Battuta.

It answers the question nose on. We begin with 1, proceed to 2, between 2 and 3 we conduct a search for systems of logic that use 1 and 2 to explain our perceptions. We stumble on mathematics, physics, and all their consequences: the belief in an objective reality that obeys causal logic. We reach 3 knowing that brains are consciousness: the existence of brains is the same as the existence of consciousness. We are our brains, and whatever we are is material. Any brute facts of our existence are material. The entire universe and all its rules are physical. Failing to accept this sends us back to the search between 2 and 3, which we repeat, and find no better (necessary and sufficient) model to explain our own existence.

qwed~

e: seeing Luis and Scotty's posts I will happily say that consciousness is a material process within the brain, a function. It's true that the whole brain isn't devoted to consciousness. All consciousnesses are in brains.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: The "hard problem of consciousness"
Even if we're digital simulations, we still have brains, the simulation is computing us as little blobs of flesh. Brains are as brutally factual (:megadeth:) as consciousness.