Which of these two narratives is what really happens from your point of view?
That depends on which of the two me's I was. I can only be one entity.
That's right (as is everything in that post). Nothing you said in those two recent posts you made was wrong. You're just arguing a point that's been accepted and explored for pages now.
After the fork, you are two separate entities, both of them equally you. If one dies, he's dead.
No, it doesn't matter which one is the origial. What matters is that each one has it's own sense of self - and that sense of self dies with him. Which makes all of YOUR points moot.
I've already explained to you why this happens. Go back and reread the earlier posts.
Using this forking method, you can now extend retrospective existence indefinitely.
You're
right. We've been telling you that you're right. But you're also equally
wrong. I think you're finally beginning to ken what we're saying, though.
Copying
creates more yous. The fact that you've now admitted that you can't tell copy from original, but that the copies themselves can,
is precisely the point. Would you be worried about your own death if your identical twin died?
No.So long as any given fork lives, you can think of it as your twins dying, but the original living. And I don't think that you, TrashMan, would be freaking out over your copies dying (you might be sad, but it's not
you dying.)
Thus, you are immortal.
Your problem is that you want an immortality strategy that
minimizes deaths of all possible forks of you. Such a strategy is a trivial
reduction of the forking strategy. One example would be moving your brain to a new, young body every so often.
But if you accept this as an immortality strategy that does not involve self-death (which you will), then you must also accept that the forking strategy is logically
identical, only with the addition of some duplicates of yourself as 'waste products'. The only thing is that any given duplicate will see all the others as those waste products.
You are multiplying yourself. And yes, if any one of those selves dies, it is dead. Nobody has contested that.
You can indeed die in this system; immortality by forking does not prevent you from dying. But it nonetheless satisfies these constraints:
A system whereby my consciousness can continue indefinitely, without fear of permanent extinction, and without any interruption more significant than normal sleep or unconsciousness, and in which all my memories, both implicit and explicit (thereby including skills, cognitive structures) are preserved, along with my neural structure and embodied cognitive elements.
What you're hung up on is
divergence after the fork. As Kara has demonstrated, you have to explicitly state which point of view you are now working from, just as in relativity you must specify a reference frame for anything to make sense. And as you've stated, you are now two different people after the fork.
The issue is that the system does not appear to be time-asymmetric. If you follow your point of view forward through a fork, you see no divergence: you sit down in a chair, get scanned, get up, leave, die.
But if you follow your point of view
backwards from either of the two forks, you also meet no divergence. As the artificial fork, you remember sitting down in the chair and then waking up in a white room.
Making the leap to understanding this requires a tremendous effort and no little insight, like Ford Prefect said. It requires you to realize that you are not the same person you were five years ago in any physical sense, and that as long as two brainstates are directly causally coupled, they must be meaningfully considered the same person (how else are we to say that we continue?)
Again, Trash, ponder this thought experiment: You have two brains in your head, held perfectly in synchrony by a link. Which one of them do you live in? What happens if one of them is killed? Do you notice anything?
On the broader level, consider: how is death different from simple unconsciousness? In what way are they meaningfully different? If the mind can be rebooted from pure wetware after a state of zero neural activity, then what is the mind aside from meat? (Nothing!)
All in all, I find it ironic that you're getting so worked up about this, when I imagine you would be completely fine with a live-update backup system with no divergence (
a la Cylon resurrection.) Would you be okay with resurrecting Cylon-style?
Bizarre, huh?
Interesting but it doesn't totally answer my question. How do you know this stuff? I'm curious about your sources so I can learn more.....
Uh, honestly, I've mostly just thought this out from first principles. Most of it (including the mind-boggling paradoxes that are tripping up Trash and even myself) can be constructed from physicalism and a purely physicalist answer to the mind/body problem. However, Kara might've read something good on the topic. I think Daniel Dennett is a physicalist philosopher.