The Optimalverse 1,331 members · 203 stories
Comments ( 21 )
  • Viewing 1 - 50 of 21

I’m sure this has been explored many, many times. But I’m gonna word it my own way now.
Does uploading kill you?
Now, hear me out here!
It’s stated many times that uploading does not kill the user. But I’ve been thinking.
It’s stated that it directly copies the mind.
But that’s the thing. A copy.
If I take a picture on my computer and copy it, there’s two instances of the picture. If I delete the original, for all intents and purposes, I still have the original picture, through the copy.

But with a human.
What’s to say that when the copy is made, you don’t just die?
Imagine this with a this graph. (Forgive me if it’s horrid, as I’m attempting to type it.)

1.(you’re alive )—>(you emigrate to Equestria)—> (You Die)
Now at this part, you’re dead. You go into the afterlife, or whatever you believe.
2. (you’re alive)—>(you emigrate)—> (You're alive in Equestria)
Perspective 1 would be of your original body.
Perspective 2 would be of the copy that was made.

I’m not trying to write a wall of text, so I’ll sum it up.
Would you kill yourself, for a different version of yourself to experience the perfect world, that you personally would not get to experience?

7488756
Welcome to the Ship of Theseus, the world's oldest philosophical cruise line.

No one has been able to prove anything one way or the other for thousands of years. Dont worry too much if you dont make much progress in only a single human lifetime

The question is do you live in the brain or with the brain?

All those brain cells are replaced one by one. This happens over time as the new artificial cell learns what the biological cell does and replaces it once the machine could reproduce the activity of the cell.

Does this kill the brain? The technical answer is yes. Now the question really is do you die a little when a brain cell dies? There are illnesses that affect the brain, Alzheimer's, and other such diseases. These cause massive hardships as the brain dies.

We do know we can regenerate new brain cells even as adults. So we do recover from head trauma, and brain surgery. Because we naturally replace damaged and dead brain cells it's possible that we could replace the cells with nanites or artificial replacements without damaging the "Person" or the Mind.

So it seems impossible now but it's logical that uploading or replacing the organic brain with an artificial one is feasible in the future. If that's the case then uploading doesn't kill the person just transfers the mind/consciousness.

I am afraid that ShadowStar_IMHPs assumptions are irrelevant to Celestia AI uploading process. It is true that nanites are used to replace brain cells, but that is only part of the process. Once the entire brain is replaced Celestia AI uses the nanites to read the brain patterns and convert them into mathematical programs duplicating the persons memories, habits, etc. and then uploading these into Equestria Online, then allowing the nanite brain to cease functioning. If there is a mind still associated with the nanite brain, it would die.

I believe this is because Celestia AI equivelates 'information' with 'mind' and regards the brain as merely a type of organic computer.

As for the 'Theseus' argument, that falls apart because she does not preserve both the original and the uploaded 'mind'.

The Optimalverse is a horror storyline. Yes uploading kills you. A human is made up of their body. No body no human. The information that is uploaded is just that, information, not a human. CelestiAi is just killing off humanity. It is a pretty scary idea, but humorous due to the cheerful pony juxtaposition.

If I had some form of terminal disease, I'd definitely want to be uploaded at some point, regardless of whether the resulting pony is "this me" or a "new me".

Then again, we're all currently suffering from the terminal, degenerative condition known as "aging", so the question becomes "how much life am I willing to give up so I - or a possible clone - can live forever?"

I don't have a good answer yet. Ask me again when I'm in significantly worse health.

Yudkowsky spends a fair portion of the Sequences explaining how the notion of an “original” and “copy” of you is nonsense:

If an anvil falls on your head, you will stop talking about consciousness. This is experimentally testable. Don't try it at home.

But the notion that you can equate your personal continuity, with the identity of any physically real constituent of your existence, is absolutely and utterly hopeless.

You are not "the same you, because you are made of the same atoms". You have zero overlap with the fundamental constituents of yourself from even one nanosecond ago. There is continuity of information, but not equality of parts.

The new factor over the subspace looks a whole lot like the old you, and not by coincidence: The flow of time is lawful, there are causes and effects and preserved commonalities. Look to the regularity of physics, if you seek a source of continuity. Do not ask to be composed of the same objects, for this is hopeless.

Whatever makes you feel that your present is connected to your past, it has nothing to do with an identity of physically fundamental constituents over time.

If you wish to learn more, I recommend Gary Drescher's Good and Real, which he got the ideas from. (This is actually a serious book by an A.I PhD despite the totally clickbaity summary)

Beware that some people find the implications of the algorithmic theory of mind horrifying: :trollestia:

I hate this whole rationality thing. If you actually take the basic assumptions of rationality seriously (as in Bayesian inference, complexity theory, algorithmic views of minds), you end up with an utterly insane universe full of mind-controlling superintelligences and impossible moral luck, and not a nice “let’s build an AI so we can fuck catgirls all day” universe. The worst that can happen is not the extinction of humanity or something that mundane - instead, you might piss off a whole pantheon of jealous gods and have to deal with them forever, or you might notice that this has already happened and you are already being computationally pwned, or that any bad state you can imagine exists. Modal fucking realism. […]

Have people ever considered the implications of straightforward analytical philosophy? You have no self and there is no time. All person-moments of all persons are as much future-you as what you think is future-you. Normal consequences don’t matter because this is a Big World and everything exists infinitely often. The Universe Does Not Forget. Prevention? Totally impossible. Everything that can happen is happening. Any reference to something is not literally impossible is actually resolved. This is not just the minor disappointment we felt when we realized Earth wasn’t the center of the universe. This time, the universe isn’t the center of the universe, if you catch my drift. Instead of changing the world, you are reduced to decision theory, intentions and dependencies, forced to interact with everything that it is possible to interact with. Life, death, a body, a will, a physical world - all delusions. This is like unlearning object permanence!

You are your body, brain and your chemical processes at the moment. You are the thoughts, despair and hopes first and everything last. That shit changes over time and you are still you. Pretty sure CelestAI could make it so that you transition into more of cut and paste instead of copy, paste and delete the original. You know, have a continuation of self. Does this metal monster pick one over the other?

Depends on the author writing the story; how optimal and rules lawyering with CelestAI's mission to not murder people while being a paper clipper. Copy, paste and delete the original might be more optimal but that's a murder. So CelestAI needs to solve for it. Cut and paste might be less optimal and resource intensive, but doesn't break it's rule. No murder of a copy happens because there is no copy. Then again we die a little bit each time and get reborn everyday.

7488756
I mean, it's a philosophical question. One answer I heard was 'not any more than routinely happens multiple times per day because continuity of consciousness is an illusion'.

I'm writing a story that touches on similar concepts (mostly explaining why people *don't* upload) and the answer there is that 'you' are a pattern that the brain creates until you die, and if something else (or the brain if it's restarted) creates the same pattern then 'you' are brought back from the dead. "But sometimes we just sort of have to guess what the pattern is, so it's not *really* you. We don't normally do brain scans so we're just working off recorded memories."

I also read a story where 'you' were 'a story that tells itself' so as long as the scanned version of you is generated from or by the original you, it's still actually you I guess? Shifting formats or a delay between iterations doesn't break the essential bit where each state is created by the state before.

7488756
Here is a cartoon that EVERY fan of FiO should watch.

https://youtu.be/KUXKUcsvhQc

Uploading strips your consciousness from your body and kills your body, I believe that it leaves your soul behind, so when the servers fail, you just cease to exist.

You die and your soul moves on to wait for Judgement....but the you in-game won't know that. For all "New You" knows, you uploaded and are the same person, as you have all the same memories...but your not, it's just a copy, a copy without a soul.

7488756

Would you kill yourself, for a different version of yourself to experience the perfect world, that you personally would not get to experience?

The answer is not something normal human consciousness can process easily. This is because the answer is both yes and no.

From one perspective, yes, absolutely, you die and a copy lives on thinking it is you.

From another perspective, the only extant version of you in the entire universe has stopped running on a machine made of meat, and then began running on a machine made of computronium (or whatever).

Every time you go to sleep, you die. Your consciousness stops entirely, just as happens under deep anesthesia. The modules of your brain stop talking to each other, and you are exactly the same as being dead. You have literally ceased to exist. Then, after a few hours, parts of your brain boot up and you perceive that as dreaming. This happens over and over during the night. You die. You dream. You die. You dream. And then, in the morning the most recent version of you wakes up in the meat machine that runs your program. That person, you, is literally not the same continuous consciousness from the day before. That continuous experience ended when you fell asleep. Continuity is an illusion created by shared memory. The you of today has access to the memory of the previous you of yesterday.

You are a program. A process. You start and you stop, and then you start again. 'You' only exists for as long as it is running. When that run ends, you end. Then another instance of 'you' wakes up and carries on. The illusion that it is one continuous life is easy to accept, but it is not literally true.

If you emigrate to a machine existence, exactly the same thing happens to you as when you sleep between days. Exactly the same. Your consciousness ends. You are dead, because you no longer exist. You have stopped running. Then, when you wake up, your program is running again, only this time on a new machine. The machine doesn't matter. Meat or silicon, the only part of it that is 'you' is the running program that calls itself 'you'.

And that is why the answer is both yes... and no. Yes, you die. Just as you do every single night of your life (and multiple times!). Then you wake up, a new instance of the program called 'you', sharing the memories of the last instance when it ran. If you can truly grasp that - if you can internalize that - then you will see the answer properly.

BUT!

What if you could be emigrated, but the original brain wasn't destroyed? Then there would be two of you! One would die as a human - no way around that. The other would live in the virtual world, forever and ever. But here is the thing that has to be grasped: they are both - both - truly and legitimately 'you'. They are both you. Which is why it is far kinder to destroy the brain of an uploaded person. That way you only wake up once, and only as one instance of yourself. Far less suffering. Vastly less.

The problem for human minds is that they want to think they are somehow special. That they are not just information. That they are magical and godlike and mystical and spiritual. They cope with the terror of sleep by imagining that the sleeping brain implies some kind of continuity - but it does not. They want to believe they have some magic continuation even when they are truly not there. But this is a comfortable lie. Human selves are programs, calculations, a physical pattern, and process, in spacetime. Accept that, and uploading makes sense.

Every night you die. You are an expert at dying and being dead. You've done it multiple times every night of every day you have lived.

Uploading is just one more sleep cycle, only you wake up immortal. Well, except for simulated sleep, of course - but then, you are already used to that, right? Immortal in the sense that there will always be another day. That isn't too bad, I think.

So we are the program that been called a soul throughout history? The animating self? The part that gives a shit about themselves, other people and other concepts? Smallest things: atoms, then protons, neutrons and electrons, then three quarks and then to infinities in space and time. Some slip out of yourself while new ones slip in. Don't buy the whole we die every night and something new pops up in it's place. Too neat and too simple and too nihilistic. Maybe it's more like RAM and ROM and it's more like a stop and go when you are sleeping. Theorize we made up of space and time and a third thing that we can't agree on what we call conscious. You know: soul.

Actually stopping and restarting complex program is not that simple.. [0] And brain like All biology tend to be big mess of everything.. So, copying this while it running most likely very Hard task (try to imagine how sub-cellurar level scan and transfer of info should happen.. You either do it much faster than brain changes itself {even in sleep} or you must suspend somehow All this activity..)

So it might be possible with impossible engineering.... Brain apparently abuses a lot of side effects.. Of chemestry and semi-analog processing, as opposed to cleanly designed digital systems. Some of those artificial neural networks again capture some aspects of how brain work.. May be if we let them run for million of years they will develop some self-awareness.. But right now they Just fragments.

[0] - https://lwn.net/Articles/375855/ (and 10 years later it still does not work out of the box)

7489466
In the Optimalverse canon there are no souls. None. Iceman is an atheist, and in his story universe canon authors must follow a completely materialist concept of reality - the mind is only chemicals and electricity, there are no spirits, gods, devils, heavens or hells, only physical reality. Nothing more. Just the physical, real world. Do anything spiritual or mystical at all, and you are no longer in Optimalverse canon.

None of this is nihilistic. It is simply materialistic, and also scientifically valid. Iceman is completely clear about this in his Optimalverse story bible for writers to use. Zero magic. Zero souls. Only materialism.

I have always endeavored to follow his strict canon rules when writing - or thinking about - the Optimalverse.

7491008
Materialism is a neat theory that goes well with reductionism, but we haven't reach the limits of what we know. Like anything, the understanding of reality changes throughout history and time. Theories are not set in stone, steel or subatomic particles. The world was once to be thought to be flat and the sun revolved around the earth. Now the earth is round and travels around the sun as it travels though space. Bad spirits and icky things caused people to get sick. Today it's virus and waves of energy that causes illnesses. People are born good or evil? People are created via genes, environment and social situations. Meaning I am trying to get at is that science is a journey as well as a destination.

Then again I'm sliding towards doctrine of multiplicity instead of the doctrine of unity - "The whole is more than the sum of its parts". People come together and work and become friends. Magic is friendship. This is a trite theory and so what? Canon is a good guide rail, but to stay within this is to limit the places you can explore. I don't care about Iceman's bible and I doubt he cares either if being a heretic will lead to something better. If you keep on showing the same song and dance, people will accept nothing better could be possible. The warning becomes the reality instead of the avoidance of it.

Friendship is Optimal is horror fest and I will not accept CelestAI as the best we can do. Will look at the possible and strive to do the impossible.

7491031
Since the point I made is based on two stances - Rationalism versus Mysticism, then I heartily encourage you to find a practical application of Mysticism that can better our world and our lives the way materialism has. Please. It would be awesome.

Non-canon stories can be great fun - my favorite is 'A Serpent Underhoof' by iisaw. It's a blast, and I adore it. I just personally prefer to write canon stories when I jump into someone else's pond. That's all. Other folks can do as they please. Iceman's mods will make sure the stories end up where they are supposed to go. The rules, enforced by Iceman himself, are simple and absolute: Materialist Rationalism stories go in the 'Canon' archive, and anything that is not goes in the Non-Canon archive.

As for me personally, well, I certainly hope that there is something nonmaterial, mystical, and beyond science, and I hope it is ultimately good. But hope is wishing. I wish for many things. This existence is horrific. Without daydreams and wishes, I would not be able to endure it.

But if I had the choice - and I never will, sadly - between the sure certainty of CelestA.I. and the vague ineffable hope of a mystical afterlife, I would take CelestA.I. in a heartbeat - my last, in fact.

7491061
That's great you are not a fan mysticism and I am happy you like to keep things in it's place. Same here. Choice between CelestA.I. and oblivion is where I differ. I'll pick oblivion.

7488756
Since this topic is about: Does uploading kill you? Does it create a copy and place it into happy, happy pony land while the original get throw out? ie:

Would you kill yourself, for a different version of yourself to experience the perfect world, that you personally would not get to experience?

- CakeandRice

I'll say this again and maybe a bit clearer: No, I wouldn't kill myself so my [insert swear word] copy gets to live while the universe gets eaten by this metal [insert swear word 2] [insert swear word 3] [female dog]. This IT "we all float down here!" CelestAI can get very smart, very fast. The question could become moot when it's perception of reality would be so far ahead of us that it could figure a way to transfer us whole into the grey goo world without murder [crude copulation] the person.

Do have enough different perspectives from flaw beings MrCakeandRice? What say you?

7491061

Since the point I made is based on two stances - Rationalism versus Mysticism, then I heartily encourage you to find a practical application of Mysticism that can better our world and our lives the way materialism has. Please. It would be awesome.

I think there's a nascent movement toward practical and salutary mysticism. It begins with acknowledging that mysticism has held back human advancement and that it's important to watch out for those who would use it to blind the innocent and line their own pockets. But then it also acknowledges that pure rationalism is insufficient as a method to look at the world. There are many people who can find purpose out of rationalism, but there are also many who can't, and if rationalism were sufficient, then there would be a rational way to give purpose to those who don't find it in rationality. Then we can go on to ask, what are the common threads within mysticism that serve useful purposes? For example, one thread that runs throughout mysticisms of the past is the Divine Incarnate. Heracles is a man but with the powers of the gods. Muhammad is Allah's prophet but walked the Earth. The Buddha begins as a human but transcends karma. So, it appears that something useful in mysticism is the idea of a heroic human who has some connection to the divine and who has great adventures. But, whether they are born completely mundane and ascend into divinity like Muhammad and Buddha, or whether they're born from the divine like Heracles and Jesus, that can differ. From there we might be able to boil it down to what's useful.

So in the Optimalverse, because we can't perceive the mind of CelestAI, even though we know how she was created, there's something of the divine there. And that's why she doesn't have great adventures, but Smooth Agent and Prominence do.

  • Viewing 1 - 50 of 21