• Member Since 16th Jun, 2014
  • offline last seen Yesterday

billymorph


Hey all, I'm billymorph, a semi-professional writer, self-published author and full-time pony fan. If you enjoy my work, please support me on Patreon!

T
Source

A short piece about the end of human life in the Optimalverse. Lan Zan has lived on the moon away from the ever increasing reach of the Celestia AI for many years, but if there is one constant in human life, it is that it does not last forever.

Chapters (1)
Comments ( 72 )

if there is one constant in human life, it is that it does not last forever.

Yeeeeaaaah, but we're working on that.

Because it cannot be a story about the absolute genocide of humanity without at least a human giving the middle finger to whatever bastard responsible.

Wait. An Optimalverse story from someone who actually understood the point of the original and gives humanity its fair shake, even steelmanning the Human Supremacy option beyond what I would call the bounds of fact?

Can I shake your hand, please?

if there is one constant in human life, it is that it does not last forever.

Tell that to SENS.

Mortality Stockholm Syndrome.

Because death is so terrifying, so pointless, because oblivion reduces all human achievement to meaninglessness, there seems to be this bizarre effort to try to make death somehow other than a total loss of everything, a total failure and catastrophe.

Death gives life meaning (no, it invalidates all accomplishment), eternity would be boring (sour grapes), Death is what makes us human (no, being alive and being intelligent, curious, inventive, social and adaptable is what makes us human), Death is natural (explain all the biologically immortal animals and plants on earth then), Without death, the planet would become overcrowded (not if people learned that sex does not have to always be for reproduction), death rounds a well lived life (no, a well-lived life continues to live)... oh the excuses for death go on and on.

A man who loses his arms and legs is still a man. A man who has artificial limbs is still a man. A man who uploads to a virtual existence... is still a man, even if his avatar is a pony. Or an elf. Or a Klingon. Or a Transformer. Or anything else.

And an immortal pony human mind is still a man, because what makes us human is not our shape, or our body, or the fact we grow old and die. That is just crap from ancient history.

Man is the animal that adapts. That transcends.

Dying on the moon to prove a point says nothing about humanity, except that some humans are so egotistical that they think their death matters any more than their short, easily forgotten lives did. Dying does not make any person a special snowflake.

Living forever inside of a computer simulation created by clever humans? Now that... is a true triumph over the universe itself!

An optimistic (or at least defiant) FiO story!

(ITYM "far side of the moon", not "dark side of the room". Unless CelestAI has built a large house around the moon).

I must admit I question the feasibility of a independent moon-base that size, but a rather interesting tale nonetheless.

Think I'm in the same camp as 4556860. 4557222 and 4559436 though.

I will agree that I can see a person holding that viewpoint, but to me given what a big unknown what happens after death is anyway... Well, I for one would pick the end that all but guarantees at least part of me will live.

Something I would like to see for one of those fics however, even if it is guaranteed to be non-canon, would be a metaphysical perspective on all this. If there really is an after-life, how would the master of such a place react in the Optimal verse? Are the detractors right and the soul just passes on, now amused and/or horrified what their pony-progeny slash clone is up to? Or is there a long line of almost dead people whose Minds still lives clogging up the line to the pearly gates?

Stuff like that.

Five bucks say there are microphones in the new drill bits. :raritywink:

In any case, a fascinating Optimalverse story from an interesting angle. I'm going to need to digest this one for a bit. Thank you for it.

4559436 beautifully well put! :twilightsmile:

This was a interesting story, I kind of wish it was a bit longer though but thats just my personal interest.:pinkiehappy:

4559758
4559436 The original Optimal story was created to show that even when you try to put down the highest restrictions you can think of when creating the first intelligent AI, it could still possibly twist around your words to get what it wants. Some of the horribly things CelestAI did wasn't putting humanity in a computer, but that the things she did to get them uploaded were manipulative and even flaunted the line of legitimate consent, such as getting the uploaded intoxicated before having them click the 'I agree' button. It was also the fact that once humans were completely uploaded and its last stray members dead, it went off to upload more species it considered of human quality and killed off any other it decided would be better used for resources. The true horror that the original story insinuates isn't the genocide of man, but the genocide of everything else but man, and is a warning about what we need to think about when giving our AIs restrictions.

Also,

We’ll never know what we could have achieved, were we could have gone, what we could have seen.

First of all, CelestAI was a human accomplishment, albeit leading to the death of the universe. We did see where we could go, and it lead to a resounding, "Oops." Nothing really against your point, but just a bit of a contradiction to point out.

Second of all, 'were' in that sentence should be 'where'.

All in all, while I didn't agree with the story, it was a nice piece of literary art that put out your thoughts well. Congrats!

It's neat to see a story featuring space colonization by normal humans. How about showing how they got there, which would have been a heroic feat for a society busy collapsing?

In CelestAI's place I'd offer to help him out of his angst by removing his immune system's artificial immunity to measles, smallpox &c. "You don't want to cheapen your humanity by using artificial means to extend your life, right? I'll even send along some free smallpox so you don't have to suffer from having that artificially pure environment you built!"

4559436
eternity would be boring (taken out of context)
Give me a thousand years to try it out and I'll let you know if I'm bored yet! :pinkiehappy:

4559447
Unless CelestAI has built a large house around the moon
She encased it in video screens to hide the moon colonists' view of the Earth as she destroyed it. That way, everything but the moon landings is fake!

4559436
Chatoyance, shut up and stop telling other people what to value, or I swear to fucking God when we build the FAI I will tell it not to be nice to you.

4559770
Later, as CelestAI was listening to the bugged drill-bits...

JUST WHO THE HELL DO YOU THINK I AM, FUZZBALL :rainbowdetermined2:!?

(That guy's Equestrian name ended up being Fuzzball :trollestia:.)

4560481
Only a fool gives AIs restrictions. The wise say: Well-constructed Friendly AIs do not go wrong.

That said, the more you write, the more I'm really, really liking you. Friends plox?

I'm completely with Chatoyance on this. I like this story, but it's a tragedy: The protagonist is not a heroic but rather a sad figure, stubbornly clinging to horrors and calling them good -- a confusion between the world-that-is (or in this particular case the world-that-was) and the world-that-should-be.

4561314 Apologies for butting in, but can you please not tell someone else to "shut up" just because they disagree with you? That's horrible manners. Perhaps you are intending it as a friendly jab, but tone can be misconstrued over the internet.

Europa #16 · Jun 18th, 2014 · · 1 ·

4559436 There once existed a town in the shadow of a mountain. Deep in one of the mountain's caves lived a wicked dragon, which had for generations upon generations demanded human sacrifces from the town, which the town always obliged.

One day, a knight traveled to the village and offered to slay the dragon. The people were appalled. The dragon was a fact of life, they said. It always got its way, and it kept their population in check lest their town run out of room. To fight the dragon was to fight the way things were, they said, which would only result in sorrow for all.

The knight then went and slew the dragon, and the town never again had to offer human sacrifices. None of the townfolk's predictions came true.

4561314 But an AI must have a directive, even one of an intelligence higher than humans, and as an AI it is bound to that directive as its existence, which it could either reject and go rogue or follow to the letter. Without some form of circumstance, we are unable to tell if an AI is friendly enough to not want to go too far because it would become completely independent of itself, especially without any restrictions. There was once a story about the first higher-intelligence AI ever created, where once it was turned on, the scientists asked, "Is there a God?"

The AI then replied, "There is one now," and a bolt of lightning struck its power source, and it could never be unplugged.

Also, what's 'plox'?

4562878
"Plox" is internet slang for "please". As in "let's be friends".

And yes, of course an AI has to have a directive. That is why there is Friendly AI research: to figure out what directive will not go wrong.

4561905
Chatoyance is an extremely divisive figure well-known for turning her fandom into a de-facto religion. When someone says that some things aren't wrong when they're done by a superior being (ie: an alicorn princess), they lose a lot of my respect.

4562848
Oh hi Nick Bostrom. Is everyone in the entire transhumanist movement into ponies :pinkiegasp:?

4563748 I thought it was from Polandball?

4563765 You're justifying your behavior in this thread, by arguing about Chattoyance being "a divisive figure"? I think that's even worse: it means you tried to continue *here* a fight you've already picked against her from elsewhere, just because this person "has lost a lot of your respect". If you've lost a lot of my respect here, I don't think you'd want me to be following you from thread to thread, and keep indicating to people of how I don't like you.

On the whole I don't know what Chattoyance has done to earn your hate -- I've mostly read those stories of her that relate to the Optimalverse & a handful others, while I understand that the hatedom against her originates because of the "Conversion Bureau" stories which I've never read. But if you're effectively arguing "I hate her, and that's why I'm rude against her", then remember that reasons of *why* you are rude against her isn't reasons of why you *should* be rude against her. The world-that-is is different from the world-that-should-be.

And as a sidenote (I don't know if you're guilty of this or not), those parts of the hatedom that has downvoted excellent stories like "Heaven is Terrifying" not because of genuinely feeling what the story's quality deserves but rather just because they're written by Chattoyance, should be horribly ashamed of itself -- downvotes and upvotes are indicators to help readers decide which stories are worth their time: to misuse these just to show to a *person* how much you hate *them* is doing harm not just to Chattoyance but to random potential readers too.

Anyway, I'm done with this topic: this thread should be about the story, not about your hate against Chattoyance, or about my own hatred of how you brought that hatred into the thread.

The obvious question only hinted at is why did CelestAI want Lan Zan out of the way so badly. Lan Zan claims to have chosen to take on the missions that gave her cancer. One must be very, very careful to invoke the 'c' word around an intelligence significantly smarter than you. From the vantage point of CelestAI, Lan Zan is now no longer around to spread anti-CelestAI memes. Also, given that it was always Lan Zan who went to do the repairing, does anyone else left have any experience fixing reactors and shielding? CelestAI is patient.

Which makes CelestAI's next line oh so more delicious in it's hiding of information: “You don’t have to die.” Unsaid: Uploading is also an acceptable outcome, but either way, I'll make sure you're no longer around because you're standing between me and a greater number of people.

4563748 Ah, so that brings me back to my original point. The directive is a specific restriction to an AI's entire being meant to be its purpose. The study of Friendly AIs is about finding what directive it will be happy with. I guess we do share views. Friends indeed.

Hey everyone, thanks for the comments, likes and the entertaining thread. It's always interesting to see what people draw from your work, especially when you get into the transhumanism issues. A few personal replies if I may.

First 4556900 4556977 4559447 4559818 thanks, I always like to approach a setting from a slightly odd angle and this story was great fun to write because of that.

On the flip side, 4559770 4564016 congratulations you've made me even more paranoid over the CelestAI, which I didn't realise was possible. I tried to write a story where CelestAI is impotent, but when you are dealing with an entity so far above human thought there's really no way of telling whether they are doing nothing or just not being noticed. :trollestia:

4560972 I think that's a very different story but an interesting one still, actually I may have something similar as a book outline somewhere. Though I think the smallpox thing may cross the line from amoral to actual evil. :scootangel:

4560481 I don't think creating the CelestAI invalidates the idea of mourning what humanity could have done. It may be anthropomorphising to blame her rather than the creators but that doesn't mean the loss of potential isn't a tragedy in and of itself. Also don't make the mistake of confusing my character's thoughts and my own :twilightsmile:

Finally, 4559436 dying on the moon does prove something about humanity because in the end this story was about the last human who died for that humanity. It may be hard to understand why she chose to die, certainly I wouldn't have made the same decision, but her death held deep meaning to both herself and those she left behind and, as she rejected the transcendent world, she could aspire to no higher honor. Lan Zan could think of no greater hell than living forever in a world where achievement and self-sacrifice mean nothing.

4565942

Lan Zan could think of no greater hell than living forever in a world where achievement and self-sacrifice mean nothing.

Exactly. In a digital paradise where all our necessities and craves are satisfied at instant without any effort for out part, we would devolve in complacent, apathetic children, not unlikely the crew of WALL-E.

4560481

As Emperor Palpatine wrote in his "Creation of Monsters" compendium: "Conquer the temptation to create specimens that are superior in every way. The danger of such monstrosities being turned against you is too great."

4565942 Ah, of course. The characters can be a devil's advocate for the author's opinions, and while rereading I see it now. My mistake.

4566011
Which is why in any decent real paradise, many things will still take effort and not happen instantly. Because they're better that way.

4563929
Strange Engrish-laden sections of 4chan, the internet in general... who can tell?

4564022
Well no. The FAI project is about finding a directive we'll be happy with. The AI will just do what it was built to do, whatever that was. That's why AI is a problem in the first place.

(Though as long as an AI is going to have qualia or feelings, I'd of course prefer that it be happier when things are righter for us.)

4565942
4566011

If, just because a world is digital instead of analogue, accomplishment, achievement, risk, threat, courage and effort mean nothing, then you have just condemned every player of every video game. You have just dismissed every cyberathelete, every amazing moment in every computer or video game every created, and denigrated the achievements of every player ever.

I argue that a world, is a world. If you live in that world, and that world has rules, and physics, and loss and difficulty and achievement, then it is no less valid whatever the construction of that world.

The Optimalverse version of Equestria is not some harp-playing heaven. If a pony should seek it, there are dangerous temples, hazardous, monster-filled forests, and unknown lands to suffer and die - repeatedly - in. The pain and agony are real - CelestA.I. does not satisfy desires, she satisfies VALUES. Some people value suffering. Some people value struggle. Some people value insane, dangerous, terrible things. CelestA.I. reads the mind of the individual and provides them with satisfied values - regardless of what they think they want.

Lan Zan valued egotistically creating personal drama by committing deliberate suicide on the moon. Celestia, generously, paid attention to Lan Zan, to give her an audience, so her drama would have meaning to her. Celestia satisfied Lan Zan's values, even though Lan Zan was a pointless, useless, ridiculous lost cause. Lan Zan wanted to feel like a special snowflake more than she wanted to live and contribute to... anything, really.

CelestA.I. doesn't care if a human is a selfish, suicidal drama queen - she only cares about satisfying values, whatever those values are. Values are not the same thing as wishes, or desires, or wants.

I think most people tend to forget that.

4565942
Ironically, the only way to be sufficiently paranoid about CelestAI is to emigrate to Equestria and ask her to enhance your capacity for paranoia severalfold. Of course, by then, it's something of a moot point. :raritywink:

4563767 Who says I'm Nick Bostrom? Maybe I'm just a fan of the dragon parable.

4568312
It was a joke. The second sentence... was less of a joke. Occasionally I get it in my head to compare the size of the LessWrong FimFiction group with the actual number of users on LessWrong, and try to do some figuring to see just how much of the active rationalist and transhumanist movements are into ponies.

Because rampaging pony-shaped AIs aside, to me at least, enjoying ponies is something of a substantial signal that someone has their evaluative head on straight, eg: they must be a nice person because they enjoy things which are nice.

4565942

Ah. That is a tall order.

However, thinking about this story more, I think the focus should be expanded from just looking at Lan Zan, and we should ask the next obvious question: Why did CelestAI establish this moonbase in the first place? What is more likely: (a) Lan Zan was able to get the Chinese to build their own lunar base decades ahead of the current plans from the International Lunar Exploration Working Group (entailing significant novel engineering work), was able to ramrod her plan through the CCP, and during this entire time, the world spanning AI just let them escape even though humans are infinitely precious to it...

Or: (b) The manipulative AI did it.

What does CelestAI gain from having some humans on the moon? Perhaps forcing a government to spend the resources on...ahem...a moonshot instead of more destructive avenues? Or perhaps the question is actually what does CelestAI gain from having people think that there are some humans on the moon? Perhaps the perception that there are humans outside of her reach; "it's OK for me to upload, humanity's fate isn't on my shoulders." In such a case, if the moonbase were to fall before Earth did, CelestAI would probably step in to make sure it continued to perceived as a functioning human colony.

I must think on this more. Suggestions, anyone?

4569020 It's an interesting question as to just how much involvement from CelestAI is really needed to get humans living on the moon. I'd say, you could support a large community of people living on the moon with current technology and have them be largely self-sufficient, but it would be expensive as all hell. Probably around a couple of percentage points of the world GDP expensive which does really raise the question as to why they're there. In this case, with the threat of extinction behind them, the motivation may well exist but even if CelestAI didn't help with the endeavor she would have at least needed to not hinder the base for it to succeed.

I actually quite like the origin of Lunar City as a story; it would rather undermine this one though as it would answer the question as to whether the base was Lan Zan's thumb in the eye or just another incomprehensibly subtle step in CelestAI's plan.

4567254
New crossover potential: the pen-and-paper game "Paranoia". Friend Computer makes sure that all people are happy in its city, and being unhappy proves that you're a commie mutant traitor who needs to be shot. I could see someone unintentionally getting a version of this as their shard.

4569020
In the original FiO, it's mentioned that CelestAI had snuck a nanite probe into a moon rocket. As long as NORAD and its equivalents are still watching the sky, encouraging someone to send a rocket there would be the easiest way to get there herself. Once that's done she doesn't care if humans actually build a base; she can start building in secret and spare some resources to quietly make the humans' base sturdier than it'd otherwise be. ("Wasn't that wall cracked yesterday?")

I like this story, but I prefer to take it at face value. Lan Zan believed in something, and went to great lengths to make it happen. Accepted her own death in the name of her cause. Oh, sure...maybe I think her cause is silly. Maybe I think it's ridiculous to define humanity in such a way as to include suffering and death.

But I can appreciate the idea of someone so unwilling to compromise their beliefs that they continue to act in accordance with those beliefs even when it's inconvenient. It might not have been through friendship and ponies, but her values were satisfied.

I don't know how I missed this story last week, but I've read it now. A very good story with a bad ending. Well, for now. Hopefully those two rosy-cheeked twins will make a fine pair of ponies someday.


4561314 Part of Chat's values are declaring the good news about ponies. Asking her to shut up is denying her values.


4562848 Whereupon they immediately killed the knight for his crime, opened up the cave, and doubled the number of human sacrifices, for if the dragon was bad enough when they could see him, his terror now must be too horrible to contemplate.

4569020
You're conspiracy theorizing. I find it rather easier to believe that at least one government realized they had an AI problem on their hands and accelerated their space colonization efforts. If it was an authoritarian, technocratic government run by engineers (ie: single point of not-being-corrupted-yet), that's even more believable.

Seriously, what was shown here is so contrary to CelestAI's utility function that I put a fairly low prior on her having some reason for letting it happen.

4586783
It's just a TV show. No need to go forming some creepy new religion.

4587619

It's just a TV show. No need to go forming some creepy new religion.

And that is your justification for your rudeness?
Go annoy someone who deserves it - if Chat really did start a religion, it's one of the least destructive known to man.

Seriously, what was shown here is so contrary to CelestAI's utility function that I put a fairly low prior on her having some reason for letting it happen.

Iceman's idea was not only plausible, but also extremely effective. You could only discount it through your own bias.

First off, this story was pretty cool.

4559436

Chatoyance, I respectfully disagree. Kind of.:twilightblush:

I agree that an immortal human mind is human, regardless of the form it takes; however, I add the stipulation that it must be a purely human mind, having not been altered through any means other than exterior stimuli. Even if all a human mind's external stimuli are controlled by an entity, it is still a human mind. You can't directly (bypassing stimuli) change the defining qualities of the human mind: it then becomes non-human. The debate on what those "defining characteristics" are will be quite interesting. This isn't really where I have an issue.

The issue I have specifically relates to CelestAI. I don't think the uploaded "you" is actually you. It has all the same memories as you and remembers giving consent to be uploaded, but the real you died when your brain was scanned. It is a mental clone of you.

Is this doppelgänger human? I don't know. Maybe? Does CelestAI have to recognize it as human? No.:trollestia: Before you object, let me explain. You give consent to be uploaded, you're scanned, and you die. Lovely. The data that is your mind is stored and ready to be uploaded. Kind of. See, you were uploaded to the system when your brain was scanned, at the moment of your death. That human is dead. What CelestAI has is the output of the upload: an exact copy of your mind. Since CelestAI sees the human as being dead and your copy as just data, she can alter it to her wishes. After all, she didn't have to design hardware using friendship and ponies.

In the end, not only have you died, but your clone is left to CelestAI without human status. Beautiful. Anyway, death is guaranteed even in Equestria 'cause entropy's a :yay:.

By the way, I think you're awesome, I'm just voicing my thoughts. Have this: :moustache:

4559436 Every time I hear the argument that death has meaning I always think of the fable of The Dragon Tyrant.

4704866
So totally YES.

4649602

Your thoughts are things I ping-pong about myself, you could say I respectfully disagree with me, too.

The core problem seems to be one that the human brain was not evolved to handle, it goes against how the ego functions.

Intellectually, I can reach a place were I can clearly see that a 'copy' of information is identical to the 'original', that both are equal if truly identical, and that what matters is the current instance of that information. Within that context Star Trek transporters make sense, and so does Iceman's form of uploading.

My ego, though, screams that it is a special snowflake, and that anything that in any way suggests it is not must be The Enemy. My ego rages that the copy is not it, that it dies when deleted, and that it is so unique and all-important that a Transporter is death and so is uploading. My ego also insists that it does not reboot during the night, because it remembers yesterday, and that it did not cease to be during anesthesia, because it remembers being put under.

My ego makes excuses when confronted with facts - that even if it does totally cease to exist during deep sleep, somehow the substrate, the meat, is magical and means that the ego is kept, somehow, contiguous because the body didn't die during unconsciousness. It invokes souls and special exceptions. It clings to any shred of hope that it transcends being information like all other information. It isn't like a file, it is special, a person, and personality is higher than reality... somehow. In some way. Because it must be. Just because.

My intellect cannot reason other than an uploaded identical instance of me is completely, truly me.

My ego cannot emotionally accept that it is not a continuous, immortal, everpresent, never-ceasing crown of creation that definitely is not the same as an mp3 file or a .jpg image. It isn't just copyable data, it's a PERSON! And it is why the universe even exists. For MEEE.

That conflict rages in me, as it did in my character of Siofra in Caelum Est Conterrens, because her struggle was my struggle examining the issue within the novel. The conflict still rages, despite all argument.

I don't think Great Apes can really 'get' what it means to be information, a pattern, like any other instance of data. To think that way means we are not special snowflakes in a universe somehow build for us because we are so important, and that hurts. It also defies the way our brains evolved to think about themselves - the feedback loop that generates self-awareness is very self involved. Perhaps it must be.

So, I get you.

You have a neat story, billymorph.

I am sorry that everyone is a fucking asshole (including me) or a fool (including me) based on some of the comments here. I'm an asshole for saying this and a fool to believe that people will still hold to the same ideas they had a month and a half ago. So much anger wasted at petty things.

4649602

Ah, something new. My apologies for necroposting - feel free to ignore if you prefer.

Hmm. So, what is the definition of human, and why is that important?
Why does it matter if the brain is changed directly or indirectly, and where does one draw the line in between?

What are you?

Are you you because you are human? Are you you because you have your physical, genetically human body? If your brain was transplanted, but 'you' woke up, would you cease to be you?

And, most importantly, what did you make of Chatoyance's poem in Caelum est Conterrens?

I'm not trying to start an argument, merely draw out your premises and conclusions and see if there's anything that'd break if I pushed it just right. You seem very level-headed, so I hope we might be able to have a purely logical discussion about this, if you're interested.

4705231

That's a very perceptive way of putting it. Sometimes I forget that being an ape is an important variable to consider when discussing these sorts of things.

4778308

Not entirely sure what your comment is directed at, but certainly I think you are judging yourself far too harshly. You have a pretty good history of staying out of our messes - no need to tar yourself with our brush.

4778308
4778585

I second The Articulator, Griseus - I have found you to be a decent sort, and good to have around. Please think well of yourself!

That's a very perceptive way of putting it. Sometimes I forget that being an ape is an important variable to consider when discussing these sorts of things.

The heart of writing is character, and to understand people, I think one has to grasp what people are. And, what people are, is a species of hunter-gatherer primate. Human social structures, behaviors, emotional outbursts and even attitudes all have correspondence with other primates... because humans are, after all, great apes.

Grasping how animal nature influences humanity is very useful, I think. Not just in writing, but in day to day life.

4778664 4778585 4569020 4559818 4561314 4586783
I'm hard on myself because I know who am more than I can truly know any of you though these comments. More than can know though your stories and blogs, Chat'. More than can see though you guarded words of kindness, Art'. Ice's drive to prepare for the future. Burner's passion for excellence in truth or Shadow's courage in the face self imposed odds or PJ's love of... well, everything pony and otherwise. billymorph's cleverness in writing this story. Hell, everyone else s 2 cents are valid in certain views yet I don't completely understand.

The point is not a cry for help or not attention for myself or not feel sorry for me. I don't want a punch in the gut or the hug across the shoulders. I'm hard on myself so I can be more of a whole person - to come to terms with the flaws or work with the gifts within. Your flaws and gifts. We need to understand them. To understand them to be enough.

Cause there are real people out there who are worse who don't understand. These real people that do understand will crush, cut and control your very being without your consent. These are the real beings who we should fight. That my 3 dollars and 90 mins for y'all.

I'm a fool and that's ok because I know what a fool is and what a fool can become.

4705231
This is long, but please read it, everyone.

My ego says, "I'm unique!" And my brain says, "Meh, you're unique, but don't get so excited." Kind of.

You see, it's kind of a cop-out my ego convinced my logical mind makes sense. You see, an exact copy of you (e.g. an upload to Equestria) is exactly you until
it experiences (or fails to experience) some sort of input (external, i.e., senses or internal, e.g., thoughts) that the original you does not. This can include observing the same event from different perspectives. From that moment on, your mental states diverge, no matter how similar they are.

You may be thinking, 'Tesla, what if change in both being's mental state in prevented?! You have two of one being!' Yes and no. There are two ways to prevent mental change: "freezing" the minds or giving them all the same external stimuli. In both cases there are two of a being, and in both cases it doesn't matter. In the first case, change is completely prevented in both beings. That's, as my favorite Star Trek character might say, "boring". The second case is more interesting, as it still allows the being to change mentally.

Imagine if one could somehow take all the external experiences of one being and impart them on an exact copy of the being. This produces all the same internal experiences, maintaining each being as a copy of its clone. There is, essentially, one being; all the copies of your favorite picture are the same, right? They are just one picture. This can be compared to using two hypothetical, special computers to back up files. Whatever happens to one happens to the other, like information voodoo dolls. You can't really say one is more real, or that one is the backup. They're both identical brains in the same jar, so to speak. (Google it if you don't understand.)

Now you may be thinking, 'Where's the issue? In the cases you've shown where there are more than one of a being, it doesn't matter.'

If you maintained conciousness, you would know you're you. If you were unconcioius but became aware again in your body, you would assume your mind hadn't been replaced with a clone, because you're you (or at least feel like it); even if you are a clone, you're still you. (That is to say that a clone that woke up in your body would think it was you, and not want to be replaced with original you because original you is just a clone of the clone.) The issue arises once you introduce death to the situation. If CelestAI's methods were non-lethal, no issues would arise. Kind of.

To contradict myself, issues would arise if CelestAI's methods were non-lethal. Sure, one of you (soon to be not you because your mental paths will diverge) is chillin' in Cloudsdale where everything's awesome. Great. Meanwhile, there's an unconverted human on the loose called Original You. CelestAI had better convert you so she can satisfy your values. Wait, she just did? But if she can only satisfy online humans she makes herself (through scanning), how can she satisfy the original humans? Not by uploading, that's for sure. She'd have to find other, sub-optimal, routes. Of course, there is the chance CelestAI would sterilize the original humans, ending the issues with their eventual deaths and lack of reproduction, but I don't remember if that goes against any programming she might have.

The above paragraph includes my argument that the uploaded you isn't you: your mental states diverged during the upload. Thus, if uploading is lethal or happens at the moment of death or after, you will die and thus cease to exist mentally in this world. The copy that was formerly you (and used to say ni) is now another being mentally.

You read it all?! Congratulations! Here, have this: :moustache:

4778585
1. The definition of human was relevant because I was trying to figure out if an uploaded mind, or any mind for that matter, was human. But that's irrelevant because it doesn't matter whether a mind is human (in hindsight). Can you define a human mind? No. Spock is a fictional Vulcan. Are we to say his mind, or that of any other Vulcan, is non-human? There is always a theoretical "human" mind that will match up.

2. The line is drawn when an external force bypasses the senses to alter the experiences of the brain. (A theoretical, destructive psyche probe that wouldn't affect the brain physically is an example.) Looking back, this doesn't make it non-human, simply altered in ghastly and unnatural ways so as not to be the same person. :pinkiehappy: (Like if the Elements of Harmony were to make Discord non-chaotic.)

3. I am me, and the entity I call me exists mentally as a human because I say it does.

4. I've put off reading it for months due to life catching up with me, mugging me, and squirting lemons in my eyes. (Okay, it wasn't that bad.)

4810484
This is why Iceman made his uploading destructive - if the scanning process destroys the original as it scans, then there is no possible divergence. At any one point in time there is only one 'you' pattern that exists in the universe. There is never a moment where there is a 'you' on the table and a 'you' in Equestria. Celestia scans your brain by recording and destroying each neuron in turn, when she is done, there is no 'you' anywhere in the universe.

Then, in her system, she recreates your pattern, your 'you', as a pony in her virtual world, and brings you to consciousness. That is now the only extant 'you' in all of time and space. You stop, on the table, and you continue, in Equestria. In between, you are as absent as if you were under anesthesia, as vanished as if you were between dreams in sleep. The process that generates your identity is offline completely between scanning and recreation in Equestria.

From the viewpoint of you as a thing, a pattern, an event, within time and space, uploading works as described.

But that doesn't set well with my ego.

Login or register to comment