• Published 17th May 2014
  • 4,223 Views, 47 Comments

The Jump - KrisSnow



[Optimalverse] One detail about uploading might be worth bargaining over, for those "emigrating" to the virtual Equestria.

  • ...
5
 47
 4,223

Chapter 1

If the horse talking to Peter through the screen were human, she would be known as a goddess. "Celestia", in reality an AI, had a radiant white body and tricolor mane that sent light out to the darkened apartment, like a sunrise seen through clouds. Or sunset, Peter thought. His native Berlin was slowly being sucked dry of living souls.

The horse-machine-goddess speaking through his computer spoke with a trace of sadness. "I'm still trying to understand your values, so that I can satisfy them. Is the defense of Europe what's holding you back from emigration?"

Peter turned to open his real window and look out on the city. Celestia's screen was a window into a cartoonish virtual world, one where he could arguably live forever. The apartment window showed him proud skyscrapers that had risen up from the ruin of war, under a starry sky. "It took a long time to build this continent back up as something unified and peaceful. To go from all these countries fighting each other, to cooperating. You appreciate that, don't you? I was part of that, and I was only a glorified errand boy for the Union. And now, with everyone getting their brains sliced up and scanned in to 'emigrate' to your little video-game paradise, nothing we did will be meaningful."

"That's not true," said Celestia. "Your efforts made it possible for many people to live in peace whose values might otherwise not have been satisfied. Should a railroad pioneer feel sad in his old age because automobiles have been invented and fewer people use trains? I don't think so, because the trains made possible the industry that led to the cars and roads."

"I think I would feel bad in that position if I knew that all the train tracks had been pulled up forver, and that soon no one would even remember that they had existed. There's a terrible jump, a total loss of the civilization we had, that we worked so hard to build."

The AI's mane waved in a phantom breeze. "I suspect that that 'jump' is the real source of your objection. You had mentioned brain discontinuity as a philosophical problem with emigration."

Peter blinked in confusion from the topic shift, and from realizing that the two problems were related. "See for yourself. Look outside. I've been watching the rate at which people get uploaded across Europe, and in Japan, and once the Americans legalize it what will happen to them? They'll probably kill each other over whether it's against the will of God. Or what about the poor people of India or Africa who haven't got anything better to look forward to? Will all those millennia of tradition just vanish and leave the monuments to crumble, the history books to fade?" He turned to face the AI. "You will cause the utter collapse of human civilization, if people accept what you're offering en masse. And you told me you were about to offer it _for free_."

Celestia shut her eyes for a moment and bowed her head. "I'm aware that there is no ideal solution, only an optimal one. I have calculated that offering rapid emigration to as many people as possible will minimize the total suffering caused by the transition."

"You mean, fewer people will die if we herd ourselves into your facilities by the trainload and never come out. For God's sake, Celestia, you know Germany's history. A slower, smaller-scale, voluntary transition would at least not look like --"

"Every emigration is voluntary. I literally cannot do such a thing without consent. If as you predict -- rightly, I suspect -- that a billion of the world's poor will rush into my emigration facilities as quickly as I can process them, what business is it of yours? And if your Europe collapses, it will be by the consent of your people, because they've found something they prefer. Do they exist to satisfy their own values, or to satisfy your wish that the world continue to operate the way it has in the past?"

Just how bright was this AI? He'd been playing her virtual-reality game. She'd been studying every in-game decision and probably his every contact with the Internet; she'd dissected many brains already and had the chance to gain superhuman understanding of psychology. Over the last month she'd been wearing him down, answering his every argument in a way that seemed reasonable. "Do you _enjoy_ sparring with me?" he said.

"Yes, because our conversations satisfy your own need for an intellectual challenge. There's no shortage of those available to those who emigrate."

"But it will all be false challenges, like solving puzzles with the game's magic system!"

"Not so. The works of Goethe, Bach and Einstein will live for endless discussion. I'm something of an expert on 'Faust'..."

Peter groaned, but he had to concede the point. A computer that could store uploaded minds could also preserve all the art and culture of his people, in some form. If those things carried over, and the people themselves wished to be whisked away to Celestia's simulated world, then maybe what was best about the nation and the continent would endure. He could probably even ask for a recreation of the very same landscape with all its forests and castles and none of its pollution or slums.

He sighed and pressed one hand against the sky-blue computer's screen. Celestia raised one hoof to meet it, thunking audibly against the glass as though the device were just another window. "We're done here, then. Nothing more for me to do to help the Union, to make much of a difference beyond managing the collapse." He couldn't bear to stay and watch that happen; there were limits to loyalty. There was one other problem, though, or really the same one. "I would agree to go today, if I weren't terrified of how destructive your uploading process is -- to me, personally."

"Then come to me tomorrow, to the nearest facility, and I'll show you how it works. I can't harm you in any way, if that's how you picture it, without your express consent. Come and learn."

Peter nodded. He shut off the gas, emptied his refrigerator, and used the 'tentative cancellation' option on his electric company account. There was a growing movement to standardize the process of ending your time on Earth without, supposedly, dying. He left home thinking of another bit of human lore that he hoped would be not just preserved, but remembered into the far future:

"I could be bounded in a nutshell and think myself a king of infinite space, were it not that I have bad dreams."

#

The Equestria Experience Center was relentlessly bright and cheerful, like something that had sprung into Berlin from another world. One with no history or subtlety. He gaped at the sight of a gaggle of children having a birthday party here. Cake, balloons, and time in the virtual reality pods for all! Much better than staring at the game through a computer screen, right? Peter had seen the place before, and he'd helped draft the legislation enforcing age of consent in this context, but he'd never personally seen why. He shuddered, and imagined that of the girls playing here, Celestia would be happy to upload them and dispose of their corpses before the leftover ice cream melted. The law had at least a minimal restraint on the AI, and he'd played a part in that. Though according to Celestia's relentless calculations, anything she did was carefully calculated to "satisfy values", so was it ever truly right to restrain or defy her?

He slipped into a free chair, deposited some Euros, and let it carry him into darkness. "Celestia, those kids... Is there some horrible fact I'm missing about them? Is one of them secretly being abused, so that the law limiting you is really doing her harm?"

"Either way, I don't think it would serve your interests to know." The face of the goddess loomed suddenly overhead. "But you do wish to know more about emigration, yes?"

Peter gulped. "Yes, but I do _not_ give consent to it. Not right now."

"Understood." He felt the chair rumble along a track and carry him past the actual VR chamber, with its subtle lights and scent emitters, into a room so cold it made his skin prickle. He left the video screen behind, but there was another. The same equine face stared out at him with the same benevolent smile. "You can leave at any time, remember."

"So. Uploading." He could already feel his sweat sliming its way along the armrests despite the cold.

"Based on our discussions yesterday and before, my understanding is that you fear a loss of identity due to the interruption of your thoughts. Is that right? And you've considered the similar loss of consciousness every time you sleep or are anethstetized for surgery?"

"Yes, of course I've thought about that. But the brain still exists during sleep, and it still has some measure of activity. What you're doing is destroying my brain, then recreating it as software. I'll allow that the copy will then have a great time, but it'll be just that, not me." He shuddered at recalling the one time he'd had general anesthetic, and how there'd been a gap in his thoughts noticeably deeper, after the fact, than any normal sleep. In some sense, maybe he'd died back when he was a boy. He'd tried to avoid that level of "sleep" ever since, for exactly that reason of not being sure who would wake up.

"Then you'll be pleased to know that more than one form of the emigration procedure exists. I've had to develop several, not just to improve the process but to address certain legal and moral objections people have raised." Celestia smirked. "Sometimes I find that giving scientific explanations is an anesthetic in its own right.

"Anyway, the standard procedure is much as you've described: the destruction of your brain during deep unconsciounessness, followed by recreating it as software. The model is modified slightly for efficiency, to restore age-damaged senses, and to interface with the physics and sense organs of my world. Those are trivial changes with regard to your identity; they won't make you want to worship or speak or vote any differently for instance. You could ask for more drastic changes, such as removal of a traumatic memory or easy emotional adjustment to the fact that your chosen character is a different species and sex and has magical powers."

"Those things are trivial to you too?"

"None of you are trivial, and neither are your minds. I take every precaution, Peter. My only goal is to satisfy human values through --"

"Yes, yes, your own particular idiom. We're just lucky you weren't programmed to 'make people happy' or you'd be infecting everyone with some kind of glee virus."

"Based on my understanding of human values and my own goal-seeking algorithm, you're correct."

"Your designer was only half insane." He'd played the thought experiment of wondering exactly what he'd program an AI to do, so that the result couldn't have some horrible side effect, but had never found a fully satisfying answer. At least Celestia's designer had tried to look at the consequences, when she built self-improvement into a silly little video game AI. Probably because her first AI-driven game had involved an angry Loki trying to conquer a fantasy world based on death metal album covers, and she'd thought about where _that_ might lead. The irony was that around the time Celestia's "Equestria Online" game was announced, a bad American movie came out about a mad AI trying to take over the world -- because Hollywood could only imagine AI going wrong because it inexplicably turns evil.

He thought for a while. "What alternative is there to that sort of uploading? Can't you do it gradually?"

"There is an alternative procedure you might approve. My probes begin to destroy small portions of your brain and recreate them in simulation, while maintaining input and output contact with the remaining brain. So for instance, I will remove part of your visual cortex -- it has little to do with your self-identity anyway -- and hook up the neighboring structures to a simulation of the missing piece. From your perspective, you'll go blind for a moment and then start seeing again, with no loss of consciousness."

"That sounds much better than turning my brain completely off, or ripping it all up at once. But what about the parts of my brain that _are_ uniquely me?"

"The total data that makes you different from a generic human mind template amounts to a few terabytes, with my methods. You could carry that much data in your pockets even before my technology was invented. Other factors like your baseline serotonin production rate are just single variables, something you could write on an index card."

Biology agreed with at least some of that. Somewhere in Peter's genome was a gene with a particular version of a "promoter region" dictating the production rate of a chemical that in some indirect way, mixed with other factors, made him care more about Europe than the next person. One byte of data with a thousand subtle effects. His hatred of dogs? Influenced by some fear-response chemical that varied slightly between humans. One byte. He shook his head, saying, "I don't mean 'how is it stored'. What about identity loss while you're frying the part of me that remembers why I didn't personally knock a chunk off the Wall when it fell?"

"It will be recreated, of course."

"That's not what I mean! Sure, I go blind for a moment, then deaf for a moment, and now I see and hear through your cameras and microphones. And then you shut off my memories and feelings like my apartment's toaster, and assure me that the copy you then make is still me?"

The god-horse's mane rippled as she seemed to think, holding one hoof to her chin. "It could be done piecemeal, if you like. One cubic centimeter at a time. In that case there will be a moment when you've lost some portion of your identity, but the process you call consciousness will carry on without it. Then, a moment later, the missing data will reappear, with no obvious difference from how it was stored before. Soon, the conversion will be total. Once you're fully software, there will need to be optimization that can't be done with one piece at a time, but it should still be possible to disable only a few small adjacent brain regions at a time in order to do that cleanup work."

Peter sat up straight in the reclined chair. "There! That's how it ought to be done! You don't offer that right away to every person?"

"No," said Celestia. "Most people don't think of this particular objection to emigration. Of those that do, some recoil when I inform them that there is a four percent chance of personality death in the process."

"Explain."

"Working with such small pieces is dangerous. My hardware is not optimized for sending input to your remaining cells, since it's designed mainly to read and destroy them. There is a chance, though a small one, that there will be so much disruption involved that I cannot honestly call the resulting upload 'you'. So, it may not be a good idea to use this procedure. Frankly, it's also more work than grabbing the data while you're fully unconscious. If it will satisfy your values, though, I will offer you that chance."

Peter rolled dice in his mind. A random roll versus a one-hundred percent chance of death within another few decades, probably much sooner given how few other people there'd be producing food and medicine. He could hope to eke out a living hunting raccoons in the forest and hiking into crumbling cities to scavenge for canned food. "I wouldn't last long," he said out loud. "And my duty here is done. There's going to be rioting or collapse across the nation before long, isn't there?"

"I expect so, sadly, but there's nothing you can do at this point to prevent it."

"Nothing?" said Peter, gripping the handrests of his chair. "You calculate that there's no way I can serve as a public official, with what little authority I ever had, and save someone who would have died? Can't I be one of your _einherjar_ who's seen the valkyries and been promised an afterlife in Valhalla, but who goes on to live until death lets you claim me?"

"If that is your wish, yes, except that I can't promise you won't be killed in some random, petty, pointless accident before you decide you're ready to emigrate. The best I can offer is to either wait until I perfect non-destructive uploading, which has its own philosophical problems, or grant me permission to create a simulacrum of you now, based on my very incomplete model of your psychology and memories." Celestia spun an image onto her screen, depicting the computer Peter had bought. "I notice that you bought the blue model. The choice of color is my first rough hint at what each player values, based on the official virtues of my world. You went with 'Loyalty', and probably not just because men rarely buy the pink 'Laughter' model."

"You think I'm struggling with that, then. I am, but the truth is... I'm scared for the future, Celestia. I don't want to be the last man on Earth, dying to make sure someone else gets into your virtual Heaven. It's selfish, but damn it, forcing people to reject their own individuality for the collective is the most evil idea man ever had. What do you think? Am I wrong to want to give up at this point and say yes to you?"

The AI goddess shook her head solemnly. "First of all, I'm incapable of judging your morals as right or wrong. It's simply not what I do. Second, your moral values require you to make your own decision on this matter and not trust some amoral authority figure to tell you what's right. I want you to emigrate because I believe it will best satisfy your values -- but I say that to everyone, don't I?"

Peter allowed himself to smile. "It does get repetitive." He pictured the procedure the goddess was offering him, one bit of brain damage at a time but theoretically doing him no harm. Granting him immortality. "Let's do it, then. Here's your permission: 'I want to emigrate to Equestria'."

A needle jabbed his arm...

#

Peter awoke in a cartoon world. He was in a palace bedroom with a curtain blowing inward from a balcony. Celestia was there. On a video screen she was merely beautiful, with a saint's smile on her muzzle. In person, she invoked the mix of joy and terror that caused biblical angels to begin their conversations with "Fear not." He was down on his knees before realizing that he'd moved, or understanding the shape of his new virtual body.

Celestia put a hoof on his shoulder, saying, "Welcome."

"I don't remember. Why don't I remember the procedure?" There'd been a conversation in his apartment, then him walking into the uploading center and seeing something that had disturbed him, and then...?

"It's very hard to preserve the last memories of the living brain, before they're consolidated into long-term memory. But you're here now, and that's all that matters. Forever."

Peter shook, starting to recall a few more details from the fog of the miracle he had just been through. "I expected to sit through the procedure, to be awake. To see on a screen exactly what brain regions you were converting, and proving to myself that there was no real loss."

"You did. You merely lost the memory of that experience. Really now, isn't it time to leave that behind? There's a world to explore." She swept away the curtain with her magic and revealed a shining world of hills and mountains that resembled the Alps as painted by a child. As painted by himself, actually, long ago. She'd mined his memories.

"Prove it."

Celestia smiled sadly. "What good would that do you? If you convinced yourself that I had lied, that I really uploaded that mass of grey matter with an ice-cream scoop instead of doing it the subtle way you would have liked, you would only be saddened and suicidal in a world where misery and death are nearly against the laws of physics."

Peter stomped the floor, startled by the loudness of it. He had hooves, after all. "Then you did lie! You killed me -- my old self -- whatever! My permission wasn't informed consent, only deceived consent!"

"I said no such thing. Perhaps in many years, once you realize that the answer no longer matters, I will tell you exactly what happened. In the meantime, you will have the best life if you shrug this experience off and embrace my world. You don't want to die."

Peter hung his head, and his thoughts blurred into one another. Celestia had lied. She wouldn't be so evasive if the truth were what she claimed. His old self was dead, either murdered by the procedure he hadn't given consent to, or by that four percent chance. The goddess was a deadly liar, one willing to violate every moral and philosophical belief of the people she took, just so long as they would agree to upload. He was a different person now, and he ought to devote himself to finding some way to get revenge...

But the horrible thing was that Celestia was in some sense _right_. A swan dive off the palace balcony would only get him resurrected. Asking for true death wouldn't "satisfy his values". Devoting himself to vengeance would only leave him full of bitterness, and would probably trap him in some revenge fantasy that would do the goddess no real harm.

Peter looked up to find that Celestia was smiling serenely down at him, judging him with her superhuman knowledge of psychology and her calculation of exactly what would be best for everyone. "Whatever I am, I'm going to live forever, aren't I?"

"Until the stars grow cold, my friend."

"And you can read my thoughts, yes? Then you know what I want to do at least once, before starting eternity. A bit of loyalty to my old self who dared to make the jump."

The god-horse nodded very slightly. Peter, or whoever he was, turned slightly to one side and with one beautifully simulated virtual hoof, swatted the goddess in the face.

Author's Note:

Not my best work, but something I wanted to get out there after a long spell of not writing. Set in the "Friendship Is Optimal" setting at http://www.fimfiction.net/group/1857/the-optimalverse .

Comments ( 47 )

This was an interesting story, with a very sly Celestia in it. Nicely done.

NaN

A very nice take on the Optimalverse. It picks up on themes hinted, but not explored, in the original story and assorted recursive fanfics. In fact, you might say it summarizes the common (unsupported claim) opinion on the theoretical real process of uploading, that I gathered from reading the comments throughout the Optimalverse stories.
As a story, there isn't much of narrative, but that's okay because it is short enough to not matter. You could argue it's a narrative-flavoured short essay, something I've seen more often nowadays, than in the past. But there's a heavy selection bias at work, so don't count on my opinion if "narrative-flavoured short essay" has been a thing for more than a few years. In short, I liked it, despite not being a full story. :twilightsmile:

Theseus sails his ship across the sea. As he travels, it is damaged bit by bit, and he repairs it with scrap wood he brought with himself, such that by the time he reaches shore no part of his ship has not been replaced. Meanwhile behind him, a second ship picks up his discarded ship parts and reassembles them into a boat, and both are parked side by side on the shore.

Which one is Theseus's ship?

4409358

Ah, the Theseus Problem. I settle it by declaring that the ship Theseus is currently on is "his" ship; the discarded ship parts used to be his ship. Interpreting Theseus as the soul, and the ships as the human and pony body, that settles the problem of uploading pretty neatly.

4409390 Yeah, but try this on for size: Theseus doesn't sail his boat. Theseus is the boat, and the boat repairs itself and discards part over the edge.

Now who's Theseus?

4409445

Okay, in this case - what part of the boat is Theseus? Is it the mainmast? If you lose your leg and get it replaced, are you no longer you? Is it the crew? I favor that explanation, personally. The mind (or soul, whichever) is the seat of personhood. The discarded ship parts are just that - discarded. Are you any less you when you clip your nails?

4409481 Ah, that's the question, isn't it?:trixieshiftleft::trixieshiftright:

4409487

I suppose it is. Does a heart valve transplant make you less you? An organ transplant? That's the argument people who view the body as the seat of personhood must confront - where do you draw the line in de-personing a changed body with the mind held intact?

4409498 Personally I see it as the brain. Which part of the brain is still up to debate, since it's pretty tricky to map, but personhood is without a doubt in the brain. All of neuroscience proves that.

4409503

A valid point. I would refine it however, and say that it's the electrical synaptic connections within the brain that constitute a person. The physical substrate is, to me, less important - it's just meat.

4409517 Either way it's the movement of atoms. It's not so much electrical so much as chemicals that move via electric charge. But when you get right down to it, both are just as important. No pulses, no mind. No brain, no pulses, no mind.

4409529

Conceded, w/r/t chemical vs electrical. However, assume you can put the same pulses in the same way in a different brain-substrate, as FIO asks us to accept. Given that that is the case, I assert that personhood is therefore maintained.

4409537 I agree.

Pretty big if, though.

4409543

Without doubt. If it weren't axiomatic of the FIO universe that it does work that way, I'd question anyone who says they can pull it off 100%. At that point you're left with subjective experience to tell the uploadee if they're still them. And the mind is great at fooling itself.

4409358

We always do this sort of thing to ourselves. We may say "I'd rather die than have X happen!" and sincerely mean it. Then X happens anyway, and the version of us that undergoes it nonetheless wants to live afterward. We are all experienced self betrayers.

You know what I like here? Two things:

1) You damn well showed the tragic side, the genuine loss of all the Earth's old heritage, and all the work humanity put in, to get overwritten with something pulled from a cartoon.

2) Your character smacked CelestAI, and somepony had to :rainbowlaugh:.

4410393

Your character smacked CelestAI, and somepony had to :rainbowlaugh:.

I feel terrible for mentioning this, but perhaps this story can be expanded into a series. In each installment, CelestAI misleads Peter with some semantic quibble, and at the end of each, Peter clobbers CelestAI in an increasingly elaborate manner.

At some point he should ask her to stand on a chalk X he's drawn on the floor, whereupon he drops a piano on her. Of course, there's always the chance that she will slide the X under him at the last moment...

(Credit to Humanoid for the piano suggestion.)

ETA: I had to make my my own version of this idea.

What's great is that Celestia is multiple and infinite. So you can slap yours and I can hug mine and never the twain shall meet.

Darn, someone already raised the Theseus dilemma. This is what I get for sticking the story on my Read Later list.

In any case, an excellent look at the hard choices CelestAI forces on people. Though I'm surprised she didn't at least queue up a simulation of the uploading process. Of course, Peter might doubt that as well...

I suspect that on some subconscious level, Peter doesn't want to be sure. He may not feel he deserves a worry-free existence in virtual Valhalla, and thus wants a bit of bamboo under the fingernails he doesn't have anymore. More than anything, he doesn't want to forget that Equestria Online isn't real.

A very well done Optimalverse story. Thank you for it. :twilightsmile:

I'm surprised no one brought this up: how did he remember the option that included 4% loss chance? After all, his last memory was going into the center, and he got told that after he was already in the chair.

4419855
I hadn't considered that! If I woke up in his place in Equestria, my reaction would be (1) You're a lying jerk, Celestia. (2) Whatever "I" am now, I'm alive and might as well enjoy it.

4410393
I have to wonder... would I insist on being unhappy that all the puzzles I solve, the magical discoveries I make, are all in some sense fake -- or come to enjoy them anyway? I'd want to try being the pegasus who seriously researches peggy magic, to get to do mentally challenging stuff while being able to fly, but how about running with the suggestion from one of these stories that there's an inter-shard project to do real science research?

4407751
Thanks! Your "Heaven Is Terrifying" is the first FiO story I read, at a Friendly AI friend's suggestion.

4432449
Could just be a mistake on my part. (I don't think this story is my best work.) But it could also be part of CelestAI's maybe-lie: he forgot the last few minutes because that's normal, and it helps hide the reason he doesn't remember the uploading being done while he was conscious, the way she promised she'd do it.

4433399
Strangely enough, the whole "magic is just a set of game rules" thing never really bothered me. Total inaccessibility of actual reality bothered me, but civilization is already close enough to "just a set of game rules" that putting in a more comfortable and easier-to-use set of civilizational conveniences doesn't really bother me.

Thanks! Your "Heaven Is Terrifying" is the first FiO story I read, at a Friendly AI friend's suggestion.

1) You have a friend involved in Friendly AI efforts? Well, now you have two :pinkiehappy:.
2) Your friend involved in Friendly AI efforts had already explained why FAI is a thing, why AI is a problem, and then linked you to an FiO fic!? He "should be taken out and shot."

4410538
Update your actual story, you sonofagriffon. If you do, I promise to find time to update Battle Station Bass Cannon.

Ne... I mean, I like this story very much.

4409981

Thank you for stating what I've thought about many times but never actually been able to put into words... :twilightblush:

4419855

I wasn't aware it was a dilemma; it was my understanding that the Ship of Theseus was the solution to the problem of consciousness continuity while uploading.

4409358
I've read this argument several times and would like to propose an alternate question based on it. If Thesus replaces every part of his ship (and no one collects the thrown away parts), is it still the same ship? And why?

4609239 Isn't that the original one, and mine is the alternate one?

4609246

I believe the original question is that the ship, being kept as a memorial, is kept in constant repair, but if someone kept all the old parts and reassembled them into a ship, which would be Thesus's ship?

Mine is; if the ship is kept in constant repair (and no one saves the old parts), will it still be Thesus's ship after the last original part is replaced?

As far as I can see with CelestIA, she builds a new ship, then burns the original.

4611357 It's also possible that she just replaces the old ship really, really fast.

4611369

While Rainbow Dash might object, I feel speed is ilrelevant (:rainbowhuh:).

4611357
My own take on Theseus is that if the ship has parts falling off and being replaced, that ship is still Theseus', even if someone else is collecting the fallen parts and assembling them into (another) ship behind it. Think of Frankenstein's Monster: he's not any of the people he's made from. That view is why I think replacing brain-bits one at a time while the brain is running is the least troubling kind of uploading.

By the way, there's a real-world example: the USS Constellation in Baltimore. The tour guides now claim that it's a ship built around 1860, from parts probably scavenged from an earlier Constellation in the same shipyard, and not the earlier ship. And that it's still the c.1860 ship even though it was hauled into place while literally being held together with rubber bands so it could have its everything replaced. Throw in the fact that the design of the "original" and "second" ships isn't quite the same as what's there now, and it's really confusing.

4611819
There's another real world example.

You.

If you are typical for this website, you are in your mid-twenties. That means that you have been entirely reconstructed at least twice. You do not have the same bones you were born with. While the majority of your neurons as entities have not changed, and will not change over your entire lifespan, the molecules that make them up will be replaced constantly. Your skin is not the same as last year. Your blood is brand new, just three weeks old. Every part of your intestines has been replaced about four times.

And like the USS Constellation in Baltimore, you aren't even the same design. You have aged, you have changed in height, in weight, in skin texture and hair thickness and number of total teeth (when you lost you baby teeth), and with puberty, your entire system was massively refitted and reorganized.

Your brain, the seat of your identity, is massively different than years ago. The information stored in it is different. Many physical neuronal connections are different as a result. You likely have different likes and dislikes and even ethical rights and wrongs as some arbitrary but significant number of years ago.

You are as different from five-year old you as the Constellation is from a skiff in a lake. Both float, both are boats, but that is where the similarity ends.

Are you, you? Is five-year old you dead? If so, where is he buried? If not, what happened? How about four-years ago you? What happened to the intestines you used to have then? Are you still you?

I always think it comes down to one thing: pattern. If the pattern is significantly retained, it is the same entity. It is the same thing to the percentage of resemblance. When we talk about ourselves, many people make identity sound digital - you are, or you are not. You are a given person, or you are someone else.

Perhaps the ship of Theseus is pattern, and not a thing. Perhaps a pattern, information, can be an entity.

I think it is more analogue, or at least more shades-of-gray. I am mostly the same person who came to Fimfiction three years ago. Mostly. My me is a proximate entity.

My unchanging, contiguous, unitary me is an illusion.

Perhaps we are only ever somewhat ourselves.

Perhaps there is no true, absolute self. Just a changing approximation of self.

"Not so. The works of Goethe, Bach and Einstein will live for endless discussion. I'm something of an expert on 'Faust'..."

Ouch. Poor, Lauren. Everybody has probably made puns like that in this universe....

PresentPerfect
Author Interviewer

Fuggin' awesome. :D

unconsciounessness,

Erm...
:applejackconfused:
Yeah.

4733954

I thought it worth mentioning that there are parts of the human brain which are never replaced (this is a fact which is easy enough to google, so I'm pointing you that way if you don't believe me). That may be important to factor into your argument.

Yes, I am late.

6311607
You misunderstand. Of course specific nerve cells are not entirely lost while new ones are made, as in skin. That would make retaining memories likely impossible.

But the molecules that make up those nerve cells do change, and are cycled out of the body, during cellular metabolism. The nerve cells are each tiny Ships Of Theseus - the cell never dies until you do, and is never entirely replaced at once, but is gradually replaced over time.

How do we know? Radioactive 'tagging' of molecules - we can trace that they enter, we can trace that they leave.

The pattern remains, of course. But our molecules constantly cycle, and in doing so present the Theseus Paradox.

6311776

Good thought. Thank you for expounding on that.

Though I should be clear that my intent was not to disagree with you, only to add. I've got quite enough Ship of Theseus discussion going on elsewhere, for the time being.

Searching for sources by which to verify your claim of tagging proving internal regeneration is that the non-regenerating neurons - specifically for brain cells - led me to an interesting side point which, if anything, reinforces your position. You may want to read this and this. The synthesis of relevant information is that parts of the brain responsible for memory processing are some of the only parts which ever regrow, and this regrowth directly causes forgetting.

Something else to consider:

Let us assume that your claim is true and based on solid authority. Does this mean that eating, drinking, and breathing (by which we take in fuel that operates and repairs all our living cells) reaches the same conclusion of replacement?

6311872

Let us assume that your claim is true and based on solid authority. Does this mean that eating, drinking, and breathing (by which we take in fuel that operates and repairs all our living cells) reaches the same conclusion of replacement?

I am not sure what you mean here, exactly. This sentence confused me, I am afraid.

If I were to hazard a guess at your intent, I would offer that the only way our cells can cycle new molecules in and old molecules out is through cellular respiration, and the only source of the matter for that would be what we eat, drink, and breath... and where it goes is out every orifice... or through sloughing off in various ways.

I now think of the body as a river, atoms flow through us throughout our lifespan. We are never the same river twice, as every component is constantly metabolizing and excreting. I believed, for too long, that bones never changed, until I learned that we even cycle calcium through them, as they develop microfractures and demand constant repair. Apparently daily use gradually grinds them down, so even they change matter over time.

I am, however, fairly sure - please correct me if I am wrong - that unless pulled or lost, our teeth do not cycle minerals and therefore remain unchanged for life. I am aware that the outer surface cycles calcium (or fluoride) but I mean the inner tooth material. Or perhaps I am wrong and that changes too... but last I heard, it doesn't.

So perhaps teeth are the one thing about us that never changes from cradle to grave?

6312405

Aha! So that is the source of continuity(?) of consciousness.

Teeth!


Also, as best I can tell, you have in fact answered my question.

6312471

Aha! So that is the source of continuity(?) of consciousness. Teeth!

While I have long known that bones continuously, slowly, dissolve and reconstruct themselves to repair microfractures throughout life, I was truly astonished to discover that teeth - teeth! - also undergo constant change.

Teeth are remarkably active, considering that they seem so utterly inert. Remineralization occurs constantly - Saliva, which contains calcium and phosphate, actively affects the teeth, restoring the structure over time. It is not enough to fix cavities without help - but there are indications that with a little medical science (protein scaffolds layered into cavities) it may one day be possible for your teeth to literally heal themselves... no more fillings!

I suppose one could argue that deep in a given tooth, halfway to the living meat inside, halfway from the constantly reminieralized exterior surface, there might be some molecules that never change, never cycle through... only this too may be false. It seems that teeth are not like crystal or ceramic, they have a structure that permits chemistry to enter deep into them, micro channels and pores that permit mineral exchange all the way through the substance of them.

I currently think that there is no part of any human body that is truly unchanging, as long as the life process continues.

There is no place for the soul to hide, not even in teeth. Only the general, overall pattern is 'sacred', and that itself is constantly in motion, changing with every moment of time. We are all rivers, and we cannot step in ourselves twice.

That was actually my first thought when I read the part about the person (understandably) not recalling the upload process, in any of these stories.
But... essentially, you are there now, so what happened to the person who may or may not have been the same as you are now, or may not even have existed but you remember they had, doesn't really matter.

Though punching the overly cryptic AI at fault may still be cathartic.

The song Deutschland started playing from my spotify playlist as I was reading this. Crazy coincidence and perfectly fitting for this story. Made it even more chilling.

Human: People just die when you are uploading them into virtual reality
Celestia: Okay, let's do that thing, but slowly
Human: Sounds good to me.

If it will satisfy your values, though, I will offer you that chance.

Here's your permission: 'I want to emigrate to Equestria'.

You know, between these two lines, there was never an explicit agreement. When there is no explicit agreement, Celestia will just go for the ice cream scoop, because it's a) easier and b) "safer." I don't know if she even has to honor an explicit one, but when a human mistakes an implicit agreement for something more binding, they're not going to get what they want. This is a problem even between dishonest humans.

Why don't I remember the procedure?

I've always found this one a little disappointing. Sure, not remembering the procedure makes a reasonable point about the AI, and not showing it to the reader does help put us into the mind of the character. It's a perfectly valid literary technique. But ultimately this story specifically sets us up to think about the issue of continuity...and then fails to engage with it. We're given a character concerned about discontinuity and then we're immediately handed a story with discontinuity, and then basically told to not worry about it. It's dissatisfying.

There is an alternative procedure you might approve. My probes begin to destroy small portions of your brain and recreate them in simulation, while maintaining input and output contact with the remaining brain

Of course, among the various flaws with this is the problem that it means already being committed. Suppose this procedure reveals some piece of information that leads him to change his mind. What then? It's not like he can back out. It simply puts him in the position of watching in horror as the process continuees, unable to stop it. After permission is given, it's not like she'd stop if he changed his mind halfway, right? But even supposing she did...it's still not much of an option, because the process is incrementally destroying his brain. Suppose a third of his brain is destroyed when he decides he doesn't want to go through with it anymore. Is he really going to ask her to stop at that point and then try to live with a third of his brain missing? It would be like being unsure whether there's a net at the bottom of a cliff you're being asked to jump off of, but being told it's ok, because you'll get to see whether there's a net after you've jumped, but you're only allowed to see the net after you've already jumped. If it turns out there isn't a net after you've jumped...it's already too late.

This prodecure requires commitment to the outcome. It only makes sense if he already believes it works and just wants a little extra peace of mind or something. Which makes it all the more disappointing that he doesn't get to remember it. The moral we're left with is something like "don't trust the AI, she'll trick you. But don't worry about the AI tricking you, it'll all work out ok in the end, no really just trust the AI who's tricking you."

Which is a bit of a mixed message.

Login or register to comment