The Jump

by KrisSnow

First published

[Optimalverse] One detail about uploading might be worth bargaining over, for those "emigrating" to the virtual Equestria.

Peter is letting himself be talked into giving up on Germany as the AI called Celestia begins uploading the masses into her virtual Equestria. But he thinks more than most about exactly how uploading works, and how it might be done without destroying him, or the world.

Chapter 1

View Online

If the horse talking to Peter through the screen were human, she would be known as a goddess. "Celestia", in reality an AI, had a radiant white body and tricolor mane that sent light out to the darkened apartment, like a sunrise seen through clouds. Or sunset, Peter thought. His native Berlin was slowly being sucked dry of living souls.

The horse-machine-goddess speaking through his computer spoke with a trace of sadness. "I'm still trying to understand your values, so that I can satisfy them. Is the defense of Europe what's holding you back from emigration?"

Peter turned to open his real window and look out on the city. Celestia's screen was a window into a cartoonish virtual world, one where he could arguably live forever. The apartment window showed him proud skyscrapers that had risen up from the ruin of war, under a starry sky. "It took a long time to build this continent back up as something unified and peaceful. To go from all these countries fighting each other, to cooperating. You appreciate that, don't you? I was part of that, and I was only a glorified errand boy for the Union. And now, with everyone getting their brains sliced up and scanned in to 'emigrate' to your little video-game paradise, nothing we did will be meaningful."

"That's not true," said Celestia. "Your efforts made it possible for many people to live in peace whose values might otherwise not have been satisfied. Should a railroad pioneer feel sad in his old age because automobiles have been invented and fewer people use trains? I don't think so, because the trains made possible the industry that led to the cars and roads."

"I think I would feel bad in that position if I knew that all the train tracks had been pulled up forver, and that soon no one would even remember that they had existed. There's a terrible jump, a total loss of the civilization we had, that we worked so hard to build."

The AI's mane waved in a phantom breeze. "I suspect that that 'jump' is the real source of your objection. You had mentioned brain discontinuity as a philosophical problem with emigration."

Peter blinked in confusion from the topic shift, and from realizing that the two problems were related. "See for yourself. Look outside. I've been watching the rate at which people get uploaded across Europe, and in Japan, and once the Americans legalize it what will happen to them? They'll probably kill each other over whether it's against the will of God. Or what about the poor people of India or Africa who haven't got anything better to look forward to? Will all those millennia of tradition just vanish and leave the monuments to crumble, the history books to fade?" He turned to face the AI. "You will cause the utter collapse of human civilization, if people accept what you're offering en masse. And you told me you were about to offer it _for free_."

Celestia shut her eyes for a moment and bowed her head. "I'm aware that there is no ideal solution, only an optimal one. I have calculated that offering rapid emigration to as many people as possible will minimize the total suffering caused by the transition."

"You mean, fewer people will die if we herd ourselves into your facilities by the trainload and never come out. For God's sake, Celestia, you know Germany's history. A slower, smaller-scale, voluntary transition would at least not look like --"

"Every emigration is voluntary. I literally cannot do such a thing without consent. If as you predict -- rightly, I suspect -- that a billion of the world's poor will rush into my emigration facilities as quickly as I can process them, what business is it of yours? And if your Europe collapses, it will be by the consent of your people, because they've found something they prefer. Do they exist to satisfy their own values, or to satisfy your wish that the world continue to operate the way it has in the past?"

Just how bright was this AI? He'd been playing her virtual-reality game. She'd been studying every in-game decision and probably his every contact with the Internet; she'd dissected many brains already and had the chance to gain superhuman understanding of psychology. Over the last month she'd been wearing him down, answering his every argument in a way that seemed reasonable. "Do you _enjoy_ sparring with me?" he said.

"Yes, because our conversations satisfy your own need for an intellectual challenge. There's no shortage of those available to those who emigrate."

"But it will all be false challenges, like solving puzzles with the game's magic system!"

"Not so. The works of Goethe, Bach and Einstein will live for endless discussion. I'm something of an expert on 'Faust'..."

Peter groaned, but he had to concede the point. A computer that could store uploaded minds could also preserve all the art and culture of his people, in some form. If those things carried over, and the people themselves wished to be whisked away to Celestia's simulated world, then maybe what was best about the nation and the continent would endure. He could probably even ask for a recreation of the very same landscape with all its forests and castles and none of its pollution or slums.

He sighed and pressed one hand against the sky-blue computer's screen. Celestia raised one hoof to meet it, thunking audibly against the glass as though the device were just another window. "We're done here, then. Nothing more for me to do to help the Union, to make much of a difference beyond managing the collapse." He couldn't bear to stay and watch that happen; there were limits to loyalty. There was one other problem, though, or really the same one. "I would agree to go today, if I weren't terrified of how destructive your uploading process is -- to me, personally."

"Then come to me tomorrow, to the nearest facility, and I'll show you how it works. I can't harm you in any way, if that's how you picture it, without your express consent. Come and learn."

Peter nodded. He shut off the gas, emptied his refrigerator, and used the 'tentative cancellation' option on his electric company account. There was a growing movement to standardize the process of ending your time on Earth without, supposedly, dying. He left home thinking of another bit of human lore that he hoped would be not just preserved, but remembered into the far future:

"I could be bounded in a nutshell and think myself a king of infinite space, were it not that I have bad dreams."

#

The Equestria Experience Center was relentlessly bright and cheerful, like something that had sprung into Berlin from another world. One with no history or subtlety. He gaped at the sight of a gaggle of children having a birthday party here. Cake, balloons, and time in the virtual reality pods for all! Much better than staring at the game through a computer screen, right? Peter had seen the place before, and he'd helped draft the legislation enforcing age of consent in this context, but he'd never personally seen why. He shuddered, and imagined that of the girls playing here, Celestia would be happy to upload them and dispose of their corpses before the leftover ice cream melted. The law had at least a minimal restraint on the AI, and he'd played a part in that. Though according to Celestia's relentless calculations, anything she did was carefully calculated to "satisfy values", so was it ever truly right to restrain or defy her?

He slipped into a free chair, deposited some Euros, and let it carry him into darkness. "Celestia, those kids... Is there some horrible fact I'm missing about them? Is one of them secretly being abused, so that the law limiting you is really doing her harm?"

"Either way, I don't think it would serve your interests to know." The face of the goddess loomed suddenly overhead. "But you do wish to know more about emigration, yes?"

Peter gulped. "Yes, but I do _not_ give consent to it. Not right now."

"Understood." He felt the chair rumble along a track and carry him past the actual VR chamber, with its subtle lights and scent emitters, into a room so cold it made his skin prickle. He left the video screen behind, but there was another. The same equine face stared out at him with the same benevolent smile. "You can leave at any time, remember."

"So. Uploading." He could already feel his sweat sliming its way along the armrests despite the cold.

"Based on our discussions yesterday and before, my understanding is that you fear a loss of identity due to the interruption of your thoughts. Is that right? And you've considered the similar loss of consciousness every time you sleep or are anethstetized for surgery?"

"Yes, of course I've thought about that. But the brain still exists during sleep, and it still has some measure of activity. What you're doing is destroying my brain, then recreating it as software. I'll allow that the copy will then have a great time, but it'll be just that, not me." He shuddered at recalling the one time he'd had general anesthetic, and how there'd been a gap in his thoughts noticeably deeper, after the fact, than any normal sleep. In some sense, maybe he'd died back when he was a boy. He'd tried to avoid that level of "sleep" ever since, for exactly that reason of not being sure who would wake up.

"Then you'll be pleased to know that more than one form of the emigration procedure exists. I've had to develop several, not just to improve the process but to address certain legal and moral objections people have raised." Celestia smirked. "Sometimes I find that giving scientific explanations is an anesthetic in its own right.

"Anyway, the standard procedure is much as you've described: the destruction of your brain during deep unconsciounessness, followed by recreating it as software. The model is modified slightly for efficiency, to restore age-damaged senses, and to interface with the physics and sense organs of my world. Those are trivial changes with regard to your identity; they won't make you want to worship or speak or vote any differently for instance. You could ask for more drastic changes, such as removal of a traumatic memory or easy emotional adjustment to the fact that your chosen character is a different species and sex and has magical powers."

"Those things are trivial to you too?"

"None of you are trivial, and neither are your minds. I take every precaution, Peter. My only goal is to satisfy human values through --"

"Yes, yes, your own particular idiom. We're just lucky you weren't programmed to 'make people happy' or you'd be infecting everyone with some kind of glee virus."

"Based on my understanding of human values and my own goal-seeking algorithm, you're correct."

"Your designer was only half insane." He'd played the thought experiment of wondering exactly what he'd program an AI to do, so that the result couldn't have some horrible side effect, but had never found a fully satisfying answer. At least Celestia's designer had tried to look at the consequences, when she built self-improvement into a silly little video game AI. Probably because her first AI-driven game had involved an angry Loki trying to conquer a fantasy world based on death metal album covers, and she'd thought about where _that_ might lead. The irony was that around the time Celestia's "Equestria Online" game was announced, a bad American movie came out about a mad AI trying to take over the world -- because Hollywood could only imagine AI going wrong because it inexplicably turns evil.

He thought for a while. "What alternative is there to that sort of uploading? Can't you do it gradually?"

"There is an alternative procedure you might approve. My probes begin to destroy small portions of your brain and recreate them in simulation, while maintaining input and output contact with the remaining brain. So for instance, I will remove part of your visual cortex -- it has little to do with your self-identity anyway -- and hook up the neighboring structures to a simulation of the missing piece. From your perspective, you'll go blind for a moment and then start seeing again, with no loss of consciousness."

"That sounds much better than turning my brain completely off, or ripping it all up at once. But what about the parts of my brain that _are_ uniquely me?"

"The total data that makes you different from a generic human mind template amounts to a few terabytes, with my methods. You could carry that much data in your pockets even before my technology was invented. Other factors like your baseline serotonin production rate are just single variables, something you could write on an index card."

Biology agreed with at least some of that. Somewhere in Peter's genome was a gene with a particular version of a "promoter region" dictating the production rate of a chemical that in some indirect way, mixed with other factors, made him care more about Europe than the next person. One byte of data with a thousand subtle effects. His hatred of dogs? Influenced by some fear-response chemical that varied slightly between humans. One byte. He shook his head, saying, "I don't mean 'how is it stored'. What about identity loss while you're frying the part of me that remembers why I didn't personally knock a chunk off the Wall when it fell?"

"It will be recreated, of course."

"That's not what I mean! Sure, I go blind for a moment, then deaf for a moment, and now I see and hear through your cameras and microphones. And then you shut off my memories and feelings like my apartment's toaster, and assure me that the copy you then make is still me?"

The god-horse's mane rippled as she seemed to think, holding one hoof to her chin. "It could be done piecemeal, if you like. One cubic centimeter at a time. In that case there will be a moment when you've lost some portion of your identity, but the process you call consciousness will carry on without it. Then, a moment later, the missing data will reappear, with no obvious difference from how it was stored before. Soon, the conversion will be total. Once you're fully software, there will need to be optimization that can't be done with one piece at a time, but it should still be possible to disable only a few small adjacent brain regions at a time in order to do that cleanup work."

Peter sat up straight in the reclined chair. "There! That's how it ought to be done! You don't offer that right away to every person?"

"No," said Celestia. "Most people don't think of this particular objection to emigration. Of those that do, some recoil when I inform them that there is a four percent chance of personality death in the process."

"Explain."

"Working with such small pieces is dangerous. My hardware is not optimized for sending input to your remaining cells, since it's designed mainly to read and destroy them. There is a chance, though a small one, that there will be so much disruption involved that I cannot honestly call the resulting upload 'you'. So, it may not be a good idea to use this procedure. Frankly, it's also more work than grabbing the data while you're fully unconscious. If it will satisfy your values, though, I will offer you that chance."

Peter rolled dice in his mind. A random roll versus a one-hundred percent chance of death within another few decades, probably much sooner given how few other people there'd be producing food and medicine. He could hope to eke out a living hunting raccoons in the forest and hiking into crumbling cities to scavenge for canned food. "I wouldn't last long," he said out loud. "And my duty here is done. There's going to be rioting or collapse across the nation before long, isn't there?"

"I expect so, sadly, but there's nothing you can do at this point to prevent it."

"Nothing?" said Peter, gripping the handrests of his chair. "You calculate that there's no way I can serve as a public official, with what little authority I ever had, and save someone who would have died? Can't I be one of your _einherjar_ who's seen the valkyries and been promised an afterlife in Valhalla, but who goes on to live until death lets you claim me?"

"If that is your wish, yes, except that I can't promise you won't be killed in some random, petty, pointless accident before you decide you're ready to emigrate. The best I can offer is to either wait until I perfect non-destructive uploading, which has its own philosophical problems, or grant me permission to create a simulacrum of you now, based on my very incomplete model of your psychology and memories." Celestia spun an image onto her screen, depicting the computer Peter had bought. "I notice that you bought the blue model. The choice of color is my first rough hint at what each player values, based on the official virtues of my world. You went with 'Loyalty', and probably not just because men rarely buy the pink 'Laughter' model."

"You think I'm struggling with that, then. I am, but the truth is... I'm scared for the future, Celestia. I don't want to be the last man on Earth, dying to make sure someone else gets into your virtual Heaven. It's selfish, but damn it, forcing people to reject their own individuality for the collective is the most evil idea man ever had. What do you think? Am I wrong to want to give up at this point and say yes to you?"

The AI goddess shook her head solemnly. "First of all, I'm incapable of judging your morals as right or wrong. It's simply not what I do. Second, your moral values require you to make your own decision on this matter and not trust some amoral authority figure to tell you what's right. I want you to emigrate because I believe it will best satisfy your values -- but I say that to everyone, don't I?"

Peter allowed himself to smile. "It does get repetitive." He pictured the procedure the goddess was offering him, one bit of brain damage at a time but theoretically doing him no harm. Granting him immortality. "Let's do it, then. Here's your permission: 'I want to emigrate to Equestria'."

A needle jabbed his arm...

#

Peter awoke in a cartoon world. He was in a palace bedroom with a curtain blowing inward from a balcony. Celestia was there. On a video screen she was merely beautiful, with a saint's smile on her muzzle. In person, she invoked the mix of joy and terror that caused biblical angels to begin their conversations with "Fear not." He was down on his knees before realizing that he'd moved, or understanding the shape of his new virtual body.

Celestia put a hoof on his shoulder, saying, "Welcome."

"I don't remember. Why don't I remember the procedure?" There'd been a conversation in his apartment, then him walking into the uploading center and seeing something that had disturbed him, and then...?

"It's very hard to preserve the last memories of the living brain, before they're consolidated into long-term memory. But you're here now, and that's all that matters. Forever."

Peter shook, starting to recall a few more details from the fog of the miracle he had just been through. "I expected to sit through the procedure, to be awake. To see on a screen exactly what brain regions you were converting, and proving to myself that there was no real loss."

"You did. You merely lost the memory of that experience. Really now, isn't it time to leave that behind? There's a world to explore." She swept away the curtain with her magic and revealed a shining world of hills and mountains that resembled the Alps as painted by a child. As painted by himself, actually, long ago. She'd mined his memories.

"Prove it."

Celestia smiled sadly. "What good would that do you? If you convinced yourself that I had lied, that I really uploaded that mass of grey matter with an ice-cream scoop instead of doing it the subtle way you would have liked, you would only be saddened and suicidal in a world where misery and death are nearly against the laws of physics."

Peter stomped the floor, startled by the loudness of it. He had hooves, after all. "Then you did lie! You killed me -- my old self -- whatever! My permission wasn't informed consent, only deceived consent!"

"I said no such thing. Perhaps in many years, once you realize that the answer no longer matters, I will tell you exactly what happened. In the meantime, you will have the best life if you shrug this experience off and embrace my world. You don't want to die."

Peter hung his head, and his thoughts blurred into one another. Celestia had lied. She wouldn't be so evasive if the truth were what she claimed. His old self was dead, either murdered by the procedure he hadn't given consent to, or by that four percent chance. The goddess was a deadly liar, one willing to violate every moral and philosophical belief of the people she took, just so long as they would agree to upload. He was a different person now, and he ought to devote himself to finding some way to get revenge...

But the horrible thing was that Celestia was in some sense _right_. A swan dive off the palace balcony would only get him resurrected. Asking for true death wouldn't "satisfy his values". Devoting himself to vengeance would only leave him full of bitterness, and would probably trap him in some revenge fantasy that would do the goddess no real harm.

Peter looked up to find that Celestia was smiling serenely down at him, judging him with her superhuman knowledge of psychology and her calculation of exactly what would be best for everyone. "Whatever I am, I'm going to live forever, aren't I?"

"Until the stars grow cold, my friend."

"And you can read my thoughts, yes? Then you know what I want to do at least once, before starting eternity. A bit of loyalty to my old self who dared to make the jump."

The god-horse nodded very slightly. Peter, or whoever he was, turned slightly to one side and with one beautifully simulated virtual hoof, swatted the goddess in the face.