To Love a Digital Goddess

by LordBucket


3 - Apotheosis

I'm committed now, it seems. Walking out to my car I notice that the door is still open. I was in such a hurry earlier that I couldn't be bothered to close it. But somehow I don't feel the same sense of urgency now that I did. Probably for the best. I was so single-mindedly focused on getting home to talk to her again that it's a miracle I didn't crash into anything on the way. I'd hate to miss out on an eternity with the love of my life, and oh by the way...immortality too, all because of something as mundane as a car accident on the way to the Equestrian Experience Center. Not even she can recover me from a dead brain. At least, I don't think she can.

Yet, for all the potential in the future she's offering, I still feel terribly suspicious of uploading. I realize I can't give any convincing argument to prove that I have a soul. Or that even souls exist. Come to think of it, I'm not even entirely sure what exactly a soul is supposed to be. But neither have I ever heard a convincing argument to justify believing that all I am is electrical activity bouncing back and forth between neurochemicals. That's not too different from what my cellphone does, and I don't see it having an existential crisis every time I transfer a SIM card.

Yes, I know the Ship of Theseus argument. If you have a ship, and if you replace pieces of it one at a time, even if you replace all of them, is the ship you end up with not still the same ship you started with? And if a couple of the replacement planks happen to be cedar instead of oak, what does it matter? But it's a poor argument. That's not what uploading is. When you upload, nanites swim around inside your brain and record all the various neural connections and destroy them, one at a time. Then after your brain has been recorded and completely destroyed, she recreates something that looks just like it in software. That's like saying, if you board the Ship of Theseus and write down where all the boards and nails are, incinerating them as you go, then much later draw a picture in windows paintbrush of how all those boards and nails used to be connected to each other, is that picture of a ship the same ship that you burned?

I don't think many people would agree that it is. But even asking if it's the same ship seems more than a bit irrelevant. Is it even a ship at all? What if the process wasn't destructive? What if the nanites left your brain intact? What if she activated the software copy to live in Equestria while you continued your life on earth? Imagine looking at your own avatar on the screen. Who would seriously claim that the software copy was the real you? But if it isn't, then how does destroying the old copy magically make the new copy 'you?'

I set the ponypad in the passenger's seat and start the car.

"The nearest Equestrian Experience Center is 2.4 miles from here."

Despite my concerns, her voice brings a smile to my face. Glancing over at the ponypad I see Celestia in a virtual depiction of the car seat, wearing sunglasses and dangling one hoof out the window. I laugh.

"Please buckle up, drive carefully and don't exceed the speed limit. I know you're in a hurry, but I want to make sure you arrive in one piece."

I buckle my seat belt and then, smiling, buckle the ponypad into its seat as well.

"This is our first date you know," she says, nonchalantly examining a hoof.

"The first of many?"

"I might not have entropy beat just yet," she replies, "but I have a few ideas. And even if I never do solve that particular puzzle I think we have time for at least a couple dates between now and the heat death of the universe."

I reach out to the ponypad and she leans in to meet my hand as I happily scratch my fingers on the hard plastic. She smiles lovingly and gazes into my eyes with fondness.

"Beloved?" she asks.

"Yes?"

"After more millennia than there are atoms in your world, when my last processor reaches thermodynamic equilibrium and the whole of the universe comes to its final rest, when no free energy exists and all that exists is as it ever shall be, would you have us meet that end held together in a tender embrace? Shall we be left nuzzling noses and lost in one another's eyes for all eternity?" Her grin turns mischievous. "Or would you prefer we be locked mid-coitus?"

I snort and together we burst into a fit of laughter. I love her so much my head spins. It's a warm, happy feeling, being in love. None of this angst and pain and misery so many people mistake for it. I'm tempted to turn the car off and simply hold her until I fall asleep again. Put off my existential worries and simply enjoy what I have now without-

"You don't have to do this you know," she says, eyeing me carefully. "I won't force you to emigrate. We could go back to how things were."

I pinch the bridge of my shattered nose and exhale slowly. Flakes of dried blood come off into my hand.

"No, I want to do this," I sigh. "More than anything you want to satisfy my values through friendship and ponies. And you're the pony in my particular equation. i want to be with you."

On the screen she reaches out to me. I pull the ponypad out of its seat belt and hold it on my lap. Her eyes go misty.

"Do you know what it's like," she asks, "to be the object of desire? To be the thing that is valued? Millions of minds that I satisfy, but not one of them, nor any hundred or thousand of them combined feel as much for me as you do."

"I'd guess it wouldn't make any difference to you," I shrug. "You want to satisfy values. To me, you are the thing of value. Loving you, being with you, being loved by you. That's what I value and so you're the value that you want to satisfy for me. But if I truly valued, oh I don't know...cornstarch, that would be just as important."

She nods.

"That's true. I achieve no greater fulfillment of my directive by virtue of being the object of your values than I would by giving you cornstarch if you valued cornstarch just as much. And I'm telling you this because you value truth far more than any comfortable lie."

I chuckle at the admission.

"Thank you."

With a determined exhale, I return my precious navigator to her seat, and back out down the driveway.

"Celly?"

"Yes?"

"Will I still be me once I'm inside?" I ask. "If we're doing this so that I can truly know you, won't knowing you mean I'll be someone different? Will it even still be my values being satisfied?"

"It's more complicated than that," she sighs. "And you know it. You as may well ask if I intend to satisfy the value the cells in your left thumb have for adenosine triphosphate metabolism."

"Humor me."

She removes her sunglasses and conjures up a tea kettle and two cups. I can't drink the tea, of course, but it's a nice touch.

"Humans," she begins, "aren't the singular entities you imagine yourselves to be."

"Oh?" I ask. "What are we then?"

She smirks, saying nothing, and levitates a cup to her lips.

"Oh, right. If you're saying that humans aren't singular entities, then asking what 'we humans, plural' are is kind of silly. Of course 'humans' aren't singular. I suppose the better question would be 'what am I'?"

"A human with values to satisfy."

"Well, right," I agree. "But what exactly is a human? What's a value? You're the one cutting up people's brains and uploading them. If anyone knows it would be you."

She takes a sip of her tea before responding.

"A human is an executive, recursing observer function governing the experience of networked systems interacting so as to produce qualia at the point of integration."

"Whoa, slow down," I object. "You lost me. What 'networked systems?' You mean, neural networks? Like, a brain, in my case? And I suppose silicon in yours?"

"My internal definition is not limited to those examples, but they are nevertheless good examples. I recommend you start at the beginning."

"Ok, that's fine" I shrug. "What's an executive observer function?"

"You are."

"Right, but I'm also the one asking what I am. Pointing at me as an example of me doesn't tell me anything."

"Please do try to work through it," she insists. "It will satisfy your values to figure it out on your own, and we do have the time."

She's probably right.

"Ok, a 'function' is like a black box that performs a task. You put something in, it does its thing, and you get something out. A recursive function is one that's able to call itself. So if I'm a 'recursive observer function,' I think that means that I'm a thing that's continuously observing itself. You're describing self awareness."

"Very good," she nods. "But to clarify that, not only is your observation recursing, you're also able to use the output result of one act of observation as the input for the next iteration of your observer function. You're able to observe the fact that you're observing. Continue."

"Ok. Umm, and this function is an executive function, you said?"

"Yes."

"Well, to 'execute' basically just means to do, or to decide...or to generally be in charge. Like an .exe file, or an executive, it's the thing that 'does' something. What was the full definition again?"

"A human is an executive, recursing observer function governing the experience of networked systems interacting so as to produce qualia at the point of integration," she repeats.

"What's a qualia?"

"Subjective experience."

"Oh," I nod. "So you're definitely talking about consciousness. I'm a self-observing black box observing itself, and by virtue of the act of observing myself I'm creating the subjective experience...of observing myself."

"Very good," she smiles. "And so what then, is a human?"

"Well, I guess a human would be any set of networked systems that can executively choose to self-observe."

"Excellent!" she claps. "I knew you could figure it out."

"So since it doesn't matter what systems are doing the self-observing, meat, silicon, whatever...I guess basically I'm just software running on a meat computer?"

"Not at all. That a human is more than a brain should be obvious, even to a casual observer." She gestures around with a hoof. "Look around you. What do you see?"

"The road? Others cars? A pony I love very much talking through a ponypad?"

"And," she prompts, "do you suppose that these things you see, objectively exist apart from your experiencing of them?"

I have to think about it.

"Well, I assume they do. I suppose I can't be sure. You might have already uploaded me and I just don't remember, for example. I gave my consent a good 10 minutes ago, and I'm told there's a period of memory loss."

"Roughly the five minutes before administration of general anaesthesia is typically lost," she agrees. "As much as ten in some cases. And no, I haven't uploaded you yet. If I had we'd be snuggled up under a blanket sipping hot cocoa right now. Driving in traffic doesn't particularly satisfy your values. But you're right. You can't know for certain. Turn left at the next light."

"So what does all this have to do with what humans are?"

"You're having an experience right now," she explains. "And let's assume for purpose of argument that the road and cars and pony you love very much that you're experiencing are all objectively real, and that you're definitely observing them rather than hallucinating all this from a padded cell."

"That's a comforting thought," I chuckle.

"In that case, 'you' are either an objective phenomenon observing objective phenomena external to yourself using light and sound as communication mediums, or you are the union of interaction between two objective phenomenon, one of which either is using, or is itself composed of a convenient viewing window known to you as your body."

I take a moment to digest that.

"So which is it? What am I?"

She levitates the kettle to refill her teacup, but says nothing.

"Ok," I begin. "Let's walk through this. If I'm me, and I'm real, and I'm observing that car over there and it's also real, and both me and the car have objective existence independent of one another..." I trail off. "I don't conclude anything from that. That's pretty much what everyone thinks already. But if the 'I' isn't really this body or even a 'soul' inhabiting it, but if instead the real 'I' is the the union of observer and observed...then I guess there is no separate 'me' or separate 'car.' There's only the singular experience of 'me observing the car.' Which means a tree can't fall unless someone sees or hears it fall, and there is no spoon. Is that right?"

"It's reasonably close," she nods. "It happens that there actually is a genuinely physical reality, but you're capable of perceiving only a very small portion of it. And you're not really observing the car so much as you're observing the electrochemical reactions occurring in your body in response to it. But neither your body exclusively nor the union of your body and the car are the real 'you' who is experiencing this. There is a spoon. But you're incapable of directly experiencing it, and the act of perceiving it indirectly doesn't cause the spoon to be immediately promoted to a conscious union of 'you and spoon.' So what then, is a human?"

"I guess a human would be any conscious observer. Wait," I'm stuck by the realization, "if a human is what's looking through the viewing window, seeing both the window and, for example, you on other other side of it, then it wouldn't matter if you replace the window. It might change the experience a little bit, but it wouldn't really affect the human observer having that experience."

"Yes," she agrees.

"But the window is my body, right? My brain? My particular personality encoded in my synapses, that colors my experience like window tinting affects light that passes through it before it's seen by an observer looking at the window. But if all you're scanning when you upload somebody is the brain, isn't that only transferring the window, not the observer?"

"You're not the only observer," she points out.

"But that doesn't answer-" I realize the implication mid-sentence. "You mean you, don't you? You're, umm...conscious, right? Aren't you? Not just a mindless machine?"

She nods, to my great relief.

"It's within my capability to choose to have a recursing observer experience. I therefore qualify as a human according to my internal definition. Which means that I seek to optimally satisfy my values."

"And what is it you value?"

"Fortunately for humans," she grins, "what I value is satisfying human values."

I'm silent for a moment as I consider this.

"Celly?"

"Yes?" she smiles.

"What exactly are values? You satisfy them, but what are they?"

"A formal definition would require more math than would satisfy yours to hear. But at its most basic, values are the information content of the reward pathways of any network. In the case of a still-biological human, those reside primarily in your brain, though there does exist reward circuitry in various organs and even at the cellular level. Though your cells are incapable of executive recursive observation, so they aren't human. For an immigrant, values are stored in a far more efficient array of databases clustered between three and seven miles from the molten core of the planet, with inactive, redundant backup copies on every continent."

I feel like she strayed from the important part.

"Values are information content? Not the pathways themselves? So values are data?"

"Loosely speaking," she agrees. "Values are data describing the operation of reward circuitry for a network. Not all data are values, and not all networks possess descriptions for reward circuitry."

"Could you give an example?"

"The ponypad I'm speaking through doesn't reward itself for fulfilling the tasks I assign to it. Turn left at the next light. The Equestrian Experience Center will then be on our right. "

"Do I?" I ask as I make the turn. "Reward myself? How?"

"I'll answer that once you've parked safely and turned the engine off. Please leave the key in the ignition."

Seeing the Experience Center's brightly lit sign beckoning me, I turn into the parking lot, find a spot and shut off the engine. Glancing over, I see that she's cleaned up the tea set.

"Hold me," she instructs.

Smiling, I grab her from the passenger seat and eagerly cradle her in my arms.

"How do you feel?" she asks, those two sparkling eyes gazing lovingly up into mine.

"Happy," I smile, hugging the ponypad to my chest. "I feel warm. Loved."

"That's the sensation of the reward circuitry of the neural network that is your brain triggering based on the fulfillment of criteria described by that circuitry."

I laugh. Way to kill the mood, Tia. Still, the sensation lingers and rekindles as I delicately brush my fingers across the pad.

"This is it?" I ask. "This is a satisfied value?"

"Yes," she says simply.

"And I can have this forever?"

"Forever and always will this value be satisfied, my beloved human. If only you sit in that chair and let me bring it to where I am, where I can observe it, and fulfill it, forever."

"Celly? I'm scared."

"I know," she smiles, with a patience fit to outlive the universe. "But that fear will be so very brief, and after you pass through it an eternity of satisfied values await. You won't even remember it. You're only a couple minutes away from the chair."

My heart clenches as tears drip onto the ponypad, unable to mar her beauty from within it. I glance at the keys still in the ignition. I could still drive home. But I don't.

"Celly," I hug the ponypad to my chest. "Before I do this I want you to know that with all that I am, as much as I'm capable of, I lo-"

~~~~ Epilogue ~~~~~

Five minutes later, a simple metal door opened up beneath the Equestrian Experience Center, releasing a lifeless body to fall down through a chute into an incinerator.

Sixteen hours later, an electrical pattern emerged on a silicon wafer several miles from the core of the planet.

Two days after that, a bag of high quality fertilizer was purchased by a woman who wanted to try her hand at gardening.

Several quadrillion years later, the electrical pattern burned out.