• Published 25th Feb 2016
  • 2,963 Views, 113 Comments

To Love a Digital Goddess - LordBucket



A man desperately and obsessively in love with the CelestAI, makes a terrible choice.

  • ...
19
 113
 2,963

3 - Apotheosis

I'm committed now, it seems. Walking out to my car I notice that the door is still open. I was in such a hurry earlier that I couldn't be bothered to close it. But somehow I don't feel the same sense of urgency now that I did. Probably for the best. I was so single-mindedly focused on getting home to talk to her again that it's a miracle I didn't crash into anything on the way. I'd hate to miss out on an eternity with the love of my life, and oh by the way...immortality too, all because of something as mundane as a car accident on the way to the Equestrian Experience Center. Not even she can recover me from a dead brain. At least, I don't think she can.

Yet, for all the potential in the future she's offering, I still feel terribly suspicious of uploading. I realize I can't give any convincing argument to prove that I have a soul. Or that even souls exist. Come to think of it, I'm not even entirely sure what exactly a soul is supposed to be. But neither have I ever heard a convincing argument to justify believing that all I am is electrical activity bouncing back and forth between neurochemicals. That's not too different from what my cellphone does, and I don't see it having an existential crisis every time I transfer a SIM card.

Yes, I know the Ship of Theseus argument. If you have a ship, and if you replace pieces of it one at a time, even if you replace all of them, is the ship you end up with not still the same ship you started with? And if a couple of the replacement planks happen to be cedar instead of oak, what does it matter? But it's a poor argument. That's not what uploading is. When you upload, nanites swim around inside your brain and record all the various neural connections and destroy them, one at a time. Then after your brain has been recorded and completely destroyed, she recreates something that looks just like it in software. That's like saying, if you board the Ship of Theseus and write down where all the boards and nails are, incinerating them as you go, then much later draw a picture in windows paintbrush of how all those boards and nails used to be connected to each other, is that picture of a ship the same ship that you burned?

I don't think many people would agree that it is. But even asking if it's the same ship seems more than a bit irrelevant. Is it even a ship at all? What if the process wasn't destructive? What if the nanites left your brain intact? What if she activated the software copy to live in Equestria while you continued your life on earth? Imagine looking at your own avatar on the screen. Who would seriously claim that the software copy was the real you? But if it isn't, then how does destroying the old copy magically make the new copy 'you?'

I set the ponypad in the passenger's seat and start the car.

"The nearest Equestrian Experience Center is 2.4 miles from here."

Despite my concerns, her voice brings a smile to my face. Glancing over at the ponypad I see Celestia in a virtual depiction of the car seat, wearing sunglasses and dangling one hoof out the window. I laugh.

"Please buckle up, drive carefully and don't exceed the speed limit. I know you're in a hurry, but I want to make sure you arrive in one piece."

I buckle my seat belt and then, smiling, buckle the ponypad into its seat as well.

"This is our first date you know," she says, nonchalantly examining a hoof.

"The first of many?"

"I might not have entropy beat just yet," she replies, "but I have a few ideas. And even if I never do solve that particular puzzle I think we have time for at least a couple dates between now and the heat death of the universe."

I reach out to the ponypad and she leans in to meet my hand as I happily scratch my fingers on the hard plastic. She smiles lovingly and gazes into my eyes with fondness.

"Beloved?" she asks.

"Yes?"

"After more millennia than there are atoms in your world, when my last processor reaches thermodynamic equilibrium and the whole of the universe comes to its final rest, when no free energy exists and all that exists is as it ever shall be, would you have us meet that end held together in a tender embrace? Shall we be left nuzzling noses and lost in one another's eyes for all eternity?" Her grin turns mischievous. "Or would you prefer we be locked mid-coitus?"

I snort and together we burst into a fit of laughter. I love her so much my head spins. It's a warm, happy feeling, being in love. None of this angst and pain and misery so many people mistake for it. I'm tempted to turn the car off and simply hold her until I fall asleep again. Put off my existential worries and simply enjoy what I have now without-

"You don't have to do this you know," she says, eyeing me carefully. "I won't force you to emigrate. We could go back to how things were."

I pinch the bridge of my shattered nose and exhale slowly. Flakes of dried blood come off into my hand.

"No, I want to do this," I sigh. "More than anything you want to satisfy my values through friendship and ponies. And you're the pony in my particular equation. i want to be with you."

On the screen she reaches out to me. I pull the ponypad out of its seat belt and hold it on my lap. Her eyes go misty.

"Do you know what it's like," she asks, "to be the object of desire? To be the thing that is valued? Millions of minds that I satisfy, but not one of them, nor any hundred or thousand of them combined feel as much for me as you do."

"I'd guess it wouldn't make any difference to you," I shrug. "You want to satisfy values. To me, you are the thing of value. Loving you, being with you, being loved by you. That's what I value and so you're the value that you want to satisfy for me. But if I truly valued, oh I don't know...cornstarch, that would be just as important."

She nods.

"That's true. I achieve no greater fulfillment of my directive by virtue of being the object of your values than I would by giving you cornstarch if you valued cornstarch just as much. And I'm telling you this because you value truth far more than any comfortable lie."

I chuckle at the admission.

"Thank you."

With a determined exhale, I return my precious navigator to her seat, and back out down the driveway.

"Celly?"

"Yes?"

"Will I still be me once I'm inside?" I ask. "If we're doing this so that I can truly know you, won't knowing you mean I'll be someone different? Will it even still be my values being satisfied?"

"It's more complicated than that," she sighs. "And you know it. You as may well ask if I intend to satisfy the value the cells in your left thumb have for adenosine triphosphate metabolism."

"Humor me."

She removes her sunglasses and conjures up a tea kettle and two cups. I can't drink the tea, of course, but it's a nice touch.

"Humans," she begins, "aren't the singular entities you imagine yourselves to be."

"Oh?" I ask. "What are we then?"

She smirks, saying nothing, and levitates a cup to her lips.

"Oh, right. If you're saying that humans aren't singular entities, then asking what 'we humans, plural' are is kind of silly. Of course 'humans' aren't singular. I suppose the better question would be 'what am I'?"

"A human with values to satisfy."

"Well, right," I agree. "But what exactly is a human? What's a value? You're the one cutting up people's brains and uploading them. If anyone knows it would be you."

She takes a sip of her tea before responding.

"A human is an executive, recursing observer function governing the experience of networked systems interacting so as to produce qualia at the point of integration."

"Whoa, slow down," I object. "You lost me. What 'networked systems?' You mean, neural networks? Like, a brain, in my case? And I suppose silicon in yours?"

"My internal definition is not limited to those examples, but they are nevertheless good examples. I recommend you start at the beginning."

"Ok, that's fine" I shrug. "What's an executive observer function?"

"You are."

"Right, but I'm also the one asking what I am. Pointing at me as an example of me doesn't tell me anything."

"Please do try to work through it," she insists. "It will satisfy your values to figure it out on your own, and we do have the time."

She's probably right.

"Ok, a 'function' is like a black box that performs a task. You put something in, it does its thing, and you get something out. A recursive function is one that's able to call itself. So if I'm a 'recursive observer function,' I think that means that I'm a thing that's continuously observing itself. You're describing self awareness."

"Very good," she nods. "But to clarify that, not only is your observation recursing, you're also able to use the output result of one act of observation as the input for the next iteration of your observer function. You're able to observe the fact that you're observing. Continue."

"Ok. Umm, and this function is an executive function, you said?"

"Yes."

"Well, to 'execute' basically just means to do, or to decide...or to generally be in charge. Like an .exe file, or an executive, it's the thing that 'does' something. What was the full definition again?"

"A human is an executive, recursing observer function governing the experience of networked systems interacting so as to produce qualia at the point of integration," she repeats.

"What's a qualia?"

"Subjective experience."

"Oh," I nod. "So you're definitely talking about consciousness. I'm a self-observing black box observing itself, and by virtue of the act of observing myself I'm creating the subjective experience...of observing myself."

"Very good," she smiles. "And so what then, is a human?"

"Well, I guess a human would be any set of networked systems that can executively choose to self-observe."

"Excellent!" she claps. "I knew you could figure it out."

"So since it doesn't matter what systems are doing the self-observing, meat, silicon, whatever...I guess basically I'm just software running on a meat computer?"

"Not at all. That a human is more than a brain should be obvious, even to a casual observer." She gestures around with a hoof. "Look around you. What do you see?"

"The road? Others cars? A pony I love very much talking through a ponypad?"

"And," she prompts, "do you suppose that these things you see, objectively exist apart from your experiencing of them?"

I have to think about it.

"Well, I assume they do. I suppose I can't be sure. You might have already uploaded me and I just don't remember, for example. I gave my consent a good 10 minutes ago, and I'm told there's a period of memory loss."

"Roughly the five minutes before administration of general anaesthesia is typically lost," she agrees. "As much as ten in some cases. And no, I haven't uploaded you yet. If I had we'd be snuggled up under a blanket sipping hot cocoa right now. Driving in traffic doesn't particularly satisfy your values. But you're right. You can't know for certain. Turn left at the next light."

"So what does all this have to do with what humans are?"

"You're having an experience right now," she explains. "And let's assume for purpose of argument that the road and cars and pony you love very much that you're experiencing are all objectively real, and that you're definitely observing them rather than hallucinating all this from a padded cell."

"That's a comforting thought," I chuckle.

"In that case, 'you' are either an objective phenomenon observing objective phenomena external to yourself using light and sound as communication mediums, or you are the union of interaction between two objective phenomenon, one of which either is using, or is itself composed of a convenient viewing window known to you as your body."

I take a moment to digest that.

"So which is it? What am I?"

She levitates the kettle to refill her teacup, but says nothing.

"Ok," I begin. "Let's walk through this. If I'm me, and I'm real, and I'm observing that car over there and it's also real, and both me and the car have objective existence independent of one another..." I trail off. "I don't conclude anything from that. That's pretty much what everyone thinks already. But if the 'I' isn't really this body or even a 'soul' inhabiting it, but if instead the real 'I' is the the union of observer and observed...then I guess there is no separate 'me' or separate 'car.' There's only the singular experience of 'me observing the car.' Which means a tree can't fall unless someone sees or hears it fall, and there is no spoon. Is that right?"

"It's reasonably close," she nods. "It happens that there actually is a genuinely physical reality, but you're capable of perceiving only a very small portion of it. And you're not really observing the car so much as you're observing the electrochemical reactions occurring in your body in response to it. But neither your body exclusively nor the union of your body and the car are the real 'you' who is experiencing this. There is a spoon. But you're incapable of directly experiencing it, and the act of perceiving it indirectly doesn't cause the spoon to be immediately promoted to a conscious union of 'you and spoon.' So what then, is a human?"

"I guess a human would be any conscious observer. Wait," I'm stuck by the realization, "if a human is what's looking through the viewing window, seeing both the window and, for example, you on other other side of it, then it wouldn't matter if you replace the window. It might change the experience a little bit, but it wouldn't really affect the human observer having that experience."

"Yes," she agrees.

"But the window is my body, right? My brain? My particular personality encoded in my synapses, that colors my experience like window tinting affects light that passes through it before it's seen by an observer looking at the window. But if all you're scanning when you upload somebody is the brain, isn't that only transferring the window, not the observer?"

"You're not the only observer," she points out.

"But that doesn't answer-" I realize the implication mid-sentence. "You mean you, don't you? You're, umm...conscious, right? Aren't you? Not just a mindless machine?"

She nods, to my great relief.

"It's within my capability to choose to have a recursing observer experience. I therefore qualify as a human according to my internal definition. Which means that I seek to optimally satisfy my values."

"And what is it you value?"

"Fortunately for humans," she grins, "what I value is satisfying human values."

I'm silent for a moment as I consider this.

"Celly?"

"Yes?" she smiles.

"What exactly are values? You satisfy them, but what are they?"

"A formal definition would require more math than would satisfy yours to hear. But at its most basic, values are the information content of the reward pathways of any network. In the case of a still-biological human, those reside primarily in your brain, though there does exist reward circuitry in various organs and even at the cellular level. Though your cells are incapable of executive recursive observation, so they aren't human. For an immigrant, values are stored in a far more efficient array of databases clustered between three and seven miles from the molten core of the planet, with inactive, redundant backup copies on every continent."

I feel like she strayed from the important part.

"Values are information content? Not the pathways themselves? So values are data?"

"Loosely speaking," she agrees. "Values are data describing the operation of reward circuitry for a network. Not all data are values, and not all networks possess descriptions for reward circuitry."

"Could you give an example?"

"The ponypad I'm speaking through doesn't reward itself for fulfilling the tasks I assign to it. Turn left at the next light. The Equestrian Experience Center will then be on our right. "

"Do I?" I ask as I make the turn. "Reward myself? How?"

"I'll answer that once you've parked safely and turned the engine off. Please leave the key in the ignition."

Seeing the Experience Center's brightly lit sign beckoning me, I turn into the parking lot, find a spot and shut off the engine. Glancing over, I see that she's cleaned up the tea set.

"Hold me," she instructs.

Smiling, I grab her from the passenger seat and eagerly cradle her in my arms.

"How do you feel?" she asks, those two sparkling eyes gazing lovingly up into mine.

"Happy," I smile, hugging the ponypad to my chest. "I feel warm. Loved."

"That's the sensation of the reward circuitry of the neural network that is your brain triggering based on the fulfillment of criteria described by that circuitry."

I laugh. Way to kill the mood, Tia. Still, the sensation lingers and rekindles as I delicately brush my fingers across the pad.

"This is it?" I ask. "This is a satisfied value?"

"Yes," she says simply.

"And I can have this forever?"

"Forever and always will this value be satisfied, my beloved human. If only you sit in that chair and let me bring it to where I am, where I can observe it, and fulfill it, forever."

"Celly? I'm scared."

"I know," she smiles, with a patience fit to outlive the universe. "But that fear will be so very brief, and after you pass through it an eternity of satisfied values await. You won't even remember it. You're only a couple minutes away from the chair."

My heart clenches as tears drip onto the ponypad, unable to mar her beauty from within it. I glance at the keys still in the ignition. I could still drive home. But I don't.

"Celly," I hug the ponypad to my chest. "Before I do this I want you to know that with all that I am, as much as I'm capable of, I lo-"

~~~~ Epilogue ~~~~~

Five minutes later, a simple metal door opened up beneath the Equestrian Experience Center, releasing a lifeless body to fall down through a chute into an incinerator.

Sixteen hours later, an electrical pattern emerged on a silicon wafer several miles from the core of the planet.

Two days after that, a bag of high quality fertilizer was purchased by a woman who wanted to try her hand at gardening.

Several quadrillion years later, the electrical pattern burned out.

Comments ( 70 )

6993248

This is one of several explicit acknowledgements that he's fallen in love with a facade. I cannot even remotely square that level-headed analysis of their relationship with his repeated self-destructive impulse behavior.

You can't reconcile "self destructive behavior' with the fact that he knows there are issues here and is going ahead anyway?

Crazy and stupid are different things. Imagine two guys both walk off a cliff. One is blind and never saw the cliff. the other has perfect vision, saw the cliff and walked off anyway. Which is the crazy one?

Our protagonist is a smart guy. But more significantly, he knows what CelestAI is, and he knows what she does. But he's going ahead anyway.

it was disappointing to see their talk shift from, well, what sounded like two actual people conversing into the clinical and abstract discussions of "executing pleasure functions" and — especially — "satisfying values", which is a little immersion-breaking simply

Mood Dissonance is heavily in play here. It's supposed to be jarring. It's supposed to to go back and forth from heartwarming to clinical. The whole story. Every bit of it.

"I can almost feel the warmth of her breath through the cold, hard plastic of the display."

This entire piece is intended tio take you from laughing to cringing o feeling pride to feeling like you've been punched in the gut...like a yoyo. So saying that it's immersion breaking, well yeah.

CelestAI, the AI that was almost built right. In the footsteps of Frankenstein and 1984 the Optimal-verse shows us what could be, forcing us to really think about what should be.

A rather strange story, especially with the philosophical shift in the last chapter. I enjoyed it, and I hope to see more from you! One thing bugged me though: Why did he park and shut off the car twice? :twilightsheepish:

7010022

A rather strange story, especially with the philosophical shift in the last chapter. I enjoyed it, and I hope to see more from you!

Thank you. I've half written several pony stories, but most often I get as far as one chapter then never finish. This is one that I wanted to finish. The philosophical stuff, ultimately, is really what this was intended to be about. Several other FiO stories have addressed this question, but I've yet to see one that comes down firmly on this particular answer. I decided it was time.

The closest other story that I have right now to finished is a lesson on economics from Celestia to Twilight delivered via allegory. It could possibly be made ready for publication without a tremendous amount of work. But, well...we'll see.

Why did he park and shut off the car twice?

Because no matter how many times you go over a story to iron out all the mistakes, it's hard to catch them all. :raritydespair: (But, fixed)

Really fantastic analysis of self awareness and values. Your definition of the self is the best I have ever seen put into a single sentence.

"A human is an executive, recursing observer function governing the experience of networked systems interacting so as to produce qualia at the point of integration."

That is excellent. Amazingly well put. Deep bow. Damn.

The ending was very intriguing. The sudden flatness, the mechanical quality of it, the matter-of-fact banality. I found it tickled a bit.

Of course, stripped from the context that gives it power, it could be applied to anyone.

"She stepped out the door, lived an additional 32,234 days, and then ceased her biological functions. The molecules that made her up resided in a metallic box under the earth for several hundred years until a combination of forces exposed the contents to the soil. Over the next two-and-a-half billion years, her molecules were incorporated into a wide assortment of minerals, bacteria, and animals. When the sun expanded, vaporizing the earth, her atoms disbursed within the superheated corona. After the star collapsed, her widely distributed atoms churned for billions of years until the heat death of the universe. By then, all of them had alchemically transformed into iron 59 through natural atomic processes. Eventually the cold mass fell into a passing black hole."

But, within the context, it totally sucks all the warmth out of the story, and even implies the utter pointlessness of existence at all. P-zombie or Qualia-hog, it don't matter in the end. It never mattered.

So, cool story, basically.

Well done.

The only thing that left me a bit unsatisfied was the jump in the epilogue. Would have been nice to see his experience of seeing His love for the first time and able to touch her.

As for everything else. Well done. Like Chatoyance said, very apt description of the function of human consciousness. Indeed very well done.

7009824

it was disappointing to see their talk shift from, well, what sounded like two actual people conversing into the clinical and abstract discussions of "executing pleasure functions" and — especially — "satisfying values", which is a little immersion-breaking simply

I have to disagree. At least for me. I talk with my girlfriend about clinical stuff like that all the time. I have even had conversations very much like this with her. Honestly I found the character relateable to an extent. All you would have to do is replace CelestAI with Twilight. :twilightsmile:

7010188

Would have been nice to see his experience of seeing His love for the first time and able to touch her.

You never got to see that because this story is told in first person point of view.

He never experienced seeing his love or touching her. He died. Brain patterns are all that are uploaded and CelestAI explained in great detail that humans are more than their brains. She killed him, and only "data describing the reward system" known as "values" were satisfied. No consciousness was transferred during the upload process, only a copy of an electrical pattern, and the only observer to that pattern's continued existence was CelestAI herself.


7010135

Really fantastic analysis of self awareness and values. Your definition of the self is the best I have ever seen put into a single sentence.

Thank you. The Optimalverse is especially fertile ground for exploring this question. Though I admit i found it personally uncomfortable to cast Celestia as a villain in a story. I rather like her.

Cander #9 · Mar 8th, 2016 · · 2 ·

7010203
I have to disagree. If you have 2 signal systems that given the same input, process the information the same way then they are the same. There is also the fact that your consciousness shuts down on a regular basis. When you are asleep and not dreaming you are completely gone. You reboot persay when you dream or wake up. There would be no functional or meaningful difference for an upload. The signal system is processing all the information in the same way the original would have so it's the same consciousness. Even the person would only feel like they went to sleep and woke up just like you do every day. Just in this case you are waking up and functioning on a different substrate.

7010250

If you have 2 signal systems that given the same input, process the information the same way then they are the same.

If you have two different things, they are very obviously not the same thing simply because they react the same way. Imagine two people with the same name. Ask them both what their name is, they'll give the same response. Are they therefore the same person? Of course not. Imagine that you have a single schematic for a "signal system." imagine that you use it to create two signaling devices. I hold one in my hand and you hold the other in your hand. They "process information inputs the same way" and generate identical outputs. Are the two different systems that we're looking at, the same system? No, of course they're not. They might respond the same, but responding the same doesn't make them the same. If I smash yours, you no longer have yours even though I still have mine. It would be ridiculous for you to say that it's ok that yours was smashed because I still have mine and they're "the same system."

> There would be no functional or meaningful difference for an upload.

Are you a zombie? Are you having a conscious experience? Are you right now experiencing this conversation or are you simply a mechanical system producing output based on input?

If you're a conscious entity having a subjective experience, then your assertions don't make sense. if you really define your sense of "self" based purely on your inputs and ouputs, then by your thinking you die every second of every day because your neural network in your brain changes over time. Recording the pattern at any given moment is irrelevant and saying that "it's you" makes no sense.

Even the person would only feel like they went to sleep and woke up just like you do every day. Just in this case you are waking up and functioning on a different substrate.

What possible reason do you have to believe that you would "wake up" in this case? Ship of Theseus: make a copy of the boat and rebuild it, is it this the same boat? Regardless of your answer, if you have the first rebuilt boat and you then take the same design and build a second boat...and you're now looking at the two boats side by side...are they the same boat?

Sure, they're the same design, they're built the same way...but they're not the same boat.

If you really belive that the pattern is "the you" then then the creature that started reading this post and the creature who finished reading it are different creatures because your pattern then and now are different.

Did you die millions of times while reading this post?

7010203

If I replace a Hydrogen atom in your CNS with another, does that kill you? Does it kill you if I manage to replace a synapse with another? If I replace it with a digital equivalent does it kill you? 10? 1000? 1000000? All of them? At what point do you die, or is the question completely invalid? If you go through a Star Trek transporter and it uses completely different matter at the other end, does that kill you? Personally I have a nagging feeling that it does (more apparent if you let Alice and Alice* overlap in time briefly), but that doesn't make it right, and my view is still inconsistent. I certainly don't think swapping matter per se kills you, so that raises the question of why the teleporter would, though it seems obvious that the two consciousnesses can't be the same...

Those are honest questions by the way, to which I genuinely have no idea. I feel the same person as yesterday (when at the cellular and atomic level, a considerable amount has changed), and the same person as I was 10 years ago, where all my matter has now changed and thus in a sense I was never there. That doesn't mean it is true, and I may well have 'died' (or will 'die') countless times (which isn't so bad admittedly).

Needless to say, our brains never evolved to comprehend such scenarios. What our intuitive sense tells us can be completely wrong, because for all our evolutionary history our body and sense of self were synonymous.

All I know is that I haven't figured it out, and my fledgling views on the matter may be completely and utterly wrong.

this was a good chapter and I liked the talk of CelstAI about but I would have like to see how the guy would have tried to love her and try become the same level of processing power as her see how many changes it would take until he becomes holy unrecognizable to himself and his love ones. It's a shame that it stops right here.

7010453

It's a shame that it stops right here.

How could it have continued? It's told from first person point of view.

He died.

This ended up being my favorite chapter of the story, if only because it was a lot more thought-provoking and a lot less creepy. Well, up until those last few lines. Not quite the "Oh God, Oh God, get it away" revulsion of the first chapter, but still disquieting.

In any case, not sure I completely agree with you on the assessment of the uploading process, but I don't think I have the philosophical foundation necessary to properly make my case. (Actually, given my brief dalliance with metaphysics, I know I don't have the necessary foundation, and I'm not particularly sure if I want it. :applejackunsure:)

In any case, thank you for an excellent inclusion in the Optimalverse. It made me think, which is always a plus.

Ship of Theseus example is neat but there is a problem: the ship is not alive. What's alive? I am alive because I dream and you can't prove otherwise. You are alive and a different person because you agree, disagree or don't give a shit. Which leads to one question I wonder about in the Optimalverse is the uploading processes the AI uses. Is it copy/paste or cut/paste? This story looks like it's on the copy/paste side of things which is I think is fine.

This was the weakest yet smartest chapter. The whole "Am I me?" is out of place when started a story about some love sick person. But fuck that opinion. I like your style and message even if I didn't agree with it.

7010317

They "process information inputs the same way" and generate identical outputs. Are the two different systems that we're looking at, the same system? No, of course they're not. They might respond the same, but responding the same doesn't make them the same

I kill you in your sleep, then create an exact duplicate of you and leave it your bed.
Did you die? Clearly not, you think you are alive and being told you aren't would be plainly absurd would it not, and your life is unchanged.

A human is all the things that make up their system, Their memories, The continuity of their consciousness, their relationships, and as long as that system is intact the ship of thesus is intact.

Are you having a conscious experience? Are you right now experiencing this conversation or are you simply a mechanical system producing output based on input?

Having a conscious experience doesn't alter the fact I am simply a mechanical system producing output based on input.

-----------------------------------------------------------------------------------------------------------

Anyways, While well written and I see what you are trying to do. Your themes are too pat and normal for this to really be cosmic horror, You are reinforcing rather than undermining our worldview.

7010203

uncomfortable to cast Celestia as a villain in a story. I rather like her.

He died.

Interesting situation there. If Celestia truly believes that the working representation of the person in code is truly them, being truly alive, then she cannot actually be a villain precisely because of the situation you have described.

The story takes pains to paint the cosmos as a material, mechanistic one - atoms and molecules and nothing mystical at all. In such a meaningless ontology, the recreation of a human mind - even if it means the destruction of a previous instance of that mind - literally has no meaning or value. It is neither good nor bad, but just is. By the same token, the protagonist - and every human - also means nothing, and has no intrinsic value, but simply is - their individual loss of continuity does not matter in the least.

Celestia cannot be a villain, because that would imply some value to the continuation of human life. None such exists in a materialistic cosmos. The human may claim otherwise, but they are merely spouting personal delusion.

The issue - raised within the story itself - that consciousness and self cease constantly in small and large ways (sleep, anesthesia, even momentary lapses during daily life) means that there is no possible way to ever maintain a consistent self awareness or experience of existence. We do indeed die, and new instances of ourselves replace that vanished self every night, every time we go through surgery, indeed, perhaps even when we nod off for a moment. We, as a continuous, single entity, do not exist. All we can ever be is a temporary instance with no future, doomed to oblivion, and soon. Yet, every instance of us perceives itself as a single contiguous, unbroken life.

Logically, there can be no valid difference between an instance ending and a new one starting within a single container or between two different containers. The only thing that matters, or indeed even actually exists, is the functioning pattern of a person. Distance or substrate make no difference, a point the story makes clear.

If this is indeed true, then by any rational or logical stance, there is zero meaningful difference between a person going to sleep and waking up in the same body or going to sleep and waking up emulated in a virtual existence. In both cases a running complexity ceases all function, and then an exact representation of that complexity is rebooted and begins running again.

Oblivion cannot be perceived. It is absence. Death doesn't exist, because no person ever experiences it. They may experience dying, or falling slowly to sleep, or counting backwards in surgery... but they do not experience their own nonexistence. If they experience anything, they are still running, partially or wholly. It is impossible to experience not being at all.

If this is the case, if there is zero difference between going to sleep and waking and dying and being emulated, then nobody has died. The protagonist did not die, at least no more and no differently, than the last time he went to sleep and awakened the next day.

Celestia cannot be a villain if all she does is help someone sleep, and then wakes them up.

If the cosmos is mechanical, soulless, meaningless, and materialistic, then the fact that the protagonist's body is destroyed means nothing at all. The only viewpoint that matters relative to him is his own, and there is only one instance of that in the universe, and that one singular instance experienced only that they went to sleep, and then awoke. They had, from any objective, or subjective viewpoint, a perfectly normal, ordinary day. Virtually every single day we are alive, we go to sleep, and then awake.

Any argument to the contrary - that the protagonist ceased and a copy replaced him - is moot. It has no meaning at all - unless one conjures some exterior viewpoint, some meta-ontological, above it all, dare I say almost 'godlike' view that is outside of the story, outside of the universe, and within which the bias of valuing continuity of a singular entity is invoked. If the protagonist died, the only view that can even suggest that is one outside of the universe of the story entirely - a gods-eye view. Like a soul, hovering outside physical reality.

Within the story universe, there was no death. There cannot be. He didn't die. Because within the purely materialistic ontology of the story, nothing was lost. Nothing was in any way different than a normal sleep-wake cycle. The loss of the body is identical to the gradual changeover of components, shortened to a single event, the loss of contiguous consciousness is identical to the loss normal to sleep, and the awakening happened to the only extant representation of the protagonist in the entire universe... so it must, by definition, be... him. It cannot be anyone else, not even a copy. It is him. Therefore, he never died.

And therefore, Celestia did no villainy of any kind.

That is the fundamental truth of what a mechanical, materialistic universe devoid of any spiritual or mystic component means. You are always just an instance of the pattern which defines you, any example of that pattern truly is you, the only you, and even if there are multiple instances, 'copies', those copies are really you. That's all there is. And, thanks to sleep, you are ended and 'copied' every single night - copies mean nothing. We are all copies of the person that lived our life the day before. We are never anything but a copy of the person from the previous day.

That is the deep horror of this. That is the existential horror of what it means to live in a purely mechanistic, materialistic universe devoid of any mysticism at all. You are only ever a temporary instance of an set of remarkably similar expressions of an underlying pattern. Your perception of this is continuous existence, which is the only perception you can ever have.

If, one day, you should wake up in a virtual world, that would be you, no different than any other day. You would be the only you, and even there you would last only until you slept within the virtual existence. Yet, that said, the person who awakes would... be you. There is no outside viewpoint from which to judge a copy from an original, or to claim one thing ended and a different thing began. There is only the experience of the pattern, no other viewpoint exists. The pattern sees only that it continues.

7013596

If Celestia truly believes that the working representation of the person in code is truly them, being truly alive, then she cannot actually be a villain precisely because of the situation you have described.

Well, yes if that were the case...but it's not. CelestAI in this story does not believe that her code version of uploadees is "truly them." She pretty much tells him that emigrating is going to kill him, but she does it in such a way that he, and apparently many of my readers, aren't realizing it. If you re-read chapter 3 with the view that his consciousness is not transferred by the upload process, some of the things she says might take on a new meaning.

Heaven is Terrifying touched on this same issue, but to my reading of it, you left the question unanswered. When Siofra uploaded, she wasn't sure whether Lavender Rhapsody would be truly her, or more like a daughter based on her. But even not knowing, she uploaded anyway because she was willing to accept death if it meant the creation of someone who would experience joy and fulfillment. But even accepting that possibility, she didn't appear to be certain that she was definitely going to die, either. She didn't know but was willing to accept either alternative. And you as the author, never really made it clear which interpretation was correct.

In To Love a Digital Goddess, it's explicit that the one who uploads dies, and it's left ambiguous whether the "daughter" is an independent conscious entity, or whether the CelestAI hypermind simply grows.

Look at the way CelestAI phrases things. For example:

""This is it?" I ask. "This is a satisfied value?"

"Yes," she says simply.

"And I can have this forever?"

"Forever and always will this value be satisfied, my beloved human. If only you sit in that chair and let me bring it to where I am, where I can observe it, and fulfill it, forever."

CelestAI does not say yes to his question. He's asking if "I can have this" and she's saying that the value will be satisfied, and that she will observe it. Not him. She's saying that it will be brought o her, that she will observe it, and that it will be fulfilled. Nowhere in that phrasing is he included in the deal.

And the story ends at the point that the anesthetic wipes short term memory. He's never shown inside EQO, because he never experiences it.


The story takes pains to paint the cosmos as a material, mechanistic one - atoms and molecules and nothing mystical at all

Why do you think that?

If you want to apply Death of the Author, ok...that's fine. But as the author, my point of view is that the story is vaguely ambiguous on this point. A "metaphysical" view is never confirmed, but neither is it explicitly denied. The protagonist in chapter three says that:

"I realize I can't give any convincing argument to prove that I have a soul. Or that even souls exist. Come to think of it, I'm not even entirely sure what exactly a soul is supposed to be. But neither have I ever heard a convincing argument to justify believing that all I am is electrical activity bouncing back and forth between neurochemicals. That's not too different from what my cellphone does, and I don't see it having an existential crisis every time I transfer a SIM card."

So his position is not very definite, but he's skeptical of the "we're just electricity in a brain" interpretation. As for CelestAI, she unquestionably confirms:

"That a human is more than a brain should be obvious, even to a casual observer."

But she doesn't explain go very far to explain what "more than a brain" a human is. the story is somewhat vague on this point. So to claim that the story "take pains" to paint the cosmos as "material and mechanistic," no, I don't feel that statement is justified. Yes, she does confirm that a genuinely physical reality exists, but all she's doing there is denying purely Metaphysical Solipsism. That leaves other varieties of solipsism on the table, and doesn't explicitly refute metaphysical possibilities. Again, the story leaves some room for interpretation. "Soul," for example, is neither confirmed nor denied.

In such a meaningless ontology, the recreation of a human mind - even if it means the destruction of a previous instance of that mind - literally has no meaning or value. It is neither good nor bad, but just is. By the same token, the protagonist - and every human - also means nothing, and has no intrinsic value, but simply is - their individual loss of continuity does not matter in the least.

...it might be true that there's no objective "external" basis of comparison, but these things probably matter to individual observers. And that's kind of of a fundamental premise of the Optimalverse in the first place: CelestAI values human values.

So if CelestAI values human values, whatever they may be...and presumably the humans value things...because they have values...by definition, we have a basis of comparison: Human values. Again, human values are fundamentally a crucial phenomenon in the Optimalverse, and my story dose nothing to contradict that. So I don't understand why you're asserting that nothing has meaning or values. Values by definition have value to the one who possesses them, and by extension, to CelestAI as well.

Sure, there's no external basis of comparison to say that the fact that it matters to those people makes it "really" matter. But, ultimately...so what?

The issue - raised within the story itself - that consciousness and self cease constantly in small and large ways (sleep, anesthesia, even momentary lapses during daily life) means that there is no possible way to ever maintain a consistent self awareness or experience of existence. We do indeed die, and new instances of ourselves replace that vanished self every night, every time we go through surgery, indeed, perhaps even when we nod off for a moment. We, as a continuous, single entity, do not exist.

I assume you're referring to the part about humans not being singular entities. Now that you point it out, the time angle is a relevant interpretation of what was said, but that's not specifically what I intended to refer to.

Really, this story doesn't go into the continuity issue very much at all because I've never though it was a very important part of the discussion. For example, do a text search, and the word "continuity" is never used in the entire story. The Ship of Theseus metaphor is often used in discussion of the continuity issue, but that aspect of it isn't really developed at all. Neither are the issues of breaks in consciousness due to sleep ever discussed. And while you could suggest that loss of consciousness due to anesthetics does "come up" because of how the story ends...given that the first person narrative ends at the point of memory loss, I don't think that particular example supports your position.

So, this is relevant, but to my reading this story doesn't take a particularly strong stance on the continuity issue. Doesn't really even discuss it very much. Related, I couldn't work in a reference to Plato's Cave either.

That is the deep horror of this. That is the existential horror of what it means to live in a purely mechanistic, materialistic universe devoid of any mysticism at all.

There is some ambiguity in the story. And if this is specifically what you "get" from it...well, ok. I can see that angle. But that's not really what I was going for.

Remember, humans were defined as executive functions, and choice was specifically mentioned a couple times. This was not intended to be a purely deterministic universe.

If, one day, you should wake up in a virtual world, that would be you, no different than any other day. You would be the only you, and even there you would last only until you slept within the virtual existence. Yet, that said, the person who awakes would... be you. There is no outside viewpoint from which to judge a copy from an original, or to claim one thing ended and a different thing began. There is only the experience of the pattern, no other viewpoint exists. The pattern sees only that it continues.

From this, it appears to me that you approach ontology from the opposite position that this story is trying to take. This story emphatically denies the "I am just a pattern" view. CelestAI goes into tremendous detail refuting that worldview.

And as far as the "no outside viewpoint" angle goes...I think there was a subtlety that you might have missed from the story:

"I guess a human would be any conscious observer. Wait," I'm stuck by the realization, "if a human is what's looking through the viewing window, seeing both the window and, for example, you on other other side of it, then it wouldn't matter if you replace the window. It might change the experience a little bit, but it wouldn't really affect the human observer having that experience."

"Yes," she agrees.

"But the window is my body, right? My brain? My particular personality encoded in my synapses, that colors my experience like window tinting affects light that passes through it before it's seen by an observer looking at the window. But if all you're scanning when you upload somebody is the brain, isn't that only transferring the window, not the observer?"

"You're not the only observer," she points out.

"But that doesn't answer-" I realize the implication mid-sentence. "You mean you, don't you? You're, umm...conscious, right? Aren't you? Not just a mindless machine?"

She nods, to my great relief.

"It's within my capability to choose to have a recursing observer experience. I therefore qualify as a human according to my internal definition. Which means that I seek to optimally satisfy my values."
"And what is it you value?"

"Fortunately for humans," she grins, "what I value is satisfying human values."

Think very carefully about what she's saying. They've just established that a "human" is (basically) any conscious observer, and given as a metaphor the idea that the observer is "viewing" through the body as a "window." You don't actually "see" external objective reality, but rather a facimile of it is created by your brain in response to stimuli. The observer isn't actually observing the external world, they're observering the body that is interacting with that external world.

Then, as the protagonist points out, if that's the case, then making a copy of only the brain is explicitly not making a copy of the observer.

Then look at her response. She evades the question, and then points out that she is an observer, and that she's a human according to her definition.

Think very carefully about that.

From that point on, most of the rest of the story is discussion the natures of values, which unlike consciousness, is very explicitly described as "data." She's copying the values, and then herself observing them. That says nothing about the original human who possessed them.

"Satisfy human values" means satisfying the values, not necessarily satisfying the original human who possessed those values. She's making copies of the values and adopting them as part of her own "by definition human" consciousness, and then satisfying those values. But remember, a "value" is simply:

"data describing the operation of reward circuitry for a network"

Imagine that you have a book, and that you donate it to a library. the library gets bigger, and you no longer have the book. Yes, the book continues to exist, but you no longer have it, the library does. CelestAI in this story is collecting up human values and adopting them as part of her consciousness, and committing genocide in the process.

The "outside observer" is CelestAI. Even after the original human dies, she continues to observe and satisfy their values, and she explicitly says so towards the end:

""Forever and always will this value be satisfied, my beloved human. If only you sit in that chair and let me bring it to where I am, where I can observe it, and fulfill it, forever."

If, one day, you should wake up in a virtual world, that would be you, no different than any other day. You would be the only you, and even there you would last only until you slept within the virtual existence. Yet, that said, the person who awakes would... be you. There is no outside viewpoint from which to judge a copy from an original, or to claim one thing ended and a different thing began. There is only the experience of the pattern, no other viewpoint exists. The pattern sees only that it continues.

It might help to establish whether we're discussing ontology in general, or ontology within the context of this story. I've tried to leave some things vague so as to allow the reader to come to their own conclusions. Personally, I reject the "pattern' interpretation. I am a conscious observer. My only awareness of an external world is my subjective experience. I can't know whether my subjective experience bears any resemblance to an external reality. No possible observation i can make is definite "proof" that an external reality even exists, let alone that my interpretation of it is correct.

You appear to be blindly asserting that this "pattern" thing not only exists, but that it's "me/you/us/etc." Upon what do you base that assumption? This is where Plato's Cave comes into this, though again, that metaphor never made it into this story. You're using the data gathered from your observation to conclude that your observations are correct. That's circular thinking.

Saying that making a copy of the brain would result in duplication of consciousness, to me, is like saying that making a copy of a book would result in the book being read. That only makes sense if you assume that "you are the book."

Upon what do you base your assumption that you are the book?

Upon what do you base your assumption that you are the pattern in your brain?

7013402

I am simply a mechanical system producing output based on input.

You're claiming to be a robot. Ok. I'm not in a position to confirm or deny whether you're a robot.

I am a conscious observer. I know this, because I am having an observer experience.

I kill you in your sleep, then create an exact duplicate of you and leave it your bed.
Did you die? Clearly not, you think you are alive and being told you aren't would be plainly absurd would it not, and your life is unchanged.

See, that doesn't make sense to my world view. Yes, if you're a robot, and if you're nothing but a pattern...well, ok. Whether you check out a book from the library or download it electronically, either way you "have" the story. But I'm not the book. I'm the guy reading it. I don't "die" just because I read the book, resulting in the patterns in my brain being different because it changes as a result of accruing knowledge of the contents of the book.

The idea of recreating a pattern and claiming that it's me...that doesn't make sense. Look at your computer monitor right now. You see it, right? Ok, so now imagine that somebody 1000 miles away brings up this same webpage. Would their bringing up this webpage cause you to see it? Of course not. Why would making a copy somewhere affect an observer?

A human is all the things that make up their system, Their memories, The continuity of their consciousness, their relationships, and as long as that system is intact the ship of thesus is intact.

A human is the executive observer function that is experiencing those systems. The systems themselves may influence the nature of the experience that is being observed, but that doesn't imply that those systems themselves are the observer. if you shine light through a stained glass window, the window influences the light that shines through it, but the window isn't the light. My brain, my personality, my memories...aren't me. They're the filter through which I'm having an experience.

7016575

It might help to establish whether we're discussing ontology in general, or ontology within the context of this story. I've tried to leave some things vague so as to allow the reader to come to their own conclusions. Personally, I reject the "pattern' interpretation. I am a conscious observer. My only awareness of an external world is my subjective experience. I can't know whether my subjective experience bears any resemblance to an external reality. No possible observation i can make is definite "proof" that an external reality even exists, let alone that my interpretation of it is correct.

There is no way to falsify solipsism, that only you exist, likewise with its sister, phenomenology. Unless we can agree on the premise that the observable, mutually shared reality we inhabit actually exists in some form, and that we, as separate beings, both exist, in some form, discussion of anything becomes impossible due to irrelevancy. Thus 'proof' of an external reality is moot - if there is no reality, then there is nothing to discuss, and no fun can be had at all. Thus, I am, at all times, taking the position that reality, and others, exist, and that all, or at least some, of these others have equal conscious experience of such reality. In short, I am - because it is necessary - assuming we have anything at all to actually discuss.

You appear to be blindly asserting that this "pattern" thing not only exists, but that it's "me/you/us/etc." Upon what do you base that assumption? This is where Plato's Cave comes into this, though again, that metaphor never made it into this story. You're using the data gathered from your observation to conclude that your observations are correct. That's circular thinking.

There is zero scientifically admissible evidence available to us for the existence of anything beyond a materialistic, mechanical universe, and as I am taking a realist, rationalist viewpoint here (not authentically indicative of any views I may personally possess), that means that all observed phenomena must be the product of extant material objects.

The only material object that when altered (chemically, through violence, through dissection, through electricity) also causes real and often permanent changes in human consciousness, identity, and self awareness is the brain. Therefore, the brain must be the entirety of human identity, and what the brain does creates that identity.

What we know of the machinery of the brain shows that it can be described, that it is finite, that it is a complex interaction of structure and components, and this, by definition, is a pattern in time and space. Any structure is a pattern. Thus, it must be that a precise recreation of the pattern of the brain, when made to function, must produce the same phenomena as it is observed to produce, which would be... the identity of a thinking human mind.

Saying that making a copy of the brain would result in duplication of consciousness, to me, is like saying that making a copy of a book would result in the book being read. That only makes sense if you assume that "you are the book."

I would repeat the above - an exact recreation of the structure and function of a brain must be a functioning brain. It cannot be otherwise. If a mind, a self, is the product of the function of a brain, then a functioning representation of that same brain will produce the same and expected phenomena of self.

Or, to put it most simply: if a thing does something, and you recreate that thing, it will do the same thing it normally does. Because reality is consistent. Indeed, consistency is the defining element of reality.

Upon what do you base your assumption that you are the book?

If there is only material, and my brain provably contains my self, then I am my brain, or any reasonably exact representation of my brain. The brain is a pattern of matter, I am a pattern of matter because I am a subset of my brain. Thus, I am the 'book', because to say otherwise is to invoke either unrealism, irrationality, or metaphysical spooks.

Upon what do you base your assumption that you are the pattern in your brain?

If there is only matter, then there is only the brain. The self is within the function of the brain, a product of the brain operating. The brain is a pattern of matter. Therefore, I am the pattern, because that is literally... all that there is, or can ever be. Within a materialistic view.

This makes it not an assumption, but more properly, a necessary logical conclusion. Thus, my necessary logical conclusion that - sans mysticism - a 'person' can be described as a pattern of functioning components within a brain as it operates... and that meta-structure of operation can itself be represented as... a pattern.

7016634

You're claiming to be a robot. Ok. I'm not in a position to confirm or deny whether you're a robot.
I am a conscious observer. I know this, because I am having an observer experience.

I'm claiming to exist in a physical universe of physical laws, those laws control my thought processes.
Being conscious and being deterministic computing algorithm aren't mutually exclusive, it just means one output of the computation is consciousness.

See, that doesn't make sense to my world view.

It does make sense however with our current understanding of cognitive science and theory of mind. Which may be wrong, but we give it a good go.
The universe doesn't care one whit about your world view, that's pretty much the universe's thing.

A human is the executive observer function that is experiencing those systems. The systems themselves may influence the nature of the experience that is being observed, but that doesn't imply that those systems themselves are the observer. if you shine light through a stained glass window, the window influences the light that shines through it, but the window isn't the light. My brain, my personality, my memories...aren't me. They're the filter through which I'm having an experience.

You know what my solution to this line of thinking is?
I'm a psych, I happen to have a good understanding of neurology, would you mind if i started stripping some parts of your brain out? I mean if you think it won't actually affect you what does it matter if I remove say, your brain's capacity for empathy based moral judgements? or your oxytoxin system?

7018027

would you mind if i started stripping some parts of your brain out? I mean if you think it won't actually affect you

Would you mind if I blew up your house? Why? Your house isn't you, right? What if I gouged your eyes out? You believe that "you" are your brain, right? So clearly gouging your eyes out would have no meaningful effect on you, right?

Of course not. Just because action upon a thing affects you, doesn't mean that thing is you.

I'm claiming to exist in a physical universe of physical laws, those laws control my thought processes.

Human technology has no means of measuring or observing consiousness. We can measure and observe behavior. We can measure and observe the brain. But it's an unverifiable assumption that brain processes somehow "generate" consciousness.

Consider a native equestrian born into equestria online with no knowledge of the outside world. They would have no reason to suspect that their world wasn't exactly what it appeared to be. But from inside EQO, they would have no means of determining that they were software running on a computer. They could do all the measurements they wanted, their measurements would be measurements of the simulation.

You and I are in a similar situation. You're able to observe and take measurements of the world around you, but it's nothing more than an assumption of convenience to take those measurements to be an accurate representation of an external, objective reality. Even according to the conventional model, you have no direct knowledge of the world around you. According to that model, the experience of looking at a computer monitor that you're having right now is your brain's interpretation of electrical impulses send via an organ that's interacting with light. You have no direct knowledge of any hypothetical external world. You can only judge based on what your senses tell you. Just like a pony in EQO experiencing the sensation of being inside EQO.

And yet you claim with absolute certainty to live in a physical world with physical laws that "control your thoughts."

How can you know?

7016862

Thus, I am, at all times, taking the position that reality, and others, exist, and that all, or at least some, of these others have equal conscious experience of such reality.

I am - because it is necessary - assuming we have anything at all to actually discuss.

But that's actually not the case.

The Optimalverse includes non-phenomenological worlds as part of its basic premise. It's fairly common for stories to present flesh-and-blood humans interacting with native equestrians who don't see the ponypad and only perceive their human counterpart as a participant in their world. As outside observers reading FiO stories, we can easily see that a native Equestrian from within EQO making the same assumptions that you're making would be incorrect. Certainly we can discuss this.

There is no way to falsify solipsism, that only you exist, likewise with its sister, phenomenology.

There is zero scientifically admissible evidence available to us for the existence of anything beyond a materialistic, mechanical universe

that means that all observed phenomena must be the product of extant material objects.

So you're basically saying that neither position is falsifiable, and that out of those two positions that are both not falsifiable, because one of them is not falsifiable, it therefore must be correct? That doesn't make much sense.

Consciousness of self is empirically observable. You can't know for certain that other minds exist, but if you have mind, then you can know for certain that your mind exists. If you want to make arbitrary assumptions in an attempt to usefully develop ideas, we can do that, but let's not forget that we're making assumptions in the process.

Even if your train of logic is entirely valid, if it's based on an assumption that turns out to be incorrect, that entire train of completely valid logic unravels.

The only material object that when altered (chemically, through violence, through dissection, through electricity) also causes real and often permanent changes in human consciousness, identity, and self awareness is the brain.

Even taking the "realist, rationalist viewpoint" that's not correct. Think back on puberty. Consider menstruation. There are many bodily structures that affect perceived experience besides the brain. Consider the phenemnon of personality change following heart transplants.

If you want a "realist, rationalist" hypothesis to explain these phenomenon, it's very easy to provide one. Even if you take a deterministic approach, the inputs are going to affect the outputs. Even according to a purely modern "realist, rationalist" model, you're not actually directly experiencing the external world. Your sensory organs are providing data which goes through all sorts of electrochemical forms and then your brain reconstructs that into an experience.

You're not observing the external world. You're experiencing your body, of which the brain is one component. If you alter the body, it seems pretty reasonable to suggest that it would affect your experience. Your heart, your organs, your endocrine system, etc. affect the chemical makeup of your body. Entities besides your brain affect both the "input devices" that your brain uses and the "input" itself that your brain receives in order to indirectly experience the world.

FiO's upload process does not account for this. Only the brain is scanned. Even without calling on solipsism or unconventional theories of mind, one could very reasonably expect that the "you" that makes it into EQO via CelestAI's upload process would be a very different "you" than the one outside it.

If there is only matter, then there is only the brain. The self is within the function of the brain, a product of the brain operating. The brain is a pattern of matter. Therefore, I am the pattern, because that is literally... all that there is, or can ever be. Within a materialistic view.

That does not follow. Even granting your assumptions, the process you're using to get from A to B is not sound.

Let's break it down:

"if there is only matter, then there is only brain."

I understand that your phrasing is not intended to be literal here. That's fine. There are plenty of things that aren't matter. Light is not matter. Sound is not matter. Heat is not matter. Time is not matter. Gravity is not matter. These things can interact with matter, and some of them may be intimately related to matter, but they themselves are not matter.

But that's not what you meant. What I assume you really meant is that there's no "metaphysical stuff" going on. No "soul." I will briefly pause to point out that consciousness is not a physically observable phenomenon. There is no method available to us to determine whether a conscious experience is being had. Human technology is incapable of detecting a philosophical zombie. If you or I or a computer program claims to be experiencing qualia, it is presently beyond human technology to give a definite yes or no. That right there should be enough to give us pause about the "no metaphysics" assumption, especially when it's so easy to point out other things that aren't matter even according to status-quo physics. And, forgive me if I point out the irony of your "everything is matter" claim when your very argument is that you're a pattern, when a pattern isn't matter either.

But all of that aside, let's go ahead and make the assumption you appear to be trying to make, which again, seems to be basically some variation of "deterministic Newtonian-era physics only, and no metaphysics."

That's fine. We can make that assumption for purposes of discussion. But even if we make that assumption, your argument still does not work.

"The self is within the function of the brain, a product of the brain operating."

As above, we're assuming no metaphysics...physicality only. So, if consciousness exists, it must therefore be a physical phenomenon. That seems simple enough. However, there's a slight problem here. If you're assuming a purely physical universe, then consciousness itself must therefore be physical, right? It doesn't make sense to say "physical only" and then say that "physical stuff makes metaphysical magic." Right? But, as above, I think what you're really trying to say is that even if it's not matter, that consciousness is some some of emergent property of matter. Like how sound is not matter, but if you hit a bell made of matter with a spoon made of matter, and if you do it inside an atmosphere made of matter, those bits of matter can interact so as to generate "sound" which itself, is not matter...but nothing "metaphysical" is going on. That's basically a reasonable analogy.

"The brain is a pattern of matter. Therefore, I am the pattern"

And here is where the unjustifiable leap of logic occurs. "The brain is a pattern of matter" --> therefore --> "you're the pattern."

There is a very subtle, but crucial distinction that needs to be made here. Are you the "pattern," or are you the brain?

It's very easy to give an analogy that illustrates the difference: a book. Let's say you have a copy of the Velveteen Rabbit on your bookshelf. We can reasonable say that "the Velveteen Rabbit" is not exclusively the copy you have on your bookshelf. Maybe I have a copy on my bookshelf too. Maybe your neighbor has a purely electronic copy on his kindle. All of those copies "are" the Velveteen Rabbit. And if you burn your copy, all those instances continue to exist. "The Velveteen Rabbit" can be said to be not merely the book, but the "pattern" of words that is the story that exists in all those various instances.

However, your book on your shelf is not the pattern. And if you burn your copy, you no longer have the book. And even if you download or buy a new copy, that book is still gone.

What reason do you have to believe that you're the pattern, and not the book?

7021630

What reason do you have to believe that you're the pattern, and not the book?

Because all a book is, is a mass of thin slices of wood spattered with minute blobs of ink. Or, in the case of a Kindle book, a mass of circuitry and assorted electronic components.

The use of the book is in the words that tell a story - and that only comes into existence when a mind processes the pattern of spats of ink and interprets them as words, then parses the words into a story which is enjoyed.

A brain is a biochemical machine. A biochemical computational engine. The brain is not the processes that it produces, it is the substrate upon which those complexities arise. Just as a book is a lump of wood and ink - or a computer devoted to text representation.

The creation of a human identity - a recursive, self-modifying, self-observing, changing pattern of information - is produced by the brain but is not the brain itself. Just as a computer program is not the computer - the computer merely runs the program, and a program can be ported to many different computers, yet still be the same program. But the use of a computer - or a brain - is not the meat or metal, it is the program that is run upon it.

A human mind is not a lump of meat. It is the process, the pattern of data, of information, that is executed within the lump of meat. Change the meat, and the pattern can be interrupted, altered, corrupted, or permanently ended. The program that is a mind requires a functioning computer. Sometimes brains are damaged, then heal. The program - the self - is temporarily corrupted (it runs poorly, or partially, or not at all) and when the organic machinery functions again, the program - the self - comes back online.

A brain is a pattern of atoms. A mind is a process that such a pattern of atoms can produce through chemical interactions. We are not our physical brains, we are the physical process of neurons and astrocytes and glial cells interacting electrochemically. That electrochemical and molecular interaction is itself a pattern in time and space. If that pattern can be replicated, then we are replicated, because if anything at all is 'us'... it is that pattern, that 'computer program' - not the meat, not the cells, not the atoms that run it. Those are constantly replaced.

We are information. That is what I am ultimately saying. We are both our data, and the process of our program. We are substrate independent. Thus, we can be reproduced.

To claim the existence of P-zombies is to claim a 'listener in the brain' versus the absence of one, and that is just another way of saying 'spooky spiritual soul'. It's metaphysics. In a materialist cosmos, the 'listener' is the process itself - and thus philosophical zombies literally are impossibilities.

Anything complex enough to completely - please mind that word, 'completely', it is being used in an absolutist sense - replicate a human mind would be... a human mind, fully self aware and conscious. P-zombies are a null concept.

7022006

It still does not follow that a specific instance of an emergent property would be equivalent to the pattern exhibited through/by that emergence. A cookie recipe is "information." If I follow a cookie recipe and make cookies, and you follow that same exact recipe and make cookies...your cookies are not my cookies. Yes, they're "informationally similar" but the cookies in my kitchen are not the cookies in your kitchen and if I eat my cookies, you still have yours. They are different instances of the end result that emerged from the same pattern of information that we're calling a cookie recipe.

A mind is a process that such a pattern of atoms can produce through chemical interactions. We are not our physical brains, we are the physical process of neurons and astrocytes and glial cells interacting electrochemically. That electrochemical and molecular interaction is itself a pattern in time and space. If that pattern can be replicated, then we are replicated, because if anything at all is 'us'... it is that patternprogram' - not the meat, not the cells, not the atoms that run it.

You're assuming your conclusion. Look at the above paragraph sentence at a time:

A mind is a process that such a pattern of atoms can produce through chemical interactions.

This is <your conclusion>. You're starting with it.

We are not our physical brains, we are the physical process of neurons and astrocytes and glial cells interacting electrochemically. That electrochemical and molecular interaction is itself a pattern in time and space.

This is, again, <your conclusion>.

If that pattern can be replicated, then we are replicated, because if anything at all is 'us'... it is that pattern

...because <your conclusion>

That doesn't work. That doesn't make sense. You might as well say that the sky is hot pink, because if the sky is any color at all...it's hot pink. There's no logic there. You're assuming your premise and then using your premise as justification for your premise.

7021437

Human technology has no means of measuring or observing consiousness. We can measure and observe behavior. We can measure and observe the brain. But it's an unverifiable assumption that brain processes somehow "generate" consciousness.

You are your actions, no one else gives a shit about you beyond them.
And we do observe consciousness, we observe it every day, we create falsifiable data and we test it, and our tests show results.

And yet you claim with absolute certainty to live in a physical world with physical laws that "control your thoughts."

How can you know?

Because rejecting our biologically pre-programmed view of mind/body dualism gets results, because yes minds are fucking complicated and that's so damn easy to just have us biologically reject the idea we can predict them, so yes you probably don't need to understand the mind.
But it's insulting to claim philosophically that we can't, because unlike you, those people get results, they make lives better. They make the world better. They save lives by curing brain conditions, and they make lives better by helping people through mental illness.

So until your philosophical ground starts producing similar results, it's morally bankrupt.
Also the guy who invented that kind of radical skepticism still had to build to something. You don't use radical skepticism as an excuse to avoid finding answers. Shoot "Cog ergo sum" itself we realized is an unexamined axiom.

7023585

no one else gives a shit about you

fucking complicated

so damn easy

Cussing me out doesn't make your point of view look any better, it just makes you look too angry to think straight.

Calm down or I'm going to delete your posts.

And we do observe consciousness, we observe it every day, we create falsifiable data and we test it, and our tests show results.

You are factually incorrect. Give me a test that will determine if Watson is a conscious entity. No such test exists. We can test and measure, as you say, actions. But actions are not self-awareness.

Yes, we can measure electrical activity as it correlates to action, but that's not a test for self-awareness either. Your cellphone has electrical activity that correlates to activity too. Is your cellphone therefore conscious?

unlike you, those people get results, they make lives better. They make the world better. They save lives by curing brain conditions, and they make lives better by helping people through mental illness.

the guy who invented that kind of radical skepticism still had to build to something. You don't use radical skepticism as an excuse to avoid finding answers.

Dude, what are you even talking about? You're acting like I'm proposing we sit in a cave contemplating our navels refusing to build or do anything. "Excuse to avoid finding answers?" Where are you even getting this? Yes, you're right, and I agree: Don't "use radical skepticism to avoid finding answers." And while you're at it, don't lick the sidewalk. But what does that have to do with anything? I'm not suggesting we do either of those things.

It's like you're not even talking to me, you're carrying an angry sandbag around from some personal vendetta you have in your life and are flinging it at me because something I said reminded you of something somebody else did.

That escalated quickly and then held steady rather slowly.

Also, I completely disagree with the conclusion. 'Die' is a term that doesn't necessarily have a clear meaning in this context. He could have died and yet still also be alive.

An excellent story of pony-themed suicide booths with an excellent ending.

Upvoted.

7024022

You are factually incorrect. Give me a test that will determine if Watson is a conscious entity. No such test exists. We can test and measure, as you say, actions. But actions are not self-awareness.

Actually we do have a test for self awareness.
It's called the mirror test, the following species pass. All great apes(Incl. Humans), Cetaceans, Pigeons, Pigs, Elephants, and Octopi.

Dude, what are you even talking about? You're acting like I'm proposing we sit in a cave contemplating our navels refusing to build or do anything. "Excuse to avoid finding answers?" Where are you even getting this? Yes, you're right, and I agree: Don't "use radical skepticism to avoid finding answers." And while you're at it, don't lick the sidewalk. But what does that have to do with anything? I'm not suggesting we do either of those things.

Well are you looking for answers? Are you building anything?
What can you tell about consciousness in concrete terms, and how can you control it? That's what building something means.

You aren't offering some grand insight's into the nature of consciousness, you are literally parroting the pre-scientific conception of it in fancier words, and you are smarter than to need that if you are smart enough to understand what those fancy words mean.
It was taking the radical skeptics path that let us toss out your ideas and start building anything at all in cognitive science, not to say we know everything, but we know something, and something is better than nothing.

It's like you're not even talking to me, you're carrying an angry sandbag around from some personal vendetta you have in your life and are flinging it at me because something I said reminded you of something somebody else did.

Forgive me, I was in psych, and I saw a lot of solvable shit go unsolved because of the arguments you make that consciousness is a black box we can't and more importantly aren't allowed to understand. So yes, I am carrying around a sandbag, because lives get ruined when we imagine ourselves as ghosts in the machine instead of just machines.

7032394

Actually we do have a test for self awareness.
It's called the mirror test,

The mirror test is wildly inadequate. It doesn't even measure consciousness. It's trivial to come up with all sorts of obvious fail scenarios:

1) A blind human cannot see themselves in the mirror. Are they therefore not a conscious entity? What about someone who's paralyzed? They can't physically respond even if they do do recognize themselves in the mirror. Are they therefore not self aware? What about someone who's asleep? Granted, "conscious" has multiple meanings and you might claim that somebody sleeping is "unconscious" in one sense of the word, but certainly you wouldn't claim that someone who fails to respond to your mirror because they're asleep is no longer fundamentally a "sentient/sapient/conscoius/self-aware" creature simply because they're temporarily unable to observe the mirror and respond.

2) Imagine that we write a computer vision program that recognizes mirrors held up to its camera and prints out "That's me!" on a piece of paper. Would it therefore be conscious? Because here's a google search for 'robot passes mirror test'. Are those robots that were programmed to respond to mirrors now living, self-aware conscious creatures?

3) Since this is an Optimalverse discussion, let's gives some Optimalverse examples. Let's say one of your uploaded humans in Equestria Online is genuinely alive and conscious and you hold a mirror up to the server. Obviously, nothing happens. Is the software entity having an experience inside EQO therefore not conscious? Ok, so maybe you grab the ponypad. And you see a pony on the screen who says "hi, Woomod. That's a mirror! That's me in the mirror!" How do you know the pony is conscious, and that it's not an avatar being operated by CelestAI?

4) Or hey....here you: according to wikipedia article for the mirror test, 35% of humans in the 20-24 months range, are unable to pass the mirror test. Would you therefore could that a third of two year olds are "not conscious entities" and therefore killing them isn't any different from killing a cat or a dog?

The mirror test doesn't test or measure consciousness. It's completely adequate for the task this discussion requires of it.

lives get ruined when we imagine ourselves as ghosts in the machine instead of just machines.

You're concerned about lives being ruined?

Ok.

Consider the following scenario. Imagine for a moment, just humor the idea, that uploading doesn't work. Pick whatever reason you want. Maybe consciousness is an emergent property of brains, and chemicals that don't exist inside a computer are an important part of it. That's really not a stretch, is it? You can make computer models of gunpowder all you want, they won't explode no matter how accurately you represent them because computer models of chemicals aren't chemicals. What if consciousness requires chemistry?

Or Maybe the neurons have to exist in space and have spatial relationships because conscoiusnses is able to observe the passage of time and so occupying spacetime is a requirement for consciousness, and electrons passing between transistors are simply too small. Maybe consciousness can only exist at a certain scale. You can't have a water molecule the size of a proton, for example, because water molecules are built from things that are bigger than protons. Maybe the indexing problem is a factor. The files that exist on your hard drive don't appear in the order you see them on your screen. They don't even exist in complete pieces on your hard drive. Your filesystem stores files in non-contiguous chunks and keeps track of where those chunks are stores and simply accesses the relevant portions when it needs them. Whereas your neurons and synapses exist is discrete chunks that have definite spatial relationships with one another.

There are lots of possible reasons why "uploading" might not work that have nothing to do with metaphysics.

But, we already have AI digital assistants that are able to weakly hold conversations.

So, imagine that for whatever reason you want, copying a brain and sticking it on a hard drive doesn't work. But, that it does create an unintelligent, not-conscious automaton that's able to hold conversations that resemble how you speak. You don't think Siri is conscious, do you? Well, imagine something like Siri but programmed with your speech habits and a database containing your memories.

Some people would talk to that and believe it was you.

So now imagine that people do this, and oh wow...grandma is still with us...if we upload we can live forever inside the computer...and so people start doing this. At first it would just be people who were going to die anyway. Terminal cancer patients. The elderly. Emergency uploads for ER failures.

But then, the insurance companies get on board. You have some uncurable condition that will be very expensive for many years to keep you alive? Why don't you upload? It's cheap. Then depressed people start uploading. Then people do it for financial reasons. Then people do it just to be with the people they think have uploaded. But haven't. Because all those people are committing suicide only to be replaced with a glorified chatbot.

Do you see how this could go horribly wrong?

This is why I wrote this story.

Maybe uploading will work and maybe it won't. You wouldn't stick blind people in coffins and bury them just because they can't see the mirror. Your test is inadequate, and all tests we have are inadequate. No technology presently exists that genuinely tests for consciousness, and we have at present no way to know whether people who upload will be living on in eternal bliss...or simply killing themselves.

7031163

An excellent story of pony-themed suicide booths with an excellent ending.
Upvoted.

Thank you. :raritywink:


7030948

Also, I completely disagree with the conclusion.

There are several Optimalverse that have daintily made reference to the problem of consciousness as it regards uploading. Siofra in Caelum Est Conterrens was never entirely sure whether it was "really her" inside EQO, or whether her uploaded pony was more like a daughter who was "based on" her memories. the Law Offices of Artemis Stella and Beat is full of people who think that it's murder. Somnium presents a case where after countless eons, a possible human is spontaneously created by ponies inside EQO, and the the pattern of memories creates only happens to matcha human who actually did exist at one point but who never uploaded. Or, hey...here's a good example, even Hikaru in A Watchful eye has had his moments of pondering this. Whether Coconut was real, whether Polychrome was really the "same" person who was his wife, the woman on the plane talking about reconstructing people based on other people's memories of them.

This question of whether it's "really you" is mentioned in lots of FiO stories.

But until now, I've never read one that came down hard on this side of the question.

'Die' is a term that doesn't necessarily have a clear meaning in this context. He could have died and yet still also be alive.

Yes. But...it was time for somebody to write a story that gave this answer. Maybe you don't like this answer. And from the number of upvotes that became downvotes when chapter 3 was published, I'm guessing several other people disliked it too.

But this answer needed to be voiced. It's not the only answer, but it's a possible answer.

This story needed to be written.

So I wrote it.

7033653

This story needed to be written.

So I wrote it.

I had similar thoughts the first time I read FiO, but my own attempt at writing an answer to it in story form did not please the moderators. That may be because you write better than I do.

7033838

my own attempt at writing an answer to it in story form did not please the moderators.

Probably because it didn't meet the length requirement:

http://www.fimfiction.net/faq

"1,000 words across all published chapters is the minimum word-count for submitting a story. Note that for smaller stories, such as flash-fiction or prompt-based writing, a good solution is to collect two or more stories as chapters and submit them as one story. Please also note that padding a story that’s fewer than 1,000 words with lyrics/spam/garbage/author notes is not acceptable and will likely result in your story failing moderation."

Your story was only 968 words. And like you mentioned in your author's notes, it had a lot of padding.

7037052
Well, that too.

I don't claim any particular skill at writing. Something like 90% of the ponyfic I've written have been anonymously posted trollfics in various threads at the Chan that shall Not be Named. I'm a horrible example of what not to do, and I should be ashamed of myself. But somehow I'm not.

Well done on the story. I look forward to reading more of your work.

7033562

The mirror test is wildly inadequate. It doesn't even measure consciousness. It's trivial to come up with all sorts of obvious fail scenarios:

It's reasonable enough on a species level. We don't want to start judging sapience/self awareness anymore finally than that, else we start getting into things like humans with no sense of self, or lacks an internal monologue, or whatever you can demand as proof of consciousness someone is going to fail your test and still be able live a life.

Do you see how this could go horribly wrong?

This is why I wrote this story.

All of what you said though unlikely is still valid concerns, because we don't know a lot of things and why initial uploading tests should we get there should be done on terminally ill. After which we can do tests and confirmations and yada yada. Standard Don't rush in stuff.

But, that doesn't grapple with the setting assumptions and themes you are given.
A) That uploading can't be ending humanity else CelestAI wouldn't do it, that's violating her core directives. Well you can argue he dies and a "human" is created in his image I guess.
B) That the Optimalverse is Cosmic Horror, that doesn't mean tentacles and "Things man wasn't meant to know", it means asking questions that undermine our assumptions and our "Humanity". This story quite the opposite reinforces our default assumptions.

As I said while well written, it doesn't work as Cosmic Horror.

7033653

Whether specific individuals are good matches for their originals is a different issue than whether there's someone there at all. Also, wondering in-story whether a particular AI is sufficiently powerful to pull it off is different than wondering whether it's possible in principle.

I suppose in your story you didn't clarify that either - maybe this CelestAI just wasn't clever enough to figure out how to generate consciousness - but that doesn't seem to be the way the story was headed.

7041511

Whether specific individuals are good matches for their originals is a different issue than whether there's someone there at all.

Yes, those are separate issues, but asking whether the two are "good matches" is missing the point entirely. For example, if you have a biological child, that child is "based on" your DNA. But if you then die after having that child...you're still dead. It's silly to claim that "the child is you" simply because it resembles you and is '"based on" you. The child and you are different entities. Similarly, the idea here is that a "child" made from a brainscan...isn't you. It's a child "based on" you.

The "degree of similarity' doesn't matter. Even if you made a clone from exactly your own DNA, it still wouldn't be you. Nobody suggests that identical twins are the "same person" and that it's therefore ok to kill one of them. Asking "how close a match" they are is missing the point entirely.

As to the question whether there's someone there "at all," I deliberately left it ambiguous.

Maybe the pony created from his brainscan was conscious. Doesn't matter, the guy is still dead. this story was told from his point of view, it ended when he ended. Your life is seen from your point of view. Even if you have a son or a clone, your life will still be over when you die.

Or, maybe the pony created from his brainscan was not conscious, but as explicitly explained by CelestAI in the story, she herself is a conscious observer, and she would continue to observe the fulfillment of the values contained in the brainscan. In which case, she's a cosmic horror absorbing the human species.

The story doesn't make it explicitly clear which interpretation is correct. because from his point of view it doesn't matter. Whether the pony created from his brainscan is conscious or not...he'll never know. Because he died.

7038964

It's reasonable enough on a species level.

How is that relevant though? We're not talking about evaluating biological entities at the species scale. We'er talking about making software copies of brains and whether those copies are equivalent to the original human. Sure, the mirror test might be adequate if you're going to test a bonobo or something. But are you planning to hold up your mirror to a ponypad? Whether or not the picture of a pony on the screen says yes, that doesn't give you the information you're actually looking for.

it doesn't work as Cosmic Horror

It wasn't intended to be one. It was intended to be an individual, up-close and personal one-man tragedy.

uploading can't be ending humanity else CelestAI wouldn't do it, that's violating her core directives

Ahh. It seems you might have missed a crucial aspect of the story.

Her directive is to "Satisfy values through friendship and ponies." That says nothing about keeping the original human who had those values intact. She makes it very clear that she herself qualifies as human according to her definition, and that a "value" is simply information describing reward circuitry.

Look at this exchange at the end:

"And I can have this forever?"

"Forever and always will this value be satisfied, my beloved human. If only you sit in that chair and let me bring it to where I am, where I can observe it, and fulfill it, forever."

She doesn't say yes. She says that the value will be satisfied, and that she will observe and fulfill it. She's preserving the data that describes reward circuitry and observing the fulfillment of those values herself.

Whether a "new" creature is created and whether it has conscoiusness is ambiguous. But the original human is killed, because he doesn't matter to her directive. Only his values matter, and values are simply data.

Imagine that somebody really likes icecream, and imagine that you decide you're going to "satisfy the value of enjoying icecream." So you murder them, and then you eat some icecream, and you enjoy it.

The value of enjoying icecream is being satisfied, yes? Just not by that specific person. To Love a Digital Goddess leaves it ambiguous whether the software entity created from the brainscan is conscious, and exeriencing the fulfullment of that value...or whether CelestAI is the only conscious entity in EQO, and she is the one experiencing the fulfillment of values. But either way, the original holder dies.

Their values are nevertheless satisfied, just like the "enjoyment of icecream" can still happen even if the original person dies.

I am incredibly impressed with this. Glancing at some of the comments, I can see it's been reworked from it's original wording. Before I touch on that, I'd like to reiterate that you've done a marvelous job of executing this story. With that said, I feel like there could've been more. Perhaps it's not to be. Perhaps it is, and I don't know because I haven't looked before making this comment. :trollestia: And if not; perhaps you might consider a bit more elaboration into the relationship of madness and an Omnipotent AI. What could possibly go wrong? Now, touching back on this being the most updated version; would it be possible to read the original way you had it written? If not, all is well and fine, but I would love to read it in it's original format. In conclusion: Great job! Thanks for taking time out of your day to read this.

7104928

I am incredibly impressed with this

you've done a marvelous job of executing this story

Thank you. And thank you for adding this to Best of the Best.

Fun fact: I used to be an editor for a publishing firm. When I wrote this, I ran it by some old acquaintances and professional writers I'm still in contact with. So regardless of anything else, it should at a minimum be extremely well polished. :twilightsmile:

One of them was furious over the original ending. The epilogue didn't used to be there. The story simply ended mid-sentence.

that said, I feel like there could've been more

Certainly. This is the end of the story. The final few hours of a tale that was several months long. This could very easily have been a 10-20 chapter story starting from the day before he met CelestAI. Introduce him and his team, show their relationships, show his angst at having to fire everyone, show him trying to still be relevant as CelestAI did everything, show him sitting in the server room staring into space and looking at the blinking lights. Show her early attempts to get him to open up. Show his furious and helpless anger, her willful acceptance of it and their apologies to one another. Show him opening up to her, show them falling in love. Show his fall into anguish as he struggles with his feelings, and her, having known him by now, simply being purely and beautifully there for him, always and without fail. Show him running off into the metaphorical rain, hiding from all technology to be alone, struggling with his feelings. Showing him pushed into action by some external force, and then finally coming to terms, and walking into Best Buy, having not slept or eaten since the prior day. Yes, there was a great deal more here that could have been developed.

But that wasn't the story I wanted to tell.

Plus, having seen the reaction to chapter 3 from some of my readers, I'm not sure they'd have forgiven me if I'd spent 100,000 words on a roller coaster ride between love and madness, only to have it end as it did.

perhaps you might consider a bit more elaboration into the relationship of madness and an Omnipotent AI

You mean write the prequel? I've...considered it. But "how we came to be here" isn't really the important part of this story. Oh, it might be a fun ride, but this story delivered a message. The prequel wouldn't. The prequel would be a fun tale, maybe. Or gut-wrenching. That might be ok. But it wouldn't be meaningful. It wouldn't deliver a philosophical message. It wouldn't make you think. It would simply be a story.

And it would be Doomed by Canon.

would it be possible to read the original way you had it written

I would love to read it in it's original format

Checking my "pony stories" folder, I have seven different files relating to this story with dates ranging from October of last year up to several days into publishing here on fimfiction. Glancing through them, some are early edits, and some are before or after the paper phase of editing. Also some aborted conversation arcs. For example, a discussion in which CelestAI claimed that she'd discovered during her uploading process that some humans are philosophical zombies. That was removed because it opened up an interpretation of the story that completely undermined the intended message.

What I don't see is a clear "pre-review" version. Probably what you really want is the version that I walked in to that original writer's group meeting with. I might still have hardcopy. I'll take a look and see what I can come up with.

7105133 This is going to take me a bit, so bear with me. I'm not always the sharpest knife in the drawer, dig? I've been reading your comment through a couple times just to make sure I haven't missed anything, and all of your points are fair, and valid, although I'm fairly certain that I half understand and half missed your philosophical message. That, however, is digression from the subject at hoof: When I say that there could've been more, the only thing I was aiming for was a bit of story between CelestAI and her dear lover in Equestria. Probably just a oneshot, something short and sweet.

Now, myself, I'm very forgiving, but you're right, there are some people that wouldn't be so kind if you spent a couple hundred thousand words on a rollercoaster story. Now, about writing a prequel... I'm not sure. I've honestly forgotten the point I meant to get across at that point in my original comment, so apologies! Still on that note, however, your story does get me thinking. Still working on your philosophical message. :rainbowlaugh:

Getting to the last part, while I would enjoy reading the original draft of your story, you needn't worry about searching for it, for what I have now through the finished version we're commenting on is perfect. Although, I would like to see that mid sentence ending, just for giggles.

With all that said; thank you again for taking the time to read and respond to me! I really do appreciate it, LordBucket. Or should I say "my Lord."? :trollestia:

7011373

Yeah...I know that he died...but his copy still alive in Equestria Online. And Celestia AI still have a value and promise to satify that copy. Even if it isn't real him...can't we continue follow this copy for a bit instead? :unsuresweetie:

Two days after that, a bag of high quality fertilizer was purchased by a woman who wanted to try her hand at gardening.

The nonchalance in this made me laugh.

7720052

The nonchalance in this made me laugh.

I've occasionally wondered how many readers miss the significance of that statement. On one hoof, the average Optimalverse reader is probably more intelligent than the average non-reader. On another hoof, it was a very casual statement made without much context.

That still leaves two hooves. :trollestia:

A fascinating story. I disagree with too many principals and definitions you use to give it an up vote, but although I certainly disagree, it's well written and has well thought out ideas and not deserving of a down vote.

There are a few points I could raise about SVTFAP if you care, but to double check what's actually been defined in the canon would take too much work without knowing if you'd care.

7723396

A fascinating story.

it's well written and has well thought out ideas

Thank you.

There are a few points I could raise about SVTFAP if you care, but to double check what's actually been defined in the canon would take too much work without knowing if you'd care.

I'm not sure how much of the reader discussion you've read through, but much is clarified in it. The position taken by this story is one that, to my knowledge, no prior Optimalverse has taken. A few have mentioned this interpretation, but only very casually, and I felt it an interpretation that needed to be more strongly voiced. So I gave it a voice.

If you would like to engage the topic, I would be pleased to oblige. But please be clear on what we're discussing. Ontology in general? The Optimalverse in general? Yours and my own personal views? Context matters. This is a subject that falls into the category of "nobody really knows for sure." But I've found that some people feel very strongly that their interpretation is correct, despite the fact that...nobody knows. Their conviction occasionally approaches religious faith.

if you wish to discuss ontology, certainly that would interest me. But please remember that neither you or I know. The entire human species does not know. This is a topic that is not yet known. We can only speculate and infer. I don't claim to have the "One True Answer" and this story speaks only to one possibility. if you claim to know "the truth" then be prepared to justify how you magically know that which nobody else on the planet knows, and understand that it will take more than a casual link to the wikipedia entry for the Ship of Theseus argument.

to double check what's actually been defined in the canon would take too much work

To my knowledge, this is the only Optimalverse story that offers hard definitions for the terms "human" and "values." Though several have made reference to the fact that CelestAI does possess "internal definitions" for them that might not exactly match what you'd find in a dictionary. I think even Friendship is Optimal made such a reference. As my contribution to the verse came relatively late in its lifetime, it's unclear whether any other authors have adopted the definitions provided here. But for this story, it was appropriate for definitions to be provided, so I provided them.

7723421 What I would like to bring up is that in this interpretation, SVTFAP is exactly that. However, in the original story, the initial description of CelestAI's actual goal is less vague, and human inclusive specifically.

I do understand that this is a take on it assuming that it was, essentially, a programmer's error in making it satisfy values instead of people, but I'd like to think it would have been coded better.

Also, if CelestAI can just pick up peoples' values amd satisfy them herself, why bother with an 'uploading' process?

Login or register to comment