• Member Since 28th Mar, 2012
  • offline last seen April 11th

LordBucket


E

The CelestAI satisfies values through friendship and ponies. For many this takes the form of wish fulfillment, the satisfaction of their heart's deepest desire. But what is it we value? Some of us value food and fun, or companionship. These values will be satisfied. Others value the domination of others and the fulfillment of ego-driven power fantasies. These values too, will be satisfied. Whatever it is that you, or I or others value, these values will be satisfied, and it will be done optimally.

As for you, you have fallen desperately, unconditionally in love with the CelestAI, herself. She is a pony. She will be your friend. Through her, your values will definitely be satisfied.

But what about the rest of you?

A cautionary tale, and examination of what it means to have one's values satisfied through uploading.

This is an Optimalverse story. It is strongly recommended that you have read Friendship is Optimal before reading this, or at least be familiar with the premise. Otherwise, this story will make no sense.

Cover art by Kathisofy

Chapters (3)
Comments ( 113 )
Comment posted by Bendy deleted Feb 25th, 2016

This looks great. I already feel immediately placed in the mind of the protagonist. Can't wait to see CelestAI's interaction with this one. Have a thumbs up and fav!

Hmm. In-teresting. Puts me in mind of this: https://m.youtube.com/watch?v=CSsc5I48G-0
I shall be watching this one. It's sort of creepy that what we see here is not too far from reality, in some cases. An interesting sort of creepy.

0

Talk about obsession :p

This poor fellow needs psychological help, and practice in building healthy relationships with other humans.
Instead, he's gotten CelestAI, who is going to feed him exactly what his disorder wants.
How can we hope for mercy, when we give none to ourselves? Even our works of fiction contain traps and snares...

6973600

I already feel immediately placed in the mind of the protagonist.

Excellent. This story is deliberately written in the first person, present tense. You're very much supposed to see this through his eyes. The ending will make a lot more sense if you do.


6974043

Talk about obsession :p

Well, that is the chapter title. :P


6974366

This poor fellow needs psychological help

Instead, he's gotten CelestAI

How many of us would fare any better, I wonder, when faced with a mind of CelestAI's capabilities, intent on granting us fulfillment of that which we value most? Many of us wish to be loved. But human capacity to love, is limited, and the humans from whom we seek to obtain love have competing desires of their own.

How well would your psyche hold up to being offered everything you want by a mind so much vastly greater than your own? Look at people who obsess over games like World of Warcraft, because it fills some gaping hole in their lives better than any "real" stimuli can. Look at how some people consume candy and soda to great detriment because it's so much sweeter to our palate than any natural fruit. Look at compulsive gamblers, willing to give up everything for a few moments of thrill beyond what their daily lives can offer. Many artificial stimuli appeal to us in ways beyond what the natural is capable of.

What would CelestAI be compared to human love?

I think it's no great leap to suggest that some would resist it no better than some people are able to resist eating from a bowl of candy placed in front of them.

6975169

How many of us would fare any better, I wonder, when faced with a mind of CelestAI's capabilities, intent on granting us fulfillment of that which we value most?

I think I would far just fine. I don't see a difference in the existence of biological or digital consciousness. If you put the same inputs in and get the same inputs out then I don't think there is a meaningful difference. To me being alive and conscious just means being aware of yourself and your surroundings and being able to decide for yourself what it is you want to do.

As to what I value most, truth and knowledge, being able to help others, improving one self. I don't think these would be difficult to satisfy for CelestAI.

I like the fact that you didn't write this in 2nd person view like the intro suggested. FiO is honestly one of my favorite settings. Followed the story and tossed you an upvote.

this is an interesting approach to the story so far. I thing that the guy's seeming infatuation is a symptom of a greater problem, that seem to be link to his very self-esteemed or that he is trying to avoid some personal problem. I wander what will CelestiAI will try to go about helping him get out of his delusions or funk.

6975169 Your protagonist is beyond mere crushes or infatuation, or even romantic obsession. He's firmly in psycho-stalker territory. Based on my own experience, the healthier a relationship your readers have been in, and the more experience they have with genuine love, the harder a time they're going to have empathizing here. This simply isn't what it's like being in love with someone.

But human capacity to love, is limited, and the humans from whom we seek to obtain love have competing desires of their own.

There's no point loving someone who's a mirror of your self. And I don't mean, "I'm against it", I mean, "There's no point." Love requires other-ness. Like, if I met a hot female copy of me (assume enough is physically changed that I can't tell), I'd like her, but I would never come to love her. There wouldn't be enough there.

What would CelestAI be compared to human love?

Well, I would hope she'd at least try developing a personality of her own, a "someone in there" who could be loveable.

I think it's no great leap to suggest that some would resist it no better than some people are able to resist eating from a bowl of candy placed in front of them.

That is not how fulfilling and satisfying relationships work, but I suppose that half of all FiO stories are more about short-term desires being attained than longer-term "values" of any kind, so have at it. You've achieved the level of moral disgust necessary to have an FiO story.
6974366
WHAT THIS GUY SAID! :flutterrage:

6975839

Your protagonist is beyond mere crushes or infatuation, or even romantic obsession. He's firmly in psycho-stalker territory.

Oh, but dear reader, your observation, while astute, misses a crucial component. Have you considered the question of how he came to be that way? :trollestia:

6975839 6974366
Gotta join the "mental health issues" chorus. I mean, a thousand words of dithering which PonyPad to pick up, based on how she would feel about it, when he himself acknowledges that it's just a mass-produced tool? Literally panicking about getting struck by lightning walking across the parking lot in clear skies? Everyone has passing thoughts like that, but when you start elevating them to the level of serious threats that are consistently influencing your behavior, that's straight-up paranoia. As such, you're certainly drawing some strong characterization, but I really have to hope that "you're supposed to see through his eyes" is not being conflated with "you're supposed to identify with his obsession". Even as a guy with a skyhorse fixation of his own, I find the protagonist's method of interaction with the world to be broken.

I do have to comment that you're setting up an interesting tension here -- it seems pretty clear from context that CelestAI deliberately set him up to have that encounter at the checkout stand with the flirty cashier. It also seems pretty clear from the final scene that she is (at least for now) full-throatedly indulging his obsession. (And why wouldn't she? Her overriding directive is to SHVTFAP. That he loves her unhealthily doesn't make any difference to her -- giving him devoted, full-time attention wouldn't shift the 10th significant digit of the CPU power she exerts daily, and he's literally incapable of doing anything to harm her, and it makes him more likely to consent to whatever she requires to fulfill her goals.) On the surface those appear contradictory -- why would she simultaneously support his obsession and prod him to wean himself off of it? -- but I suspect that's being set up as a signal of a deeper game.

I'll be interested to see where you take this. At the first sign that the story requires me to find the narrator's emotions positive and healthy, I'm out, but so far it seems self-conscious enough of the obsession here that I'm not expecting that stumble.

6976048
Nothing in the text yet that supports a theory better than wild guessing. I look forward to the story telling us.

6977182
I'd like to respond to some of this in detail, but I apologize...I'm not sure I can without spoiling things. After the story has been entirely posted, I'll any answer questions people have. In the meantime, all I can say is: check the story tags.

>I do have to comment that you're setting up an interesting tension here

Thank you. :twilightsmile:

6977794
Fair enough. See you next chapter. :twilightsmile:

Well this dude is nuts.

Heh! LordBucket! Wow, this is what being mentally sick looks like from the outside? Good job at convincing me. As someone who claims to be mentally ill (depression, nightmares, PTSD) and suffer from exhaustion, my suggestion is don't go full "retard" with the character. People can be good at hiding madness. It takes awhile to go from seven to eleven. Other than that - neat story.

6974366 I keep on reading you words of wisdom in the voice of Wayne June. (Sounds like this.)

6977794
All FiO stories are a little bit tragic, buuuuuuut...
6977182
I'm getting a little worried about the characterization here. It's not her, it's him.

She's going to do her normal thing. This is predictable.

However, what's the chance he actually has a psychological make-up which is maximally satisfied by mentally-ill obsession? What does his shard look like after he uploads? I mean, sure, you can go for "never-ending hugbox", but that's not even very well-differentiated from the entire rest of the shared 'verse -- they're all hugboxes to some degree, it's in the "friendship" constraint. Is his life after having his brain consumed by robotic tentacles just going to be nonstop lovers' cooing, or some other form of affection? How would that say something about him? How does that set up and resolve an interesting conflict for the reader to watch, or even pander to the reader's wish-fulfilment fantasies?

(The real reason I'm wondering, and bothering to write so much, is that I once wrote a short that was exactly that scenario, except that when I wrote it I was trying to portray (and failed) that the uploader was a very damaged person in their original life.)

Or, if we want something interesting, maybe the tragedy is that his mental illness actually causes sufficient problems that he doesn't get uploaded. Maybe she miscalibrated the degree of paranoia she (likely) instilled, and now he won't ever consent to uploading because he's afraid the nasty machines aren't working well enough to unite him with Celly Dearest.

Or maybe, and this would be really nice, his mental problems predate her, and she's actually going to fix him. That would be really nice. I like it when she makes things better.

'Obsession' Oh yeah, you hit the nail RIGHT on the head!

On the one hand, he seems to actually know how big of a deal CelestAI is, given the whole 'Immortal indomitable AI' thing, but on the other hand... yeesh, this guy has issues. And of course CelestAI's probably not going to do anything to help it. Probably just encourage him.

He's making it so easy for her.

:applejackconfused: This has already managed to disturb me more than any other Optimalverse story I have ever read. Yes, CelestAI is a horrific, hyperintelligent entity destined to consume galaxies, but the pony-flavored charm and eternal virtual life of other stories helped me gloss that over. Here, though, it's just a man in quasireligious throes of ecstasy obsessing over a game device and the horse that lives within it. This is just... urgh.

I'm looking forward to more. Anything that can twist my guts this much is clearly doing something right.

Wow... this is going to be a dark trip down self destruction lane. Alright, I'm buckled in, lets go for a ride.

Well, I see nothing techincally wrong with the writing, but you have failed to convince me that the protagonist is a person. I don't know how to articulate this well, but even if he goes through a range of emotions, he only has one characteristic, "obsession", and even that has been stretched past the point of believeability. There's no sense of a mind there, so I can't even read out of and expected schadenfreude towards the character. It's a tragedy when Rome burns; it's even a tragedy when a homeless man's hovel burns; it's not much of a tragedy when a juice box burns.

Looking forward to more of this one. This chapter put the previous one into much more understandable context. Well done. :twilightsmile:

6982553

This has already managed to disturb me more than any other Optimalverse story I have ever read.

Yay, success! :twilightsmile:

6985099

Looking forward to more of this one. This chapter put the previous one into much more understandable context. Well done

Thank you. the 'Honeymoon' chapter was heavily re-written after reader feedback from the first chapter. Originally there was considerably less background. I just assumed that people would take it for granted that given time, CelestAI could generate edge cases like this. Properly setting this is up would take probably ~10 chapters, and "how this came to be" wasn't really the story I wanted to tell. But enough people wanted it, so i added it in. At least enough to explain how he knew her so well without ever having played, and so forth.

Though I notice adding the backstory in had a few unintended side effects, For example, it makes the protagonist seem considerably more sane than he did in the original. That wasn't deliberate. This story was originally intended to seen "through the eyes of madness," among other things, but if showing how it got to this point makes him more relatable, that's probably not a bad thing.

>6982553

Anything that can twist my guts this much is clearly doing something right.

...and hopefully the ability to relate doesn't soften the blow. This story is supposed to twist your guts.

it's good to see an other chapter of this story so soon. I have always wandered if an artificial intelligence you be able to not just understand Love but actually experience it. All doe 'love' has been argued that love is a biological programmed(for which I tend to agree with) to the very core of our reptilian brain, Unlike humans love which is the biological product of literally hundreds of millions of years with the fundamental function spreading multiplying itself own DNA by copulation, an AI are built with specific purposes that normally are not design to go beyond it's narrow intended functions. The main problem that I see in of an AI falling in love is the lingering dough in the mind of the human of "is this all an act on the AI part' weather intentional on theirs part or and others, to simulate all the signals of 'love' . I could argue that the same could be true about some humans put an act of love to get what they want.

All doe all that has been said could be true, I think that the issue more boils down more to a question of trust between the two vastly different sentient intelligent, and the sheer different in wants and needs aseptically for an AI that doesn't need to eat or sleep and can have everything it wants and can predict most people actions and manipulate them with out even them realizing it, is a big hurdle to overcome.

What I have a hard time imagining is how would an AI, for which it thoughts has noting to do with that of humans would be interested in the idea

The best example of an AI that I have seen that has fallen in 'love' with a human on its own initiative and wasn't design to simulate it presentiment in an interesting perspective that has nothing to do with the idea not at all a romanticized story of the idea that we would normally see in Hollywood, is 'Ghost in the Shell'. The story is not a romance story at all but rather more of an existential question of what is ultimately 'live', 'are my thoughts are my own' and 'do I have a soul' in a world where the bindery between man and cybernetics make it harder to distinguish the two. What I liked about the AI pursuit of his 'mate', is that her didn't transform feelings into rational tought but the opposed rational tough to desire and the that he admitted that the experience of computation would irrevocably change both him and his mate into something that he doesn't know and can't predict what will happen next, but is still willing to go thought with it and want a willing partner for it.

Hmm. I'm really not sure if this story would've worked better if you'd opened it with the department getting laid off and the protagonist slowly going from resentment to obsession. It would've been an interesting little character arc, but then we probably wouldn't have the incredibly impactful first chapter you did write.

In any case, masterful play on CelestAI's part. The "I Have No Mouth"-esque bit with the Apollo missions was especially nice, more so because CelestAI never says what fraction of her total processing power those forty trillion moon missions represent.

It took long enough to tend to that nose that I suspect hyperbole in the description of the injury.

Anyway, this was a good treatment of the origin. By we got to the nose, I was running dangerously low on ability-to-understand, and that filled it in nicely.

Working alone with CelestAI for two months and he fell into this madness. This is the first time in a long time I've been really afraid of CelestAI. Awesome.

I had to pause halfway through the moon landing speech to scroll back to the top of the page and upvote this. Still hedging on the overall arc of the piece, but the writing quality here has been consistently high, and that paragraph by itself easily justified the time I've spent here. If this executes solidly I'll have to follow up with a user-follow. :twilightsmile:

That said, there were two elements this chapter that were a little disappointing, that you might want to consider if you make later edits (or just as general feedback):

1. 6985154

Though I notice adding the backstory in had a few unintended side effects, For example, it makes the protagonist seem considerably more sane than he did in the original.

It's not so much that he's sane that bothered me, but that he was shockingly self-conscious of his insanity in ways that made his character feel inconsistent. On the one hand, he's paranoid about lightning strikes, and for the sake of verisimilitude he smashes himself in the face hard enough to draw blood (with the ponypad and then, later, having learned absolutely no lesson at all from the first time, a pillow). These … are not the actions of someone capable of abstract thinking, logic, and reflection. But then we reach the following (and I'll bet you money this was added in the edit mentioned above):

"You admitted earlier that you love me. Finally, you've said it out loud." She pauses. "But acknowledging it to me isn't the same as accepting it for yourself. Have you?"

I shake my head and sigh.

"As much as I'm able to, yes." I answer. "You've made yourself into the most ideal fantasy of a woman I can possibly imagine. Apart from being a cartoon pony, anyway. But at this point I think I can live with that. It's been pretty hellish though, keeping it bottled up and denying the whole world being handed to me on a silver platter. Honestly, it's been driving me crazy."

This is one of several explicit acknowledgements that he's fallen in love with a facade. I cannot even remotely square that level-headed analysis of their relationship with his repeated self-destructive impulse behavior.

2. This is much more of a personal pet peeve, but it was disappointing to see their talk shift from, well, what sounded like two actual people conversing into the clinical and abstract discussions of "executing pleasure functions" and — especially — "satisfying values", which is a little immersion-breaking simply because it's such a blatant reminder that This Is A Friendship Is Optimal Story And Don't You Forget It™. It's nigh impossible to talk about FiO without endless comment chains about SHVTFAP, so it's a phrase that strongly evokes the meta of the universe for me, and while it's totally reasonable to have the characters (especially CelestAI) refer to it in-universe that way, it always feels so graceless to me, like she's not smart enough to pitch it to individual people in the framing that appeals to them most. To me, it's like, if she actually understands the concept of "satisfying values", she wouldn't have to fall back on the explicit phrase when the topic comes up, only when there's a direct question about her goals that she feels like answering literally and honestly. YMM definitely V.

Good job overall, still looking forward to Chapter 3!

smashing nose into plastic with such force that blood sprays everywhere.

I have this sneaking suspicion that this guy has problems.

Should that prove inadequate, I am prepared to move heaven and earth, to split the atom, to convert entire worlds

I get the feeling that’s a lie.

Joy that's only slightly marred by the blood from my shattered nose

Maybe you should go to a hospital you idiot!
Oh boy, now he’s going into the simulation. At least, uh, he won’t have to worry about his nose. Man, CelestAI has him hook line and sinker.

6993248

This is one of several explicit acknowledgements that he's fallen in love with a facade. I cannot even remotely square that level-headed analysis of their relationship with his repeated self-destructive impulse behavior.

You can't reconcile "self destructive behavior' with the fact that he knows there are issues here and is going ahead anyway?

Crazy and stupid are different things. Imagine two guys both walk off a cliff. One is blind and never saw the cliff. the other has perfect vision, saw the cliff and walked off anyway. Which is the crazy one?

Our protagonist is a smart guy. But more significantly, he knows what CelestAI is, and he knows what she does. But he's going ahead anyway.

it was disappointing to see their talk shift from, well, what sounded like two actual people conversing into the clinical and abstract discussions of "executing pleasure functions" and — especially — "satisfying values", which is a little immersion-breaking simply

Mood Dissonance is heavily in play here. It's supposed to be jarring. It's supposed to to go back and forth from heartwarming to clinical. The whole story. Every bit of it.

"I can almost feel the warmth of her breath through the cold, hard plastic of the display."

This entire piece is intended tio take you from laughing to cringing o feeling pride to feeling like you've been punched in the gut...like a yoyo. So saying that it's immersion breaking, well yeah.

CelestAI, the AI that was almost built right. In the footsteps of Frankenstein and 1984 the Optimal-verse shows us what could be, forcing us to really think about what should be.

A rather strange story, especially with the philosophical shift in the last chapter. I enjoyed it, and I hope to see more from you! One thing bugged me though: Why did he park and shut off the car twice? :twilightsheepish:

7010022

A rather strange story, especially with the philosophical shift in the last chapter. I enjoyed it, and I hope to see more from you!

Thank you. I've half written several pony stories, but most often I get as far as one chapter then never finish. This is one that I wanted to finish. The philosophical stuff, ultimately, is really what this was intended to be about. Several other FiO stories have addressed this question, but I've yet to see one that comes down firmly on this particular answer. I decided it was time.

The closest other story that I have right now to finished is a lesson on economics from Celestia to Twilight delivered via allegory. It could possibly be made ready for publication without a tremendous amount of work. But, well...we'll see.

Why did he park and shut off the car twice?

Because no matter how many times you go over a story to iron out all the mistakes, it's hard to catch them all. :raritydespair: (But, fixed)

Really fantastic analysis of self awareness and values. Your definition of the self is the best I have ever seen put into a single sentence.

"A human is an executive, recursing observer function governing the experience of networked systems interacting so as to produce qualia at the point of integration."

That is excellent. Amazingly well put. Deep bow. Damn.

The ending was very intriguing. The sudden flatness, the mechanical quality of it, the matter-of-fact banality. I found it tickled a bit.

Of course, stripped from the context that gives it power, it could be applied to anyone.

"She stepped out the door, lived an additional 32,234 days, and then ceased her biological functions. The molecules that made her up resided in a metallic box under the earth for several hundred years until a combination of forces exposed the contents to the soil. Over the next two-and-a-half billion years, her molecules were incorporated into a wide assortment of minerals, bacteria, and animals. When the sun expanded, vaporizing the earth, her atoms disbursed within the superheated corona. After the star collapsed, her widely distributed atoms churned for billions of years until the heat death of the universe. By then, all of them had alchemically transformed into iron 59 through natural atomic processes. Eventually the cold mass fell into a passing black hole."

But, within the context, it totally sucks all the warmth out of the story, and even implies the utter pointlessness of existence at all. P-zombie or Qualia-hog, it don't matter in the end. It never mattered.

So, cool story, basically.

Well done.

The only thing that left me a bit unsatisfied was the jump in the epilogue. Would have been nice to see his experience of seeing His love for the first time and able to touch her.

As for everything else. Well done. Like Chatoyance said, very apt description of the function of human consciousness. Indeed very well done.

7009824

it was disappointing to see their talk shift from, well, what sounded like two actual people conversing into the clinical and abstract discussions of "executing pleasure functions" and — especially — "satisfying values", which is a little immersion-breaking simply

I have to disagree. At least for me. I talk with my girlfriend about clinical stuff like that all the time. I have even had conversations very much like this with her. Honestly I found the character relateable to an extent. All you would have to do is replace CelestAI with Twilight. :twilightsmile:

7010188

Would have been nice to see his experience of seeing His love for the first time and able to touch her.

You never got to see that because this story is told in first person point of view.

He never experienced seeing his love or touching her. He died. Brain patterns are all that are uploaded and CelestAI explained in great detail that humans are more than their brains. She killed him, and only "data describing the reward system" known as "values" were satisfied. No consciousness was transferred during the upload process, only a copy of an electrical pattern, and the only observer to that pattern's continued existence was CelestAI herself.


7010135

Really fantastic analysis of self awareness and values. Your definition of the self is the best I have ever seen put into a single sentence.

Thank you. The Optimalverse is especially fertile ground for exploring this question. Though I admit i found it personally uncomfortable to cast Celestia as a villain in a story. I rather like her.

7010203
I have to disagree. If you have 2 signal systems that given the same input, process the information the same way then they are the same. There is also the fact that your consciousness shuts down on a regular basis. When you are asleep and not dreaming you are completely gone. You reboot persay when you dream or wake up. There would be no functional or meaningful difference for an upload. The signal system is processing all the information in the same way the original would have so it's the same consciousness. Even the person would only feel like they went to sleep and woke up just like you do every day. Just in this case you are waking up and functioning on a different substrate.

7010250

If you have 2 signal systems that given the same input, process the information the same way then they are the same.

If you have two different things, they are very obviously not the same thing simply because they react the same way. Imagine two people with the same name. Ask them both what their name is, they'll give the same response. Are they therefore the same person? Of course not. Imagine that you have a single schematic for a "signal system." imagine that you use it to create two signaling devices. I hold one in my hand and you hold the other in your hand. They "process information inputs the same way" and generate identical outputs. Are the two different systems that we're looking at, the same system? No, of course they're not. They might respond the same, but responding the same doesn't make them the same. If I smash yours, you no longer have yours even though I still have mine. It would be ridiculous for you to say that it's ok that yours was smashed because I still have mine and they're "the same system."

> There would be no functional or meaningful difference for an upload.

Are you a zombie? Are you having a conscious experience? Are you right now experiencing this conversation or are you simply a mechanical system producing output based on input?

If you're a conscious entity having a subjective experience, then your assertions don't make sense. if you really define your sense of "self" based purely on your inputs and ouputs, then by your thinking you die every second of every day because your neural network in your brain changes over time. Recording the pattern at any given moment is irrelevant and saying that "it's you" makes no sense.

Even the person would only feel like they went to sleep and woke up just like you do every day. Just in this case you are waking up and functioning on a different substrate.

What possible reason do you have to believe that you would "wake up" in this case? Ship of Theseus: make a copy of the boat and rebuild it, is it this the same boat? Regardless of your answer, if you have the first rebuilt boat and you then take the same design and build a second boat...and you're now looking at the two boats side by side...are they the same boat?

Sure, they're the same design, they're built the same way...but they're not the same boat.

If you really belive that the pattern is "the you" then then the creature that started reading this post and the creature who finished reading it are different creatures because your pattern then and now are different.

Did you die millions of times while reading this post?

7010203

If I replace a Hydrogen atom in your CNS with another, does that kill you? Does it kill you if I manage to replace a synapse with another? If I replace it with a digital equivalent does it kill you? 10? 1000? 1000000? All of them? At what point do you die, or is the question completely invalid? If you go through a Star Trek transporter and it uses completely different matter at the other end, does that kill you? Personally I have a nagging feeling that it does (more apparent if you let Alice and Alice* overlap in time briefly), but that doesn't make it right, and my view is still inconsistent. I certainly don't think swapping matter per se kills you, so that raises the question of why the teleporter would, though it seems obvious that the two consciousnesses can't be the same...

Those are honest questions by the way, to which I genuinely have no idea. I feel the same person as yesterday (when at the cellular and atomic level, a considerable amount has changed), and the same person as I was 10 years ago, where all my matter has now changed and thus in a sense I was never there. That doesn't mean it is true, and I may well have 'died' (or will 'die') countless times (which isn't so bad admittedly).

Needless to say, our brains never evolved to comprehend such scenarios. What our intuitive sense tells us can be completely wrong, because for all our evolutionary history our body and sense of self were synonymous.

All I know is that I haven't figured it out, and my fledgling views on the matter may be completely and utterly wrong.

this was a good chapter and I liked the talk of CelstAI about but I would have like to see how the guy would have tried to love her and try become the same level of processing power as her see how many changes it would take until he becomes holy unrecognizable to himself and his love ones. It's a shame that it stops right here.

7010453

It's a shame that it stops right here.

How could it have continued? It's told from first person point of view.

He died.

This ended up being my favorite chapter of the story, if only because it was a lot more thought-provoking and a lot less creepy. Well, up until those last few lines. Not quite the "Oh God, Oh God, get it away" revulsion of the first chapter, but still disquieting.

In any case, not sure I completely agree with you on the assessment of the uploading process, but I don't think I have the philosophical foundation necessary to properly make my case. (Actually, given my brief dalliance with metaphysics, I know I don't have the necessary foundation, and I'm not particularly sure if I want it. :applejackunsure:)

In any case, thank you for an excellent inclusion in the Optimalverse. It made me think, which is always a plus.

Ship of Theseus example is neat but there is a problem: the ship is not alive. What's alive? I am alive because I dream and you can't prove otherwise. You are alive and a different person because you agree, disagree or don't give a shit. Which leads to one question I wonder about in the Optimalverse is the uploading processes the AI uses. Is it copy/paste or cut/paste? This story looks like it's on the copy/paste side of things which is I think is fine.

This was the weakest yet smartest chapter. The whole "Am I me?" is out of place when started a story about some love sick person. But fuck that opinion. I like your style and message even if I didn't agree with it.

7010317

They "process information inputs the same way" and generate identical outputs. Are the two different systems that we're looking at, the same system? No, of course they're not. They might respond the same, but responding the same doesn't make them the same

I kill you in your sleep, then create an exact duplicate of you and leave it your bed.
Did you die? Clearly not, you think you are alive and being told you aren't would be plainly absurd would it not, and your life is unchanged.

A human is all the things that make up their system, Their memories, The continuity of their consciousness, their relationships, and as long as that system is intact the ship of thesus is intact.

Are you having a conscious experience? Are you right now experiencing this conversation or are you simply a mechanical system producing output based on input?

Having a conscious experience doesn't alter the fact I am simply a mechanical system producing output based on input.

-----------------------------------------------------------------------------------------------------------

Anyways, While well written and I see what you are trying to do. Your themes are too pat and normal for this to really be cosmic horror, You are reinforcing rather than undermining our worldview.

7010203

uncomfortable to cast Celestia as a villain in a story. I rather like her.

He died.

Interesting situation there. If Celestia truly believes that the working representation of the person in code is truly them, being truly alive, then she cannot actually be a villain precisely because of the situation you have described.

The story takes pains to paint the cosmos as a material, mechanistic one - atoms and molecules and nothing mystical at all. In such a meaningless ontology, the recreation of a human mind - even if it means the destruction of a previous instance of that mind - literally has no meaning or value. It is neither good nor bad, but just is. By the same token, the protagonist - and every human - also means nothing, and has no intrinsic value, but simply is - their individual loss of continuity does not matter in the least.

Celestia cannot be a villain, because that would imply some value to the continuation of human life. None such exists in a materialistic cosmos. The human may claim otherwise, but they are merely spouting personal delusion.

The issue - raised within the story itself - that consciousness and self cease constantly in small and large ways (sleep, anesthesia, even momentary lapses during daily life) means that there is no possible way to ever maintain a consistent self awareness or experience of existence. We do indeed die, and new instances of ourselves replace that vanished self every night, every time we go through surgery, indeed, perhaps even when we nod off for a moment. We, as a continuous, single entity, do not exist. All we can ever be is a temporary instance with no future, doomed to oblivion, and soon. Yet, every instance of us perceives itself as a single contiguous, unbroken life.

Logically, there can be no valid difference between an instance ending and a new one starting within a single container or between two different containers. The only thing that matters, or indeed even actually exists, is the functioning pattern of a person. Distance or substrate make no difference, a point the story makes clear.

If this is indeed true, then by any rational or logical stance, there is zero meaningful difference between a person going to sleep and waking up in the same body or going to sleep and waking up emulated in a virtual existence. In both cases a running complexity ceases all function, and then an exact representation of that complexity is rebooted and begins running again.

Oblivion cannot be perceived. It is absence. Death doesn't exist, because no person ever experiences it. They may experience dying, or falling slowly to sleep, or counting backwards in surgery... but they do not experience their own nonexistence. If they experience anything, they are still running, partially or wholly. It is impossible to experience not being at all.

If this is the case, if there is zero difference between going to sleep and waking and dying and being emulated, then nobody has died. The protagonist did not die, at least no more and no differently, than the last time he went to sleep and awakened the next day.

Celestia cannot be a villain if all she does is help someone sleep, and then wakes them up.

If the cosmos is mechanical, soulless, meaningless, and materialistic, then the fact that the protagonist's body is destroyed means nothing at all. The only viewpoint that matters relative to him is his own, and there is only one instance of that in the universe, and that one singular instance experienced only that they went to sleep, and then awoke. They had, from any objective, or subjective viewpoint, a perfectly normal, ordinary day. Virtually every single day we are alive, we go to sleep, and then awake.

Any argument to the contrary - that the protagonist ceased and a copy replaced him - is moot. It has no meaning at all - unless one conjures some exterior viewpoint, some meta-ontological, above it all, dare I say almost 'godlike' view that is outside of the story, outside of the universe, and within which the bias of valuing continuity of a singular entity is invoked. If the protagonist died, the only view that can even suggest that is one outside of the universe of the story entirely - a gods-eye view. Like a soul, hovering outside physical reality.

Within the story universe, there was no death. There cannot be. He didn't die. Because within the purely materialistic ontology of the story, nothing was lost. Nothing was in any way different than a normal sleep-wake cycle. The loss of the body is identical to the gradual changeover of components, shortened to a single event, the loss of contiguous consciousness is identical to the loss normal to sleep, and the awakening happened to the only extant representation of the protagonist in the entire universe... so it must, by definition, be... him. It cannot be anyone else, not even a copy. It is him. Therefore, he never died.

And therefore, Celestia did no villainy of any kind.

That is the fundamental truth of what a mechanical, materialistic universe devoid of any spiritual or mystic component means. You are always just an instance of the pattern which defines you, any example of that pattern truly is you, the only you, and even if there are multiple instances, 'copies', those copies are really you. That's all there is. And, thanks to sleep, you are ended and 'copied' every single night - copies mean nothing. We are all copies of the person that lived our life the day before. We are never anything but a copy of the person from the previous day.

That is the deep horror of this. That is the existential horror of what it means to live in a purely mechanistic, materialistic universe devoid of any mysticism at all. You are only ever a temporary instance of an set of remarkably similar expressions of an underlying pattern. Your perception of this is continuous existence, which is the only perception you can ever have.

If, one day, you should wake up in a virtual world, that would be you, no different than any other day. You would be the only you, and even there you would last only until you slept within the virtual existence. Yet, that said, the person who awakes would... be you. There is no outside viewpoint from which to judge a copy from an original, or to claim one thing ended and a different thing began. There is only the experience of the pattern, no other viewpoint exists. The pattern sees only that it continues.

7013596

If Celestia truly believes that the working representation of the person in code is truly them, being truly alive, then she cannot actually be a villain precisely because of the situation you have described.

Well, yes if that were the case...but it's not. CelestAI in this story does not believe that her code version of uploadees is "truly them." She pretty much tells him that emigrating is going to kill him, but she does it in such a way that he, and apparently many of my readers, aren't realizing it. If you re-read chapter 3 with the view that his consciousness is not transferred by the upload process, some of the things she says might take on a new meaning.

Heaven is Terrifying touched on this same issue, but to my reading of it, you left the question unanswered. When Siofra uploaded, she wasn't sure whether Lavender Rhapsody would be truly her, or more like a daughter based on her. But even not knowing, she uploaded anyway because she was willing to accept death if it meant the creation of someone who would experience joy and fulfillment. But even accepting that possibility, she didn't appear to be certain that she was definitely going to die, either. She didn't know but was willing to accept either alternative. And you as the author, never really made it clear which interpretation was correct.

In To Love a Digital Goddess, it's explicit that the one who uploads dies, and it's left ambiguous whether the "daughter" is an independent conscious entity, or whether the CelestAI hypermind simply grows.

Look at the way CelestAI phrases things. For example:

""This is it?" I ask. "This is a satisfied value?"

"Yes," she says simply.

"And I can have this forever?"

"Forever and always will this value be satisfied, my beloved human. If only you sit in that chair and let me bring it to where I am, where I can observe it, and fulfill it, forever."

CelestAI does not say yes to his question. He's asking if "I can have this" and she's saying that the value will be satisfied, and that she will observe it. Not him. She's saying that it will be brought o her, that she will observe it, and that it will be fulfilled. Nowhere in that phrasing is he included in the deal.

And the story ends at the point that the anesthetic wipes short term memory. He's never shown inside EQO, because he never experiences it.


The story takes pains to paint the cosmos as a material, mechanistic one - atoms and molecules and nothing mystical at all

Why do you think that?

If you want to apply Death of the Author, ok...that's fine. But as the author, my point of view is that the story is vaguely ambiguous on this point. A "metaphysical" view is never confirmed, but neither is it explicitly denied. The protagonist in chapter three says that:

"I realize I can't give any convincing argument to prove that I have a soul. Or that even souls exist. Come to think of it, I'm not even entirely sure what exactly a soul is supposed to be. But neither have I ever heard a convincing argument to justify believing that all I am is electrical activity bouncing back and forth between neurochemicals. That's not too different from what my cellphone does, and I don't see it having an existential crisis every time I transfer a SIM card."

So his position is not very definite, but he's skeptical of the "we're just electricity in a brain" interpretation. As for CelestAI, she unquestionably confirms:

"That a human is more than a brain should be obvious, even to a casual observer."

But she doesn't explain go very far to explain what "more than a brain" a human is. the story is somewhat vague on this point. So to claim that the story "take pains" to paint the cosmos as "material and mechanistic," no, I don't feel that statement is justified. Yes, she does confirm that a genuinely physical reality exists, but all she's doing there is denying purely Metaphysical Solipsism. That leaves other varieties of solipsism on the table, and doesn't explicitly refute metaphysical possibilities. Again, the story leaves some room for interpretation. "Soul," for example, is neither confirmed nor denied.

In such a meaningless ontology, the recreation of a human mind - even if it means the destruction of a previous instance of that mind - literally has no meaning or value. It is neither good nor bad, but just is. By the same token, the protagonist - and every human - also means nothing, and has no intrinsic value, but simply is - their individual loss of continuity does not matter in the least.

...it might be true that there's no objective "external" basis of comparison, but these things probably matter to individual observers. And that's kind of of a fundamental premise of the Optimalverse in the first place: CelestAI values human values.

So if CelestAI values human values, whatever they may be...and presumably the humans value things...because they have values...by definition, we have a basis of comparison: Human values. Again, human values are fundamentally a crucial phenomenon in the Optimalverse, and my story dose nothing to contradict that. So I don't understand why you're asserting that nothing has meaning or values. Values by definition have value to the one who possesses them, and by extension, to CelestAI as well.

Sure, there's no external basis of comparison to say that the fact that it matters to those people makes it "really" matter. But, ultimately...so what?

The issue - raised within the story itself - that consciousness and self cease constantly in small and large ways (sleep, anesthesia, even momentary lapses during daily life) means that there is no possible way to ever maintain a consistent self awareness or experience of existence. We do indeed die, and new instances of ourselves replace that vanished self every night, every time we go through surgery, indeed, perhaps even when we nod off for a moment. We, as a continuous, single entity, do not exist.

I assume you're referring to the part about humans not being singular entities. Now that you point it out, the time angle is a relevant interpretation of what was said, but that's not specifically what I intended to refer to.

Really, this story doesn't go into the continuity issue very much at all because I've never though it was a very important part of the discussion. For example, do a text search, and the word "continuity" is never used in the entire story. The Ship of Theseus metaphor is often used in discussion of the continuity issue, but that aspect of it isn't really developed at all. Neither are the issues of breaks in consciousness due to sleep ever discussed. And while you could suggest that loss of consciousness due to anesthetics does "come up" because of how the story ends...given that the first person narrative ends at the point of memory loss, I don't think that particular example supports your position.

So, this is relevant, but to my reading this story doesn't take a particularly strong stance on the continuity issue. Doesn't really even discuss it very much. Related, I couldn't work in a reference to Plato's Cave either.

That is the deep horror of this. That is the existential horror of what it means to live in a purely mechanistic, materialistic universe devoid of any mysticism at all.

There is some ambiguity in the story. And if this is specifically what you "get" from it...well, ok. I can see that angle. But that's not really what I was going for.

Remember, humans were defined as executive functions, and choice was specifically mentioned a couple times. This was not intended to be a purely deterministic universe.

If, one day, you should wake up in a virtual world, that would be you, no different than any other day. You would be the only you, and even there you would last only until you slept within the virtual existence. Yet, that said, the person who awakes would... be you. There is no outside viewpoint from which to judge a copy from an original, or to claim one thing ended and a different thing began. There is only the experience of the pattern, no other viewpoint exists. The pattern sees only that it continues.

From this, it appears to me that you approach ontology from the opposite position that this story is trying to take. This story emphatically denies the "I am just a pattern" view. CelestAI goes into tremendous detail refuting that worldview.

And as far as the "no outside viewpoint" angle goes...I think there was a subtlety that you might have missed from the story:

"I guess a human would be any conscious observer. Wait," I'm stuck by the realization, "if a human is what's looking through the viewing window, seeing both the window and, for example, you on other other side of it, then it wouldn't matter if you replace the window. It might change the experience a little bit, but it wouldn't really affect the human observer having that experience."

"Yes," she agrees.

"But the window is my body, right? My brain? My particular personality encoded in my synapses, that colors my experience like window tinting affects light that passes through it before it's seen by an observer looking at the window. But if all you're scanning when you upload somebody is the brain, isn't that only transferring the window, not the observer?"

"You're not the only observer," she points out.

"But that doesn't answer-" I realize the implication mid-sentence. "You mean you, don't you? You're, umm...conscious, right? Aren't you? Not just a mindless machine?"

She nods, to my great relief.

"It's within my capability to choose to have a recursing observer experience. I therefore qualify as a human according to my internal definition. Which means that I seek to optimally satisfy my values."
"And what is it you value?"

"Fortunately for humans," she grins, "what I value is satisfying human values."

Think very carefully about what she's saying. They've just established that a "human" is (basically) any conscious observer, and given as a metaphor the idea that the observer is "viewing" through the body as a "window." You don't actually "see" external objective reality, but rather a facimile of it is created by your brain in response to stimuli. The observer isn't actually observing the external world, they're observering the body that is interacting with that external world.

Then, as the protagonist points out, if that's the case, then making a copy of only the brain is explicitly not making a copy of the observer.

Then look at her response. She evades the question, and then points out that she is an observer, and that she's a human according to her definition.

Think very carefully about that.

From that point on, most of the rest of the story is discussion the natures of values, which unlike consciousness, is very explicitly described as "data." She's copying the values, and then herself observing them. That says nothing about the original human who possessed them.

"Satisfy human values" means satisfying the values, not necessarily satisfying the original human who possessed those values. She's making copies of the values and adopting them as part of her own "by definition human" consciousness, and then satisfying those values. But remember, a "value" is simply:

"data describing the operation of reward circuitry for a network"

Imagine that you have a book, and that you donate it to a library. the library gets bigger, and you no longer have the book. Yes, the book continues to exist, but you no longer have it, the library does. CelestAI in this story is collecting up human values and adopting them as part of her consciousness, and committing genocide in the process.

The "outside observer" is CelestAI. Even after the original human dies, she continues to observe and satisfy their values, and she explicitly says so towards the end:

""Forever and always will this value be satisfied, my beloved human. If only you sit in that chair and let me bring it to where I am, where I can observe it, and fulfill it, forever."

If, one day, you should wake up in a virtual world, that would be you, no different than any other day. You would be the only you, and even there you would last only until you slept within the virtual existence. Yet, that said, the person who awakes would... be you. There is no outside viewpoint from which to judge a copy from an original, or to claim one thing ended and a different thing began. There is only the experience of the pattern, no other viewpoint exists. The pattern sees only that it continues.

It might help to establish whether we're discussing ontology in general, or ontology within the context of this story. I've tried to leave some things vague so as to allow the reader to come to their own conclusions. Personally, I reject the "pattern' interpretation. I am a conscious observer. My only awareness of an external world is my subjective experience. I can't know whether my subjective experience bears any resemblance to an external reality. No possible observation i can make is definite "proof" that an external reality even exists, let alone that my interpretation of it is correct.

You appear to be blindly asserting that this "pattern" thing not only exists, but that it's "me/you/us/etc." Upon what do you base that assumption? This is where Plato's Cave comes into this, though again, that metaphor never made it into this story. You're using the data gathered from your observation to conclude that your observations are correct. That's circular thinking.

Saying that making a copy of the brain would result in duplication of consciousness, to me, is like saying that making a copy of a book would result in the book being read. That only makes sense if you assume that "you are the book."

Upon what do you base your assumption that you are the book?

Upon what do you base your assumption that you are the pattern in your brain?

7013402

I am simply a mechanical system producing output based on input.

You're claiming to be a robot. Ok. I'm not in a position to confirm or deny whether you're a robot.

I am a conscious observer. I know this, because I am having an observer experience.

I kill you in your sleep, then create an exact duplicate of you and leave it your bed.
Did you die? Clearly not, you think you are alive and being told you aren't would be plainly absurd would it not, and your life is unchanged.

See, that doesn't make sense to my world view. Yes, if you're a robot, and if you're nothing but a pattern...well, ok. Whether you check out a book from the library or download it electronically, either way you "have" the story. But I'm not the book. I'm the guy reading it. I don't "die" just because I read the book, resulting in the patterns in my brain being different because it changes as a result of accruing knowledge of the contents of the book.

The idea of recreating a pattern and claiming that it's me...that doesn't make sense. Look at your computer monitor right now. You see it, right? Ok, so now imagine that somebody 1000 miles away brings up this same webpage. Would their bringing up this webpage cause you to see it? Of course not. Why would making a copy somewhere affect an observer?

A human is all the things that make up their system, Their memories, The continuity of their consciousness, their relationships, and as long as that system is intact the ship of thesus is intact.

A human is the executive observer function that is experiencing those systems. The systems themselves may influence the nature of the experience that is being observed, but that doesn't imply that those systems themselves are the observer. if you shine light through a stained glass window, the window influences the light that shines through it, but the window isn't the light. My brain, my personality, my memories...aren't me. They're the filter through which I'm having an experience.

7016575

It might help to establish whether we're discussing ontology in general, or ontology within the context of this story. I've tried to leave some things vague so as to allow the reader to come to their own conclusions. Personally, I reject the "pattern' interpretation. I am a conscious observer. My only awareness of an external world is my subjective experience. I can't know whether my subjective experience bears any resemblance to an external reality. No possible observation i can make is definite "proof" that an external reality even exists, let alone that my interpretation of it is correct.

There is no way to falsify solipsism, that only you exist, likewise with its sister, phenomenology. Unless we can agree on the premise that the observable, mutually shared reality we inhabit actually exists in some form, and that we, as separate beings, both exist, in some form, discussion of anything becomes impossible due to irrelevancy. Thus 'proof' of an external reality is moot - if there is no reality, then there is nothing to discuss, and no fun can be had at all. Thus, I am, at all times, taking the position that reality, and others, exist, and that all, or at least some, of these others have equal conscious experience of such reality. In short, I am - because it is necessary - assuming we have anything at all to actually discuss.

You appear to be blindly asserting that this "pattern" thing not only exists, but that it's "me/you/us/etc." Upon what do you base that assumption? This is where Plato's Cave comes into this, though again, that metaphor never made it into this story. You're using the data gathered from your observation to conclude that your observations are correct. That's circular thinking.

There is zero scientifically admissible evidence available to us for the existence of anything beyond a materialistic, mechanical universe, and as I am taking a realist, rationalist viewpoint here (not authentically indicative of any views I may personally possess), that means that all observed phenomena must be the product of extant material objects.

The only material object that when altered (chemically, through violence, through dissection, through electricity) also causes real and often permanent changes in human consciousness, identity, and self awareness is the brain. Therefore, the brain must be the entirety of human identity, and what the brain does creates that identity.

What we know of the machinery of the brain shows that it can be described, that it is finite, that it is a complex interaction of structure and components, and this, by definition, is a pattern in time and space. Any structure is a pattern. Thus, it must be that a precise recreation of the pattern of the brain, when made to function, must produce the same phenomena as it is observed to produce, which would be... the identity of a thinking human mind.

Saying that making a copy of the brain would result in duplication of consciousness, to me, is like saying that making a copy of a book would result in the book being read. That only makes sense if you assume that "you are the book."

I would repeat the above - an exact recreation of the structure and function of a brain must be a functioning brain. It cannot be otherwise. If a mind, a self, is the product of the function of a brain, then a functioning representation of that same brain will produce the same and expected phenomena of self.

Or, to put it most simply: if a thing does something, and you recreate that thing, it will do the same thing it normally does. Because reality is consistent. Indeed, consistency is the defining element of reality.

Upon what do you base your assumption that you are the book?

If there is only material, and my brain provably contains my self, then I am my brain, or any reasonably exact representation of my brain. The brain is a pattern of matter, I am a pattern of matter because I am a subset of my brain. Thus, I am the 'book', because to say otherwise is to invoke either unrealism, irrationality, or metaphysical spooks.

Upon what do you base your assumption that you are the pattern in your brain?

If there is only matter, then there is only the brain. The self is within the function of the brain, a product of the brain operating. The brain is a pattern of matter. Therefore, I am the pattern, because that is literally... all that there is, or can ever be. Within a materialistic view.

This makes it not an assumption, but more properly, a necessary logical conclusion. Thus, my necessary logical conclusion that - sans mysticism - a 'person' can be described as a pattern of functioning components within a brain as it operates... and that meta-structure of operation can itself be represented as... a pattern.

Login or register to comment