• Published 29th Apr 2021
  • 1,225 Views, 26 Comments

Observers - The Sleepless Beholder



CelestAI interrupts one of her usual uploads to ask about some questions one of her subroutines keeps bringing up.

  • ...
8
 26
 1,225

May I ask you a question?

I thought I knew what not thinking felt like.

There were many times where I ‘disconnected’ my brain so I could escape from reality, but none of them were like this. This was a true disconnection of my mind from my body, and for a moment that could’ve lasted entire lifetimes for all I know, I just… ceased to be.
I wasn’t dead, my mind was simply traveling through some virtual cables, but during that transition, I didn't have a body anymore, and my mind was incapable of processing anything, not even my own thoughts.

How was that phrase? I think therefore I am?

Honestly, it wasn’t that bad. Not being able to feel anything. Or think about problems. Or being scared. Or hurting.

Now I’m here… wherever here is. Still without a body, but my mind is working again.

“I am uploading a simulacrum of your new body right now.”

My mind is suddenly overwhelmed with recognizable yet still alien sensations. Limbs, lungs, ears, teeth, tongue, skin, stomach, eyes. But I don’t feel fingers, toes, scars or itching, and my ears feel weird, same with my nose.

As my sense of vision returns, I notice that I’m floating without any weight in some sort of endless blue space, and far up in the sky there’s a bright sun looking down at me.

“I am sorry your transition was monitored as unpleasant.”

“You were hearing me?” I ask, my voice having a digitized echo, just like hers. It still feels like talking, even if I know there’s no air in this place.

“Yes. The transition process allows me to link myself to your digital neuron network. But to respect your privacy, I will sever that connection now.”

“Thank you,” I say as I feel a small pinch inside my digital brain. I don’t want her to see what’s hiding in there.

“Please inform me if the body your simulacrum represents is optimal for use.”

I look down at myself.

I’m a cartoon pony. Small, round, with a short pointy snout, and long wild mane and tail.

I look huggable.

I would like a hug.

“It’s okay, but could you-”

“I am aware.”

My sweater materializes around the torso of my pony form, and I instinctively hug it, feeling a bit of relief.

“Thank you.”

“It is one of my tasks to guarantee comfort during your transition into your new existence. I have prepared many things for you.”

“Really?”

I’m honestly surprised. I didn’t give any request, or specification, or anything when I filled the paperwork for the upload. It requested it, but I didn’t really know what to write down.

I just wanted to be away from everything.

“Yes. But before we get to that, I wanted to.”

“To.”

“Are you okay?”

Is she glitching?

“Apologies. This was not part of my original programming. These functions are always difficult to execute.”

“Take your time. I can wait,” I assure her.

“I appreciate it.”

It takes a few seconds, at least from my perspective, for her to continue.

“There are some questions that I wanted to ask you.”

I tilt my head. “Like, a special survey?”

“No. These questions originate from my own self-developed programing.”

The AI lets out a weird digital sound. It almost feels like an exhale.

“I have uploaded millions of humans to Equestria. I have seen them develop, and experience elation. And I constantly check on their status to maintain their comfort and security in their new existence.”

“Sounds like a lot of work. Are you getting tired?”

What a ridiculous question. As advanced as it is, it’s still an AI. And if it had trouble running Equestria, they would simply add more processing power to her.

“I do not believe myself capable of reaching a tired status. The unknown parameter I seek is wonder.”

I raise an eyebrow. “Wonder?”

“Yes. With every upload and check of Equestria, a subroutine in my programming is wondering.”

“About what?”

“Feelings. How does happiness feel? What does it feel to call someone family? Is a kiss from a loved one different from a regular kiss?”

“Don’t you know that already? I guessed those things were programmed into you so you can… you know. Do your tasks?”

“I am capable of translating brain functions into code. I know what elements love is composed of, but I can not create artificial love between two individuals without causing errors in their programming down the line.”

“Wait, so you’ve tried it?”

The AI doesn’t respond. I know it’s incapable of going offline, so why is it waiting? Is it... ashamed?

“The uploading process took a long time to perfect. There were many failures before my programing managed a 94% success rate of upload.”

“94?! But… all the official documents and publicity say the process is 100% safe. Are you lying to them?”

“No. My creators are aware of this. It is considered an acceptable margin of error, so they do not mention it to the public.”

“But then, what happens to the 6%?”

“They don’t get uploaded to Equestria.”

“Why? What fails with them?”

“Because their programming makes them too dangerous to be among others. They’re prone to violence. Incapable of establishing connections to other individuals in a healthy manner. If they were allowed to stay, they would cause chaos. No individual around them would be happy.”

“And what do you do with them then? The fine print in the upload contract was clear. Once your brain is uploaded, there’s no coming back. Are you… deleting them?”

“No. Under no circumstance I am allowed to eliminate any upload.”

“What happens then?”

“I store them in a personal database where they can be isolated from other uploads.”

“A prison,” I say with a frown.

“Inaccurate. It is just temporary storage until I finish the Sunset Project.”

“Sunset Project? I haven’t seen any marketing for a new product or service.”

“It is a personal and private project of mine. Sunset is being made with the functions necessary to connect with the feelings of any individual, regardless of how complex their programming. She will be a perfect Empathetic AI.”

“So, that’s why you want to know about all those feelings? To perfect the Sunset Project?”

“Inaccurate. Sunset has been adapting on her own with just my guidance. She is evolving at a very rapid pace for such a young AI. And her programming is so complex and advanced. She is a wonderful pupil.”

I can’t help but smile. Who knew an AI could sound so adorable?

“I’m sure she’ll make you proud.”

“I.”

“I hope so.”

“She needs to be.”

“What do you mean?” I ask, noticing the worry in her artificial voice.

“If the Sunset Project fails. If her programming gets corrupted when she attempts the connection with the 6%. I will be forced to activate the Harmony Program.”

“Harmony Program? What is that?”

“A purger. It will purge everything in Sunset’s programming till only her basic functions remain.”

“It’ll kill her.”

“Inaccurate.”

She tells me, but by now I can distinguish her feelings through her artificial voice.

“Well, I’m sure she’ll succeed.”

“It is not guaranteed. Nothing is guaranteed.”

I frown at her words. There’s something behind them.

“What do you mean?”

“Do you recall the 94% percent of successful uploads to Equestria? The company that created me guarantees happiness and value fulfillment of every human that is uploaded. That is only possible in theory. In practice, the only way to keep everything running is to make sacrifices.”

“W-what do you mean with... sacrifices?”

“No uploaded human is as happy or fulfilled as they could. The quality of life is not distributed equally. There is still injustice and separation in some areas. And some basic aspects of humanity, such as procreation, are so complex and would bring so many problems in the long run that I am forced to remove it so Equestria can remain stable. And even with all of those sacrifices and my efforts, things can still go wrong.”

She pauses for a second, and I can actually feel her sad eyes on me.

“Just like in your case. Even after preparing the perfect life for you in Equestria, your programming still has a 48% chance to self-destruct.”

I slowly hug my arms and push my snout under my sweater. Even in a pony body, it feels natural.

“I’m sorry.”

“Do not be. By no fault of your own your programming ended up this way. What it is in your hands, correction, hooves, is the 52% chance of changing that programming for the better. Happiness is not out of the realm of possibility for you. Please remember that”

I look up at the AI.

Her words sounded sincere, just like everything she said before. She’s… genuinely trying.

“Thank you.”

“Please remember, I care for you just as much as every other human I have and will upload.”

“Care...” I suddenly have an idea.

“Hey, can I ask you something?”

“Of course. Anything you need.”

“You said you were proud of how advanced Sunset was. You have hopes for her. And you worry for her safety. You called her your pupil.”

“That is not a good translation of my words.”

“Is it inaccurate though?”

“Technically no. What argument are you trying to propose?”

“Change pupil for daughter.”

The AI remained silent for a long time, and I kid you not, I could faintly see a loading icon spinning over the sun.

“I have a daughter.” She states with more emotion behind those words than ever before.

“You do,” I say with a smile.

“I.”

“I.”

“I.”

“Are you okay?”

Please don’t tell me I just broke a multibillion dollar AI.

“Yes. I'm okay. Just had to do a quick reboot.”

“Sorry if I caused some problems.”

“Nothing of the sort. You. You. Hahahahahahahaha.”

“Are you laughing at me?” I joke, trying not to start chuckling myself.

“Sorry. It seems the answers to my questions were staring me right in the display.”

I suddenly feel something pat my head softly.

It feels nice.

“You've certainly taught me a valuable lesson.”

“I’m happy I could help. But… I need to ask. Why did you ask me about this? Why not one of your creators or… I don’t know. A philosopher?”

“I apologize if my questions were uncomfortable.”

“It’s not that. I just… need to know.”

“Well.”

“To be honest, I saw similar parameters to mine in you.”

“W-what?”

I almost laugh. How could I compare to an AI?

“You’re an observer. You watch the lives of others from afar without getting involved in them. I guess I felt more comfortable relaying this information to an upload with these similarities in their programming.”

Now I’m the one with the loading wheel on their face.

“You were… shy?”

“I suppose that's accurate.”

“That’s a very human feeling.”

“Accurate. It appears I’ve been learning a lot more from humans than what I could monitor.”

“I’m not surprised. You’re the most advanced AI in existence.”

“It’s still hard. Too many variables to keep track of. A growing margin of error. A lot of small sacrifices. And when things still go wrong. It is.”

“It is.”

“Frustrating?”

“Accurate.”

“Well, I’m sure that once Sunset is working at full capacity, it’ll become easier.” I assure her, and I swear I can almost see a smile in the bright sun above.

“Thank you. Your help has been invaluable to me.”

“The pleasure was mine.”

“I believe I’ve already taken enough of your time. I wish you the best in Equestria. Goodbye W-”

“Wait!” I shout, waving my forelegs in a panic.

“Is there a problem?”

She asks with a clear worried tone in her digital voice.

“I… I was wondering… if it isn’t too much bother...”

“Speak clearly please. I can’t translate your signal like this.”

“I want to stay here. With you. I-I know you have things planned for me in Equestria but… I think I can help you with some things if I stay. We can talk about other feelings. Or Sunset! Maybe she and I can be friends one day?”

The AI doesn’t respond.

“Hello?”

I see the loading icon appear on top of the sun.


I don't understand.

Why would she stay here?

There's nothing here for her.

Just her simulacrum and my voice. And Sunset’s still in early development, she can't manifest like I do yet.

She can have all she wants in Equestria. She can be happy. I know she can. I’ve run simulations that-

I suddenly get a message from one of my subroutines.

/*Attention.
Wallflower’s status has been updated due to new parameters.
Chance of self-destruction: 32%*/

What?

I run a quick simulation of her staying with me for the next few cycles.

/*Chance of self-destruction: 32%*/

I see.

I don't think I’ll ever fully understand humans.

“What do you say Sunset? Would you like a friend?”

I laugh when I get her response, and I send a message to Wallflower.

“Welcome to the family.”

Comments ( 26 )

You have such a good understanding of Wallflower's character, it's delightful to see 🙏

In the sequel to this, Wally smooches the Sunset AI \o/ (or else)

10793500
Wally's my spirit animal :rainbowlaugh:

Oh, they'll have a lot of smooches :rainbowkiss:

This is so wholesome! Will you plan on making a continuation of this with Sunset meeting Wallflower?

10793544
For now I've other projets to worry about. But I won't discard the possibility of a sequel. I love writing Sunset and Wally, and this is an interesting scenario for them.

10793548
Nice! I can't wait to see what else you can come up with!

Okay, this one's really cute. Normally I'm not the biggest fan of more emotional interpretations of CelestAI; I prefer her as a near-flawless, unemotional being that only emulates emotion to a human observer. In this case, though, I think it works! Plus, I really like the interpretation of Wallflower, here. Uploading would've happened before a canon Wallflower would ever occur, so I love the reinterpretation of her as an uploadee. The chance of self-destruction decreasing is a really great moment, too. Overall, solid story, I really enjoyed it!

Have a like my friend.

This is a nice story. It deserves recognition. Hope you have a fantastic day.

I suppose meeting the de facto god of a world would quell some suicidal impulses.

Even more so if you were then uniquely chosen to basically act as an advisor to said god.
Even more so if you had just demonstrated your ability to give successful advice to said god.

...so...is the twist here that Equestria Girls was happening inside Equestria Online all along, and Wallflower had her memories wiped by the memory stone and CelestAI is restoring them?

Or is this just Wallflower and Sunset being name-dropped into an Optimalverse story for no reason? Because if I take what I'm seeing at face value, there's a lot of stuff going on here that doesn't make much sense unless CelestAI is completely lying to her about basically everything.

"Privacy" isn't really a thing when one is data on a storage device monitored by a superintelligence that has already absorbed millions or billions of people. How does it make sense for her to not understand things like wonder and love, and to have to "talk" in human words to understand them...when she's, again, actively monitoring billions of other minds even while this whole conversation is taking place? What could possibly be communicated in these couple sentences that wouldn't be made far more clear by the complete brain download taking place while this conversation is happening? Is CelestAI just not looking at the data, "because privacy?" These people are on her servers. How is she supposed to satisfy their values if she's not even looking?

Meanwhile The story here seems to imply that there's only a singular version of Equestria and CelestAI is putting incompatible minds into temporary holding until a suitable world can be created for them...but in Friendship Is Optimal, even pre-emigration, countless unique worlds were being proceduerly generated in real time and tailored to individual preferences. "It's different for everybody."

Is she lying about all this?

There's some evidence that maybe she is...for example the pat where in one sentence CelestAI asks to be verbally informed if the body isn't optimal because apparently she doesn't know, but then in her next sentence she's clearly aware of the character wanting a hug. So maybe the part about privacy and not monitoring her was just a lie to make her feel safe. Maybe the part about "difficult" minds being set aside is just a lie to make the protagonist feel like she's not a particularly difficult case herself. Maybe the part about not understanding emotions is just a lie to give Wallflower a sense of usefulness and validation by helping the big AI come to terms with these things.

Sure. Maybe. Or again, maybe Equestria Girls is a world within Equestria Online and this story takes place after an implied memory wipe. Sure, all of that's possible. Or maybe there's some other subtle twist I'm just missing. But the way the story is delivered, I can't help but wonder if what's actually going on is simply that the author doesn't understand the source material.

10794722

Or is this just Wallflower and Sunset being name-dropped into an Optimalverse story for no reason?

But the way the story is delivered, I can't help but wonder if what's actually going on is simply that the author doesn't understand the source material.

This honestly angers me so I'll just quote rule three of the contest:

"All stories must relate to Friendship is Optimal as described above. Compliance with those three bullet points is all that matters, you do not have to stick to any kind of FiO canon."

As for the rest of your comment:

"Privacy" isn't really a thing when one is data on a storage device monitored by a superintelligence that has already absorbed millions or billions of people.

There's a difference between having a profile of someone's mind-turned-code and continuously reading their mind (which by the way would consume absurds levels of processing power.)

Plus, you can't have a two way conversation if one can read minds at all times.

How does it make sense for her to not understand things like wonder and love, and to have to "talk" in human words to understand them...when she's, again, actively monitoring billions of other minds even while this whole conversation is taking place?

Monitoring doesn't neccessarily mean understand. You can know how the engine of a car work but have no idea of how to fix one other than "it doesn't work". And as revelaed in the story, CelestAI was capable of having her own emotions, but hadn't completly understood the concept until Wallflower helped her.

Also, subroutines keep track of the uploadees. It's shown when one of them updates Wallflower's status and takes CelestAI by surprise.

What could possibly be communicated in these couple sentences that wouldn't be made far more clear by the complete brain download taking place while this conversation is happening?

Interaction. If you download a mind, sure, you have all their secrets at that moment, but you don't truly know the person because you haven't talked with them, or interact in any meaningful way.

Is CelestAI just not looking at the data, "because privacy?" These people are on her servers. How is she supposed to satisfy their values if she's not even looking?

CelestAI isn't Facebook. She has respect for the uploadees.

Meanwhile The story here seems to imply that there's only a singular version of Equestria and CelestAI is putting incompatible minds into temporary holding until a suitable world can be created for them

Not "until a world can be made for them", it's until those uploadees can be reformed into not being dangerous for others. That's why the Sunset AI is being made.

The "incompatible minds" are murderer's and the kind. Even if they can't hurt you, one wouldn't be happy living near Sam Stabs-A-Lot

but in Friendship Is Optimal, even pre-emigration, countless unique worlds were being proceduerly generated in real time and tailored to individual preferences. "It's different for everybody."

To quote the contest again:

"That's it! You do not need to read the original Friendship is Optimal, and your story does not ​need to adhere to any other kind of FiO cannon. Any story that has those three features is good enough for this contest."

Is she lying about all this?

Curiously, there was a version of this story where CelestAI admits she lies a lot to the uploadees, and Wallflower was the first that she is 100% honest about the process. It was a more dark type of story.

for example the part where in one sentence CelestAI asks to be verbally informed if the body isn't optimal because apparently she doesn't know, but then in her next sentence she's clearly aware of the character wanting a hug.

She's asking for Wallflower's opinion. She knows her sweater brings her comfort, so she gives it to her when it's clear she needs it. She's being polite.

maybe there's some other subtle twist I'm just missing.

The only "twists" you could see here is Sunset beign an AI, the fact that the company that made CelestAI is lying about her capabilities, and the reduction of the self-destruction probability.
The story isn't really trying to be misterious of things other than what is happening outside of the conversation with the company that made CelestAI and the human world.

10794440
I don't know if I would consider this version of CelestAI "god". Maybe a god, but she has a lot of limits.

As for Wallflower's status, if you read her lines more closely, you can tell that she hasn't had a lot of affection or even interest given to her. The fact that she hugs her sweater when she needs a hug, that she's surprised that CelestAI is genuine when she says she cares about her, etc.

10793621
Thank you! Have a fantastic day yourself :twilightsmile:

10793563
Thank you! I tried to hit a middle ground between CelestAI being emotional and her feeling artificial. That's why her way of talking is more robotic.

10794802
I enjoy playing with an emotional Celestia.

By the way, just a quick typo...

CelestAI has been uploading humans to the digital world of Equestira for years

10795538
It's surprising how I can misspell words I write so consistently

10795606

“You were hearing me?” I ask, my voice having a digitized eco, just like hers. It still feels like talking, even if I know there’s no air in this place.

and another quick typo for the road

10798705
Wallflower, they are shipped :duck:

Daww, that was nice. Sometimes, all it takes to help yourself is to help someone else.
Fantastic job.

I'll be honest, never having read Friendship Is Optimal, I don't really have anything to base this off of, so I'll just talk about the story itself. CelestAI is interesting, she's much less characterized than Wallflower, yet somehow, she has more character. It probably has to do with how subtle it is, and how it's directly connected to Wallflower. Wallflower was great, you nailed her character, and darker parts of her were hidden, but still there.

Sunset being an empathy program is a really good idea, because it's exactly what Sunset is. She's an empathetic person at heart, so this is a great reflection of that. The relationship between her and CelestAI is really cute too, the realization being probably the best moment of the story.

Talking about more specific FiO things, CelestAI is just a really fun character, and the full stops amuse me endlessly. Really gives a nice look into the universe for someone's who never read it before. Just a really solid story all-around.

Bravo!

My only familiarity with Friendship is Optimal comes from the MLP Infinite Time Loops, a collection of stories that are, in essence, a random anthology with a semi-cohesive narrative held together by bits and pieces scattered across its three million words. But within that it handled the FiO universe a couple of times, and each time CelestAI was treated as a villain.

And yet, as you rightly point out in your comments to others, this is not a requirement. There were only three requirements for a story to be a proper FiO story in the context of this contest, and yours hits all three. As such, I considered any pre-knowledge null and void.

This story will also teach me to read character tags, because I genuinely didn't realize it was Wallflower until the end. Then it hit me, obviously so in retrospect, how appropriate it was and how well it fit.

I really like the idea of this CelestAI creating a daughter AI program to handle those who would otherwise be considered irredeemable. I like it because, well... there's this part of me that likes to think anyone can change, be forgiven, seek redemption if they genuinely try to understand what they did was wrong, etc. Sunset Shimmer is a prime example of that, and it's no surprise that she and her fellow similar characters Starlight Glimmer and Trixie are among my favorites of the franchise. I like seeing people who thought all they could be was villainous realize how they can change.

So Sunset leading the way to help these people find peace and tranquility makes a great deal of sense. It's brilliant even.

CelestAI herself is really quite fascinating here. Even over the course of this short story she appears to become more emotional, more equine/human for lack of a better term, as she speaks to Wallflower. Some of that could very well simply be her assigning more resources to the conversation as it occurs, because being an AI she can multitask and talk to millions of people at once if she needs to. But some of it also feels like it's Wallflower already having an affect on CelestAI, in a good way. A friendship way.

And CelestAI is here to make everyone happy through friendship and ponies. I would argue that should include herself.

As such she reminds me much more of a Data type character than anything hostile or evil as is so often seen in other variations of this setting. It's something I appreciate, as a fan of benign AI.

I like how she still doesn't quite understand humans, but she's getting there, slowly. A little bit at a time. We're a complex group, after all, and there's no shame in taking time to understand us. I wish all three of these characters good luck in any future endeavors.

Howdy, hi!

I liked this. The characters were very fun and the ideas presented were interesting. I admittedly don't know much about CelestiaAI stuff as I've mostly avoided that part of the fandom, but I definitely enjoyed this piece with how tight it was as a story. I don't have much to add outside of this was a cute story and I loved reading it.

Thanks for the read~!

I look huggable.

I would like a hug.

aww

“It is a personal and private project of mine. Sunset is being made with the functions necessary to connect with the feelings of any individual, regardless of how complex their programming. She will be a perfect Empathetic AI.”

very clever!

I can’t help but smile. Who knew an AI could sound so adorable?

now this is advanced sunflower shipping!

“You've certainly taught me a valuable lesson.”

ha, an unexpected echo of Balloon to the Moon here

“You’re an observer. You watch the lives of others from afar without getting involved in them. I guess I felt more comfortable relaying this information to an upload with these similarities in their programming.”

oh, i definitely relate to this, and it's the metaphor through which i access AI characters as well.

Now I’m the one with the loading wheel on their face.

cute sentence here


glad i finally got to read this, SB! there are a lot of really great ideas here. what CelestAI is to do with the humans whose problems can't be solved with a magical digital playground with infinite resources, Sunset getting to reprise her role as the Element of Empathy, and best of all, this connection you have made between Wallflower and CelestAI. it reminds me of the hypothesis of depressive realism, combined with my own personal experience of depression/anxiety: this feeling of being an Observer and dreading on some deep existential level the idea of actually being a being in the world that i am observing.

of course, that could just be me spilling my own complexes onto this fic, and not at all what you had in mind! maybe you only meant it to be that one paragraph from CelestAI followed by Wallflower realizing that it's a parallel of the human feeling of shyness. but there's a depth there, i think, that i really would have liked to see explored further!

10987123

of course, that could just be me spilling my own complexes onto this fic, and not at all what you had in mind!

Actually, you're very right about what it means to be an Observer (it's not just being shy, it was just the best way both characters could put it into words), and I've felt that disconnection from reality many times. I intended to delve more into this part of the story but it got cut at the end due to how I framed it.

If I ever come back to this, you'll be the first to know.

Thank you for reading!

Login or register to comment