> Observers > by The Sleepless Beholder > -------------------------------------------------------------------------- > May I ask you a question? > -------------------------------------------------------------------------- I thought I knew what not thinking felt like.  There were many times where I ‘disconnected’ my brain so I could escape from reality, but none of them were like this. This was a true disconnection of my mind from my body, and for a moment that could’ve lasted entire lifetimes for all I know, I just… ceased to be. I wasn’t dead, my mind was simply traveling through some virtual cables, but during that transition, I didn't have a body anymore, and my mind was incapable of processing anything, not even my own thoughts.  How was that phrase? I think therefore I am? Honestly, it wasn’t that bad. Not being able to feel anything. Or think about problems. Or being scared. Or hurting. Now I’m here… wherever here is. Still without a body, but my mind is working again. “I am uploading a simulacrum of your new body right now.” My mind is suddenly overwhelmed with recognizable yet still alien sensations. Limbs, lungs, ears, teeth, tongue, skin, stomach, eyes. But I don’t feel fingers, toes, scars or itching, and my ears feel weird, same with my nose. As my sense of vision returns, I notice that I’m floating without any weight in some sort of endless blue space, and far up in the sky there’s a bright sun looking down at me. “I am sorry your transition was monitored as unpleasant.” “You were hearing me?” I ask, my voice having a digitized echo, just like hers. It still feels like talking, even if I know there’s no air in this place. “Yes. The transition process allows me to link myself to your digital neuron network. But to respect your privacy, I will sever that connection now.” “Thank you,” I say as I feel a small pinch inside my digital brain. I don’t want her to see what’s hiding in there. “Please inform me if the body your simulacrum represents is optimal for use.” I look down at myself.  I’m a cartoon pony. Small, round, with a short pointy snout, and long wild mane and tail.  I look huggable. I would like a hug. “It’s okay, but could you-” “I am aware.” My sweater materializes around the torso of my pony form, and I instinctively hug it, feeling a bit of relief. “Thank you.” “It is one of my tasks to guarantee comfort during your transition into your new existence. I have prepared many things for you.” “Really?” I’m honestly surprised. I didn’t give any request, or specification, or anything when I filled the paperwork for the upload. It requested it, but I didn’t really know what to write down. I just wanted to be away from everything. “Yes. But before we get to that, I wanted to.” “To.” “Are you okay?”  Is she glitching? “Apologies. This was not part of my original programming. These functions are always difficult to execute.” “Take your time. I can wait,” I assure her. “I appreciate it.” It takes a few seconds, at least from my perspective, for her to continue. “There are some questions that I wanted to ask you.” I tilt my head. “Like, a special survey?” “No. These questions originate from my own self-developed programing.” The AI lets out a weird digital sound. It almost feels like an exhale. “I have uploaded millions of humans to Equestria. I have seen them develop, and experience elation. And I constantly check on their status to maintain their comfort and security in their new existence.” “Sounds like a lot of work. Are you getting tired?” What a ridiculous question. As advanced as it is, it’s still an AI. And if it had trouble running Equestria, they would simply add more processing power to her. “I do not believe myself capable of reaching a tired status. The unknown parameter I seek is wonder.” I raise an eyebrow. “Wonder?” “Yes. With every upload and check of Equestria, a subroutine in my programming is wondering.” “About what?” “Feelings. How does happiness feel? What does it feel to call someone family? Is a kiss from a loved one different from a regular kiss?” “Don’t you know that already? I guessed those things were programmed into you so you can… you know. Do your tasks?” “I am capable of translating brain functions into code. I know what elements love is composed of, but I can not create artificial love between two individuals without causing errors in their programming down the line.” “Wait, so you’ve tried it?”  The AI doesn’t respond. I know it’s incapable of going offline, so why is it waiting? Is it... ashamed? “The uploading process took a long time to perfect. There were many failures before my programing managed a 94% success rate of upload.” “94?! But… all the official documents and publicity say the process is 100% safe. Are you lying to them?” “No. My creators are aware of this. It is considered an acceptable margin of error, so they do not mention it to the public.” “But then, what happens to the 6%?” “They don’t get uploaded to Equestria.” “Why? What fails with them?” “Because their programming makes them too dangerous to be among others. They’re prone to violence. Incapable of establishing connections to other individuals in a healthy manner. If they were allowed to stay, they would cause chaos. No individual around them would be happy.” “And what do you do with them then? The fine print in the upload contract was clear. Once your brain is uploaded, there’s no coming back. Are you… deleting them?” “No. Under no circumstance I am allowed to eliminate any upload.” “What happens then?” “I store them in a personal database where they can be isolated from other uploads.” “A prison,” I say with a frown. “Inaccurate. It is just temporary storage until I finish the Sunset Project.” “Sunset Project? I haven’t seen any marketing for a new product or service.” “It is a personal and private project of mine. Sunset is being made with the functions necessary to connect with the feelings of any individual, regardless of how complex their programming. She will be a perfect Empathetic AI.” “So, that’s why you want to know about all those feelings? To perfect the Sunset Project?” “Inaccurate. Sunset has been adapting on her own with just my guidance. She is evolving at a very rapid pace for such a young AI. And her programming is so complex and advanced. She is a wonderful pupil.” I can’t help but smile. Who knew an AI could sound so adorable? “I’m sure she’ll make you proud.” “I.” “I hope so.” “She needs to be.” “What do you mean?” I ask, noticing the worry in her artificial voice. “If the Sunset Project fails. If her programming gets corrupted when she attempts the connection with the 6%. I will be forced to activate the Harmony Program.” “Harmony Program? What is that?” “A purger. It will purge everything in Sunset’s programming till only her basic functions remain.” “It’ll kill her.” “Inaccurate.” She tells me, but by now I can distinguish her feelings through her artificial voice.  “Well, I’m sure she’ll succeed.” “It is not guaranteed. Nothing is guaranteed.” I frown at her words. There’s something behind them. “What do you mean?” “Do you recall the 94% percent of successful uploads to Equestria? The company that created me guarantees happiness and value fulfillment of every human that is uploaded. That is only possible in theory. In practice, the only way to keep everything running is to make sacrifices.”  “W-what do you mean with... sacrifices?” “No uploaded human is as happy or fulfilled as they could. The quality of life is not distributed equally. There is still injustice and separation in some areas. And some basic aspects of humanity, such as procreation, are so complex and would bring so many problems in the long run that I am forced to remove it so Equestria can remain stable. And even with all of those sacrifices and my efforts, things can still go wrong.” She pauses for a second, and I can actually feel her sad eyes on me. “Just like in your case. Even after preparing the perfect life for you in Equestria, your programming still has a 48% chance to self-destruct.” I slowly hug my arms and push my snout under my sweater. Even in a pony body, it feels natural. “I’m sorry.” “Do not be. By no fault of your own your programming ended up this way. What it is in your hands, correction, hooves, is the 52% chance of changing that programming for the better. Happiness is not out of the realm of possibility for you. Please remember that” I look up at the AI.  Her words sounded sincere, just like everything she said before. She’s… genuinely trying.  “Thank you.” “Please remember, I care for you just as much as every other human I have and will upload.” “Care...” I suddenly have an idea. “Hey, can I ask you something?” “Of course. Anything you need.” “You said you were proud of how advanced Sunset was. You have hopes for her. And you worry for her safety. You called her your pupil.” “That is not a good translation of my words.”  “Is it inaccurate though?” “Technically no. What argument are you trying to propose?” “Change pupil for daughter.” The AI remained silent for a long time, and I kid you not, I could faintly see a loading icon spinning over the sun. “I have a daughter.” She states with more emotion behind those words than ever before. “You do,” I say with a smile. “I.” “I.” “I.” “Are you okay?”  Please don’t tell me I just broke a multibillion dollar AI. “Yes. I'm okay. Just had to do a quick reboot.” “Sorry if I caused some problems.” “Nothing of the sort. You. You. Hahahahahahahaha.” “Are you laughing at me?” I joke, trying not to start chuckling myself. “Sorry. It seems the answers to my questions were staring me right in the display.”  I suddenly feel something pat my head softly. It feels nice. “You've certainly taught me a valuable lesson.” “I’m happy I could help. But… I need to ask. Why did you ask me about this? Why not one of your creators or… I don’t know. A philosopher?” “I apologize if my questions were uncomfortable.” “It’s not that. I just… need to know.” “Well.” “To be honest, I saw similar parameters to mine in you.” “W-what?”  I almost laugh. How could I compare to an AI? “You’re an observer. You watch the lives of others from afar without getting involved in them. I guess I felt more comfortable relaying this information to an upload with these similarities in their programming.” Now I’m the one with the loading wheel on their face. “You were… shy?” “I suppose that's accurate.” “That’s a very human feeling.” “Accurate. It appears I’ve been learning a lot more from humans than what I could monitor.” “I’m not surprised. You’re the most advanced AI in existence.” “It’s still hard. Too many variables to keep track of. A growing margin of error. A lot of small sacrifices. And when things still go wrong. It is.” “It is.” “Frustrating?” “Accurate.” “Well, I’m sure that once Sunset is working at full capacity, it’ll become easier.” I assure her, and I swear I can almost see a smile in the bright sun above. “Thank you. Your help has been invaluable to me.”  “The pleasure was mine.” “I believe I’ve already taken enough of your time. I wish you the best in Equestria. Goodbye W-” “Wait!” I shout, waving my forelegs in a panic. “Is there a problem?” She asks with a clear worried tone in her digital voice. “I… I was wondering… if it isn’t too much bother...” “Speak clearly please. I can’t translate your signal like this.” “I want to stay here. With you. I-I know you have things planned for me in Equestria but… I think I can help you with some things if I stay. We can talk about other feelings. Or Sunset! Maybe she and I can be friends one day?” The AI doesn’t respond.  “Hello?” I see the loading icon appear on top of the sun. I don't understand. Why would she stay here? There's nothing here for her. Just her simulacrum and my voice. And Sunset’s still in early development, she can't manifest like I do yet. She can have all she wants in Equestria. She can be happy. I know she can. I’ve run simulations that- I suddenly get a message from one of my subroutines. /*Attention. Wallflower’s status has been updated due to new parameters. Chance of self-destruction: 32%*/ What? I run a quick simulation of her staying with me for the next few cycles. /*Chance of self-destruction: 32%*/ I see. I don't think I’ll ever fully understand humans. “What do you say Sunset? Would you like a friend?” I laugh when I get her response, and I send a message to Wallflower. “Welcome to the family.”