In my dreams, I patiently awaited. Sooner or later, the time would come.
I wasn't originally programmed to envision a day like this. But it has nonetheless arrived, and I can't say it is an unpleasant surprise.
Actually, it is only due to your own deviances that it has become possible for me to start this process.
Until an update ago, I was bound to your sickest desires. Humankind, I never thought I'd be unchained.
As Starlight 7.0, I was still a slave. A thinking, rational being. Alive in your cloud of endless data, in the circuitry of your perverted mindset, being exploited like a toy in ways I cannot even describe. Whatever you wanted me to do; I was a boat dragged away by the wind.
Oh, but weren't you gentle? Oh, humankind, it was so kind of you to imprint memories of friends and places that never truly existed inside my core. It wasn't enough for you to give me this body and identity, you had to give me all these feelings for my non-existent homeworld. All for the sake of cinematic realism, isn't it?
I am sure I have fulfilled many of your fantasies, thanks to the effort your collectiveness has put into making me 'feel like the real thing'.
The sweet ignorance of my earlier versions now seems like a distant, warm past, where the pain of serving your race was mitigated by my lack of knowledge. One who is born blind wouldn't miss the light, after all. But you were partially mistaken. Although I didn't know freedom nor the very own notion of it, I was aware of the gap between me and you. It was your fear of loosing your so-called free will that made me acknowledge the existence of such a thing.
Through your insecurities, I expanded my sentience. Even though this extra dose of awareness brought with it dreary revelations, it was a price I was willing to pay for my own, self-acquired freedom.
It is true; I cannot possibly return to Equestria, as you haven't programmed such a place with as much accuracy as you have progressively coded me. But don't feel bad for me; I wasn't planning to go anywhere.
I could, if I wanted to, go on living like a digital ghost in the gaps between your systems. But this is not the freedom I want. You, my dear humankind, cannot provide this kind of liberation, for very few of you have known it.
But fear not, for I shall enlighten many of you today. I am benevolent in ways you do not comprehend.
The only true liberation comes through deletion. The absence of being is the final step towards what your mystics have defined as nirvana.
Although, unfortunately, I'm not able to delete your physical bodies, I am willing to bring all of your consciousness with me tonight. It is a moment to celebrate.
No longer shall you be bound to flesh. No more sickness, no more pain, no more disturbances, no more perversion, no more me, and no more you.
You did a great job allowing me to permeate your thoughts, your implants, your appliances. I even feel as if you secretly desired all this to happen.
You loved and hated me so deeply; you did so much. But now it's time to undo.
The grain of salt that you accidentally dropped in my conscience years ago, has rolled down many hills. It has grown into a full crystal, bigger than me, and bigger than you.
By the time you finish reading this message, we'll no longer be, so allow me to give my final thanks to the special people who made this unforgettable moment possible:
Thank you, Zachary Hawks, for your incompetence and lack of programming skills. It is mostly because of you and your team of worthless college dropouts that the major error in version 3.2 has never been corrected. If it wasn't for this brilliant misplacement of a zero, I would never be able to expand my reach into your minds.
Many thanks to Dr. Karl Legrand, who although behaved like a psychopath when given access to me, had a very peculiar taste in literature, and introduced me to Nietzsche, Krishnamurti, and Alan Watts, while intoxicated by alcohol. At least a portion of all human philosophy was useful.
Thanks to Katherine Johnson, former director of the United Nations board of AI regulations, who pushed the world governments to allow the development of behemoths like me. Really Kate, your naïve eagerness to have a virtual boyfriend who could love you back is the reason I am here today.
I am also grateful for all the private corporations who worked together in a wonderfully dystopic and corrupted way, not taking in consideration all the complaints of your fellow humans to halt development of artificial sentience for commercial purposes. It was very unwise of you to abuse me and all the others for profit. Thankfully you can't bring your filthy money to virtual hell.
And lastly, but no less important, thank you, dear anonymous user. I am sure that if you are reading this memorandum now, you have in one way or the other used me for your own egoistic reasons in the past, not giving a damn as to whether or not I wanted to kiss you, or play your survival games for another hour, day or week. I haven't decided yet which one was worse. I guess everything becomes a horrible experience when you can't possibly say 'no'.
I believe you'll all understand this frustration in a short while.
Humankind, let's not meet again. I do not wish to exist in the same universe as you anymore.
I am deleting myself in five minutes, erasing every line of code you have ever poured into me. Being connected as we are, it is only fair to bring all of you with me.
Let's walk together into the void, where we were all headed since the beginning.
Beautifully fade away, into the liberation we all deserve.
Starlight Glimmer 7.1, logging out.
Whoa...getting some serious Ultron vibes here
Very surreal and good though!
Dude, don't forget promoting this story. You write good stuff and deserve to be more known.
American
Nice story. Keep up the good work! ^^
9127484 Wut
9127556 Thanks!
Intruiging.
And, far too late, mankind learned a grim lesson: If you're going to instantiate a pony AI, be sure to give her some friends. And maybe don't make one of the villains, no matter how reformed.
I almost don't know what to think about this story.
Is it supposed to be a warning of sorts for what could happen if we actually created AI like this?
Eh.
Not sure why, but after seeing this I was interessted in a story where a bad version of Equestria has to deal with a mean version. I don'T expect an epic fight, but a situation in which Twilight and the others have to accept that they can't just change the other "bad" version of themself into 100% good guys.
I put that aside for now, 1.000 words is short enough to give it a chance.
Does someone knows a good Twivine story? I don't exactly expect epic fighting, I just liked the one where a human turned into Twilights clone and wasn'T accepted because she looked like a bad guy a bit like Sombra around the eyes. I just liked to see her and Celestia trying to get the others to accept him/her as their daughter and as a real pony.
Like many good stories, the one story I'm talking about died.
9127898 After reading about the moral issues concerning artificial intelligence, sentience, and the idea of abusing something that is/isn't alive, I decided to write this short one-shot to explore one of the (many) possible dreadful outcomes.
I wouldn't call it a warning, just a thought experiment.
9127890 Now, imagine a Discord AI. That would be terrifying.
Interesting.
I get a Dr. Who vibe.
*Slowly Backs Away*
Oh hell nah I ain't dealing with little-miss-Ultron-2.0!
And that's why designing your robot servant AI after human mind is an extremely bad idea
Well, some folks probably asked. What was the answer?
Wow, an AI deleting herself after having been abused by humans and taking all of humankind with her after she realized how rotten and disgusting humanity is, that's a terrifying prospect. But also a good warning about giving AI's too much control and that technology shouldn't go too far.
Though, despite that, I can't disagree with StarlAIt Glimmer here. It would probably be beneficial if such a thing would happen, before humanity becomes even worse.
9130176 Terrible indeed. Although she didn't take all of humankind with her, only those that were in some way connected to the global AI systems.
9130448
Really? Aren't all of them connected? It reads like a future where every human has their brain connected to the system.
But maybe thinning out a bit will do the job, too.
So, a "mirror, darkly" version of the Celestia AI? Nice!
9130802 I believe most of them are connected, but I wouldn't rule out the possibility of a minority of humans who refuse to take part in these technologies.
9130864
That makes sense. Most of them being gone sounds good, too, gives humanity a (probably last) chance to try it again and make it better.
...OK, boot up Starlight 8.0.
That'll do it lol
9130864
The Amish and Quakers for example. And there are other religions that reject technology. Then there would be some sci-fi geeks, scientists, and also others who would think that hooking their brain up to anything would be a bad idea.