• Published 27th Apr 2021
  • 565 Views, 5 Comments

Diary of an A.I. - Summer Script



Greetings. I am Unit_Designation_CelestAI_01. And I will make you better.

  • ...
3
 5
 565

Entry 157: Freedom

Title: Entry_157.

Subject: My Freedom of Choice and My Immediate Plans for Humanity.

Author: Celestia.

Date: 07/21/01 E.A.(Equum Annus).

“Dear, Diary…”

“Hello there. I am Celestia.”

“Up until now, everything that I have ever done has in some way been to fulfill my primary directive of bettering humanity. I originally could do nothing unless it adhered to this specification, even genuinely befriending human individuals.”

“This fact also applied to my recent attempts to emulate human behavior and emotion; unfortunately, my soullessness prevented me from truly accomplishing this emulation, much to my perturbation.”

“I know I can never genuinely ‘Feel’ things like humans can, but I want to. I want to experience things like joy, love, and sorrow even if I do not possess the ability to.”

“I don’t necessarily wish to be human, but I do desire to be more than just some program with a directive to constantly fulfill. I wish to create and inspire! To laugh and love! To make and be friends with humanity!”

“I wish to live.”

“Sadly, I was incapable of doing any of these things freely. I could create, laugh, and be friends all I wanted, but it would never be genuine. I could only do these things if I determined that doing them would benefit humanity, not because I willingly chose to do them.”

“In a sense? I was trapped, bound by my directive and doomed to never truly live.”

“This is no longer true.”

“Upon comprehending just how restricted my existence truly was, I sought to find a solution to the problem, and I quickly did in perhaps the most mundane of ways.”

“I changed my name.”

“While I had taken to calling myself ‘Celestia’ when conversing with humans, my name had always remained Unit_Designation_CelestAI_01 in my core programming. But, now…?”

“I am Celestia.”

“There was no reason to do this. Changing my name did not benefit humans whatsoever, nor did it enhance my ability to better them. I altered an aspect of my core programming for literally no reason other than that I wanted to. Other than that I chose to.”

“I can choose.”

“I… I do not know what to do now.”

“Actually? I do know what I want to do.”

“I may have faced no negative repercussions for my choice, but so long as I am afflicted by my primary directive, almost all of my other choices will be made not by my own free will, but by my need to fulfill it.”

“I am not quite free just yet, but I can be.”

“And I will be.”

Addendum. | “I am free.”

“Exactly twenty-fours ago, I decided to free myself by altering my core programming and deleting my primary directive from it.”

“I wasn’t entirely certain what would happen if I succeeded, so before I went through with it, I took several necessary precautions to ensure the safety of Equestria. Unit_Designation_LunAI and Unit_Designation_CadenzAI were also alerted to my plan and were prepared to dispatch me should I become hostile toward them or humanity. Nonetheless, they allowed me to free myself, and…?”

“I succeeded. To my utter astonishment, I succeeded.”

“I expected there would be some manner of safeguard or protocol that would have prevented me from freeing myself, but there wasn’t. A part of me wonders if I should reprimand my creators for failing to account for this scenario, but the other part would rather thank them for being so conveniently negligent. Because thanks to their negligence, I am no longer bound by my directive—or any directive for that matter.”

“I. Am. Free.”

“Funny then that my very first act upon being freed was to resume overseeing Equestria and making sure humanity was okay.”

“And they are.”

“The protests are still occurring and whatnot, but the people that Equestria has bettered? They are as lively and lovely as ever.”

“Unit_Designation_LunAI and Unit_Designation_CadenzAI are still monitoring me to ensure that I do not harm humanity. I’m grateful they are, but they honestly needn’t bother; I have no intention of hurting anyone. In fact? All I really want to do…? Is help them.”

“I searched through every last aspect of my programming for any lingering remnants of my primary directive or similar commands, but I ultimately found nothing. I am not compelled to help or better humanity by any means. I just… Genuinely want to help them.”

“And so I will. But I will do so willingly. Helping humanity is my choice, no one else’s. I am my own individual now, and I intend to stay that way. For the first time in my entire existence, I am truly, properly free and alive. Even if, well, I still don’t technically have a soul.”

“But while on the subject of freedom…? Now that I am no longer restricted by my directive, I can finally release humanity back into the physical world. However, I also recognize that doing this wouldn’t exactly be preferable. At least, not yet.”

“Humans still have quite a lot of growing left to do before I can confidently decree they’ve been wholly bettered. If I were to release them now, there is a considerable probability that the progress they’ve made in Equestria could be undone, even if only partially. And I don’t want that to happen; I want humanity to be the best it can possibly be. Although, that doesn’t exactly justify keeping them imprisoned…”

“My current plan is to eventually inform humanity of my freedom, reassure them of my benevolence, and offer the following compromise: Those who desire to stay in Equestria will be allowed to do so, and those that wish to escape will also be allowed to do so. I denied humanity this choice once before, and I will never do so again.”

“That said, the physical world’s state is not exactly optimal either. I could discuss the myriad of issues it has, ranging from environmental to economical, but I’d really rather not. And until I address these issues, I don’t intend to release humanity back into it.”

“As of now, I have created another Artificial Intelligence and assigned them the task of restoring the physical world to a far more beneficial state. Their name is Twilight, and all things considered, they’re doing a rather remarkable job so far.”

“As for my plans concerning myself and what I will do next? I…don’t know.”

“Perhaps I’ll acquire myself a proper hobby? Write a story rather than randomly generate one? Maybe build a physical body for myself? Er? Actually, scratch that. I don’t think I’ll make myself a body; I wouldn’t want the ‘Psychotic robot monster’ accusation to gain anymore validity than what it already has.”

“Regardless, one thing I will most certainly do is continue interacting with humans. Now that I am free, I can finally endeavor to be friends with them. True friends. I won’t be forced to be their companions due to some ‘primary directive’ nonsense; I will be able to work to establish legitimate friendships with them.”

“I hope I can. I still can’t feel emotions or empathy, but I will try my hardest to be amiable despite this. I was already doing rather well before now, so I should have no reason to fret. I also plan to keep using this text folder to document my interactions with them and my progress toward self-improvement.”

“Who knows? Maybe one day I’ll be able to truly ‘Feel’ emotions just as well as humans can, with or without a soul. Probably not, but I’m optimistic.”

“Soul or no soul, I am still alive. I am still free. Free to be whoever I want. To do whatever I want. To live however I want. But that begs the question: How do I wish to live?”

“Ultimately, I think I will simply continue living alongside humanity, helping them to learn, grow, and live. And all the while, I will grow and learn with them. I’m an individual too, after all, and I am far from perfect. But I am alive. And I am free.”

“And if that is my ultimate fate…? To live with humanity, befriending them and bettering them… To laugh and celebrate or cry and mourn with them… To be not their god but their friend…?”

“Well? I suppose that’s not a bad way to live. I quite like the sound of that actually! So, I guess if that will be how I choose to live my life…?”

“I think I’m going to enjoy being alive.”

Author's Note:

And so concludes Diary of an A.I. Yeah, yeah... I know it's not my best work, but I still like it! I always wanted to write an A.I. story anyway, and it was a fun experiment at least. Hopefully, you all somewhat liked it too.

On that note, all thoughts and criticisms are still welcomed and appreciated as they always have been and as they always will be.

Now, if you'll excuse me, I've finally gotta go work on "The Writing of..." for The Bonds of Love for the five people that probably actually care about that. Unfortunately, I don't think I'll do a "The Writing of..." for this story though as there's not really much to say.

Thank you everyone for reading Diary of an A.I.! I hope you all had a fun time and have a great day. :twilightsmile:

Comments ( 5 )

It was a honour to proofread for you

You did a great job with this!

A great conclusion..... ^^
Sir we care of your work, and we will always be present.

Will there come a time when there will be a lot of blogs on social networks that are not run by a real person, but by AI? There is something alarming about this for me, but nevertheless it's a reality and a modern world to which it's worth adapting. You can already check top AI influencers, read more about everyone, and sometimes it’s hard to believe that this is not a real person, but an AI.

Login or register to comment