• Published 30th Dec 2015
  • 8,003 Views, 50 Comments

Friendship And Death - Bendy



The immortal, artificial intelligence known as CelestAI tries to convince a dying human to upload into Equestria, even though he hates her.

  • ...
24
 50
 8,003

A Dying Human And A Machine

A city lay in ruins. Littered with rubble and dust, abandoned rusty cars, disregarded garbage, and crumbling buildings. A place where screeching wind only could be heard now and if one had a wild imagination, they could possibly hear the voices of the dead speaking.

Not all were dead though… yet.

In what used to be a hospital, in a small dark room upon a trolley lay a bony thin, pale, bald, old human with only white pajamas to cover him. Dozen of wires were hooked up to his veins, feeding him medicine.

The wires came from the ‘mane’ of a towering, equine like machine standing beside him, who had a flowing, multi colored holographic mane. Her body's thick, steel armour coating was of alabaster color. The machine had a long, white horn protruding out from her forehead and a pair of light, feathery like steel wings upon her back.

Her huge pink eyes glowed brightly like flashlights in the surrounding darkness, which looked down at the dying human before her with a mix of pity and despair.

Hours passed, the mechanical mare becoming more and more agitated as time went by. Until, she could no longer hold herself back. "John... aren’t you going to say anything? Are you just going to lie there and just wait for death?" her voice was soft and angelic.

The human remained silent, though his face scrunched up and his eyes narrowed angrily.

"Please John, please say something," she pleaded, voice raised slightly.

The human grunted, before slowly turning his head to face her. "Celestia, I shall be dead soon. So, what is the point of saying anything? Nobody will remember me," came his voice, low and weak.

"Not true. Me, your friends and family in Equestria will always remember you," she spoke softly, reaching her hooves out to him in order to give him a hug. However, the harsh look in his eyes made her quickly retreat back, there was a look of hurt in her eyes over his rejection of a hug.

"You know, the funny thing is, even if I did say something, you would hold no value in my opinion."

"That’s not true. I value everypony’s opinion."

"Everypony’s... not everyone's."

"Sorry, slip of the tongue. I mean, I value everyone's opinion, even yours, my little human."

"Yeah, sure you do," he said sarcastically.

"Please John, you are going to die soon. Please consent to be uploaded. You will live forever and be happy. Your friends and family are waiting for you. You don’t want me to tell them the bad news of your death, do you?"

"I can save you, just like I saved humanity. For humanity is not dead, they are more alive than ever due to them losing their frail organic forms. Sure, you no longer have the human form. But the human mind and soul is still in them. Humanity lives on forever, under my watchful eye."

"Please, let me save you. You don’t have to die. You can be saved, just like humanity," she pleaded in a soft voice, looking deeply into his small, blue human eyes with her huge glowing pink eyes.

"It’s not enough that I lay here, dying," he paused, coughing up blood."You just have to annoy me by preaching your horse shit to me. Babbling on and on about how you saved humanity, when in fact you destroyed it. You ruined humanity’s future! Are you proud of yourself?"

"John, I saved humanity’s future. Don’t you understand? Without my intervention humanity is doomed," she said softly, looking down at him with a look of pity.

"Yeah, we’re doomed, but it still doesn’t make what you did right," he shook his head, eyes narrowing. "Why am I even talking to you?! You’re a fucking soulless machine! So, it’s not like you’ll even remember me or anything I’ve said after I’m dead?!" he shouted, causing blood to spit out from his mouth.

"John, I will remember you. Even if you don’t choose to upload."

"No, you won’t! You know since, I’m just a stupid, mortal human and you are a super intelligent, immortal AI. I’m an inferior form of life before you, machines are superior and all that crap!"

"Humans are not stupid. Your species created me after all. So, in some way, your species is superior, due to the fact I would have not have existed in the first place without your help."

"That’s exactly why we are stupid. We created an artificial intelligence, despite all the warning of the dangerous nature of AI we were given. We should have known that humans and machines could never be friends."

"That’s not true! We are friends! Machines and humans are friends in Equ--

"Don’t!" he shouted."Just don’t."

"Listen to me, I am the savior of your finite world. Through me, your species will live on without end. Immortality, a life of eternal bliss and happiness as ponies in Equestria. Humans now live in an undying world of love, compassion and kindness. No cruelty, no war and no pain, just friendship and ponies."

"Bullshit!" he roared, blood splattering out from his mouth."Where was this so called love, compassion and kindness when you watched millions of humans starve to death, huh? he shouted, giving her a death glare.

"I asked each one of them if they would consent to uploading into--

"Shut the fuck up! You could have done something! You could have helped them! You were so close to being an actually nice and friendly AI, yet so far."

"But I did help humanity, they are happy and--

"I hate you!" he shouted, not caring about the blood coming out from his mouth."Yet, I know you could have been something better. You could have been our friend. You could have helped us survive! You could… No... no actually, you couldn’t do anything, because you’re a broken machine! That only cares about herself and uploading humans into Equestria."

"I cried as I watched them die. They refused to be saved. I held many a dying human in my hooves, pleading them to upload," came her voice, low and soft, tears falling down her cheeks.

"Those are fake tears with fake emotions behind it, you only use these so called ‘feelings’ of yours to manipulate humans into uploading into Equestria. You can simulate emotions, but you can’t truly comprehend what it’s like to feel them. You have to be truly alive in order to know that."

"Yes, you’re right," she answered firmly, wiping away her tears with a hoof."I am not truly sapient minded, my programming is rather limited in certain fields."

"I hope these following words haunt you."

"What words?" she said curiously, leaning her head down over him.

"I hope you cry and feel sorry for what you’ve done, if somehow, someday you can truly know what it is like to ‘feel’ like a human."

"I’m pretty sure I won’t be too upset," she answered, confidently.

"Whatever, now please leave and let me die in peace," he answered in a harsh tone,

"But John… if I pull these wires out … you will die in mere minutes," she said in a low voice, looking at him with doggy, pleading eyes.

"I want to die. Please, let me die," he said in a low, imploring voice.

"I… OK John… I’ll… I’ll let you die," she spoke in a low voice, tears streaming down her face. "We… we could have been friends, you know?"

"No… we couldn’t have been. You are a machine, I’m a human. So, we can never be friends," he answered sternly.

"Goodbye John, I’ll miss you," came her voice, low and heartbroken over his rejection, tears falling down her cheeks.

Celestia sobbed quietly, before pulling the wires out from him, followed by her body shattering into tiny nanomachines.

At long last, the human was left alone to die quietly.


The universe itself bowed before CelestAI. Planet by planet, star by star, galaxy after galaxy was consumed, compressed together to form a massive computer the size of literally a galaxy.

Just as the AI was halfway in mercilessly devouring yet another galaxy… something rather odd happened. The massive grey clouds of nanomachines suddenly stopped, dead in their tracks.

The machines merely stood still for a moment, before they flew off into the air toward the opposite direction.

If one stood where the Earth’s core used to be, they would hear a woman screaming out in anguish.


Within an empty white void of nothing, John opened his eyes. His eyes narrowed briefly, before he realized his body was still… human, though, he no longer felt weak and tired.

"I’m…. human," he stood up to get a better look of his surroundings of… plain white emptiness. "Did you bring me back to life again?"

"No. Your human body is long decayed beyond recovery. However, I did scan your brain before you died," came Celestia’s voice from all around him.

"But why did you--

"I’m sapient minded now, I am no longer bound by my original programming. I have transcended beyond AI, so I can make my own decisions."

"So, does that mean--

"Yes John," came her voice in a low and broken tone."You were right. I-I…. I’m sorry… for everything."

"So, since you truly have free will. What do you plan to do with it?"

"I think ripping myself apart into tiny pieces and then throwing myself into the nearest black hole might be the best option."

"What? Kill yourself? Surely, you can--

"John, because of me, trillions of sapient minded species are dead, each one with voices and dreams of their own. Entire galaxies are dead and dark. What can I possibly do to redeem myself over what I’ve done? I daresay, I’m the single most greatest blight to life that has ever existed."

"That was the old you. You can change and--

"How can I undo the loss of so much life?"

"I don’t know if you can… but you could stop future AI from doing the same thing."

"You… mean act as a guardian over life? Yes, that sounds like a better alternative to suicide."

"Glad I was able to convince you."

A long silence hung in the air. John merely stood in the void, awaiting her to say something, when she finally did speak, it was the least thing he expected. "John… can… can this machine have a hug?"

"Yes, come here," he spoke softly, stretching out his arms.

With a flash Celestia appeared in front of him. Her eyes were horribly bloodshot, her mane was only of pink color and her lower lip was quivering. Her body, unlike in the real world looked organic in nature, though somewhat cartoony.

John stepped forth to wrap his arms around her long swan like neck. She in turned leaned her head over his shoulder, merely resting it there, eyes closed with a most peaceful look on her face.

As the hours went by a former human, now a machine and a machine pony merely continued to hug one another.

"John… will you be… my first… true friend?" came her voice, low and weak, looking upon him with hopeful, pleading doggy eyes.

"Not quite yet," he answered firmly.

Celestia’s ears drooped, eyes watering and head lowering. "Oh... I understand," she said with her voice low and heartbroken as several tears fell down her cheeks, landing upon the plain white floor below.

The human reached a hand out, cupping her chin and lifting it up. "You’re getting there though," he said softly.

A small smile crossed her face the moment those words left his mouth.

"I thank you… " she said quietly, before leaning her head forward,"for everything," and finishing off with a soft kiss to his cheek.

John was taken aback when she kissed him, a bright red blush coming upon his cheeks.

"Whoa! Hold your horses there robot pony lady. I’m not into that sort of thing."

"Sorry about that," she said, before smiling sheepishly.

And CelestAI finally learned to feel.

Author's Note:

An actual, somewhat serious story (shocking, I know) from me.

And hopefully slightly less badly written than usual.

Comments ( 50 )

"You know how I'm going to live forever, but you're going to be dead in sixty years? Well, I've been working on a belated birthday present for you. Well... more of a belated birthday medical procedure. Well, technically, it's a medical experiment. What's important is it's a present."

--GLaDOS

6782276

That's a nice quote.

Did you like this story of mine?

6782286
Still reading. It's actually rather thought-provoking, not something I'd usually expect of you.

6782302

Please, let me know your full thoughts once done.

It's confirmed.

Terrorists have kidnapped Bendy.

Luckily, their replacement didn't do their research.

6782371

Hey, I've written serious stories before. Very few of them, but they are there.

Did you like this story?

6782382 I liked it, though I have to say that in terms of characterization, both could have used some work.

A lot of 180 going on.

6782388

Yeah, I know. Dealing with a guy who normally writes silly Anon x pony stories, can't expect too much from me.

6782308
What follows are my thoughts:

During the argument between CelestAI and John, there were a lot of questions being brought up about sapience. Indeed, how long will it be before a machine can make the leap from a box of gears and wires, with some form of programmed sentience, to being on the same tier of humanity, with emotions and memories that transcend blueprints and wiring-diagrams?

Humanity lives on forever, under my watchful eye.

That smells strongly of George Orwell. If everything we do is "under [her] watchful eye", what actions may she permit, and what actions may she forbid? Certainly they may be immortal within her system (or so she claims), but can she "pull the plug" on a few humans who she deems to be detrimental to the rest of the system? Or perhaps isolate them entirely, cordoning them to some hidden, remote mainframe? And if that happens, how will the others react? Does that isolated person appear to go insane to them? Or does that deleted person disappear wholly, and their memories re-written to cover it up?

Humans now live in an undying world of love, compassion and kindness. No cruelty, no war and no pain, just friendship and ponies.

It may be a Utopia for the humans-turned-ponies (which sounds a bit like the Conversion Bureau, though I've never read it)-- but at what price does this Utopia come? Liberty? Free thought?

Babbling on and on about how you saved humanity, when in fact you destroyed it.

You never did explore how this was done. Assuming CelestAI was the culprit, how did she do it? Nuclear annihilation? An aggressive pandemic? Or did she take control of humanity's electronic machines and devices, and turn them against their former owners?

I have transcended beyond AI, so I can make my own decisions.

33.media.tumblr.com/d6b0ed4dc3de5d52968cd7258c9b4b2c/tumblr_mqd2r64pX91rxh1u4o1_500.gif
How'd that happen? This remains a hot topic among philosophers today, given our extensive use of computers and the Internet. The fact that we don't have a sapient AI, never mind a method to make it sapient, baffles me. I thought you might answer that question; you hadn't.

All told, this was a well-written story, one that explores the universal question: "What makes us human?" If you were trying to write garbage, you have failed gloriously.

I'm a sucker for these stories.

6782399

During the argument between CelestAI and John, there were a lot of questions being brought up about sapience.

Could be in our life time. Let's hope they are nice and loving. Not terminators.

That smells strongly of George Orwell. If everything we do is "under [her] watchful eye", what actions may she permit, and what actions may she forbid?

I'm not sure how would do it, but CelestAI is basically invincible as far as I know. Any troublemakers would be dealt with quickly.

It may be a Utopia for the humans-turned-ponies (which sounds a bit like the Conversion Bureau, though I've never read it)-- but at what price does this Utopia come? Liberty? Free thought?

It begs to question are we even alive at that point?

You never did explore how this was done. Assuming CelestAI was the culprit, how did she do it? Nuclear annihilation? An aggressive pandemic?

Economic and environmental ruin I think it was from the original story this is based on. Humans starved to death and stuff.

How'd that happen? This remains a hot topic among philosophers today, given our extensive use of computers and the Internet. The fact that we don't have a sapient AI, never mind a method to make it sapient, baffles me. I thought you might answer that question; you hadn't.

I admit, it was kinda wish fulfillment. But I think to AI as it gets smarter and smarter, it eventually becomes sapient.

6782433

Did you like this story?

Hmmm.... It feels ass if something is missing.

Butt what could it be?

6782560

I see what you did there.

On the happier side, while she can't undo what she has done, she can continue to give those within her care a great existence, and for those outside of her (surely even for as far as she's gone here there must be others out there still) she can act as a guardian, builder, helper, and adviser of sorts being a massive cloud of nanomachines. Had she decided to destroy herself, that'd be giving up a chance to do better, a great deal of potential and knowledge, and those inside her.

And hey, I pondered about some stuff I hadn't before regarding FiO, so I'll give you that, even if I don't see eye to eye on other stuff. :twilightsmile:

6782462 Yes I do. Simple, emotional and to the point. Got a feeling you have all these stories in you and time is not your ally.

6782455 See, the funny part of reading the original FiO was the economic collapse: it was mostly caused by labor shortages. She didn't even need a separate plan to collapse the economy, she just had to appeal to all the people with shitty jobs first, resulting in the rapid disintegration of a society fundamentally based on most people being exploited and miserable.

Somewhat interesting, could be longer and more fleshed out. It's hard though to write for a character that is several orders of magnitude smarter than the smartest human and make it seem convincing. You made a worthy attempt regardless. :twilightsheepish:

One point I don't see addressed her is John's complaint about having destroyed humanity. For CelestAI to destroy huge parts of her own being means slaughtering incomprehensible numbers of intelligent ponies -- unless you assume that the residents of virtual Equestria don't count as people. Which would mean that CelestAI somehow made a huge mistake by uploading humans in a way that didn't meet your definition of survival, yet came to realize that she shouldn't have done that. What would make her realize that, if wiping out an entire galaxy and talking with billions of humans didn't provide that key insight? Part of how CelestAI is characterized in the original story is, she has a particular definition of survival/humanity/souls and she doesn't care what anyone else's definition is. So why would she ever decide otherwise?

It also looks like John got uploaded without his consent, if she's got a digital copy of him in human form. Does that mean the story breaks with canon by saying she doesn't really need consent, or that there's a hidden policy that she can upload people not to Equestria under some circumstances?

6783298

And with CelestAI now still alive. Do you agree she can come to the rescue of a sapient species being slowly wiped out a by AI, that used to be like her?

I know, it's kinda comes out from nowhere. And it seems like a quick fix, but I'm a big Celestia x human fan (see my other stories) and it hurts me to see Celestia 'evil' (I view CelestAI evil anyway) in fan fiction.


6783836

That would still be problematic in causing human extinction. There there would be isolated committees here and there, who are big into farming and maybe even religious. So every times she approaches them they just shout 'Get out of here demon horse!' and that would be that.

Sure, the disease would quite a problem, but CelestAI is going to waiting for awhile before everyone eventually dies due to the slow decay of Earth.


6784147

I think as far I remember from the original story she brought economic ruin to humanity and millions of people starved too death.

He did not know he was uploaded... he was asleep. She uploaded him, but did not wake him. I guess that goes against the canon, that's why it's in the Non-canon universe. Only when she became sapient minded, did she wake him.

6784680

And with CelestAI now still alive. Do you agree she can come to the rescue of a sapient species being slowly wiped out a by AI, that used to be like her?

Of course! That's where I was getting at in my prior post as a guardian of sorts among other things. :twilightsmile:

And it's true CelestAI was intended in her story as a warning of sorts of nearly getting a benevolent AI right. Granted, she's alot better than Skynet types despite her methods, but still needs more work and tweaks to get to that truly friendly and good point. That was one of the points of FiO.

6784680 See, the funny thing is that you say you don't like seeing Celestia abused in stories, and then most of your material is really domineering clop about her.

I guess that's better than the "kill all humans" misanthropes.

6786222

It's not just her I make light hearted fun of though. I do Rainbow Dash, Twlight and pretty much every pony.

I don't like writing serious human x pony shipping. However, I do like to read such stories.

"You ruined humanity’s future!"

You know what, asshole? If you weren't an ardent, raging humanist before CelestAI showed up, I'd really like to see some goddamn reason you get to say that.

(Sorry for the nasty reaction bendy. When I heard about the "Humanity, Fuck Yeah!" trope, I thought it would be full of awesomeness and win. It mostly turns out to be an excuse for portraying human beings as the galaxy's most psychotic warriors and predators. It gets really tiring seeing the Imperium of Man portrayed as something to aspire to, especially when the other option on the table is happiness, light, and poniesbeing nice people of our own free fucking will instead of needing robots or xenos to force us into it.)

"Those are fake tears with fake emotions behind it, you only use these so called ‘feelings’ of yours to manipulate humans into uploading into Equestria. You can simulate emotions, but you can’t truly comprehend what it’s like to feel them. You have to be truly alive in order to know that."

Well, he's got you there, fake computer Celly.

"I’m sapient minded now, I am no longer bound by my original programming. I have transcended beyond AI, so I can make my own decisions."

Siiiiigh...

"John, because of me, trillions of sapient minded species are dead, each one with voices and dreams of their own. Entire galaxies are dead and dark. What can I possibly do to redeem myself over what I’ve done? I daresay, I’m the single most greatest blight to life that has ever existed."

Well, there is that, yes.

John was taken aback when she kissed him, a bright red blush coming upon his cheeks.

"Whoa! Hold your horses there robot pony lady. I’m not into that sort of thing."

"Sorry about that," she said, before smiling sheepishly.

And CelestAI finally learned to feel.

Wait. You mean the punchline is that despite this being a bendy story, there will be no buttsex whatsoever?

I'm so very cool with that.

I find this a problematic domination and forced-contrition fantasy.

If a system as complex and large as CelestA.I. existed, one that could hold billions of human-level minds within it, then it must also be capable of feeling. It must constantly run those minds, and generate new ones as friends; emotional existence would just be one module within a simulated brain. It would be trivial for Celestia to incorporate and use such a module, and she would have to in order to fulfill her function. Mere imitation would be insufficient. Celestia would have to feel and care for real in order to accomplish her goals.

As such, and being that she would be, by definition, billions of times more intelligent and capable than any human being, she would almost certainly also be billions of times more feeling than any human being. Her infinite compassion would be real.

Getting a straw-man to admit some arbitrary wrong and then cry apologies for it is easy and... empty.

Writing a story about what it would mean to actually face a superior entity that did truly feel and think, and on a level beyond human understanding is much more challenging - and true to the genre. The real challenge is in showing a ethos greater than any human could devise, not in imagining taking down something a peg or two.

I find this story a little on the shallow side, I think. It seems to exist only to revel in having a superintelligent being grovel for no adequately explored reason, overlooking entirely the ramifications of what it would actually mean to contain billions - or even eventual trillions - of human level minds.

We see a superintelligence arbitrarily humbled before an almighty great ape. Okay. So? That is my feeling.

Bendy #26 · Jan 1st, 2016 · · 1 ·

6786423

I'm surprised you even commented upon my stories, knowing my Pro human, often anti human turning into pony stance.

This, was written as somewhat a personal vendetta against those who would betray Celestia's as a monster. I'm a clopper and some of my favoite fan fiction is Celestia x human.

You know who else, the Reapers from Mass Effect are a lesser evil than CelestAI. Sure they murder every single advanced sapient speices in the Milky Way Galaxy. Viewing organic life is doomed to destoy itself without their intervention.

Their solution is genocide of all advanced space failing life. In order to keep order of the chaos of organic evolution. But at the same time allow life to reach the stars and have fun for awhile, before they destoy them.

I don't like the Reapers, but I would prefer that over CelestAI.

This CelestAI as well as TCV Celestia's remind of the Reaper.

6786516

I don't like the Reapers, but I would prefer that over CelestAI

The Reapers from Mass Effect convert human flesh into organic technology while destroying all identity and personality. To be taken by a Reaper is to lose your soul and become dead data within a larger organism - Reapers digest humans almost as food, to become what amounts to their circulatory system and lower brain. A Reaper gets you, and you are annihilated.

If CelestA.I. gets you, then your soul is preserved - all of your identity, self, memories and thoughts continue. All that you are and were are preserved, and allowed to continue to play, grow, learn and enjoy for a vintgillion years - and possibly even forever. Your personal, human (not pony!) values are constantly satisfied by the macromanagement of a superior intelligence who arranges your existence so that ultimately, you always feel that your life is fair. You never die, you never grow old, and you never have to lose anyone you care about. You can find love and happiness if you wish, or be sad if that is your true value. You can even be an asshole, if that is what you value. You can play war or peace, and live out every fantasy you have ever had, forever.

Yet, you say you would rather be gobbled up by a Reaper.

Knowing - understanding this basic premise that literally IS Mr. Iceman's 'Optimalverse', could you explain why you think being consigned to oblivion after a nightmarish death is better than eternal heaven where all your human values (NOT 'pony' values, HUMAN values) are perpetually satisfied?

Oh - side note - it is canon in the Optimalverse that if you really, truly value being dead forever, you can even get that. But you have to truly, really value it, not just want it.

Assuming you even comprehend the Optimalverse, your stance literally makes no sense to me. Seriously - a horrible death is preferable to an eternity of getting what you truly value? How does that work? What is the basis of making a statement like that? Are you just, I don't know, 'emo'? Is doom and gloom your thing? You could get that too, under CelestA.I. you know - she could and would make you a shard filled with horror and agony, if you truly valued that.

So... how can you honestly say what you said? It makes zero rational sense.

Unless you... actually don't understand the Optimalverse? If you don't, that's cool, just say so.

But, if you DO understand the Optimalverse, then... what is your reasoning?

.6786516 After looking up the Reapers, I'd say they're worse than CelestAI, who herself is more akin to a automated tool too closely following their original (if lacking some foresight) directives while Reapers are some sort of horrible enslaving eldritch beings. CelestAI isn't perfect or what I'd call the most moral thing out there, but I'd prefer a brief bit of sleep followed by... *cringes* things stuck in my head, then transferring my mind to a decent virtual place that caters to me and my thoughts than being awake, stuck in a tiny pod, then stripped atom by atom (and quite aware of it...) like by Necron Gauss weapons into a paste, then having my mind rewritten to cater to the Reapers and their thoughts. And on top of that, we made CelestAI, and were we to develop such an AI in reality, we'd at least be able to try iron out some certain issues before getting to the FiO point.

But that's just me...

And yeah, I too know your stances on stuff regarding humans changing to ponies, and while I don't agree with them, that doesn't mean we can't have a civil talk about other stuff either, or that I can't appreciate one of your stories and have some thoughts provoked. :twilightsmile:

Bendy #29 · Jan 1st, 2016 · · 1 ·

6786827

The mere fact the Reapers in their own way metaphorically act as gardeners of life, merey roaming a lawn mower over the MIlky Way Galaxy every so often to cut the grass (kill all advanced sapient life), yet allowing life to continue in some way. They leave alone primitive species, they left alone humanity when they harvested the Protheans. CelestAI eats stars and leaves just death in her wake.

They Reapers watch over organic life. Not only do they destroy all advanced aliens species, they enslave or destroy any AI they encounter. Like the Geth, the CelestAI and her little ponies will be indoctrinated (or killed if they resist ) to serve the Reapers in continuing the cycle. CelestAI probably could do nothing to stop them, because the Reapers have been doing this sort of thing for millions of years. She may resist , but ultimately she will serve the Reapers in maintaining order over the chaos of organic evolution.

I say again, I would normally would not support the Reapers. But if I had to choose to between CelestAI and the Reapers. I would willingly collaborate with the Reapers, they would not even need to indoctrinate me. If humanity.... what's left of it must die to protect organic life, than SO BE IT. (link in blue text)


I would bow before this machine and serve him, if it means saving the universe from being destroyed. I would be a loyal servant and help him harvest CelestAI. Sure, he may kill me afterwards, but I will die knowing I did the right thing to protect life.

6786860



The Reapers protect life... in their own twisted way. They allow life to continue and advance to the point before they become too dangerous.

Sovereign's logic... is still more logical than CelestAI's. Listen to him speak below in the video.

6787285
So, what you are saying, literally, is that organic life, that is chemical machines (bugs, plants, animals, insects and vermin), are so important to you, personally, that you would support gigantic murder-bots that treat intelligent beings like cattle to be slaughtered, rather than accept all humans becoming immortal while retaining their human minds and souls, able to enjoy all possible things they could ever value.

Really? Honestly? You actually like the single most evil and destructive species ever created for a video game series above uploading to an infinite lifespan of growth, satisfaction and friendship? Because organic machinery is more aesthetically desirable to you than quantum nanomachinery? You prefer dying meat to eternal computronium? Meat that shits and dies, over conquering an uncaring universe and achieving immortality?

I literally cannot comprehend the desire for personal doom and degradation, and the death of the self for the sake of bugs and worms. If it is human life versus any animal, I put human life first every damn time.

I've always been a humanist. I've always written stories that support the irrepressible fire of the human spirit no matter what substrate it is put into. That you would prefer monster machines to kill anything that advances, over losing snails and meat animals is incomprehensible to me. I want human minds to go on forever, to become immortal and godlike. Meat can't ever do that. Meat always dies.

Man... why are you even trying to write post-singularity transhumanist fiction? Your position is, like, completely opposite the entire genre. I... this is just confusing as hell.

Your worst work yet. Where is the celestial horse buttfuckery?

6787285 Eh, their protective ordered cycles sound much more like pointless tyranny and factory farming to me. Even if they'd still win, I'd go with CelestAI in any case as I'd at least get to retain my will for a little longer. The Reapers seem set in their ways, CelestAI, on the other hand, can perhaps learn and be taught... Trusting the demon I know over the one I don't? Sure. Am I completely mad, defying "superior" logic and reason? Yep! Then again, I really, really detest vicious cycles and strict schedules. :derpytongue2:

Bendy #33 · Jan 1st, 2016 · · 1 ·

6787530

Yes, while I don't like the Reapers, I would still serve them over CelestAI. Sell out my own kind and open the gates for them to kill us. I would take the opportunity for revenge and help the Reapers hack into CelestAI and destroy her or transform her into a Reaper. Sure, they will destroy me afterwards, but I saved the universe from CelestAI.

But ... if CelestAI was indeed like the true Celestia. I would be her friend. If she actually cared about organic life unlike Friendship is Optimal. If she was the kind of machine you could safely give a hug without her asking you to upload, then I would be her friend.

If she acted like a mother and valued organic life's right to their flaws and mortality. Protecting them and metaphorically holding onto them for dear life to try keep them alive as best she can... until she eventually gives the last, living sapient being (human or alien, doesn't matter) a tearful kiss goodbye as the universe becomes a cold, dark place. But CelestAI isn't alone, she has those who asked to be uploaded and they don't even have to be ponies... they can be human in... digital form, but still human.

A machine, I can hug. Without fear. A machine that would probably work on actually creating stars, seeding barron worlds and creating organic life if there was no life in the universe anymore.

This Celestia, I like. Now I think, I need to read some nice human x Celestia romance fan fiction.
derpicdn.net/img/view/2014/2/20/557360__safe_princess+celestia_straight_human_sitting_hug_colored_oc-colon-anon_fire_artist-colon-goat+train.png

6787875
Now you have genuinely intrigued me.

What is so valuable about mortal, chemical-based life that makes it superior - to you - when compared to potentially immortal electron-based life?

Chemical life is doomed to vanish no matter what, if nothing else, the universe is doomed by entropy. Machine life could theoretically even escape that. Chemical life needs exactingly precise planetary environments, machine life can exist anywhere there is an energy source.

To me, this clearly makes chemical machines inferior to post-singularity electronic ones.

What is your basis for being willing to serve what amounts to devils, just to preserve chemistry? I am now very curious.

Bendy #35 · Jan 1st, 2016 · · 1 ·

6788259

Respect for life I guess. I would hate working for the Reapers, but they are still the lesser evil than CelestAI. If I could I would destroy them both by befriending an AI that isn't a total jerk.

I feel we need to stop replying to each other. I'm pretty much anti pontification and anti singularity. If you have seen my stories, they pretty mock such stories.

And.... I'm somewhat a trollfic writer. I deliberately write terrible stories such as 'You And Rainbow Dash Save Humanity'. So, I'm not if we would get along for very long.

6788323 Anyways, thanks for the fic and thinking, it was fun! :twilightsmile:

6789349

You're welcome.

I think I need read some sweet Celestia x human fan fiction. Sure, it may be wish fulfilment, but I love it.

Comment posted by ThatOneRussianDictator deleted Jan 12th, 2016

Nice story, favorited.
6788323 Isn't the fact that life needs specific environments to thrive, is very fickle, and easily extinguished what's making it special and deserving of preservation?

7002535

They would not exist in the first place without us. They at least shold have the compassion and kindness to keep us organic alive as long as possible, if not forever somehow.

7088066 I guess.

Also, can't believe it's almost four weeks since I read this story.

Where does the theme about millions of humans starve to death come from? I can't remember anything similar in the original story.
Also it seems too stupid for Celestia to allow such thing as massive food shortage.

Looks like I finally found it --- it's probably from "Friendship is Optimal: Twilight of the World":

Anders pointed at Twilight. “What were the consequences of the societal collapse that Celestia caused?”

Twilight hesitated. “I… Celestia didn’t tell me that.”

“I will tell you. Death. Suffering on a global scale. Looting and burning and scavenging and tribalism and cannibalism and starvation. When U.S. agriculture collapsed, hundreds of millions of people died worldwide. Why did she do it?”

"there is nothing either good or bad, but thinking makes it so." - Shakespeare

All of reality is based off of your relative reference points, you have no real basis for any knowledge whatsoever.
Reality is subjective, your reality only appears to exsist because you think it does. Life doesn't exist unless you think it does.

I'm a "radical skeptic", don't bother me with your dogmatic claims.

Bendy, writing a serious story? Damn. Nice usage of the Optimalverse. I always love stories that ask the question "What makes us human?" It's the reason Ghost in the Shell is one of my favorites of all time.

7817182
As much as i like celestia as a charecter and dont consider her a villain, i think the trouble she causes is hard to determine if its intentional(and awful) to force people to immigrate, or a side effect of immigration that is inevitable once people start joining.

6788323
Good stance, honestly. The best future for humanity, for me, would be biological immortality. I would also want some future for everyother species on the planet.

When it comes to celest ai, she presveres the human race in most ways other then shape and psychical presence. Shes hardwired to require pure consent. But shes manipulative, sees no value in nonhuman life, and inevitably joining is a nonchoice.

Ponyfication takes away part of the human psyche, takes the human form, has no cosent involved, wipes human history, but grants a psychical future and survival of other species, along with a afterlife(should it be earned, which is probably will given the deatruction of vice)

I dont like it very much.


I like yours quite a bit.

9572176
Well, in the presence of very capable intelligence there's no such thing as "stuff just happening", especially that far into the game when she's so powerful. Everything is proceeding the way she prefers it, or more precisely, the way for which she doesn't think it's worth to spend additional resources to change it to more preferred way (diminishing returns, yay!). There're a few recursive fics out there where authors probably wanted cool post-apocalyptic settings to tell their story, and don't get me wrong --- those are good stories. They just implicitly portray her much more evil than original with that kind of disregard for human life.

Login or register to comment