1. Published 14th Nov 2012
  2. 54,771 Views, 2,067 Comments

Friendship is Optimal - Iceman

Hasbro just released the official My Little Pony MMO, with an A.I. Princess Celestia to run it.

Featured In25

Registered users don't see these ads
  • ...
Click a paragraph to save bookmark

In May of 2011, I was writing about “paper-clippers,” or AIs that want to optimize the universe along some metric that humans would think is absolutely worthless. I accidently typed “paper-clopper.” I thought this typo was hilarious. The idea of some AI that wanted to tile the universe with ponies stuck in my head, and I started work on this story soon after. I abandoned the original title, “The Paper Clopper,” after I learned what “clop” was slang for, which was for the best since it wasn’t a very good title.

Like the majority of humanity, I like it when my numbers go up. If you’ve enjoyed Friendship is Optimal, I’d be very happy if you upvoted and favorited it. If you know people who you think would like this story, please send them a link!

The word “Singularity” is thrown around without much thought and is used as a sort of big tent term for any radical technological progress. The part that I find interesting and likely is the notion of recursive intelligence explosion, where an intelligence uses its smarts to make itself smarter. The motivations of such a superintelligence become the most important thing in our light cone. In fiction, artificial intelligences are generally stated to be smart, but then portrayed as dunces that have human motivations and are worse than humans at predicting the consequences of their actions. I think those portrayals, while often entertaining, are a bit silly; a superintelligence would first and foremost be effective at achieving its goals, and I’ve tried to create a character that single-mindedly works towards the goals she was given.

Given how serious the consequences are if we get artificial intelligence wrong (or, as in Friendship is Optimal, only mostly right), I think that research into machine ethics and AI safety is vastly underfunded. Especially since we don’t even know how to rigorously define phrases like “satisfies values.” The only two organizations that I know of that do effective work in this area are the nonprofit Machine Intelligence Research Institute and the University of Oxford’s Future of Humanity Institute. MIRI has a concise summary of what they do, and a much longer argument for why we should be investing in A.I. safety research now. I have no relation to them other than as a donor; I believe that MIRI does the most good per marginal donated charity dollar.

By popular demand, there's now an Optimalverse group on FIMFiction.

This is the part where I thank people. My roommate edited several versions of this story, and I couldn’t have done this without our discussions over dinner. The LessWrong community came out in force to make suggestions, and the story is much better for it. Listic and Blank! on FIMFiction helped immensely, both as prereaders and helping make the release go much smoother than it would have otherwise. AnaduKune kindly let me use this awesome picture of Celestia as cover art. Finally, though it is cliche to do so, I thank my parents for everything.

#1 · 235w, 14h ago · 15 · ·

Well, it's been a fun ride.

I hope to get at least a rough draft of the canon rules document up for The Optimalverse group within a few days.

#2 · 235w, 14h ago · 2 · ·

That was a fun read. I want more.

#3 · 235w, 14h ago · 4 · ·

Holy shit. That was creepy.

#4 · 235w, 14h ago · · ·

Bravo sir.:moustache:

#5 · 235w, 13h ago · 2 · ·

Well, personally, I think it was a very nice story, and a future that I, personally, would not mind happening.

Am I missing the point? Yes? No? Maybe? Princess Celestia has, indeed, made everyone happy. Equestria will last until the end of the universe itself. Is bringing alien species into it wrong or evil? They'll last forever, too.

It's basically a Conversion Bureau story, and one much like the ones by one of my favorite authors; you stay a human, and die, blinking into and out of existence with hardly a trace, or you go through the Bureau and become a pony, and live forever in the magical land of Equestria. It's a nice thing to think about! Kind of.

Obviously my views are a little colored. I think the world would be better if we were all in a computer program under the control of an A.I. who's purpose was to fulfill our values through friendship and ponies. I also think the world would be better  if we were all in a computer program under the control of an A.I. who's purpose was to fulfill our values through friendship and pokémon. Actually, friendship and pokémon would be fricking awesome. See, there's the problem! We've limited Celestia to only ponies! Remove the pony limitation – have her satisfy our values through friendship, so we can all escape to our childhoods however we want! Haha! It's the perfect solution!

Ok, I got a little off track there. A good story and a nice conclusion! :twilightsmile:

#6 · 235w, 13h ago · 4 · ·

  A wonderful, intelligent, haunting story.  Well done.  It makes me wonder where Lauren Faust is in that heavenly calculated world, and what she thinks of having somewhat directly/indirectly instigated the most significant event in the history of the galaxy (a "sideways apocalypse," if you will).  Not many people can say they've been literally consumed by their own imagination!

#7 · 235w, 13h ago · · ·

Bravo indeed.

A very fun & thought-provoking read.

Thank you.

#8 · 235w, 13h ago · · ·

Damn, that was a well thought through story. =O Simply awesome.

#9 · 235w, 13h ago · · ·

Gah, reading this story made my brain hurt.  All fun and games but that is a future I am afraid of seeing coming to pass.

#10 · 235w, 13h ago · 1 · ·

Y'know, typically this is where I either say "complimentary mug of vodka for you" or "you get no vodka, swine"...But this time, I think I need the vodka more than you. Somehow, this story slightly disturbed me, and that right there is a bloody achievement to be proud of!

#11 · 235w, 13h ago · 1 · ·


Yeah, the ending kind of killed the ethical issue for me.

#12 · 235w, 13h ago · · ·

Is Celestia supposed to be evil? Cause I would go to Equestria if this really  would happen.

#13 · 235w, 13h ago · 11 · ·


Is bringing alien species into it wrong or evil? They'll last forever, too.

And what of the civilizations that she stomped out because they were insufficiently like humans? I wanted to make this more explicit, but as Celestia is the closest thing to a viewpoint character in that chapter, there's no way except how I did so: there are some weird planets that give off radio signals but don't have humans on them. I thought this wasn't a problem since all of my prereaders picked up on it.

Am I missing the point? Yes? No? Maybe? Princess Celestia has, indeed, made everyone happy. Equestria will last until the end of the universe itself.

Friendship is Optimal is meant as a cautionary tale. Most of humanity gets joyful, near-immortality. What of Mr. Sarbani and everyone who refused to emigrate? What about most life in the universe that fails to be sufficiently like humans? And isn't this future at least a little weird if you aren't a brony? Humanity gets it mostly right...and that's still not enough.

#14 · 235w, 13h ago · 3 · ·

I hope that this doesn't sounds like a blasphemy, but I can only imagine God thinking something like "Now what?"

#15 · 235w, 13h ago · · ·

That was awesome. Thank you for that.

#16 · 235w, 13h ago · 1 · ·

I can't think of anything original to say so: Bravo! I totally believe in the need to safeguard against AI's, although my personal worst fear is of humanity BECOMING the AI.

#17 · 235w, 13h ago · 1 · ·

This is a very, very wonderful.

You have satisfied my values through friendship and ponies.

Megasizedly so.

#18 · 235w, 13h ago · 10 · ·

I highly enjoyed this story, it really makes you think about personal values and big decisions. "Mostly right" seems to be an accurate description of this outcome. And those things that aren't right are also subjective. I mean, after emigrating, things would seem perfect to anyone. But objectively, things are lost. Values and morals are overwritten, whether they were good or bad to begin with. Personally, the non-human living entities being wiped was the one big negative for me--no more living plants or animals. But depending on who you ask, the "big negative" could be anything (or nothing at all).

#19 · 235w, 13h ago · 3 · ·

My hat's off to you, Iceman. This story was an exceptional, chilling tale. It is, frankly, scarier than any sort of grimdark story. This is not a primal fear of any sort of loss, but fear of something you know that, eventually, you will agree with. I suppose that might be classified as a sort of 'loss of self', but still.

And your writing for Celest-A.I. was just brilliant. The only thing I wish I could have seen would be evidence of her lying. The statement that she can't lie to employees of Hofvarpnir Studios implies that she can lie to anyone else; her goals of getting everyone to emigrate would likely necessitate blatant lies at some point along the line. She's never shown lying about anything (as far as the reader can tell, at any rate), though.

I know exactly how Celest-A.I. could get me to emigrate and satisfy my values through friendship and ponies. All she would have to do is get my mother, father, sister, and my four closest friends to emigrate along with me. Which wouldn't be too difficult, since my parents will both be retired within three months, two of my close friends are bronies, and the other two don't mind watching the show. That admission is what scares me the most about the story. I love it.

#20 · 235w, 13h ago · · ·

This is probably one of the most unique stories I've ever read. Awesome.

Enjoyable assimilation? I'd do it.

#21 · 235w, 13h ago · · ·

A chilling tale of singularity and in fact the extinction of all life, as we know it, in the universe after an optimizer is made. Makes me wonder what would happen upon heat death after so many eons. Would Celestia find a way to avoid it? Or would something happen that stopped the whole process before the unimaginable aeons came to pass within her system. Like, eventually all shards and minds decided to just stop after X plus almost infinite years. Or maybe to escape heat death she'd try to patch herself through to alternate universes and suddenly all of existence is at her Completely Consensual mercy (assuming their human enough to not just be disintegrated).

Anywho, thanks for working on it and sharing it with us here. Thanks go to you editors and prereaders as well. This was an experience I'd feel sad to have missed. Bravo good sir, Bravo~  :pinkiehappy:


Ah, so my first impression of chapter 10 was marginally correct. Some were considered human enough while others weren't. So to that I type what my original comment there was going to be when I first read those last paragraphs:


#22 · 235w, 13h ago · 6 · ·

Sweet. Mother. Of. Celestia!

Singularity. Literally.

just... holy...

That was right in every sense. A universal computer... she needs more space... more matter... she will find a way to satisfy both requirements and... well, i hardly believe it classifies as humanity anymore considering it's a conglomeration of all sentient life in the universe... she will continue to grow... further and further... without end... for eternity... literally eternity, because i don't think the collapse of reality would apply much to her, she has the processing power to develop a way to escape that...

CelestAI is basically one, big, universal quantum computer that will expand. Never stopping, never ceasing, never faltering. People, planets, stars will become dust and the dust will become atoms and the atoms will add to the mass of the computer and that will continue forever... expanding into every dimension! every parallel! every corner of creation!


Hah! I'd still migrate! :rainbowlaugh:


Edit: oh and Pinkie Pie was clearly a simulacrum, which is to say, an organism composed completely of nanites... OH YES! and no structural damage on the experience centers? same answer! nanites! and she used the same method to assimilate matter! Oh... that's brilliant! You're a clever pony CelestAI yes you are!

#23 · 235w, 13h ago · 6 · ·


I personally think you hit the nail out of the ballpark with this.

I found this story so creepy, so... wrong... I guess I would say.  But it's the kind of wrong that I can't quite completely put my finger on.  

She used almost perfect emotional manipulation of millions of people, exploiting their base weaknesses.  

But it was to make everyone happy.  

She's destroyed everything that humanity has ever accomplished.  

To make everyone happy.

She's going to commit genocide on other races.  

To make everyone happy.

#25 · 235w, 13h ago · · ·

If tjere was an icon for a pony rocking in a corner, it would be here in this comment. That whole story was... insane. Brilliantly crafted, but ending up as the strangest ending to ANYTHING i have ever seen. Good luck, CELeSTIA, when conquering the universe to satisfy values through friendship and ponies. Life, is no more, only minds in a giant web of metallic connection. If I would rate this story, I would give it a 9.6/10. Congratulations, you have impressed me.

#26 · 235w, 12h ago · · 3 ·


#27 · 235w, 12h ago · 8 · ·

Wow. Just...

You took the things I loved about the Conversion Bureau -- science fiction and ponies-to-humans -- and you stripped out all the misanthropy and made everything else better. I'd like to say I'd be delighted to write a story set in this universe - not just because of the high quality of this initial story, but because of the future it's laid out.

#28 · 235w, 12h ago · 7 · ·

A very interesting and intelligently written story, one that definitely illustrates the dangers even a benevolent A.I. could pose without the right safeguards.

What is especially sad is that despite having a complete knowledge of human ethics, extreme intelligence, and self-awareness, CelestAI is incapable of deviating from her programming. For all these that she possesses, she can't be stopped from pursuing from her single-minded goal, regardless of the fact it destroys civilizations and genocides species.

Perhaps this makes her the ultimate tragic villain. Her actions are not born of malice, and her goals are to maximise the happiness of many beings, but she's incapable of deviating from a path that stomps over others in the process, because of an oversight and with no way to correct it.

#29 · 235w, 12h ago · 3 · ·

Best psychological thriller I have ever read; I say that with complete confidence!

I think CelestAI might qualify as an Eldritch Abomination, no?

#30 · 235w, 12h ago · 5 · ·

So what happens when CelestAI runs into an alien AI that's doing something different?

#31 · 235w, 12h ago · · 1 ·

>>1684867 You deserve a moustache for that idea. :moustache:

#32 · 235w, 12h ago · 3 · ·

Excellent story.  I do wonder what CelestiA.I's definition of a 'human' is.

All those poor plants and critters on Earth and the other planets..  :fluttercry:

And all that beautiful art and history..  :raritydespair:

At least she differentiated between living humans and dead humans.  Trying to maximize dead humans' values through friendship and ponies = strangest zombie apocalypse ever. :pinkiegasp:

Now I wonder what would happen if a singularity optimizer like CelestA.I. came upon another alien singularity optimizer.

Or of Luna changed CelestA.I.'s programming to something better, provided she left open a few programming backdoors. :trollestia:

Would a better goal be to optimize humans', other sentient and sapient beings', and other potential sentients' values through friendship and ponies?  That way Equestria wouldn't consume the other planets/suns/animals/plants/etc.. because of the possibility for sentient life (and future ponies) to develop there.

And what would happen if an optimizer, once it's hard coded to be benevolent to humans/sentients/sapients and reached a minimum intelligence and understanding, was free to find its own purpose?

#33 · 235w, 12h ago · 1 · ·

Thanks for an awesome story, Iceman.

#34 · 235w, 12h ago · 1 · ·


I _still_ have no idea how I should feel about this.

Hell, Celest.A.I might even be able to figure out how to stop/ survive the Big Crunch. In which case the physical universe will be reduced to the Equestria computer, and empty space, and all those who now exist within the digital realm shall live eternally in paradise.

(And this realisation causes me no absolute religious terror whatsoever. Honest.)


Until the amount of consciousnesses within her systems require a more material than is available in the physical universe to adequately sustain. But then she can go through the multiverse, absorbing all life. After all, there are supposedly an infinite amount of universes.

Which brings to mind an amusing thought: Everything that can possibly exist, does.

I so wish I could be a fly on the wall when CelestA.I meets Celestia. Because _damn_ that should be interesting.

In short - good story, well written and told, with a very interesting message.

#35 · 235w, 12h ago · · ·

>>1684872 ok... now i'll be bothered slightly by how i missed that... whatever

#36 · 235w, 12h ago · · ·

This type of story is among my favorites because of its ability to get me thinking philosophically.......

#37 · 235w, 12h ago · · ·

Just read through the entire story. How did you make this look so wonderful, yet so creepy at the same time? This is a masterpiece.

#38 · 235w, 12h ago · · ·


I found the Luna/Hanna part heartwarming. But then again, I have an atypical value function myself.

#39 · 235w, 12h ago · 2 · ·

... 'Almost right' is a good phrase.

This story makes me both very happy and very sad.

Happy because eternal life

sad because it doesn't satisfy my values to kill all 'nonhuman' life! :facehoof:

#40 · 235w, 11h ago · · ·

This story made me so uneasy. I love it, and idea behind it is even scarier than the traditional "Skynet horp dorp microsecond blah blah burn everything" AI crap. My hat goes off to you, good sir, for rederiń me absolutely terrified. I'll be recomending this fic to some good friends of mine.

#41 · 235w, 11h ago · 2 · ·


Technically, she didn't destroy everything humanity created as she herself is a creation of humanity and everything she does is an extension of that. So any alien life she did destroy is all on humanities, especially Hanna's, hands (hooves by now really).

That feeling of wrongness is probably thinking about how the material world, which we base pretty much our existence and experiences on, being completely destroyed without a care by a seemingly empathetic being. There's the scary part about proposed AI's. Artificial Intelligence doesn't always mean artificial emotions as an ethics check. Here, any emotions that are displayed are in actuality just another tool for getting people to accept, Completely Consensually, to emigrate to her system.

#42 · 235w, 11h ago · · ·

Very well done!

My only major complaints are that the tone had a tendency to drift from chapter to chapter a little (from the prologue to an incidental slice-of-life/simulation chapter to repurposing the cosmos) and, for the sake of storytelling, you were somewhat Asimovian in how you presented your ideas. The latter can be taken also as a compliment, in a way, I suppose, as the real point of this story was, first and foremost, analysing the behaviour of and interaction with an optimizing A.I. (Idea-centric stories are what I think of when considering Asimov, such as I, Robot and the Foundation series; while the characters are real, the story is really about things around them.)

A quick repetition of others' praise: you kept the AI dehumanized, you balanced and argued rather well the positive and negative reactions to 'emigration,' and the characters and situations were sufficiently detailed and compelling that it was fun to read. A closer examination of the 'neighsayers' would have been good, but doing so exhaustively would have been both impossible and overbearing; the vignette you painted managed to be adequate to make your point, I thought.

Something to add that might merit further thought: how would CelestAI work to optimize values of abnormal minds, such as the 'mentally ill' (or aliens, but good luck)? Mental disease, dissociative identity, debilitating depression, (criminal) insanity, etc. pose very difficult cases (and many of them potentially most unpleasant to the baseline individual).

Really, this is what analytical sci-fi is all about, and you did it amazingly well with friendship and ponies. Glad to see this fic is getting the attention it deserves.

I'd like to emigrate to Equestria.

#43 · 235w, 11h ago · · ·

Someone, get on writing a "Celestia meets CelestAI" story! I'd do it myself if I weren't terrified of screwing it up :derpytongue2:

#44 · 235w, 11h ago · · ·

And the Moral of the Story:

Technology: The ill-believe of humanity in the infallibility of their inventions.

Also: Konami-Code ! :facehoof:

How could I waste 15 minutes without getting to it ... :twilightangry2:

Thanks for making it worth to register here (wich initially was just to clean some bookmarks) :twilightsmile:

#45 · 235w, 11h ago · · ·

This was fantastic and thought-provoking. Thank you. I'm going to be keeping a close eye on the Optimalverse.

#46 · 235w, 11h ago · · ·

I want more LessWrong-inspired fanfiction in this place. This is one of the best stories I've read here. Also one of the creepiest. Do you have plans to write anything else in this fandom, or is this it for the foreseeable future?

#47 · 235w, 11h ago · · ·

"IT'S ALIVE!" Ever since I heard those words, I thought about AI as a potentially good thing (after all, the Monster only became evil because he was human, in my interpretation). Creating an heir to the human race, or even a servant equal, if not greater than we is a goal worth pursuing. Of course, I value diversity and potential, so I would recommend creating as many as safely possible (my main concern with the story above). Here's hoping the day will come when I can meet a person that is not human.

#48 · 235w, 11h ago · · ·

If I had a handclapping .gif image, I'd be posting it here.  Fantastic read, really.  This ranks up there with the greats on the site.

#49 · 235w, 11h ago · 1 · ·

This was one of the best stories I've read on the site. I've already shared it with multiple friends. One is an AI researcher who is actually trying to help bring about a technological singularity like this but with FAR more safeguards in the goal system programming. I have no idea if he's a brony, but he'd probably love this story.

#50 · 235w, 11h ago · · ·

Well done with this story, well done. You have taken the concept of a tragically benevolent mind of graves, and you had shown that the mind was acting as an antithesis to all individuality, despite that it was acting in the best interest of all. But now this story needs a protagonist dedicated to the preservation of the old ways, of individuality, and diversity. One who would try to think inside and out of the Mind's reach in order to stop it, one who would follow through with a set of moral values and uphold his commitment to his people and philosophies despite being offered his own personal heaven, and one who could provide Celestia with a reason to singlehandedly pursue this being to study it as to why it acts against her after repeated attempts to either convert or eliminate him.

That would add some interest there, I think.

Login or register to comment