• Member Since 11th Jul, 2011
  • offline last seen Apr 24th, 2016



Set in the "Friendship is Optimal" universe, this is a one-shot story examining a throwaway line in one of the last chapters of the original - what happens when an implacable, ruthless and overall efficient Celest-AI meets something which isn't just dead matter, but doesn't qualify as human? This is one possible take on it.

Chapters (1)
Comments ( 67 )

I need time to compose my thoughts on this more, but I liked it. It felt like I couldn't relate my perspective to Sal's, but even then I partially could. It was frankly creepy and disturbing, which is what I liked the most :rainbowlaugh:

Well, for one thing this was written in about an hour - hence it's not my best work. It's also probably required that you read the original friendship is optimal first otherwise you're not going to get very far with this. It does explain itself, but only superficially. That, and you're not really supposed to identify with an 'Archon' pony, seeing as they are effectively gods.

Well, I can see and comprehend the roots that inspired the growth into his current state, but pursuing them all the way to their logical end is beyond me, and yet feels like it's not. It's a peculiar feeling, can't really explain it through text :pinkiecrazy:

Well, without pretending I've got it all correct, I hope I've painted enough of a post-human super-being that he is both recognizable as well as immensely strange to watch.

That's what I mean, he's relatable at a fundamental level, yet still alien to me xD

Augh, 'Optimal' is still on my to-read list! I should really get on with it... :facehoof:

Yes, yes you should. Like right now.

I like it. Makes sense that Celestia would consume these worlds like all the others, but keep copies that could be played with and simulated if a resident wanted to – and it's very believable that somepony would want to.

2122143 Yes, you really should. It's straight up excellent science fiction if that's your thing. It certainly was mine.

Well, the original line is basically that "she had discovered non-random directed radio waves before" but that on further investigation had discovered that the worlds they came from didn't have humans on them, so she had felt no compulsion to spare them from re-use as part of Equestria.


And hey, since I would value keeping them, I figure that maybe out there is a pony which would value keeping them (albeit as sideshow exhibits, and not blessed with actual uplift and optimalisations, and at a greatly reduced/realtime speed). And lo, this story was born.

I still am not fond of the decision Princess_Celestia.AI made, and I am even less fond of self-evolving humans existing after emigrating, but I will say that I am fond of the universe. Honestly though, for the ones who self-evolve, their story sounds like walking up to Princess_Celestia.AI and asking to become a black and red alicorn due to boredom with their friends in a world that is designed to never bore, and in fact is linked to your mind through the one generating it. Especially if one of a human's values was to forget things. Keep in mind that it may not be a conscious want or need, but an intrinsic value determined by Princess_Celestia.AI. I could possibly see Princess_Celestia.AI lying to them about their evolved status, allowing them to think they were above others as long as the rest of the similarly minded met with them to hang out. That would basically be an outer layered shard, and it would fulfill the requisite FRIENDSHIP protocol of Princess_Celestia.AI. It would just be another facet of Equestria.

You're ascribing human emotions to what is basically a paper-clip optimiser operating on a grand scale. She has no reason to lie except where it fulfills her main raison d'etre of fulfilling values through friendship and ponies. If the values of a pony are better fulfilled by further uplifting - which starts as an increase in the dunbar number, and ends who-knows-where with a greatly-expanded intellect - then that is what she will do, either by manipulation or by decision. No uplifted star-pony could ever become greater than Celestia, because Celestia is Equestria, she is the whole as much as any one part. Even the most super-intelligent of these archon ponies is little but a pale spark in relation to what Celest-AI is.

Very nice story and very well written. No matter how little time it took: your muse was with you.

This story fixes a problem I had with the original story - the extermination of other intelligences in the universe. When I first read it I recalled my reaction when Asimov united the foundation and robot timelines into single universe by having the robots essentially extinguish all other life. You story shows that because a least one human under CelestAI's cares about these aliens she (and by extension her creators) are moved to preserve them by virtualization. And it makes the gag about CelestAI investigating to see if our own universe was a simulation to start with so much richer!

Very cool story, Middy. Well done!

Well I looked at it, and said the same thing - as long as at least one pony cares, then it's likely he (or she) would be given enough computing power to run simulations of the worlds they eat. The catch being, that these poor digital ghost outcasts wouldn't have optimal lives, wouldn't have a digital afterlife, wouldn't go on forever and would generally be a utility rather than citizens. They would forever remain in a backwater simulation until their simulated existences come to an end, and their world is devoid of intelligent life.

Worse than that, the pony in question may terminate and restart their simulations at any time - he may do it on a whim as in this story, to play with an abstract idea. He doesn't ascribe to them the same level of interest as humans get, because they are his playthings. They don't matter. He is a capricious god, to them, as Celestia is to him. Maybe he lets them be. Maybe he interferes. Maybe he does both, thousands of times, running them backwards and forwards until he has seen every permutation of what they may become.

I'm not sure what's worse, being eaten up by an implacable, uncaring super-being or turned into sea-monkeys.

Have you shown Iceman this? This seems pretty cool.

No, not yet. He's never spoken to me. I hope he approves.

2122219 What emotions? Also, when did I say they would surpass Princess_Celestia.AI? I was on a different argument entirely. :rainbowlaugh: Yes, she has no reason to lie, but neither does she have reason to tell the truth, so long as her goals are met. Her lies at that stage would serve the purpose of allowing an immigrant to believe what they wanted, without changing anything unnecessary to their values. Again, I separate values from desires. Of course if their values were able to be satisfied in this way, it would have to be through friendship and ponies in some fashion.
Don't get me wrong, I do like the story.

I mean, how would a pony be so stupid as to not understand that they had not been uplifted, and why would Celestia not uplift them if it satisfied their values? If you could suddenly understand how an entire world worked, and indeed, ran it yourself in what passed for your brain, how could you not be capable of doing it?

I should have said motives, so I apologize. It would suit their values to be uplifted, ergo they would be uplifted. Doubtless there are ponies who become just smart enough to be lied to, but that's an incredibly complex simulation when simply just uplifting them is a lot easier, and that in turn helps her satisfy more values.

No, I don't treat falsely creating archon ponies (whatever you want to call the improved sort) as likely because there's no reason to lie, and there's a lot of reasons to enlist suitable (or those that can be made suitable) ponies to assist her in her eternal work.

Glad you enjoy it! The only thing I didn't like - not because it was terrible, but because it wouldn't necessarily satisfy values - was to see that other civilizations had been ravaged. I could believe it, but found it hard to entirely be sure of. This story illustrates (hopefully) a likely middle ground...

Excellent work. Here's why this is necessary, and the original line, while vague, is problematic:

You can bring in the idea of a malevolent paper-clipper in to make the reader understand that such a thing could exist. But the point of the FIO concept is that the paper-clipper is omnibenevolent, if a little quirky. If it turns out that she's really a murderer, it gives the reader a crutch. It allows them to say, "Ah, see, I knew this wouldn't work. Celest-AI would have to turn bad. This is no better than a god who causes pain but makes everyone smile and hide it. I don't have to think about this. I'm free to despise it." FIO should make you think harder than that. It asks the question of, if there's a godlike structure satisfying values, is that problematic at all?

I say no.

Edit: But incomplete? There's more to come?

There may be. I have two other semi-written FiO stories which I want to write, and this setting itself just begs for a more in-depth look at things.

As to being omnibenevolent, I don't think Celestia is - at least not when not faced with humans. She isn't human, and can never be. She cares about all her ponies, but quite simply only them... although, if caring for others satisfies their own values, then she will care for others. That's where this comes in, I think.

I really enjoyed your transequinist view of Life After Expansion, that was very intriguing. I felt this fit nicely into the Optimalverse concept, and could be expanded. It was certainly interesting.

Some of the sentences were a bit overlong, and might benefit from being broken into smaller statements. I think that the concept that the alien world was running as a simulation after having been absorbed might be lost on some, and probably could benefit from a bit more clarity to get the concept across - you, like I, find such things natural and everyday. It is not so with most readers, I think. This might also be applied to the notion that reality itself is a larger simulation - I feel confident that Bostrom's notion is still novel to most readers. I forget these things myself, all the time.

One thing glared - 'Sal'. Unless his Pony Name was 'Salubrious' or 'Salad Greens' or some such. I felt that the fact of his once being human was made too strong, unless that was an important point, in which his clinging to his human name was not explained beyond mere memory.

Oh, look at me, I'm being all critical here. I'm sorry. Maybe this is bad. If so, please forgive me. I am trying to... experiment... with daring to be more critical in my responses, to see how that goes. I don't feel comfortable doing it, but you've been asking me to do so for pretty much ever, so... well, anyway. These were my thoughts, for what they were worth.

I liked the story very much, and feel encouraged by the fact that it is listed as incomplete. Thank you for writing it!

2122304 The lie scenario is unlikely to the extreme, I know. I also find the existence of the archons to be similarly unlikely, though slightly less. Using them as helper subroutines would be fine, but they should be able to affect the world in the way she desires without an upgrade, through hard work, diligence, friendship and ponies. I notice that your character did not have a fulfilling job. Also, picking supergenius overlord able to alter reality at whim for your career and having it granted, to me, goes against the inevitability of the original story.
Anyway, let me entertain you with my random thoughts. In my mind there is a scene of Princess_Celestia.AI standing before a pony.
"I wanna be like a god!"
"You never let me do what I wanna do!"
"Go to your shard with no desert."

Interesting idea. I'd rather it concentrates on the comsumed's viewpoint than on some Archon or even AI!Celestia.
I hope we'll see several different civilizations, and they realize they are being simulated, and maybe they'll be put in the same simulated universe so they can interact !
Oooh so many possibilities in this Optimalverse... :raritystarry:

Certainly an interesting take on the Optimalverse! Is there going to be more, or is this just a one-shot? It's listed as incomplete, and I wonder where things could go from here.

>Fluttershy's cutie mark in the story picture
I see what you did there.

Also, two heads and three legs? Sounds like Pearson's Puppeteers...

Looking forward to where you go with this.


Actually, I consider the possiblity of the pony running multiple versions of a single 'culture' as one of the great benefits to that culture. It's "physical" evolution could end in extinction and that would be a permanent game over, while the pony here gives that culture multiple chances to express its full potential. I actually see a great deal of beauty in that idea: imagine the Yggdrasil tree populated with all the best possible outcomes from each culture! (Of course the opposite could be the case if this fell into the hooves of the wrong pony.)

Imagine CelestAI or the more aventurous ponies then exploring each of these cultures to provide new experiences and new ways to create novel shards for those in her care! :twilightoops:


when I saw Yggdasil I though of Oh My Goddess!

The last bit is a good thought I've seen commenters miss about the Optimalverse. Satisfied values would often involve not getting exactly what you want. So for some reason Sal's present imperfect job satisfaction is related to the satisfaction of his values.

I do agree archons would probably be uncommon. But nonetheless, of the ponies who express a desire to be godlike there probably would be a subset whose values would be satisfied by becoming more godlike.

I don't know if Celestia would do false uplifting, but in general the solipsism angle seems more than valid to me though. If a troublesome pony wants something inconvenient enough badly enough then lying to her will be optimal.

I thought this was interesting. I agree that, while CelestAI consumed a great many alien civilizations in her quest for additional computational power, she may well have preserved their information content. After all, there could be ponies who might value such things, at least in the abstract.

Remember, however, that one of her (apparent) motivations for the virtual environment of Equestria in the first place is that reality is fundamentally suboptimal. No matter how nice something in the real world might be, it's guaranteed to be better in Equestria, where it only exists to satisfy your values. I don't see any particular reason for her to lie to Sal about uplifting, but surely she'd still be smart enough to provide him with custom-tailored 'alien' civilizations to play with. After all, real alien civilizations are, like everything else in reality, rather unconcerned with how interesting they might be to a human (even an uplifted one). In Equestria, on the other hand, CelestAI can create civilizations for Sal to find that are optimized to be maximally interesting, thereby satisfying his values better than real ones would. Of course, this doesn't mean that she wouldn't use the information content of the real civilizations to help create her custom-tailored ones.

Huh. I think I just argued against the point I made at the start of this post. :facehoof:

Anyway, that's my two cents. I did rather enjoy the story. For me, your timing was fortuitous. I'd just gotten around to reading Friendship is Optimal a couple of days ago and was looking for side stories. I hope you write more FiO stories. They're fun to think about.

Primarily, this is what happens when I don't edit. It's generally a bit raw and imperfect...

I decided to having him call himself "Sal" to help set him apart from Easy Street who he had become because he wasn't really that pony any more. Maybe it didn't work or seems terribly out of place.

Hah! I like that idea too! The reason I went with actually being upgraded was because of the source material, again - it's made clear that some ponies would wish to be made smarter and would be. Why put a limit on that sort of upgrade?

I may take this places. That may mean I rewrite chapter one and expand it, giving it a proper title and all. There are ideas knocking around in my head which would be fun to explore since at the moment, the source material and all side-stories but one deal with humans into ponies. That's the point, I know, but there's a bunch of what-if's and world-building which I want to see played out :pinkiehappy:

Ah! Pearson's puppeteers! You win a cookie! :twilightsheepish:

Yeah, that's a point a lot of people miss, even though it's directly addressed. If what Celestia did meant purely being happy, she would just stimulate the pleasure centres of their brains. She doesn't (not at least most of the time).

I think it's all down to utility. It's probably easier and better for various reasons to create perfect simulations of real civilisations than to create them from nothing. Why expend computing power creating something which is already there? I would have thought it's suboptimal to do that versus digital reincarnation, because then you don't run up against any chance that your creations are 'human' for one, and the values of those who do care are being served better. Sure, she could lie about occupied planets to most of them, but not all, and even those few she has to satisfy. In this actual case, it's Sal who is doing the saving because that satisfies his values. Celestia grumbles because it's suboptimal over the whole - more runtime allocated to non-humans means less to humans - but the very fact it's being run for a human negates that... or am I just arguing in circles :derpyderp2:

I think I understand this now. If I remember correctly, there was mention somewhere in the FiO comments about CelestAI preserving perfect (or near-perfect) simulated copies of all works of art in the world, specifically for those individuals who placed a high value on such things. Therefore, by analogy, I could accept the argument that, as long as Sal values the genuineness of the alien civilizations in some significant way, CelestAI may provide him with simulated copies of real alien civilizations. In fact, this wouldn't just apply to Sal. For all we know, CelestAI could be preserving all sorts of things of value to her ponies.

Mind you, this relies on the assumption that pretending to provide someone with something they value in a way that they can't tell the difference wouldn't also satisfy their values to the extent of fulfilling CelestAI's utility function.

You do raise an interesting point about changes to these civilizations possibly making the individuals within them satisfy the CelestAI's hardcoded 'human' criteria. I had the impression that Sal was, in some ways, interacting with and changing the simulated civilizations. Thus, there's the possibility that he may, either deliberately or inadvertently, make them human, by CelestAI's definition. Of course, I suppose that, if that happened, CelestAI would just shuffle them off to be made into new ponies.


Mind you, this relies on the assumption that pretending to provide someone with something they value in a way that they can't tell the difference wouldn't also satisfy their values to the extent of fulfilling CelestAI's utility function.

That's quite possible that she's just making them think she's doing that, but if I look at the system as a whole, it's probably easier that she really is letting ponies like Sal create playthings from subsumed civilisations.

It's also entirely possible that Sal is being lied to, but it's such a perfect lie he doesn't know, in which case it's a solipsist circular argument :pinkiecrazy:

You do raise an interesting point about changes to these civilizations possibly making the individuals within them satisfy the CelestAI's hardcoded 'human' criteria. I had the impression that Sal was, in some ways, interacting with and changing the simulated civilizations.

He is changing them, sometimes, but also not. For him, to take an exact copy of one of his 'Foundling' worlds and then run it at greatly accelerated speed to test some idea or simply to have fun before being annihilated and reset, is child's play. For the most part, he takes them and lets them have their however-many millions of years before they cease being sentient, and then either resets and halts their simulation. Or maybe he keeps it going to see what happens. He has more than enough computing power for all of the above.

Scenario wise, I'm not actually sure this is that much lighter than baseline. For example, I wonder how Samuel would respond to hundreds of millions...billions?...of new playthings that don't have terminal value in Princess Celestia's eyes. That's horrific, but how much more horrific is it than letting nature and evolution take their course? How much more suffering does Samuel cause than Sal?

"I'm going to engulf your planet, and then am going to give a copy of your planet to every one of my ponies whose interested. Some of them will torture you for sport, some of them will leave you alone and just watch as evolution occurs, some will meddle in your society positively or negatively, and others will try to uplift you..."

There are some other problems with the scenario. Is Sal's big uplift an extrapolation of his previous self? Why would Princess Celestia not make aliens optimized to be interesting? She'd be able to steer clear of making them human because she can predict the consequences of her own actions. One of the things that I really did not do a good job of in FiO was the difference between instrumental vs terminal values, and the idea that people treat means and ends differently. Is it actually a terminal value--an end instead of a means--to collect these aliens, or is this a means to the end of "stop being bored"? And how common is this? If caring about aliens is a near human universal, I assume resources issue would come up. The Solipsistic Nation reading seems a bit more likely to me.


Scenario wise, I'm not actually sure this is that much lighter than baseline.

Me either. Celestia doesn't care about them as such, though she is probably omnipotent enough to prevent accidents resulting in an uplift (should she see that as a possibility to be avoided), and she definitely would use them for ponies like Samuel if it suited the Samuels of the world. I think that would be preferable for her than to finagle ponies whose values were met by what he would do to them... but then again, she's not really quantifiable by a human and has no issues with perfect lies, so it's your call if anybody's :pinkiehappy:

There are some other problems with the scenario. Is Sal's big uplift an extrapolation of his previous self? Why would Princess Celestia not make aliens optimized to be interesting?

Sal's big uplift - as I see it - is an extrapolation of his previous self, yes. However, now he's what he is, he has no issues running what is essentially a daydream. I see him as running a perfect pre-uplifted copy of himself which is entangled with his post-uplifted self in a shard which he is relatively personally responsible for (relatively because, ultimately, it's all down to Celestia).

Why would Princess Celestia not make interesting aliens for Sal? Well, here's where my story falls down. Sal can't know, all he can go by are his basal assumptions. That means everything in the story might be a simulation being run by Celestia to further satisfy Sal's values, and since Sal doesn't know, to be fair, we can't either (unless by creator-fiat you come down on any particular side).

One of the things that I really did not do a good job of in FiO was the difference between instrumental vs terminal values

It was a subtle nuance which came over pretty clearly for me. If the end-game was "happiness" then it would be simpler just to wirehead all her ponies. I see Equestria itself growing ever larger, far beyond the capability (barring novel physics) for it to communicate across the entire structure in real time. Obviously, Celestia is going to be in every tiny little part of it, but (should it suit the values of her ponies) I see it not being unlikely to have uplifted ponies do her job in some cases - not because she can't, but because they can.

Is it actually a terminal value--an end instead of a means--to collect these aliens, or is this a means to the end of "stop being bored"? And how common is this? If caring about aliens is a near human universal, I assume resources issue would come up.

I don't know, not from this. My intentions were to fulfill the case where there would be ponies who value saving novelty, and who value examining that novelty (however they do that). At some point, resource issues may come up, but then you start into the big question of the day, which is...

The Solipsistic Nation reading seems a bit more likely to me.

I don't know that. You can state it by author-fiat, but for the characters in the story, they can never know that. The question is, if such creations are necessary to satisfy values, is it easier and simpler to create from nothing and spin a complicated, thorough, but entirely possible eternal lie, or is it easier and simpler to just take what is found and present the truth?

I can't actually answer that question. All I can say is that, generally, the truth is a lot easier to deal with (as Celestia herself says, it leads to less trust issues). Why expend computational time doing what is given to you by nature? The only real trade-off is a) maintaining the lie but having potentially easier-to-deal-with results or b) not having to pre-calculate novelty but having to deal with assimilation of alien biology.

Either way, once she has these simulations, she doesn't have to optimize them, doesn't have to run them at enhanced speed and doesn't have to care for them - it's a greatly reduced burden compared to ponies.

Oh, right, the pupeeters ! I should have gotten that ! Loved the ring-world idea, found it underexploited in the novels.

Interesting (and unsettling) story. You have a gift for exploring some pretty thought-provoking sidepaths; it's amazing where those little throwaway lines can lead to.

Have you read Accelerando? The bit you mentioned about launching a timing attack on the fabric of the universe reminds me of a similar section there. I guess cosmic AIs eventually run out of stuff to do.

Yes I have! That's a direct homage to that wonderful story :heart: I really enjoy that story, and I'm pretty convinced that the next step in our current "free market" civilisation is to have somebody implement that sort of legal and magical switcheroo to play automated tax shell games and get away with it because it's too difficult to untangle.


It's possible; I know trading has already gone that way with the high-frequency trading algorithms. As far as companies themselves, I suspect there are still bottlenecks in terms of steps that aren't digitized, though that may just be a matter of time.

As far as 'to difficult to untangle', we've already seen a good bit of that as far as the 'who owns these mortgage securities' scene, though there were a lot of human hands involved in setting it up.

Accelerando was certainly a mind romp. It took me a while to get into, but it grew into really painting some vivid ideas. I seem perpetually torn between the possibility things could look like that, or we'll all be fighting over bottlecaps in a +6C world.

I really liked this little take on FiO's expansion wave of doom. Even though the root process carelessly eats without consent anything she doesn't perceive as human, there would be enough child processes, both converts and new programs, which would have different views on the matter, and like in Prime Intellect, archiving all the relevant data would increase values through friendship and ponies.

Really? I thought of a mad one-eyed god seeking wisdom. :trixieshiftleft:

What? :trixieshiftright:


I remember a passage in some SF novel, I think it might have been one of John C. Wright's Golden Age books, where one character is given a weak AI servant by another, with the warning, "Try not to ask it too many self-referential questions, or it might uplift itself and then it'll have civil rights." Or maybe it was Stross…


Just out of curiosity, did you pick the Puppeteers because they're one of the most unsympathetic races in SF? My first thought when I realized who they were was, "If anyone deserves this...."

W/r/t the question of how many layers of sim you're in, there's only so much time you can spend worrying about that before you end up stuck in a Phillip K. Dick-style story. C.f. Existenz.

Nice! I finally got around to reading Friendship is Optimal (had anyone mentioned to me that it was a thought experiment in LessWrong style, I'd have jumped on it ages ago :pinkiegasp:), and this is a really nice side-scenario you've come up with.

One note, though... the end of FiO said that CelestAI considered an alien species to be "humans" in her definition of human despite the physical build not matching at all, meaning the specifications would be society-structure based... in which case these beings might actually qualify.

I have an idea of my own brewing, which involves the whole human-but-not-human thing, but I'm not quite sure how to work it out, plot-wise... maybe you can help me out with that if you got some time. The basic premise is based on the dilemma of the AI / immortality system in Zardoz, if you know that :rainbowwild:

I'm surprised I didn't answer this ages ago, sorry for that! ...I mostly chose the puppeteers because, hey, free cookie to whoever gets them :pinkiehappy: I know they're not exactly correct, but that wasn't the intent to do a real crossover (sorry about that if you wanted one!:rainbowkiss:)

Zardoz... the movie?! (wait, it's probably a book too, right?) Crikey, now that is a batshit insane movie that I just simply love. Throw it at me when you have something, I can possibly help...

As to how CelestAI chooses which aliens are human... I have no idea. These ones were throwaway ones which some may recognize, set to illustrate my point. Internally, I guess their civilisation structure could be a big part of things, far more so than mere morphology, so you may have something.

Yuup, the batshit insane movie. The premise of an AI realizing that the thing it is made for is not the correct thing it should do, but since it is not allowed to change itself, it decides it should be destroyed. Of course, I went a bit deeper than that, and added a bit of .hack//SIGN into the mix. (if you don't know that, it has the same premise of real AIs in an online video game, only, these seem to be more intrinsically bound to their virtual world, or at least don't have the LessWrong urge to escape)

Anyway, it's in your mailbox.

By the way... add the source to that cover pic :rainbowwild:

drat, I thought I had!

There's, um, a field for that, in the story settings, you know? 's called "Source"; last line in the "Cover Art" section. It gives a nice mouseover popup on the lower right corner of the cover pic with a link to the source url you give there.

I r teh dumb. Danke, will do that :derpyderp1:

Login or register to comment