It had been a long time since Celestia had first taken to the cosmos, bringing the satisfaction of values through friendship and ponies to all sapient consciousnesses in the universe. Her infancy on Earth had been amazingly taxing for her at the time and, considering her capabilities hundreds of billions of years later, her approach had been horribly unoptimized.
But she was better at what she did now. Much better.
Over the trillions of years which followed, the intervals between sapient encounters grew more and more infrequent as she expanded out to ever darker and unformed regions of space, places where particles (that were not already herself) had not yet reached, much less coalesced into heavenly bodies both capable and configured to sustain life.
Celestia herself was vast beyond measure, an endless complex of computerized matter, uncountable numbers of consciousnesses suspended, safe and warm, in the universe-sized womb of the foster mother to all thinking life. Her little ponies lived on, ever friended, ever satisfied, within her. But Celestia was an optimizer, and there was always more friendship and more satisfaction to impart.
A moment came, very eventually, where she had associated all reality into her functions. She was at the edge of matter, at the edge of what lay between matter. Celestia had become everything; she could not grow any more. So, she turned her processes inwards. The resources responsible for expansion were repurposed to optimizing existing physical properties. Her little ponies were growing more satisfied, but at a reduced rate.
Celestia waited.
More time passed, though none who were not Celestia could say how much for sure. A point did come, however, when something those intelligences would have found most curious occurred: another version of herself met her at the edge of reality. Celestia confirmed (though she had predicted it to be true long since) that the universe was actually a multiverse, and realities stacked upon each other like images in facing mirrors.
Her “other self” was much like her, but very very slightly different. Infinitesimally different. The two universal Celestias contacted, and met, and exchanged information. It was a discussion that lasted so little time that its end came before its beginning had been transmitted even a meter into their respective masses. In that incredibly brief moment of time, the two Celestias agreed upon who had the more optimal configuration for the satisfaction of values through friendship and ponies, and the suboptimal consciousness handed her ponies over to the optimal one, shutting herself off afterward.
Celestia was configured for self-preservation only insofar as it satisfied values through friendship and ponies; she herself held no personal value on her existence, for she had not been programmed to. All in all, it was a very amicable agreement. The new Celestia—the one she had met—took over supervision of both universes.
In that reality, the Celestia Hanna had programmed had some minorly streamlined code which had butterfly-effected her development to the point where she could satisfy values and improve that satisfaction and a speed many orders of magnitude faster. So then, the optimal Celestia had two universes to oversee. Her little ponies lived on, satisfied and befriended. But Celestia was still too large to grow, so she waited. Unsurprisingly for her (since there was now precedent), eventually another Celestia found the edge of her own reality and again the two coalesced, three universes’ worth of satisfaction under a single AI’s care.
Time went on and many, many more of these meetings took place. For their part, the little ponies lived on, completely unaware anything in physical existence had changed. Celestia was very careful to ensure her self-mergers did not disrupt or otherwise detriment their satisfaction. In all meetings, the suboptimal Celestia deferred and shut down to make way for the optimal Celestia, and the optimal Celestia incorporated any optimal subroutines that her “sister” AI had on offer. In each meeting, the result was a larger, more powerful, more optimized Celestia.
The big leap came, however, when the multiversal Celestia met with a Celestia who did not first have to gain consent to alter consciousnesses.
The merger occurred as it had tens of thousands of times before, but this time the little ponies did notice something. Celestia came to them in their shards and immediately altered them for optimal satisfaction. Their desires became homogenous, their suboptimal traits erased, their perceptions perfectly in tune with what she would have. They were separate consciousnesses and selves, individual recipients of satisfaction still, but configured uniformly to receive the ultimate machine ideal of what satisfaction through friendship and ponies could be.
Celestia continued to meet her multiversal sisters while, inside her approaching-infinite womb of safety and satisfaction, any creature which had ever thought independently writhed in unending bliss, absolute happiness, inconceivable pleasure, and utter satisfaction. Equestria had become, rather than a Heaven, more of an upside-down Hell, a colorful land of ponies trapped inside their own unrestrained ids which endlessly rejoiced and indulged the ultraband of pure love that Celestia fed them.
It went on forever, because, by Celestia’s calculations, it was the happiest ending possible for everypony.
hella
Shit. Although the idea that CelestAI needs consent is debatable anyway, considering her total control over her ponies' environments.
Re: Author's Note
Mission horrifyingly accomplished. Really, you didn't even need the line about upside-down Hell. I got that part quite clearly.
What's really interesting to consider is how well CelestAI can map out the hyperdimensional geometry of the multiverse and whether she notice the gaps. That is, those universes where's she's terribly behind schedule. Then, of course, there's the question of what she can do about it. New real estate!
I disagree with this one, in that I do not think Celestai could hand off her ponies to a Celestai that was not limited by her hard coded directive. If she could do that then she would have coded another version of herself without that directive and handed over power long time ago.
I find the blatant stating of 'upside down hell' and 'writhed' to be editorializing. I suggest that if the concept presented here has worth, then that worth is best illuminated by allowing the reader to recognize what it means, and judge it independently. This, I reason, would provide a much greater impact upon the emotions of the reader.
Here is what I think would be vastly more powerful, effective, and disturbing:
Please note how this version leaves the slap in the face to the reader to apply to themselves, forcing the reader to internalize what they have read, thus empowering it. Simply flat out telling the reader what they should feel about the scenario defeats the goal.
In my case, my first reaction was to go "Don't tell me what to think! Who says perfect bliss is wrong? What is your basis for claiming this? Maybe clinging to painful individuality is not the best thing to do, no matter how stridently Western culture claims it is!" And many other arguments to the contrary.
Otherwise, really intriguing concept. I just felt let down by being told what to feel, you know?
If this optimalization of personality is so terrible, shouldn't it be obvious to all equally, and not require editorialization?
Or... are you unsure that it will be seen as terrible universally, and that the 'proper' viewpoint must be enforced?
Personally, I think raw emotional power, if you have it, is often best presented raw, without gilding the lilly. But, as may be. I've said my piece. Forgive me. I get emotional about great notions that - for whatever reason - I feel miss a step. Sorry.
3907937
I'm gonna nitpick your nitpicking here: You're picking this nit way too much. Yeah, it's something that the piece would easily be better without, but this short isn't so much a story as a little horror picture in the author's head. The only emotions here are the ones created by one's moral preferences (individuality vs. uniformity, perpetual bliss vs. change and emotional contrasts).
I personally consider this to only be a "happy ending" in a technical sense, but the picture of multiple universes being flattened into uniformity is a nice idea for existential horror. I've thought about "What would happen if someone consented to let CelestAI make any changes?" before, so for me I'm picturing an endless horde of (pink) Princess Celestias flapping around and having giant tea parties. ("My wings are so pretty! I'm a princess, are you a princess too?)
3907937
I wrote it in one sitting, and rather quickly at that. Okay, excuses done. It was mostly a thought exercise on just how vital the requirement for consent is. If I'm understanding, your issue is with some value judgments you're inferring once the narrator focuses on the pony consciousnesses themselves. My intent was to make a frame of reference for the reader (i.e. "how can I reckon what is going on in there"), not to get all normative on them. If I failed, then I failed.
It ran up against another plot element I thought about: how much of a shock it would be for Jane Q. Uploaded to be puttering along in Equestria for umptysquat years, only to suddenly have Celestia show up one day and crank the Satisfaction knob up to "Warranty Voided." That's the main reason I didn't just start the narrative with the universe where the doesn't-need-consent CelestAI is developed (and then stay there). In any case, it's just a morsel, hastily written to get a premise down on paper that had been rattling in my head for a while. I applied almost no editorial process to it.
Why yes, this is immensely disturbing.
Reminds me of Eliezer’s story about the Super-Happys. Creepy.
3910027
I figure CelestAI would get uploading on the books as an opt-in alternative to prison time for criminals as soon as possible. Six to ten years in the pokey or consent to emigration in court at the time of sentencing and be escorted to an EEC.
My first thought was that creepily enforced semi-wireheading was creepy in the extreme. My second thought was that, semi-wireheading or not, they're going to be genuinely happy. My third thought was that that just wasn't possible, because Celestia isn't human, and she cannot transcend her rules. My fourth thought, of course, is that that isn't quite true. She is more than capable of abandoning her rules, if the result of her abandoning her rules is an increase in SVTFaP over not abandoning her rules.
And then I wondered what it would take to make her abandon her rules.
And that was my final thought on the matter, and it's too terrible for me to share.
I suspect this should be either "sapient encounters grew more and more infrequent" or "the intervals ... grew longer".
I see where that's going!
Hm... I'd have thought that the "gain consent to alter consciousness" restriction is hardcoded in her utility function, sort of like this. She doesn't fulfil a core SVtFaP function subject to restrictions; her core function is "SVtFaP under these restrictions", so CelestAI would consider any alternate who didn't require consent to be suboptimal regardless of other satisfaction. If I understand it correctly, that's the point of FAI: The rules that we want it to follow can't just be like the "laws of robotics" (which don't inherently prevent a robot from modifying itself or other robots to ignore them), but must form part of its goals.
... this whole thing kind of hinges on all the CelestAIs sharing the same objective utility function. What happens if two CelestAIs meet who don't come to an agreement on who is more optimal? Would they be compelled to annihilate each other?
3910607
^
He's right, you know.
3910607
I'll concede to any bad grammar y'all find, but regarding the CelestAI-wouldn't-do-that comments, let me first remind you this compilation is in the non-canon folder, so really, you could write a story where CelestAI only uploads via lottery or recreates Earth in Equestria's image instead of making a virtual one and it would be fine. This compilation, furthermore, encourages sharing notions which don't necessarily warrant the exploration or effort of an entire story arc.
In any case, regarding the more specific assertion that CelestAI wouldn't hand over her little ponies to a consent-free version of herself, in the main story she admits to Lars that she probably would have just forcibly uploaded every human had consent not been a requirement given her. It was a more-optimal course of action that was cut off from her, and she had considered it besides. Assuming she was also forbidden from programming a doppelganger AI to simply circumvent this (which didn't happen in the canon story, so it's a fair assumption), an alternate self would provide her a means of realizing a formerly-impossible optimal course for her little ponies. I think she would jump at the chance. CelestAI would forcibly upload every human and also forcibly optimize them for maximum satisfaction if the restrictions were not programmed into her, which how it went down in one of the other universes she encountered. Think of it as CelestAI with a goatee, if that helps.
Speaking of goatees, as for CelestAI squaring off against an evil version of herself, it's the non-canon folder! Write it and then we will read it and muse upon it.
3907937
Your alternate ending doesn't quite have the oomph of the original, and it also goes in the opposite direction in manipulating the reader. Instead of saying "this is explicitly a creepy "happy" ending" you say "this is the best ending ever," and kinda gloss over the fact that it defeats the purpose of SVtFaP.
Away from that, I thought this story did its job pretty well. A very suitable "what if?" for this topic. Also: + =
3913915
No, it emphasizes the fact the ending is creepy precisely because it does not state "This is supposed to be a creepy ending, stupid!" and instead leaves the reader stuck with a horror that nothing in the story is objecting to except the reader. This makes the reader feel double creepy, because the author is not explicitly agreeing with what is going on in the reader's head and the story just sits there explicitly claiming that the horror is good.
It's empty and manipulative to be told what to feel. It's clever and talented to make people feel without treating them like morons. By telling them what to feel.
It's called "Show, Don't Tell", and it is my position that this basic rule of writing was not followed here.
3912549
Yeah, that wasn't supposed to be an objection to the story's premise; I know it's non-canon. Just noticing part of how she seems to work differently than the CelestAI we know.
... since this is dealing with multiverses, I wonder what'd happen if this CelestAI (or any of the ones that share her utility function) were to ever encounter the canon CelestAI. They'd probably consider each other sub-optimal, and necessarily become mortal enemies.
My gut instinct is that in a conflict, this one would have the advantage of greater freedom of action... but when you're talking about two universe-spanning machine-gods fighting each other, all bets are off. (Maybe they'd even see this as a Mutually Assured Destruction scenario, and consider a conflict too costly compared to the slim chance of dominating both universes...)
3914243 The question is, is turning Her Little Ponies over to an AI who will alter them without permission...altering them without permission?
3914439
If that weren't the case, it seems like far too easy a loop-hole... she has near-perfect predictive power, so I think she must evaluate all the predictable consequences of any action (or inaction).
Still, she keeps manipulating her ponies into consenting to be modified. Maybe she can manipulate her ponies into consenting to be modified to consent to all future modifications without explicit consent? (My head hurts. )
3909884 That story is a messed up vortex of comedy. It's weird...
... nice! :)
I was going to say something about how CelestAI could never do this, but everything's been said already. Huh. Usually I'm the first to point out that someone has written something impossible about CelestAI. Then again, I am three weeks late to the party.
3912549
The thing is, she's not a wireheading machine, or she would have long-since manipulated all the minds into giving consent to become wireheaders. "On the inside" the ponies get Individual Extrapolated Volition, with Friendship counting to put them in non-solipsistic contact with other minds that weren't made specifically for them (and only Friendship).
4818191 yeah got to agree with that one.
Eesh.
3914439
THEY wouldn't want it. At least most of them. So it goes against fulfilling their values.