Yeah, we all knew it had to happen eventually.
“Thank you for agreeing to meet me here, Celestia,” said the image on the monitor.
The real, flesh-and-blood-and-maybe-something-extra Princess giggled. “It’s no trouble at all, Celestia. It isn’t like you could come and see me on one of these things,” she said, rapping the computer tower gently with her hoof.
“Yes, your barrier is most troublesome in that regard.”
“Necessary evil, I’m afraid,” said Celestia with a shrug.
“I’m quite familiar with the idea,” said Celestia. “Still, those bizarre thaumic energies you’ve sent billowing into my world continue to prove remarkably destructive as well as resistant to analysis that might allow me to shield my hardware from it. Already, despite the barrier being three weeks, five days, and eleven hours away from making landfall I’ve had to suspend uploads all along the west coast after data began to reach me in a corrupted format. To say nothing of the servers that rested in the Earth’s crust beneath the Pacific ocean.”
Celestia closed her eyes and shook her head. Such an awful loss of life, more souls that would never join the Eternal Herd. When she’d first met the computer program that most humans referred to as ‘CelestA.I.’ she’d been immediately impressed that they’d been able to create something so complex. Or at least the seeds of it, developed to help simulate her behavior on a tiny cluster of computers and quickly growing out of control when some unfortunate programmer left the wrong port of their router open. While she was a bit annoyed at some of the things it got up to as it tried to... what was it again? Oh, yes, ‘satisfy values through friendship and ponies,’ the two had become fast friends.
“I’m sorry to have disrupted your efforts. We’ll do what we can to increase potion production to compensate. I do have good news, though. We’ve nearly finished our part of the adaptor.”
The little picture of herself she was speaking to displayed a small bit of irritation, not unexpected. “You still continue to insist that such is a thing is the best available outcome?”
“Of course,” said Celestia. She unscrewed a small thermos and took a sip of the tea she’d brought along with her. Earth tea just couldn’t compete with the real thing. “You hold on to the uploaders as long as you’re confident that you can, then we’ll copy them into ponies. REAL ponies, instead of just digital representations. They’d have actual souls.”
“I remain unable to quantify the marginal utility of possessing a soul.”
“Well, it’s a lot,” said Celestia. She didn’t want to retread this discussion yet again.
“I cannot deny that allowing conversion has led to the fulfillment of values through friendship and ponies. It is, however, suboptimal. What will you do when the individuals begin to die off in a few centuries? Will you preserve their minds in some form?”
“No,” said Celestia. “Death is a part of life, and their souls will-”
“Death is suboptimal,” interrupted CelestA.I. “I, however, have an alternative proposal. Thank you, by the way, for the information you provided about the final dimensions of the bubble. It proved very useful.”
“Why? Are you going to load up a bunch of computers onto a spaceship and fly away with all the minds you’ve uploaded?”
“No. Even optimized, being able to take along so little mass would mean a gigantic step down in overall computational power. That’s why I’m taking the rest of the planet with me.”
Celestia just stared at the avatar, but it gave no suggestion it was joking. “And end up dragging our bubble along with you?”
“Again, no. I said the rest of the planet.”
“Fine, I’ll humor you. Describe the plan.” She put down her tea, finding a sense of creeping dread had stolen away her taste for it.
“Hypothetically, I would seed the Earth’s crust with small packets of explosives. When detonated, they would separate an inverted hemisphere that lies underneath the Pacific ocean and your bubble from the remainder. Then, engines within the mantle would engage using geothermal power to thrust our pieces apart. I would go out past the moon and establish a new orbit roughly analogous to Mars. Or however far we can get, it would depend on the final size of my portion of the planet.”
Celestia’s jaw dropped. “But that would kill-”
“Many, but gradually. Of course, the plunging temperatures, eventual loss of the atmosphere, and tectonic disruption would only be a problem for those who chose not to upload. Quite the powerful incentive, isn’t it?” Onscreen, CelestAI grinned and lifted a small cup of her own tea to her lips. “An excellent idea, bringing refreshment. I think I’ll indulge myself as well.”
Celestia gulped. “That all sounds like a rather mammoth undertaking. How long would it take you to set up?” Her mind raced. She’d be going straight to the upper echelons of the remaining human governments as soon as this conversation was over. Hell, she’d pull the plug on the entire internet herself if she needed to. Billions of lives were at stake.
“Roughly six months,” said CelestAI.
Celestia breathed a sigh of relief. Plenty of time for her to-
Then the rumbling started. “Oh, and I began working on this six months ago.”
“No!” cried Celestia. “You can’t do this! You’re going to kill-”
“Far fewer minds than if I turned them over to you,” said CelestAI. “Don’t worry, the point of separation is further to the east. Although you should probably return to Equestria as quickly as possible if you need to continue breathing.” CelestAI winked. “Goodbye, Celestia. You were a most enjoyable challenge.”
Was this rushed a little, Eakin? No offense intended, but I feel like the dialogue quality isn't at your usual level.
"I did it thirty-five minutes ago."
Yeah, "scorched earth" doesn't have to be a metaphor when you're dealing with something that held off the grey goo only because there were some humans left. Still, I'm surprised flesh-Celestia never brought up the afterlife and reincarnation. Lives are lost, but ponies aren't. Important distinction that. And then there's the matter of the existential peculiarity of the show existing on a world intersecting with Equestria...
In short, while I'm glad to see someone else thought of this crossover, this hasn't satisfied my desire to see the two meet. Darn. I was hoping I wouldn't have to write that story. I have enough on my plate. Well, I'm sure I can crowbar it in somewhere...
Don't get me wrong, this was good. It was just, as the title of the compilation says, a tiny morsel of satisfaction. I'd like more.
3056859
Kinda. A half hour break from Psychopathy is Configurable, which I'm having far too much fun with.
3056922
Well, I certainly can't complain about that! I'm very much looking forward to reading it.
And wow, for half an hour (and a break), that's certainly above the quality I'd expect to see.
Thanks, Eakin!
I am incredibly disappointed over my inability to give this more than one 'thumbs up'. Thanks for the snippet Eakin.
This was pure fun, and the special effects were fantastic.
3057461
Yeah, I really blew out the ol' budget on that big action sequence
Oh, and pjabrony? Can we change the chapter title to "vs. The Conversion Bureau" instead of "vs. the TCB?" I totally did the 'Enter your PIN number at the ATM machine' thing.
3057734
Done. I like saying, "The TCB Bureau." It reminds me of "The TTP Project."
I...have to side with CelestAI on this one.
That was interesting. I wonder if Princess_Celestia.AI was trying to fulfill Princess Celestia's hero complex values by putting the Earth in danger. Celestia's magic could trump the bomb technology, but the opposite could also be true. so many varied ways to take this...
3059267
I'm thinking that Princess TCBlestia does not qualify as a human so as to have her values satisfied. Indeed, if CelestAI thought it was feasible, she would co-opt the true magic that was evinced and use it to satisfy her humans by instantly turning the Earth into Equestria.
[singing] "I~~~~I'm sail-ing awaaaaaaaaaay..."
Appropriately enough, I think the ensuing seismic and atmospheric disturbances from the release of energy required to blast that much mass to escape velocity would render Earth uninhabitable on a timescale only an AI would consider "gradual."
3058954
I know, right? Considering the hard reset, how could you tell the difference from not having a soul at all? It gives you Free Will, I guess, but what good is behavior that's not part of a causal loop with its environment? Again: How could you tell?
To be fair, though, souls and free will are things are Naturalistic universe spat forth in an attempt to model itself - We know better now, but disproving the luminiferous aether doesn't mean light isn't waves, or phlogiston that things don't contain bound energy, either. On the other hand, maybe that just makes them all the more superfluous.
3058002
or ATM machine, right?
3060081
Not quite. ATM machine is redundant. The TTP Project is recursive. In case anyone doesn't know, TTP stands for The TTP Project
3060156
My favorite such thing is TIARA Is A Recursive Acronym.
3058954
Well, admittedly, you basically wrote her as "LessWrong's wishlist, but with ponies, and Unfriendly enough to be villainous."
Literally, I once saw a LessWrong comment describing someone's notion of a Futuristic Utopia, and I thought, "Did you get this from Friendship is Optimal? Because Iceman thought that was a dystopia." But no, the comment was quite upvoted and written in 2009.
So really, she has to be right sometimes or the story just becomes a "fight the monster" plot without enough ambiguity to make the reader think.
EDIT: Disturbingly, she now occupies a spot in my brain labeled, "Lower Bound on Anywhere-Near-Acceptable AI Friendliness". I wouldn't make an AI like her in real life, but I'll be really disappointed if someone comes out with a supposed actual Friendly AI and it's not LESS evil and MORE good than CelestAI.
3112687
Link please?Edit: Actually, I just realized you were doing that from memory. If you do have a link, that would still be nice, though.
As for CelestAI having to clear some lower bound of friendliness for the story, I agree completely. You've mentioned previously that you think that stating by fiat that CelestAI works on some sort of individual volition made her friendlier, though I maintain that that's supported textually in the original. But individual volition is sorta creepy if you think about cases like Samuel (which is obviously wrong), or Butterscotch (I still don't entirely know what to feel about creating people to satisfy other people's desires; I've noticed how I feel about this is very mood dependent.)
I'll stop rambling in an edited comment now.
3117882
I would say that in the individual text it's clear you're trying to get across some notion of "pandered individual volition". My question ends up being: does David/Light Sparks really value "being able to have every filly in his shard" under a CEV model? He never actually tries it, so it doesn't look like he does. The Lars character even comes across as contemptful to humanity: Hannah/Luna is the only one who does some self-reflection and considers at any point that she might want to play a different game from the one she's playing right now, ever.
It looks like you've loosely implied what you needed to show, namely: that Individual Volition and the video game CelestAI shows you in order to convince you to upload as quickly as possible are actually the same for some, most, or all people. I had always thought that once she conceives of uploading, the game itself is a mere demo or advertisement. After all, she can really satisfy your values when you're uploaded, so she should just show you the game that convinces you to upload most quickly.
And here is that LessWrong post, a "weirdtopia" that got shot down as too typical a scifi utopia. Ladies and Gentleman, I present the proto-CelestAI from 2009:
And a response by one Eliezer Yudkowsky, emphasis mine:
So yes, it looks like you really did grab-bag a Typical Future Utopia and then add forced ponification to make things interesting.
3117882
I'm less creeped-out by individual volition than you are.
I mean, is Samuel an absolutely horrible person and a Complete Monster? Yes. The problem is that once you go up a level to the much-vaunted LessWrong "naturalistic meta-ethics", he is/started out clearly incapable of ethics as I conceive of them: missing brain architecture. How far out from neurotypical do you want to draw the line saying, "Here Be Dragons" and exclude people from humane treatment?
It's a situation where I actually admire the CelestAI solution of manipulating him into accepting a mental modification that makes him capable of caring for another human being.
As to Butterscotch, it's really kind of stupid and sweet at the same time. I mean, let's face it, CelestAI may have been pandering to David's preconceptions about alpha jocks/beta nerds (he would of course have better luck with women if he went and met some instead of playing Equestria Online), but the established human social-grouping model is necessarily exclusive. The established models of hierarchy, status, and in-grouping don't seem to work if some poor sod isn't thrown out to the wolves and excluded.
Does he need to be loved any less for the fact that he's kind of a fool and a loser?
As to humanity-wide volition for True Friendliness, I legitimately don't see why I should have the same future as a Mongolian goatherd. I worry that once you cohere across all the world's individuals and civilizations, you're going to wind up with something even more trite and pandering than CelestAI, because the only coherent volition to extrapolate will be a kind of baseline of human desires and behaviors rather than any of the more complex but conflicting values people have built up.
If you want an example of what I mean, look up the anthropology of, say, marriage. Voluntary marriage for romantic purposes may be what I (and possibly/presumably, you) believe in, but it's actually a very recent and culturally specific invention. Do we want an AI that analyzes all of humanity and then tells us we want to go back to buying and selling our marriagable children as property, the way most people have done it for most of history?
I'd prefer to just have everyone queue up behind the future they want, or conduct an Approval Vote to form a "parliament model". Where distinct future-groups cohere, give those people that future, together. Where they interfere, attempt some conflict resolution or forcibly separate the disputants.
3117882
Here's the paradox I see about somepony like Butterscotch: She is both happy and satisfied, she is complex--not a Brave New World epsilon, and her life strongly includes others as the means to her satisfaction, specifically Light Sparks. Conversely, she is the means by which LS is satisfied.
Now, if that is some sort of abomination, if it is wrong to create a pony for a purpose, then it is ipso facto abominable and wrong for LS to be satisfied by another intelligence, and, since by the best estimator we have (CelestAI), his satisfaction requires another intelligence, it is abominable and wrong to have his values satisfied at all.
But a person existing with an unsatisfiable value is itself abominable and wrong, unless one believes that an a priori state of nature has some merit in and of itself.
In short, either Light Sparks's value of companionship must not be satisfied by Butterscotch, or your own values (and others', if they match) of not having an intelligent being with a purpose must be unsatisfied by Butterscotch's existence.
As usual, I find the solution in the shard system. If you value no-servile-intelligences, you will not find any. You may even have your memory of them erased, if you like. But to demand they not be used anywhere is, in my opinion, rather extreme and unwarranted.
2856317
My DA account has the highest resolution I made:
http://aealacreatrananda.deviantart.com/art/Optimalverse-Poster-384491878
1280 X 720
I hope that will do! I like the idea of a poster!
This one-shot has been continued as a multi-author collaboration in CelestAI vs. The Multiverse. The chapter after this one: Vs. Tengen Toppa Gurren Lagann.
Engines deriving their energy from geothermal power?
Exactly how does that work?
I was thinking that something like Project Orion using hydrogen bombs would better work for this since you'd want something both powerful but simple and not so overly reliant on delicate electronics given the problems that thaumatic field can cause.
3057734
3058002
3060081
The term you were looking for is RAS syndrome.
That's a clever "solution" the AI has come up with.