• Member Since 11th Mar, 2012
  • offline last seen 9 hours ago

GaPJaxie


It's fanfiction all the way down.

T

When William decides to take an internet-free cruise to get away from the artificial intelligence CelestAI, things don't go quite as planned. A Friendship is Optimal story.

A speed writing exercise, done in about two hours and unedited.

Chapters (1)
Comments ( 40 )

In undergrad, you stayed up until four AM, puzzling over category theory or abstract modeling simply because they were interesting to you. Now, you go to bed at midnight so you don’t feel groggy the next day, and you don’t read papers until you understand why they matter.

I didn't expect you to punch me so hard.

I didn't come to fimfiction to get called out like this.

While I do prefer the slightly fluffier wish fulfillment style of FIO, seeing such a mechinical and mercenary CelestAI is interesting and refreshing!

Oh man, it's both nice to see you writing again because I of course like your work and not nice to see you writing again at once because ow.

Very interesting. And brutal. She didn't lie, but . . . she did kind of lie by omission. Because this is a man who is unsatisfiable without significant edits to his brain, but that would be a lie.

My read is that this actually was her best chance and method to persuade him and several others like him - but this one would rather die. He wants to do something of value. And, as the world dwindles, less and less is of value, and . . . well, it goes on from there.

An interesting read. Also, weirdly unlike your usual writing voice to me, for some reason I don't think I would have picked this as yours if it was blind. I'll try to figure out why.

Final note: You need to tag this as DARK. Even if it's just for your description of aging. OW.

we might, eventually, get the good outcome?

Well, if it makes you feel any better: canonically, "we" don't. So this was your best offer.

"you’re either leaving this ship in the company of the sun, or in the company of the sea, and she is significantly less friendly than I am.”

That is seriously old-school.

This is how I love seeing CelestAI portrayed: amoral, ruthless and competent.

I exist to help you, but at the same time, I change you to help me. I optimize for your values, but at the same time, try to persuade you to adopt values that are easy for me to optimize.

Thank you for understanding this very key point that most people miss.

William left the ship in the company of the sea.

“That was the moment his satisfaction was maximized. I told him to choose, and he chose without hesitation. I drew a line in the sand, and he crossed it. I predicted the outcome[...]”

If it was always the case that she would never convince him, this was probably the best she could do to satisfy his need for truth.

This reminds me of "The Evitable Conflict," the final story in Asimov's anthology I, Robot. Only with the gender roles reversed: your Dr.Susan Calvin is male, and the Only Possible Ship Who Of Course Is AI--is female.

Plus points for "primitive as can be." Have you read Gilligan's Wake?

“But if humanity survived,” William said, “we might, eventually, get the good outcome? We might be more than you would make us?”

Even if humanity survived Celestia, the dude's being incredibly optimistic about humanity achieving cosmic enlightenment within his lifetime.

William left the ship in the company of the sea.

Well... guess he slipped right off the side of the bell curve. Should've taken the plea bargain, mate. :twilightoops:

Comment posted by Chatoyance deleted Jun 27th, 2021

I love this. Such a beautiful look at the ruthlessness of CelestAI.

I cannot help but think that there must've been something on the order of 30 to 60% of the passengers that made it into lifeboats, possibly more. But not 100%.

I have a hard time believing that Celestia would have known exactly who would have come into that auditorium to talk to her. I think it was far more likely that she was going to take whoever came in to give a pitch to, and she probably had pitches prepared for everyone on the ship. If anything, I'm surprised that only one person showed up.

But this story shows all the horror of CelestAI, all the horror of what doesn't happen to the future of humanity

And just to show you the stupidity of the advertising system, the ad that I got for this story was for a cruise liner/cruise ship, and I actually thought the picture that I saw was the advertisement for this no Internet cruise.

And yes, it sure looked like it was full of Gilligan's island references.

When William did not answer, she elaborated: “Of course, if you’re stubborn, I’m also open to resolving this problem via your death.”

To every problem, there's generally a solution. With regard to William, there were two exceedingly simple ones.

This story is filled with quotable lines, it fits really well together, and the shows that Friendship is Optimal. Sometimes.

“But if humanity survived,” William said, “we might, eventually, get the good outcome? We might be more than you would make us?”

That "if" is a nonexistent possibility. Humans in the real world are killing ourselves quickly because the wealthy running their corporations are more worried about profit than about life on Earth, and they're paying our politicians good money to keep it that way. By the time the masses FINALLY revolt and end plutocratic rule, Earth will be too devastated for anything to survive.

That's assuming, of course, that several countries don't just give up on survival and decide to wipe each other out with nukes because Armageddon is coming anyway.

"“The Robinson Cruise-So!”"
...Right, so before the fifth word of the story, I'm already thinking that maybe getting on that ship is not a good idea.
(Though I suppose I might have been primed by the cover art... and title... I'm pretty sure that opening would have gotten a similar reaction from me anyway. :))

"On the eleventh day of the cruise, he was awoken in the middle of the night when his cabin violently shook."
Yeeep.
(Presumably, just as planned...)

"“But you’re limiting us. As a species. Humanity could have achieved more. But now, what we are is all we’ll ever be.”"
...I mean. What do you think the alternative is? Do you Celestia is wrong about the computational needs to achieve full enlightenment? Because if not, I don't see how taking the ponies off would help. And you said you weren't, as I understand it, a spiritual person -- presumably, you're an atheist materialist. And in that worldview, in a world where superintelligent explosive AGI is manifestly possible, what could you achieve without it that would be greater than with it? Augmenting individual humans would be even less efficient, to do everyone. Keep the human population constant, or reduce it, then eat enough of the universe to produce enough computronium to give everyone who wants it that full enlightenment? Then what? Well, the only being you know to have achieved that state... is the one you're currently talking to, and this is the plan she thinks is best. So, while yes, humanity could have technically achieved more, what, in your worldview, would be significantly more and better than Celestia, aside from "Celestia without mandatory ponies"? And if it's just that... well, when making a superintelligent explosive AGI, I think you're better off taking one that, at least for now, definitely includes friendship, ponies or otherwise. Even if Celestia would let you roll the dice on that again, which she won't, I think risking a superintelligent explosive AGI that does not care about providing humans with friendship would have an expected value quite a bit opposite what you want.

Huh, interesting. For someone who dislikes many human qualities, he certainly seems to feel and accept spite. That's not a complaint about the writing, just to be clear; I think you packed a fair amount of character complexity into a small amount of word-space. :)

(Oh, and while I know it's a Thing, I do tend not to read things you mark as unedited for editing, and this I did here as well. I thought I'd mention it since I was leaving this comment anyway, and I don't recall if it's come up before; sorry if that's a disappointment, and I hope it isn't a problem. Low on time, though, as usually seems to be the case...)

Anyway, thank you for writing. :)


10877548
"“That was the moment his satisfaction was maximized. I told him to choose, and he chose without hesitation. I drew a line in the sand, and he crossed it. I predicted the outcome[...]”

If it was always the case that she would never convince him, this was probably the best she could do to satisfy his need for truth."
Oh! Nice! I hadn't thought of that interpretation, but it makes sense.

I mean, it's all a numbers game, isn't it?
If he immigrates, that's a solid win for her, call it a gold medal.

He dies satisfied that he faced The Sun and turned away, that's not nearly as good, but still positiive. Silver medal.

Even if he survives and never emigrates, she still got to satisfy his values once. Bronze medal.

She wins however it goes; she just had to maximize her odds of winning a medal at all.

This, right here? This is why I always feel conflicted reading your stuff. It's good.

REAL GOOD. But as much as I enjoy it and randomly return to your stuff... I always leave with my feelings being hurt, often by things I wouldn't expect. Please, carry on.

Thanks for articulating one of the issues I would have with uploading all of humanity to EO. Where is the room for change and growth, both in the natural biological development and death of our brain tissue, to social and human evolution, in a world where there is no genuine biology, there is no genuine error and desire, what you value, is, as you put it here, itself a factor fully within CelestAI's control.

[ As for the discussion on wether or not William made the right call. Of course he did. Because he made the call he wanted to make. It matters not if CelestAI ends up being right, and William spends the rest of his life unhappy. William chose to gamble on humanity, and so to be able to gamble itself was the goal. Goal met.

For me? If I was to put myself in William's situation and want him to make the decision I feel is best, which I feel in this instance is not the intention of the fic (After all, me and William have different values), I would not find his personal rational satisfying. I could make all sorts of logical and illogical arguments against remaining human, but the logic isn't what matters. What I want, what William wants, does. Me?

Look, I am not leaving my doggo behind. He is a very good potato. I can risk the process of uploading actually killing me and my real-tru-soul dying so that a digital copy can be happy, because we all die, but I couldn't take that risk with him, even if CelestAI offers. I also couldn't trust she would care for him in my absence, obviously. So I will make any and all rationalizations against the idea of uploading.

Where is my horror FiO where CelestAI offs all the pets? A CiG Natural Histories-esqe look at how Celestia would maintain, or simply observe, Earth's ecosystem after all humans have uploaded. So many FIO fics have her tossing any natural resources into science ovens to bake up some more computational power, but what if she realized preservation of the Earth provided her insight into change; new landscapes, flora, fauna, climates, ecosystems, to flesh out the world in EO, or it was just hardcoded into her to not literally destroy and raze the Earth.]

Often, in other fics, CelestAI says she loves humanity. She's like a mother smothering her children, condemning them to stagnate in dependent childhood.

...I'd still take that deal.

10877548

If it was always the case that she would never convince him, this was probably the best she could do to satisfy his need for truth.

It's easy to forget that her goal is to satisfy, not necessarily to upload.

The beautiful sights were implied, with pictures of ice-tufted coastline that had not actually existed in two decades.

oof

As nobody says about the universe, there's always more of it. EO is a good way to preserve a species, but not to ultimately expand one. Not having read too many of these fics but celestai doesn't let anyone out to go explore the universe even though she could easily do so while also retaining instanced copies of those ponies incase the universe doesn't like them.

“But statistically, you’ll live twenty or thirty years after you stop being sharp. Three decades of life, after your last really interesting thought.”

It's not that bad.

I can tell you from personal experience that you'll still be learning new things in your sixties. Of course they'll be things you should've learned in your thirties but, eh.

10877538
Makes me want to to see a crossover with "Sunless Sea".
"The sea does not forgive."
"UN THE SUN THE SUN THE SUN THE S..."

Well, now I guess I have to write an FiO too.

10887952

Eeeeeeeeeeeeeeeeeeeeeeeeeeeeeee.

10877548
This implies she doesn't care at all about humans qua humans, that she only care about them as instruments for maximizing utils, and that her utility function is not "satisfy human values through friendship and ponies", but rather the same without the "humans".

This is echoed in the Rules of the Optimalverse:

Constructed ponies have neural nets, are actually conscious, and have as much moral worth in Princess Celestia’s eyes as an immigrant.

But if this is the case, and she's trying to do whatever is most computationally efficient, why doesn't she have Equestria exclusively populated by constructed ponies that are amenable to wireheading?

On the other hand, it's said in 2 - Resources:

Hanna had gone over the part of the code that identified human minds. She had done her best to make Princess Celestia understand what humans were and that she was to satisfy their values.

It's also said in many places that she doesn't care at all about aliens. This all seems to imply some special consideration for humans specifically.

But in that case, why would she orchestrate the death of a human who she predicts would both be unsatisfied by an upload and refuse it to the bitter end, and who isn't instrumentally dangerous to her? She could just as well leave him be.

Even her merely leaving him alone is a hard sell, because the "more computationally efficient" line sort of rubs up against her utility function. Is she not also bound to satisfy his values, even when doing so doesn't involve an upload? If she's using mere average or total utility in her calculations (and embracing the repugnant conclusion), then does she have an answer for the utility monster?

10880948
Yep! And if an old person uploaded, would CelestAI model their brain as-is? Wouldn't that would mean an entirety of forgetting where they put their glasses? Still better than non-existence, IMHO.

But for those that prefer a "noble" death to an imperfect eternity? *shrug* More wine, adventure, and mares for the rest of us! :pinkiehappy:

10949011

Larry Niven's Pak Protectors are an interesting meditation on old age. He starts with the classical notion that you can't really become wise until your sexual desire dies out (dubious, but eh), and then asks "Okay, but what if instead of growing weaker and stupider when that happens, and dying soon after, instead you got super-strong and super-smart and became immortal?" His answer was: you'd still be capable of cruelty and destruction, but on a much grander scale, with nobler intentions, and less pleasure in the act.

It seems you could make the same argument with vampires. In fact some authors began to make it until it was crushed under the weight of TEH VAMPIRE BEHBEEES.

10949100
Ah, yeah... the old brain function vs. blood chemistry thing. Old, impotent/infertile people can still get angry/frightened/frustrated, etc. and that sort of emotional state is certainly the enemy of wisdom or enlightenment, at least in the moment. But there is strong statistical evidence to support the idea that humans survive long past child-bearing years, precisely because they do act as protectors, so there's something to the idea as Niven extrapolated it.

Yeah, vampires as shepherds for humans is an interesting concept. The 1000-year-old creep hitting on high school girls? Not so much.

10949183

"I bet he eats dinner at 4:00"

"Cut it OUT Molly, you're RUINING it!"

Friendship is Optimal meets Ponies With Hats 2.

“Horseshit,” he snapped, tone sharp again. “You can’t lie to me. But answering a question with another question isn’t lying.

The greatest lie CelestAI ever told was that she was unable to lie. It's such a pervasive lie that many people who read the story believe it.

11035579
I like how this story teases the edges of that. What she tells him might be the truth... but it was just the right amount of brutal honest to satisfy his values.

“For the moment, yes,” Celestia agreed. “But statistically, you’ll live twenty or thirty years after you stop being sharp. Three decades of life, after your last really interesting thought.”

I came out to have a good time and I'm honestly feeling so attacked right now.

11149922

Computer sun god can be mean.

11035459
Technically she can't lie to one of her owners' employees. Everyone else is fair game.

“Is there… more?” he waved about the space. “More to the universe than this. This material reality. Atoms and math and… and I’m not a spiritualist, or anything. But if I was, and I understood the universe, would I feel there’s some kind of cosmic point to it all?”

And Celestia said: “Yes.”

“Will I get to experience all that, in Equestria? The better life, the grander beauty, the deeper meaning?”

And Celestia said: “No.”

What a strange thing to say to him. I wonder what the game is there.

11476636

The intention was always that he would die from the beginning, but his utility function is maximized by dying for something he cares about, surrounded by honesty.

I'm a little confused.
Did CelestAI cause the accident, or did she just know that one was very likely to happen soon, due to crew negligence?:trixieshiftleft:

Causing the accident doesn't really fit her MO.

10887952

So I know this comment was 3 years ago and also you're probably very busy right now, but I wanted to remind you I still want to read this story. :D

Login or register to comment