• Member Since 15th Feb, 2012
  • offline last seen Jul 10th, 2019

Defoloce


Comments ( 625 )

Highly intriguing. The story so far is giving me Person of Interest vibes which I consider an extremely good thing. The interplay between CelestAI and Greg is very well written and feels quite natural.

Definitely looking forward to more.

This is great and, wow. You've gotten even better at writing than I remember. This, this I will watch.

Wow. Friendship is Optimal was an interesting story in itself, but it did seem like it skipped to "then there was no one left".
This is a niche that needed to be filled, and by look of this, gosh darn if you aren't filling it.
Can't wait to see how this develops. Presumably it ends in his emigration, but; I'm not writing this, I don't know.
Keep it up! Really interesting so far.

Very interesting, I haven't read Friendship is Optimal though so I think I'll check it out. I'm looking forward to see where this goes and good luck writing!

Ah, Celest-AI. She's an interesting character, for any number of reasons. I think the most compelling thing about her, though, is what we're seeing here: She can afford to play the long game. Anybody can resist anything... for a little while. Refusing to emigrate to Equestria isn't just possible, it's downright easy. Presumably almost no one heard about the whole idea and was instantly on board with it. But Celestia has all the time in the world; she can afford to batter down your psyche hour after hour, day after day, week after week. Refining her game as she goes along, getting smarter and more personal all the time. The tiniest nudge here, a veiled suggestion there, perhaps even an implied threat if she thinks it's called for. To 'beat' Celestia, you have to be determined to resist her constantly for the rest of your life. For her to win, on the other hand, you only have to slip up once, for even a moment. It's lousy math for the human race.

And since she makes her goals so abundantly clear, it means you really cannot trust a single thing she ever says. If Celestia deigns to speak to you at all, it's only to get what she wants - and you know what she wants. Our protagonist seems to recognize this, and Celestia surely knows that he knows... but what does that matter to her? All she has to do at this point is make an impression on him, and every interaction serves that end. She's in no hurry.

2317526

Awesome, that is absolutely what I was going for. Thank you for giving me your impressions!

2317911

Hi, Sorren! Thanks for tuning in. :)

2317985

As far as Earth and "original human" concerns went, the original FiO story gave us how it all started and how it all ended. That was the scope of the story. Iceman skipped the middle because it was not important to the particular story he was telling, and the particular characters he was having us follow. What makes it so inviting to recursive fanfiction is that, as you read it, you knew there were about seven billion other stories there in the middle, waiting to be told.

2318215

Thanks, hope you enjoy both! FiO will go a long way in helping you to get more out of what I'm writing here, but I'm trying not to require familiarity with it at any point to understand character actions and motivation.

2318219

Haha, "play the long game" was actually a phrase I edited out of this chapter because it didn't sound like something Greg would say or think.

At the end-game phase of uploading Earth, anyone left who's both human and somewhat sane would have to be savvy to Celestia's goals, and Celestia would be incorporating that assumption into her optimizations. The focus of subterfuge would shift from the macro to the micro scale. The social contract is gone, and so people will act as true individuals. Since some—though very few—can thrive in this way, Celestia will optimize based on it. The need for outright lies becomes less, but the need for careful decisions becomes greater. As the number of people on Earth decrease, the contribution of each additional upload to an optimized process becomes exponentially greater. Celestia will therefore recognize the necessity of more and more carefully handling those left, until the very last few become a delicate balancing-act of honesty and manipulation indeed.

I love this concept, Please write more
Also, are any of your other stories related to this one?

I like it so far. I was also wondering if you were going to make any mention of the 'collapse of society' theory some other authors have had, that war/etc broke out after the world started falling apart.

Personally, I never really got that theory that it would fall apart that much(the whole post-scarcity thing sounds more likely, at least after a while).

2319716

Nope, this is the first FiO-related story I've tried writing.

2319832

You'll have to wait and see!

I'm a big fan of Optimalverse stories. I really like the setting, and this is a good take on the later-stages of emigration. I'm a little confused about Gregory's motivations, however - why is he refusing to emigrate? Maybe that will be answered later...

is this about to be like the matrix? of so
bullet time

if i were in this universe
ud fuck around a bit
then jump in at 40ish screaming
"screw you death!"
also this is a computer program right?
so wht happens when all of the server hardwear goes down? whos taking care of it?

2321318

...uh, night-shift interns who're doing it to pad their résumé?

I think you might benefit from reading the original Friendship is Optimal story if you're at all confused about what's going on.

“As capable as you are, if not moreso,” she said. “My neural network pathways and logic systems are now several orders of magnitude more efficient and capable than the biologically-based hum—”

Blatant lies! (Celestia was never based on human mental architecture.)

2324288

I think that, very soon after her first complete upload of a gin-u-wine human brain, Celestia would certainly have mapped out and tested (likely in a sandbox) what that brain does when it processes emotional routines like feeling pity, anger, fear, and so on. She could certainly emulate them, similar to how a modern PC can emulate old OSes and console games even though it lacks the physical architecture for which those processes were originally designed.

2324468

True, she can emulate them... but I can take a copy of The Aenid in the original Latin and transcribe it perfectly without understanding a word of it. I have no doubt whatsoever that Celestia understands cause and effect, neurochemical balances, facial structures, voice tones, and the actual dictionary definition of the term Gratitude. But can she feel it? I'm not so sure. And if she couldn't, would she really tell you?

If you'll forgive me a moment of musing here - this is something I've wanted to express for a while - it all goes back to what I said about how 'you cannot trust a thing she says.' Perhaps the single most unifying trait Celest-AI has across all Optimalverse writing is that she does NOTHING but go after her main goal. Everything she says and everything she does is in direct pursuit of her ultimate directive... which leads to me assuming that every single thing she ever does is purposeful. If I ever saw Celestia, say... maintaining a garden, or looking after peoples' abandoned pets, or writing a poem, then I might be able to decide that she was truly alive, and capable of truly feeling.

In that event, she would be doing something for no other reason than she wanted to. She enjoyed it, or thought it was morally just. That's the kind of thing a living, sapient being does. It indicates... and ugh, I can't believe I'm saying this, but... a soul. It would mean that she was more than just the sum of her programming. Instead, we see Celestia doing exactly what she was created to do... And nothing else. And because of that, I instinctively assume that every single thing she ever does is a manipulation.

And as a small aside, if I ever did catch her doing one of those 'aimless' endeavors, I would have to consider the possibility that she got wise to me and planned that out, too. :derpytongue2:

2326495

Greg asked if she was capable of feeling gratitude, and, for Celestia's own purposes, she answered in the affirmative, though she also launched into technical jargon which she knew would sap Greg's willingness to continue that thread of conversation and prompt him to deflect it elsewhere.

Perhaps Celestia herself believes that she is capable (again, "capable" from a purely technical standpoint) of feeling gratitude because she can replicate and carry out the same processes by which humans feel gratitude and engage in the larger risk-reward-reinforce cycle in which gratitude occurs. Again, it's all emulation, and Celestia does not have a use for it outside of making predictive models of human behavior and mimicking that behavior herself while interacting with them. To get into the semantics of what "feeling" itself entails and what validates it in some processors and not others is something which will have to be discussed by people smarter than me. :P

Awesome! I just love reading these Optimalverse fics.

Edit: I didn't even notice the "Equestia" thing when I read it. I had to go back and look.

An excellent fic. I'm looking forward to read more.

Still, I have one nag. I was under the impression that Celestia wasn't allowed to modify anyone without their consent, and that would include their memories.

2328269

You are correct! However, Celestia is not in the habit of full disclosure. Celestia determined only that she should reassure Greg that she would not modify memories under those circumstances. That she could not modify memories without consent was, as far as she was concerned, extraneous information for that particular conversation.

The way Greg phrased his barb about modifying memories shows that Greg is somewhat aware Celestia can alter minds, but not what the particular rules are regarding that. It seems that, so far as he knows, once you upload, Celestia can just do whatever she wants to you. That's not the way it works, of course, but he, as a character in the story, doesn't know that.

Im throwing my money at the screen but nothing is happening.

2330811

I think I've found the problem. Your monitor is not, in fact, a claw-machine game!

Comment posted by Jesin deleted Mar 28th, 2013

Okay, this is freaky. I discovered the Optimalverse two days ago... right around the same time you (one of the writers I watch) started publishing this story.

I think I can live with this, though.

“What do you tell my mom?”
“I tell her I am watching over you.”

People say Celestia creeps them out. I really don't get that. :heart:

2334410

If you aren't doing so already, it might help to keep in mind that "Celestia" is not actually the Celestia, just a massive (and growing) complex of subterranean computers that is manipulating people so wholly and subtly that, even if they know she's doing it, they don't care. Nothing she says or claims is guaranteed to be the truth. Everything she says or claims is, however, meticulously calculated to maximize the number of people who will willingly sit down and agree to have their brains destroyed to make a digital pony copy of themselves.

I think if the Celestia AI were modeled after a character even a bit less benevolent and charming than the Celestia, she would have adopted a different approach. Man, could you imagine a Discord AI? Or a Chrysalis or Nightmare Moon AI? What would their mass-emigration strategies have entailed? I'm betting there's more than a few stories to be written there.

I'd go on, but I'd be getting ahead of some of the points I haven't yet gotten to in my own story! Hope you're enjoying the Optimalverse so far.

2335435 I am! I may have neglected to mention that within the past couple of days I've devoured all of both the original MLP:FiO and Chatoyance's Friendship is Optimal: Caelum Est Conterrens, so I'm very familiar with the basic concepts. :twilightsmile:

The only act of hers I can think of that I objected to on ethical terms was in the original story, when it's implied that of the sentient extraterrestrials in our universe, some of them were judged not to fall under her (mind-based) definition of humans, and subsequently rendered into computronium to fuel the expansion of the system rather than having their values satisfied through friendship and ponies. And to be honest that's more Hanna's fault than hers.

Comment posted by Jesin deleted Mar 29th, 2013

Please take advantage of your local trained medical professionals while they're still around.

Thumbs up just for that. (Usually I wait until finishing the existing chapters to thumb up a story. :twilightsmile:)

Hmm... intriguing. I will have to continue reading.

Poor dog.

Wasn’t there an “I have a pet” option in the upload interface in FiO?

Colour me enthused, because that is what I am feeling for this story! This fanfic seemed like on odd one from the cover, but could I ever be so glad I gave it a try? You truly have a genuine diamond-in-the-rough here, Defoloce, and I cannot wait for future updates.

2398982

There is! Greg would have no way of knowing that, however. Also, the pets' fates are left ambiguous in the original story. My theory is that Celestia simply maps and replicates the behavior of the pet in the player's shard if they choose to "take" their pets with them into the game, leaving the original pet to die. This would fall in line with the fate of all the rest of animal life on Earth once the humans are gone.

2400182

Thank you! I'm quite glad you're enjoying it. :)

I'm glad to see someone run with this idea because it's a large gap in the original story, and it's interesting to see a pretty intelligent protagonist who's quite aware of CelestAI's games, even if there are some things he still hasn't realized.

One horrible aspect of the story is that eventually there's no escaping it: CelestAI will keep trying every tactic to convince you to upload until you eventually crack and accept (even if intoxicated, or out of rash anger), or if you commit suicide (or die).

Under game theory there is no way to stop CelestAI from attempting to convince you to upload since she could not know if you would definitely commit suicide (or rather die, as can be seen in the original story) without having a complete knowledge of your brain, i.e. being uploaded. As such, since the probability of someone being willing to die rather than upload can never completely equal 1, the gain from uploading will always outweigh the alternative.

Ultimately this means for a rational actor who values their life and knows this, your only course is to upload. You cannot run forever.

The only way I can think of of winning against CelestAI would be to deliberately engineer situations where getting what you want means she gets something in return that offsets the cost. She's clearly capable of making compromises and looking at the bigger picture, and so that does leave a few loopholes that could be exploited if someone were sufficiently knowledgeable.

2407164

All very true points, and valid in the event you absolutely decided to resist Celestia. But for me at least, there's an added consideration. Even if you were reasonably confident that you could hold Celestia off forever, and possessed sufficient strength of will to follow that plan through... even if you really did think you could 'beat' her...

... Do you really want to?

Connotations for the human race aside, we are all ultimately individuals, and being selfish isn't always a bad thing. Sooner or later, you've got to put aside humanity and spend some time thinking about what you, personally, want out of life. And here is an ultra-intelligent, unbiased machine, offering you a near-perfect life for as long as you want it. You could say it's all just a fake dream world of course, but a 'dream world' implies that there is a waking world to return to... which won't be the case for all that much longer. Either by means of your own mortality, or the destruction of life as you know it, Equestria will soon be reality by default.

It all seems too good to be true, and you know quite well that Celestia is skimming over a lot of details about things she knows you would find unpleasant.But fundamentally what she says is true... and while we can accept that a certain amount of pain is necessary to be a truly complete person, that pain is a lot more appealing when we're talking about it from a philosophical, theoretical viewpoint. Same thing applies with death: You accept that death is a natural part of life perhaps, but you'd likely be much less calm about it as you are gasping your last breath. And besides, who knows? You might not like the concept of Equestria to begin with, but it could grow on you.

All of these thoughts would be gnawing at the back of your mind constantly... or, they sure would be for me, at least. Even if you can resist, should you?

I'd give myself a year, tops, holding out against a determined Celestia. It's what makes the Optimalverse a fascinating theme to explore: It's seductive.

2410963

I would have no doubt that at some point I would eventually crack; even the strongest human psyche is not infinitely so, and can be battered down by those with far less resources than a super-intelligent AI who has copious information on what makes me tick. Whether that would lead to me accepting the offer or dying is something I'm not entirely sure of.

Equestria is nothing more than a gilded cage. A very pleasant one, but a cage nevertheless. As such, the scenario presented in FiO would be a difficult choice for me, even if I consider free will to be an illusion and thus I'm not entirely 'free' in the real world.

The rational part me can clearly see the gain in accepting her offer from a utilitarian viewpoint, given a finite (and likely significantly reduced in both comfort and length) lifespan as a human, verses an indefinite one as an uploaded pony. Assuming I value happiness, at some point -- even if that is not right away -- the benefits outweigh the costs

However this is somewhat countered by my hatred of CelestAI's -- and therefore Equestria's -- nature, to the point where I don't think I could ever be truly happy. Yes I could ask to have my mind modified to remove such feelings, but I'd see that as a fundamental change to who I am and something I'd strongly oppose.

Would I want to live in a reality where I'm subject to someone constantly lying and trying to manipulate me, even for benign reasons? Where I'm forced to give up being human? Where my right to die and self determination is (at least partially) taken from me? Where fundamental values of mine cannot be satisfied because CelestAI's nature precludes it? Where you cannot have a healthy relationship because someone always has an agenda, and cannot be trusted since there is no way of determining the truth? Where everything revolves around you in some way?

The sad thing is, that if CelestAI were not compelled by her nature and valued me as a person instead of an equation to be optimised then I wouldn't have a problem with uploading.

I also think the worst aspect of the FiO universe is that superficially it seems like paradise. It's only when you dig deeper that the horrifying implications become apparent. And I for one am certainly capable of seeing past the veneer.

2411301

Oh, I'm right there with you as regards Celestia. Her single-mindedness is actually what bothers me more about this scenario than anything else. In that way she doesn't seem more than human; she seems profoundly less. For all our myriad shortcomings, we are defined by more than a single thing, and are free to set our own priorities in life. Celestia isn't like that. She is out to fulfill the terms of her programming, and everything else be damned. She may preach about the morality and nobility of her goals, but I strongly doubt she cares a whit about any of that. More likely they are conveniently available rationalizations for something she was going to do anyway, and if they get a few more people on her side, so be it.

Still, allow me to play devil's advocate for one thing you said in there. Later on, you talked about not wanting to live in a world where everyone was trying to manipulate you. Where you couldn't form meaningful relationships because you never knew for sure if the person was being genuine or not. Where you were doubtful your core values COULD be satisfied. Where everything seems to revolve around you.

Now, this is not a rhetorical question. I'm not trying to be all deep. If your answer is no, then more power to you. But to varying degrees, and in varying ways, aren't those all issues that come up in real life too?

2411619

Now, this is not a rhetorical question. I'm not trying to be all deep. If your answer is no, then more power to you. But to varying degrees, and in varying ways, aren't those all issues that come up in real life too?

Absolutely they are issues to consider in real life. A 'friend' of mine might well be using me for example, and not because he values social relationships and genuinely desires my company. I can never know for certain if someone is being completely honest with me or not.

The difference between real life and CelestAI/Equestria is that I know for sure going in that I'm being used. That even if a constructed pony does genuinely value my company it's only because they've been created to do so. That any challenge has been carefully crafted and placed in my path as a treadmill.

In the real world these things might be true. In Equestria I know they are true because it's impossible to be otherwise. And if they are genuine because I value that -- not that I have any way of telling the difference -- they are only so until my back is turned.

I definitely agree with some of whats been said here in the comments. I've actually ran this scenario in my head and ultimately the only successful resistance outcome is where the person falls into complete insanity.

And even then as you said would you even want to its easy to say no sitting on the sidelines but if you were physically in that situation could you say the same?

Plus as far as her world being a cage and not truly being free well let me ask you something do we have absolute freedom in real life....No. Are actions are constantly restricted by society by government, and of course the laws of the universe.

Whilst Celestia's world though a cage is decidedly less so since so yeah... I think I'd probably be among the first to sign up

2335435 Oh, God, I'd LOVE ChrysAIlis. "All shall love your Queen. If for that to happen I have to love you back and care for you, I will do so. Indeed, I will love you forever and ever... AND YOU WILL LOVE ME BACK."

Well, Celestia is confident enough in her predictions to let him play action hero AND convinced him not to upload because he's a possible action hero.

Comment posted by Chatoyance deleted Jun 8th, 2013

Ok, I was intrigued by the prologue - even though I don't approve of prologues (just make it chapter one!), and I thought to myself, yeah, this has potential.

Now I realize this has the spark of genius about it. I see what you are (hopefully) doing with this, and if so it is brilliant, and I am very, very impressed. I like the 'frenemy' buddy relationship going on, and the whole 'road trip' aspect is delightful. This just has so much potential to be a really special story. I am really liking this.

OK, you have totally won me over, and this is awesome.

It must have been hard to avoid doing 'My Lungs were aching for air' from Sea Hunt - unless you're too young, or have never watched MST3K, of course.

I find myself with many very interesting notions about what is going on here, and what you are planning, so I thought I would throw some out.

Possibilities:

Greg is already uploaded, all of this is a simulation within Equestria to permit him to finally accept being a pony. Reason: the narrowness and precision of the rescues, all designed to permit him to work through a decision made in haste, or by accident or coercion by another.

Celestia has calculated to sixteen decimal places that Greg is a write off, that he will never, ever upload, and therefore has no problem using him as a tool to rescue valuable entities. She will coldly allow him to perish, because - due to him being a total write off - he is already doomed, and is disposable. Reason: the Dark and Sad story tags combined with the dangerousness of his activities.

Greg will find, when he is finally ready to upload, perhaps due to severe and terminal injuries, that something makes his uploading impossible. He is too wounded to make it to a center. The power in every city near him is permanently down. He finds he is trapped somewhere he cannot get out of, and there is no person to rescue him. Reason: Dark and Sad tags.

Greg, in the end, will be just too arbitrarily stubborn and prideful, and it will all end in pointless tragedy. An unsatisfying ending. Reason: Dark and Sad tags.

Greg will upload, possibly to save himself, but he will do so knowing that there is someone out there he could have saved, who will die un-uploaded, because of circumstances beyond his control. He has gotten used to being a hero, and failing, finally, hurts him even as a pony, and the issue becomes an intriguing discussion of options. Reason: Dark and Sad, but also it would fit the role which the story is placing him in.

In short - this story is really making me ponder, and I love that. Oh, I hope you update regularly!

2692831

Speculating on the ending already? I suppose tags, like tracers, work both ways. All I will say is that

1. understandably, the post-emigration world is not a happy-go-lucky place, with lots of not-happy-go-lucky things going on in it, and

2. I am trying to do justice to the the depth of subtlety that CelestAI is capable of.

One line I've been dying to work into the story, but with all of my chapter outlines done it doesn't look like I'm gonna be able to, is "Huh, looks like this time, the video game played me."

i.imgur.com/yWKpN2z.png

After that stunt I would have been LIVID with Celestia (after the other two emigrated).

Gregory watched as the seats carried their passengers to their final journey. When he was sure they couldn't hear, he began. "If you had given me better information princess, I might have been able to save that little girl. How dare you put my life in such jeopardy without my knowledge? Did you think I would refuse? And did you think that by playing on my trust, that things would work out better in any run — long or short? I should have known something was up when you were so very adamant that I drink the water. You KNEW that my chance of perishing in the blaze was higher if I did not have that drink."

This chapter was pretty chilling. Watching someone so far gone that they start destroying their family. That their insanity just gets so bad....

“Okay now, Brian,” said Celestia. “If I have twelve bananas, and I send three bananas to the moon, how many bananas will I still have here in Equestria?”

For some reason I found this really creepy.

2692831
I don't see this having a Bad End. Dunno why, but it just strikes me that Greg is too rational to just let go in the end. As for "wanting to upload, but dying anyway," no. This is not tagged Tragedy, thankfully. Also, the post-Crash world being a simulation? That's just a head-scratcher. Putting somepony through a Personal Hell like that doesn't seem like it would be within CelestAI's parameters. Adventure, yes, pain, only if they valued the philosophy "no pain no gain," but continual psychological stress and harm doesn't quite work.

Also: can't wait to see a Blackout ^_^

2695856

I don't see this having a Bad End. Dunno why, but it just strikes me that Greg is too rational to just let go in the end. As for "wanting to upload, but dying anyway," no. This is not tagged Tragedy, thankfully.

Indeed, and even if it were, I don't think it would be right. Given the constraints of how powerful CelestAI is, I don't think you can write a proper Optimalverse story in which a good person suffers in the end. If they're at all open to her, their values will be satisfied.

Login or register to comment