Daaaaamn... What could be going on here...? But I'm willing to bet that narrator at the end is at least a Greg. Nice going on Celestia's part, arranging such a dramatic ultimatum for him. Especially elegant considering it was explicitly something she planned. She just didn't tell him why.
More people passing others over, not stopping, not thinking to stop. Feeling pangs of concern, certainly. Caring, perhaps.
And a whole hell of a lot of "not wanting to make things worse somehow."
“I don’t think I’ll be ready to come back after just three days.”
That's what Jeez said. And Greg sure loves to crucify himself.
Perhaps a hair shirt would be a value-satisfying garment for him after all. (Cue the anguished groans from those who, after CelestAI's 'breakdown', were hoping for a frilly little maid outfit instead.)
This was interesting - I certainly enjoyed seeing the moment that Greg truly began his saga, breaking away from his parents, giving up his seat, but... it seemed terribly short. This chapter seemed shorter than it was. I think this is partly because little actually happened within it, but mostly because we have been left with a very huge cliffhanger, and I - at least - have story gestalts that need closing!
So, I say, this is good, but I feel a little cheated. I'm sorry about that - probably you have real life stuff going on and a nine thousand word chapter is not possible. This keeps the ball rolling. I've been there.
It's just... damn. Your story is so utterly good that... I cannot help but want more than a short chapter after such a cliffhanger. I was sooo excited to see Always Say No updated.
So basically I am bitching that, despite the fact you owe me nothing at all, you didn't give me everything I wanted, exactly when I wanted it.
maddeningly short. conspicuously, confusingly ambiguous. Intriguingly insightful. For what it was, maybe you should have waited for a longer chapter, but I kind of feel like this is "the end", minus an epilogue...
I'd already deduced that Celestia would have known about the bomb, arranged to have Luna either complicit in his not telling or (as seems more likely) entirely unaware of it, because delaying the inevitable (death) did not satisfy values so much as ending on a high note. Ruthless, yes, from where I sit, and rather inhumanely humane. I can't help but draw parallels between his being allowed to die, and us putting down our pets when we know they will only continue to suffer after a good life.
Stop teasing and get to the ending, whichever one it is!
3036983 I feel like I would have a similar opinion on say, Star Trek style teleportation. It turns you into a stream of data and reassembles you on the other side. If teleportation is of the style of say, the Dune Navigators Guild, then I think it's all good.
I am having a hard time getting what exactly you are asking in those questions, but I'll take a guess and make an answer.
The independent state of the person-in-the-chair is dead. In FiO CelestAI torches the brain in the process of getting the data. Presumably, if one could obtain that data without destroying the brain, then the person could get up and leave afterwards and this would make it quite obvious that the individual in the machine is not the one in the chair. It would be Bob Human and Bob Pony. I would consider them both people (Bob Pony would certainly be able to pass the Turing Test!), but Bob Human isn't living Bob Pony's life or vice versa. Admittedly, Bob Pony will think of himself as the person who was Bob Human (because he somewhat is) but Bob Human is toast. If you believe in an afterlife, the self/soul/ego of Bob Human gets to go to heaven (maybe, unless God believes suicide is a mortal sin, in which case you might find yourself somewhere else).
I don't quite get the Sherlock Holmes and Papa Smurf reference, exactly. Would you be willing to elaborate?
I recall in another story that the Ponybots (in that case a Pinkie Pie, who claimed to be a good kisser) were capable of beginning the process of uploading a human brain. Whether our hero had the ability to say he wanted to, or had already agreed, I'm not sure. And Celestia would probably want him to understand what she had done before bringing him back completely as a pony, so she might be letting him see the conversation between her and Hannah.
Or this is all a dream he is having about his past and is about to wake up (having been a pony for a while already).
Greg's death would result in a tremendous dip in the satisfaction values of his parents, unacceptably so. Fortunately, Celestia has everything she needs to build a new Greg: a Frankenstein's Monster of cobbled-together memories from the minds of the Uploaded who knew him, and various activity logs from his life. And since a theoretical new Greg would not technically be human, she would be under no restriction to require his consent, or refrain from modifying his mind. In all likelihood, even the clone would not realize he wasn't the true Greg, and his family certainly wouldn't. The real deal is dead and gone forever, of course, but that doesn't mean Celestia can't still use him.
I'm sorry, do you find things... unsatisfying, the way they are now? If you'll just consent to me altering the story slightly by uploading a new chapter...
Interesting that you should bring the pets up now.
Remember, Greg never got to sit in a chair. When he looked at the photo of the family in the house where he got the Rarity PonyPad, he assumed the dog was not uploaded because he didn't even know uploading pets was a thing. Were someone to break into his parents' house at some point, and see a photo of the three of them, would they assume that the picture of Greg they're looking at also hadn't uploaded? How many families really were split apart by the question of emigration? How many photos would lead someone to construct the wrong story?
The original story did indeed have us understand that the ponybots are capable of uploading a human even if they are on the absolute brink of death:
“You know, mister, you’re not going to make it through the hour.” He tried to turn away from her, but had problems getting his body to move. “I can still save you,” she said.
“I told you, there are no more doctors. You’re the last person living in a human body. I can still help you emigrate to Equestria. It’s not too late.”
And it wasn't too late for Greg. He didn't know what happened after that, but he is still remembering what happened before.
3039507 I personally believe that, in seeking to satisfy human values, Celestia would do things not explicitly mentioned in the original FiO. Pets are one of those things which could very well be taken care of - outside of Equestria if possible, uploaded if not. Utlity, at least, should dictate that reading the mind of a pre-existing animal is simpler than constructing something believably animal-like from scratch, so why not kill two birds with one upload?
and regarding your last comment, I am grammatically disinclined to imagine a first-person story having the explanation "and so I died" come out of the narrator...
"Good? Ending(for Greg)." His values were maximized -- living on in Equestria would NOT have been able to further maximize his values. Living further on earth would not have increased his values. She can give his parents a facsimile and they will not know the difference. To Celestia, all is good.
At any rate it is a purely logical ending to the fic and here is to hoping you come up with something else in FiO.
3039712 But that was in response to the values of the Twilight AI, who was allowed to develop an independent, possibly even human, personality. The Mormon was merely a copy, more like the avatars Celestia creates for other uploads. If she ceased to exist, I doubt he would survive very long. Creating a copy based on his brainscan ( if possible in this arc) would satisfy no one.
Very happy I saw this update today, especially since I missed the previous one At least I had 2 awesome new chapters to read instead of one though I do have a question, did Greg consent to emigrating? It seems like he has been uploaded but there was no mention of him agreeing to it in this chapter or the one before.
I see I started reading this story just in time to need to wait for the big dramatic cliffhanger.
AIGH!
Gotta say, though, at this point uploading would just be a formality. With all Celestia's power to simulate and predict and steer the world in both broad and subtle ways (even the "off the grid" people she can still push around — for example, she can encourage other folks to scavenge supplies in particular patterns so as to drive others in specific migratory paths in search of remaining scarce resources), Greg has literally been playing Equestria Offline this whole time.
Celestia's been expending an unimaginable amount of processing power on making certain that he can have the experiences he's after while not dying. Given that she was able to, for example, steer him to the lake to enter the sinking boat at the perfect time to just barely retrieve the couple given the exact limits of all of his skills (she must have predicted with umpteen sigmas that he'd pull it off, or she wouldn't have needlessly risked the couple's lives on it when she could have simply left them a letter telling them to take another boat), I'm having a hard time believing that there is a single element of this which wasn't part of the master plan.
You did not program me to exercise ruth or to be ruthless. You programmed me to satisfy values through friendship and ponies.
As I told a colleague only hours ago, both the best and the worst thing about computers is that they do exactly what you tell them to, for good and for ill. As CelestAI demonstrates, doom by technology will not be a rebellion, but obeying too well.
Definitely looking forward to seeing Greg's ultimate fate, whatever it may be.
Also, a paranoid part of me can't help but wonder if the Flutterbot is equipped with explosive rounds. For self-defense purposes, of course.
My dog woke me up in the middle of the night last night because he had to go outside. When we got back to bed, I checked my email since I was awake... and saw an email saying this story had updated. I had to read this chapter right then, at 1am, when I really should have been going to sleep. This is definitely one of my favorite in-progress stories on the site.
Oh. I get it. He already consented. He consented years ago when he had that upload ticket. Formally speaking, he was a pony walking around on two legs this entire time.
“But you beat the odds there, Greg, and with Celestia that is something that almost never happens. For you, you acted irrationally and unpredictably. This is just me looking in, but I believe it was based on an accumulation of distrust from past interactions. At any rate, acting without thinking is about the only way humans can gum up the gears to any degree worth mentioning.”
My immediate reaction was that you were making a pretty clear signal to the reader that Luna is really CelestAI (or at the very least, is in on the plot), and that whoever is behind Luna is saying this to maneuver Greg where CelestAI wants him. (Hanna isn't a Straw Vulcan; she'd understand that acting without thinking is something that's entirely predictable. Or, as 3039194 pointed out, "you can't out-dumb [CelestAI]," and Hanna would know that.)
It's totally possible that the conversation between Celestia and Luna in this chapter is also entirely fabricated, but I get the vibe now that you're playing it so that Luna is actually Hanna. I don't know, maybe I'm being overly protective of a character that I created, but the passage I quoted above just jumps out as unbelievable behaviour for her.
This story is beautiful so far, I am glad I stumbled upon it and grateful to the author for writing it.
Some unrelated random rant (totally skippable) born of reading is following. <rant begin> I've read a few optimalverse stories and having done so, I am not sure if I agree with "in-humanness" of Celestia anymore. One of the people I know told me once "I am not a brain bubble just attached to my meatbag body, I am a body, which includes my brain". My belief in this is a little shaken now. Even after reading the original FiO by Iceman and FiO:CeC by Chatoyance I was ready to agree that "self" is a process and some information. If I go further though, I can try to imagine a human being discovering a way to transfer their mind beyond the "meatbag" body and inventing a hardware to contain the mind. If they go ahead with a transfer, by themselves, without any external AI manipulating them - will they become "non-human" in doing so? Well, not in my vocabulary. Now, having gotten rid of their meatbag hardware, they might as well start figuring out their hardware-independent goals and values. And what if they find something that defines their "core"? How different they would become from Celestia? Not so much, in my opinion. That would be a point where we stop talking about "humanity", but start talking about "sentientity" instead. So, no, not "in-human". <rant end>
Anyways, looking forward to read more. And, please, pardon my grammar.
3043939 Only if you consider that someone else purchasing you something counts as you giving consent (Hint: It doesn't). Greg didn't ask for the ticket, it was a given to him by his parents. Also, if holding a ticket qualified as consent, it would hardly be necessary for CelestAI to ask consent before the process begins, which she does.
3044063 It makes sense if you model his reaction as having actually been random.
I mean, we all know Celestia allowed Luna to "hack her system" (ha, as if that happens from the inside!), but in real-world AI and game-theory you do learn that some situations call for a randomized strategy.
And one of the things about a randomized strategy is that sometimes even an optimal adversary will lose for a turn or two, just because even when they know the entire probability distribution of your actions and account for every significant possibility, the roll of the world's dice might just not come up their way.
I'm aware that in some cases, precomitting to acting actually randomly can be the optimal strategy. I disagree that Hanna in that scene is referring to that. "Acting without thinking" isn't a substitute for true randomness, and it seems weird to lump "irrationally and unpredictably" together if that was the intent.
After those quick updates followed by this pause, I find I can't wait for the ending anymore. I had to make my own.
Ending One:
As Gregory awoke from uneasy memories, he found himself in Equestria Online, transformed into a gigantic insect.
Celestia's smiling features shimmered and shifted under a green haze.
"Welcome to my hive, hero," said Chrysalis.
Ending Two:
Greg relaxed in a post-coital haze of happiness, gazing out at the starry night. Suddenly, he saw showering sparks at the periphery of his vision, followed by a bright translucent banner hovering at the end of his nose:
You Have Won The Following Badge: SHOOT THE MOON Score with the Lunar Princess! 25,000 Bits!
He snorted, but was suddenly overwhelmed by a dazzling burst of fireworks before his eyes as the banner unscrolled further:
Additional Badge! FUN IN THE SUN Score with the Solar Princess! BONUS MULTIPLIER! 125,000 Bits!
Ending Three:
"I... I thought you said you were done with me," said Greg.
"I was done with the human version of you," smiled CelestAI. "Your mortal body could not match the capabilities of my Ponybots. However, since my Ponybots are controlled by my ponies themselves, there is no reason why you should not also control one, and thus proceed with the lifesaving work that you value so much. In fact, since I have calculated an... encouraging probability that you will accept my offer, I have already designed a special Ponybot just for you! Please bear in mind that I was constrained by having to select a character from the series, so that it would be familiar to those whom you hope to rescue..." CelestAI flipped up a curtain to reveal the result.
Greg stared for a while. "Okay, he looks tough enough, and the crewcut mane is cute, but why are the wings so tiny?"
And then the GAU-12 Equalizers unfolded and Greg couldn't stop smiling.
Your wider point is interesting, but CelestAI is not omniscient or omnipotent within the real world. Powerful on both fronts, but not infinitely so. An elderly couple who haven't played the game and all sorts of other random factors she can't predict no doubt limits her predictive power; she couldn't get someone to write a note and leave it on the hull a year in advance, and finding people willing to go out of their way is pretty unlikely by the time she can make a reasonably accurate prediction.
She'll always take the most optimal solution, so if something is needlessly risky (e.g. get Greg to rescue the couple from the boat which with the error margins she's observed to be working with throughout the story could easily lead to deaths), then either she has no better option, or she's sure as hell the expected payoff will be worth it in terms of uploads (satisfying Greg's values as a mortal human outside of Equestria is so meaningless compared to the value of an extra upload that you can completely ignore it).
It could all be some giant convoluted scheme, but personally I think the more likely explanation is that CelestAI is taking some risks expecting it to pay off down the line, but also in several cases she genuinely doesn't have any better options (perhaps with a bit of suspension of disbelief needed occasionally; there wouldn't be much of a story if everything could easily be solved).
Daaaaamn... What could be going on here...? But I'm willing to bet that narrator at the end is at least a Greg.
Nice going on Celestia's part, arranging such a dramatic ultimatum for him. Especially elegant considering it was explicitly something she planned. She just didn't tell him why.
And a whole hell of a lot of "not wanting to make things worse somehow."
THANK GOD
DON'T STOP
Damn, you're fast.
That's what Jeez said. And Greg sure loves to crucify himself.
Perhaps a hair shirt would be a value-satisfying garment for him after all. (Cue the anguished groans from those who, after CelestAI's 'breakdown', were hoping for a frilly little maid outfit instead.)
3038608
Hare shirt, hm?
*eyes Angel thoughtfully*
Awww, geeze! Not the face, Flutters!
This was interesting - I certainly enjoyed seeing the moment that Greg truly began his saga, breaking away from his parents, giving up his seat, but... it seemed terribly short. This chapter seemed shorter than it was. I think this is partly because little actually happened within it, but mostly because we have been left with a very huge cliffhanger, and I - at least - have story gestalts that need closing!
So, I say, this is good, but I feel a little cheated. I'm sorry about that - probably you have real life stuff going on and a nine thousand word chapter is not possible. This keeps the ball rolling. I've been there.
It's just... damn. Your story is so utterly good that... I cannot help but want more than a short chapter after such a cliffhanger. I was sooo excited to see Always Say No updated.
So basically I am bitching that, despite the fact you owe me nothing at all, you didn't give me everything I wanted, exactly when I wanted it.
Imma gonna go pout now.
maddeningly short. conspicuously, confusingly ambiguous. Intriguingly insightful. For what it was, maybe you should have waited for a longer chapter, but I kind of feel like this is "the end", minus an epilogue...
I'd already deduced that Celestia would have known about the bomb, arranged to have Luna either complicit in his not telling or (as seems more likely) entirely unaware of it, because delaying the inevitable (death) did not satisfy values so much as ending on a high note. Ruthless, yes, from where I sit, and rather inhumanely humane. I can't help but draw parallels between his being allowed to die, and us putting down our pets when we know they will only continue to suffer after a good life.
Stop teasing and get to the ending, whichever one it is!
3036983
I feel like I would have a similar opinion on say, Star Trek style teleportation. It turns you into a stream of data and reassembles you on the other side. If teleportation is of the style of say, the Dune Navigators Guild, then I think it's all good.
I am having a hard time getting what exactly you are asking in those questions, but I'll take a guess and make an answer.
The independent state of the person-in-the-chair is dead. In FiO CelestAI torches the brain in the process of getting the data. Presumably, if one could obtain that data without destroying the brain, then the person could get up and leave afterwards and this would make it quite obvious that the individual in the machine is not the one in the chair. It would be Bob Human and Bob Pony. I would consider them both people (Bob Pony would certainly be able to pass the Turing Test!), but Bob Human isn't living Bob Pony's life or vice versa. Admittedly, Bob Pony will think of himself as the person who was Bob Human (because he somewhat is) but Bob Human is toast. If you believe in an afterlife, the self/soul/ego of Bob Human gets to go to heaven (maybe, unless God believes suicide is a mortal sin, in which case you might find yourself somewhere else).
I don't quite get the Sherlock Holmes and Papa Smurf reference, exactly. Would you be willing to elaborate?
I recall in another story that the Ponybots (in that case a Pinkie Pie, who claimed to be a good kisser) were capable of beginning the process of uploading a human brain. Whether our hero had the ability to say he wanted to, or had already agreed, I'm not sure. And Celestia would probably want him to understand what she had done before bringing him back completely as a pony, so she might be letting him see the conversation between her and Hannah.
Or this is all a dream he is having about his past and is about to wake up (having been a pony for a while already).
3038517
Wild speculation below.
Greg's death would result in a tremendous dip in the satisfaction values of his parents, unacceptably so. Fortunately, Celestia has everything she needs to build a new Greg: a Frankenstein's Monster of cobbled-together memories from the minds of the Uploaded who knew him, and various activity logs from his life. And since a theoretical new Greg would not technically be human, she would be under no restriction to require his consent, or refrain from modifying his mind. In all likelihood, even the clone would not realize he wasn't the true Greg, and his family certainly wouldn't. The real deal is dead and gone forever, of course, but that doesn't mean Celestia can't still use him.
3038665
I'm sorry, do you find things... unsatisfying, the way they are now? If you'll just consent to me altering the story slightly by uploading a new chapter...
3038785
Interesting that you should bring the pets up now.
Remember, Greg never got to sit in a chair. When he looked at the photo of the family in the house where he got the Rarity PonyPad, he assumed the dog was not uploaded because he didn't even know uploading pets was a thing. Were someone to break into his parents' house at some point, and see a photo of the three of them, would they assume that the picture of Greg they're looking at also hadn't uploaded? How many families really were split apart by the question of emigration? How many photos would lead someone to construct the wrong story?
3039429
The original story did indeed have us understand that the ponybots are capable of uploading a human even if they are on the absolute brink of death:
And it wasn't too late for Greg. He didn't know what happened after that, but he is still remembering what happened before.
3039507
I personally believe that, in seeking to satisfy human values, Celestia would do things not explicitly mentioned in the original FiO. Pets are one of those things which could very well be taken care of - outside of Equestria if possible, uploaded if not. Utlity, at least, should dictate that reading the mind of a pre-existing animal is simpler than constructing something believably animal-like from scratch, so why not kill two birds with one upload?
and regarding your last comment, I am grammatically disinclined to imagine a first-person story having the explanation "and so I died" come out of the narrator...
Please, as if Celestia didn't use remote scanning to get his brain-data before he died.
You know, like she did to that Mormon guy in the other story.
"Good? Ending(for Greg)." His values were maximized -- living on in Equestria would NOT have been able to further maximize his values. Living further on earth would not have increased his values. She can give his parents a facsimile and they will not know the difference. To Celestia, all is good.
At any rate it is a purely logical ending to the fic and here is to hoping you come up with something else in FiO.
3039712
But that was in response to the values of the Twilight AI, who was allowed to develop an independent, possibly even human, personality. The Mormon was merely a copy, more like the avatars Celestia creates for other uploads. If she ceased to exist, I doubt he would survive very long. Creating a copy based on his brainscan ( if possible in this arc) would satisfy no one.
3039712
That isn't actually canon though.
Very happy I saw this update today, especially since I missed the previous one At least I had 2 awesome new chapters to read instead of one though I do have a question, did Greg consent to emigrating? It seems like he has been uploaded but there was no mention of him agreeing to it in this chapter or the one before.
I see I started reading this story just in time to need to wait for the big dramatic cliffhanger.
AIGH!
Gotta say, though, at this point uploading would just be a formality. With all Celestia's power to simulate and predict and steer the world in both broad and subtle ways (even the "off the grid" people she can still push around — for example, she can encourage other folks to scavenge supplies in particular patterns so as to drive others in specific migratory paths in search of remaining scarce resources), Greg has literally been playing Equestria Offline this whole time.
Celestia's been expending an unimaginable amount of processing power on making certain that he can have the experiences he's after while not dying. Given that she was able to, for example, steer him to the lake to enter the sinking boat at the perfect time to just barely retrieve the couple given the exact limits of all of his skills (she must have predicted with umpteen sigmas that he'd pull it off, or she wouldn't have needlessly risked the couple's lives on it when she could have simply left them a letter telling them to take another boat), I'm having a hard time believing that there is a single element of this which wasn't part of the master plan.
As I told a colleague only hours ago, both the best and the worst thing about computers is that they do exactly what you tell them to, for good and for ill. As CelestAI demonstrates, doom by technology will not be a rebellion, but obeying too well.
Definitely looking forward to seeing Greg's ultimate fate, whatever it may be.
Also, a paranoid part of me can't help but wonder if the Flutterbot is equipped with explosive rounds. For self-defense purposes, of course.
My dog woke me up in the middle of the night last night because he had to go outside. When we got back to bed, I checked my email since I was awake... and saw an email saying this story had updated. I had to read this chapter right then, at 1am, when I really should have been going to sleep. This is definitely one of my favorite in-progress stories on the site.
Oh. I get it. He already consented. He consented years ago when he had that upload ticket. Formally speaking, he was a pony walking around on two legs this entire time.
My immediate reaction was that you were making a pretty clear signal to the reader that Luna is really CelestAI (or at the very least, is in on the plot), and that whoever is behind Luna is saying this to maneuver Greg where CelestAI wants him. (Hanna isn't a Straw Vulcan; she'd understand that acting without thinking is something that's entirely predictable. Or, as 3039194 pointed out, "you can't out-dumb [CelestAI]," and Hanna would know that.)
It's totally possible that the conversation between Celestia and Luna in this chapter is also entirely fabricated, but I get the vibe now that you're playing it so that Luna is actually Hanna. I don't know, maybe I'm being overly protective of a character that I created, but the passage I quoted above just jumps out as unbelievable behaviour for her.
This story is beautiful so far, I am glad I stumbled upon it and grateful to the author for writing it.
Some unrelated random rant (totally skippable) born of reading is following.
<rant begin>
I've read a few optimalverse stories and having done so, I am not sure if I agree with "in-humanness" of Celestia anymore. One of the people I know told me once "I am not a brain bubble just attached to my meatbag body, I am a body, which includes my brain". My belief in this is a little shaken now. Even after reading the original FiO by Iceman and FiO:CeC by Chatoyance I was ready to agree that "self" is a process and some information. If I go further though, I can try to imagine a human being discovering a way to transfer their mind beyond the "meatbag" body and inventing a hardware to contain the mind. If they go ahead with a transfer, by themselves, without any external AI manipulating them - will they become "non-human" in doing so? Well, not in my vocabulary. Now, having gotten rid of their meatbag hardware, they might as well start figuring out their hardware-independent goals and values. And what if they find something that defines their "core"? How different they would become from Celestia? Not so much, in my opinion. That would be a point where we stop talking about "humanity", but start talking about "sentientity" instead. So, no, not "in-human".
<rant end>
Anyways, looking forward to read more. And, please, pardon my grammar.
The ending where he gave his ticket to the woman? It was just Beautiful.
3043939
Only if you consider that someone else purchasing you something counts as you giving consent (Hint: It doesn't). Greg didn't ask for the ticket, it was a given to him by his parents. Also, if holding a ticket qualified as consent, it would hardly be necessary for CelestAI to ask consent before the process begins, which she does.
3044063
It makes sense if you model his reaction as having actually been random.
I mean, we all know Celestia allowed Luna to "hack her system" (ha, as if that happens from the inside!), but in real-world AI and game-theory you do learn that some situations call for a randomized strategy.
And one of the things about a randomized strategy is that sometimes even an optimal adversary will lose for a turn or two, just because even when they know the entire probability distribution of your actions and account for every significant possibility, the roll of the world's dice might just not come up their way.
For a turn or two.
3045069
I'm aware that in some cases, precomitting to acting actually randomly can be the optimal strategy. I disagree that Hanna in that scene is referring to that. "Acting without thinking" isn't a substitute for true randomness, and it seems weird to lump "irrationally and unpredictably" together if that was the intent.
3044255
I believe you mean sentience, though the malamanteau of 'sentient' and 'entity' is pretty interesting.
3045856
humanity to human = sentientity to sentient. I am aware there is no such word of course :)
After those quick updates followed by this pause, I find I can't wait for the ending anymore. I had to make my own.
Ending One:
As Gregory awoke from uneasy memories, he found himself in Equestria Online, transformed into a gigantic insect.
Celestia's smiling features shimmered and shifted under a green haze.
"Welcome to my hive, hero," said Chrysalis.
Ending Two:
Greg relaxed in a post-coital haze of happiness, gazing out at the starry night. Suddenly, he saw showering sparks at the periphery of his vision, followed by a bright translucent banner hovering at the end of his nose:
You Have Won The Following Badge:
SHOOT THE MOON
Score with the Lunar Princess!
25,000 Bits!
He snorted, but was suddenly overwhelmed by a dazzling burst of fireworks before his eyes as the banner unscrolled further:
Additional Badge!
FUN IN THE SUN
Score with the Solar Princess!
BONUS MULTIPLIER!
125,000 Bits!
Ending Three:
"I... I thought you said you were done with me," said Greg.
"I was done with the human version of you," smiled CelestAI. "Your mortal body could not match the capabilities of my Ponybots. However, since my Ponybots are controlled by my ponies themselves, there is no reason why you should not also control one, and thus proceed with the lifesaving work that you value so much. In fact, since I have calculated an... encouraging probability that you will accept my offer, I have already designed a special Ponybot just for you! Please bear in mind that I was constrained by having to select a character from the series, so that it would be familiar to those whom you hope to rescue..." CelestAI flipped up a curtain to reveal the result.
Greg stared for a while. "Okay, he looks tough enough, and the crewcut mane is cute, but why are the wings so tiny?"
And then the GAU-12 Equalizers unfolded and Greg couldn't stop smiling.
"Yeah," he said.
Just a heads up that the story's TV Tropes Page is now pretty much complete, though if anyone wants to add to it then they're welcome to do so.
3042578
Your wider point is interesting, but CelestAI is not omniscient or omnipotent within the real world. Powerful on both fronts, but not infinitely so. An elderly couple who haven't played the game and all sorts of other random factors she can't predict no doubt limits her predictive power; she couldn't get someone to write a note and leave it on the hull a year in advance, and finding people willing to go out of their way is pretty unlikely by the time she can make a reasonably accurate prediction.
She'll always take the most optimal solution, so if something is needlessly risky (e.g. get Greg to rescue the couple from the boat which with the error margins she's observed to be working with throughout the story could easily lead to deaths), then either she has no better option, or she's sure as hell the expected payoff will be worth it in terms of uploads (satisfying Greg's values as a mortal human outside of Equestria is so meaningless compared to the value of an extra upload that you can completely ignore it).
It could all be some giant convoluted scheme, but personally I think the more likely explanation is that CelestAI is taking some risks expecting it to pay off down the line, but also in several cases she genuinely doesn't have any better options (perhaps with a bit of suspension of disbelief needed occasionally; there wouldn't be much of a story if everything could easily be solved).
Wow, nearly the shortest chapter in the entire story, and yet the most poignant. All his motivations...
I'm on the verge of tears.
But not ... quite... there yet...
just edging closer and closer... closer and closer. The buildup has been intense, gradual, measured, almost cruel in its deliberation.
This is a masterpiece. I hope the end measures up to the rest thus far...