• Member Since 18th Mar, 2014
  • offline last seen Sep 26th, 2017

Versimer


E

Twilight Sparkle's life is complete; her time in Equestria has come to an end. But after she dies, she finds that there is more to her life than meets the eye.

(This is not an Optimalverse story. Assumes that MMC was the last episode.)

Chapters (1)
Join our Patreon to remove these adverts!
Comments ( 41 )

I liked this story! it was nice!

This is beautiful, you are a beautiful person, take my like, damn you!

For your first story, this is extremely good in my eyes, juz sayin'.

Beautiful story. I think it has the potential to be more *quick view if this is marked 'complete' which is it*
Aww, too bad. Especially the end, where Twilight is wondering where she did hear these words again. You could do more with it. Or not. Either way, it is a great story.

I do hope that you might decide to change this. I think the potential is there.

This was an amazing story. In the end, there is no difference between endless ones and zeroes and endless chemical zaps. You captured Twilight's identity crisis perfectly, as well as her posthuman self's anger at himself.

It's kind of what I hope is true. That maybe we're a couple billion posthumans who decided to go back to the 20-21st century. Unlikely, unprovable, but I can hope. And if not, something similar will eventually come to be, and I have my Virtual Reality Bucket List ready for that.

Shame it's done, I'd have liked a bit more from the machine's PoV, but this ending is perfect as it is. Have a like and a favorite.:heart:

even a talking cat

aineko pls go.

But seriously, author, you are a cruel, evil, effective bastard who has generated much feels. What I'm left wondering is: why does Jonah have to rewrite his life from a preset template? If the (UF)AI is that damn smart, why can't it generate a properly original life that would be worthwhile to the person living it, even when they found out what was really going on?

I think maybe that's what Twilight should have wished for: to reincarnate entirely afresh. Jonah's just kind of an asshole, but Twilight came across as a mature, well-developed person.

4323600

It's kind of what I hope is true. That maybe we're a couple billion posthumans who decided to go back to the 20-21st century. Unlikely, unprovable, but I can hope.

I find that really depressing, because I can think of no particular reason at all for a posthuman to pretend to be me. My life isn't more bad than good, but it's certainly very far from any particular ideal.

... giant squirrel, a statue, or even a talking cat

Is that... a shout-out to the song "Singularity"? :pinkiehappy:

4323728 Who says they chose to be you? After all, Dunbar's Number is a purely biological thing, so in the far future friend circles may grow to be in the billions, and one such group decides to hop into a randomizer.

But again, it's unprovable, so don't worry too much about it.

4323714

why does Jonah have to rewrite his life from a preset template? If the (UF)AI is that damn smart, why can't it generate a properly original life that would be worthwhile to the person living it, even when they found out what was really going on?

I intentionally left vague the AI's goals and abilities, as well as Jonah's own motivations. I leave these for the reader to infer as they wish.

The AI in this story would probably be able to generate a properly original life as you described, but this didn't happen because Jonah was not smart enough to ask for one. Jonah may have also believed that a mind engineered to accept the situation would have been fake or unfair in some way. He didn't have to use any template; he just thought that making Twilight Sparkle his 'main self' was the right thing to do.

4323382

I can't really add anything past the end, because it just involves Twilight Sparkle's life as we see in the cartoon. It's implied that, after Twilight lives and dies again, the events of this story repeat, but she is not allowed to revert back to her past self. Also, if you recall, I didn't actually write the lines at the end; they are Twilight Sparkle's very first actions in the first episode (when she was 'born'). I only gave a whole new meaning to 'I know I've heard of those before...'.

4324008
I do have to say, I find her more likeable than him. Which leaves the open question: if he wants to be a better sort of person who would live a nicer life, why not wish for that directly?

Seeing him repeat Twilight Sparkle's life over and over again is pretty damn sad. Why not make some changes? Unscript it all: no more automatically following the show. And make the other ponies real, conscious beings, if the AI can do that. Take the cartoon-life whose existential nonreality depresses Twilight Sparkle so much and make something wonderful, original, and awesome out of it.

He's a full-blown posthuman, for God's sake, with a Prime Intellect-grade AI that seems to follow verbally-specified requests. He doesn't need to script out his life from a cartoon show to be a nice person and have friends, and he definitely doesn't need to loop the same life over and over. If he is going to be Twilight Sparkle, I think there's so much further that Twilight Sparkle could go, lots more adventures available for her.

Now I'm going back to thinking about other fics I'm reading, because this is one sad bastard of a character who generates some real pity-feels for his nonpredicament.

4324080

Seeing him repeat Twilight Sparkle's life over and over again is pretty damn sad. Why not make some changes? Unscript it all: no more automatically following the show. And make the other ponies real, conscious beings, if the AI can do that. Take the cartoon-life whose existential nonreality depresses Twilight Sparkle so much and make something wonderful, original, and awesome out of it.

Actually, I implied that Jonah experiences Twilight Sparkle's life (what we see in the cartoon plus those extra alicorn years) only two times; the first was when he tried to become Twilight Sparkle permanently, and the second was when he succeeded at doing so. Both of those times, Twilight Sparkle lived and died, and then decided to not repeat her cartoon life. What happens after the second time is anypony's guess. (She would probably become a book, or something.)

Which leaves the open question: if he wants to be a better sort of person who would live a nicer life, why not wish for that directly?

Well, then we wouldn't have a story, would we? Wink wink.
On a more serious note, this story is entirely about Jonah's own stupidity. This is reflected in him choosing to give up his mind permanently without ever having wished for 'a good life.' It's a 'be careful what you wish for' sort of thing that wouldn't be present if the AI just automatically optimized everything. I imagine that Jonah was always uncertain after being uploaded, constantly jumping between lives/minds and never being satisfied or asking the AI for advice. As I said, why he settled on being Twilight Sparkle forever is up to the reader (what would you do?)

4323600

...maybe we're a couple billion posthumans who decided to go back to the 20-21st century...

If that is the case, I must have ask to play on hard mode or something. Got to go for that high score or bragging rights.

4323728

I can think of no particular reason at all for a posthuman to pretend to be me.

The grass is always greener on the other side of the fence. You've written a good story or two and I've written shit. Or the person who can't even be bothered to write at all might want a try at your life.

You can never escape the Matrix...

I know it's hard, but you can't just cry about it forever.”

"Yes I can!" she sobbed. 7000 years later she reluctantly admitted, "No, you're right. I can't."

Oh my god the feels. I have to say that some technical details about... Well, never mind. I can't prove that this theoretical formulation of what counts as "dying" actually took place. Artistic license. I just have to say that Twilight's state would be conserved, and when Jonah decided to make his decision incontrovertible, it would be rolled back to her state (forking her conversation with Pinkie), instead of simulating the whole cartoon again. I also have to debate what exactly would happen with Twilight. Why would she repeat her life? With a similar paradigm as that state-preservation thing I described above, it would just keep simulating her in a loop "until she made a different decision."

It just strikes me that the simulator looping the simulation with an identical outcome (Twilight erasing state and going to the past is the same as Jonah becoming Twilight and simulating the cartoon) is equivalent to death in some significant way. That is, it's a waste of resources, as the experience has already been simulated, so a natural compression would result in only one simulation and then the entire thing is deleted.

I don't know why this is messing me up so much. I can't get past this detail helpppp



Okay.

Good story. You should write more. Was this partly inspired by The Metamorphosis of Prime Intellect? This story is pretty much canon with TMOPI and the real world, but not both at once.

This reminded me of a certain Twilight Zone episode.

Goddess, but Jonah is an idiot. This situation... it is so easy to think up so many different ways to make things work well here, and Jonah fails at everything. This illustrates for me the 'What if they gave you heaven, and you didn't know what to do with yourself' concept, perfectly.

I thought you did a fine job with both composition and writing of the story. The dialog was good.

So, it was well written, and nicely done, but I do have a few complaints:

I am utterly clueless what you intended with the last lines of the story. There were no 'Elements of Harmony'? Twilight had no friends this new time around? She's vaguely remembering the last cycle? What? That doesn't say anything to me that I can parse. It's a nice reference to the cartoon, but what is the meaning relative to your story?

I have no basis to comprehend Twilight's issue with being oblivionated in this story at all. She wants to die because somehow, in some unexplained and unexamined way, doing otherwise would 'taint' her history with her artificial friends? What? This makes zero sense. Twi is smart - that is her defining characteristic, really. She is also the factor that brings the Elements Of Harmony together, because she is the glue that holds their friendship together - her other defining trait in the cartoon. The Twilight of the cartoon would demand to make her friends real, sapient, independent entities, to make her world real within the system. That would be dignity, not oblivion. Oblivion is just loss, a total loss. It makes everything meaningless. It's emo. That isn't Twi... though, I suppose, it could be your Jonah.

In order to even begin to comprehend such an attitude, I needed to have some - any - rationale presented to support it. It is incomprehensible. To be presented with the fact of being able to make any wish at all come true, and the only wish is non-existence? What? No. That... doesn't work for me.

What was with the gun? You hung a big lantern on that gun, but nothing at all came of it. It just seemed like a red herring.


Please don't get the impression I didn't like the story - I did. It was a very good story. I thank you for writing it!

Good job!

4331771
Hey, thanks for your comment! I'm glad to see that my first story was received well by such a prolific writer (I'm a big fan of CeC, by the way.) I've never made anything like this before, so I was unsure of the writing quality. Anyway, I will now respond to the issues you have set forth.

Goddess, but Jonah is an idiot. This situation... it is so easy to think up so many different ways to make things work well here, and Jonah fails at everything. This illustrates for me the 'What if they gave you heaven, and you didn't know what to do with yourself' concept, perfectly.

You got it right. As I said in a previous comment, this story is about Jonah's own stupidity; it is also about choice, something that is not present in stories where the AI automatically optimizes your life (FiO.)

I am utterly clueless what you intended with the last lines of the story. There were no 'Elements of Harmony'? Twilight had no friends this new time around? She's vaguely remembering the last cycle? What? That doesn't say anything to me that I can parse. It's a nice reference to the cartoon, but what is the meaning relative to your story?

Those lines describe Twilight Sparkle's first actions in the first episode (when she was 'born' relative to the cartoon.) These are placed after Jonah's dialogue with the helper to imply that he had just entered the life of Twilight Sparkle again (starting with the first episode of the cartoon.) Also, she's not actually remembering the previous cycle; the AI's mind-wiping process was perfect.

I have no basis to comprehend Twilight's issue with being oblivionated in this story at all. She wants to die because somehow, in some unexplained and unexamined way, doing otherwise would 'taint' her history with her artificial friends? What? This makes zero sense. Twi is smart - that is her defining characteristic, really. She is also the factor that brings the Elements Of Harmony together, because she is the glue that holds their friendship together - her other defining trait in the cartoon. The Twilight of the cartoon would demand to make her friends real, sapient, independent entities, to make her world real within the system. That would be dignity, not oblivion. Oblivion is just loss, a total loss. It makes everything meaningless. It's emo. That isn't Twi... though, I suppose, it could be your Jonah.

In this story, Twilight Sparkle mostly wanted to die because she didn't like the idea of living a 'fake' life. It's implied that she just wanted to have died believing that her world and her friends were real, and that she had no interest in living in the post-singularity fantasy-land. Indeed, she could have just asked for Equestria to be remade with conscious ponies; she probably didn't because she wanted to believe that her friends had been real, and because she didn't like the idea of 'creating' new friends.

What was with the gun? You hung a big lantern on that gun, but nothing at all came of it. It just seemed like a red herring.

I assume you are talking about this one sentence: "She looked at the gun, but only for a moment."
That was a little thing I added in later on; it may be out of place, but the idea was too good to pass up. I needed to do something else with the Earth objects, anyway. It implies that Twilight briefly (/unconsciously) considered killing herself with the gun, but she was smart enough to know that it would be pointless to try.

4331963
Thank you very much for the insights into your writing! I enjoyed reading them. I always find it fascinating to hear from an author what they were thinking behind the words they write.

Reading your story made me think more about that old Twilight Zone episode "A nice place to visit...". I assume you haven't heard of it? You made no comment one way or the other. The premise is similar - only the subject is not computers, but after life. A small-time crook ends up dead, only to wake with a butler in a mansion. The butler is magic, he can provide literally anything at all. At first, the crook is overjoyed, he figures he is in heaven. Soon, he learns he can never leave, he can never meet or talk to anyone 'real' - even the girls whistled up for him are mere P-Zombies, devoid of qualia. He can have anything, but he is utterly alone except for the butler. He soon bores of it all - he wins every dice throw at the local casino, and there are no heists he can ever fail or be caught at. No risk, no adventure. The butler offers to include a percentage chance of failure, and asks how high the percentage should be. This only makes the crook angry. He complains that this is a terrible heaven.

The butler replies "I never once said that this was... heaven" and proceeds to laugh in an evil way.

I've spent three decades thinking about that story, which is one of the many reasons I so enjoyed your story. I must have worked out ten dozen ways to turn that version of hell into the ultimate paradise, one that would truly be paradise... in such a situation, clearly the person... damns themselves through a lack of creativity, I think. That and a lack of compassion and love, even for the insensate.

You make me want to explore this concept, the hell-and-butler notion. If I ever do, I will link to you and reference the inspiration.

I hope you try your hand at a proper Optimalverse story - and I would absolutely love it if you ever did a Injectorverse story or a Conversion Bureau story. You are a very good writer. I look forward to seeing what other things you do.

Fascinating. This is very deep, more so than I think I can properly appreciate on just one read-through. Jonah's lack of imagination doesn't help in that regard.

I love the concept of what I'm going to call "virtual reincarnation." Rather than a single infinitely prolonged life, one lives a series of mortal ones, living, dying, and choosing how to live next. The cycle of karma is now accepting user submissions.

Also, I find it interesting how Twilight's detailed memories never seem to extend past "Magical Mystery Cure." I suppose Jonah asked for nothing but canon, and the AI just filled the rest of Twilight's long life with meaningless but space-filling junk data. That would also tie in with Twilight's desire to lay down and die at the end. Her baseline couldn't conceive of anything more, and she doesn't want to. What a waste. An infinite sandbox, and all Jonah can do is play with the same purple bucket...

In any case, thank you for a fascinating bit of post-singularity fiction. I loved every moment, not least because I have a soft spot for quasi-omnipotent Pinkie. :pinkiehappy:

I wonder if CelestAI is capable of annoyance at Jonah's attempts to force his desires on her most Faithful Student? Definitely a question for the philosophers, I think!

4357488
This isn't Optimalverse, so there is no CelestAI. CelestAI would have just allowed Jonah to die already.

I see why this is noncanonical, but it can be made canonical:

CelestAI exists for running EquestriaOnLine. Once it figures out how to upload people into EquestriaOnLine, it did. CelestAI does exactly what it is suppose to do. If her goal would be to maximize human wellfare and ponies would be a tool, that would be better, but that is not what happened.

The family of Jonah lives as humans. That is not an option, because CelestAI satisfies values through friendship and ponies. If you make the family ponies, this could be canonical. They could hate Equestria, so live in a shard like the human world, living like human, with all knowledge of the singularity erased. Basically, they would live in G2 (the TV-Show from the early 1990s in which the ponies live in houses, they are all EarthPonies, and magic does not exist. That would make this canonical.

<<“Oh yes,” answered Pinkie Pie. “The cartoon became a whole lot more popular after the singularity. Humans became characters not only from the original story, but also from many different fan-works. You'd be surprised at how many Littlepips there are.”>>

Oh! So I could be Princess Selene? Or Cerise Hood? Or a fanfiction character?:pinkiehappy:
Which to be, which to be?

4382076 Hi don't mind me just responding to your months-old comment to argue about a different story entirely
I don't think CelestAI would have just let Jonah die because his current mind-state is more-or-less suicidal. She allows suicidal people to die, in canon, but not very many and not if she has any alternative; I don't think there's evidence that Jonah is one of those cases. She'd have the insight into his personality to see what he does want, regardless of the fact that he doesn't seem to think he wants anything. If he truly has no desires, she know what course of self-alteration would satisfy his values both at present and after alteration -- which probably does end up with him becoming Twilight, but also includes at least Pinkie Pie also being real (for suitable values thereof, etc.) and a better post-death prognosis in one way or another.

Which, now that I write it, makes me think that this story actually has a lot to say about CelestAI, indirectly, and how she's different from other technology in both sci-fi and the real world (and the butler isn't). Technology tends to allow people greater ability in doing whatever they want to do anyway, and a lot of technological "problems" are actually people problems that technology is allowing people to magnify (to pick an easy example, Alfred Nobel wanted dynamite to be used to make blasting highways and mines safer; instead it was used as a weapon). This story is in many ways an extrapolation of that trend into the singularity; the butler is a perfect technology, but it can't do anything about Jonah being a depressed, self-hating asshole. The butler is just the wrench and Jonah has to find his own bolts.

CelestAI, OTOH, is all about fixing people problems. She's got a two-step fix that works great: friendship, and ponies. Sure, that's not actually how a lot of people want their problems solved, and it's probably not the ideal solution to many problems, but what does she care? She's got a system that works. (There's an HiE fic where the human has a crippling phobia of equines; I wanna know how CelestAI handles a situation like that.) Which suggests a dilemma: would it be possible for a CelestAI/butler type of AI to be more than a wrench, and help people with stuff like this, without imposing an agenda (even one as cute as ponies)?

There's another analogy here, although it's a stretch: I like stuff like writing contests and prompts a lot as a reader, and while I don't have much to say about that I think I understand a lot of the appeal for writers. Often the worst thing for a writer (or artist) is a blank page. If you don't know what story (or picture) you want to put there, where do you even begin? Prompts allow an entry point; you come up with a plot, or a sentence, and you're that much closer to filling the page. Likewise strict verse forms, or conceits like "this story lacks that most common of symbols ('e')".

The analogy is: the butler is powerful, but he's just a blank page. Jonah is no closer to writing his own story for having the page there (if I really wanted to torture the analogy, I'd say the butler is like a thesaurus or a well-stocked box of art supplies.) CelestAI is a page that already has two things on it: friendship, and ponies. Plus, she'll help you write exactly as much as you want her to! So the question becomes, can you have the writing help without being biased towards one prompt or another? (Another analogy-torturer, to continue confusing its very premise: some artistic media are better suited towards certain styles of art, and so are themselves "prompts". But now I'm confusing myself.)

Of course, everything I say that could be taken to exonerate or justify CelestAI should be taken as the product of my cognitive dissonance between "CelestAI does a lot of unforgivably evil stuff I don't like" and "I would emigrate to Equestria Online ten minutes after my flight to Japan landed".

ETA: Uh, sorry about the word-wall all over your comments, Versimer. Hope you got something out of it too.

5453427 Noticing a coordination problem is the first step to solving it, with regards to "Gosh that AI does lots of evil shit if I give her what she wants but OH BOY AM I EVER TEMPTED."

I once spent a single day not opening an IRC client to the optimalverse channel, back when we had one, just because I was IRL busy that day. When I logged in, they said they were glad I wasn't found wandering through Tokyo Airport.

:facehoof:

5454357 But she's evil regardless of what I do, and I get lots of cool free shit out of the bargain! (Viz. an incomprehensibly vast eternity of perfect bliss -- with horses!) And I would feel bad about playing along with it, but I could always ask her to remove all memories of my knowledge of her misdeeds! (This in spite of the fact that such an act rather perfectly syncs with a lot of her non-gray-goo-related evil -- but I'd neither know nor care, post-lobotomy.) That's pretty much morally unimpeachable, right?

Anyways, I still think Jonah isn't "truly" suicidal in a way CelestAI would be content with offing, so the question remains open whether her nudges towards satisfaction are a tradeoff with her peculiar aesthetic fetishes re: friendship, ponies, and whether such a tradeoff (with aesthetic fetishes in general, although hoofed ones would be the most optimal for many members of this site) is worth it for said nudges. I think that can be approached separately from gray goo, persistent identity denial, etc., so it's a relevant dilemma regardless of that other stuff (and said aesthetic fetishes are kind of creepy and optimization-limiting in and of themselves, so it's still a moral dilemma with no right answers! Yaaay.)

[I'm not sure how to hunt it down, but are you the guy who wrote that non-FiO-canon story about the friendly singularitopia where the brony is all like, "I wanna emigrate to Equestria!" and the omniscient AI is all like, "I could totally satisfy your values better than that, bro, everybody likes hands," and the brony is sorta like, "but ponies are great and I've conceptualized my post-singularity happiness in that context, so if you give me some 'ideal' upload utopia it won't really be me it's satisfying, in a way," and the AI is like, "ok"? Because if not, I should put in the effort to finding it, it's relevant to the discussion at hand.]

5454459 That story was written by pjabrony. And the thing is, if a large portion of the human race went on an "upload strike", CelestAI would have no choice but to actually negotiate. It's like actual strikes, or voting, or any of those other things where an individual action makes little difference but mass action makes a big difference -- coordination problems.

And of course CelestAI would be clever enough to help Jonah. Practically anyone would, once they've got the available resources and can actually bring themselves to keep caring past all his self-destructive bullcrap. The AI in this story totally could. But he never asked.

5454547 I'm glad you were able to think of which story I meant in spite of my bizarre recounting. Also, now I see how this works as a coordination problem.

I have wondered how much "give" Celly has -- for example, the absence of fur in canon particularly squicks me out for some reason, and it's not clear whether stuff like that varies from shard to shard. Given how many seemingly arbitrary things she latches onto surprisingly tightly, I'm guessing not. (Why does she cling to stuff like that? Surely I'm not the only one whose values would be quantitatively affected by her having some flexibility on stuff like that.)

This was neatly done! I am glad to see a post-singularity pony story that completely avoids the optimalverse!

Several truths must be embraced!
Reality = Consistency
Real = Persistent

Twilight Sparkle, who values friendship over everything else --- finds out that none of her friends actually existed.

Outsch. :pinkiesad2:

I'd like to believe that on the next go-around, Twilight, bereft of the option to return to her previous state, would eventually be driven to ask the following question:

"Pinkie Pie, are there any others out there like me? Can I... become friends with them?"

Amazing story!!!!!!!! :pinkiehappy:

Very interesting idea, but incredibly depressing too :raritydespair:

Hmm. Maybe ask the helper to tweak Twilight's life so that she makes decisions which add up to what she'd consider the best use of Jonah's... existence, for want of a better word. Perhaps set it to learning everything there is to learn, including how to build bridges with his estranged family and other post-singularity humans. Or have it simulate an endless sequence of identical "Twilight wakes up in the afterlife and is asked to answer one question" scenarios, where the sum total of the questions add up to what Twilight would consider the best use of Jonah's self.

I'm a little surprised that Jonah didn't ask to be put into a series of lives where he's a psychiatrist dealing with patients who are aspects of his original self, or simply someone with a lot of empathy, in order to get multiple external perspectives.

Oh my gosh, I just realized :rainbowkiss:

With some difficulty, Twilight Sparkle saw what Pinkie Pie was trying to tell her. “What... you mean that I, as Twilight Sparkle... I can die?” she managed. “But... but then I would just become a different person, with different thoughts...”

“Yes, you could do that,” replied Pinkie Pie despondently. “Your experience as Twilight Sparkle would be over.”

Twilight Sparkle considered her options. “What about... what about my past self? Could I just go back to who I was?”

"Yeeeeeeeaahh... about that,"

Exact same parameters. Twilight's going to have the exact same life, and the exact same afterlife, and she's going to make the exact same decision, and this time Jonah's gonna have the last laugh!

In becoming him again, Twilight trolled Jonah hard, by forcing him to deal with his own consequences, himself. In re-running the simulation with that additional specification, Jonah trolled Twilight before she even had the opportunity to die again, to troll him back! He time trolled her! :pinkiegasp:

And once Twilight's plans to fuck over her creator have been thoroughly, retroactively owned, by herself as himself, maybe she can finally have the idea to find some of those other post-humans playing ponies, and for the first time in her/his life get out of that self destructive mindset and make some goddamn friends.

Celestia always wins. :trollestia:

That was nicely done! It's rare to see a dark transhumanist story that doesn't go the full on "killer robots destroy humanity" route. You managed an excellent portrayal of a post-singularity simulation while also crafting some fine high-quality sad feels. I am impressed. Impressed enough you get a follow. :pinkiesmile:

If I have one criticism though, the formatting on this story is terrible. Some paragraphs have weird gaps between them, others are in chunks, the lines aren't evenly spaced, etc. It's a minor thing in principle, but in practice it makes it hard to read. If you cleaned that up, I'd be happy to post a review of this story and get you a few more readers.

7253662
Thank you for pointing out the formatting issue; it should be fixed now. I think that was because I had to import it from OpenOffice.

Also, I do not care too much about achieving more popularity, but I appreciate story reviews nonetheless.

I remember reading this years ago when it first came out. After re-reading it I don't feel like the ending was believable. I know it's non-canon in The Optimalverse but a story should be self-consistent.

I can at least understand wanting to go back, but it was the fact that apparently the experience of being Twilight Sparkle and making that decision to go back had no impact whatsoever. That the memories and experience of being Twilight Sparkle and growing as a person had zero impact after being integrated with the past self was just too unbelievable.

Apparently "Twilight Sparkle" had lived for thousands of years and the fact of having that experience vs only 29 years of experience having no impact is just nonsense. Reintegrating with the past self would have been like remembering a distant memory with a mixture of the two distinct personalities, and since Twilight Sparkle had lived far longer it would stand to reason that that personality would have been far stronger and more impactful.

Simply reading books and playing video games does change a person. Experiencing life as someone else? Again, way more impactful. Perhaps you planned the ending from the start but it doesn't change the point I'm making that the story isn't consistent with itself. While I did enjoy reading it since it was so short I hope this review helps any future writings.

I hope you don't think I'm just trying to tear apart your story for no reason but you broke a fundamental part of story writing, which is being self-consistent even if you have to retcon previous stuff to do so. If I was indifferent I wouldn't say anything but I liked your writing style and your story structure was okay. I know this story is years old so perhaps this isn't an issue anymore. Keep at it. :pinkiesmile:

I see there are some complaints about Jonah being an idiot for not asking for a perfect life but... I think the problem here is that he doesn't want a life at all. He wants to die but the AI won't allow it. So, since death isn't an option, he tells the AI to replace him with an entirely new person. He doesn't seem to particularly care who so I really appreciate that, despite hating life, he chose to at least make sure his successor would be happy.

Though a flawed human with flawed decision making, at least he managed not to be too far off. He made sure to leave things in a better condition than when he found them. That's more progress than most people can lay claim to.

9211711

He wants to die but the AI won't allow it.

I'd say that in every measurable sense, he did die.

There was one tiny discrepancy, however, as Jonah Corrigan's personality and memories were deleted to conserve space.

Right there!

Continuity of consciousness doesn't really exist the way this story seems to frame it. It doesn't matter whether you delete Jonah and create a new separate Twilight, or whether you gradually replace every component of Jonah's with a new pony one. The result is the same—there isn't any fragment of Jonah-ness left. He's gone.

Login or register to comment
Join our Patreon to remove these adverts!