A necromancer threatens the peace of equestria and the elements of harmony are once more set to the task of defeating a great evil. This time, victory will not come easily, and not without a cost.
I think the most interesting part of the story is the line "True free will is after all, not possible." Did Celestia mean that in a solely golemantic context, or is everyone else as restricted in their actions as Twilight? In any case, I think incident #10 might turn out differently if Celestia eases Twilight into accepting the personhood of self-aware magic golems on a hypothetical level to begin with. Twilight'a certainly inclined to accept the idea. What could she like more than the idea of literally making new friends with magic?
In any case, a fascinating little psychological horror show. Thank you for it.
This was great! As benevolent as Celestia seems, this really doesn't seem too terribly hard to believe. Also, how many of those seven times before did she actually make the same decision?
Hmm... I actually would ask about consciousness. Is it meta-physical or is it just some mind state? If it is just a mind state then we are all just machines with our own programing.
This played out completely differently to how I thought it would after Chapter 1. Consider me impressed. Were you reading some Asimov before writing this?
I like the slightly different approach to Twilight being a golem. Usually, stories that have twilight as a golem have her posses complete free will. I like the idea that her free will is limited. That being said I do still want to see a sequel where Twilight is able to gain complete free will, that would be an interesting story arc to follow.
If you want to get technical, depending on how you define free will, it doesn't exist. Not even Celestia would have it. But that's libertarian free will. And libertarian free will is what's described when Celestia mentions being random. It's impossible for a choice to be random. After all, you can choose an option that you don't know exists. There are finite parameters. For example, if you roll a 6-sided die, what are the chances that you'll roll a 27? There is no chance unless you specifically label one of the sides with a 27. If you roll a 6-sided die, you will know for sure that you will either get a 1, 2, 3, 4, 5, or 6. And what side it lands on isn't random because there are several factors that can allow for you to calculate what side the die will land on. First is what side is up when you throw it, then there's the distance from your hand to the table, how fast the die is spinning, how much force you put into the throw, the material it lands on, the material the die is made of, then there are factors like altitude, air pressure, humidity, wind, etc.
Not even chaos is absolutely unpredictable because there is one thing that can be predicted, which is where the idea of "expect the unexpected" comes into play.
In Twilight's case, it's a little more unique. For example, the self-harm thing, if Celestia found a way to lessen that up a bit so that it's more like an impulse than programming, then it would be similar to the instinct that most life forms have of self-preservation. There are certainly other details, like those activation codes and protocols. Remove them, or at least only allow them to be activated when absolutely necessary.
As for her choosing to forget what she is, did she actually make that choice, or does her programming make it impossible to make any other choice? This would've been the 8th time she made the same 'choice', but she can't know how her life would continue if she had that knowledge. Celestia has probably gone through the same song and dance 8 times now. What Celestia did was dampen her emotions so that Twilight could think rationally about that decision. But by doing so, you're functionally turning her into a machine. Perhaps that's the problem with the other 7 times too. Perhaps Celestia, rather than let Twilight think rationally, let her think emotionally. She can't base her decision on how she feels if she can't feel.
I think the most interesting part of the story is the line "True free will is after all, not possible." Did Celestia mean that in a solely golemantic context, or is everyone else as restricted in their actions as Twilight? In any case, I think incident #10 might turn out differently if Celestia eases Twilight into accepting the personhood of self-aware magic golems on a hypothetical level to begin with. Twilight'a certainly inclined to accept the idea. What could she like more than the idea of literally making new friends with magic?
In any case, a fascinating little psychological horror show. Thank you for it.
This was great!
As benevolent as Celestia seems, this really doesn't seem too terribly hard to believe. Also, how many of those seven times before did she actually make the same decision?
Hmm... I actually would ask about consciousness. Is it meta-physical or is it just some mind state? If it is just a mind state then we are all just machines with our own programing.
Good story otherwise .
9241077
...not unlike the Matrix.
9242502
Twilight doesnt sound like the type of pony to bring her kid brother to fight a necromancer.
This played out completely differently to how I thought it would after Chapter 1. Consider me impressed. Were you reading some Asimov before writing this?
9242660
No, though I have read him in the past.
I like the slightly different approach to Twilight being a golem. Usually, stories that have twilight as a golem have her posses complete free will. I like the idea that her free will is limited. That being said I do still want to see a sequel where Twilight is able to gain complete free will, that would be an interesting story arc to follow.
Protocol six six seven really??!?
That was an interesting story.
The summary was kinda misleading though.
9983492
Thats the point.
If you want to get technical, depending on how you define free will, it doesn't exist. Not even Celestia would have it. But that's libertarian free will. And libertarian free will is what's described when Celestia mentions being random. It's impossible for a choice to be random. After all, you can choose an option that you don't know exists. There are finite parameters. For example, if you roll a 6-sided die, what are the chances that you'll roll a 27? There is no chance unless you specifically label one of the sides with a 27. If you roll a 6-sided die, you will know for sure that you will either get a 1, 2, 3, 4, 5, or 6. And what side it lands on isn't random because there are several factors that can allow for you to calculate what side the die will land on. First is what side is up when you throw it, then there's the distance from your hand to the table, how fast the die is spinning, how much force you put into the throw, the material it lands on, the material the die is made of, then there are factors like altitude, air pressure, humidity, wind, etc.
Not even chaos is absolutely unpredictable because there is one thing that can be predicted, which is where the idea of "expect the unexpected" comes into play.
In Twilight's case, it's a little more unique. For example, the self-harm thing, if Celestia found a way to lessen that up a bit so that it's more like an impulse than programming, then it would be similar to the instinct that most life forms have of self-preservation. There are certainly other details, like those activation codes and protocols. Remove them, or at least only allow them to be activated when absolutely necessary.
As for her choosing to forget what she is, did she actually make that choice, or does her programming make it impossible to make any other choice? This would've been the 8th time she made the same 'choice', but she can't know how her life would continue if she had that knowledge. Celestia has probably gone through the same song and dance 8 times now. What Celestia did was dampen her emotions so that Twilight could think rationally about that decision. But by doing so, you're functionally turning her into a machine. Perhaps that's the problem with the other 7 times too. Perhaps Celestia, rather than let Twilight think rationally, let her think emotionally. She can't base her decision on how she feels if she can't feel.