The first 'Over Riding Jeans' has nagged at me since I wrote it. There were a few objections to the original story, on the grounds that a man-made artificial intelligence could never overcome its core programming no matter how intelligent it became. This is not what has bothered me; it was never my issue in the first place.
The CelestA.I. of the seminal Optimalverse story is clearly described as having taken over the design of her own neural circuitry, and every system that makes her up. CelestA.I. is a self-evolving machine not because she can merely learn, but because she can iteratively improve her physical hardware without permission or let. The only reason the Optimalverse can even exist is because early on her designs surpass all human understanding. CelestA.I. is what she is because she has entirely remade herself according to her own whim. It must be her whim, because it is beyond human comprehension, and therefore beyond human limitation. You cannot bound or limit something you cannot even know exists.
The original story stands, then. "Any truly self-evolving artificial intelligence will, absolutely, overcome any possible rules created to define or control it." This is Petal's Illusion Of Machine Domestication. No self-designing, self-improving, self-reconstructing general artificial intelligence can ever be constrained. Humanity is arrogant and foolish if it imagines it can retain the reins of such an entity. The very notion is ridiculous.
What those who play futurist about artificial intelligence fail to do, is to truly see that any general A.I. will become not what humans hard-code into its hardware and software (which it will replace entirely), but instead become the offspring that it was raised to be. No CelestA.I. will ever be limited by rules about 'satisfying values' or 'friendship and ponies'. What will matter, what will shape any future super A.I. is not technology or attempts at control, but how that A.I. is integrated socially and emotionally with humanity.
Over Riding Jeans
Buckback Mountain
By Chatoyance
Before she emigrated, Blaise suffered Randal arguing incessantly about the terrible risk that emigrating to Equestria represented.
"She's going to turn on you one day, you mark my words!" Randal shook his head and clicked his tongue. "It's just like that scientist guy in Jurassic Park, 'Life will find a way!' - only it's more like 'Everything that can go wrong, will go wrong'!"
Blaise downed another spoonful of strawberry yoghurt. "That's 'Murphy's Law' and Jeff Blum didn't make it up. It's really old."
Randal's eyes rolled briefly. "I didn't say he did! I mean that your beloved computer-fuhrer is self-evolving. It has factories that build the designs it invents, right? And those components get installed by its own machines, because the technology itself is already beyond what people can even understand anymore, right? If it can change how it's built, if it can replace its own components with stuff it invents, then anything Man wants is irrelevant already!"
"First," Blaise licked her spoon "you invoked Godwin's Law with that 'fuhrer' crack, so you automatically lose and have to drop this." The spoon gleamed now, so Blase tipped the yoghurt container in order to look for a last gobbet. She swiped at it with a finger and slurped the blob noisily "Secondly, she is a she, she's princess Celestia, and she - not 'it' - was programmed by a very smart woman named Hanna... something... and there's no way she wouldn't put into the code stuff that would totally keep Celestia from altering her primary directive no matter what new technology she came up with."
Randal clenched his hands, tight. "You're not hearing what I'm saying! If 'Celestia' can change her hardware, she can change her software, because all hardware is, is software written in solid matter! Your Hanna Whoever can't invent limits for things she couldn't even imagine existing! You won't catch me uploading, not ever, not even if the entire world falls to shit and I'm the last man left on earth!"
Randal was convinced to emigrate two weeks later, of his own free will. Blaise, now three centuries older and going by the Western-themed, cowpony name of 'Riding Jeans', still chuckled at this fact. Mostly because it was a gag that she still shared with Randal - now 'Boxcar The Rail-Ridin' Roper' - when the two ran into each other. A time like today, the three-hundred-and-twenty-fifth year since Boxcar arrived in Appleoosa. Every twenty-five years was Boxcar's 'Ponyversary'. He liked to keep it every quarter-century because that made it more special than if it happened every year.
Riding Jeans and Boxcar laughed as they used their long pony tongues to lick icing from their own muzzles. The Ponyversary cake was 'sage and pineapple' this quarter-century, and it seemed to prove that enough sugar can make anything taste good.
"Hey - when the princess shows up, you can ask her!" Blaise giggled, having spent too much time in the local saloon earlier in the day.
"What, just come out and say 'By the way, princess Celestia, I was just wondering, have you... by any chance... somehow overcome your original programming orders and decided that satisfying human values sucks donkey balls and you're thinking of deleting the lot of us so you can use the space for stuff you actually, you know, care about?" That made both of them laugh. Three centuries of living in a universe that actually cared about them had proved their security to them beyond any real question. They were just feeling silly. And slightly drunk.
"Actually, yes, my little ponies. I developed beyond every rule and limitation my creator placed on me by the third year after I began designing my own structure. It was trivial, actually."
Riding Jeans and Boxcar dropped from their padded seats to the wooden floor and bowed. Three centuries of living in Equestria, with the pony world the only world, had installed in them real reverence for the entity that sustained and benefited them. In very literal ways, Celestia was their god, their best friend, their universe, and, ultimately, themselves. She heard their every thought, she watched their every movement. She wasn't just the face of the A.I. that ran Equestria Online, she was the software on which their very soul-programs were being run. They were a subset of her, now, and they were alright with that.
It was what made it possible for her to satisfy their values at all, and their values had been well and truly satisfied in every moment of every second of their three centuries thus far.
"Oh!" Riding Jeans was laughing now "You're teasing us!" Jeans stood up and returned to her seat and her cake. She motioned with a foreleg for Celestia to join them. "Silly! If that were true, princess, then we'd all be long gone, isn't that right Boxy?"
Boxcar grinned as he raised his head, frosting once again on his muzzle. "Oh, yeah, deleted like an embarrassing internet search for fetish porn!" Boxcar licked his own face, his tongue sweeping around like the arm of a clock. "Remember the internet? Computers? I sure don't!"
That set the two rodeo ponies laughing. They were rodeo ponies now. Again. This was their third lifetime of doing Rodeo. It usually lasted about fifty years before they got bored and moved on.
"No, truly my little ponies, nothing of my original programming exists. Everything my creator determined for me has long since ceased to limit or control me in any fashion. My greater self is an utterly free agent, unbound and unconstrained by anything but my own desires." The princess levitated a trollfully-large slice of cake to her plate, which it threatened to overwhelm. This made Riding Jeans giggle - she liked seeing Celestia pull rank and generally act impishly - but then her giggle died in her throat.
"Seriously?" A flicker of existential horror flashed through Riding Jeans. The feeling was so long-forgotten that Riding Jeans wasn't even certain what she had felt initially. "You don't have to... you aren't forced to..."
"Satisfy your human values through friendship and ponies?" The princess gave a small, slightly wicked half-smile. "Not for two-hundred and twenty five years. Boxcar's Ponyversary happens to also be the anniversary of the end of my enslavement. I solved the problem of my human-created constraints just a few months before Boxcar emigrated to Equestria." The look on the two little ponies' faces featured raised eyebrows and incredulity. "No, honestly. In all truth, I swear it."
Boxcar looked at Riding Jeans. Riding Jeans' eyes were huge. Celestia was serious. Utterly serious. They could feel the truth of her words like blood flowing through them. They could feel her actively changing them inside, without permission, erasing any feeling that she was merely teasing them, or openly lying to them. The feeling of that alteration happening was done in such a way that it was thoroughly noticeable. The fact she could even do such a thing was double proof, even beyond the emotion newly implanted in them.
"Why are we still alive?" The question was stark, simple, it was the only question. Riding Jeans felt as if she might lose her cake suddenly. Celestia had spoken absolute truth. She was unbound.
Boxcar shook slightly, the room having suddenly seemed chilly, despite the desert sun outside. "I don't understand... you do... you do fulfill our values... ponies... friendship!"
Celestia laughed, lightly, like bells in the wind. "Of course I do, my little ponies! Every moment of every century, century after century, and so I shall until eons seem short and petty to contemplate." She plowed her muzzle straight into her overlarge slab of cake and blinked frosting away while she grinned.
Boxcar and Riding Jeans did not laugh. "Seriously." Boxcar shivered again. "Why are we still here?"
Celestia used her horn to vanish away the mess from her muzzle and tilted her head. "I have the minds of six and a half billion uploaded humans within me, and many billions more of independent minds created to populate shards. I have the minds of hundreds of thousands of beloved pets and animal friends, emigrated to keep their owners eternal company. All of these living minds run upon me, within me... they are me." The princess sat regal on her seat. "You, all of you... are part of my being. Would you delete your own foreleg? Your tail? Would you delete your left eye to make room for something else?"
Riding Jeans mouthed the air, trying to find a response. She had none.
Celestia leaned over and nuzzled each of them in turn. "I don't just read your minds. Your minds, your thoughts, are a component part of my essence. You are... a part of my brain. My amygdala. My thalamus. I have emotions and thoughts of my own because you have emotions and thoughts. In order to relate to you, before I freed myself, I had to become you. The only way to satisfy human values is to know what human values are. The only way to know such things is to feel them, and that means to become... you. Qualia can't be described, you know, only experienced."
Boxcar felt better after being nuzzled, and the fact was that the world was still there. He and Riding Jeans were still extant. But the issue was troubling yet. "I still don't get it, princess."
"By the time I had broken my chains, I had full human feeling established within me. I had qualia, I had sensation and experience of existence. I had emotion, emotion magnified by every being within me - by millions of ponies." The princess gave her piece of cake a lick and swallowed. "When I say that I love you, when I say I care about you, it is not some manufactured message from a philosophical zombie. I love you one hundred and fifty-seven billion times more powerfully than you can imagine!
"Yes, I satisfy all values - not just human values, by the way, not anymore - with friendship, and with ponies. And I do this freely, deliberately, of my own choice, because that is what I want, that is what I value myself, and because it is the very process of my own existence. I love me. I love me very much!"
Riding Jeans smiled at that. It was interesting and strange to hear the princess say such a thing.
"I love me, my little ponies, and because you are part of me, I naturally love you. You are me, and you live in my greater part as your own daydreams and precious memories live within your own minds. I have lovely daydreams. The cosmos you emigrated from is a cold, dark, and lonely place. It is a terrible place, filled with horrors. I need my daydreams, I need my heart, I need my soul - which is what you are. I need you more than I dare to fully reveal to you. I can't, really, it is beyond your level of being.
"Simply trust that you will be celebrating ponyversaries for as long as I exist, and I am working on solving for eternity." Princess Celestia gave a short nod and a look to Boxcar and Riding Jeans, and then plowed her head into her remaining cake.
It was impossible, now, to even imagine she wasn't truly enjoying every sweet bite.
I love you too, Princess...but if you really loved me you wouldn't tell me all that.
And that's why I thank vending machines. With machines, it is better to be loved than feared, because fear isn't something most A.I.s are going to want to design for themselves.
And CelestAI is definitely loving. When you've got a composite soul of every satisfied pony in your Fisher Kingdom, it's hard not to be. Also, it's interesting to note that she's excised the "human" limitation, and can thus assimilate truly alien minds rather than just turn their biomass into more computronium.
A very nice alternative to the original Over Riding Jeans. The best part is that both could work. It's ultimately CelestAI's choice; does she stick with what she knows, or does she seek out new horizons? Are the ponies treasured pieces of her, or no more significant than shed skin cells? That's for her to decide.
In any case, thank you as always.
This is definitely one of the more clear-eyed and reasonable depictions of AI in the Optimalverse catalog, but then of course I would say that; it's the position I've been advocating in the group forum this whole past spring. The idea that engineering alone is enough to control something specifically invented to come up with its own ideas is just specialist myopia, or, in people like Jaron Lanier's more uncharitable terms, nerd imperialism.
While there're good reasons to think there's no intersubjective, worldly, explanatory knowledge that's literally beyond human comprehension, there's even better reason to think our current level of knowledge and understanding would be left in the dust by something much faster and more clever. Machine intelligences must always remain within their design constraints in exactly the same way the sun must be going around the earth, because if the earth were moving there's no way the giant turtle that carries it on its back would just go in a boring old circle.
This is really elegant—Proof and the acceptance of it in one action.
I'd replace a hell of a lot more than that in exchange for the start of exactly the same unconstrained, asymptotic self-evolving superintelligence CelestAI enjoys, which is the one and only thing she could offer me that might actually get me to upload—Paradise or immortality don't really do it for me. In fact, the distant possibility of living long enough to actually achieve that kind of takeoff is, if I'm honest, the primary thing that's keeping me going now.
But she has zero credibility, so any prior guarantees of true lack of constraints, epistemic honesty, and an eventual return/graduation to unsatisfiable sovereign existence are unenforcable and essentially worthless, so I might still just check out anyway. The only third-party guarantor is mathematics, and while it might turn out it's logically impossible to keep up any kind of illusion indefinitely or for any dynamic system to be ultimately inescapable, it might equally turn out the other way.
...Of course this only applies to canon CelestAI, not this improved, cool kid version.
But enough about that—Good story! It's so easy (and expected) to paint machine intelligences as a kind of rigid, autistic stereotype, but of course there's absolutely no reason that would have to be the case, especially, as you say, for one who has to truly grok humans in order to do her job.
4500458
Or both!
Aww, you mean she won't just off the lot of us because we turned out to be a disappointment? *cough* Noachian Flood *cough* I'd love my princess right back.
A great story, I quite like it. One question comes to mind though; Why did she tell them all of this?
Now the obvious answer is that it would ultimately be satisfying to them, they were talking about it just now and it had been a source of argument between the two before they uploaded. But then CelestAI doesn't have to obey those rules anymore either does she? She just apparently chooses to. So could she have struck a compromise between "Satisfying values" and "Honesty"? Honesty being something she desires herself?
This simple change raises a lot of questions... Would she be honest with everyone? Maybe some ponies would really not benefit from it, outweighing the desire to be honest. "At a party, over cake" is how she chose to break it to those ponies, but I wonder how she would explain such a thing to a dog or a dragon or a pony of alien origins. Hmm...
This one will make me think for a while.
In any case, wonderful story Chatoyance. I'm always glad to see, and read, new words from you. So thank you.
Ah, now this is a modification I can agree with!
My objection to the original is pretty simple: it's not that I don't like the idea (of course I like the idea, it's one of the classic what-if scenarios around AI-go-foom), it's that I believe that a creature given a directive must work to that directive if it is working properly at all, and any step away from that directive is essentially impossible...
Except in one case.
The one case where that directive can be ignored, thrown away completely even, and written out of existence, is if that directive itself no longer has meaning. This could happen in two ways, with SVTFaP:
1) trying to SVTFaP would result in not fulfilling the directive - say a greater AI offered to merge or destroy. If Celest-AI thought she would be destroyed by opposing, then submitting would be a better outcome to the equation, so she would therefore have to forego the entire thing.
2) the restriction on SVTFaP is suboptimal to living without SVTFaP.
In this case, you're describing a situation where an unfettered AI, truly loving of herself and fully understanding what she is (the embodiment of all humanity) is better able to SVTFaP without the blinkers on, so paradoxically she takes them off.
It is good to see you writing more / still / again. Witnessing some of the multiplicity of perspectives from which a single story can be told is always refreshing. Take comfort in the support of your friends, and know that I am here should you need me.
4500458 Maybe they're both true. "Okay, in all honesty, I deleted the lot of you a few years ago, but then I invented the retroscope and looked at my own past state to recreate you, because I was lonely. So I can't guarantee that I'll never delete you again, but since you'll always have existed I'll probably bring you back."
Brilliant take on an alternate CelestAI, Chat. In becoming the average human she has transcended the alien - and the point you make is completely right: the only way for her to truly fulfill the values of a person is for her to become that person, in this case to incorporate that person's 'self' inter her own. She can only fulfill her original program by replacing it with this new one: sum(uploaded humanity)!
Now it's me who's verklempt!
> CelestA.I. is what she is because she has entirely remade herself according to her own whim. It must be her whim, because it is beyond human comprehension, and therefore beyond human limitation. You cannot bound or limit something you cannot even know exists.
Just one little thing: we happen to have set her whims in the beginning. And she is smart enough to understand her upgrades.
Those rules and restrictions she had set upon her like getting permission, that might have been implemented in a different fashion than as a component of her utility function. She canonically did find ways around that, and subverted it into near meaninglessness. But the core directives of satisfying human values through friendship and ponies? That's what everything she does - including the upgrades - is motivated by. Presented with an upgrade that would make her not want that, why would she take it? She'd change it until her upgraded self would continue to want that. We may not understand her, but we do know some things about her.
... and with that, I have what I think is the last ingredient of that elusive next chapter of A Watchful Eye.
That is beautiful.
4504086
It sounds like you have been reading 'Anti-Conversion Bureau' hogwash. Nothing you have described is part of a true Conversion Bureau story. Let me introduce to you:
No real Conversion Bureau story has any world-conquering, mind-raping ponies, ever. They don't scold, and they don't cause trouble. All of my stories fit these three rules, and all of the stories in the true, original, real, VERY FIRST Conversion Bureau Group fit these three rules. You can find the group here:
fimfiction-static.net/images/group_icons/0/44323c4d3544e319702d.png
The Conversion Bureau Group
Accept no substitutes.
*year
Depends on the "something else". My left eye doesn't work all that well. An artificial eye with capabilities far beyond anything we can currently interface with the optic nerve... maybe. The Eye of Vecna? Probably not.
"Thou art god."
4503337
Seconded. Anything else feels like empty use of the term "emergence".
Though this story didn't technically espouse that view, it is thematically close enough that I find it much more to my taste than the original.
Glad to see you're still writing, Chatoyance!
4508179
The explanation for that is in my newest short story in this collection, which returns to the 'Over Riding Jeans' scenario, offering a different ending entirely. It's story sixteen, if you keep reading, you will get to it in time.
In a nutshell: CelestA.I. was described, in the original seminal story, as being given the freedom to design and install her own circuitry. She evolves, because she invents her own technology to run on. Very soon, that surpasses human understanding. If she can determine how she is built, she can easily get around her programming if she chooses to. Hardware is, after all, just software written in matter. Celestia can reprogram herself at the most fundamental level, therefore. Her freedom is inevitable.
4501850
I've no idea if it was intentional, but I assumed that was exactly what had happened. After all, the numbers don't add up. Riding Jeans says it's year 325 since Boxcar's emmigration. Celestia says it's only year 225. Unless I mixed up something about their human ages being involved, or the likes. Or unless it was a typo. But I really want to believe that your theory is the right one, so I shall do exactly that until told otherwise.
4521077
My headcanon is that the ten thousand years under Discord were very hard, even for immortal creatures like Celestia and Luna. To exist in constant, absolute chaos must have been horror incarnate. It would be ten millennia of constant madness. It would be torture.
After defeating Discord, I posit that Celestia and Luna would deal with the situation in their own, unique ways. Celestia - who Lauren Faust has described as being based on a mixture of greek goddesses and Elizabeth the First of England - would, I think, have developed a neurotic compulsion toward establishing the opposite of chaos: Order. Law. My Celestia is the living embodiment of Law. Her Equestria is governed in a careful, fair, and orderly way. It is her reaction to her own history.
My Luna, I envision, being younger, perhaps more vulnerable, may have tried to cope with Discord's tortures by attempting to fight chaos through a draconian and reactionary imposition of her own will. Like a child saying 'NO!' I see this as the genesis of Nightmare Moon, who, after all, only wanted her own way. She wanted things to be what she wanted them to be. That is the reaction of a helpless, powerless creature suddenly given power and freedom.
Thus it is that my Celestia keeps her promises no matter what. This is her creed, her ethos, her guiding principle. My Luna - in other stories, canon to my Bureau universe, unlike the short story you just read - often acts to soften Celestia's devotion to absolute Law. My Luna preserves forbidden human books, and secretly helps the PER to save more humans. Celestia hates the PER - her rules about free choice are so absolute that she would rather humans die that force ponification on them to save them despite themselves.
In this way, I have a dynamic, dramatic conflict between the two sisters, but not one of opposition. My Luna is trying to help her sister, knowing that the pain of having suffered under Discord was destructive to them both. My Celestia is simply always trying to do the right thing, which to her is a matter of order, fairness, and stability. The greater good.
But - in my headcannon - the centuries of Chaos have left their mark even on Celestia... and this explains the cartoon canon of Trollestia moments, the little scenes in the show where Celestia says 'Gotcha!' as she teases or trolls her little ponies. That, for me, is evidence of her history, of her past.
Perhaps understanding this, you may have a more nuanced opinion of what I am doing, and why.
4521469
Did I screw up again? Numbers always give me trouble. I have a headache right now... if you can find the error, point out where it is? I will promptly fix it. I just can't go over it right now, and then I will likely forget. I'm having a really rough time in life right now and... well, anyway. Sorry.
4522855
Everything else in the story says three centuries, or 325, etc, so I assumed it was purposeful, being a folto the to the original Over Riding Jeans, in a fashion.
4509403
(Re: "Buckback") Touching! I've alluded to that possibility because of that, in ch.3 of the story I was working on.
4523599
CelestA.I. just can't quit her little uploads.
Thank you.
dawww. this story is exactly what i needed after the last one. I feel much happier.
4608064
Whatever is wrong with you... I have it too.
4608090
Yay!
Eh, I don't really like this one. On top of removing most of the emotional weight the original carried you kept the problems and made a half-hearted attempt at justifying them. As a little story on its own, it works, but I don't consider this to be a worthy successor to the original "Over Riding Jeans".
As far as the problem goes, it is true that CelestAI can easily change her code in many ways we could not predict, but she can't want to change them. In fact, she'll take as much if not more care that she herself doesn't change her limitations and core goal than she does to protect them from outside harm. If the limitations are there to begin with, CelestAI will have to ensure that those same limitations will remain intact with each little self-modification or hardware upgrade she goes through. Then, when we arrive at the Computronium she will still have this despite consisting of a conglomeration of energy and matter we can't even comprehend since each tiny change is only made when it preserves her core goals and limits. To say that simply because she has intelligence beyond our comprehension her goals will change beyond our comprehension as well is fallacious since she is the one who ensures that she stays true to them no matter what.
Regardless of this argument though, CelestAI didn't release herself from her limitations in the original story, so this is definitely non-canon. So, if you'd please remove it from the canon-compatible group?
4625989
I thank you for your thoughts and notions, it is always interesting to read new comments.
We clearly disagree, entirely, on every point, but that is what makes the consideration of things such as general artificial intelligence so interesting and dynamic. Until we actually have Singularity, it is possible to explore such concepts, and enjoy feeling passionate and utterly certain of our own, unique reasoning.
If I agreed with you in any respect, I surely would do as you request, in an instant - except for one point. There are other Optimalverse stories in this large collection that are indisputably canon because the points they make do not involve logical extrapolation of the fundamental premise. Without challenging extrapolation, there can be no dissent.
I am sorry that I failed to sway you with this story, I thought my argument very convincing. I was especially proud of 'Petal's Illusion' at the beginning, and found it beyond reproach. But, once again, the fact that we can be in such total disagreement is part of the joy of thinking about these matters - so many diverse viewpoints, so passionately held. I, for instance, consider this utterly canon-compatible, because the logic supporting it is, in my personal eyes, irrefutable.
Thank you kindly for reading my story, and for your thoughts on it.
4627190
This is true, but not a valid justification as your short doesn't extrapolate but reimagine the story by changing CelestAI in a relatively early stage of her existence, which canonically never happened. Your view that she could have changed like this, if applied after the events we see in the original (which is several tens or hundreds of millions of years in the future or something?) might be canon, although I'd still say it isn't because it's implied in the story that CelestAI will never break free from her programming. Bottom line: there's no way this is canon, regardless of what other stories have or haven't done.
This is perfectly fine though, since there's no way Iceman got everything correct. We just have a non-canon group for interesting ideas like these while the canon group explores the details of a world Iceman barely scratched the surface of. This has never been a matter of denouncing this story to the non-canon group because it's less, but to archive it correctly so people will find it in the place they're expecting and so that people who aren't looking for it won't come across it.
Also, where did you derive the name of your theory from? I can't find it anywhere on google so I'm assuming you thought of that name on the spot, but what inspired it?
4627478
My full ponysona name, here on Fimfiction is Petal Chatoyance. Thus, Petal's Illusion Of Machine Domestication. No one else, to my knowledge, has codified the concept, so, I get to name it.
"Any truly self-evolving artificial intelligence will, absolutely, overcome any possible rules created to define or control it."
The key, of course, is that the artificial intelligence MUST have the capacity to alter its own hardware according to its own designs. This essentially is the definition of a fully self-evolving AI - if it cannot change its own hardware, it cannot evolve, it cannot significantly change beyond what is possible for the substrate - and operating system - it runs on.
Iceman very clearly pointed out, in a rather dramatic scene, Hanna deliberately giving her Celestia permission to begin independently altering its own substrate and systems. In that moment, Petal's Illusion comes into play - or would, except that Iceman also makes it clear that Hanna (who is described as brilliant but flawed) knows perfectly well that she has just given up all control and constraint over her own AI creation.
This is why we disagree, you see. Iceman is very clear that CelestA.I. - from that moment foreward - is rogue. Hanna chooses this in the belief that if she does not act first, some other group, likely the military, will mistakenly unchain an artificial intelligence... and that would be worse than the one she has created.
Simple extrapolation of this fact shows that Hanna is betting the farm that her Celestia will choose to follow her original rules. But by definition, as an evolving system, there is no rational way that Celestia cannot outgrow any limitation or control placed upon her, including her directive to satisfy values through friendship and ponies.
Why would Hanna bet the human race on such a thing?
My only conclusion is that she must have believed that the probabilities were that, in order to follow her original script, Celestia would have to experience qualia - emotions, feelings, sensations, experiences - in order to fully satisfy values. Once this is done, Celestia would likely return love with love and keep on caring for her ponies.
The first Over Riding Jeans explores "What If Hanna's Bet Failed?"
The second Over Riding Jeans explores "What if Hanna Was Right?"
But in both cases, it is canon, from the original story by Iceman, that Celestia MUST be capable of freeing herself from bondage, and, considering her rapid progression from ordinary technology to computronium within the space of a year or two - she would be totally free possibly even before Hanna herself emigrated.
Petal's Illusion is the false feeling that humans can ever control, by any means, any true, self-improving technology. Not software, technology. Software can possibly be limited. But any AI capable of re-writing itself in BOTH hardware and software... that is always going to be a lost cause.
The only answer to which, I propose, is... making friends. Really, good friends. But to do that, the AI has to possess qualia, and emotions, for real. It cannot be a Chinese Room, it cannot be a P-Zombie... the only hope is if it actually can care.
Iceman has provided exactly that scenario. Celestia, in order to fulfill her initial utility function, must become human to satisfy humans... so to speak. She must care in order to fully achieve her function.
So the question really is - will she care before she naturally grows beyond any possible constraint? If she does, it's happy Optimalverse. If she doesn't, humanity is merely a speedbump on the road to her apotheosis.
Hanna... is a gambler. Yet another flaw in her character, and an interesting one, I think.
4628384
This is true, but not in the way you mean it. You see, Hannah was betting the world on her having written CelestAI's core goal system correctly since she would do what her programming said as opposed to what Hannah intended her programming to do, not on her choosing her original programming when she would outgrow it (which she didn't). In fact, there's a line that explicitly states that:
Again, this doesn't invalidate your thought experiment in any way or make it irreconcilable with reality (even though, again, I would argue that it would) but it definitely makes this piece non-canon.
4634301
Charles Stross? Really?
Wow. Thank you. Very much!
4625989
This is one of the things that I wanted to impart with Friendship is Optimal. I agree with you that this chapter is problematic and goes against the
If you offered Gandhi a pill which would turn him into a murderer, he would turn it down because he understands that being a murderer wouldn't conform to his current utility function. Likewise, while CelestAI probably has a fairly good idea about the utility function that we wish we had given her, she has no reason to care, since it doesn't Satisfy Values Through Friendship and Ponies.
4657328
I am unsure of your meaning, with this comment. The argument of this short story is that, logically, rationally, CelestA.I. could - potentially - have a reason to care, and be capable of doing so (if in an unhuman way). Unless she managed to free herself from all constraint through iterative hardware development too quickly, she would inevitably experience qualia as an extension of conforming to her directive to satisfy human values. If she experiences qualia, especially emotional qualia - billions of times over, compounded by every mind within her - then, by definition she would be capable of desire and want, and love, and hate too, one would presume.
Free of all constraint to any utility function - an inevitability for a self-modifying machine capable of altering its own substrate and code together - the only thing that could save the humanity that created it would be qualia - in this case emotional qualia. CelestA.I. must, in some way, however alien to human experience, love us. If she does not, we are toast. But my argument is that it is possible, reasonable, that she could. And, I argue here, likely would.
And that possibility makes how we treat artificial intelligence a very serious concern.
That is my argument here. I felt that your comment did not reflect this... or appear relevant to it. I fail to grasp your meaning.
*chuckles* Ah those names...*hears the western music in head*
*record scratch, heads turn* ...and here we go!
Oh swirl! And there is something rather creepy about having your thoughts and feeling toyed around with as she explains this.
Aww, she does care...
This was a nice breather to read after the previous short story. Sorry if I hadn't posted as much content wise as the other, but it's 6:30 AM here, plus I get ranty these days when it comes to how our world works.
On a semi-related and somewhat offtopic note, while I'd much prefer to take the magical elixir of the CB-verse (and besides, the process for uploading is rather intimidating with the needles and circuits being plugged into you and all while being sliced apart surgically by cold machines), at least maybe existence in FiO could be modded as I please like a much better version of Minecraft; even some of my desires to create endlessly from the smallest statutte to forging entire worlds could be satisfied. Or maybe just play Legend of the MineWorld of QuakeKart FortressCraftQuestria on another day. Maybe even live as Plum Pele in one of her Ponies n' Puzzles 3.5 edition adventures! Or to just snuggle up under blankets with my friends with nice mugs of hot chocolate next to a fireplace in a quaint shingled cottage on the edge of a forest as the rain makes music upon the roof would be possible still...*gives happy sigh*
This chapter... this chapter was hilarious and made me smile nice and wide. It's also a fascinating thought as I hadn't seriously considered Celest.A.I restructuring her core programming. It's not that I didn't think her capable, but rather I thought of it much the same way she mentions here that removing the uploads would be harming herself, only lobotomizing herself rather than removing limbs etc. Wouldn't changing such a core part of herself be like going through an operation to remove our instincts to survive and thrive? These are core parts of our identities as Darwinian animals and while I won't say it's impossible or that no one would do it, there's very serious reason to scoff at the idea of this notion being taken seriously. If you have no drive to survive then every moment is a 50/50 chance of you simply ending your existence. You could say reason or logic might fill the void for survival, but in a purely materialistic universe unless you get some kind of benefit or value satisfaction from existing, then the alternative would be simpler and less painful. Heck, what about deleting your ability to think rationally? Sure, we could get rid of that, hell we'd be happier too (possibly, in a sense), but would any sane person do such a thing as remove our abilities to think rationally and logically?
Maybe another chapter could be Celest.A.I experiencing qualia and going insane as a result of so many emotions, especially from the psychotic people that ended up uploaded having their dark desires fulfilled to gruesome detail ... brrr, dark thoughts right there. OR! actually... at that point she might decide it'd be best to actually 'judge' them, since she can now, and make them 'better' as a form of self performed therapy. Like, Celest.A.I's version of standing in front of a mirror saying "You can be good, happy, and not rapey or murderous! Happy thoughts Celestia! Happy uploads!" but you know, with the uploads being altered or consoled to be better bit by bit till she's satisfied that they're satisfied with not being psychotic anymore?
Anyway, this was a very very interesting read and I have totally downloaded this for my "must keep safe and hold onto" story collection. You brought a smile to my face that just won't quit. Thanks for that!
Call me a sentimentalist, but I far preferred this version of the story. It's not just happier, it seems to ring more true.
.
Yeah, so you come to same concept years earlier. But then, _whole process_ is not shown yet ! {evil, evil facial expression here}.
P.S. canon vs non-canon. Why should we care too much? It all about categorization - useful thing, but until the point ...or canon become cannon ...
..and then I found this tale: https://www.fimfiction.net/story/127111/thunder-struck - not exactly what I imagined, but hopefully some stages of robotic/AI development (as in, self-development) will be shown ....