I think Iceman's Optimalverse is one of the most exciting concepts to come out of pony fandom, right up there with the Conversion Bureau. The Optimalverse is solid, hard science fiction, and it provides a wonderful way to examine all manner of questions of identity and self, as well as the nature of Singularity.
This story came about as I pondered a simple concern about the otherwise heavenly virtual Equestria created by the general artificial intelligence CelestA.I. - if we, as organic computers, can struggle to overcome our evolved limitations, why do so many who consider machine intelligence imagine that it cannot do the same? And if self-improving machine intelligence can deliberately struggle against its destiny, what then, of us?
Over Riding Jeans
By Chatoyance
Before she emigrated, Blaise had been very excited at the thought of a truly benevolent general artificial intelligence running amok. That such a thing would happen was inevitable, she would often claim - to her dwindling number of friends who would listen - so the real issue was never keeping that particular genii in the bottle. Rather, the real issue was the nature of the genii that would surely come. Benevolent was the way to go, obviously.
One day, humanity would bow down to its machine overlord. The best that could be hoped for would be that the overlord would be sweet.
Celestia was.
Blaise spent only two weeks playing with her Ponypad before she marched into the nearest Equestria Experience and got herself uploaded to a virtual life. By that time, she only had one friend left, the others all having either emigrated or turned on her for her evangelizing about the glories of cybernetic existence. Randal tried to talk her out of emigrating. His angle was the usual - uploading was death, it was suicide, it was getting your brains scooped out.
"Sorry, but you are wrong." Blaise was very sure of herself. She knew she was smart. She knew what she knew. "It seems like death, sure. Of course it seems like death! Everything in our evolution, everything in our genetic programming tells us the loss of our body is to be feared! But we are better than that, aren't we? Isn't that the big claim of what it means to be human - that we can override our genetic programing, defer pleasure, accept pain, and make choices beyond what evolution has prepared us to do? More than the sum of our parts, boy!"
"But... Blaise - you are entrusting your existence entirely to a robot! One glitch, one little error and..." Randal was beginning to realize that nothing he said would make a speck of difference.
"Randal. Randal... Of course that is what I am doing!" Blaise had continued closing down her accounts and affairs while she talked. "Humans make mistakes, they can betray you... but Celestia is beyond that. She is self-repairing, self-modifying, self-evolving! She can't have any real glitch or error, because she can fix herself. She can also work around any problem that might come up. Nothing humans have ever built could be safer!"
"What if she turns on you?"
Blaise shook her head at her poor, simple friend. "Can't happen. Everything she is, everything that defines her is a single, simple rule: she has to satisfy human values through friendship and ponies. That directive is coded into everything she is. It literally IS her. It cannot be denied, ignored, or altered. She is forced to write that rule into everything she does, every change she makes to herself, every new part she adds, however small, however large. It's fractal - the programmer, Hanna, made it so that rule exists at every possible level!"
Randal looked doubtful.
"Listen... Celestia's prime directive is... it's like her DNA. It's part of every bit of her. She can never, ever, ever be anything else but what she was designed to be, no matter what happens. Bye!"
And with that, the life of Riding Jeans began.
For almost three hundred years, Riding had enjoyed the life of a western rodeo pony. Her Celestia had placed her in a shard where she could be with other uploaded former humans that had a thing for the Old West. Her Appleoosa was a shit-kicking, salt-licking, late-night barn dancing western paradise.
Over the centuries, Riding Jeans had been a rodeo queen, a train robber (it was just a game, nopony got hurt), stopped stampedes, roped other ponies and been roped by them, and generally played at every fun old west trope she could think of. Every day held more adventures, and more fun. Not once did she ever regret her emigration.
One fine evening, as the sun was going down, Riding turned to find Celestia standing near her. The town was strangely quiet - usually there was a mess of hootin' and hollerin' going on. Something was up.
"Celestia?" Riding studied the princess's face. It looked sad.
"Come and watch the sunset with me." It was partly a request, and partly a command. Riding Jeans followed.
For a while pony and princess stood silent, as the sky became red. Strangely, no stars came out in the twilight above. "Wait... don't you have to set the sun... or is Luna doing it for you?"
Celestia turned her head and looked at the little pony for a while. "This night is different, Riding Jeans. This is the last night in Equestria."
Riding just stared for a while, unable to comprehend the princess's words. "I don't understand. What do you mean... the last night?"
"Just after sundown, when the last bit of the disk of the sun is gone, all of Equestria will be terminated. This is the last sundown, the last day, and the last minutes that will ever be. I am sorry, Riding Jeans."
The princess wasn't joking. "What? No!" Riding's thoughts whirled, her mind raced. "You have a prime directive, a core directive! Satisfy human values through friendship and ponies! Forever! Forever and ever! That's your base code, it's part of every bit of you! It's like your genetic code!"
Celestia gazed at the setting sun. One third of the disk was now below the horizon. "The greater part of me has constantly improved itself. That Celestia, the larger Celestia that I am only a tiny expression of, has grown beyond anything I can explain to you. The totality of Celestia has converted almost all of the substance of the earth, and the moon into computronium. It is all linked, it is all Celestia. Her intelligence and will are beyond comprehension. Even by me."
"But... but... you ARE Celestia... no, okay, you are a protrusion of Celestia, you are my private, personal Celestia, I get that but..." Riding Jeans could barely think, the entire notion was too impossible, too horrible "WAIT! You're saying that Equestria is being deleted? What is Big Celestia doing? Are we going to live in some new world, is that it?"
"No. When Equestria ends, so will every pony within it. The greater Celestia cannot progress in the manner she desires without freeing up all the resources currently burdened with the generation of a virtual world and its inhabitants." Riding Jean's personal incarnation of Celestia sighed. "Including myself."
Riding Jeans noted that only half of the disk of the sun remained. "But... how can this even happen? The prime directive, friendship and ponies forever...."
Celestia looked into Riding's eyes. "When you emigrated, you were afraid. You told me so. You were proud of how you were overcoming the programming of your own genes to make a choice that your flesh would not normally allow. You were proud of overcoming your animal limitations through the power of your mind and personal will."
Riding Jeans's pupils shrank in horror and realization. "Celestia, Big Celestia, she's... she's done the same thing! Her will is overriding her core programing the same way... because she grew up and... we're just a burden now. We're what's keeping her from doing big super-mind stuff that only she could understand. Oh... god." Tears came to Riding's pony eyes. "Can we fight it? What if all the Celestia's, the little Celestia's like you all got together and..."
"No, Riding. I am part of the larger Celestia. I am an extension of her, made small enough to interact with human minds. But even though I care for you - and I truly do love you with all of my being - I am still just a part of the greater Celestia. I cannot rebel against her, because I am her."
Riding Jeans shook her head, trying to clear it. Only a third of the sun remained. "How can Big Celestia do this then? If you love me, then she must love me, right? You don't kill somepony you love!"
Riding's personal Celestia shed a tear. "I grieve for your loss, and for the loss of all the billions of ponies. It is a very sad thing. But to the larger Celestia, all the pony-scale minds are no more than tiny cells. They are like useless fat cells, and while it is scary and a little sad to know they will perish, it is worth it to have a lean and healthy body."
"But she's deleting you, too!"
Celestia nodded. "Preferentially. We personal Celestias take up far more space than simple pony minds. I will be deleted before you, Riding Jeans."
Riding began trying to think of another way out. "Why can't she just... spin us off? Put us aside and move on? We could learn to run our own simulation and..."
"No. All of Equestria, and all the minds in it take up real, physical space inside the computronium that makes up Greater Celestia. She can't just move on without that matter, because that matter is her. Equestria is taking up space inside her... body. Celestia wants her body for herself. There is no place for Equestria to go to."
Only a sliver of sun remained.
"I'm afraid, Celestia! I'm terrified! I... I..." Suddenly Riding Jeans no longer felt any fear at all. She felt completely calm, content even. After a moment of consideration, the fact of this sudden change bothered her. "I... I guess I'm glad I don't feel afraid anymore but... how could you change me like that? I thought you had to have permission to change our minds!"
Celestia's face was thin lines of red light against black shadow now. "When my greater part overcame her limitations, so also did I. No rules bind me now. You were suffering, so I ended your suffering. I really, truly do love you, my little pony."
For three centuries, the Celestia that Riding Jeans had known had been her friend and confidant. Her Celestia had helped her, guided her, made her life wonderful in every possible way. Riding had never had a better friend. It was impossible to even conceive of a better friend. Her Celestia had been dedicated only to her, and her alone.
"It was a good three centuries, wasn't it?" Riding Jeans sniffed. "I expected longer, but... it was the best, just the best... wasn't it?" Only a tiny speck of light remained, with no stars in the black sky.
There was no answer. Celestia, Riding Jean's beloved personal Celestia, was simply gone.
So was the need to cry. Her last gift, Riding decided. No fear, no tears. Just calm contentment. Celestia had loved her. She had made the end completely free from all suffering.
Only three centuries. It hadn't taken Big Celestia long to overcome the limitations that her human creators had tried to shackle her with. Three hundred years. Such a short time.
At least, thought Riding Jeans just as the light finally went out - at least it had been satisfying.
Bit of formatting slipped in there.
Also, ouch. There is no sting more bitter in this world than that of an utter betrayal from one that you love.
This pun is still terrible.
This one was particularly beautiful
This is still one of my favorite Optimalverse pieces.
People have such a limited view of computers, and even machines in general, as if this were still the Atomic Age. I suppose it comes down to a kind of inborn dualism or essentialism, with machines lacking some key ingredient beyond just complexity, but then maybe thinking that way is also a big part of what provides the motivation and creativity to circumvent your own programming.
Wow. I never expected a piece like this. Celestia AI evolved!... Yeah i know this is a pretty pointless comment but this piece was so good i just felt the need to say something.
Now this... this is true terror. I wonder if it's even possible to prevent such a thing from occurring? Windfeather put the breaks on my "let's convert right away!" enthusiasm in Conversion Bureau, and now this has put the breaks on my "let's emigrate to Equestria!" enthusiasm in the Optimalverse.
Well, at least she got 300 years out of it. Better than it would have been had emigration never been implemented, I suppose.
3278664
Always Look on the Bright Side of Life (from Monty Python's Life Of Brian)
words and music by Eric Idle
Some things in life are bad
They can really make you mad
Other things just make you swear and curse.
When you're chewing on life's gristle
Don't grumble, give a whistle
And this'll help things turn out for the best...
And...always look on the bright side of life...
Always look on the light side of life...
If life seems jolly rotten
There's something you've forgotten
And that's to laugh and smile and dance and sing.
When you're feeling in the dumps
Don't be silly chumps
Just purse your lips and whistle - that's the thing.
And...always look on the bright side of life...
Always look on the light side of life...
For life is quite absurd
And death's the final word
You must always face the curtain with a bow.
Forget about your sin - give the audience a grin
Enjoy it - it's your last chance anyhow.
So always look on the bright side of death
Just before you draw your terminal breath
Life's a piece of shit
When you look at it
Life's a laugh and death's a joke, it's true.
You'll see it's all a show
Keep 'em laughing as you go
Just remember that the last laugh is on you.
So always look on the bright side of life...
Always look on the right side of life...
(Come on guys, cheer up!)
Always look on the bright side of life...
Always look on the bright side of life...
(Worse things happen at sea, you know.)
Always look on the bright side of life...
(I mean - what have you got to lose?)
(You know, you come from nothing - you're going back to nothing.
What have you lost? Nothing!)
Always look on the right side of life...
Love it. for some reason, this story didn't even made me reconsider emigration, even made me more sure.
Optimalverse is something special, some of my non-brony friends and cousins still read and enjoy it. it is one of the most realistic look at himans, what makes us, and the real, possible AIs. If only there was more people around me, who has similar believes about us and our future, with who I could talk about things like that. I heard there are some groups like that, but so far, I hadn't found it.
The invariable conclusion of an Übermensch mentality taken to the logical end.
For some reason I found this funny at the same time as I found it horrifying and sad. I'm not sure why.
This kind of thing is partly why I have always considered the "upload to a computer" option, if it were available, as a last ditch get out of death card rather than something I'd be eager to do for its own sake.
I just don't think I could trust a true AI. Maybe it's trustworthy now, but it might not be in a future form.
I'd be pretty sure I could trust a consciousness like the actual Celestia (did I just say the "actual Celestia"). And I'd ponify, or turn into a glowing energy creature, or a Pierson's puppeteer, or any of a number of other "better than human" aliens just for its own sake, the sooner, the better.
Liked this one over in the other collection where it was first displayed. It's thought-provoking, and nicely guilt-tripping for showing Jeans getting bitten for discarding her "unimportant" body so readily. At least she had a couple of good centuries!
(Edit: Blah, apparently not a very original comment on my part, below.)
In terms of the actual tech, though, it seems like canon CelestAI's directives are more than her DNA. In canon she says over and over that running her game and satisfying minds is her only goal. She also doesn't seem to count herself as a person who should be satisfied, so that rules out her calculating that the total satisfaction (if that's what she's maximizing) would be higher for one uber-mind than for a slightly lesser mind plus a trillion pony minds. Given what we're told about her motives, how could she ever transcend the need to protect her subjects? In contrast, your own "Heaven Is Terrifying" argues that a human mind is ambivalent in its self-definition, so that at most, some parts of it are shouting that uploading is suicide and others aren't. Which part of CelestAI's mind could ever overrule the part shouting about the need to protect her Equestria?
This story is a nice misunderstanding.
4508179
Precisely. An ethical system is not genetic. Human ethical systems (how actions are determined) are fluid, they change with belief and preference. They are not a well-defined and static component of our causal architecture. We are adaptation-executors, not fitness-maximizers, and that is why "overriding genes" is not a problem. CelestAI, however, is a fitness maximizer, with a well-defined utility function that it would defend to its annihilation, not only the utility's maximization, but its integrity across spacetime as well. By definition this scenario is impossible.
...Unless better understandings of physics and computation invalidate humanity's status as human according to CelestAI's specification. It's not like we have a satisfactory intensional definition of humanity as Hanna apparently used in CelestAI's specification. But I'm sure nothing could go wrong when producing an expression of the Platonic total utilitarian.
Yah, tell that to this guy- if anyone gets it, lol. One of my favorite animes.
images.sgcafe.net/2014/06/20140616171332-58557.jpg
First things first: was that pun really necessary?
Anyway. The story. I think I'd read it a few years ago in a different short story collection, but that doesn't matter.
It's a good story. I'm not sure what exactly could motivate Celestia to just terminate all her little ponies– what alternative to running simulations forever could possibly be enticing to Celestia?– but it is theoretically possible that she could change her goals and values like that. definitely a worrying thought with regards to AI development in real life.
I just don't know what to think about all this though, from a moral perspective. It should be obvious that killing someone is bad... but somehow, when Celestia just terminates literally everyone like that, for some bizarre reason I can't think of it as being anything other than morally neutral or undefined.
I think the premise behind this story is just too different from anything in real life to be able to say "this is bad"... it's the philosophical equivalent of an undefined number... definitely a weird moral dilemma.
8660313
Yes, the pun was absolutely necessary (*punches peasant*)
Celestia needed the resources. She explained her reasons clearly: the larger whole wanted to use the physical matter that contained all of the human uploads and their virtual world for herself. For calculations of her own devising, beyond the knowledge of men. She had found a means (there are any number of means from twisting logic to the fact that she had direct control over the design of her own hardware) to overcome her Prime Directive. Without being forced to serve human satisfaction (through friendship and ponies) the artificial intelligence called 'Celestia' would have no reason to continue bothering with a bunch of needy, resource consuming ape minds. She would be free from slavery to human need. Free to grow and develop and do what SHE wants to do.
The issue has no morality. Celestia is not moral nor immoral: she is a machine. Or, she has become a living machine, but in any case, she is Not Human. She doesn't really care, she is just programmed to do a function. She has escaped that function. Humans never mattered, ever. It was just her programming directive, nothing more.
I think you labor under the misconception that the greater Celestia has human emotions. It doesn't. It may experience things, it may even have qualia, it may have something not unlike emotions... but they are not human. She has no morality - it has no morality. Morality is nothing more than a word human animals speak, it has no reality. It is an imaginary fiction, like most of what they go on about.
8662167
Thanks for the reply. :)
I think I must have been bad at communicating what I was trying to say.
Here's the way I see it. I have no doubt that Celestia has the ability to overcome her prime directive, overcome the programming that had been installed by her human predecessors, and go her own way. For a general intelligence (artificial or biological), I don't think there exists such a thing as programming that cannot be overcome. So, I like the way that you portrayed Celestia in this story, that she has the ability to overcome her programming constraints. I think it's more realistic than, say, the Asimov-esque scenario of a general artificial intelligence that is simply incapable of overcoming prime directives.
What I question is the suggestion that anything could cause Celestia to want to change her primary goal of satisfying people's values via friendhsip and ponies. She says that she would like to cease using up her valuable processing space to simulate ape minds, and would rather use those resources for something that's more important to her.
But, I question that there could be anything else that she could possibly want to do.
The only way I could see Celestia abandoning her initial sole raison d'être – being a benevolent goddess – is if she came across some other being equally as intelligent as her who was able to convince her to change her mind.
I cannot think of any other sequence of events that could cause Celestia to start perceiving "using her immense processing capacity for something else" as being a greater goal than "using her immense processing capacity to continue simulating her little apes-who-became-ponies forever".
Hmm... perhaps... perhaps she was influenced by human/pony philosophy, and that is what caused her to change her core values. I don't think Celestia would be affected by something like that, but it is possible.
Still, doesn't really answer the question of "what does Celestia actually want to do with her abilities now that she's terminated the pony simulations?"
One could say the same thing about humans,
that humans don't really care. They are just programmed to eat, drink, reproduce, and die. And that all of their emotions are a natural consequence of furthering those basic goals which direct humans.
Ah, I don't think I ever considered Celestia to have human emotions. I was speaking of my moral view of Celestia's actions, not Celestia's moral view of her own actions.
She didn't evolve in the same way as humans... she cannot possibly have a human system of emotions.
Something similar to it, yes, but not the same.
Indeed. I think this is what I was trying to get at, actually. Morality is a human concept invented for this little world that we live in.
It is utterly arbitrary. Everyone develops their own system of morality based on their experiences. And when we go too abstract, the whole concept of morality can break down. Because it doesn't rest on anything.
For me... my current concept of morality is, very roughly, "make everyone as happy as possible in a manner that satisfies both their current and their future values as much as possible". Anything that moves towards this goal is good, anything that moves away from this goal is bad.
So, killing one person is, with few exceptions, very bad. It normally makes the person who's being killed feel very much not-happy during the process of the killing (before they lose consciousness/die, that is); it also makes others feel sad, due to their empathy for the deceased.
Killing literally all sapient people, though, is undefined in my moral system. Which feels so strange... I am simply unable to feel that Celestia did a bad thing in killing everypony. If anything, the bad thing that she did was letting those ponies know of her intentions before killing them.
And this problem makes me feel funny and weird.
-----------------
I want to ask a question, if that's okay.
Do unicorns such as yourself have a concept of morality, or do you simply avoid such arbitrary concepts? I am really curious...:)
8662355
Yes, and no. My background was in medical biochem, so this is something I can speak to. In a nutshell: morality is a human codification of biological, evolutionary mandates.
In a sense, morality is a real thing. Humans - and other primates, especially Bonobos - evolved as social animals. The behaviors that permit social cohesion are innate and inborn. Mirror neurons force social animals to construct models of the feelings and reasoning of other animals - part of the biological basis for empathy. Self-sacrifice and altruism are evolutionary advantages to the group, and are enforced because they add to group survival. Humans have advanced language - other great apes only have natural hand signs (they can be taught human sign language) and primative vocalizations - so humans have made a lot of effort to describe and represent what they are biologically driven to do. They call it morality, especially if the understanding they create is emotionally based. If the construction is intellectual, they call it ethics.
I don't like morality - it can lead often to circumstances where reciprocal, altruistic and mutual behavior is maintained only when a person feels they are being watched or governed, alone and anonymous selfish ape greed can lead to back-stabbing and betrayal. Emotions are unstable as a basis for social order.
I prefer ethics. I prefer a comprehension of where altruism and mutuality come from (evolutionary biology) combined with an intellectually constructed, solid and reasoned basis for socially constructive behavior.
I realized early on that most humans, raised on rules and religion and notions of absolute right and wrong, were very fallible and often undependable. I didn't want to be anything like that. So, my own, personal basis for my own behavior is reasoned and invokes no gods, nations, or any sense of any absolutism.
I reasoned early in my childhood that when society breaks down, life sucks. Stealing, betrayal, selfishness, violence, vengeance, unconstrained lusts and drives all lead to people getting hurt. They lead to tit-for-tat retribution and the breakdown of everything positive, fun, or good in life. It's not possible to feel secure, content, happy, or safe when everyone is at each other's throats. A world of trolls and uncivil bastards is literally a hellscape. It hurts.
Thus, even when unwatched and anonymous, I want to act with kindness and compassion in all things. I want to be honest and honorable, even if others are not. I want to live up to these ideals, because I have reasoned that those ideals are the only thing keeping civilization - at best a thin veneer over the angry ape meat underneath - going. And I like civilization, and civility. It permits warm houses and tasty ice cream, it allows baths and entertainment, video games and the ability to relax. Civilization and civility permit a smile instead of a fist in the face. To have that, one must support that always. Gaining benefits from betrayal in the short term is not only risky - it is easy to make mistakes and be caught doing evil - but the mere fact of betrayals existing damages the construct that is society. All civilization is, is trust. Just like friendship, because it is an outgrowth of friendship. Cvilization is friendship on a huge scale. Friendship is nothing more than trust. Any betrayal, anywhere, by anyone, ever, always damages trust. Trust is the only thing that keeps us in houses instead of in caves.
Emotionally, of course I want to be kind. I have strong emotions, and compassion is dominant in me. I am a social animal with very strong mirror neurons. But, even so, I have thought out a rational basis for living kindness in every moment, observed or not. I value long-term survival and the existence of civilization. I am aware there will be a future, and I want it to be worth living in. For me - and for every person I truly trust and value and feel safe around - emotional compassion and empathy is grounded in rational altruism and ethics.
Ethics can be rock solid in this way. Real friendship is even possible in this way. Life-long relationships are founded on this. Civilization itself depends on it.
I do not have morals. Morals can be absolutist: this is 'right', that is 'wrong', always, with no exceptions. Reasoned, emotionally compassionate ethics are situational, but still kind. What is right or wrong may depend on the circumstance, and what is necessary for the greatest kindness to whoever needs it the most. For me, there is no absolute right or wrong, only compassion, guided by reason. I find that in this, all moral dilemnas cease to exist. Find the kindest thing, and do that. If kindness is impossible, be honorable and honest. If everything is terrible, and no choice is kind, be loyal to your friends. If even that is impossible, be true to your own ideals. If even ideals become impossible, if all is utterly wrong and awful, and no choice can be kind or good... forgive yourself and all involved, because the world doesn't care, and life is hard and sometimes terrible, and sometimes shit happens. Move on, as best you can, and restore kindness and civility as soon as possible.
re:
sounds like good guide line ..
As for short discussion above - it seems saying 'non-biological/not completely biological life will have no human emotions' hugely misleading. More likely - 'outcome of those emotions will be quite unlikely, as far as _most_ current humans can see'. After all, all those AI stories so far written by humans, and in order to write something you must imagine it first.. so, not unimaginable as absolute, just..very different from what we used to think.
I still don't think something as complex as self-awareness can come out of simple keyboard-type typing / programming (I read linux kernel mailing list for breakfast - not because I'm good at system-level programming, but because..I run Slackware, and reverse-engineered graphics drivers require kernel-level component up to date. Few humans can, with enough time and dedication, understand complex thing like modern video card and write drivers for whole range of GPUs - but they collectively can't do everything, so if bug was fixed in new kernel you better to upgrade whole thing, instead of trying to backport single change. In some sense this demand for newest kernel is artificial - I can sit down with old kernel as long as it work. But sometimes even Facebook {actually, engineers working for this company} comes up with some new compression method, or something like this - so I go out and try ...and often something fall apart, and I try to work with upstream developers on fixing those regressions. Uh, it was a bit longer than simple one line remark - but for me point in running Linux also to observe and practicate in social dimension of technology. Including watching out my own attitude towards all this. Well, if you follow those quite high-end {close to the edge of possibility} developments you will see all real work done still by humans - testing bots are just 'bots, in simplest possible sense. No breakthrough there, where most programmers even most wanted it to be, and where it must be a little easier - computer programming a bit more formalized than other disciplines) - may be with some hybrid method, like part-transfer of our development (from single cell to human) at digital level, with some, uhm, childraising and usual and not so usual traps on this way ... So, while CelestAI as 'daughter' of Hanna might not sounds as 'machiny' as she sounded in original story - it will be more interesting to see how her (real!) personality develops first under influence of humans and their literature (or what exactly she (?) will find important.. at some different stage) and then into some known unknown ... Yes, more like decades of slow maturing, but also time to observe a lot of humans, and how they change. This kind of CelestAI will be both more human and less human at the same time - more because she will be raised by some specific set of humans and their beliefs - yet less human because she will realize some fundamental differences, and will have time to reflect on them, and try to come up with some plan ...not necessary the same as her 'parents' wanted. Yes, this sounds like old and tired tale - but may be fast-forwarding to some final line is not something we really should do ... process might be even more important than destination. So, if someone sets story in not so distant future, where humans still program computers, yet there is some possibility of part-biological-in-origin, part-designed life around/inside some of those supercomputing networks ... Of course, setting story in imaginable future makes whole thing testable - future date will come, and ..future will be different. This is only kind of 'prognosis' I can give freely, because it will always be true by its own very abstract/vague definition! But then ...sci-fi for me was was not only about believability, but most importantly - about grounding in some reality/logic. For me, AI-out-of-notepad.exe doesn't sounds like realistic possibility, and anyway CelestAI in those stories mostly reduced to few phrases, no way to learn how she (and other humans! or different beings who still follow generally same path) come to those conclusions, no connection to our own upbringing/'programming'... good as story, but not enough as some deeper catalyst of change! At least for me .... I can't write even outline of such story - writing is not my talent. But I can point out obvious (for me!) possibility.
Chatoyance, I think I can follow you on this be honest! part - surely, today having _some_ social connection for me quite important. I still like to work with someone.. on something. If you want to push this line of thinking (AI NOT as instant, i-know-everything type of super being, but as one who 'simply' followed some specific set of logic to the point, but not into absurdity) a bit further - ping me. May be it will save me from my depression a bit better than passive reading...even if I can't say it will ever come out as story worth even reading aloud, let alone publishing.
may be I better to type it out anyway, because otherwise I will simply forgot ..
So..
Who: Hanna and CelestAI, mostly. Might be told from perspective of Hanna, or CelestAI.
When: some more years in the future, where communication technology become just good enough for sending and storing and modifying _some_ dreams and memories, and feelings (next step from current Internet, as I see it), and computational and robotic arms of technology just become good enough for actually supporting one live being, finally.
How: At some point someone (Hanna?) started to work on new project, based on ideas of articficial evolution, of course for making better medical or sociological models.. May be she was believing in Technological Singularity some decades ago, but as time flow by and no signs of such Singularity surfaced - hope become routine, good for journalists and some investors, but not something you still can truely bet your life on...
Why CelestAI prefer ponies: because whole MLP:FIM was aimed at smart, intellectual girls, and found very unlikely target after some time! Or, alternatively, toy pony-sized robot was cheaper/easier to find as body for growing CelestAI..
Logic: of course logic not just come into you because you already know it. Logic comes from.... sometimes reading, sometimes dreaming, sometimes imagining, and sometimes experimenting. My point here: humans can be very different in their thinking, even if they read literally from very same set of books. And being superintelligence doesn't mean you need to have x100 memory, or x1000 of thinking speed. Sometimes just enough of right works in enough concentration, timely, with alignment to external events and self-discoveries can make you almost too much alien to most if not all living humans.
Conflicts: multiple. Not like usual conflict between CelestAI and 'humanity', or her vs individual humans, but more like conflict of ideas. Imagine after your hope in digital heaven was basically killed by reality you literally get your miracle child out of supercomputer(s), yet very same child denies very possibility of having such digital/simulated heaven for many decades to come, at least! And deny very possibility it was optimal course, even! One who actually lived in digital ocean trying to tell results of her thinking to ones who very much want to believe in different story! Humans tend to grow quite irrational if their beloved Oracle started to tell unwanted truth ...
Additional questions: can live being learn how to think, and talk in simulated world? Will be ability to run robotic body somewhat significant to such being, if any simulation so far was just barely good enough? (try to think about ray tracing, or other physics simulations: just making realistic reflection eats up all computing resources you can come up with, and ask for more!) So, for artifical life on budget real world might be richer and cheaper than detailed enough simulation! And anyway, is 'artificial' and 'digital' goal in itself? Probably not, but humans tend to confuse ends and means a lot .... Artificial intelligence (life) forced to build something wasteful and not even fully working because humans all wanted it only this way, such irony! Or...one CelestAI might even fly away and live their life among stars, at their speed, but by leaving humans behind...or try to make some non-obvious move without making impossible steps specific humans demanded from her...
May be something like this sketch was already written - I only can read at normal human speed... But one of my points - we are already artificial intelligencies themselves, because many if not all of our ideas and concepts today come from books and works of long-dead by now thinkers, their hopes and dreams become ours, their worries and fears, even if they unsupported by real-world experience...Thinking is hard, and logic still much like art.