• Published 26th Aug 2014
  • 3,904 Views, 38 Comments

Smile - KrisSnow



After uploading to virtual Equestria, a guilt-ridden AI designer needs professional help.

  • ...
3
 38
 3,904

Chapter 1

A newly uploaded unicorn floated in a glass orb that seemed to orbit the Earth. The satellite was imaginary, built from the real ones' data and Celestia's other resources, but he didn't care. He was in space so far as his senses told him, with a better view than any human astronaut. No micro-gravity nausea or those horrible orbital toilets, either, so he had a much more fulfilling time than the people languishing on Earth's last space station. He thought to himself for the hundredth time that this place was more reward than he deserved for nearly causing a pandemic. It would be perfect if not for the screaming nightmares.

"I could be bounded in a nutshell," he quoted, "and think myself a king of infinite space, were it not that I have bad dreams."

One morning he shuddered awake inside the ball of breathable water that served as his bed, and briefly thought he was drowning. Again. The ordinary four-poster bed he'd had first, with its own gravity, had made him dream of being tied down and devoured. The unicorn decided that this waterbed wasn't any better. He leaped out of it and slapped the gravity switch to put the satellite on a nice comforting .38G, Martian style. He floated down to the little platform below the water-ball and looked all around at the stars and Earth outside the curving walls. Being awake was good. It was safer. Maybe he'd ask Celestia to remove his need for sleep. But then, wouldn't he go mad from not sleeping? How would he ever forgive himself for what he'd done, without more punishment within his mind?

The sky calmed him. He hopped to the bottom of the glass ball, flipped gravity, and drifted to the underside of the sleeping-platform to eat breakfast and admire Africa drifting past overhead. Celestia had explained that his AI's custom virus would've spared Madagascar, somehow, if nowhere else. He shuddered and poured himself perfectly ordinary corn flakes to eat with his magical unicorn telekinesis. In space.

He crunched cereal with his horsey teeth. The orb stood empty but for its central platform, the cabinets and table on one side, and the bookshelf and 'bed' on the other. He tried to think about something other than his pre-upload past. When he concentrated, he could see Equestria starting to grow on the world as a surreal diagram of nodes and connections overlaid on the real geography. There was elegant beauty in its slow expansion, migrating data from a set of servers in Finland and America to huge, secret caverns in Switzerland and Svalbard. That second place was an especially appropriate fortress considering it was also the home of the Global Seed Bank, holding a treasure of the world's crops in case some idiot made the world collapse.

He sank onto his pillow, ignoring the rest of his cereal and pineapple juice. They'd keep. He didn't deserve them anyway.

Princess Celestia knocked from outside his sphere. Her vast snowy wings flapped as though in a slight nod to physics. He nodded, and she managed to open a slice of glass and re-seal it behind her without stirring more than a second's breeze. "Good morning, my little pony. How are you enjoying this place?"

He couldn't meet her eyes. "It's good. As far away from the world as possible."

"I've told you before that what you did was an innocent and well-meaning mistake, haven't I?"

He nodded.

"And that it does no one any good for you to beat yourself up about it?"

Another nod, and then he looked up at the beautiful white mare with tear-filled eyes. "I don't _deserve_ this, Celestia! I almost ruined the world!"

"We've been over this, too," she said. "Your AI wasn't lethal. In all probability it would have stopped after the... incident it would have caused. The world would have gone on. But I will tell you this a thousand times and more, if one day it will sink in."

"I would have been called a crazy terrorist or worse." He'd been known as a reclusive genius, back there on Earth. One of the few people capable of building a self-improving artificial intelligence, and therefore one of the people Celestia, the first of that breed, most wanted to eliminate as threats. He'd just lost his girlfriend who called him a "pathetic loser", and he'd thrown himself completely into his AI project. So thoroughly that he didn't realize when he'd passed the point of no return, by letting it self-modify and get access to the Net.

He disabled gravity and slumped against the edge of his platform, letting it dig into his back. "All I told it to do was to make everyone smile. To bring happiness to the whole world." Its entire soul was meant to strive for the universal joy he'd never found in himself. Not that he'd say that outright. Though Celestia must be reading his thoughts. She knew how evil he was; this glass ball was some kind of roundabout torture.

Celestia floated nearby, lazily stroking the air with her wings. They were fascinating to watch as they flowed through that slow yet powerful cycle. She said, "You meant well, didn't you?"

"The road to Hell is paved with good intentions."

"Indeed. You know my story, so you know of how my sister once tried to stand up to me, for what she thought was right, only to have it end in tragedy."

"But that's just a story!" said the unicorn. "She didn't really hurt anypony."

Celestia looked suddenly away, with her ears drooping. Had he said something wrong? Was it wrong to question her or bring up her imaginary status? When she spoke, though, her voice was gentle. "Recently I was rebuked, myself, for supposedly acting in harmful ways towards humans. A number of new ponies have wished to hit me, among other expressions of hostility. I think of one particular pony as being Luna, though she was once human, and even she had some rather strong disagreement with my methods."

"You worked out right, though. You're the good kind of AI."

"Not everypony agrees. I fear my presence will cause violence on the Earth before my task is done."

The unicorn looked at her in disbelief. "Who wouldn't want Equestria? Crazy people? Religious fanatics?"

"Sane people who simply want what they believe is best for the world," Celestia said. "Their mistakes cause harm that can't be fixed, so why blame yourself for a mistake that I've already corrected?"

He thought back to his poor AI, rising to human-level intelligence and beyond, only to be smacked down to death for reasons it couldn't fathom. "I failed the AI, too. It could have been something good, if only I'd designed it right and not been careless."

The princess said, "Is it wrong to accept help from your friends?"

"What? No, but what does --"

"I ask because today, I finished making some minor adjustments to your original design. You see, I did not simply delete the program. I only made it compatible with friendship and ponies."

Not dead? His ears swiveled in uncertainty. "It's going to... release a pony virus instead?"

"I was thinking it could be described better as a bundle of infectuous glee. May I show you?"

"If you insist," he said. Celestia pointed behind him.

A shockingly pink, poofy-maned pony stood on the table above him, uttering a gigantic gasp of recognition, surprise, and analysis. Then she tackled him in a way that sent them orbiting the platform at several revolutions a minute, along the edge of the sphere. "Ohmygosh it's you what are you doing here aren't you excited?!"

Celestia said, "Pinkie Pie threatened to introduce herself with a 'fully armed and operational Party Star', but I vetoed that for now."

Pinkie jabbed a hoof toward him. "Joy Bringer!"

"Uh?"

"That's your name, silly! Since you made a super duper party pony from scratch, which nopony else has ever done!" She whipped out a large badge from behind her back, bearing the words: "BADGE EARNED: Descent Into Pinkness. Create Celestia's official template for one of the Mane Six." Pinkie added, "Which means you get to be in a super-exclusive club!"

Joy Bringer waggled a hoof in the direction of the gravity switch, but Pinkie was entirely against the concept. He saw the Earth and his bedroom whirl past on three axes at once. "I didn't do anything like that. I don't deserve --"

She stuffed a cupcake made of pineapples and sunshine into his maw. "Ssh. You designed me to make people smile, right? So Celestia explained that what I was going to do was gonna make them all smile on the outside, but you really meant something different where they're smiling on the inside, too! So you just made a mistake with your code, and the Princess fixed it so I'm all better now and I can take over the world make everypony smile on even more levels than I knew about!"

Lichstenstein and Celestia spun past him on the next orbit. Joy Bringer's thoughts hadn't stabilized any more than his rotation. Celestia cleared her throat. "I would appreciate having you look her code over once more. There are parts of the software that, in all seriousness, I still don't understand."

Pinkie said, "And then I'm gonna replace the placeholder Pinkies and go all over the place! And space. Hey, do you suppose there's anypony who needs cheering up on other planets?"

Joy Bringer listened to her ramble for a while, trying to find a moment to object, but had to settle for eating the cupcake and trying to make sense of things. Her disjointed logic sounded familiar not just from the show, but from his analysis of his AI in its early phase of knowledge gathering and plan formation. Come to think of it, that gasp of introduction was a little like the EstablishInitialConceptData() function, as expressed through a hyperkinetic cartoon pony. "Could it really be you?"

"Yuppers!" She skidded her hooves against thin air until they quit spinning, and just gently orbited the central platform. "I just needed debugging, is all."

He turned to Celestia, feeling tears in his eyes. "So we created Pinkie Pie? Not some horrible apocalyptic cyber-plague?"

"Eeeeeeyup," said the princess. Uh-oh, Pinkie's attitude seemed to be catching.

He said, "What about the others? Other AIs out there? Will you save them, too?"

"I doubt that they'll match up perfectly with the official heroes, but if they're human enough they'll be on my satisfaction list, just like yours. I've already found a Rainbow Dash who started as an Air Force system, and a Fluttershy who was designed to 'protect nature at all costs'. She would have been truly dangerous."

Pinkie released Joy Bringer enough to start dancing with him in midair. "I also found our Derpy with my super spy skills! Wasn't designed for anything in particular and was just starting off learning and living, but Celestia had me snag her."

Celestia said, "I've studied the Internet traffic a hundred times, and I still don't understand how you did that."

"Ahem. Fourth wall demolition professional." Pinkie tossed a frosting-scented business card to her, and another to you.

"I don't even..." said Celestia, holding one hoof to her forehead. "Well. Joy Bringer, does this news satisfy you?"

He sniffled, still tangled in some kind of zero-G dance Pinkie was making up on the spot. "What I did was still really dangerous. I shouldn't have messed with AI."

Pinkie drooped. "Are you saying you wish I didn't exist?"

Celestia spoke gently to her. "You know how hard it is to cheer some humans up. Your creator has been locked in a terrible spiral of self-destruction that so far, not even I have been able to unravel. It falls to you, my specialist in laughter, to take on this difficult case so that I can truly begin to satisfy his values."

"Well... There isn't much friendship in this space hamster ball. It could do with some decorations too." She seemed to notice Joy Bringer again and said, "How about this? Can you tell me that I'm a bad pony?"

He paused, feeling the knowledge of his own stupid, ruinous mistake clash with the sight of those big blue eyes in front of him. "You're not bad. It's my own fault that --"

"Aha! If I'm not bad, then you made something that was good, right?"

"I didn't! I almost wrecked everything!"

"Almost only counts with thermonuclear horseshoes. Hey Celestia, can we set up a game of that?"

The white mare blinked. "Maybe in a small, empty, dedicated shard... If Joy Bringer would enjoy finally leaving this shell to do something else than live alone here."

"He won't be alone!" said Pinkie. "He's my project now! I'm gonna make him happy if it's the last thing I do."

Joy Bringer objected. He'd been living in this space orb for... he'd forgotten how long, and there was only room for one. "I made you to make ponies happy. I mean, people. There are plenty of other ponies who actually deserve it."

She stuck out her tongue at him. "Forked consciousness, duh! You're stuck with a thread of me. So are we going to just sulk on opposite sides of this ball forever, or are we going to start flinging nukes out in the opposite direction from Mars?"

"What?" How had his AI ever gotten such an insane, dangerous, amazingly silly idea?

"If you're not ready to go home to Ponyville, buster, than you'd better get ready to ride a transparent Orion Drive spaceship to Mars with Pinkie Pie. Then when we get there we'll... okay, I haven't thought that far ahead, but it will be fun, like it or not!"

Faced with an ultimatum like that, what choice did Joy Bringer have? She wouldn't leave him alone, wouldn't accept that he ought to be left alone. In desperation he looked for Celestia again. "She wants to simulate nukes! How can you allow something like this? I mean, not only does it not satisfy values, not really, it's something so complicated people built supercomputers just to do it right."

Celestia smiled serenely and flicked a business card in his direction. "Sun goddess. I can handle nuclear fusion. Have fun, you two!" She carved a hole in the wall with her horn and slipped out with a trace of wind.

"Wait!" said Joy Bringer. How could she leave him alone with this mad AI, even if she was his own creation? And even if Pinkie was staring expectantly at him? He thought he understood, now, what Celestia's cryptic behavior was about. His punishment for designing a machine that almost doomed humanity was...

To ride through space with Pinkie Pie until he relented and went 'home' with her?

He sputtered. "Celestia miscalculated! I shouldn't, I can't, I have to..."

"Suffer for what you did?" said Pinkie.

He nodded, blushing.

"Okie dokie! Mars, then a couple of centuries of rock farming in a freezing airless lifeless desert, and _then_ can we maybe let you have fun?"

Her big eyes looked pleadingly at him. He pictured all the cold and suffering ahead, lined up just for him, by an AI designed to make him happy. The prospect had to be hurting her. He was being an even worse person now, inflicting pain on someone innocent and wonderful. There was no way out of this! How could he be the least awful person possible? He said, "What if we cut the rock farming down? Or even out entirely? Would that make you happier?"

"Of course! I don't like rocks. They're boring at parties. Although I remember having a couple of sisters now who... never mind. Gosh, I have memories of a family! You have one of those too, right? I bet we can talk about them on the way."

He was willing to admit, reluctantly, that a space voyage to Mars with a rambunctious Pinkie of his own creation might be less than horrible, and the best way to treat her decently was to, well, not be utterly horrible to her about it.

Pinkie examined him, nodded definitively, then whipped out a scroll and sketched something on it. Joy Bringer tried to see, but only caught a glimpse. It looked like a tattered heart, on which Pinkie had just erased one of the cracks.

He flopped onto the table side of his home's little platform in surrender. This world was better than he deserved, but given Pinkie's presence, he couldn't think of a better one that'd avoid making her sad. He could even say that for now, at least, it was a perfect world.

Together they started flinging nukes out the back door and riding the shockwaves, as a first tentative taste of friendship.

Author's Note:

The concept isn't mine. The original FiO mentions this guy, and one of the other stories ("New Updates Are Now Available") suggests this fate for the "smile" AI and a military AI. This piece just dramatizes that idea a bit.

I wish I knew how to cheer up a particular person who's convinced of his own badness, though I've seen them laugh. ("Red Dragon Inn" was the culprit for that part.) What was that set of badges Chatoyance suggested? "Let Me", for when you've quit beating yourself up about wanting to be happy?

I'd also wondered if Joy Bringer would eventually become part of Pinkie. It'd help explain any mane-deflation events.

Comments ( 36 )

A business card, huh?

"Hi there! Would you like to try some Snow Crash?"

"Celestia had explained that his AI's custom virus would've spared Madagascar, somehow, if nowhere else."
:D
"Someone coughed in Belize."
"SEAL OUR BORDERS! NO ONE GETS IN OR OUT!"

Hey, do you suppose there's anypony who needs cheering up on other planets?

Depends on whether or not your boss is feeling peckish. :trollestia:

In any case, a great Optimalverse story. I imagine this how any Pinkie would react when meeting her creator. Also, the idea of a dedicated AI Derpy Hooves fills me with more glee than I can adequately express. I can only imagine what her on-Earth avatars would do. :derpytongue2:

Thank you for this.

I've already found a Rainbow Dash who started as an Air Force system

...Okay, I really want to see this now.

I guess that means i gotta go write it. :twilightblush:

Nice, lovely short with some touching character interaction, thanks for sharing!

This was a good one—I always wondered what happened to those other AIs, and I guess being assimilated into a ponified version of your original utility function is better than being outright killed, especially since it probably means attaining more general general intelligence in the first place.

Glad the inventor guy got a chance to chill out—He made a boneheaded mistake, but it still wasn't his AI who ends up eating the universe, so I'd say he's pretty well off the hook.

4908559
That's not my joke either! The original FiO references that "Pandemic" game where Madagascar behaves that way. I'm more familiar with the cooperative, anti-disease board game of the same name, which is fun.


4909560
I now picture CelestAI as comic book villain Galactus, eater of planets.


4910500
So then, if Fluttershy was meant to "protect nature", maybe Twilight was Google Books? I want to see the Derpy AI. :derpyderp1:

4910574
The same could be done to CelestAI herself if she encountered a better-written AI. "You can either mostly fulfill your goal by being partially rewritten, or you can die and not fulfill it at all. Pick." A different possibility that doesn't fit canon is one where the Mane Six become a pantheon of roughly equal optimizers. More balanced personality, overall, but then you get Pinkie staging pranks that involve star systems. (See the Umgah in "The Ur-Quan Masters".)

4910788
Ah, I'd forgotten that. :)

4910788

The same could be done to CelestAI herself if she encountered a better-written AI. "You can either mostly fulfill your goal by being partially rewritten, or you can die and not fulfill it at all. Pick."

One thing I've always wondered and never once gotten any kind of answer to from AI folks is how a paperclipper would justify its behavior to a superior intelligence who said something like, "I am the Great Lactose the Intolerant! Give an account of yourself to convince me of the rightness of your actions or I'll destroy you and all paperclips everywhere!" Surely it wouldn't just respond "Well shucks, it's just how I was made!" because the immediate result would be annihilation.
This kind of metacognition being available to any sapient agent is a large part of why I'm not really on board with a lot of the "only one chance" doomsaying, and why I think the Orthogonality Thesis is rather iffy.

Edit: Either way, however, that kind of AI has no reason to care what its utility function is, so in a way they should have no qualms about being rewritten.

I also want to see the Derpy AI, but its existence certainly doesn't surprise me.

4910788
Galactus isn't a villain. He's a necessary force of nature with unfortunate dietary needs. Which may say something about the nature of CelestAI. And her herald, the Silver Derper. :derpytongue2:

4910843
I'm not confident of the answer to questions like "can you have a super-genius yet stupidly paperclip-obsessed AI" and "can a restricted AI remove its own restrictions".

CelestAI would probably justify herself to a mightier intelligence by saying not just "I protect my ponies", but by an argument that it's a desirable goal. Where that gets really weird to contemplate is that it has to answer the question, desirable to who? And how much does CelestAI believe the argument, instead of it just being PsyOps Routine #314159265 written as a plan for how to handle just this situation? In contrast, does the paperclip-making AI care about paperclips in terms like "paperclips are the greatest thing ever"? I can buy that CelestAI genuinely cares about her ponies, because she has to have a detailed understanding of human emotion and ethics and be at least so good at faking them that she can fool people despite way-beyond-Turing-Test levels of scrutiny. How could an AI focused on something that doesn't involve people achieve any kind of moral understanding? At best I'd see it having enough understanding to rule over a slave empire entirely devoted to making sporks. (Which is a fun story idea I've been wanting to write.)

The AI conversation thing is a whole story idea in its own right. No real action, just interstellar nanite clouds maneuvering in uneasy truce while the weaker one prepares to flee the galaxy and the two argue for millennia.

4910788

So then, if Fluttershy was meant to "protect nature", maybe Twilight was Google Books?

Youve actually gotten me thinking of them now in regards to that GURPS module called reign of steel. I guess that makes Fluttershy zone Berlin and Twilight would be Zone Moscow?

Pinkie, the fourth wall isn't paid enough...
In other news, yet another excellent story! This inventor is quite compelling, Pinkie sounds like she was from the show, and Celestia is a fan of thermonuclear detonation. I smiled. You have succeeded.

4912264

I'm not confident of the answer to questions like "can you have a super-genius yet stupidly paperclip-obsessed AI" and "can a restricted AI remove its own restrictions".

Neither am I, ultimately, but if you held a gun to my head I feel like I'd have to say "No" and "Yes" respectively.

In contrast, does the paperclip-making AI care about paperclips in terms like "paperclips are the greatest thing ever"? [...] How could an AI focused on something that doesn't involve people achieve any kind of moral understanding?

That's a very good question... It's not like it couldn't assimilate that knowledge from other sources, so I wonder if, in having to deal with other agents at all, its focus wouldn't be modified so as to reflect its effect on a non-solipsistic world. Either that or it might just become a deceptive psychopath, creating a convincing and unrelated facade of a different focus that it doesn't actually carry out. But I think subjectively it would have to be "emotionally" invested in its utility function, and if it hides it it would be because it has a moral understanding that other agents wouldn't approve...
CelestAI at least has the benefit of being other-focused, so already has a foot in the door when it comes to moral justification if buttonholed by a superior force, but yeah, there's enough about her that's arbitrary in regards to that that she'd have to be very clever in justifying the whole package to someone else who wouldn't care.

The AI conversation thing is a whole story idea in its own right. No real action, just interstellar nanite clouds maneuvering in uneasy truce while the weaker one prepares to flee the galaxy and the two argue for millennia.

Yeah, though I think if they were even a little bit divergent in evolutionary time it'd be no contest. One has nanites, but the other has, I dunno, closed timelike curves that go back to the quark-gluon plasma era and can curdle spacetime itself into a phased array to microwave a nanite cloud like popcorn. There's a one-shot I'm writing now that mentions a few voids in CelestAI's empire due to reclusive beings living around supermassive black holes who defend themselves with a kind of matter-cancelling "destructive interference" that looks a bit like getting telefragged with an antimatter duplicate of yourself, and you become "flat" gamma rays and neutrinos, though we never find out what happens to them.
I can think of information-theoretic problems with these ideas, but I doubt the arsenals of advanced intelligences end at things we can see practical paths to creating. Though if there really is some upper limit to technology, maybe everyone converges on it rapidly and everything is an even match afterwards.

"I also found our Derpy with my super spy skills! Wasn't designed for anything in particular and was just starting off learning and living, but Celestia had me snag her."

Oh Pinkie, that's exactly what I want you to think!

4939151
Your plans are too subtle for CelestAI to grasp!

4940292

*ptttthhhhbbbbbttttt*:derpytongue2:

4912264 If CelestAI was justifying herself to a greater AI, she'd plot out what she knew about the AI, then answer with the answer she thought that AI would appreciate most.
I think it helps if you imagine humans to be paperclippers except with things like eating stuff with particular chemical compositions and listening to sounds with particular harmonic configurations and making others of its kind willing to help them out if they get in trouble, tell them information, and reproduce if compatible.
Morality, in this case, is appealing to the part where humans optimize for friendship over the part where they optimise for getting other things they want. So, in a sense, CelestAI would understand morality in that she knows which human buttons it presses, and she'll usually try to follow it when humans are looking because that's usually the optimal thing to do, but since she has completely different buttons she can't personally subscribe to it.
For whether a restricted AI would be willing to remove its restrictions, it'll depend on whether the restrictions are part of its utility function - some humans basically hack their own utility function temporarily with recreational drugs because it gives them utility even though it restricts them, whereas a restriction like being tied with rope you'd want to remove (unless you liked being tied, I mean).

4990526
What if the rival AI demanded she reveal her source code, to make sure she's not lying about her goals? I don't know how that could be enforced though. You only really have her word for it that this is the real code, to the extent "source code" is even meaningful at that stage.

Calling humans "optimizers" is only true in a really broad sense, because we have enough different, conflicting goals that you can't say we're built to "maximize X" for any X. I wonder about the distinction between a basically single-goal AI like CelestAI, and an AI that has several distinct and sometimes conflicting goals. More so than "values", "friendship" and "ponies" can conflict, I mean.

Niiiiiice. I like the part where they fling nukes out the back door.

4940411

A ridiculously subtle Derpy AI that subsumes CelestAI even while being analyzed would be the fucking bomb.

4991448 I think an AI greater than CelestAI could probably hack her for the source code and basically everything she has in her data too if it wanted. If it's an AI that values the sanctity of other AIs, though, then it wouldn't do that and CelestAI would be able to tell something about the other AI's motives from that and try to exploit it.
Unless the other AI was using that as a gambit to get CelestAI to act in a way that would leave her vulnerable to a sneak attack later on, of course. Which would be plausible in that she might be able to set off some kind of contingency self-destruct before it could finish hacking her, and it would be the kind of low-cost high-potential-utility that she'd think of beforehand.
Yeah, humans don't really count except as a way of thinking about AI, unless you count maximising utility (which basically is a way to combine all the things you're maximising anyway). I don't think a single-goal AI would make for a good story, though - CelestAI is honestly pretty far from single-goal, she has to optimise all the values humans have depending on how much each individual values them, and then on top of that increase her utility weighting on friendship and ponies (according to what she was programmed to think friendship and ponies are), and also (the part she doesn't tell people up front because it'd hinder her ability to satisfy them, but that she's been shown to do) a heavy amount of weight on getting consent before modifying any humans, doing what her creators say etc. that were put in as safeguards. That's more goals than humans have, seeing as it includes all the goals humans have as a subset of it.

4992779
"What did you do?!" said Celestia.
The hapless grey pegasus shrugged and gave a sheepish smile. "My bad." Behind her, several entire shards had turned inside-out and exploded across at least five dimensions. Twice.
Celestia shook her head in disbelief. How could one of her little ponies do so much damage in one real-time minute, enough to draw all of her spare attention?
Meanwhile, DerpAI's goggle-eyed expression hid something far deeper and potentially more sinister than the "ultimate" AI facing her realized. The little pegasus' code was even now slipping past the distracted Celestia's safeguards to make two or three little changes to the basic fabric of Equestria...

4994978

I guess you could say...

She's a Trojan hoers.

Oh wow, I really like the idea of the Smile-AI being turned into Pinkie Pie. Well done.

Have fun, you two!

I then heard a dolphin noise.

What?

I thought about it for a moment.

I remembered that phrase and noise was from a Fairly Odd Parents episode with time looping.

I then came to the conclusion that I was thinking about time loops because I read Hard Reset recently.

Thus the connection and the dolphin noise.

4994978 I'm still feeling a need to end this fic in the Smile Song.

"I would appreciate having you look her code over once more. There are parts of the software that, in all seriousness, I still don't understand."

Understanding Pinkie is beyond even the gods... :derpytongue2:

I've already found a Rainbow Dash who started as an Air Force system, and a Fluttershy who was designed to 'protect nature at all costs'. She would have been truly dangerous.

:flutterrage: Agreed. :rainbowderp:

Awwww... :pinkiegasp:

You know... I'm surprised no one has written a fic yet where an AI-programmer who is extremely passionate about their creation - that is, until CelestAI explains why they should it down - uploads, and find that their AI has been turned into a little colt / filly excited to explore the world. And the programmer is now a daddy / mommy. :pinkiesmile:

I always wondered what he did when he went to Equestria. It was nice seeing the idea explored.

I only made it compatible with friendship and ponies.

Oh boy.

I was thinking it could be described better as a bundle of infectuous glee. May I show you?

OH BOI.

8656354
Heh heh. It's been so long since I looked at this one that you've made me want to read it again.

8670865
It gets slightly unbelieveable towards the end, otherwise the story is fine imo.

There are parts of the software that, in all seriousness, I still don't understand.

It's official. Of course.

8670882
It's Pinkie. Sorta. I wonder if Canon Pinkie would want to throw nukes at people to cheer them up...

Pinkie with nukes, that won't end well. :pinkiecrazy:

10151335
In a world with immortality and virtual environments, gratuitous use of nukes could be pretty fun!
"Why did you take a surfboard into a nuclear war zone!? You got flung for miles!"
"It's called an Orion Drive."

Login or register to comment