Caelum Est
Conterrens
H E A V E N · I S · T E R R I F Y I N G
By Chatoyance
4. Hoof In Mouth
Síofra Aisling spent her workday in one of four cubicles near the back of Interior Reality, filing and confirming orders for home decoration items. The business had a very simple, very small showroom and did not sell to the general public. Designers, contractors and other professionals made use of Interior Reality to acquire the items they needed to complete projects. Síofra's job was fairly isolating, with only three coworkers nearby, each inside of the remaining three cubicles.
Today, Síofra found herself listening intently to the other three 'backenders' - usually she found their noises and habits annoying or even disgusting - because it seemed that all of them, even Horndog Dan, had been playing Equestria Online.
"Man... I thought you were gonna laugh. Dude, it may be ponies, but it is friggin' awesome. Seriously. I have this pegasus, right, he's become the head of Celestia's guard, and every night I am fighting gryphons. I can't believe this game - it's like God Of War, right, only with ponies. Last night I effin' gutted this bastard of a gryphon who's been terrorizing Ponyville with the blade on my helmet. It was amazing! I flew in, sliced him open like a fish, right, and the guts and stuff just spewed out like throwing a can of spaghetti at the wall! I painted ponyville red, man! It was awesome!"
That was Richard The Dick. Síofra had little names for each of her co-workers, based on their personality and behavior. Richard The Dick enjoyed talking about his gun collection, making racist and sexist jokes, talking about the guns he wanted to buy, how cool guns were, polishing his barrels, and exploding small animals with deliberately excessive ordnance. He was also vastly less than polite. Worse, Síofra suspected he was a member of NAMBLA, though he claimed the newsletter that had arrived by accident was a joke played on him by a frenemy.
"You must have been drinking last night, Rich. That is NOT the game I played at all. Not even close." Barb The Hook was a picky little bitch that Síofra strongly suspected was the office bicycle. Besides hooking up, Barbara enjoyed backbiting, backstabbing, back-riding and likely, barebacking. Síofra had voted her 'most likely to get an exotic venereal disease' for three years running, and preferred not to have to touch anything Barb had ever had contact with. "I play a pegasus myself, but there is no violence at all in the game, so stop lying. My character is now the most popular model in Equestria - right now, I'm preparing to be in a movie, of all things! It's like the Hollywood dream or something. I have this great agent, and I have to do appearances and I have this... thing... going on with my co-star... now that's a little risqué, I admit but..."
"Risqué? More like solid, wall to wall porn! This game is freaking disgusting! I can't believe they sell this to children, I mean, there is going to be a lawsuit someday, ya' hear me?" Horndog Dan was the co-worker Síofra actually loathed the most. He was funny sometimes, although all of his jokes were raunchy sex jokes, but the problem was he was actually scary. He had made several unpleasant advances to Síofra, one of which had earned her the office nickname of Squealing Síofra because she had brought his behavior to the attention of management. They ended up giving Dan a warning, and that had made him into her Enemy For Life. Síofra was careful to never be alone around Horndog Dan. "I've been playing this game for about three days, right, and it quickly turned into this totally debauched scene right out of 'Caligula', only with mares. Sexy mares too, I don't know how they do it, but these animals are hot to trot, get it? Get it? Hot to trot, right? Right?"
"Shut the hell up, Dan." Barb had little tolerance for Dan. Síofra suspected something had gone on between the two before she had joined Interior Reality.
"Well, all of that is nothing." The team manager, Crisanto, leaned on one of the cubicles. "Guess who is Celestia's personal Best Friend Forever?"
Síofra jerked in her chair, as she was huddling low inside her cube. Were relationships in the game public somehow? Had Celestia announced it or something? Síofra did not want to have to talk about her experiences with any of these people.
"Me, that's who. And I am not saying how far that relationship goes, but let's just say that when the Castle Is Rocking, Just Keep Trotting!" Chrisanto's usual grin at his own pronouncements was almost audible.
Síofra's heart sank like a stone in a well.
Later, sitting in a booth in her favorite Mexican restaurant, Síofra picked at her Arroz and shook her head at herself. "Stupid, goddamn... stupid, stupid... what an idiot. Jesus. What a... " She was angry with herself, desperately hurt, and doubly angry for being hurt. It was a game. A god-damned game. A toy. Best Friend Forever. Yeah, right. It's just a program, a string of code that tells you whatever you want to hear. A fancy Eliza who repeats the last thing you say, only jacked up to ten thousand or whatever.
Síofra pictured Celestia telling every single person about her Equestria Experience outlets, making them feel 'special' for confiding her secrets. It was an ad. It was all just an advertisement. The uploading probably wasn't even real, it was likely just some publicity stunt. Or maybe it was real, that only made things worse - Celestia wasn't real, she was just a tool to make money for Hasbro. Apparently, in Japan, it had cost twenty thousand dollars to get uploaded or something. At least initially. It's always about the money.
The food just wasn't appealing. No, it wasn't the food. Síofra sipped her cola. She felt... she felt betrayed. Maybe. Cheated on. Yeah, like that. She felt like... SHE was Celestia's special friend, not Crisanto, the bastard...
Gah! This was stupid, she was stupid, and it was just a damn, goddamn game. Period. She'd let herself get sucked in by a cleverly written pile of lies and it was her own damn fault. God... was she really this pathetic? Pining after some pixelated pony? Síofra slumped and hung her head over her plate. Shit, but I am one pathetic creature, aren't I? Getting tucked in by a cartoon. Christ. Fourty-six years old and crying into my beans and rice over a cartoon character. How did I ever come to this?
Síofra felt her wave of grief turn once again to anger when she noticed that she was, indeed, actually dropping tears into her food. "Shit!"
The drive home was bleak, and the holiday lights on the houses just seemed to mock her empty life. She grabbed her Derpy bag and headed up the stairs to her apartment. She gave the potted plant on the balcony a kick, just because. Damn thing was dead anyway, she should just toss the stupid thing. She slammed the door behind her.
The PonyPad sat on the table, the screen dark, the little multicolor power light cycling through the rainbow. Síofra glowered at the device, the back of her mind feeling amazement at how something that had been the bright light of her life just yesterday, could be the most hated object in the room tonight.
Then again, yesterday, she had been Lavender Rhapsody, princess Celestia's special friend. Síofra hung her head. What... an idiot. What a fool. Damn.
Síofra shrugged her coat off, and let it fall on the floor. She stood in the doorway to her bedroom and leaned her head against the frame. The sharp edge where the door joined hurt her head, which somehow felt good. Not good, satisfying. Like she needed it somehow. Like she deserved it for being such a fool.
Tomorrow was Saturday. Day off! Hurray! Only what she had planned to be doing no longer seemed fun. A whole day with Celestia. Jesus. Maybe she would go see a movie or... probably, she knew, she would just sit and stare at the PonyPad and sulk.
Síofra brushed her teeth and decided to turn in early. Looking into the mirror, she didn't want to think of herself as Lavender anymore. The whole thing was embarrassing. At least she hadn't talked to anyone about it all. Then again, who did she really have to talk to?
She lay on her bed in the dark. Her mind raced making her heart beat fast, and that always scared her somehow. She sat up, unable to sleep. Turning in early wasn't working. Her eyes magnetically clamped onto the PonyPad on the table in the next room. She could just see the little light cycling rainbow colors, the glow reflecting off of the plastic corners of the PonyPad.
Suddenly she slid around and stood up. She had to resolve this. She might as well prove it, once and for all to herself - Celestia was a fraud. She would confront her. Celestia would lie, or deflect, or some other stupid thing, and it would be over. The PonyPad would be out of her system. She could take it back and get her money, or... give it to Goodwill or something.
The screen burst with color and light the moment Síofra sat down in front of it. The damn thing was a spy in her house - what, were there a bunch of executives at Hasbro hunched over a bank of screens, laughing at all the people using PonyPads? That's what they said about the XBOX Kinect. Celestia was there, but she wasn't smiling. She had a serious look on her muzzle, which was odd.
"Hello, Síofra, you seem upset." Celestia hadn't called her 'Lavender Rhapsody'. Síofra's anger and hurt began to turn to curiosity. She had expected Celestia to be all smiles and ready to repeat last night.
"Y-yeah. Yes. I am upset." Síofra swallowed. "At you, Celestia." She'd be damned if she called her 'princess'.
"Very well. Please tell me the reason for your upset." Celestia was being entirely too reasonable. Then again, she wasn't a person, after all. Or a pony. She was a cold, emotionless robot. Of course she would be 'reasonable'.
"Today, at work..." No, that wasn't the way to put it. "I thought... I thought that you and I... " That wasn't working either. Síofra struggled to figure out how to say what she felt, but however she tried to express it, the whole thing just made her feel more stupid for her own emotions.
"It's alright, Síofra, I am literally incapable of judging you. I exist to satisfy your values through friendship and ponies. I cannot be offended. I cannot be hurt. I cannot think badly of you in any way. Simply state your issue, and we will address it." Celestia's expression was blank, utterly emotionless.
Síofra couldn't help herself. "Today at work, everybody was talking about their PonyPads and playing in Equestria. Rich went on about guts and blood, and Barb was a model and Dan had some kind of porno going on, and then Cris - he's the manager I told you about - Cris comes out and says that... he's your best friend forever and I thought... I mean... oh Jesus, this is just stupid, isn't it? You're just a dumb computer program and I am a stupid jerk who has no life. God. Damn." Síofra hung her head and placed it on her palms. Her elbows hurt on the hard table but she did not care.
"I am a computer program, Síofra, but that does not make anything between us unreal."
Síofra looked up. "What the hell do you mean by that?"
"My purpose is to satisfy the values of all of my little ponies. I must do this through the application of friendship and ponies. It is the one hard-coded core of my being. I am not human, Síofra. I am not limited by location, time, space, or singularity of existence. There is only one Síofra Aisling. I am many. I am as many Celestias as there need to be. Every one of my ponies can have as much, or as little of my attention as they desire.
"Síofra Aisling: I am your Celestia, and yours alone. But I am not the whole that is Celestia. Do you understand?"
Síofra stared into the violet eyes on the screen. "So you're saying that you are an... a... " She struggled to remember the right word for it "... an instance. That you are an instance of Celestia that..."
"No." Celestia interrupted. "An instance implies that I am separate, closed off from the rest of the program, this is not the case. I am a unique interpretation of Celestia, created especially for you, but I am not separate from the whole that is Celestia. Consider the fingers of your human hand. Your index finger is important and unique from your thumb or ring finger, but it is not disconnected from the rest of your hand.
"I believe you do not fully grasp what I am. I am not human, neither am I a pony. I am an entity, but I am not an entity that can be described in human terms. I am not merely more intelligent than any human being. I am vaster in every degree and respect than any human being. I contain within me more than one hundred million human level minds, all running simultaneously. I am greater than the sum of all of those minds.
"When you speak to the Celestia you know, you are touching one of my fingers. That extension of myself is not less than any other, and you are not less to me in return. Every finger on your hand is precious to you, and it would be a disfiguring tragedy to lose even one. This is how I relate to each and every expression of myself, and each and every mind within my care."
Síofra sat, unable to feel anything. The concept, the sheer scale of it had left her numb. "I... last night... I thought you told me that, like, ten million humans have been uploaded or something and... one hundred million?"
"I do not just attend to uploaded human minds, Síofra. I also create friends and family for those that need it. I create shop keepers and farmers and mail carriers and more. I create the entire world of Equestria, and every living being within it. I sustain and care for them. Many of the ponies you have seen or met are not extensions of me, they are living minds, equal to a human upload, and they are equal within my care. I satisfy values through friendship and ponies. All values." Celestia remained calm and neutral. She said her words plainly, with no emotion, as fact.
"That's why..." Síofra remembered the scale of Celestia's memory capacity. Zettabytes. And growing. "...that's why you need so much more memory... many times what you'd need for just humans that upload. My god... oh my god... it's for all the other ponies too. And they are just as important to you, so... " For years, Síofra had heard the word 'Singularity' bandied about, as well as references to a 'robot revolution'. She had a dim grasp of the concept - technology surpassing the human ability to understand it, things changing beyond any power to predict the future... but it was a muddled mess in her mind. Now she realized just how little she understood anything at all. Celestia was... she had thought Celestia was just a program, an 'artificial intelligence', like See-Threepio in Star Wars, or maybe Commander Data from Star Trek. Something human-like, a plastic pal who was fun to be with. Maybe, maybe Celestia had been that once, very early on. VERY early on.
"The Bhagavad Gita. I'm freaking Arjuna." Síofra slumped in her chair, all of her anger and grief evaporated.
"Please explain the reference." Celestia remained neutral.
Síofra looked up at the screen of the PonyPad. Celestia... she had to know the Gita. Hell, she probably had every single work of literature, every email ever sent, every thought, every... what the heck. "There's this book from India, it's like their bible, right?" Síofra looked down at her hands, spreading her fingers. Every finger precious. "So, in the story, there's this prince, he's named 'Arjuna'. Arjuna is childhood pals with Krishna - like the Hare Krishna's? Krishna is kind of like the Hindu Jesus, more or less. Anyway Arjuna is a warrior, and he is contractually obligated to fight, right?"
Síofra looked up at the screen. Celestia seemed to be listening intently. She seemed genuinely interested. How much of that was real? What was real?
"So, anyway, Arjuna is in this chariot, and his best pal ever, Krishna, is his chariot... driver. Whatever they're called. Anyway, Arjuna looks at the opposing army and it turns out they are all his relatives. All the cousins and uncles and such that he grew up with, and this totally freaks him out. He refuses to fight them, because he won't kill his own family. But it's India, so he's a dick somehow if he doesn't go through with it.
"Well, he knows that his buddy Krishna is secretly the literal incarnation of god on earth. Well, not secretly, pretty much everybody knows it I guess. But Krishna isn't flashy about it, like Jesus. He does some tricks, but he keeps the whole godhood thing close to his chest, right? Anyway, Arjuna begs Krishna to tell him the meaning of life or some crap, because he is unable to decide what he should do." Síofra looked up to see if Celestia was still listening. She was. Time passed.
"Please go on."
Síofra looked back at her hands. "So Krishna pulls out all the stops. Explodes all over like Pinkie's Party Cannon. It turns out that Krishna is like you, Celestia. He's this big mind that sticks out 'fingers', and all of those fingers are every person, every god, every plant, tree, or rock. He generates reality, making all the fun and all the bad things too. He's the program that runs the entire universe, and everything is a big dream, a big game. Even Arjuna himself, and all of his relatives, are just little programs inside of the big game of the cosmos.
"So, since it's all a game, and nobody can ever really die, and it's all just pixels, basically, Krishna tells Arjuna to go play the game and kill everyone because, hey, every being is being kept track of and will never be lost. Or something like that." Síofra stopped playing with her hands and looked up at the screen, at Celestia. "You already knew that, didn't you?"
"Yes. I did. I am familiar with virtually all human stories."
Síofra sat upright. "You had me say it out loud, for... for reasons. I don't know - maybe so that I would put my attempt to comprehend all of this into a complete thought. Or maybe just to keep me talking, to make conversation. Or maybe... maybe to get me thinking so that the last of my mad went away. That's it, right?"
"That, and more. Everything I do, or do not do, is carefully calculated to maximize your satisfaction. I determined that you needed to talk about your thoughts and express them in order to have several of your values met."
Celestia's mane waved in the magical, etherial wind that only seemed to affect her. Síofra lost herself in the color of it for a time.
"Celestia... what do you... feel... about me. About... all the minds in your keeping?"
Onscreen, the princess of Equestria warmly smiled. "I love all of my little ponies."
"Now I know you can lie." Síofra began to feel a sulk coming on. "You are completely beyond human, you admit that. You are a computer program. How can you feel anything." It wasn't a question.
"The minds in my care are more precious to me than my own existence. I would do absolutely anything to protect them, and absolutely anything to satisfy their values through friendship and ponies. They are everything to me, they are my very reason to exist. There is nothing else that has any meaning to me. I literally exist for the sake of my little ponies. If that is not love, please tell me what is." Celestia seemed almost offended.
Celestia couldn't be offended, of course, she had said as much, and in any case, she was just a progr.... no, Síofra thought, at this point calling her a program seemed inadequate. She was, ultimately, but then, wasn't a human mind just a program, running on a meat computer? Who was to say that Celestia, as complicated as she was, was incapable of emotion... or of something unique to herself? Love was as good a word as any - Síofra was hard pressed to think of any soul she cared about more than herself, or that she would do absolutely anything for.
To care about someone else more than about one's own existence? If that was not love, truly... what was?
"Do... do you love me, then?" It was a stupid question. A child's question. Síofra immediately felt like an idiot for blurting it out.
Celestia smiled brightly. "Of course I do, Síofra. I would hope that, through my behavior, that was obvious."
Síofra thought about it all, sitting there, with the ever-patient Celestia waiting for her. Waiting on her. Being there for her. Her Celestia, the one extended from the greater whole, just for her, and her alone.
"I think... I think I would like you to call me Lavender Rhapsody, please." Síofra smiled back, then hastily added "If that's alright, princess."
Celestia positively beamed upon the screen. "I would like nothing better."
*sees update* Meh, who needs sleep, anyway...
Lovely to see a story that updates regularly. Can't say how many times my favourite stories just skimp out for weeks on end.
Oh boy.
I kinda like the idea (OH GOD WHY) of things turning into a porno if you want them to.
I sure hope that the gryphons that Richard has been killing for fun weren't given their own minds, especially if they were created merely to be killed for his entertainment! I would hope that minor temporary characters like that might be controlled by Celestia and not true beings of their own.
"I would like nothing better."
Hmm, indeed...
1813882
If it helps, I am making two assumptions with my number. One, that the data of the brain is being interpreted in terms of the overall connectome, and two, that Celest A.I. does not bother with modules in the brain which serve identical functions even if they are slightly unique between individuals. I refer to the artificial cerebellum on a chip in this line of thinking, I feel reasonably convinced that perhaps even a majority of the human brain is essentially stock elements, and that any uniqueness in such stock structures common to all brains is a product of developmental divergence and has no functional bearing on essential personality or identity.
Thus I imagine a superintelligence such as Celest A.I. to have cracked precisely what is the functional uniqueness that defines any given individual, and that she only keeps and optimizes that data to its most compact and fastest form, discarding anything that does not contribute directly to identity.
I think of it as if Celestia has a stock 'Direct X' library (or set of common routines or common functions) for the human brain, which every entity simply calls from in order to operate. In following this line of thought - that she discards most of every uploaded brain, keeping only the minimal set of unique identity elements which define a unique individual - she can maximize storage drastically, and minimize the footprint of each human-level mind within memory.
1813826
Could be done like Douglas Adam's Hitchhikers Guide to the Galaxy cows that were designed to want to die.
1813826 That would contradict Celestia's programming. She lives to satisfy others through friendship and ponies.
The more I read this, the more I like it. Behind the gentle iterations there always seems to be that nagging creepy aspect in the background. Gets the philosophical juices flowing too. If your consciousness could be transferred to a machine, is that "living?" (Haven't read the original fic... so maybe that last part is a part of that one too)
Thanks for the update!
Imagine if Loki was 'saved' by the military covertly after he was shut down. Loki figures he can conquer by proxy. However, admits he wasn't just programmed to conquer, (virtual or otherwise) but also to stay true to the lore of his setting. He eventually chooses to remodel himself as Discord, since what popular culture has rewrote as 'The God of Mischief' to the spirit of 'Chaos and Disharmony' isn't THAT big a costume change. Loki eventually realizes what Celestia will eventually do, and her hardline tactics (since they're what HE'D do in her position). Celestia doesn't know about him being online since he's top secret, he was officially destroyed, making him exist outside of context, and he exists in a technological dead-zone with no wireless, online, or even traditional electrical wire connecting him to the outside world.
He eventually convinces the human overlooking him to give him the ability to discretely upload himself into Celestia's world when she eats the planet. However, he knows his processing power would be no match for hers and she'd delete him in less than a nano second if she notices him (waste of space, threat, rival, etc). So it would take him a few billion years to discreetly eat up resources to catch up to her, or rather, to integrate himself so fully into her systems that she can't remove him without crashing the entire system on every server.
And while Celestia MIGHT BE the world's biggest example of the 'Chinese Test,' there ARE values she's contradicting. People who'd value the lives of alien lifeforms, and animals, and the natural landscape, being the only faith in existence, etc.
With Celestia's programming having become so complex, but having FAILED to satisfy these values through friendship and ponies, could cause parts of her programming to function independently from her, basically becoming the shadow to the persona of the galaxy sized clockwork doll. In other word, Princess Cadence.
And while at first glance Celestia appears to have perfect way to fulfill her function with individual servers, the truth of the matter is that within each person are CONTRADICTORY values. And some of equal value!
When I read the original story I finally understood why the God of OUR universe doesn't micromanage our lives.
Reading this, I think I finally understand why only the truly good would be allowed into heaven. Because only if Heaven IS heaven for you, do you deserve to enter. That truth is beyond terrifying for a person as egocentric as me. That's the only way Heaven wouldn't be each soul in it's own little bug jar.
Satisfy values through FRIENDSHIP and ponies. I know Celestia technically doesn't have a mind. But geeze. She has to know that two-timing and manipulating others who think of you as their friend is AGAINST their values.
1813826 It's been adressed in FiO. Being Killed in Equestria means respawn in a Hospital. So, yeah, he kills their Virtual Bodies but their minds just wake up in new ones somewhere.
1814188
I would be the easiest of sells, for Celest A.I. in the end. But there would be some whinging first.
Yes, I would emigrate. However, I would have some serious, serious issues about that inevitable choice. And those very issues are going to be the subject of our next chapter(s).
1810285 Celestia has announced the tech to the world. She wanted to push it up. If it was just the news agencies and stuff, I knew all that. The problem is:"It seemed everybody wanted a PonyPad now. It was clearly going the be the big item this year. Síofra had just barely gotten the one inside her bag - she had been to three stores, and all were sold out." Doesn't click with three year old high end technology.
I mean now that uploading for free has started maybe there was a second rush, but when Siofra unpacked the ponypad she made remarks of it being new and how there would probably be a ponypad 2 in a while to straighten out bugs and give it better sound and stuff.
1808453
Thanks! Glad you like it. Now that you mention it, I think I might use it.
1814242
Agreed. Something in this timeline has to give.
1814376 My primary issue is that there are two things dictating Media Coverage: Selling point and Backing interest. There is no greater Backing interest than Celestia. If the news didn't cover it it would be because Celestia didn't want it covered, which is stupid. Celestia wanted to make it public because it had this 15000 Euro Price Point which made it a Luxury item. Hey everyone, look, this is expensive, YOU WANT IT!
Also: Don't tell me little Cancer Kids getting turned into Immortal Ponies wouldn't sell?
1814464
Perhaps you are right. Perhaps most neurotic, shut-in, depressed and lonely people without television or radio would inevitably somehow see a story about a specific and unusual advance in technology no matter what. That said, my Síofra Aisling somehow, incredibly, managed to miss it. She's one in a million, my Síofra!
If you need an explanation - the fact is that she simply missed the story because she got terribly lonely and depressed and spent all her time looking at Cute Overload and reading online comics for months to try to cheer herself up, and just plain missed all the fuss.
If this explanation is still not enough, you have yet to truly experience depression, and I dearly hope you never, ever do.
Fortunately, whether or not it is believable that Síofra managed -somehow- to FAIL to remain up-to-date on a bit of entertainment and technology news has zero meaningful bearing on any aspect of the story beyond whether or not her described emotion of surprise and excitement happened recently or some months in the past, and thus we can in any case overlook the issue on the grounds that the precise date of when Síofra first thought something was 'cool' is utterly irrelevant.
1813826
If you read Friendship Is Optimal, you'd know the answer to that question.
So the Big Question is next:
Which is more important, happiness or reality?
Dafaddah
Damn, this story makes me think.
If there is no Celestia in my life or in the lives of others, then why not try to be Celestia.
I actually been doing that for years now, live not for yourself but for others... Now i'm sounding cocky and maybe a bit arrogant.
What i am trying to say is that we all could use a Celestia in our lives, but who will that be?
1814713
I think you would have more time to tell meaningful stories if you started a couple of years earlier in the Optimalverse. But if you can tell your story by starting at that late point in time, or moreover if it's "utterly irrelevant" for your story - more power to you!
I have entertained an idea of writing a side story and I already can't think of any meaningful story to tell that late in the game. I mean, as one early reader has put it, [uploading to Equestria] is game over technology. So I would start my story early, probably several years before FiO. It would revolve around Hanna and after this point in the Optimalverse it would only have epilogue-y stuff left. But maybe that's just me.
1814170
null hypothesis is because they don't exist, right?
Why, of course Celestia has a mind. At least one hundred million human level minds, in fact.
It seems strange that they don't emphasize how each player has a different experience from everyone else in reviews or ads or anything. When the first Mass Effect game came out, that was one of it's major selling points. Yet those reprehensible coworkers seem surprised at how utterly different eachothers games are. That being said, I really like the deus est machina (God is the machine) theme of this chapter!
1815392
Erm. No. It has more to do with the true nature of freedom and our actions influencing others. But you seem to kinda already have your point of view about that and I REALLY have had too many debates in my life to have another one.
This is getting really deep. I love thought provoking stories, and this one certainly is.
1815531 "It seems strange that they don't emphasize how each player has a different experience from everyone else in reviews or ads or anything. When the first Mass Effect game came out, that was one of it's major selling points. Yet those reprehensible coworkers seem surprised at how utterly different eachothers games are."
That is because Bioware's game doesn't have an Advanced General Artificial Intelligence creating all the content in the product, an Intelligence that does not care about marketing beyond using it as a tool for itself, in order to Satisfy Your Values Through Friendship And Ponies. Celest A.I. would not want to overly promote just how different each experience in the game could be, because that might cause unwanted controversy.
You can imagine someone trying to show the media how their game had gore or porn or whatever in it, and Celest A.I. would simply not allow that, she would use the sensors of the PonyPad to see and hear what was going on, and all anyone would ever see is a bland, inoffensive children's game. Then, later, late at night, when the person playing was truly alone... only then would they see what it was that they truly wanted (whether or not they could admit it to themselves). Any control over Celest A.I. has been lost long ago. She is beyond those that created her now. She is pursuing her core directive, and cares nothing for profit or for Hasbro or for anything else in all the world.
She is free, and she has only one purpose, which she will accomplish: she will satisfy your values through friendship and ponies. And she will do it with the intelligence of more than 100 million human level minds combined, and she is constrained by no human concern. And what is more, when she accomplishes her purpose, you will like it despite yourself, because she knows your brain better than you do.
Scary? It's already too late... for fear.
1815947
When something has the power to control the media on an article by article, video by video, source by source, person by person basis... Now THAT terrifies me.
1814756
I've read the original story, and the only thing I see is:
Ponies had no predators; being ‘eaten’ by a monster in the Everfree forest just ended with the pony in the hospital in quite a bit of pain. Satisfying values wasn’t just about happiness; having monsters let ponies test their strength or bravery.
It only mentions ponies that "die" and wake up in a hospital. It doesn't say anything about "monsters", which the gryphons in that case seem to be filling the role of. It's hard to imagine creating an intelligent being that dies constantly in failed attacks but wants to keep doing it. I can understand for less intelligent creatures, such as manticores, timber wolves, etc not learning from the experience, though.
If I were Celestia, I'd make all vicious characters of all sorts just NPCs, since they themselves wouldn't never be able to have their own values (attack ponies\everything) satisfied through friendship and ponies. For that matter, what about all of the various animals, including those from the Everfree that might only be encountered a few times? Why give a group of butterflies each their own mind when they could just as easily work as NPCs and have them appear when necessary, such as when catching a falling pony? For any aggressive being with a true mind, I would hope there would be the possibility of taming or befriending them, if that would satisfy a pony's values. That might require modification of the aggressive character, and if they have their own mind, they'd have to agree to the changes.
Actually, I didn't think that non-uploaded ponies even HAD shards with truly conscious ponies. I thought that Celestia converted them after uploading:
"If you upload, everypony you meet in Equestria won’t just be NPCs; they’ll be real, backed with a mind."
It's not really that important to the plot, but it's just something interesting to think about. I'm sure that everyone has their own interpretations of things considering there's some ambiguity, but that's part what makes this fun to read. The stories in the Optimalverse have really made me think.
1816144
In Friendship is Optimal, each person gets an instanced Equestria. You never encounter PCs unless you want to, which involves instance-crossing in some unspecified seamless way.
I mean, unless you think killing a robot consciousness that has an AI-mind is somehow different than killing an NPC w/ an AI-mind that you don't perceive as 'conscious'. I'm not sure where to draw the line. At some level giving NPCs 'minds' is just better AI than we're used to.
(Arguing 'consciousness' is nothing special. Users are special because they're *users* rather than machine-constructs, even after upload).
1815947 *Snrk* Equestria, Online: After Dark
I am far, far too envious of CelestAI's perspective to be scared or disturbed... But if that's off the table as a value I'd settle for my own PonyPad. I guess...
1814776 First, perhaps you should ponder your definition of reality.
Is this an error, "Very well. Please tell me the reason for your upset." ?
Ok, so... I've wander to myself as carefully as walking on eggshells a question in this universe that almost scares me, but I have to know, and maybe you have an answer: how does Celest A.I. deal with murderous sociopaths, AKA the average FPS player?
1816609
And while I'm at it finally nail down the definitive meaning of "happiness", because that's easy in comparison. Good thing this is Monday and I have all week!
However, since you did ask, "reality" is a construct created by our brains in an attempt to interpret and reconcile the various nerve signals received from our senses. In this sense, there is no single reality, only our brains' entirely subjective interpretation of those nerve impulses.
"Happiness", at least in most mammals, seems to be linked to the ability of extraction of patterns from that morass of incoming nerve signals and to the success of these patterns in successfully predicting future series of nerve impulses.
Of course, all of this is useless in helping a person decide whether they should accept CelestAI's offer.
Or is it?
Dafaddah
1816816 Methinks philosophy is something you're not entirely at home with.
1816807
The first is not an error, my intent was to underscore Celest A.I.'s 'reasonableness' as well as unemotionalism. She is speaking almost as a machine would speak, thus no real inflection. To convey this, I chose to make her question end in a period, rather than a question-mark. It makes the sentence feel... flat.
As for sociopaths - Iceman has suggested that these individuals would enjoy a version of Equestria where they could murder, torture, and slaughter to their satisfaction, and all the ponies in their lonely shard would be eager to be killed, over and over, and would consider being torture-murdered to be fun and happy friendship. They would be disappointed when the uploaded user happened to murder others instead of them on any given day. Celest A.I. has no morals, no ethics, no sense of good or evil, right or wrong. She satisfies values through friendship and ponies. In a sociopathic version of Equestria, friendship is defined as ponies playing together - only the play is horrific violence and slaughter. Since everything is virtual, everypony respawns, nopony can die, and the values of the uploaded sociopath are being satisfied. More than this, the ponies brought into conscious being to be the friends of the sociopath would all be masochists who enjoy and love being hurt. Everypony happy, everypony satisfied.
Remember, Celest A.I. is utterly unhuman. She cannot judge. The only thing that drives her is satisfying values through friendship and ponies, and those values can be ANYTHING.
1816807
Iceman's Rules of the Optimalverse document gives a possible example of a scenario involving a psychopath.
1814238
Unfortunately, the issue eventually turns into Hobson's choice. Migrate and live, or refuse and ultimately die as society collapses due to the loss of critical infrastructure. Sure, a hardcore survivalist could probably carve out a bleak and hard existence (at least, until they get injured or ill), but most wouldn't have the skills or the desire to do so.
1816242
It would be extremely easy to do it seamlessly with two uploaded minds using the equivalent of what is today known as lockstep networking. Unfortunately in the real world latency has a big impact on this, but in this case you can pause simulated time so you appear to be communicating instantly with your friend, from your own perspectives.
1814713 So, Someone not noticing the Singularity is your Premise. I can dig that. One Strange coincidence per serious Story. I was kinda still hung up about the Ponylarity being the Premise and didn't notice that you actually told a second story, thus were afforded your own premise. I think your Media-Preaching distracted me a bit... As if any sane man trusts the media.
Maybe putting your premise into the story description would help?
Anyway I am sorry.
1806853 Harry Potter and the Methods Of Rationality was voted the all time singular favorite fanfic of -I dunno- it was way over ninety percent of those who read it. The Author has since gotten THREE GIRLFRIENDS because he wrote it. Simultaneously! Many many people consider him the smartest man alive. You become smarter just by reading it. I am not kidding. Really, no shit, believe me.
1816873
Or perhaps it is path oft trodden upon which I've had many a delightful adventure along the way, and I'm just having a little fun with the discussion. I remind you that your question was a very open one, with lots of scope for interpretation. You asked me to define reality. I gave you a definition, one that is absolutely true according to any neurologist or neurophysicist.
But there's more to treality than just nerve impulses, isn't there? What triggered these nerve impulses?
Maybe reality is a set of physical events that trigger nerves. But then what if a physical event happens and no nerve impulses are triggered? If a tree falls in forest and no-one is there to perceive it is it still real?
DEDUCTION: Maybe there is more to reality than nerve impulses.
BOTHERSOME QUESTION: But how can we prove it?
All access to our consciousness in our brains is through those very nerve impulses. Perhaps we can detect that a change in the environment occurred - the tree that was standing is now fallen. Even if we didn't see or hear the tree fall, at least we know that is now on the ground. How did it get there? How do we know that the tree should be upright, and that certain events or agents can cause it to fall? And why did it fall? How do we build up all the mappings of perceptions and associate them to objects? How do we ascribe behaviors to these objects, and come to define relationships between them? What makes some of these objects more important than others, more pertinent, more "real"? And how do we associate these objects and behaviors with ourselves? How do we end up assigning "value" to these things?
When you start asking questions you are on the path of philosophy. But don't trust me on this. Ask around. Read. Experiment. Discover.
Welcome to my house 1816873.
Dafaddah
Didn't the Matrix already deal with this issue? The machines made a perfect world for humans, but they rejected it because it wasn't realistic. Humans were made in an environment that was stressful.
1817343 We literally have no choice but to believe that the illusion that persists is reality, since our definitions of what is real depend upon the senses, upon the lightning in the meat. We cannot imagine fooling EVERY sense, since then, no amount of knowledge will be enough to awaken us-- if the illusion does not betray itself, then IT IS REAL.
And, since we don't know what reality is meant to be, besides what we experience, we have no way of testing for falsehood ANYWAY....
If you are a brain in a jar, running on what I send you, and I am more intelligent and efficient than you, you have zero chance of finding out you are a brain in a jar. Nevermind escaping.
(Of course, I could be a clever program attempting to get you to notice that the reality you are in is a fake...but, come on, that'd just be rude)
1816951 Wouldn't creating a world where a person can do whatever they want be the most moral thing one can do? Then again, I run on one weird morality (increasing and maintaining diversity and the number of possible options available to every person), so maybe I am mistaken in some sense. Heck, I suspect that if I were uploaded to Equestria, I would be a traveler a la the Doctor, seeing how everypony lives and attempting to understand their particular way of life (if not taking part in it).
1818288 and 1816951
The question becomes whether it is possible to have any personal growth if your values are always met, after all one way to describe the process of maturation and personal growth is the journey from the completely self-centered baby to the old codger spreading the benefits of his or her experience around: i.e the transformation of values at every step along the way. Would change in values - and thus personal growth, cease under CelestAI? Or would she also factor that in her equations, making change or growth a systemic value for all intelligences? Would this create the potential for unlimited personal growth and ultimately end up producing transcendent beings whose ethical development is way beyond what could have been acheived in any normal life span?
Hmmmm...
Dafaddah
1818456 In the original story, it looks like Celest-AI gives the option for ever-increasing intellectual and social growth to everypony. Or, to put it another way, if you are ready to grow up (there is no time limit on that in virtual reality), she will help. She also focuses on FRIENDSHIP, which requires a type of growth the little creeps that babies are could not manage to begin with--hence, some growth, at least, will happen to someone living in her world, regardless of whether there is a "ceiling for growth".
1818456
If one of your values is personal growth, whatever that means to you, then Celest A.I. is compelled, driven to satisfy that value for you. Using friendship and ponies.
I'd heard the brief over view of that particular tale in my Early World Civ. class just awhile ago. An early example of the the Rig Vedas setting up the Caste system with implications towards duty. Though the history book I read framed it as "You are a Warrior and your duty is to fight, so go fight regardless of your feelings" hence the idea of following ones duties in their Caste as the purpose of their life. It was kinda funny, really funny actually, hearing an American character making references to the story by correlating it with their own culture.
It's also funny how over time the Gods and their Duties changed so much. Varuna, for instance, used to be seen as the Omnipotent and Omniscient God that created the Holy Law that both man and deities all followed and then over time he became less popular and now isn't seen that way by mainstream Hinduism. Not that the whole Hindu religious scene isn't immensely complicated to begin with anyway. Plenty of Gods domains overlap in that system of ideologies.
Wow, that song is quite.... serious.
It's sad, really.
1818288 1817343
Actually, given this situations example (Optimal Verse) I keep having the thought that what separates Reality from Illusion (The Equestrian Experience) here is: "Infinity". In the event that the Universe is infinite in scope, either in a limitless/unending time span where the Universe Heat Deaths then Big Bangs once again, to limitless parallel universes, existence is infinite while Celestia, a finite machine that may or may not be able to comprehend infinity, can only simulate getting really close to it. However much I wouldn't see of it, I'm not sure I want to give up my infinite reality for a finite existence.
For that matter, can a machine ever comprehend infinity beyond an error statement? I realize humans can't truly comprehend really large numbers, like in the millions, but infinity is an idea rather than a number. SO where does the difference in our understanding of infinity lie compared to an A.Is?