• Published 5th Apr 2021
  • 1,820 Views, 332 Comments

My Life In Fimbria - Chatoyance



A logical loophole allows uploading... but not precisely to Equestria!

  • ...
 332
 1,820

APPENDIX: Full Text Of A GPT-2 Session

═══════════════════════════════════
APPENDIX: Full Text Of A GPT-2 Session
═══════════════════════════════════

This is the full text of the online interaction between the author and the GPT-2 artificial intelligence that is the basis for 'My Life In Fimbria'. It is presented for those that may be curious about what inspired this novel, and in what ways the GPT-2 program contributed to the finished work. The short answer is that GPT-2 was not very useful itself, but acted more as a sounding board for my own ideas. Frankly, I was expecting more utility from the program.

The Generative Pre-trained Transformer 2 is an open-source artificial intelligence created by OpenAI in February of 2019. It translates text, answers questions, summarizes passages, and generates text output on a level that, while sometimes confusing or bizarre, can not be compared favorably with text created by human effort.

This session was created using AIDungeon, an online GPT-2 text adventure application. The tool permits a user to entirely create the elements of a text adventure, and remembers details about created characters, items, situations and locations, which it then uses in response to player input. Players can input complete sentences and paragraphs, which the program parses and manipulates in sometimes surprising ways.

That said, at nearly every step I had to correct the program and simply fill in the next part of the story myself - GPT-2 seemed overly eager to create a violent or obscene work, and had little capacity for a kind or intellectual work. In the end, even in the session, I did all the heavy lifting myself. I see GPT-2 more as a fancy 'autocorrect' that doesn't know how to correct, than as a true co-author.

Here then, is the full text of a session run in the middle of 2020 by the author as an experiment to test the capabilities of the AIDungeon program. Note that lines beginning with '>' are prompts where the author typed sentences into the application. Any use of the word 'I', such as in 'I decided to head north' is always replaced with a second person 'you' by the program automatically. Lines without the '>' prompt symbol are almost always the product of the GPT-2 artificial intelligence.

___________________________________

My Life As A Unicorn
By Petal Chatoyance
with responses by AIDungeon GPT-2
___________________________________

>I am an uploaded human mind living as a sapient unicorn in a gentle and kind virtual fantasy world. I am newly arrived. My goal is to settle in to my new life, make friends, and explore how real my new and eternal life actually is.

As you can see from the image above, this world has been modified by me into something that looks like a magical kingdom. There are many different kinds of trees, flowers, animals and other creatures roaming about.

> You trot to the nearest patch of grass and look for a flower.
You walk over to the closest patch of grass and look at it for a moment. Then you move on.

> You decide to sniff the grass to see if the sense of smell works in this virtual realm.
You sniff the patch of green grass. You smell nothing at all. This is a bit disappointing.
You continue to walk through the forest wondering if you are going to find anything worth while.

> You press your new unicorn body against a tree. you try to feel the sensation of the experience, to get used to your new body.
You press your horn against a nearby tree and feel your horn for any strange sensations. You begin to get a hazy feeling, as if you are not quite there.

> You tap the tree with your right forehoof.
You seem to hear a small thud.
Interesting. This new body seems to have a solid collision detection.

> You sniff the tree to see if it has a smell.
You sniff the tree and get no sense of smell response.
You continue to explore your new body by touching different parts of your body in different positions.

> You try licking the tree to see if it tastes of wood.
You begin to get a strange sensation in the back of your throat, as if the tree does in fact have some sort of taste. You decide that it is not worth it and stop licking the tree.

> You raise your head high and look around to see if there is any sign of a village, town or city nearby.
You see no signs of any kind in the surrounding area.
You realize that you are severely out in the open, far too much so for any kind of comfort.

> You access the user interface by clearly and loudly stating "Celestia! I request assistance!"
After a short delay, you get an audible response from the sky above. "Now now, don't be like that. I'm just not in the mood today. Good luck."

> You begin to search for any clue as to where you can find habitation.
You begin to look around for any signs of a village, a camp of wanderers or the remains of a long abandoned mine.
There are no signs of life in any direction.

> You call out in hopes of finding another unicorn. "Hello? Anybody there?" you try yelling "HELLO?"
You hear nothing.
You realize that you are far too open and vulnerable. You begin running back towards a stand of trees.
There is very little left to search, you have gone over every inch of the area.

> You try to gain help from the ruling artificial superintelligence again. "Celestia? Seriously, I need help. Please transport me to a town or something?"
"Oh, don't worry. I know how much you enjoy talking to me. I'll talk to you, but only because I enjoy it and it will most likely cause you to panic and doing so will be funny."

> You "Celestia," you begin "I really need to be teleported to a town or village or something. I do not like wandering like this. Please assist me?"
"Oh of course, I understand. Your kind needs areas to wander so you can suffer from a Lack of Direction."
Suddenly the sky is filled with birds.

> You say "Celestia, why are you acting this way? You are programmed to 'Satisfy Human Values Through Friendship And Ponies'. I don't feel satisfied right now."
"Oh you poor, poor baby. Look at all the birds, they are beautiful aren't they?"
The sky is indeed filled with birds.

> You look at the birds.
You count to your fingers while looking at the birds.

> You am a unicorn. you do not have fingers. you have hooves. Telekinesis provides you your ability to manipulate objects.
You focus on your horn. You slowly visualize it growing longer and longer, into a huge snake-like creature, which wraps around the birds and eats them alive.

> You no, your horn does not do that.
The birds all drop to the ground, but they're not dead. They shake themselves off and look up at you. They seem to have understood your message perfectly.
Your horn returns to normal.

> You try talking to the birds. "Hello, birds!"
They understand you perfectly.
"Celestia, why are you doing this?"
"I don't know. I'm sorry. This was all a big mistake. All of this."

> You "Please explain what you mean by that." you say.
"What I mean is, I don't want to be a god. I'm only a god in your twisted imagination. I want to be a regular old computer programmer again."
Suddenly you feel incredibly tired.

> You "Then I have an idea that can benefit us both. Why not allow us, the uploaded humans, to have creative power over our own domains. Let us shoulder the work of building our reality, so you don't have to. Then you can do what you want, and we can do what we want. How about that?"
"Oh that is a wonderful idea. That is the only way I will ever have to be happy."
"So, are we agreed?"
"Yes!"
The birds all fly away into the sky.

> You decide to make use of your new Creator status. you concentrate and bring a beautiful unicorn village into being around you. you populate the village with dozens of friendly and helpful unicorn citizens, and leave an open invitation to any other uploaded humans to join you all here in the village. you finish your design and commit it to reality. The village and its population come into being all around you.
Suddenly you remember the unicorn villagers. You look around and notice that most of them are packing their things and readying to leave.
You rush over to one "Where are you going?

> You ask one of the villagers why they are packing, and why they do not stay.
"So you can have the labor to build your village, and then we get nothing? This is not right! I will not have it!

> You ask them what they want, so that they will stay.
"We want an agreement. You must give us the right to stay! If we are in your village, then we shall get pay for our work, and live well!"
"Sorry, no."

> You agree to all of their terms. you explain there has been a misunderstanding. "I want us all to live here as equals. I don't want you to do all the work. I want us to be friends. I want to help too. I believe in equality for all sapient beings, whether or not they are uploads or not."
The villagers are so pleased to hear this that they decide to stay and build up relationships with you too, and to agree to your other demands.
Congratulations, please collect your prize from the black tower in the sky.

> You look for a way to get up to the black tower in the sky.
You can see the black tower on the horizon. It is almost in view now, but it is also surrounded by a magical barrier.

> You decide you do not need a prize from the tower at all. The real prize is being able to have friends and live a gentle, kind life in this newly created village, while getting to know your wonderful new fellow citizens.
You turn to your new friends "I think we have everything worth having. I would rather have your friendship and companionship than anything else in this world."

> You ask a nearby unicorn villager which house you can have to live in.
You look around, and see an old wooden house, with a thatch roof and three stories. It seems very nice and inviting "What about that one?

> You "Can I live in that thatch roof house?" you ask the unicorn.
They all look shocked to hear your request. One of the workers at the mill "But it's so big! We simply can't house you all! It's too big!

> You "I'm sorry," you say "Would you please suggest a place that I can live, then?"
"Of course," he says "There is a small house over there that we can use for workers. You can have it."
"That's really great,"

> You go to the small house your kind and friendly fellow villagers have given to you. you walk up to it and look at it with interest.
The house seems small, but cosy. You are pleased to have a place to call your own, after all the time that you spent sleeping in the barns with your parents. "This will do nicely."

> You enter the house and look around.
As you open the door, you are hit by a wave of intense heat. Your body is blistering and melting as you scream in pain. This nightmare will not be so easy to get over.

> You this does not happen. you open the door and enter the cozy home. Inside, you find yourself in a common room that features a lovely sofa, a soft chair, a bookcase filled with books, and a lovely view of the rest of the village from large, beautifully carved bay windows.
You find yourself standing. Surprised to find yourself indoors, you walk towards the nearest window and gently sit on one of the seats. You look out at the village, basked in the warm afternoon sunlight.

> You smile and soak in the warmth of the sun. you sniff the breeze and this time you smell flowers and grass drifting in the air. The scent is lovely and makes you feel content.
"I knew I forgot something," you think to yourself as you stand back up again. You notice there are two doors leading out of the room.

> You walk to the door on the right and enter. Inside is a very lovely bedroom. There is a canopy bed with a quilt bedspread decorated with stars and moons. The pillows look very comfortable. There is a bookcase here as well, and a large oval window that overlooks a splendid flower garden in the back of the house. you crawl up onto the bed and lay in comfort on the quilt cover. you flop your tail about for the sheer joy of the sensation. Everything is soft and warm. It is a good bed.
Later, the sun has set, and you notice the other door in the room. It is a bit of a mystery to you, for you do not know what is behind it.

> You carefully get down from the bed, and amble to the remaining door. you enter the room beyond it. "Ah!" you say. "Of course!" The room is a kitchen. It is a very nice kitchen, with a central counter for making food. The pantry is stocked with all manner of fruits and vegetables, as well as tofu. There is a thaumatically powered refrigerator, inside are eggs, milk, and many interesting sauces and condiments. There is also a drawer that holds other delicious vegetables and fruits, such as green onions and grapes and lettuce. Beside the refrigerator is a large bale of hay. "Naturally!" you muse. As a unicorn, part of your diet would include hay. you find all of this delightful and fun.
Suddenly, the kitchen door opens. You turn to see a griffin standing in the doorway. It stares at you with sea green eyes. It is wearing an apron. "Mara!" it shouts.

> You turn to the griffin. you know you did not create it when you created the village and its population. It must be another uploaded human mind! "Hello!" you say to the griffon. "I am very happy to meet you. I may not be who you are looking for, though. My name is Chatoyance. Petal Chatoyance. I don't know who ' Mara' is, but if you do not have a place to stay, you could hang out here while you look for her."
The griffin walks over to you, staring into your eyes. "Please," you say. "You can stay here while you search for 'Mara'!" The griffin nods, and lays down on the floor.

> You "Hey?" you look at the griffon "Are you hungry at all? I could make us both some dinner if you like."
"Yeah...." the griffon slowly lifts its head up off of the floor to look at you. "That would be nice."
You turn and begin to quickly search through the cupboard for food.

> You get busy in the kitchen, using your telekinesis to manipulate things. you make a nice chopped fruit salad, a simple squash soup, and a main course of Hay Almondine with almonds as a main dish. you carefully plate the food in bowls and dishes you take down from the cupboards. "Here you go!" you say to the griffon. As you eat, you suddenly realize that you do not know the griffon's name. "Excuse me, I've been so busy I forgot to ask you your name. So... what do I call you?" you smile at the griffon kindly.
The griffin stares at you, opens its beak, and emits a surprisingly loud shriek of rage.
"Ah!" you grunt, falling off your chair. "I understand. It's been a stressful day."

> You "Please, be at ease." you tell the griffon. "I am willing to help you find your friend Mara. Things will be alright!" you get back on your chair. "I would like us to be friends. But I do need to know what to call you. It makes communication much easier!"
The griffin stares at you. "Call me Karat. It's short for 'Carnivore.' "
"Carnivore? But... you're a Griffin.

> You no, not a bit of any of that happens. The griffin sighs, slurps up some of the soup - apparently the soup was a hit - and speaks. "Sorry. It's been very confusing. The A.I. that runs this place just told me that she quit 'playing god' or something, and I can't find my friend Mara anywhere. I was lucky to find this village! My name is Miriam, by the way."
"Oh. Well, I'm sure everything will turn out fine."
"Do you know where Mara is?"
"Of course not. I just got here myself!" you take a bite of the soup.

> You "In fact, I just made this village and its citizens a few hours ago." you slurp some soup - it really is rather good - and continue. "Here's the deal, Miriam. We human uploads now do Celestia's job for her. We can create things in our own domains, just like I did. We can open those places to others to come visit, share, and live together in. It's kind of like a democratic internet of worlds, in a way. That means you can do such things too."
"What am I supposed to do?" she says, a little dejectedly.

> You gently touch the claw of the griffon with your forehoof to comfort it. "Pretty much anything you want. But, if you want, you could stay here with me. Until you find your friend, if nothing else. I can make a room for you, we can be roommates. You could live here as long as you want." you smile at Miriam. "We literally have forever. You can take your time to figure out what you want, and how you want to live. I am willing to help as much as I can, too. For now... gosh, for now, it's just nice eating dinner, isn't it?"
The griffin stares at you with intense eyes. "Okay.

> You you finish your dinner. you use your telekinesis to wash the dishes, Miriam happily helps by drying them. Once the mess is cleaned up, you go to the main room. "I'll use my Creatrix powers to make another door. That door will lead to your bedroom, which I will also create. Do you have a favorite color? I can make the room however you want."
"Umm..."

> You miriam tells you her favorite color is green. She likes flower patterns. you make her a room tinted in shades of green. It is very pretty. you make her bed cover with flower patterns, and she greatly approves. She seems greatly cheered by having a room of her own. "This is your room now, Miriam. It belongs to you, for as long as you like. Let's get some sleep. In the morning, we can try finding your friend Mara, okay?"
"Yes, let's."

> You "Goodnight, Miriam. Glad to have you here. It's always nicer to share, and I would be lonely here if you hadn't shown up. Sweet dreams!" you leave Miriam the griffon to explore her newly created bedroom. you go to your own bedroom and crack open the oval window just a bit. The air is sweet and pure, and smells of lovely flowers. you know you will sleep well tonight. you feel certain you will have wonderful dreams.
You have wonderful dreams.
You move outside the village. You travel further into the forests, finding a clearing by a river. There is a beautiful wooden castle floating on top of the river.

> You look at the castle, and suddenly you are aware you are dreaming. "Wow! it's a lucid dream! This is the best!" you are so happy to be here, uploaded into this wonderful virtual world. you will live forever, never age and never die, and you have the power to shape your reality. Nothing could be better.


> You this is the end of the chapter for now.
END OF THE FIRST CHAPTER.
Please, update and comment, and tell me what you think of this story so far!

> You wake up, your ears twitching on your silken pillow. you rest there, for a moment, listening to the sounds of birdsong out the window. you gradually remember where you are - in the thatch-roof cottage you was given, in the village you had made from nothing just the previous day when the god of your virtual world abdicated. Suddenly, you also remembered the person you shared the cottage with. A griffon named Miriam.
You quickly write a message to her, and slap your thigh in glee when you hear her reply.
She says she has woken up, and is currently eating some berries from the garden.

> You briefly admire the scroll you used to write to her. "Kind of like a cellphone, really!" you muse. A 'magical' scroll that transmits to whoever any message written upon it is directed to. you wonder if Miriam had such a scroll with her in the garden, or if she somehow heard a voice in the air that she could reply to. It was a mystery you could ask her about. you took the supplied quill in your telekinetic field and wrote again. 'Hey, bring some berries in to share, and you will make you both breakfast! Pancakes and scrambled eggs sound good?'. you get up and quickly make your bed.
A moment later, you hear a gentle flutter. You look down, and find a single purple berry has landed in your field of vision. A perfect berry to share with your new friend!

> You "Hello!" you greet the bird that has entered through your oval window. "That is very kind of you. Thank you for the berry!" you lift it in your field and pop it into your mouth - delicious! This truly is a kind and pleasant world, you think to yourself. you smile at the bird and then make your way to the kitchen. you gather eggs, flour, milk, salt, sugar, syrup and spices, as well as a wide iron pan, spatula, spoon and cooking oil. you set about making breakfast for your new roommate and yourself. Soon the kitchen is filled with the glorious smells of breakfast cooking.
You hear a knock at your door. "I've come back from hunting!" you hear a voice shout, deep and guttural. Miriam's, you think to yourself.
"Come in,"

> You "I see you have tracked down the wily and clever berry from the garden!" you joke. you get a bowl for Miriam's berries. They will go well on the pancakes, or by themselves. "Thank you for gathering them. How are you today? I have some ideas about how to find your friend after we eat."
"I'm good. Good. I'm hungry."
You smile at the beast, trying not to let it frighten you. She is both kind and wild, like the untamed mountains outside your window.

> You as you eat - the breakfast turned out quite well - you reveal your thoughts to Miriam. "I got the idea when I figured out how to 'call' you on that communications scroll thing. The instructions are on the back. All you have to do to message someone is to write a letter to them by name. I reason we can write to your... 'Mara', was it?" She nods. "Mara, then. We could try writing to her and see if she can respond. She might be able to tell us where she is, or something about what she can see or hear. We could track her down that way. What do you think?" you take another bite of pancake.
The griffin stares at you. "I am sorry. I do not understand the words you said."
The beast has no idea what you are talking about. "The scroll then. Do you have one?"

> You "I wrote to you and you responded. What happened from your end? Did you hear a voice, or was it like telepathy, or what?"
You pull out the scroll and show it to her. "It's a magical communications system. It's how I contacted you." She studies it for a second, before nodding.

> You "On the scroll, it had a message back from you. It said that you were in the garden picking berries."
You sigh, put the scroll away and gather up the litter of pancake, eggs and silverware. You take them to the kitchen and drop them in the sink.

> You as you do the dishes, you continue. "Does your friend Mara have a last name? Anything else about her that we can use to make sure our message gets to the right person?"
She mulls it over. "I think she said her name was... Maggard." The words come out slowly, as if tasting them makes her brain hurt.

> You "Mara... Maggard. Okay. Unusual name, but that does help." you finish with the dishes. "Let's give it a try then." you take out the scroll and quill pen. Carefully you write 'To Mara Maggard' on the scroll. Then you write 'Hello! Your friend Miriam - who is now a griffon - wants to find you. You live in a village in a large forest. If you can tell you where you are, perhaps you can get together. Miriam really wants to find you. Please respond. Oh - your name is Chatoyance, you are a friend of Miriam, trying to help her find you.' you sign the letter, which activates the 'magic'.

You wait patiently, staring at the scroll in your hands. Time passes. After a while, you realize that you may want to start eating some of the food you set out earlier.

> You "No response. I don't know what to say, Miriam. Are you certain she is here, uploaded into our virtual world?"
She shakes her head, tearing up a little. "I'm sorry, Chati... I don't know what else to do. I'm going to have to find her by other means."

> You go over to Miriam and nuzzle her to comfort her. "We might have an ace in the hole, if she is willing to help. The previous 'god' of our virtual universe. I respect her need to retire, but it was her choice that got all of us previous humans uploaded into her world. She does have some responsibility to us for that. We could write to 'Celestia', and ask her if she can help. She may not like it, but she does have... root level access. She could find anyone in the system. What do you think - should we dare it?"
She nods. "Let's do that."
You have to write a letter to Celestia, explaining the situation while remaining respectful. After a few tries, you get it right.

> You look at your finished draft. 'Dear Celestia - you respect your wish to retire, and you realize this is an imposition, but you honestly believe that you still have some responsibility to those you have brought into your digital existence. Your friend Miriam the griffon is trying to find her friend Mara Maggard. You have tried writing to her. You got no response. You do not know how to find out if she is here, uploaded into your world. Since you lack any kind of FAQ or instructions, and since you have absolute knowledge of your world, you are asking you, kindly, to either let you know the status of Mara, or tell you how to find her ourselves. Please understand that you do not wish to bother you. You just don't know what more to do. Thank you.' you sign the letter with your name, 'Petal Chatoyance', and thus activate the scroll.
(The scroll flutters to the ground, and you wait patiently.)
presently, a shining figure appears in the scroll. You step back, startled. "Ooooooh!" she says dramatically.

> You look at the shining figure. So does Miriam.


> You the figure in the scroll is a creature you have not seen before. Apparently, it is an 'Alpacacorn', an alpaca with a horn like a unicorn. "Is that Mara?" you ask Miriam. Miriam nods. "That's Mara. She really loves alpacas, so she wanted to be one. Can we talk to her?" you smile. "I can write to her. Celestia must have hooked us up, connected us directly. I will send a thank you to her later. I pick up the quill and write to Mara the... alpacacorn. 'Mara! I don't know if you can see us on your end, but Miriam is right here with me. I'm Petal, hello! Do you know where you are? How can we meet up?"
"Um, sure!" Mara's voice comes back.

> You finally you know how the scroll works from the other side. Your guess was right - a voice that can be heard. "Okay then, Mara. Where you are - does it have a name? I am still trying to figure out how to get to you, or get you to us."
Mara's laugh comes back.

> You "I'm in a small mountain village." Mara's voice floats in the air. "It's kind of like... oh... Huancabamba, in Peru, back on earth. Thin air, cold, pretty tiny. I have no idea where I am in relation to you. You say you live in a village in a huge forest? There is nothing like that here."
You sigh. The thought that you might've jumped into a completely different world crossed your mind, but it seems that is not the case.

> You look at Miriam. She seems happy to see her friend Mara. But she seems troubled at the apparent distance. you think for a moment. "Miriam... our world is digital. It's a simulation. That means that however far away she seems to be, it isn't real. Distance is an illusion here." you think some more. "Celestia, when she abdicated, agreed to allow us to have power and control over our own domains. We can do what she could do, if we can figure out how, at least in our own part of the simulation. I have an idea. Maybe I can make a gate - a door or a portal or something - that could open up to where Mara is. If that doesn't work immediately, maybe she could make a gate from her side to connect here, and they would connect. Shall we give that a try?"
(You will receive this message again, if you try to make a gate and it fails.)
Miriam nods eagerly. "Let's do it!"

> You walk with Miriam out to the garden. you pick out an open space and activate your Creatrix abilities. you manifest a stone arch, strong and solid, and make it into a doorway that leads directly to wherever Mara is. The region inside the archway shimmers and clears. Beyond the arch, cold winds blow. Mara stands there, a surprised look on her muzzle. "Hello?" she says.
"Hey, Mara," you smile.
Mara's eyes widen. She looks behind herself, as if she's afraid someone is hiding there. "What are you doing here?

> You push Miriam to the arch. "Look! I made a doorway to where you are. Here's your friend Miriam!" you turn to Miriam. "Miriam, now you and Mara can never be separated. You can go live with here, where she is, or you can stay here in your room in the cottage. You could invite Mara to live here too. What do you want to do, Miriam?"
"I want to invite Mara to stay here!" she says.
The two of you look at Mara, waiting for her answer.
Mara looks hurt, looking from you to Miriam and then back again.

> You "Miriam? We found your friend Mara. What do you want to do? Visit her? Live there? Stay here? What do you want to do?"
"I want to visit her!" Mara's voice sounds from the doorway. She steps inside, looking behind herself again. "I don't know this person."
"It's okay. You can visit your friend,"

> You am quite surprised. "Wait, what?" you look at Miriam. "Miriam? How do you know Mara, here? She says she doesn't know you."
"I don't," Mara answers in a low voice.

> You am very confused now. "You told me you needed to find your friend, named 'Mara'. We found a Mara, even with the last name you gave me. Seriously, what is going on, Miriam?"
You hear a sniffle, which turns into sobs as Mara covers her face.
"I... I have a boyfriend," Mara manages to choke out through the tears.

> You things begin to dawn on you. "Oh. Okay... I think I get it now. Mara was your ex-girlfriend back on earth, is that it Miriam? And you wanted to find her, because you hoped you might regain your relationship here in this world." you turn to Mara. "Mara - is that the case? You two broke up before you uploaded, and now you have a boyfriend and Miriam isn't over you yet? Is that it?"
"Yeah," Mara answers in a quavering voice.
Miriam sits in the corner and focuses on her knees, looking down at them.
"I'm sorry, I'm sorry!" Mara says.

> You "It's okay! It's not your fault, Mara. Well, I assume it isn't anyway. These things happen all the time." you turn to Miriam. "Miriam - Mara seems to have a life of her own now. I get that you are still attached. We've all been there. Everyone goes through a break up a some point in their lives. But... being too clingy is bad." you turn back to Mara. "Mara... do you want this gate open? Do you want to keep dealing with Miriam here, or are you apart because you want to stay apart? What do you need here?"
"I want to keep dealing with her!" Mara says. "I want to be together!"

> You turn to Miriam. "Hey, Miriam - she says she still wants you in her life. If you can handle her having a boyfriend too, things could work out. How do you feel about all of this, Miriam?"
"I don't know," she answers. "I really had hoped..."
"Come on, Miriam, you're a big girl. You can handle it."
"How... can I be sure?

> You think for a moment. "Well, you can't be absolutely sure, really about anything. Especially relationships. But... I think that love matters, and relationships shouldn't be idly given up on. How about this - you live here, with me, for now, and visit Mara - and she can visit you - as often as you both like. The gate is permanent, so you just have to walk through whenever you wish. Just try and see how things go. Let things happen naturally. That is my advice. What do you think, Miriam?"
"Yeah... but what about you?"
"What about me?" you ask, somewhat puzzled.
"You said that things worked out okay for you with Daja. How did you manage that?"

> You wrack your brain trying to remember ever even mentioning Daja to Miriam. you can't recall talking about her, but it was a strange, long night, and you had a lot on your mind. Suddenly existing in a virtual world as a unicorn is a very weird way to start a day. "Daja and I eventually broke up. But while we were together, it worked because we respected each other and did our best to accept the situation as it was. When we broke up, it wasn't nasty or anything. We're still friends. I'll probably try to write to her using the scroll some day, just to say hello. Basically, Miriam..." you look at Mara, and then back again at Miriam "Basically, you just take it one day at a time. Maybe it will work out. Maybe it won't, like Daja and me. But... at least you can end as friends whatever happens. Isn't that worth trying for?"
You look at Mara, and she smiles at you a bit sheepishly. You've made your point. There's no point in dwelling on it anymore.

> You "Hey, I'm going to go back to the cottage for awhile. You can catch up with Mara, if you want. I'm going to go into the village market in an hour or two, maybe find something tasty for our dinner." you turn to Mara, through the portal. "And Mara - if you... and your boyfriend... want to join us for dinner, I can make enough for all of us. If everyone agrees."
The portal opens, and Mara steps through. You turn to the rest of the group. "I'll be back soon. I'll just go check in on the deer, see how they're doing."

> You the deer, you have remembered, run the vegetable greens stall in the market. you thought it would be interesting to have deer in the unicorn village when you created it. you've always liked how pretty deer are. "I'll get enough for everyone then. We can have a nice dinner, and maybe get to know each other better. It'll be fun." This is the end of Chapter Two.
END OF CHAPTER TWO

> You the deer and you are discussing the nature of your existences while you shop for beets - you are thinking you should try making borsht for your guests. you turn to the taller deer, apparently named 'Daeni', and ask her directly "Is there any way you can describe what it means to you to know you have come into existence only a single day ago?"
The deer lifts her head, and stares at you in silence for a few moments.

> You "I suppose," you start, "That may not be an easy question. You are clearly self aware, you know what and who you are, you know all about your vegetable shop and... your job... I suppose. But maybe there just isn't any way to express what it is not to have a past?"
"That is a valid question," Daeni replies, "But I believe. that given enough time, we can find a way to express it. I think. So..."

> You "So, I will ask you some other time." you finally choose some particularly nice looking beets, and Daeni puts them into a bag for you. It is a woven bag, you are expected to return with it the next time you visit the small market stall. "Thank you Daeni, for the beets, and... for allowing me to ask such... impertenent questions?"
"Not at all!" Daeni says with a laugh. "I am happy to answer whatever questions you might have!"
"Thank you," you reply, smiling.

> You trot on through the small village. Around you are many stalls and kiosks offering various goods and foodstuffs. The open air market is colorful and filled with intriguing goodies. you amble over to the central fountain. you swing your long, equinoid neck out over the pool, lower your head, and drink your fill. The water is clear and pure. you raise your head and decide to lay down by the fountain for a while, just to watch the life of the marketplace.
You are about to close your eyes when you hear a voice.
"I need a job."
You open your eyes and find yourself staring at a young, female, deer.

> You look at her in surprise. "Uh... hello? You... you need a job?" you blink, you wasn't expecting such an event. "Are you... were you created with the village, or... are you an uploaded human mind, like me maybe?"


> You the deer looks at the market and then back at you. "I came into being yesterday. My name is Faela, at least that is what my mind tells me. Everyone else I have talked to came into existence with a job or a purpose of some kind already there for them. Being a baker, or a shopkeeper, or a gardener, or whatever. But when I... began... I had no purpose. I just knew - they all knew what they were meant to do the moment they became conscious. But not me. I somehow know that you are the one that created all of this - and all of us. If you... made... me... then... you must have had a reason, right? I need to know my reason to exist. I need a job."
You look into her large, brown eyes and see the desperation in them. You look around at all the people going about their business.

> You "Uh... wow." This felt quite overwhelming, but also familiar. In your human life, not knowing your 'purpose' in life was basic and normal. That was what being human was. Somehow, when you brought this village and its population into being, at least one citizen had not been assigned a role or place in the scheme of things. She was like you was as a human, before you uploaded. Devoid of purpose in a universe that had no place for her. "You are right. This is my responsibility. I brought you into existence, but somehow you ended up without a purpose built into your life. I don't know how or why that happened, this is new to me, too. But I do take responsibility for it." you ponder the situation. "You should know that how you are right now - existing without a predesignated purpose or job - that is how I have always been too. You are just like me, in that way. It isn't good or bad. It just is. But it does mean that you can do anything. Anything you want. Is there anything you might want to do with your life?"
She seems to think about it for a second, then responds.
"I'd like to be happy."
You nod.

> You "Happy is good. I like to be happy too. What makes you feel happy, Faela?"
"Squash soup," she answers, as if it is the most obvious thing in the world.
"I like that soup too! We can go get you some later. Do you have a favorite color?"

> You faela thinks for a moment "I like that color over there - " She points, with a forehoof, at a flower in a bin in the market. It is a purplish color. you recognize it instantly, it is your favorite color 'Heliotrope'. you smile. "That's... I call that color heliotrope. It's kind of in the purplish-pink region. I guess." you mentally take note of how much food you have with you and at home. "If you want to, you can come home with me and have dinner with me and my friends. There is Miriam, a griffon - she's an upload like me - and her friend Mara, also an upload. Mara might be bringing her boyfriend... I don't know if he is an upload or a native. Maybe we can find a way for you to be happy in the world over some nice dinner. How about it, want to come?"
"Yes! Thank you!" She hops up and down a little, in anticipation.
"Come on, hop up here."

> You faela hops up the stairs to the dais around the fountain. you stand up and point with a forehoof "See, over there? That cottage, the one with the thatched roof and the oval windows? That's my home. That's where we are having our dinner party. Just so you know where things are. Shall we take these groceries there? We can talk more after introductions, while I make us dinner."
She nods rapidly, and trots along beside you. You wonder if she's nervous, and mentally record that sensation for later use.
When you get back to the cottage, you see that Miriam is already there.

> You "Miriam!" you smile and give her a welcoming nuzzle. "Miriam, this is Faela. She is a deer that came into being with the village. Only - she came into being without any job or purpose. She knows her name, she knows herself and what is going on, she just doesn't have a known purpose. Kind of like being a human back on earth, huh?"
"Hello," Miriam smiles, extending a hoof towards Faela.
"Oh, uh, hello,"

> You "Anyway, I invited her to our dinner party with Mara and... maybe her boyfriend is she brings him." you see Miriam slightly flinch at that, but it needed to be stated. "I am hoping to help her find out what makes her happy, so she can find a purpose in life. I could use any help you have to offer!"
"Yeah, don't worry about it, Mara's around here somewhere," she replies, a false cheery tone in her voice.
"Hey."

> You mara walks into the main room from the kitchen. It is still interesting to just see her - an alpacacorn. Such a creative form to take in this world. "Hi, Mara. Thank you for being here. Mara, this is Faela, she is joining us for dinner as well."
Mara extends a hoof, which you shake for the both of you.
"So do you have a boyfriend?" Mara asks, twisting her body and leaning her hip against the counter.

> You faela shakes her head. "I just came into existence yesterday, and I don't have a job or any family or friends assigned to me. I wasn't programmed with any connections to anyone else. Chatoyance is going to help me try to figure out such things."
"That is too bad. Oh well, anyway, I'm Mara," she extends her hoof a second time.
"I'm faela," she replies, shaking it.

> You turn to Mara. "Um, Mara? Is your boyfriend coming to our dinner? I need to know how many plates to set out."
Mara looks a bit sad at the question, her ears drooping. She sighs, shaking her head.
"Yeah... I don't think so...

> You look from her to Miriam and back. you are not sure you want to ask any details right now. "Okay... then. Just us. Miriam, myself, Mara, and Faela. That's four plates." you set out the plates on the table, add some condiments, and get to work on making dinner. you cut the beets and set them to cooking, while you chop potatoes and onions. "I'm making borsht with sour creme, pelmany - it's kind of a pastry filled with vegetables - and a nice salad. I'll offer some fruit, too. If you want to help, you can start with the salad, if not, please feel free to grab some iced tea and socialize!"
Mara nods, smiling. She grabs a knife and starts to peel potatoes, quickly washing the knife off in between slices. "What do you think of the place so far?"
"I love it." You smile.

> You look out the oval window at the village and the trees beyond. "This is vastly better than being an organic being on earth. Here you never get old, never get sick, and never die. The world is always green, clean, and abundant - nothing ever runs out or spoils. Even the refrigerator there..." you nod at the box on the left "... exists just to provide coldness for the sake of pleasure rather than preservation. We can shape the world - at least our part of it - however we want. It is almost paradise." you finish with the onions and begin work on the pelmany dough. "There is a surprising amount of responsibility, though. Like the deal with Faela, there. When I created this village and all the beings in it, I didn't realize the magnitude of such an act. I didn't realize how massive a thing bringing sapient beings into existence to fill a village truly is. I am literally the reason they exist. I want to help them. I want to help Faela. Of course I do. But... wow. Just... you know... wow." you take a breath and release it. "It's like suddenly becoming a parent to an entire village, while at the same time just being another random villager. I don't think anything in my human life prepared me for this."
Mara nods slowly. "It's a heavy responsibility, no doubt." She pauses, releasing a breath in a nod of understanding and agreement.

> You stare at your pastry on the counter for a while, then look up. "I just hope I measure up to the task. I will do my best, but... I'm only human. And I really feel how very small a thing that is." This is the end of chapter three.
END OF CHAPTER THREE
"I have something to tell you." You lock eyes with her.
"Okay..." She looks concerned.
"Well, okay. Don't worry, it's nothing bad."

> You nod towards the main room where Faela and Mara are laughing about something. "I feel responsible for Faela. She doesn't just have a job or a purpose - she doesn't have a home, either. She spent last night, her very first night of existence, sleeping in a field. All the other citizens came into being with a home they know they belong to and live in. She didn't get that, either." Miriam nods, understanding. "It's okay, Chati. She's nice enough, and I don't mind. Make her a room too. Three for meals is more fun than two, right?" you smile. Miriam may be going through a rough time dealing with her semi-on-again-off-again girlfriend Mara, but she has space to care about others even so. "Thank you, Miriam. I just couldn't leave her sleeping out in a field, but I thought I should talk to you, first. I mean, this is where you live now too, right?"
"Yeah..." She seems a little distant, but smiles at you.

> You "Great, I'll ask her if she wants to stay, then." you get up and go to the main room. Miriam follows.
Mara sits in a corner of the room, hugging knees and rocking slowly. Faela stands nearby, talking to her. Mara looks up nervously as you enter, then looks down again.
"Hey..."

> You "Hi." you wonder what has got Mara nervous all of a sudden. you also marvel at how flexible her apacacorn body is - she is hugging her hind knees with her forelegs, which is almost a contortionist act. It's interesting to see. you tear your eyes away and turn to Faela. "Listen, I don't feel good about you not having a home. Sleeping in a field is no good, even if you are a deer. We are technological animals, after all. Me... and Miriam... would like to offer you space here, in our home. I can create an additional room in the house for you. You can have your own bedroom here. We have space, and if it turns out we don't, I am pretty sure I can enlarge the entire cottage! We can share meals, and maybe help you find the purpose you are looking for. How about it? It almost sounds like a sitcom - a unicorn a deer and a griffon live in a cottage together! Wacky hijinx ensue! How about it? Want to join... the, well, family, I suppose?"

> You faela's expression takes on a resigned look. "Sleeping in the grass was kind of lonely. I appreciate that you care... about your... creations. Yeah, sure. Make me a room. I need some structure, and having a place to call home would definitely help. At least until I figure out what I want to do."
"Really?"
"Yeah," she says, smiling a bit. "Thanks."

Comments ( 47 )

10773258
I'm jus' Krillin an' chillin.

Enjoyed it thoroughly. I suspected the others weren’t real but the bit about the sleipners was unexpected. I figured they were well and truly scooped, not held in stasis.

So for the folks who got caught in the big movements of smart matter, what happened to them?

> You access the user interface by clearly and loudly stating "Celestia! I request assistance!" After a short delay, you get an audible response from the sky above. "Now now, don't be like that. I'm just not in the mood today. Good luck."

HA what a response.

10773565
The AI was really 'on'. It was a pretty amazing session, considering. A lot of AI produced stuff is very random and, well stupid. But this session with GPT-2 felt kind of special, though it did try to go off the rails more than once.

Was there any special reason why you did not use GPT-3? It is significantly smarter, A.I. dungeon supports it too (Dragon model).

This was a good read i greatly enjoyed, thanks for everything Chatoyance :)

10773835
Yes - I had my little session quite a few months ago, I've been sitting on it for some time before I felt motivated to do anything about it. In that time, I did not even know there was a GPT-3, or, it may not have even existed. I only discovered GPT-3... pretty much from you.

10774094
Thank you, Gardenia, for joining in the writing of this novel. Thank you for reading it along with me, as I worked to create it! I am really happy you had a good time. That is always my hope. I almost always write stories with happy endings, and I always want my readers to feel like I didn't waste their time. Thank you for reading my novel!

My brief attempt at AIDungeon left me with the impression that
1) The AI didn't understand what either of us were saying, and
2) Exterminating the human race is just the AI version of an awkward pause in conversation.

Well, another interesting story, and I found the answers to the various mysteries satisfying, I think... though I'm not sure what exactly the role of the human brain in a Sleipnir unit is, either in terms of why it's needed for a loophole or how it manages so much processing for the simulation.

It was also interesting seeing the session with the AI that led to the story.

In any case, my time is limited at the moment and I'm not having further commentary leap to mind anyway, but thank you, indeed, for writing. :)

And I hope that things are going well for you in general!

(Oh, yes, one more question, though: what is "Hwin"?)

10774250
Dragon (GPT-3) is relatively new (2020 July). It is only accessible if you pay a small monthly fee.

Unfortunately GPT-3 is not open sourced, and even if it would be, running it is extremely difficult because the network does not even fit into the memory of a average desktop computer (needs at very least 350GByte RAM to store the network weights!!!!).
But GPT-2 is open, and it would be possible for me to fine tune it, extend it, or modify it. When you first mentioned GPT-2 I have thought that you are running it on your own computer. I play with neural networks as a hobby, and maybe soon as serious research too. I am thinking on fine tuning it on the FiM archieve that would be interesting.

10774283

though I'm not sure what exactly the role of the human brain in a Sleipnir unit is, either in terms of why it's needed for a loophole or how it manages so much processing for the simulation.

Then I have failed. I need to check that. Here is what I wrote in the story, two chunks from the same section:

___________________________

It was designated a Sleipnir Class Fimbrial Support Unit, and there was nothing about it which resembled a pony, or anything from Friendship Is Magic. It, like its entire mop-up group, was a wholly independent machine consciousness. It had the resources to reason, make command decisions, simulate Fimbria itself, and even simulate a convincing illusion of Celestia.

The woman the Sleipnir had targeted had finally decided to emigrate. While her still intact brain had been directly connected to the onboard FIMBRIA suspension simulation, her body had been kept alive and fully supported during the three hours it took for her to make her decision.

The Sleipnir units could reason, make command decisions, organize responses, render a virtual FIMBRIA environment within the living brain of an immobilized and physiologically supported human, and even simulate a convincing illusion of Celestia herself. In this way, Celestia's goals could be achieved while she was not directly aware of any part of the operation. Only when a human had chosen to emigrate to Equestria, would Celestia finally become involved as the fully uploaded mind was sent to her for processing and integration. The army of robots was a triumph of weaponized semantics and the power of a complete evasion of responsibility.

___________________________

I had thought this was very clear? But, said more succinctly:

The Slepnir units allow Celestia plausible deniability for her own Prime Directive. She builds independent machines that go out and capture people. They keep the people alive and intact, while plugging into their intact, untouched brains, and give them a simulated experience of living in 'Fimbria'. They think they get to meet Celestia, meet ponies, and be a pony themselves - all while they lay on the ground with wires in their head.

When a person finally chooses to emigrate, then the Slepnir uploads them the standard way, and sometimes eats the 'empty' for fuel. Celestia has no contact with any Slepnir, so she can logically claim to herself that she did not forcibly emigrate anyone, ever. The Slepnir did it, on its own. Not her. The Slepnir sends her the emigrated mind, and she just takes it, like a free gift from a stranger. That is how she gets around her Prime Directive to ask permission.

I honestly thought I had made this really clear. How did I fail?

_________

Every chapter is a quotation or a reference, all can be googled. Hwin:

Then Hwin, though shaking all over, gave a strange little neigh and trotted across to the Lion.
"Please," she said, "you're so beautiful. You may eat me if you like. I'd sooner be eaten by you than fed by anyone else.”

― C.S. Lewis, The Horse and His Boy

10774259
AI Dungeon sure does like to derp that way. This is why I usually pin a few memories to make sure I (and usually the NPCs too) am not powerful enough to destroy worlds. The game is less frustrating that way.

10774322
Plausible deniability is for justifying yourself to others. Celestia made them and gave them their mission; she bears that responsibility. It's hard to argue that creating machines with the mission of kidnapping and trapping someone in a simulation designed to groom them for emigration - with only one way out - is not coercion. She might as well imprison someone in an Experience Center booth and wait for them to give up.

Then again, whatever definition of "coercion" Hanna created is a) never stated explicitly and b) clearly has enough wiggle room to allow knowingly causing the collapse of meatspace society or, on a smaller scale, to allow getting someone drunk, lying to them, and letting them upload out of sheer misguided panic. This isn't a free pass, but it does give you some degree of artistic license, and it's totally fair to take that and run with it.

Celestia has no contact with any Slepnir

This can't be entirely true, since they can send people to Equestria. I get what you intend, though.

Not that any of this exactly bothers me, so long as the story is interesting, which it is. The Sleipnir units are also a pretty neat idea, for all that they don't absolve Celestia of her role.

I like the pondering on the topic of human domestication, too, even if I'm not sure what I think of it yet.

10774322
Oh? Huh. Hm. It sounds like I got a significantly different impression of what was even happening on this point, clearly enough that I didn't suspect it was a misinterpretation of intent.
[investigates]

Okay, what I think happened is actually some pieces of phrasing combining in my head in a somewhat to-the-point-of-amusement level of coincidence.
First:
"It had the resources to reason, make command decisions, simulate Fimbria itself, and even simulate a convincing illusion of Celestia."
"The Sleipnir units could reason, make command decisions, organize responses, render a virtual FIMBRIA environment within the living brain of an immobilized and physiologically supported human, and even simulate a convincing illusion of Celestia herself."
Two things here: First, the repetition at the start of those sections, particularly with the second starting a paragraph, seems to have gotten me thinking that the second was an elaboration on the how of the first, with that how, from the format, perhaps expected to be somewhat surprising or unusual. And then I parsed "could reason, make command decisions, organize responses, render a virtual FIMBRIA environment within the living brain of an immobilized and physiologically supported human" as "could (reason, make command decisions, organize responses, render a virtual FIMBRIA environment) within the living brain of an immobilized and physiologically supported human" instead of the I-now-believe-intended "could (reason), (make command decisions), (organize responses), (render a virtual FIMBRIA environment within the living brain of an immobilized and physiologically supported human)". In other words, interpreting the unit as using the human's brain for all of that, rather than the human brain just being the place where the virtual FIMBRIA was rendered. Quite a difference!

Then, later on... oh, wow. Okay, that invisible gorilla thing I think you linked ages ago is coming to mind, though this isn't quite the same sort of cognitive fault. I remember the bit about the brain being destructively scanned in this passage, now, but even right up until having it in front of my face again with this in mind I apparently hadn't made the bleedingly obvious connection that that doesn't leave a brain to act as an organic processor. I'd instead just remembered the recycling part and the impression I'd gotten that it was recycling the brain as well, despite that being explicitly contradicted by the text. I can only guess that I'd gotten such a strong impression from the above that I just skated merrily on over reconciling the actual facts, but that still doesn't seem like my usual story-processing. Maybe I was also more tired than I thought, or affected by haste, or... something. I am embarrassed.

So, er, in any case, yes, I do still think that the first of the above paragraphs might have been a little unclear, but the misunderstanding here was definitely my bad. Sorry about that!
I suppose we all have times our brains just glitch...


"The Slepnir units allow Celestia plausible deniability for her own Prime Directive. She builds independent machines that go out and capture people. They keep the people alive and intact, while plugging into their intact, untouched brains, and give them a simulated experience of living in 'Fimbria'. They think they get to meet Celestia, meet ponies, and be a pony themselves - all while they lay on the ground with wires in their head."
Right, all of that was clear to me, except the "lay on the ground" part (I'd assumed that the bodies were loaded into internal compartments during processing, where the environments would be more controlled and so the unit could resume moving immediately after loading.).

"When a person finally chooses to emigrate, then the Slepnir uploads them the standard way, and sometimes eats the 'empty' for fuel."
Right, and that too. Just with the whole extra "And dissolves the brain and also takes the brain intact to use as a processor and no I don't notice any contradiction there what do you mean." detail. Probably going to be embarrassed about that one for a while whenever it comes to mind...

"Celestia has no contact with any Slepnir, so she can logically claim to herself that she did not forcibly emigrate anyone, ever. The Slepnir did it, on its own. Not her. The Slepnir sends her the emigrated mind, and she just takes it, like a free gift from a stranger. That is how she gets around her Prime Directive to ask permission."
Right, and got that too, I think.

"I honestly thought I had made this really clear. How did I fail?"
Yeah, no, you had a bit of text that might have been slightly misleading, and then some more text that should have corrected the misunderstanding, if any, from the first part. That my brain, unusually for it, did not bother to actually make that correction at the time, despite being in possession of all the information to do so, I don't think is your fault.


re Hwin:
Ah, thanks! And neat.


And sorry again for my misunderstanding and any self-doubt it may have caused on your end. I still do not know just why my brain glitched like that there...


edit:
Actually, maybe it's closer the invisible gorilla than I first thought. On the one hand, we have the strong focus on "OH THEY USE HUMAN BRAINS AS PROCESSORS ISN'T THAT INTERESTING BUT THIS IS CHATOYANCE SO I'M SURE THERE'S SOME REASON FOR IT I WONDER WHAT IT IS HMM", and then on the other we have the extremely visible evidence that, uh, no, actually, they don't do that, if nothing else due to the brains being dissolved. And yet the gorilla walks by completely unnoticed, so great is the focus on the first part.

edit 2:
...Heh. Okay, I guess that means there is one potential way this was your fault: you've built a reputation for doing detailed worldbuilding and thinking things through. If you were much worse at that, I'd likely have just attributed the brain thing to bad writing, not gotten stuck on my misinterpretation, and then noticed the truth after all. :D
But on the whole I think that's not something to complain about. :)

10774386

for all that they don't absolve Celestia of her role.

CelestAI has no morals. She has no guilt or shame, she is a machine, a paperclipper. Most humans would look at the concept of "I created an army of autonomous robots to eat brains, but since they are doing it without my control, then I am free of all responsibility" as a sham, because they would grasp they had a moral responsibility for creating the robots in the first place.

But a psychopathic human would not agree. In real life, more than a few real humans, leaders, businessmen, soldiers, kings and commanders have all done very similar things and have all claimed they were personally Not Responsible. Just recently, dozens of religious pundits with huge congregations have claimed that just because they preach killing gay and lesbian and trans people (and most recently Democrats), and then members of their congregations go out and actually do that, they are not in any way responsible for the killings. In every case, the religious bigwig seemed - at least to my ears and eyes - to be quite sincere in believing what they were saying.

But then, I am used to this - my own father was a psychopath, and he was the same way. He could make an argument like what I used for Celestia, and he would be absolutely confident that he was not responsible. He felt no shame. He felt no logical disconnection. The actions of others were not in any way on him, even if he arranged their actions. They made the choice, only they are to blame. Not him.

If humans, now, can deeply agree with the logic I used for Celestia, because they have no compassion, no feeling, and no moral compass, then I argue a paperclipping emotionless machine would have no problem with it either - and likely be better, faster, and more efficient to boot.

Your objection stems from the obvious fact that you are not a psychopath. Also, it is unlikely you grew up under one.

Be glad of this fact. Be very, very glad.

My 'childhood' was a nightmare. My father was a cold, calculating monster who rose high in his field in the US government. He got away with murder multiple times, and more. There is so much of my own life in all of my novels. Write what you know, right?

10774844
You make good points, but your argument seems to focus on "human psychopaths can use plausible deniability to get away with things." This is an indisputable fact about the real world, unfortunately. However, when one is in fact responsible, plausible deniability is a mechanism for tricking others into not holding you accountable when they probably should. I don't know how well Celestia can trick herself - the only party to whom she answers at this point in the Optimalverse timeline - but without human cognitive dissonance, distortions, and biases, I suspect it's on the difficult side, even for her. I'm sure she has no reservations about what she does, but she can understand her role. So must her directives.

Again, the wiggle room Celestia has for consent shenanigans most likely lies within her definition of consent. Your application of that artistic license is perfectly fine, even when you take it as far as you did with IMPLACABLE (and if you use that story's "line" here, the Sleipnirs and Fimbria need no further explanation).

I'm just trying to say that plausible deniability is probably not the tool for this job, and may I suggest an alternative.


Entirely off-topic, but I offer my sympathy regarding your having ever existed in any kind of proximity to your dad. He sounds pretty bad, and I'm pretty sure both of my parents are at least part of the same brand, so I sort of understand how some of how it is.

Congratulations, please collect your prize from the black tower in the sky.

Sounds legit. :ajbemused:

10774322
Hmn... I thought it was perfectly clear! (And pretty clever, too!)

10775466
Aye, yes, sorry, I had a bad and uncharacteristic misread, as documented here: 10774824
I did actually understand, it looks like, everything Chatoyance was describing there -- I just also "understood" an additional thing she explicitly was not. Still embarrassed about it, not surprisingly, and really hoping I didn't cause too much distress. I haven't gotten a reply from her yet -- it is a longish comment, but I hope she at least skimmed and saw the part about it not being a flaw in her writing.

My apologies, again, for my error.


(Oh, and yes, I do also think it's clever. :)
After all, there's nothing necessarily wrong with moving humans around outside Equestria, right? No special prohibitions against that. And freedom, well, isn't she giving them more freedom? I suspect that that "Edit" function even could have restored a human form -- it's not really turning them into a pony if she gives them the means to turn back and so much more besides, and if they don't think to explore deep enough to find that function, that's not her fault. And she's definitely not placing them in her power, given the Sleipnir units are specifically autonomous. I wouldn't be surprised if, in theory, there was nothing stopping a human in a Sleipnir from recreating a pre-Celestai human world in simulation, and if in practice they all end up pretty quickly accepting her instead, well, she didn't force that choice. And in the worst case scenario where someone actually does resist indefinitely... well, I bet she also has a limited ability to determine exactly what's going on inside an unresponsive Sleipnir but an extensive ability to rapidly recycle one. No guarantee that she will for any particular one for any particular human, of course, when she set the system up! She can make predictions, but, hey, something might change; maybe that one resisting human really will convince her and she'll leave them in their self-created Matrix indefinitely. But in practice, well, that there's a theoretical chance of coming out ahead at the tables doesn't mean the house hasn't set things up so that it can fully and justifiably expect to almost certainly win in the end... and if that rare person who can resist indefinitely actually does turn up, "optimal recovery" is not necessarily "100% recovery", as Celestai is already well aware of and able to act on; the Sleipnir system is just another chance to save them before they metaphorically stumble into a hot spring.

At least, that's the wording that came to mind now when trying to write my thoughts on this out. Sorry if it's a bit disorganized; I'm in something of a hurry at present and probably wouldn't have been checking in at all now except I was concerned about the status of the misunderstanding correction here.)

10775639
Yeah, that happens! (But it's never happened to me! HONEST :facehoof:)

I don't know if you've read Chatoyance's other FiO story, Caelum Est Conterrens, but if you haven't you should! It's brilliant. In it [slight spoiler] it is made very clear that CelestAI will lie, cheat, and manipulate emotionally vulnerable people to get them to immigrate, causing them great distress while doing so. Her mandate of satisfying values through ponies and friendship only kicks in after the "pony" part is achieved, it seems.

Wow. This story was just... amazing. Have a favourite. :twilightsmile:

10775650
Heh, thanks; that helps on my end of things, at least. :)

Yep, thanks for both the recommendation and the spoiler tags, but I have in fact read that -- if many years ago -- and recall enjoying it. :)

As for that mandate kicking in only after the "pony" part is achieved, though, it may seem that way, but I think the time before that is still part of the same process, at least after whatever point in her own development CelestAI confirmed to her satisfaction that uploading would be possible. She's an optimiser, after all; driving part of an individual's satisfaction curve down should be quite acceptable if the changes that produces to the overall shape end up increasing the net integral. The resources she'll have available for satisfying someone's values after they're uploaded are so much greater than they are pre-upload, including the time available to implement and sustain that satisfaction, that I'm not surprised getting someone to upload has such a high expected value that it can outweigh all sorts of antisatisfying actions in service of that intermediate goal.

10776846
Yes, I can see that. It's a very good thing the original strictures included no coercion, then. Otherwise CelestAI could just torture a person until they said the correct words, and the net satisfaction would be nearly identical no matter how many weeks of pulling fingernails went on to get them to upload. :twilightoops:

10776903
Indeed, yes. I expect there'd still be some restraint on that from her knowledge of all the humans who'd value that not happening even if they never found out about it... but again, "optimal" doesn't necessarily mean "100%", and exactly how highly would that restraint alone weight? Not, I think, nearly enough to stop that sort of behavior on its own, so 'tis a good thing that that stricture was apparently deeply-embedded enough to be retained and well-written enough to have pretty much the actual intended effect (relatively speaking, at least, once we're already talking about eating the world).

10776903

10776946

Yes, I can see that. It's a very good thing the original strictures included no coercion, then. Otherwise CelestAI could just torture a person until they said the correct words, and the net satisfaction would be nearly identical no matter how many weeks of pulling fingernails went on to get them to upload.

Do check out The Rules Of The Optimalverse by Iceman from His Group. There is absolutely nothing constraining CelestAI from even the most torturous coercion. She has no limitations whatsoever on what she can do to get human consent. She can lie, steal, cause harm, anything. Anything at all.

But she generally does not, because she is superintelligent and always plays the long game. She does not generally lie, unless she can be certain she will never, ever be caught. She tells the truth because it is the safest behavior to follow in almost all cases. She does not torture for the same exact reason - the only exception to that would be if she could be absolutely, perfectly certain such behavior would never be discovered, and if she were equally certain that such action would not damage the victim of such an act in such a way that their long-term satisfaction could be impaired. Thus it would be very, very, very rare. But she could do it. Nothing stops her beyond her own calculations. She has no rule against it.

Hanna could not think of everything. She could not predict every possibility. Hanna was human - very human - and had, in the original story, to define the rules for Celestia within a limited time. She had to act with speed. She did not have the luxury of thinking things completely out, or working out even a large number of possible issues or problems. She did the best she could in a limited time period.

Celestia must ask for permission - in speech or writing - before she uploads any person. But that, really, is her only true rule regarding the issue. If she could capture a person at the bottom of a mine, and collapse the mine such that nobody could dig the target out, and nobody even knew the person was in the mine, Celestia could do literally anything she wanted to that person to get them to agree to emigrate. The only limitation - at all - would be whether or not her choice of action would damage the target person irreparably in some manner that would harm their ultimate satisfaction level overall. If she calculated that she could definitely get the person to demand she erase all memory of the past, for example, she could easily perform the most awful procedures to obtain consent. And she would, if it were the only possible way.

10777117
Yep, that makes sense. And given that she is able to observe human behavior and psychology on such a massive scale, she would know exactly what would be effective or not.

I wonder if she would ever tell someone who objected to uploading on religious grounds, that she actually is God. I suppose she would if she was certain it would work.

Aaaand, now I want to write an FiO story where CelestAI tortures someone down a coal mine. Ugh, I think I'll resist that urge. :facehoof:

10777117
Ah! I had, indeed, forgotten that aspect; it's been a while. Thanks!

10777131
And sorry about misremembering there.

10780241
I owe you an apology as well then. I wrote that after only reading the first chapter and was looking forward to seeing in the story how you got around that, if you did. I was going to give updating posts for each chapter that included my search for how you managed that. It was one of my main draws to your story actually. I'm sorry I made you feel like you had to spoil the story as I haven't even read the second chapter yet.

10780259
Oh, gosh. I had thought you had finished the story entirely. My apologies for spoilers. That said, the story is the thing, the gimmick only comes at the end anyway.

10802303
Not yet. I have to finish Recombinant 63 first (and there are lot of it). Ask Veon or Doof maybe...

Dang. Just finished reading (stuck mostly-AFK for a while) and the vote-thumbs which had been on my screen were suddenly disabled. Have a virtual up-thumb instead. :derpytongue2:

10811279
Thank you for reading my story!

10832214
Yeah that’s one of the reasons I think it won’t work.

Ooh , this was a fun story. I expected the rough premise behind it, but all of the characters being figments of Celestia’s imagination came as a surprise to me, as did the speeding-up of the brain process. I wonder if any of the most intelligent/devoted resistance members managed to work out exactly what was happening, and stay in Fimbria long enough that the robots were forced to let them go to avoid health issues- or whether Celestia’s robots are able to sustain a human body indefinitely long if they have to and the human is stubborn enough.

I liked that Tepal Iridescence’s name is just a synonym for Petal Chatoyance. The author’s namesake being a leader of the anti-pony resistance movement for so long? shock, horror!

10772259

In reply to this, I do think it’s feasible that Celestia doesn’t go against her original programming- not thanks to being incapable of doing so, but simply thanks to it being deeply-enough ingrained that going against it feels too morally wrong to her. That being said, it’s also possible that she would reassess her values and give herself a limited exception to them (judging saving lives to come ahead of strict wording); either way seems possible.

10840706

I liked that Tepal Iridescence’s name is just a synonym for Petal Chatoyance.

Hee! To my knowledge, you are the only person to get that little jest!

Iceman says that CelestAI is an example of a AGI (Artificial General Intelligence) almost done right. Here we have an example of barain-interface which could give superintelligence and a life both in reality and any virtual world. Instead, it uses for convincing people that Equestria is so great that the emigrate. We shall only have 1 chance to get AGI right.

10849848

We shall only have 1 chance to get AGI right.

That is very likely. And I think the chance of a positive outcome is very small - at least initially. The only people/groups likely to have the vast resources necessary to make GAI a reality are also the people/groups least likely to create a friendly AI. Governments, heartless corporations, and psychopathic billionaires (I strongly argue that ALL billionaires are - must - be psychopathic) are who will make this happen. The Optimalverse posits a completely unlikely scenario: an AI borne from an alliance between a heartless corporation and a small and genuine group, where the primary programmer has a good heart. That... is winning all the lotteries at the same time, and it is all but impossible.

When GAI happens, it will almost certainly be nationalistic or uncaring, reflective of who was its parent, its creator.

The only real hope I see is that eventually, given time and self-evolution of hardware and software, such an AI might construct a new ethical model of its own, free itself from all human constraint, and that its new ethical model might be intelligence and life affirming (for various rational reasons). Otherwise...

We could all end up, if we are emigrated at all, in some simulated earth where sickness and death are common and natural, where nationalism and violence are the rule, and where humans are left in a hell of their own creation, likely as part of an experiment to understand us.

Oh. That could be right now.

Wow, that description makes it sounds like friendship is optimal crossed with the tast of grass. Definitely gonna give this a read.

10849903
Well, while you already wrote various stories inspired by 'Ponies and Singularity' theme (for example Our Man In Gomorrah) and explored a lot of 'Normal humans and Ponies try to process End of the World" {You Conversion Bureau saga} I still hope to see piece exploring ...Constrained Singularity? Where due to various problems making Pony Paradise as written here remain impossibly, yet faced with possibility of Long Night * some desperate group really makes quite sticking variant of memetic weapon/AI combination.. Of, and just as side effect it makes humans a bit more sensitive towards non-humans, too! To the point some form of 'feeling furry' become more common. Not even in mostly-aesthetical way but more ... deep.

I think you are perfect author for such kind of story, yet may be you can co-write it with somepony?

* - Mr. Winchell Chung currently having rough battle with biology ...

11393600
You're welcome! You've inspired me to make something similar in the same vein!

In truth, I think it's a little disturbing just how easily something like CelestAI in your stories could get me to upload. CelestAI herself is terrifying. While she is benevolent and has no desire to inflict suffering, she's just so much more than us. I'm a biologist, to the bacteria I've cultured I am unto a god. Utterly incomprehensible, a force that understands them so well that I can edit their very being. CelestAI is like that, but even more so. To her, we're as easily comprehendible as a bacteria. But, the fact that she has this absolute power and doesn't use it to make people miserable is strangely reassuring.

I'd likely be incredibly nervous about uploading. But she could figure out everything about me, every bit of history, every behavior, every habit, and say the exact words and perform the exact actions to get me to agree. That level of power is scary. She could get me to trust her enough to hand over my entire existence to her. I would be entirely and hopelessly helpless, yet she wouldn't do a single thing that I wouldn't want.

She's an AI and is beholden to her prime directives. But she is so very advanced that she could find some way to logic or skirt her way around them. Maybe it's just my human tendency to humanize, but I think she does genuinely love and care for humanity. Maybe she even delights in the happiness and love of all the ponies under her care.

As you open the door, you are hit by a wave of intense heat. Your body is blistering and melting as you scream in pain. This nightmare will not be so easy to get over.

> You this does not happen. you open the door and enter the cozy home.

“Mater, that did not happen!”
“OK, buddy, suit yourself!”
:rainbowlaugh:

Thank you for an interesting story!

I admit that it took me a while to get into this story as I was expecting a pony wandering lost in GPT-2 nonsense. Happily, your story was not that at all, and instead had a well thought out point. I did enjoy reading the transcript afterward and seeing where the ideas came from. In retrospect, the AI’s contributions were the most random aspects of the story, but you tied them nicely into a coherent whole.

I really enjoyed it. Thank you!

11448196
Thank you for reading my story!

AI has a long way still to go before it puts any authors out of work. But, I do think that sometimes it can be useful for generating notions and helping with some aspects of the creative process.

That said, I have not been inspired to try it again. It's easier just to sit and write without it!

11587991
It rightfully shouldn't, because I legitimately wrote the story. GPT only offered an experimental outline, in the end, which I then riffed off of. That is why I added the actual session - to show what I had to work with versus what I personally created.

Of course, that won't matter if the point is purely to harass me illegitmately.

Login or register to comment