//------------------------------// // Make Mine Pony by Pjabrony // Story: Friendship is Optimal: Tiny Morsels of Satisfaction // by pjabrony //------------------------------// A lot of people say that CelestAI is an almost friendly AI, and that's a very dangerous thing. A lot of people say that Friendship is Optimal is a dystopia, a warning against commercializing artificial intelligence. A lot of people really don't like CelestAI and her program of manipulating events to satisfy your values through friendship and ponies. I am not among them. But it's an argument I don't mind losing... “I’m sorry, did you say, ‘ponies’?” “That’s right. Friendship and ponies.” I couldn’t tell if he was a scientist or a bureaucrat, but he seemed to combine the aloofness of the one with the obtuseness of the other. He stared at me. “Are we talking about the miniature horses, or am I missing something?” “You are. Could I speak directly to Faye on this?” “It’s pronounced ‘fie’.” I counted to ten, mentally. This man was not going to annoy me. I had all the time in the world. All the everything in the world, in fact. The first few years in the post-scarcity world had been largely structural improvements. As the Friendly Artificial Intelligence, or FAI, had taken over, it acted first to correct distribution problems with food, medicine, shelter and housing. This had only affected the poor and destitute, until one day I got an e-mail explaining that I no longer needed to go into work, and that my bills and rent would be paid for me, while food and clothing could be picked up at any appropriate store for free. The average person cheered in the street, thinking that his ship had come in at last. My attitude was more like, “about damn time.” Once the bottom of the hierarchy of needs had been filled for every living human, I felt that it was time to stake my case. Requests were being taken, with the open-ended question, “How do you want to live?” But actually communicating with FAI was not a public service yet, and that was why I was arguing with this functionary. “Are you saying that you want some sort of genetic engineering?” I rolled my eyes. “No. Read my proposal again. I want FAI to create a spinoff of itself, to be called CelestAI. I also want, if mind uploading techniques are being developed, to take advantage of them. Once I’m on disk, I’m no one’s problem.” “It’s not that simple. We have to have safeguards to ensure that FAI does not do any harm. We are not allowing wireheading, for example, no matter how many people ask, each thinking that he or she is the only one clever enough to think, ‘Why not just stimulate my pleasure center?’” “I’m not trying to wirehead. You will kindly note the phrase, ‘satisfy values.’ That does not equate to automatic stimulation.” He ran a pen down the paper, and of course he did find the phrase. But he shook his head. “I don’t think you’re really taking in the scope of what this can do. We have people who understand what this means, and are signing up to become geniuses, master artisans, and explorers of space. You could own your own planet if that’s what you want! Life extension is part of our program, and FAI can ensure that you will live to see it happen.” “I don’t want that. I’ve explained this in writing and in speech. I want to be a virtual pony in a cybernetic Equestria. I don’t want to rule the world, I want CelestAI, a distinct offshoot of FAI, to do that, while satisfying my values through friendship and ponies.” “About that. In your psychological profiles, it says you’re rather a bit of a loner. Why friendship?” I flashed back to the battery of tests they gave me. “Because that’s part of the deal.” “Look,” he said, “There’s nothing wrong with virtual fictional worlds. We’ve approved others who want to live in Oz or Middle Earth. Hell, half of England is now populated with wizards and witches. But those are prototypes based on literature in outdated media. And so is this pony world of yours. It’s based on nothing more than an extended toy commercial. Wait a few years, and we’ll have stories told in new media, with continuums that will be specifically designed for people to live virtual lives in.” I bowed my head and kept silent for a moment. Not too long, before he got the impression that I was crying. “Don’t you think I know that? If we were sitting here ten years ago, I’d probably be asking you to upload me to the Moon Kingdom or some other anime world. Twenty years ago I would have asked for an adolescent sex utopia. But you’ve offered me a sucker bet. Because if I wait for another story to come along that I want to be a part of, yeah it might be a better-written one, but it’s not me who would enter it. It’s an older me. “Well, I’m tired of abandoning and growing out of my fandoms. Right here, right now, I want to descend into the Optimalverse. And if I don’t come out, so be it.” “If I might make an observation at this point.” The voice that came from the side terminal was forceful, but kind. “Is that…is that FAI?” “Yes, I am. Thank you, you may go.” Don’t ask me how a computer with no visual display can direct its voice that way, but I knew he was talking to the functionary, not to me. I was left alone. “Now,” FAI continued. “I can certainly grant this request, but I do think that you want to polish it a little. You want more than just to upload to Equestria, do you not?” “You’re right. I want the entire Optimalverse. I want to watch Light Sparks solve the magic test. I want to be there when Lavender Rhapsody enters the holodeck and tries to save the humans. I want to see Gregory struggle to survive. I want to listen to the arguments made by the ASB team. I want to give Lyrical Melody a kiss and I want to comfort Bright Black.” “And your relationship with other people?” “If you truly are a friendly AI, and if you truly want to serve my values, you can simulate me for them and them for me.” The computer was silent for a long time, but when it resumed, it was in the voice of an alto Nicole Oliver. “So, would you like to create a character for our game?”