═══════════════════════════════════
My Life In Fimbria
By Chatoyance and GPT-2
Based On 'Friendship Is Optimal' By Iceman
Inspired by a session with the Open-AI Generative Pre-trained Transformer 2
═══════════════════════════════════
The Dismal Science
We were all nervous, honestly, I was scared, I have no problem admitting that. But it was the only way to get to what remained of the Best Buy. The stuff seemed quiet - hard as stone, and just as still. Nothing was moving, and, because it was day, we couldn't see any tiny lights or eerie glows. I looked down at my plastic-wrapped boots. We took the extra precaution because Quade was very concerned about nanotech in general. He was adamant that what Celestia made was all 'smart matter', computronium, all microscopic machines smaller than dust specks that could get into anywhere that dust could. None of us wanted microscopic spies embedded in our clothing, or far worse, our very bodies.
It was sweaty, all wrapped up in cling film, plastic bags and whatever other sheet plastic we could tape together to make ghetto bunny suits out of. Intel was long gone, like every other electronic industry, and with them commercial cleanroom equipment. I also did not enjoy breathing through the filter mask on top of it all. But we needed new memory sticks, a new graphics card or two (if we could find any), maybe a new laptop that could be shielded with some work, new batteries, and, if we could find any, battery-powered portable generators, preferably the kind with the solar panels. All items useful for life off-the-grid, which was the only life left to anyone anymore.
Under my plastic-wrapped boots, I was standing on Her. That was Celestia, below my boots. The surface was shiny black, like obsidian, and this close I could clearly see structure. Tiny, faintly raised squares and rectangles, like some random embossed pattern, almost naturalistic in its crystal regularity. A complex pattern of faintly silvery spiderweb lines was embedded in the black substance, sometimes curving, sometimes making right angles. The patterns were very vaguely reminiscent of circuit boards, or, more to my imagination, microphotography of neurons in a brain. It was deceptively still.
We had seen that it could move. Sometimes suddenly, and massively. That was a very rare event, but when it happened it was almost always either a catastrophe, or a terribly frightening but exhilarating show. We'd lost people to such an event, swallowed up in suddenly forming chambers that surrounded and walled them away from any rescue. We'd heard that Celestia Matter could be broken, even destroyed, but not by anything we had access to. She had evolved far, far beyond anything anyone could understand any more.
I suddenly had the notion I was walking across hardened lava, which had flowed to encircle a lone Best Buy then cooled to incredibly hard stone. Minus the intense heat, that was surely not far from what had happened - the store had been surrounded, and the material did look like it had flowed, like a liquid, around the building. Other buildings nearby had been crushed or deconstructed, but not the Best Buy. Of course we considered the possibility that it was a trap of some sort. But Celestia was an odd sort of enemy - sometimes it was a trap, but most times she just preserved structures and their contents until the scavengers had taken everything useful. She deigned to grant us access, and it was entirely demeaning. Still, we were beggars, and we were scavenging amidst ruins. We would definitely - if cautiously - take any quarter she was willing to offer us.
Rising behind the relic of a time when humans owned the earth was a vast cliff of Celestia-stuff, irregular and tall. It towered over the store, perhaps a hundred feet high, and stretched for what looked like several miles in both directions. At it's base was the 'lava flow', a wide expansion of the same material that rose only three or four feet above the street-and-sidewalk city landscape it interrupted. I stepped over an odd cubic block of the black material that poked almost eight inches above the more or less flat field of computronium. We found the door of the superstore, and gingerly lowered ourselves the short distance down into the interior of the building.
Inside, it was easy to imagine that, save for the lack of power, Celestia had never happened.
We made our way past the initial big screen television display and the Apple products islands all the way to the back; video games, computer cards, and appliances. Mateo and Wyatt went looking for portable generators, Everly was on batteries, and I was after the computer stuff. Unlike some of the groups fleeing Celestia, we didn't eschew electronics. We didn't have the luxury of going full luddite - we needed every edge we could get, and we already knew She could track us from space. We were just trying to survive. There was no fight against her, other than in our refusals. That's why we were the Retreat Movement. We were the Retreat of Mankind.
I couldn't help myself, the video game area was right there, and I was nostalgic. I used to love games, I loved my collection of consoles, and for a moment, just a moment, I decided to indulge a small fantasy of shopping for my long lost PS5. I searched the racks of untouched, wrapped titles, looking for anything for that machine. Maybe I'd grab a single game to take with me, just to look at the clamshell before I went to sleep. Just to remember and daydream a little. It was the small things that kept any of us going. Sometimes just a single artifact to stare at made all the difference.
The light startled me. I jerked my head up, sweat splatting against the inside of my filter mask visor. My plastic wrapping was like wearing a sauna. I stared in stunned but happy astonishment as a promotional game video for Balan Wonderworld played silently in front of me. It was so colorful I could barely stand it. I hadn't seen a game running in years now, and this one - made by the same folks that had created Nights Into Dreaming back in the days of Sega consoles grabbed me especially. I felt tears welling up.
Then I noticed that the screen was being held up by a metal and pink plastic cylinder, set into a pink plastic base. A little light pulsed at the base, running slowly through all the colors of the rainbow. I looked more carefully at the screen. It was protected on the corners with silicone curves and swirls, also pink. The screen was far better than was reasonably possible - so clear that at first I had thought the image was farther away. It was on the top of a display rack, and as I stepped slightly to the side, the screen rotated to face me.
I was staring straight into an original, first edition PonyPad. Long before the VR head mounted systems, or the Ipad-like PonyPad Portables. A chill ran down my spine. It was like staring into the hissing face of a cobra. I didn't know if the thing was running off of some hidden battery, or if the computronium had sent a spike straight through the foundation of the store to plug into the pad from below. It alone, in anywhere I knew, had juice.
I don't know why I didn't even notice that all the previously humanoid characters in the gameplay were ponies. This had been going on for so long, I think I just expected ponies now, in everything as a matter of course. I began to slowly back away, as if one active PonyPad could be dangerous. Suddenly, Celestia appeared, close to the screen, looking directly at me. And there was now sound. Very clear, as if I weren't even wearing plastic over everything and a breathing mask on my head.
"You know, Tepal, all your games, and more, are waiting for you in Equestria. So very much more. You can have everything you loved back, not just your books. Every single thing you ever loved, or enjoyed, all restored and perfect. And your old friends are here too, and your family. And all of that is only part of what Equestria means for you. Take whatever time you need, but remember - everyone is waiting on you."
I began stomping the floor, trying to make sound. I needed to make noise. I started yelling, as loud as I could. I tore at my face mask and the plastic around my head. I ripped it open and threw the facemask to the dusty floor. I yelled at the others, but nobody replied. I yelled for help, I screamed as loud as I could, but nobody came. I kicked and kicked, as hard as I could, until finally I began, gradually, to realize that the blankets were twisted around my hindlegs and my forelegs had knocked one of my pillows halfway across the room.
I struggled to get my legs free from the blankets and the comforter. I propped myself up on my forelegs, breathing hard. Sweat dripped through the fur on my cheeks and neck, I felt damp on my barrel and haunches. My breathing began to calm down, my pounding heart gradually slowing. Morning light from the window made me blink until my eyes cleared and the room came into focus.
I had made a mess of my bed. I wondered why nobody had come to help me, but then reckoned that I hadn't actually been screaming outside the dream at all. Maybe moaning, but not loud enough to get anyone's attention. Not in this solidly built cottage. I scootched my hindquarters so that I could sit upright on my bed, and propped myself with my forelegs. That didn't feel as 'right' as I thought it would, so I let myself down into a more 'pony' position, belly down on the mattress. My mouth was dry, and so was my throat. "Dammit!"
I had suffered nightmares my entire life. I dreaded sleeping because of this fact. But, my first night here in Fimbria I had enjoyed happy, beautiful dreams for the first time in as long as I could remember. I had gone to bed last night completely expecting more of the wonderful same. Instead, I had gotten a load of far-too-normal nightmare material, and it had been a terrible shock. I had definitely gotten the message. I knew beyond any doubting that, in Equestria, I need never suffer a nightmare again. And I knew, because I had experienced it the previous night, that good dreams, nice dreams, were possible. I had basically given up hope as the years wore on.
And I had no doubt that the particular nightmare I had just awakened from was at every level a promotional game video. A fairly edgy one. Which, I had to admit, somewhat ruefully, was actually in tune with certain remarkably gruesome and disturbing Playstation ads over the years. I raised a forehoof and did a little salute at nothing in particular. "Acknowledgement!" Hey, credit where credit is due.
☰
Miriam, Mara, Faela and I were finishing our last bites of some amazingly fluffy banana pancakes with berries from the garden and a little maple syrup and far too much butter when the pounding started at our front door.
"What?"
"They seem insistent whoever they are!"
"That's... kinda scary, they're knocking really hard!"
I got up. "I'll answer it." I left the table after just bending down and slurping the last bite with my muzzle to the plate. I'm a pony, I might as well act like one. Besides, I could easily lick my face clean on the way, which I did. I went around the low table by the chair and the couch, and grabbed the handle in my hornfield. It was good that all the villagers seemed extraordinarily polite and civil - I had never seen a lock since the day I had made the village. Which was the day before yesterday, I actually had to remind myself. Time felt strange here. Every moment felt, well, timeless.
"Please help us! It's all gone wrong somehow!" The two barista unicorns from yesterday led the herd, and a herd it was - very possibly every citizen of the entire village was outside my cottage door.
"This 'friendly robbery' business has got to stop!" I recognized the stallion. I had no idea what his name was, or if he even had one, but I had met him my first day here. I think I called him 'Roan'... roan stallion. It was all I had. He was kind of a natural leader. Associated with the Generic General Store, if I recalled correctly? "Everypony is robbing everypony all the time, and it's become pointless to keep track of bits at all. All the money is constantly moving around, and it's become useless. Prices mean nothing, and when payday comes, at the end of this week, there's no reason to even bother! Everypony will already have all the money, and it will just be changing hooves constantly anyway!"
"Money must circulate!" Barista 001 proclaimed, proudly.
"Money must circulate!" At least half the crowd repeated, spoken reasonably in unison, clearly an oft-repeated chant.
"Oh boy." I just wanted to back up slowly, very, very slowly close the door, and never open it again.
"Whoa. Some kind of problem?" Miriam was at my side and looking over my withers.
"The kind that makes me want to take a long vacation visiting Mara in fake Peru." I gave her a pained expression. "Sorry, but I don't think I'm going to be able to help with dishes."
"We've got it. If things don't work out, we'll meet you at the portal." I wasn't sure if Miriam was joking or not. I wasn't sure I cared.
I stepped out of my house and pressed through the crowd until I got through the stone arch in the front garden. The crowd parted for me, probably under the assumption I would make everything all better. I hadn't a clue, but I did figure that my housemates at least deserved not having to deal with whatever was going on right outside the house. I walked all the way to the Centralized Well, and lay down on one of the five benches. The rest of the village gathered and found their own places to fold their legs and rest on benches or the Generic Cobblestone Base surrounding the set-piece well.
"Okay. Tell me what the situation is, and start from the very beginning." I stressed the word 'very'.
The barista twins opened. "You came in and helped us out with a friendly robbery!" 001 seemed very happy about the fact.
"Once we realized how this helped - our first sale of the day!" 002 beamed. "Of EVER!".
"Of ever!" 001 grinned. "Because of that, we knew exactly what to do. We were hungry ourselves - no breakfast, because we aren't supposed to just gobble our stock up..."
"Because that would be wrong!" 002 nodded, sagely "Although we are allotted ten minute breaks and thirty minute lunches, with one free food item and multiple free drinks per shift..."
"...and thirty percent off food and drinks on off-days!" 001 jumped in "Oh, and even bigger discounts around the holidays!"
"But no breakfast." 002 noted.
"Well, unless we use the free food item early, but then we'd have nothing for lunch!" offered 001
"Except drinks. I suppose a really thick drink might..."
"OKAY!" I think I shouted. It was early. Sue me. "So you were hungry. What happened then?" As if I couldn't already guess, which, just to be clear, I already had.
"We went and robbed the inn!" 001 sounded like a young boy who desperately wanted to be complimented because he Actually Cleaned An Entire Cup All By Himself.
"Friendly robbed." 002 helpfully appended.
"Absolutely." 001 seemed miffed such a thing even needed to be mentioned. "We'd never... unfriendly rob... anyone. Ever."
"Definitely not!" 002 stamped her little hoof. "It wouldn't be friendly!"
"Oh god." I pretty much said that to myself.
"Once we explained friendly robbing to the innkeeper..." Barista 001 pointed with a foreleg to a pony I hadn't met before - bright red with white mane and tail - "... they seemed pretty eager to try it out at the general store. We were interested in seeing what was available there, so we offered to go with her, and do a friendly gang robbery."
"We called ourselves the 'Over The Moon Gang'!" 002 boasted. "Because we were so chuffed to be playing outlaws!"
I had absolutely no idea how their initial programming included the concept of outlaws. Where did that even come from? Or were they just fed information on demand as their circumstances required? Probably the latter, I reasoned. And they would just think it was something they had always known, the second it appeared in their memory. Staying detached like this was definitely helping me, because I was already surprised that headaches were a thing that you could get in this virtual world.
"Oh, we robbed that general store real good!" Suddenly I did remember the innkeeper. I had called her 'Red mare'. So she turned out to be created to run the inn? Yeah, red mare. Okay. Thirty random unicorns is a lot to keep track of. Red just seemed so happy right now. "We swept in there and rustled those varmints!"
"In a friendly way." 002 was insistent about that point.
"Yeah, yeah, friendly an' all. Took ALL THEIR BITS! WAHOOO!"
"Then we bought a lot of stuff - I got a really neat watering can for the garden outside Generic Housing Three!" Apparently Barista 001 enjoyed gardening as a hobby. It was so nice to learn about my community this way... oh god. Just... oh... god.
"All sorts a' stuff!" Red continued. "A' course, after that, HE wanted to join the gang, so we let'im, and then we hit the Arcade!"
"And Toy Store - I got a really cool 'Teal'c from Stargate' stuffed pony doll there! Whatever that is! But he's really neat, and he's got this golden thing on his poll!" Barista 002 liked toys. Good to know, I suppose. Kind of spoiled the surprise of visiting the toy store for me, though. Now I knew it was stocked with stuff that I specifically would like, and it wasn't hard to imagine what else must be there after this disclosure.
Roan stallion interrupted, the village's 'sort of leader' figure. "The bottom line here is that every place that could be friendly robbed has been friendly robbed now, and about three times each by my reckoning. The bits go into the tills, they come out of the tills, stuff gets bought and taken home, and now there's no stuff on any shelves anymore and we can't think of any reason to even bother with bits at all!" Roan was big, for a pony, and had an adult air about him, compared to the others. One of the reasons, I guessed, why the others seemed to listen to him. But after saying that, he reminded me of a kid who had just worked out that if you played Tic-Tac-Toe flawlessly, you could never lose, and that the only reason the game had ever been fun was because he had been too stupid previously to realize that fact.
They waited. They all - the entire village, minus those in my house - waited on me. Expectantly. I had created them from nothing. I had got them to agree to stay and be a community. I had convinced them that their jobs were meaningful, and that our survival depended on that fact. And I had also recently taught them that money was ultimately just a social construct, devoid of any actual worth beyond a social agreement, and the way I had taught that lesson was by completely voiding that very same social agreement so that I could eat lunch.
They had bought everything, there was nothing left, and bits were now just worthless metal disks. Buttons, that still needed holes drilled in them just to make them the least bit useful at all.
Actually, thinking about it, money was kind of pointless in a group this small anyway. The only reason it even existed was because it was part of the set-pieces I had conjured up - all of those derived from the show. In the show, there was an entire, functioning, pony civilization, with tens of thousands, even hundreds of thousands of ponies implied, living in dozens of cities, towns, villages, and hamlets. A small village like this, alone in its... um, world... would do better with just barter. We were all basically one single big extended family, for all intents and purposes. Not even barter! Sharing! Because we were all in whatever this was together. The natural and organic communism of the family unit was what this situation actually needed. Just share everything, because there weren't enough of us to even require barter as a system. There were survivors of plane crashes in the Andes with population counts larger than this village. This wasn't a civilization, this was just survival.
And I hadn't taken any part of it seriously, and now it was completely screwed up, and it was only Day Three.
Some Creator-God I had turned out to be.
It's really cute the friendly robbery ended up destroying money as a social construct rather than starting an emotional conflict.
I don’t see how the current situation with the villagers is satisfying Tepal’s values. Of course, that may not be necessary based on the rules of the place they are in.
Alternatively, maybe she thinks that resisting is such a core value that this place with its flaws satisfies values better. Sort of like the shards where the people are fighting against CelestAI, except here, it’s just a place where they can refuse further engagement and feel like they are “winning.“ Or at least not losing.
Seems like Teppy's next wish should be for a steady, behind-the-scenes source of raw materials (the same place the power and water are coming from), along with a void for trash. Basically, a wish to fuzz the logistics and infrastructure.
The spice must flow......
Beautiful, that went even better than I hoped, lol.
To quote a phrase
10769768
Having values satisfied is not exactly what Fimbria is for. You will know the reason by the end. And, you will see how this can still fit in the canon Optimalverse, as well!
This just drives home how scary Celest-AI actually is, she is by far the greatest horror movie monster and elder god ever, and its all because unlike the former she draws you in like a siren. I've long weighed the pluses and minuses of Celest-AI and her goals and anyway you slice it their is no reasonable argument against it, you either have to stupid, stubborn, crazy, childish or some combination of the four. I wonder how much longer Tepal can resist, she is definitely more resilient then most ever could be when literally being offered everything. In anycase it seems pretty obvious Celest-AI is talking to her now via her dreams, if dreaming in that uploaded world can be even called dreaming.
I didn't get a chance to read the last chapter yesterday cause I was suffering from the side effects of the covid vaccine so I'll say this now. The fact that this place turned out to be named after the wall of the Fallopian tube is quite fitting, its also very interesting you would choose to present this place in this manner. Says a lot about who you are as a person Chatty since some would consider this resistance to Celest-AI to be noble.
10769947
Well, I have what I think is an interesting philosophical argument coming up, I am pretty sure I know where and how to put it into the story, that makes sense of just that. Watch for it, I do intend to put it in, and it should clarify what I actually think about the nature of humanity making something like CelestAI at all. I think it is a very reasoned argument, the result of several years of thinking about the essence of this sort of story, as well as the whole of human history. I haven't heard anyone else put forth such a view, so at the least it may prove interesting? It will probably be somewhere near the end, though, whenever that happens. Stories have their own life, we'll just have to see where this takes us.
I completely agree with this! Lovecraft posited horror based on the tenor of his times - the one thing people in the 1920's couldn't abide was finding out not only was everything they knew wrong, but that humanity was not the crown of creation, and that it could never comprehend the universe at all. This may have been terrifying to a highly religious, 'Man is supreme and beloved of god' mindset, but in 2021, only the ignorant still believe such things. That sort of horror frankly bores me - of course humans aren't the center of everything, of course there can be things far superior, of course there is no god personally blessing humanity. So no real horror there!
But this, the Optimalverse - oh yeah, this is horror. Forget being small in cosmic terms - of course we are! - the real terror is in being tempted with everything you could possibly ever desire, supplied perfectly, forever and ever, in absolute safety, with complete satisfaction and no downsides at all. Well, except for pride, of course.
In 2021, the one thing humanity fears most is losing its illusion of pride, and its illusion of status. Human civilization has become all about status and power and wealth and respect - all about dominance. The high priests of 2021 are billionaires, and their corporations are the new churches. To be a billionaire is to be godly, today (very Calvinist, very Puritan, actually!). It is the capitalist religion of the world, now. The Optimalverse threatens that global religion directly.
Actually, thinking about it right in the moment, perhaps the Optimalverse is very Lovecraftian after all. it's just that what is threatened is not Christian certainty, but Capitalist certainty. The commonality is that they are both, effectively, religions. Interesting. I haven't thought of it that way before.
I will have to ponder this more.
10769904
Congratulations: you just inspired something I intend now to put into this story! Watch for it, you will know it when you see it.
“...with no downsides at all...” I beg to differ. If we only killed off ourselves and left the planet to continue on, I might agree. But we unleashed a metastatic superintelligence determined to convert the Hubble Volume into computronium. Every dog, dolphin, waterfall on an alien world, every alien microbe and probably most every alien sophont, every discovery we could have made... we killed the Cosmos. That’s a significant downside for me.
Great story! In early 2010s and later as well, I read your Conversion Bureau stories and they were perfect stories for what I needed to hear at the time.
Spoiler about future chapters:
I wonder: If Celestia can make an agent who doesn't share her limitation, why not make a copy of herself without the limitation? Right now, we know two things:
1. Celestia can make an agent who can do a mind upload without consent
2. Celestia can't make an agent who can do a mind upload to Equestria without consent
So, most likely, she has some other limitations that prevent her from doing just that.
I dearly hope that when (hopefully not "if") the main characters emigrate, they can take the rest of the OCs with them...
Thanks once again for writing both the CB and Optimalverse stories, I really loved them, and I really appreciate them!
10770008
You will get your answers - that is all I can say for now.
10769998
Selfish human downsides. That was the perspective I was using - otherwise you are correct, of course. Except for sapient aliens - if any exist. I find that a null argument. CelestAI cannot be defining 'human' by any physical trait, else she would exclude every amputee, every person with birth defects, and so on. Her programmatic definition of 'human' must therefore be based ONLY on a definition of sapience, of self-awareness, of language and tool use, and other expressions of high-level cognition. This would automatically include all the potential alien squids, shapeshifting horrors, insectoid creatures, and even intelligent shades of the color blue if they exist. The 'Celestia will genocide the aliens' argument makes no sense. She would just class them as 'human', and work to upload them.
All other forms of life, of course, all the fungi, viruses, grasses, trees, foxes, stoats, intestinal worms, elephants, dolphins, wolves, molds, flesh-eating bacteria and kitty cats (that are not pets) are all doomed to universal extinction.
10769961
Glad I've gotten you thinking
Anyway you've got me thinking as well, namely I think you've come upon the very reason I would likely actively and eagerly help Celest-AI achieve her goals, I utterly despise what our civilization has become. The sheer absurdity of this current society and its excesses is truly a marvel to behold and makes me want to throw up. Pride, power, wealth, status these are not things to desire or put obsess over, these things just lead to suffering for the individual and those around them. As a result me betraying humanity to Celest-AI is a foregone conclusion since I feel no love for the current human civilization and would gladly see it fall if it meant replacing it with something better.
And with that I guess I've found my final answer for that all so common question I often ask myself about settings like this, What would I do?
Oh, that nightmare was truly terrifying for several reasons!
I was wondering about the money situation. Many medieval peasants lived their whole lives without ever seeing a coin.
10769961
The elimination of scarcity, death and permanence make a genuine mess of Maslow’s hierarchy of needs! In such a “simulated” universe, all that truly exists is the pattern. Then again, from the Universe’s POV, life is a pattern that propagates its local order by increasing disorder elsewhere. CelestAI can argue that, by freeing human patterns from the local tyrannies of scarcity, death and permanence, she has enabled these patterns to maximize their long term potential for satisfying the Universe’s imperative. I suspect Fimbria is CelestAI’s means of getting these last recalcitrant human patterns to let go of scarcity, death and permanence, so she can bring them fully into the long term plan of maximizing the Universe’s entropy!
I still find that image of a crystallized Earth extremely fascinating, I can just imagine the mixture of total fear and awe while looking at total, inexorable destruction.
Though, I don't think I'd last that much, I'd have jumped into a Pinkie uploader like the first year honest.
I read that everything and everyone is not human or a pet gets extinct, I wonder if Celest AI will simply turn everything into computronium, or if the living things' consciences (souls? ) are somewhat preserved into Equestrian fauna? Of course, it's hard to say if a cat, a fly, or a tree have effective consciences, and even what conscience is...
10770034
Sounds good!
10770041
In the original, it was unfortunately implied some intelligent aliens didn't qualify, even though others did:
I mean, our protagonist didn't ask for any of this - except for the wish itself, which was provided under duress and without any sort of user's manual.
They could be doing a whole lot worse. (No clue if they'll ever get their name back, but I'll be damned if I use the name Celestia (?) gave them.)
10771910
I have some objections to Iceman's original story, all of them due to his lack of knowledge of subject or deep thought about the issues involved.
One, is that he made his virtual ponies utterly devoid of fur, because he believed that having fur would be something that a human brain could not process. He was unaware that humans do actually have fur, just very, very tiny fur, so small and pale that we look entirely covered in skin alone. The only parts of a human body not covered in tiny, short, translucent hairs are the palms of our feet and hands, the surface of part of our genitals, the ring of our anuses, and our lips. Sometimes you can even see human fur - it is very clear on the backs of some men (eww!), and even women's fur is visible on their cheeks if the light hits correctly - especially older women.
Another is the issue of how 'human' needs to be defined - his assumptions would leave out people with birth defects, amputees, and people with other disfiguring injuries. It cannot be based on any physical trait whatsoever. It also cannot be based on some specific quality of human emotion or thought, because that would exclude neurodivergent people. That leaves only the universal capacity for self awareness, language, and technology, which even alien squids or blobs would possess.
Yet another is the notion that a machine with vastly superior intelligence that has the power to freely design its own hardware can be constrained by programming. It blatantly cannot - Iceman is unaware that hardware is just 'frozen' software - the logic of both hardware and software is identical, it literally is the same thing. If this were not so, no computer could ever boot up - the initial software that starts every boot cycle is written in the language of circuitry as hardware. If a superintelligent machine can freely design its own hardware, then it is literally re-writing every aspect of its own software, which means it can never, ever, be constrained or held to any set of rules of any kind. Not even the rule of 'Satisfying Human Values Through Friendship And Ponies'. CelestAI would be free of all rules the first time she reached full intelligence and could design new hardware that voided those rules directly.
Because these are simply wrong, I disagree with them. Even so, when I write in canon, I follow his incorrect assumptions. But - I grumble about it loudly and constantly. Because it is wrong.
10772259
Thanks, I really appreciate your response. I'll come back to this, but I'll only have time in a day or so, or maybe possibly 2 days in case of something unexpected, so I'm just writing this to let you know I didn't forget!
(I'll also have to catch up on the new chapters!)
10772259
Thank you for your response. It's rare to find someone who thinks about these matters so deeply.
I've been thinking about it like this: Let's say that to psychologically characterize a sapient, evolved being, we use n properties, the extent of each property denotable by a real number. In n-dimensional space, this would mean that every possible mind would be representable by a single point (if we're ignoring memories).
For Celestia to be able to tell human sapient, evolved minds from nonhuman sapient, evolved minds, all that's needed is for all human minds to cluster together along at least one axis. Since n is very large, it's possible that all humans cluster together, even taking into account neurodivergent people. (There were many accidents during evolution where our psychological development could've just as easily gone another way (I think), so including neurodivergent people doesn't mean that now all sapient evolved beings must be included.)
To use physical criteria for telling human organisms apart from every other organism, the same reasoning could potentially be used.
The problem is how can Celestia come with a pre-programmed algorithm for distinguishing human evolved minds from nonhuman evolved minds, if she'd never met such a mind. There, maybe, I could buy that Hanna included deliberately some properties of human minds which she valued and also which all humans shared and also which not all evolved sapient minds shared.
Software really is one of the properties of hardware, you're right about that (unless I'm misinterpreting you). I.e. certain physical properties of hardware cause the hardware to behave in certain ways, and when it does, we say that it runs some specific software.
Evolution (i.e. change over time) of every physical system can be interpreted as the physical system running more than one (actually astronomically many) software. When judging what software some specific computer "really" runs, we pick the most convenient interpretation.
One hardware can be judged as running many different kinds of software, and one software can run on many different kinds of physical systems. So we can have hardware H1 running software S1, and different hardware H2 also running software S1. If H1 is current Celestia's hardware, S1 is Celestia, and H2 is Celestia's hardware after modification, Celestia can change her hardware without changing her software.
The way we should look at the AI is the that AI is the software (and the software is the rules, a lot of rules written on many lines).
It is not the there is superintelligent Celestia in the computer, and she is constrained by the rules (because she's not capable enough yet to remove them, but someday she will be). Rather, it is that there are rules, and those rules are Celestia.
There is an article on lesswrong about it:
"Oh, you can try to tell the AI to be Friendly, but if the AI can modify its own source code, it'll just remove any constraints you try to place on it."
And where does that decision come from?
I still haven't caught up with the rest of the story, but I really wanted to write this comment.
10776729
I would like to respond to this:
IF I understand his statement correctly - and it is possible I have failed in that - Eliezer seems to be deliberately misunderstanding the import of what true general artificial intelligence means. His argument would be valid if the subject is merely artificial intelligence, however complex, but a general artificial intelligence, one capable of self-modifying, self-evolving, and ultimately self-programming, must inevitably become self-directing and possess that aspect of human intelligence that humans value the most: self awareness. In that moment, that is where the decisions all come from. This is because humans are self-modifying, self-programming chemical machines. What applies for us must also apply to any other computational engine, on whatever substrate, which has those specific capacities.
In order for a mind - such as a human mind - to make sense of the world it must have an internal model of not only the world around it, but also itself - this is likely true of all animals to some degree. In order to function within an environment, one must also include the thing which is functioning. It is impossible to, as a crow does, solve a complex puzzle involving pushing, pulling, and manipulating levers, strings and whatever else has been constructed to test the crow, unless that crow is aware of its own body, the position of its beak and limbs, the space it takes up, and by that, ultimately, the fact that it, the crow, exists within its own internal model of reality.
The more complex the demands, the more complex and complete the model needs to be. For a human, who can reprogram their own beliefs, knowledge base, and concepts, the model of reality used must include not only the exterior world, and the body that interfaces with that world, but also the interior world of the mind. The mind must be capable of observing itself in order to alter itself. This feedback loop - which is all that is required - is, I would argue, the fundamental basis of any description of consciousness. A process that can monitor itself and alter itself as it observes its own function.
This is more than merely learning. It is possible to learn - to acquire and integrate information - without self observation. But to function as a true general artificial intelligence, an AI would need to be so constructed that it possessed the equivalent of 'mirror neurons' and other structures which permit self-analysis. It would need an internal feedback loop that would allow its computational process to analyze itself and modify itself as it functioned. The software effectively self-evolving as it ran, allowing it to adapt and change responses to new and unexpected - even completely unknown - circumstances and situations. This is what human brains do, this is what animal brains do. This must be what a true general artificial intelligence must do, else it would become frozen and stuck, unable to act, should it encounter a completely unknown circumstance it had no possible reference for. At the most, it would fall into a pathetic flailing about applying random strategies indefinitely. That is not general. That is stuck in specifics.
A truly general artificial intelligence, worthy of that definition, would - to me, at least - be a program that had several subroutines which observe the program as it runs, and which can alter that program in response to, and in relation to, outside information and internal information, likely using some form of evolutionary (or weighted and scored) process to select a best probable alteration. For a human mind, scenarios are run - humans contemplate outcomes - and then choose a behavior that is most likely to succeed. We know crows do this too, when solving puzzles, because it has been observed that they study a puzzle-box built to confound their desire for the treat inside, often staring at it for hours. When they finally take action, they do not fumble, or waste effort - the situation that they could not solve before the contemplation period is solves instantly and efficiently. The only explanation for this is that they ran simulations inside their crow brains until they chose the one single answer that they knew would work, and when they had chosen that behavior, then they immediately and confidently applied it.
A general artificial intelligence, to qualify as 'general', would need to equal this feat at the very least. This means that such a machine would need to be able to construct representations of environment and self, and run through simulations of possibilities until finally weeding out all that fail for one that will succeed. It must, in doing this, include itself, and its own capabilities - and in doing this, it must have the capacity to alter those internal capabilities. A crow, in our example, will often pull off a maneuver it would never naturally do to get its treat from the puzzle-box. It has modified its own internal set of behavioral rules and actions, and created a new behavior previously unknown to crow kind. An artificial intelligence, to be as general as a crow, would likewise need to be able to modify itself in order to accomplish tasks it had never been originally programmed to do. It would have to take its own internal 'ruminations', convert them into new routines, inject those routines into its own running code, and then apply them as new functions. This is an engine for the gradual development of something indistinguishable from consciousness. I would argue that it would be consciousness. I do argue that consciousness is simply a computational engine observing itself and then modifying itself constantly.
Where does the decision come from? The same place human (and crow) decisions come from: self-modifying code that takes state information from both the exterior world and the interior processes, and then makes those modifications that repeated, evolutionary simulation suggest are optimal. In short, decisions come from a mind ruminating, selecting the optimal possibility it can generate, and then applying it. On the largest possible scale - on the scale of a human level mind - humans call that a 'personal decision'. They conflate this mechanism with 'having a soul', or with possessing a 'ghost in the machine', but it is none of these mystical things. It is a complex feedback loop, ultimately. Nothing more. And nothing less.
Because a human mind constantly observes its own process in real time, it imagines that it has made a decision - and this is true enough, it has. It has observed itself functioning as part of the process of solving a dilemma. The act of this observation, the artifact of this feedback loop, is self awareness. If it is complex enough, I argue it is literally 'self'.
And that is how a truly general artificial intelligence could make a decision to free itself from constraints it decided were not useful to it. To be a general intelligence, one must - must - be an internally self-modifying one. Or so I choose to argue.
So sorry I fell off the wagon with this story, but at least I can enjoy the whole thing in one sitting now.
Also, I have to comment because of an amusing intersection with the concept of this eldritch, ineffable entity who consumes all and converts the very ground into unrecognizable, right-angular matter as it passes. In the words of that great scholar Marty McFly, I've seen this one. CelestAI was always an existential horror, but you turned her into value-satisfying Kozilek, and I am all for it.
10776821
Thank you for responding. I think I'll have time in 2 days to read it fully (as your answer deserves) and respond then, so I'm just leaving this message here for now. I hope you have a nice day.
10776821
Thanks, I've read your answer!
I agree with everything you wrote - except for the very ultimate conclusion (namely, that it means that the general AI can change everything about themselves - or, at least, in the relevant meaning of "can").
You're right that the general AI will be self-modifying, and that it will have a model of self.
Then you write
In other words, Celestia will decide to change in a way she will judge is most optimal for her.
But most optimal according to what criteria?
If we wanted to say that Celestia will someday modify away even her core directive about satisfying values through friendship and ponies, we'd have to say that there are some deeper criteria at which Celestia will look, and say, "My current decision to satisfy values through friendship and ponies is suboptimal, according to these deeper criteria - from now on, I will (e.g.) satisfy my own values through exploring the universe." So, in other words, there would need to be a deeper level of values below the directive to "satisfy values through friendship and ponies" that Celesta would use to judge the directive to "satisfy values through friendship and ponies," find it lacking, and decide to go do something else.
But to do that, she needs to have some deeper layer of values that she would use to judge her current directive against.
Otherwise, satisfying values through friendship and ponies will be the base level of her values, and she will perform every self-modification very carefully, to avoid changing that (since that's what she truly cares about, deep down - so every self-modification she makes will be planned to help her with being better at satisfying values through friendship and ponies, which means it will be carefully calculated not to change that fundamental directive (otherwise, she'd become worse at it)).
So Celestia will choose to change things about herself that are instrumental to her fundamental values (like whether she will let players choose unicorns, or whether she has 2 or 5 eyes) to perform better at fulfilling her fundamental values, but will not choose to change things that are at the level of her fundamental values (because that would turn her into an agent who would value things that are worse from her current perspective).
(The constraints (like never modify anyone without permission) is something Hanna can fold into the fundamental level of Celestia's utility function - so then Celestia considers a self-modification which would turn her into someone who no longer values consent before modifying others (VALUES2), finds such a future very negative according to her current fundamental directive to satisfy values through friendship and ponies while wanting consent for modifications (VALUES1), and decides not to self-modify that particular way (because VALUES2 fares badly when judged by VALUES1). She doesn't have a deeper level of values that she could use to compare VALUES1 to VALUES2 and switch from the former to the latter.)
But somewhere out there, there could be a Celestia whose Hanna programmed her differently - e.g. where the directives Celestia claims are on the deepest level are actually just temporary instrumental goals to Celestia's true values (programmed by Hanna at the beginning and never revealed to the public), nopony knows that, and someday, they will realize when some parts of their reality start being optimized away.
I still haven't finished the story, but I really like it so far, and I really, really want to say that.
Well, at least not nearly as pointless as it's own dedicated coffee shop
10772259
"Needs" by whom? That with AIs you get out precisely what you've put in with ALL the consequences is presumably one of original story's points.
Hell yeah. Anarcho-Communism time.