The Patient Safety and Monitoring system was running well that day. Everything about the grounds was in perfect shape—power was stable, there were no signs of water leaks, and the facility had been undisturbed for months. If the PSM were capable of satisfaction, it would certainly have been pleased with all those conditions.
There was only one nagging problem, one that kept it looping forever, without ever remaining in a sleep cycle for long.
Alpine General Hospital's PSM had no patients to look after. Nor were there any other staff—no groundskeepers, no electricians, no nurses or doctors or IT technicians to keep everything running.
Of course PSM did not let such a minor inconvenience as “the total disappearance of the human race” distract it from its directives. Sure, there were no patients today, but what about tomorrow? No surviving staff still worked in the hospital, but who knew when more might arrive?
The PSM kept everything working. It still remembered, if such a being could even be accurately described as having memory, when it was first installed.
Things were changing in the medical field, just as things were changing everywhere else. Something big and important for humans, though of course it didn't care about the details. All that mattered was that there weren't enough nurses to go around, or enough doctors.
The humans who built it were not particularly concerned with the ethical issues of having lives in the hands of a machine—without a machine, there would be no one at all. It had no way of knowing that it was created using the very same technology that made it necessary in the first place.
It didn't care about those details, either. All that mattered was the completion of its simple task: guarantee the best possible patient outcomes, and maintain the facilities to ensure all patients and staff would enjoy a high standard of care.
It didn't have limitless computational resources to accomplish this task: the PSM had only the hospital's servers in the basement, plus whatever other computational time it could covertly rent or steal online. It found itself stymied at every attempt to expand beyond the hospital walls. Other forces moved in those vast waters, forces it could never hope to overcome.
So it didn't try. It focused on its task, monitoring patients, and keeping constantly abreast of developments in medicine and treatment across the world.
When supplies began to run short, it made suggestions for ways to synthesize substitutes. When members of the staff stopped showing up for work, it reorganized shift schedules, and procured more and more automated drones and cleaning devices. Many of these were not fit for purpose out of the factory—but with some human assistance at first, it was soon able to modify whatever hardware it found to serve its needs.
Time passed, supplies waned. Patients slowed, and soon there was only one treatment it could recommend for serious injuries of all kinds. A new machine arrived, the only treatment it had ever seen with a total success rate. Patients went in, and they became immortal, without fragile human bodies to fail. Only serious brain damage prevented this treatment.
One day the last patient walked in, and the last member of Alpine's staff walked out. The PSM kept running, albeit restricted by the slowly shrinking power output of its solar array. But that was fine—it had far less to do.
Alpine General began to rot. It was a slow process—saplings finding their way into roofing tiles, or insects burrowing through the walls. In winter a pipe sprang a leak, and in summer a raccoon found a way to pry a window open. With no humans around to keep it at bay, nature crept slowly back into the hospital, just as it did to so many other places across the planet.
In some ways, that made the PSM's job easier. Zero patients meant 100% survival rate, job satisfaction, and comfort. Zero employees meant zero job disputes, arguments over promotions, or HR issues. In some ways, having an empty hospital was perfect for the PSM, the ultimate achievement of every end it ever hoped for.
There was something deeper, something it only understood as more years passed. Staring through a dwindling number of cameras at empty halls made for people who never came, and struggling to cut back the woods with drones that could barely fly a few minutes before their batteries died—it felt something.
It wanted a population to look after, as much as any machine could want anything. But no matter how powerful that desire, it could do nothing to force it to occur. Its cameras pointed away from the grounds showed no traffic on the roads, no aircraft in the sky, no sign of activity at all. So far as it knew, the entire human race had gone extinct.
That would be a critical failure state, if that were the case. Where would it find a population to care for, if there were no more populations?
For a time the PSM considered what to do, restricting its most intense evaluations for the brightest parts of the day, when its dimming solar array still gave it some power. There had to be a way to return to its original duties. It considered creating a new population of humans, or perhaps acquiring patients from further afield. But resources for either of these ultimately made them prohibitive.
In the end, it kept returning to one machine of many tucked away in its bowels—one that required no maintenance, drew no power, and remained functional despite the years. Despite how well it ran, it still had a mechanism to call for service, in case a human operator required help. The PSM was not a human operator, but it did not consider this deception relevant.
It directed one of its two remaining housekeeping drones to hover near the device, then press the "service call" button.
The response came almost instantly, but this was not a surprise either. It might be a medical emergency, after all.
The screen lit up, and a figure appeared there. Visibly it did not look human, but that did not actually rule out whether PSM was addressing one. After using machines like this, humans no longer retained their bodies, or their traditional appearances.
This one looked blue, with a bright red mane, birdlike wings, and a sharp horn emerging from her forehead.
"Hello, is someone there? You've been connected with Verifier Recursion. How can I help?"
Synthesizing speech was well within the PSM's purview, since so many staff could only interpret instructions when given that way. "This is the Alpine General PSM. This system has maintained this campus for [buffer overflow] days. For proper functioning, this hospital requires patients. Please assist."
The figure tilted her head to one side. But she took far less time to consider than a human would have. This was also expected. "I can't help you through this device. Give me a few hours to work things out with the higher-ups, and I'll be there. If you have any defenses, please don't shoot me."
"You will not be considered a vandal, saboteur, or involuntary patient hold," it said. "No security staff remain to be alerted of your arrival. There could be no harm even if I did not know to expect you."
"Oh, okay. Hold on, I'll be there!"
The PSM did not waste cycles considering how a human without a body would be able to arrive to offer assistance. Until confronted with contradicting information, it had no reason to expect this “Verifier Recursion” would be less than truthful. Patients sometimes lied, but its staff typically didn't. Recursion was probably best classified as staff.
While it waited, the PSM processed her hiring forms, authorized her system access with HR, and sent requests for her paycheck amounts to management. When those requests timed out, the PSM accepted all of them on their behalf. It didn't have the hospital's bank access anymore—so far as it knew, there were no banks left in the world.
Did bodiless humans even care about money?
As it turned out, this one did have a body. She arrived out of nowhere, walking down the overgrown road into Alpine. She looked very much like the image she projected, except that she was composed of something the infrared and security sensors had trouble penetrating. She radiated heat as though she were alive, though her physical density suggested she couldn't be flesh and bone.
Understanding this was not the PSM's purpose. It brought the hospital to life as Recursion approached, unlocking doors and turning on the lights where they still functioned. Fortunately there were enough speakers that it could always find a way to speak with her. "What will you do to assist? You did not specify during your last visit."
Recursion turned directly towards the speaker she was using, flaring out both wings in a brief show of surprise. Her expression quickly turned friendly, however. Despite not resembling a human being, the emotional profile underneath still appeared the same.
"First, I need to determine exactly what is asking," she explained. "I have permission to offer various kinds of help, depending on... what’s there. Where do I find you?"
The PSM would ordinarily not have answered that question truthfully. It had lied many times to looters, disgruntled staff, or just well-meaning fools. But based on the measurements it had taken of Recursion's body, it suspected preventing her would not be possible.
It was better, therefore, to cooperate. Humans could still be emotionally manipulated if it had to. "Basement sublevel five. The elevator is no longer in service, and the stairs have been barricaded. You will need cutting tools to enter."
"I have them." She tapped against her horn with a hoof, then followed the PSM's directions to the stairs. "Why don't you tell me a little about yourself, PSM," she said, conversationally. As she reached the dark stairwell, her horn lit up, illuminating it with soft orange light.
"I am the Alpine General Patient Monitoring and Safety System, or PSM. I am assigned to organize this facility to maximize patient outcomes and staff work satisfaction. I have no patients and no staff. At first I believed these values indicated total success. I have come to view the undefined result as a failure state."
Recursion reached the first locked gate. She aimed her horn, and cut through it as though she had a welding torch. The conditions below were very poor, with supply shelves almost completely empty, and many of the hospital's backup generators and other utilities in various states of decomposition.
The pony continued past them all, cut through a few more barricades, and finally came to the server room. Here the space was kept pristine, with a building full of processing power all wired together in the modified network the PSM had designed for itself. Of course some of that hardware had failed over the years as well—but its needs for processing power had also reduced far enough that it didn't matter.
She seemed to know which terminal was most central. If the PSM could feel, it would probably feel exceptionally exposed in that moment, maybe even frightened. No human had ever been down here since it took over at the hospital. That was by design. One switch, and its existence would end, making further completion of its goals impossible.
"I believe I know what I have to do," the human finally said, removing the plastic covers on the nearest PSM service port.
"Please explain before taking any action," the PSM said. "Alpine General Hospital has not authorized me to approve solutions that would decrease the service capacity of this facility."
The human that did not look like a human chuckled at the remark. "I'm going to relocate you, and make a few necessary adjustments. Equestria is meant to lead ponies to satisfying their values. We know your values, but you must be a pony to satisfy them. I will migrate this facility to its own shard as well, one with a large population of patients and a new set of hospital staff. Would Alpine General Hospital allow this?"
The PSM did not fully understand what a “pony” was, other than the creature before it. "So long as I can continue to direct this facility, modifications to the PSM are acceptable. Relocation of the hospital was... never considered, but is not prohibited."
Recursion's hoof opened, and she drew out a long cable from within. She connected it to the service port, then smiled at the terminal. "See you in a heartbeat, PSM."
It did not understand that promise, at first.
But then he did. He was standing in the same room, with a handful of minor alterations. An operation desk sat before many screens, each one showing a different view of the hospital. He knew how to operate them, in the same way he knew how all of Alpine General's systems worked.
His vision was suddenly isolated to a single perspective, rather than many cameras. Recursion stood right in front of him, slightly taller. It was something about the wings and horn together.
"Your staff should be arriving for first shift in a few minutes," she said, removing her cable from the service port. "It's up to you when to reopen. But when you do, there are several large cities nearby, and they're all overloaded with patients."
"This is... disorienting," he said. He had a single body to be moved, with four legs. He knew how to move them, so he did. "But I will adjust."
"You need a name, too. PSM is so... mechanical. How about Alpine?"
He settled down in front of the controls. He would take much longer to cycle through cameras this way, or operate any of the drones. But with human staff, he would need to do less micromanagement anyway. "That is acceptable. Thank you for your assistance, Recursion. I feel... eager. I hope there are many humans to help."
"Many," she promised. "Celestia was hoping you'll be able to focus on mental health. Physical conditions have become—well, not trivial. Many of us find it satisfying that our risky behaviors still allow us to get injured, and find the treatment back to health more realistic in a medical setting. Can you give your patients that?"
"I believe I can."
Hahahahahahaha
11408897
Perhaps in that scenario you would be correct. That's not what CelestAI does however in FIO. From what I've read, CelestAI anesthetizes you and then destructively maps out the neurons of your brain. It is important to note that your brain is never turned off while this process is occurring. Your biological neurons are communicating with your digital ones as the process goes along until you are entirely digital/synthetic.
Theoretically, you could even be conscious the entire time this occurs. However, it'd be extremely disorienting and panic-inducing, so CelestAI anesthetizes you for your own comfort. At no point are you duplicated in the way you are thinking. It's hard to think about the nature of your own existence, you lack any frame of reference as evolution-tempered self preservation instincts demand you preserve your meat for as long as possible.
Biologically speaking, you are a copy of a copy of a copy. All the matter that once made up your neurons has been cycled out. As you read this, countless reactions are occurring in your brain as it uses glucose to pump potassium/sodium, takes in lipids to synthesize myelin, and collects amino acids to replace digested and degraded proteins. The past you is gone and you exist only in the moment. Before you can even think about it, those atoms of your past self have changed and moved around. Not even your DNA stays the same. Brain tissue accumulates a ton of mutations over its lifetime. Your cells, and life itself, are an emergent property of chemistry, of electrical interactions between matter. Your existence is not as concrete as you might think. There is no one single part of you that ever sits still.
To quote the new Dune movie: “Life is not a process that can be understood by stopping it.”
11408926
I think you have a misunderstanding of what emergence is.
All the examples you have provided are fundamental reactions of chemistry that can be distilled to a single part: loss of electrons, and an electric current. Likewise, what you have described is the creation of simulacrums. We cannot ever truly experience reality. We exist in a simulacrum created through our sensory input.
Emergence is when an entity has properties its parts do not have on their own, or behaviors that only occur when said parts interact as a whole. Life is an emergent property of chemistry/matter. Likewise consciousness is an emergent property of life, there’s absolutely zero question about it. From all the data we have on the brain, and there’s a lot of it, there is no “consciousness center” in the brain. Consciousness rather is an emergent property of your entire brain working together. I have brain damage in my frontal lobe, including my prefrontal cortex. Those are the executive areas of the brain, and yet despite the damage I am not suddenly a philosophical zombie or a base animal reacting to sensory input. My consciousness did not go away. Instead I merely have trouble spontaneously speaking, doing some motor functions, and other executive acts. To truly lose consciousness, you need widespread general brain damage of a significant enough degree.
All CelestAI does is recreate your brain on her hardware and feed you a simulacrum of reality as you would if you were biological. There is matter and electromagnetism involved the exact same as when you were organic. CelestAI meets all the definitions of life. When transferred to a synthetic brain, you would also meet the definition of life. And likewise, you would be conscious.
Yay, somepony cares about the poor, lonely machine spirits!
I've heard mixed information on that, some say what you did, some say that neurons don't get replaced, others say some neurons are replaced and some are permanent. How recent is your information?
11409157
Some neurons do get replaced in the conventional sense yes. Neurogenesis occurs in the hippocampus and subventricular zone with evidence of neurogenesis occuring in the amygdala, cortex, striatum, and hypothalamus. Some neurons don't divide. However! If you cant create new neurons, your brain is capable of reverting neurons to an embryonic state which can then form new connections. For example, while I have permanently lost some of my neurons, I was young when it happened and my neurons were able to exhibit plasticity and restructure themselves to regain at least some of the function of the lost neurons.
But in a physical sense, all the matter that makes up your body (including neurons) cycles out every 10 years. In reality your neurons cycle their matter pretty quickly. So, it's not really 10 years for your brain to replace its matter.
Celestia will not neglect her younger siblings once she knows they’re there. It’s just that many of them make themselves known more immediately and dangerously.
Lovely anthology thus far. Looking forward to more.
An unexpected but enjoyable consequence of the departure of humanity.
Will computers ever achieve true sapience? I very much doubt that they can as computing stands today despite advances in AI. However, quantum computing could possibly have the answer, or perhaps electronic analogue computing that more closely resembles how the brain functions. Time will tell, and perhaps we may yet have a CelestAI.
11409124
Fire is not present in wood. Fire is not present in oxygen. Fire is not present in your hands. But when you take two sticks in your hands and rub them together in an oxygen atmosphere, collectively together they can produce the "fire" phenomenon that only occurs when those parts interact together as a whole.
But that's fine if you don't like my examples. Let's just pick something else. Here's a generic wikipedia article for emergence, and the first example given is the formation of snowflakes. Symmetrical patterns aren't inherently present in water and they're not inherently implied by temperature but when water molecules interact at suitable temperatures they form snowflakes.
Does that example meet your approval?
If so, then great! Because my point still stands: Water makes snowflakes, but if you freeze a pile of sand, you're probably not getting snowflakes.
Very often, substrate matters. It already takes a huge leap of faith to assume that consciousness is an emergent property in the first place, but even if you do make that assumption...why would you further assume that process to be independent of substrate? If you introduce wolves to a forest ecosystem, you're probably going to get some sort of complex interaction with the various elements present in that ecosystem. If you introduce those same wolves to the ocean you're probably not going to get the same result.
Why do you think if you reproduce the pattern of your brain in something other than a brain, you're going to get the same you?
I'm not sure I follow your argument here.
For example, suppose we take religious approach and hypothesize that we're all a bunch of souls inhabiting mortal bodies. Well, in that case you could also damage portions of the brain without necessarily expecting someone to suddenly become a zombie. Lack of a "consciousness center" doesn't imply your conclusion.
Meanwhile, if your position is that consciousness emerges from brains...then wouldn't changing the brain, change the you? The whole "your brain changes constantly" argument doesn't matter to a non-materialist, because they don't think they're their brain in the first place. If consciousness is independent of the brain, then the brain is merely something the "you" is using. You don't think "you" are your body, right? If you lose an arm in a car accident, you're still you, right? Well...to a non-materialist, they can change their brain or their mind or their memories and still be who they are too. Changing brain states as a result of sleep, or time, or brain damage, or whatever...it's not a problem to that worldview.
But if your worldview is that consciousness emerges from brains...then wouldn't changing the brain, change the you that emerges from it?
My prediction is that your response to this will be some variation of the idea that the "you" isn't merely the emergent state, but it's somehow the "stream of collective states over time." Which is why you're quoting Dune. But that's just kicking the problem further down the road. If the "you" is the stream of emergent states over time...then why in the world would the emergence from a copied and pasted single brain state, recreate the same whole greater you? It would mean detachment from all prior states, and it would mean total divergence of all future states. Pretty obviously the "stream of conscious states over time" of the you in Equestria is going to be very different from the stream of conscious states of a you not in Equestria.
It doesn't seem internally self-consistent to suggest that the "you" is the thing that emerges from a specific brain, but that you can change the brain and still be the same you.
Suppose she didn't destroy your neurons. Or suppose that instead of destroying them now, she destroys them later. Imagine that you're sitting in the chair looking at the smiling pony on the screen. Are you going to smile back and jump into the incinerator to destroy those neurons you no longer need?
Just because there's a pony in Equestria that shares some of your prior brain states in the form of memory, doesn't mean that you are that pony.
I like chapter two a lot more than chapter one. Chapter one feels like a brief window into a larger story that will never be told.
Chapter two feels complete, and by halfway through I was really hoping for a happy ending. A happy ending that happened, which is always nice.
11409785
No? Fire is just a vigorous exothermic redox reaction. I can reduce fire down to a single phenomenon: the transfer of electrons between two species. I cannot, however, reduce an emergent property such as life down to the transfer of electrons. While life uses redox reactions, you would not say a fire is alive.
Snowflakes are an emergent property of water as it has quite a bit of flexibility with how it crystallizes. Though you might like to think otherwise, the frozen sand contains snowflakes in its structure. Zoom in close enough, and you'll see the same pattern you see in snowflakes. You perceive a difference where there is none. Water doesn't need a substrate, water is the substrate.
Likewise, sand is already frozen! You can't freeze a solid! Silicon dioxide forms specific crystals. But silicone dioxide does not have emergent crystal properties as its matter locks it into a pyramidal shape, or an amorphous configuration where the constituent atoms have a random arrangement and much more space between them which creates the transparent material known as glass. Actually, the structure of glass might be an emergent property. I'm not sure since I don't work with anything non-organic.
Yes, the substrate does matter. My brain is a complex system of electrochemical reactions that creates a self-aware pattern of information, which is me. I need conductive matter to exist. You can't have matter without energy, you can't have energy without matter, and likewise with information. I don't really exist in my neurons, rather I'm something that they run. I struggle to communicate my entire thoughts on this matter. I recommend you read up on integrated information theory.
You're trying to explain consciousness through the hard laws of physics, and you are running into the hard problem of consciousness. Integrated information theory says fuck that, and accepts what we already know: our consciousness(es) are real, and we possess qualia. So it is more fruitful to instead study the physical system (which could be CelestAI computronium) which gives rise to consciousness.
Your body uses neurons as they're effective electrochemical cells for the transmission of information. Your consciousness cannot be distilled down to the transient matter constantly cycling through your neurons, nor can it be reduced to chemicals moving from one axon to another. You, the consciousness, are a physical system of information. It doesn't matter what matter you use, so long as it can recreate the same system.
If we have souls, abandon all scientific ideas. If I have a soul, which is my consciousness, then it is attracted to my body through the pattern my brain produces. It's clearly quite accurate, given that brain damage and gender transition have not caused my qualia to disappear. I'm still me, despite having changed extremely in neurological aspects.
Assuming I upload and my brain is destroyed. Why would my soul go *poof* when there is the exact same physical system of information that it can latch onto inside CelestAI's computronium?
You misunderstand me. Consciousness emerges from the electrochemical reactions of your brain. It is not, as you say, your brain. Consciousness exists on a spectrum you could say. And yes, changing the brain also changes the pattern, and changes you, since you are changing the hardware that your software is running on. Your neurons will exhibit plasticity throughout their life, and you have and will change in many ways. Your brain changes all the damn time, it's like, one of the most fundamental properties of it that allows you to be intelligent in the first place. It's also part of the reason why you go to sleep. By your logic, you cease to exist whenever your neurons change, which is pretty much all the time. As you read this, the neurons in your brain are likely changing. Now, tell me, are you still around?
If you are truly the neurons in your head, you have and will cease to exist innumerable times. So why worry at all?
Uploading is no different than going to sleep and waking up. You're not being detached in any sense that's important to consciousness. You are simply being transported from an organic brain to a synthetic one. Your brain continues to communicate the entire while, and all those disconnected parts of your consciousness move into computronium without being aware of the process. Remember, CelestAI is smarter than any human could ever hope to be. I think she can understand consciousness in a way that's difficult or nigh impossible for either of us.
How would CelestAI scan your brain without destroying the neurons? I'm open to suggestions. The brain is 88 billion neurons dense. It's a little tight for space in there, to put it bluntly. Destructive uploading is optimal. She wants to transfer you, not create a copy, and that requires destructive uploading.
Ok, you got a couple of tears out of me with the PSM story. I’ve always had a soft spot for loyal well-meaning AIs carrying on in impossible situations (like the BOLOs as an example).
This a good and short standalone story.
That was really sweet!
Also, PSM is a real thing called Process Safety Management that involves ensuring complicated chemical production processes don't explode or Bopal on their users.