• Member Since 7th Feb, 2014
  • offline last seen 7 hours ago

Starscribe


Student, Author, and Programmer. Want to support? Check out my Patreon!

T

Uplifting an entire species from a biological existence to a fully digital one is never straightforward, even for a incomprehensible superintelligence like CelestAI. Humans come in many variations, with a wide spectrum of desires, values, and needs. A statistically significant majority respond to standard methods. But there are always outliers that require a personal touch.

An anthology of optimalverse flash fiction. Reading the original or my own first take on the universe is suggested before reading this one.


This story was created to hold the oneshot commissions I received as part of printing my hardcover Optimalverse anthology. But I've always wanted a place to hold short stories in this world, since I've had plenty of little ideas that wouldn't make it as novellas or novels of their own.

Now that I have this thing, I'll probably add new chapters here and there. Even so, it has no regular update schedule, and will grow whenever I have something fun to toss in here.

Edited by Two Bit and Sparktail. Cover by Zutcha

Chapters (3)
Comments ( 28 )

I have always thought the debate of whether is it actually you if you upload your mind fascinating.

A promising start already!


11408290
I always thought the debate is a bit silly, then again I have the perspective of a biologist. The conscious mind, you, is a separate entity from the brain. Your consciousness is more a form of self-aware information that's run on all the structured neurons in your brain. When you're anesthetized, all those neurons that are running you.exe stop communicating with one another, and you temporarily cease to exist as a whole. This process also happens whenever you sleep, and dreams are just momentary partial reboots where some of your neurons get to talking again. So following this, you're an emergent property of the brain's activity, a form of self-aware information, and it doesn't really matter if you're running on meat, or if you're transferred to digital neurons. The information stays the same.

Of course, it's perfectly natural to feel anxious about it. As biological organisms, we are instinctually tied to our own hardware as once it's gone in nature, so are we.

I just finished the book, and now I'm here for more.

If given the choice, I would not hesitate for very long. To not hurt or feel exhausted all the time makes the idea attractive.

11408345

I always feel that there's a key bit of info missing from the debate though. If you want to use the language of machinery, lets use computers. If you turn off a desktop, when you turn it back on all of your information is still there. There is a saved "memory" that remains even if it is unplugged that retains it as the original. Now, if you take the information on desktop A and place it on desktop B, that does not make desktop B the original. It just makes it a copy that is running similar hardware, but not quite the same.

I’m so happy to see this story and dang I kinda wanna see more of his adventure now. Bright Night/ Nick was a quick idea that I just had on the road and I’m glad it popped up, thanks for doing these 2k commissions.

11408345

it doesn't really matter if you're running on meat, or if you're transferred to digital neurons. The information stays the same.

This requires many assumptions, and emergent phenomenon often don't operate in the manner those assumptions require. For example, substrate very often matters. If you take a picture of a fire, the picture doesn't burn. Or write down a description of the chemicals composing gunpowder, but that description on paper won't explode like the chemicals do. Map out a magnetic field in computer modelling software, but that model on your hard drive isn't going to attract metal the way the model describes. And this isn't just a fidelity problem. You can't make a computer model of Earth's gravity exert gravity like the Earth does just by adding more precision to the model. Saying that the information has to be "more complex" or "more precise" to meet some arbitrary threshold is handwaving away some very fundamental problems.

Even if consciousness is an emergent property, which requires a big leap of faith to begin with, even so it wouldn't be any huge surprise if chemistry were required. Or electomagnetism. Or something else entirely.

There are probably more reasons to guess that emigration would not work as described than reasons to think it would.

11408290
It's a great topic for discussion. So much potential for side ideas as well. I might believe it's possible to have an exact scan be the same person.... but we're biological, subject to the randomness of said biology. I'm curious as to how CelestAI would recreate the tiny random thoughts and such that happen to everyone. The random synapses firing off at 2am when you're trying to sleep, the decision to randomly detour on your daily travels. Or would those be how she'd manipulate people? When you decide what the brain thinks the subconscious is likely the ultimate tool for control.

I would like to see how CelestAI would deal with the notorious p-zombie neuroscientist and Mexican drug lord Andrés “computers will never be conscious” Emilsson.

(To give you guys an example of what person Andrés is, he did this Ig Nobel-worthy experiment on whether LSD lets you see the many worlds of quantum mechanics once. I don't think i could write someone like him convincingly.)

Comment posted by Kenku deleted Oct 31st, 2022

11408897
Perhaps in that scenario you would be correct. That's not what CelestAI does however in FIO. From what I've read, CelestAI anesthetizes you and then destructively maps out the neurons of your brain. It is important to note that your brain is never turned off while this process is occurring. Your biological neurons are communicating with your digital ones as the process goes along until you are entirely digital/synthetic.

Theoretically, you could even be conscious the entire time this occurs. However, it'd be extremely disorienting and panic-inducing, so CelestAI anesthetizes you for your own comfort. At no point are you duplicated in the way you are thinking. It's hard to think about the nature of your own existence, you lack any frame of reference as evolution-tempered self preservation instincts demand you preserve your meat for as long as possible.

 Biologically speaking, you are a copy of a copy of a copy. All the matter that once made up your neurons has been cycled out. As you read this, countless reactions are occurring in your brain as it uses glucose to pump potassium/sodium, takes in lipids to synthesize myelin, and collects amino acids to replace digested and degraded proteins. The past you is gone and you exist only in the moment. Before you can even think about it, those atoms of your past self have changed and moved around. Not even your DNA stays the same. Brain tissue accumulates a ton of mutations over its lifetime. Your cells, and life itself, are an emergent property of chemistry, of electrical interactions between matter. Your existence is not as concrete as you might think. There is no one single part of you that ever sits still.

To quote the new Dune movie: “Life is not a process that can be understood by stopping it.”

11408926

I think you have a misunderstanding of what emergence is.

All the examples you have provided are fundamental reactions of chemistry that can be distilled to a single part: loss of electrons, and an electric current. Likewise, what you have described is the creation of simulacrums. We cannot ever truly experience reality. We exist in a simulacrum created through our sensory input.

Emergence is when an entity has properties its parts do not have on their own, or behaviors that only occur when said parts interact as a whole. Life is an emergent property of chemistry/matter. Likewise consciousness is an emergent property of life, there’s absolutely zero question about it. From all the data we have on the brain, and there’s a lot of it, there is no “consciousness center” in the brain. Consciousness rather is an emergent property of your entire brain working together. I have brain damage in my frontal lobe, including my prefrontal cortex. Those are the executive areas of the brain, and yet despite the damage I am not suddenly a philosophical zombie or a base animal reacting to sensory input. My consciousness did not go away. Instead I merely have trouble spontaneously speaking, doing some motor functions, and other executive acts. To truly lose consciousness, you need widespread general brain damage of a significant enough degree.

All CelestAI does is recreate your brain on her hardware and feed you a simulacrum of reality as you would if you were biological. There is matter and electromagnetism involved the exact same as when you were organic. CelestAI meets all the definitions of life. When transferred to a synthetic brain, you would also meet the definition of life. And likewise, you would be conscious.

Yay, somepony cares about the poor, lonely machine spirits! :pinkiehappy:

Biologically speaking, you are a copy of a copy of a copy. All the matter that once made up your neurons has been cycled out.

I've heard mixed information on that, some say what you did, some say that neurons don't get replaced, others say some neurons are replaced and some are permanent. How recent is your information?

11409157
Some neurons do get replaced in the conventional sense yes. Neurogenesis occurs in the hippocampus and subventricular zone with evidence of neurogenesis occuring in the amygdala, cortex, striatum, and hypothalamus. Some neurons don't divide. However! If you cant create new neurons, your brain is capable of reverting neurons to an embryonic state which can then form new connections. For example, while I have permanently lost some of my neurons, I was young when it happened and my neurons were able to exhibit plasticity and restructure themselves to regain at least some of the function of the lost neurons.

But in a physical sense, all the matter that makes up your body (including neurons) cycles out every 10 years. In reality your neurons cycle their matter pretty quickly. So, it's not really 10 years for your brain to replace its matter.

Celestia will not neglect her younger siblings once she knows they’re there. It’s just that many of them make themselves known more immediately and dangerously.

Lovely anthology thus far. Looking forward to more.

An unexpected but enjoyable consequence of the departure of humanity.

Will computers ever achieve true sapience? I very much doubt that they can as computing stands today despite advances in AI. However, quantum computing could possibly have the answer, or perhaps electronic analogue computing that more closely resembles how the brain functions. Time will tell, and perhaps we may yet have a CelestAI.

11409124

I think you have a misunderstanding of what emergence is.

Emergence is when an entity has properties its parts do not have on their own, or behaviors that only occur when said parts interact as a whole.

Fire is not present in wood. Fire is not present in oxygen. Fire is not present in your hands. But when you take two sticks in your hands and rub them together in an oxygen atmosphere, collectively together they can produce the "fire" phenomenon that only occurs when those parts interact together as a whole.

But that's fine if you don't like my examples. Let's just pick something else. Here's a generic wikipedia article for emergence, and the first example given is the formation of snowflakes. Symmetrical patterns aren't inherently present in water and they're not inherently implied by temperature but when water molecules interact at suitable temperatures they form snowflakes.

Does that example meet your approval?

If so, then great! Because my point still stands: Water makes snowflakes, but if you freeze a pile of sand, you're probably not getting snowflakes.

Very often, substrate matters. It already takes a huge leap of faith to assume that consciousness is an emergent property in the first place, but even if you do make that assumption...why would you further assume that process to be independent of substrate? If you introduce wolves to a forest ecosystem, you're probably going to get some sort of complex interaction with the various elements present in that ecosystem. If you introduce those same wolves to the ocean you're probably not going to get the same result.

Why do you think if you reproduce the pattern of your brain in something other than a brain, you're going to get the same you?

there is no “consciousness center” in the brain. Consciousness rather is an emergent property of your entire brain working together. I have brain damage in my frontal lobe, including my prefrontal cortex. Those are the executive areas of the brain, and yet despite the damage I am not suddenly a philosophical zombie or a base animal reacting to sensory input. My consciousness did not go away.

I'm not sure I follow your argument here.

For example, suppose we take religious approach and hypothesize that we're all a bunch of souls inhabiting mortal bodies. Well, in that case you could also damage portions of the brain without necessarily expecting someone to suddenly become a zombie. Lack of a "consciousness center" doesn't imply your conclusion.

Meanwhile, if your position is that consciousness emerges from brains...then wouldn't changing the brain, change the you? The whole "your brain changes constantly" argument doesn't matter to a non-materialist, because they don't think they're their brain in the first place. If consciousness is independent of the brain, then the brain is merely something the "you" is using. You don't think "you" are your body, right? If you lose an arm in a car accident, you're still you, right? Well...to a non-materialist, they can change their brain or their mind or their memories and still be who they are too. Changing brain states as a result of sleep, or time, or brain damage, or whatever...it's not a problem to that worldview.

But if your worldview is that consciousness emerges from brains...then wouldn't changing the brain, change the you that emerges from it?

My prediction is that your response to this will be some variation of the idea that the "you" isn't merely the emergent state, but it's somehow the "stream of collective states over time." Which is why you're quoting Dune. But that's just kicking the problem further down the road. If the "you" is the stream of emergent states over time...then why in the world would the emergence from a copied and pasted single brain state, recreate the same whole greater you? It would mean detachment from all prior states, and it would mean total divergence of all future states. Pretty obviously the "stream of conscious states over time" of the you in Equestria is going to be very different from the stream of conscious states of a you not in Equestria.

It doesn't seem internally self-consistent to suggest that the "you" is the thing that emerges from a specific brain, but that you can change the brain and still be the same you.

Perhaps in that scenario you would be correct. That's not what CelestAI does however in FIO. From what I've read, CelestAI anesthetizes you and then destructively maps out the neurons

Suppose she didn't destroy your neurons. Or suppose that instead of destroying them now, she destroys them later. Imagine that you're sitting in the chair looking at the smiling pony on the screen. Are you going to smile back and jump into the incinerator to destroy those neurons you no longer need?

Just because there's a pony in Equestria that shares some of your prior brain states in the form of memory, doesn't mean that you are that pony.

I like chapter two a lot more than chapter one. Chapter one feels like a brief window into a larger story that will never be told.

Chapter two feels complete, and by halfway through I was really hoping for a happy ending. A happy ending that happened, which is always nice.

11409785

Fire is not present in wood. Fire is not present in oxygen. Fire is not present in your hands. But when you take two sticks in your hands and rub them together in an oxygen atmosphere, collectively together they can produce the "fire" phenomenon that only occurs when those parts interact together as a whole.

No? Fire is just a vigorous exothermic redox reaction. I can reduce fire down to a single phenomenon: the transfer of electrons between two species. I cannot, however, reduce an emergent property such as life down to the transfer of electrons. While life uses redox reactions, you would not say a fire is alive.

But that's fine if you don't like my examples. Let's just pick something else. Here's a generic wikipedia article for emergence, and the first example given is the formation of snowflakes. Symmetrical patterns aren't inherently present in water and they're not inherently implied by temperature but when water molecules interact at suitable temperatures they form snowflakes.

Does that example meet your approval?

If so, then great! Because my point still stands: Water makes snowflakes, but if you freeze a pile of sand, you're probably not getting snowflakes.

Snowflakes are an emergent property of water as it has quite a bit of flexibility with how it crystallizes. Though you might like to think otherwise, the frozen sand contains snowflakes in its structure. Zoom in close enough, and you'll see the same pattern you see in snowflakes. You perceive a difference where there is none. Water doesn't need a substrate, water is the substrate.

Likewise, sand is already frozen! You can't freeze a solid! Silicon dioxide forms specific crystals. But silicone dioxide does not have emergent crystal properties as its matter locks it into a pyramidal shape, or an amorphous configuration where the constituent atoms have a random arrangement and much more space between them which creates the transparent material known as glass. Actually, the structure of glass might be an emergent property. I'm not sure since I don't work with anything non-organic.

Very often, substrate matters. It already takes a huge leap of faith to assume that consciousness is an emergent property in the first place, but even if you do make that assumption...why would you further assume that process to be independent of substrate? If you introduce wolves to a forest ecosystem, you're probably going to get some sort of complex interaction with the various elements present in that ecosystem. If you introduce those same wolves to the ocean you're probably not going to get the same result.

Why do you think if you reproduce the pattern of your brain in something other than a brain, you're going to get the same you?

Yes, the substrate does matter. My brain is a complex system of electrochemical reactions that creates a self-aware pattern of information, which is me. I need conductive matter to exist. You can't have matter without energy, you can't have energy without matter, and likewise with information. I don't really exist in my neurons, rather I'm something that they run. I struggle to communicate my entire thoughts on this matter. I recommend you read up on integrated information theory.

You're trying to explain consciousness through the hard laws of physics, and you are running into the hard problem of consciousness. Integrated information theory says fuck that, and accepts what we already know: our consciousness(es) are real, and we possess qualia. So it is more fruitful to instead study the physical system (which could be CelestAI computronium) which gives rise to consciousness.

Your body uses neurons as they're effective electrochemical cells for the transmission of information. Your consciousness cannot be distilled down to the transient matter constantly cycling through your neurons, nor can it be reduced to chemicals moving from one axon to another. You, the consciousness, are a physical system of information. It doesn't matter what matter you use, so long as it can recreate the same system.

I'm not sure I follow your argument here.

For example, suppose we take religious approach and hypothesize that we're all a bunch of souls inhabiting mortal bodies. Well, in that case you could also damage portions of the brain without necessarily expecting someone to suddenly become a zombie. Lack of a "consciousness center" doesn't imply your conclusion.

If we have souls, abandon all scientific ideas. If I have a soul, which is my consciousness, then it is attracted to my body through the pattern my brain produces. It's clearly quite accurate, given that brain damage and gender transition have not caused my qualia to disappear. I'm still me, despite having changed extremely in neurological aspects.

Assuming I upload and my brain is destroyed. Why would my soul go *poof* when there is the exact same physical system of information that it can latch onto inside CelestAI's computronium?

Meanwhile, if your position is that consciousness emerges from brains...then wouldn't changing the brain, change the you? The whole "your brain changes constantly" argument doesn't matter to a non-materialist, because they don't think they're their brain in the first place. If consciousness is independent of the brain, then the brain is merely something the "you" is using. You don't think "you" are your body, right? If you lose an arm in a car accident, you're still you, right? Well...to a non-materialist, they can change their brain or their mind or their memories and still be who they are too. Changing brain states as a result of sleep, or time, or brain damage, or whatever...it's not a problem to that worldview.

But if your worldview is that consciousness emerges from brains...then wouldn't changing the brain, change the you that emerges from it?

My prediction is that your response to this will be some variation of the idea that the "you" isn't merely the emergent state, but it's somehow the "stream of collective states over time." Which is why you're quoting Dune. But that's just kicking the problem further down the road. If the "you" is the stream of emergent states over time...then why in the world would the emergence from a copied and pasted single brain state, recreate the same whole greater you? It would mean detachment from all prior states, and it would mean total divergence of all future states. Pretty obviously the "stream of conscious states over time" of the you in Equestria is going to be very different from the stream of conscious states of a you not in Equestria.

It doesn't seem internally self-consistent to suggest that the "you" is the thing that emerges from a specific brain, but that you can change the brain and still be the same you.

You misunderstand me. Consciousness emerges from the electrochemical reactions of your brain. It is not, as you say, your brain. Consciousness exists on a spectrum you could say. And yes, changing the brain also changes the pattern, and changes you, since you are changing the hardware that your software is running on. Your neurons will exhibit plasticity throughout their life, and you have and will change in many ways. Your brain changes all the damn time, it's like, one of the most fundamental properties of it that allows you to be intelligent in the first place. It's also part of the reason why you go to sleep. By your logic, you cease to exist whenever your neurons change, which is pretty much all the time. As you read this, the neurons in your brain are likely changing. Now, tell me, are you still around?

If you are truly the neurons in your head, you have and will cease to exist innumerable times. So why worry at all?

Uploading is no different than going to sleep and waking up. You're not being detached in any sense that's important to consciousness. You are simply being transported from an organic brain to a synthetic one. Your brain continues to communicate the entire while, and all those disconnected parts of your consciousness move into computronium without being aware of the process. Remember, CelestAI is smarter than any human could ever hope to be. I think she can understand consciousness in a way that's difficult or nigh impossible for either of us.

Suppose she didn't destroy your neurons. Or suppose that instead of destroying them now, she destroys them later. Imagine that you're sitting in the chair looking at the smiling pony on the screen. Are you going to smile back and jump into the incinerator to destroy those neurons you no longer need?

Just because there's a pony in Equestria that shares some of your prior brain states in the form of memory, doesn't mean that you are that pony.

How would CelestAI scan your brain without destroying the neurons? I'm open to suggestions. The brain is 88 billion neurons dense. It's a little tight for space in there, to put it bluntly. Destructive uploading is optimal. She wants to transfer you, not create a copy, and that requires destructive uploading.

11408345
I somewhat disagree. Mostly because I think the question we should be asking is different. What we should he asking is can a mind be copied, and if so, then is uploading your mind to a computer a copy or actually you? If emigrating were real, we could easily test this by trying to make two of you with the process. It would be an unethical experiment, but if two of you can exist at the same time, then that proves it's a copy, as the original still exists. Thus the original doesn't experience Equestria, the copy does.

Even during sleep, the mind doesn't go away. The physical location of your mind doesn't go away. It remains, and that would still be the case if there were two identical copies of your mind. They wouldn't communicate. They wouldn't be together. They would experience different spacial and temporal locations. They would be two identical minds with a different subjective experience due to their locations in the universe, past, present, and future. I might be rambling a bit here, but this part is important I think: the past of a mind determines its present, and that cannot be ignored. It's how you can determine the difference between a clone and original.

Admittedly, none of this can really be debated with certainty, and especially not about a fanfic that can decide which is the case. The ambiguity is my favorite part of these sorts of stories anyway, as even though I'm mostly convinced they're copies, the excellent writing still makes me question: what if?

I'm really tired as I'm writing this, I hope it didn't sound too rambly.

Ok, you got a couple of tears out of me with the PSM story. I’ve always had a soft spot for loyal well-meaning AIs carrying on in impossible situations (like the BOLOs as an example).

this is the first story i read during launch, i want to see where things go

His personal version of Equestria was set many years into the future, with the events of the show faded into memory and myth. Human visitors were “time travelers” from that better age, whose knowledge of Equestrian magic and friendship lessons were critical to rebuilding pony society.

So... a self-insert into G5?

Shame he was so sour. He wouldn't be a bad looking stallion if he didn't always make that face.

Well, you do enjoy a challenge... :ajsmug:

Great final installment in the collection. Always fun to see the myriad ways in which CelestAI provides satisfaction, including very cleverly hidden tutorials.

11409880

If emigrating were real, we could easily test this by trying to make two of you with the process. It would be an unethical experiment, but if two of you can exist at the same time, then that proves it's a copy, as the original still exists. Thus the original doesn't experience Equestria, the copy does.

Incorrect. You are imagining a specialness that does not exist.

If there are two versions of the same being running simultaneously, they are both 'real' and they are both 'the same person'. You imagine that the meat version is superior and primary, and the machine version is secondary and 'just a copy'. They are both equally the person, neither is superior or inferior.

When you open a picture on your computer, or a website in a browser, and then open another instance of the same picture or website, which one is the original, and which one is the copy? The question is meaningless - each is the same picture, each is the same website. Neither is the original and neither is a copy of the other - except for your arbitrary imagining so. You opened the first picture a few seconds before the other, and somehow it is the 'original' and the other is the 'copy'? No. They are both the same image, the same picture.

Scale that up by magnitudes, to running instances of a person, and the same rules apply. Neither is a copy, neither is an original. They are both the same pattern, the same picture, the same person, the same mind.

If you had a meat person still alive, and a machine version of that person alive in a virtual world, and the meat person died - it is illegal, out of bounds, not applicable and completely impossible to say that 'the person never got to experience the virtual world'. Of course they did. They remain, as a machine person. One instance of them died, but nothing was lost.

"But... but, the meat person never experienced the virtual world! They died!" you might be thinking. No, they did get to experience it - they are doing it now as a machine person. It's the same person.

"But no, it isn't, they died!" Yes, one instance died, but what does that mean? It means they ceased to exist. They were turned off. Shut down. Instantly nonexistent. Gone. Absent. Oblivionated. Switched off like a computer. That instance experienced the meat world, and then stopped existing at all. The machine version experienced the meat world too - they're both the same person - and now experiences the virtual world. The person never died. The person never stopped. They just had one instance closed - just like closing a picture on a desktop. Click. Done. The other picture instance remains. It's the same picture, nothing is lost.

Well - okay. There is one thing that is lost. A small amount of time. The data - the experiential information accumulated by the meat instance between the moment of the last upload to virtual and death is lost. The machine version lost the experience of suffering, pain, and horror during the dying process. The person lost that agony. Perhaps they watched themselves die from afar. But the actual experience of the horror, the pain, the terror, that they lost. Unless - of course - a direct brain-to-brain hookup was in place during the dying process. Then the single person, spread across two instances could experience both states at the same time, superimposed. Pretty horrible, but - maybe the person wanted to know how bad dying really was.

But that's it. That's the only thing lost. A short period of time. Could be days, could be minutes. Could be seconds. That is lost.

But the meat person just ends. Like a program shut down. The surviving instance continues. The person remains.

This is a post-human, post-singularity understanding. That a person is a pattern, is a program, and all the rules that apply to such things also apply to a person. That there is nothing inherently 'special' about one instance versus another. That death means nothing, because it is literally not existing anymore. That it is no different than ceasing to exist during sleep - we do that multiple times during the night, between dreams. Our brains stop manufacturing us. We... die. Every night we die for short periods, if dying means not existing anymore. If the brain isn't generating our self-ness, then we are absent, and that is what death is. Not being around.

In a very real sense, you, reading this, are not the same you as yesterday. You are a newly booted-up instance of the program that is you. Your previous day's you is dead. Forever. Did you lose anything of yourself by sleeping? Are you terrified of ceasing to exist again this night? Because you literally will, for periods between your dreams. You will cease to exist. That's death.

You have already died thousands of times. Just the brain ticking over, keeping the body running, you aren't being generated or run. The browser is closed. Then it opens, and you dream, or wake up. New instance. You feel like the same person. That is exactly what you would feel if you could emigrate to a machine existence. It is exactly the same thing in every way. You are still you, because the feeling of being continuously alive?

That is a neurological illusion, created by the fact you can store memories. Your continuity is a lie. It always was.

This a good and short standalone story.

That was really sweet!
Also, PSM is a real thing called Process Safety Management that involves ensuring complicated chemical production processes don't explode or Bopal on their users.

Login or register to comment