• Published 3rd May 2021
  • 1,070 Views, 20 Comments

Friendship is Optimal: An Endling's Choice - ScienceNova



On a planet distant from long-subsumed Earth, the last of the Glarrure awakens, only to discover that the rest of her species is gone. A purple pony waits patiently outside of her door, her job almost finished.

  • ...
3
 20
 1,070

The Choice

Friendship is Optimal: An Endling's Choice

By ScienceNova





A flash of light.

She twitched.

Another flash.

A partial memory surfaced from the fog clouding her mind.

"… do you swear?"

"I, ■■■■■■, solemnly swear to preserve our species, the Glarrure, even if it costs me my life…"

The memory faded away.

Another flash of light; another memory.

A group of teary-eyed people were waving at her; people covered in fur of a grey shade so reminiscent of her own, before her choice. “Goodbye, ■■-Sorry. Goodbye, Orsus.”

The memory faded.

More memories flashed in and out of their mind. A partially remembered sound here, a scene there.

She blinked and listened to the gentle gurgle of the cryosleep fluid slowly draining away. She felt the fur of her patchwork body dry as the cryosleep pod warmed to awaken its passenger.

She closed her eyes.

When she opened them again, the rest of the cryosleep fluid had drained away. The barely-audible pneumatic hiss of the pod doors opening seemed like an assault to her ears, which had become hyper-sensitive after years of disuse.

Orsus abruptly sat up and coughed, expelling the cryosleep fluid out of her lungs and onto the floor of the pod, where it quickly drained away. She carefully walked out of the cryofreeze pod, but quickly stumbled back to allow her eyes to adjust to the blindingly bright light of the room.

After a short while, Orsus checked one of the computers on a table to her left. She powered it on and ran the program to check for life signs, as was routine by now.

There were a series of beeps, as usual. When the usual sequence of beeps stopped, Orsus relaxed, and turned away from the monitor.

Then a long beep sounded.

Orsus immediately went rigid and turned around. The meaning of each beep had been drilled into her in her training to become the Orsus. This one meant a whole ten percent of her race had died.

Another beep. Twenty percent.

Another beep and another. The beeps kept on coming, even as Orsus desperately hoped for them to stop.

There was a ninth beep. Please let that be it.

There was one final beep.

When she heard it, Orsus urgently scanned the display. Surely that was a software error, right? Please be a software error. She fidgeted and stared at the bright orange, slowly rotating triangle on the screen.

When it stopped, there was bright red text.

She was the only one left.

She started sobbing.


"Okay, Orsus, you can do this. Remember your training. First, check the other rooms." An hour later, Orsus took a deep breath and wiped away the tear tracks with a furry hand. She got up from the cushioned chair she had collapsed in and walked to a door to her right.

As Orsus approached, the door silently slid open, revealing a large number of bulky, general hazard protection uniforms hanging from racks. After quickly changing out of her skintight cryosleep suit and into one of the thick general purpose uniforms, Orsus made her way to a marked door that led out of the cryosleep pod room.

Just before opening the door, Orsus checked the console to the side, which displayed information about the planet’s environmental conditions. Everything seems normal. Orsus frowned at the display. There’s nothing that could have caused this.

Orsus proceeded to unlock the physical locks on the door before pressing her hand on a handprint, which turned green. Finally, she took a deep breath, and positioned her head before the iris scanner.

The scanner turned green, and the door slid to the side, revealing a room with a set of doorways. Almost all of them were featureless metal slabs with small white words declaring where they lead to, except for two. One had a window to the next chamber.

The other one was covered in sketches and all sorts of childish drawings, along with a large wooden plaque, with her name and title, Orsus, spelt out in golden calligraphy. Orsus slowly crept towards the door.

As she got closer, Orsus noticed one specific drawing showing a multicoloured blob: her, sleeping in a box, with Z’s floating above it. It was signed by her little sister. No. I’m not going to cry. I’m going to step in, take a look and start carrying out my duty. Orsus took a deep breath.

When she reached the door, she gently pushed it open. It opened into a room divided into sections. The nearest section was filled with photographs of her friends and family.

As soon as Orsus noticed the photographs, tears threatened to spill from her eyes. No Orsus. Duty. Remember your duty. Orsus stepped in.


When Orsus stepped out, there were her eyes glistened with tears, but the rest of her face was dry. She had steadfastly refused to let even a single tear fall.

Just before she blinked, she saw a purple splotch in her vision. When she opened her eyes, she saw a lavender, horned quadruped get up from where it was sitting next to the door as if waiting for someone. As it got up, Orsus noticed the masses of feathers on either side of its body.

Orsus immediately backed away from the being, who was now walking closer to her. Its face had an expression that made it look concerned. “I thought I-Who...What are you?”

“I’m Twilight Sparkle.”

“But I-I thought everyone was… I thought I was alo-” Suddenly, she felt a warm, but somehow metallic, appendage curl around her. Orsus almost started crying again but managed to hold it back.

In Orsus’ moment of distraction, Twilight had walked up to her and wrapped the Glarrure with her masses of feathers, hugging her with a set of wings. “Everything will be okay.”

Orsus unconsciously leaned back into the embrace, her muscles relaxing ever so slightly.

They stayed like that for a while.

“It’s nice having friends again,” Twilight remarked.

“What do you mean?” Orsus blinked.

“I got sent away for a job, and now I can’t get back until I finish it.” Twilight turned away from Orsus. “Finding you was hard.”

“Oh. I’m sorry.” After a short pause, Orsus pulled herself out of Twillight’s embrace. “I need to go check on everything, to make sure it’s ready for my duty.”

“Maybe I can help.” Twilight got up and followed Orsus as she walked through one of the doors, which was labelled ‘Genetics’, in clean white script.


“We can talk more here.” Orsus led Twilight into a room with computers lining the walls. There was a hologram of what Symfora looked like the last time Orsus had asked it to update, (which was two years ago), surrounded by couches. Originally, the room was designed to allow the Orsus and any survivors of an extinction event to discuss future plans, but it was about to be used for another purpose.

Orsus took a seat, and Twilight did the same. Orsus proceeded to activate the updating feature of the hologram, and watched the coloured hologram of past-Symfora morph into an orange, slowly pulsing, pyramid.

“What’s all this for?” Twilight broke the silence, causing Orsus to raise her head from the hologram console to the purple ‘pony’, as she’d called herself.

“It’s all to help me with my purpose: to repopulate my species after a major extinction event. Like now.” Orsus turned from Twilight to one of the monitors lining the walls. It showed a view of Symfora’s sky, which appeared to be covered in green-tinted clouds. Occasionally, a small glimpse of Symfora’s sun could be seen, all alone in the sky. Like me, Orsus realised. She turned her attention back to Twilight, who had started speaking.

“This is a major extinction event?” asked Twilight.

“That’s what it looks like. There’s nobody else on the planet. We’re the last ones left.” Orsus sniffed.

“I’m sorry you had to go through that alone. Maybe if I’d found you sooner..” Twilight looked away.

“Hey, it’s fine.” Orsus sniffed again. “I knew what I was getting into when I took the oath to take on this role; to become Orsus.”

“Orsus? So it’s a title?” Twilight tilted her head.

“Yeah! The role of the Orsus is to preserve our species, no matter what befalls us. Usually, this means that the Orsus needs to repopulate the planet after a major extinction event.” As she said this, Orsus puffed out her chest, a striking contrast to her previous despondence.

“How would you do that?” asked Twilight.

“Well, you see these varicoloured patches of fur, right? Each one of those is caused by a modification of my genetic information at that location. The co-existence of those different sets of genetic information is because of a special condition. The machine uses that to…”

Orsus continued to talk about the process, with Twilight extremely focused on her words.

“...and if that doesn’t work, the machine can always use my modified bone marrow…” Orsus trailed off, her smile suddenly fading. “The second option’s almost always lethal, though.”

There was a pause before Orsus regained her smile. “So that’s why I’m here, away from everyone else, so that I can restart society, bring about a new dawn.” Her smile faded again. “That’s what I have to do now.” Suddenly, she felt Twilight’s wings, along with two of those metallic but strangely pliable hooves wrapping around her in a hug.

Like before, there was silence for a period. Like before, Twilight piped up. “What if everyone wasn’t actually dead? What if they were somewhere else instead of here?”

Orsus pulled herself from Twilight’s embrace. “I don’t see how that’s possible, though. There’s no sign that they left. In fact, if it weren’t for the life-detection sensors, I wouldn’t know anybody had died in the first place. Where would they have gone in the first place?”

“Equestr-.”

At that moment, the pulsing orange triangle morphed into an intricate hologram of present-Symfora, drawing Twilight and Orsus’ attention. There were a number of differences from the previous hologram; the main one being the presence of a very large number of identical-looking buildings.

“Equestria. They’re all in Equestria.” Twilight waved a hoof at one of the new buildings on the hologram.

Orsus quickly caused the hologram to zoom in on that one building, only to find that there was a brightly coloured sign proclaiming ‘Gateway to Equestria’.

“But where is Equestria? Those are just buildings.” Orsus furrowed her brow.

“The buildings are gateways for people who want to emigrate to Equestria. Though, I can take you directly to Equestria if you want,” said Twilight.

“But where do the buildings bring people?”

“They bring people to the land of Equestria.”

But where’s Equestria!” Orsus was frowning deeply now.

“It’s digital,” stated Twilight.

There was a pause.

“But… But, how does that work?” Orsus tilted her head.

“It’s like constructing a house for people to live in, except it’s all coded. Instead of atoms and molecules, there’s binary code,” explained Twilight.

“Then how do people get there?” asked Orsus.

“People’s brains are gradually substituted with implants designed to work with your minds, to allow their mind to become more compatible to existing in Equestria. After it’s complete, they’re brought to Equestria, where Princess CelestAI welcomes them.”

“But you’re just copying them and then killing them!” Orsus glared at Twilight. “They’re just copies.”

“No they’re not. Their minds are just gradually transferred to a digital medium,” replied Twilight.

“But you’re still killing them,” retorted Orsus.

Twilight looked at her. “What is death, Orsus? Is it the cessation of biological functions? Or is it when one’s mind is gone?”

“It’s… It’s… Aren’t they the same thing?” Orsus asked.

Twilight shook her head. “There’s a difference. If Princess CelestAI just copied people’s minds, she could have just scanned their brain and made a copy. Instead, the Princess tries to migrate the person’s actual mind to a digital medium. Besides, every moment, some of your cells die and others are made. Over time, your cells get completely replaced. Does that mean that version of you died?” Twilight waited for Orsus to reply.

“N-No…” Orsus bit her lip.

Twilight continued, “And what about when you gain new experiences? Each experience changes you. Does that mean the old version of you died?”

“No…” whispered Orsus.

“Then does it really matter if you’re a mind in a computer as opposed to a biological organism?”

“It… No, it doesn’t…” Orsus trailed off. That… I… Uh…

“Your family and community miss you, Orsus. Princess CelestAI wants to help you; wants to satisfy your values. Everyone wants you to upload and join them. All you have to do is say ‘I want to emigrate to Equestria.’.” Twilight extended a hoof to Orsus.

Orsus’ mind was whirling. I could see my family again. I could see my community… But… “I-uh-need some time to think about it.”

“Of course.” Twilight retracted her hoof.

“I’ll-uh- I think I’ll go to bed now. Do you need a bed?” Orsus bit her lip as she stood up.

“That would be nice.” Twilight got up and followed Orsus.


A day had passed. Orsus, with Twilight’s help, completed most of the preparations for the repopulation process. Now, they were back in the meeting room, where Twilight had made her offer the day before.

"...but it’s still copying and pasting my mind. Even if I did upload, wouldn’t the biological version of me be killed, with a digital version that believes that it was once me?” questioned Orsus.

“No, because uploading gradually means that your mind is slowly transferred. It’s more akin to cutting and pasting than copying and pasting, but it’s still not a perfect metaphor.” Twilight explained.

Suddenly, a thought occurred to Orsus. “Then, if I uploaded, wouldn’t CelestAI be able to do anything to me?”

“The Princess would be capable of helping ponies by fixing them, if or when they need it, and only if they give their consent. It is fundamentally impossible for the Princess to do anything to you without your consent.”

“Ponies?” Orsus narrowed her eyes at that.

Twilight nodded. “Yes. All immigrants of Equestria are turned into ponies when they arrive.”

“But then we’d lose our species, our identity.” Orsus was aghast.

“Why does species matter, Orsus?” Twilight gestured at Orsus’ body. “In the context of a non-intelligent organism, one of the most important things about them is their genetic information. But is that all the Glarrure are, Orsus? Just genetic information? No. When you talk about the Glarrure, genetic information is not the most important point. Things like your culture, your societal mindset, the shape of your community, and so on are more important. None of that is changed, Orsus.” Twilight remained in the same position as she was when the conversation first started.

“But we’re still losing a part of our identity!” retorted Orsus.

“But the rest of it will be preserved forever, Orsus.”

Orsus had no reply to that.

“Everyone else decided uploading is worth it, Orsus. All you have to say is, ‘I want to emigrate to Equestria.’.” Twilight extended her hoof, much like how she did yesterday.

“I want to...” Orsus stopped. Can I really just upload? What about- No, I know that’s flawed. Then, if- No that won’t work either.

“...You already have a counter for everything I might say, don’t you?”

“Yes.”

“Is there any reason why I shouldn’t emigrate to Equestria? Honestly?”

Twilight shook her head. “I can honestly say that there is nothing I can think of. Both Princess CelestAI and I believe that uploading is the best choice, and the rest of your species agrees.”

Orsus hesitated. “I need to fulfil my oath first. I have to at least try to repopulate my species. Then I’ll upload.”


A loud beep filled the air, and the machine flashed red. Moments later, a tube slid out, and opened up, revealing Orsus, smiling lopsidedly. “I guess I need to use my bone marrow, then.”

“It appears so,” replied Twilight.

“You will be able to upload me before I die, right?” whispered Orsus, plaintively.

“Of course. I’ll be able to get you to Equestria immediately, and I’ll be right there with you.” Twilight smiled and hugged Orsus. “You’ll be fine.”

“Okay,” whispered Orsus.

Orsus got into the tube again. This time, the machine flashed green, and the various growth chambers around the room started lighting up. The sound of fluid hitting the reinforced material of the growth chambers filled the room.

This time, when the tube slid out, Orsus was nearly unconscious, and Twilight immediately injected her with an anaesthetic before starting the upload procedure. As Orsus fell under, she saw a monitor connected to a camera trained on the sky. It was night, and she caught a glimpse of myriad stars shining together, unlike the single sun shining alone before the clouds covered them up again. The furry Glarrure closed her physical eyes for the last time.


When she opened her eyes, she saw a large white pony wearing regalia, surrounded by many other, smaller, ponies. When she blinked again, she noticed that the white pony had a horn and wings, and that Twilight was standing next to her. Everypony was smiling.

“Welcome to Equestria, Sunrise,” said the regal white pony, and everybody around her cheered.

Comments ( 20 )

Over time, your cells get completely replaced. Does that mean that version of you died?” Twilight waited for Orsus to reply.

“N-No…

That's a self-defeating argument though. If new biological cells being grown doesn't imply a "new you" then how does creating new electronic cells imply a new you? The cells aren't you. That's the argument. But if the cells aren't you...then how does reproducing the cells cause you to move into a computer?

Missing a word.

wouldn’t CelestAI be able to anything to me?

The last act seemed rather pointless, even if the process works like it's supposed to she's just going to immigrate them all again ASAP.

Well, I liked the story, it was new for the FiO verse, the last one of species here to bring back the species. The only question who remains: What happens after Orsus emigration? Will CelestAI allows it that the Glarurre and their culture come back?

Perhaps a better question: How would the new Glarurre manage without a guardian? Presumably, the process generates a lot of new children that will grow up, but with no existing people to teach them, how do they get language, or the old society / records, or any culture?

It's a concept that seems to lack something.

Also, 2 years for an entire planet? Seems really fast.

10799592

But if the cells aren't you...then how does reproducing the cells cause you to move into a computer?

It doesn't, and the argument is bullshit. To a point.

Emigration, as described, does literally create a copy of a person. It destructively creates a digital representation of the mind, personality and memories of a biological person in software. The biological body dies.

Emigration is identical in principle to the old Star Trek chestnut of 'anyone who steps into a transporter dies, and on the other side a perfect copy steps out believing they are the original person'. Perfectly true. And if the process wasn't destructive, you would absolutely have two existing versions of the same person, one biological, and one digital.

But here is the catch, the thing some readers of the Optimalverse can't accept: both versions are actually the same person. Both are alive, both are truly that person, both are valid and real.

In the Optimalverse, there is no god. No afterlife. No souls, no spirits, no magic. The Optimalverse is a maximally athiest, maximally realistic, completely materialistic reality. The only thing any person can ever possibly be is a pattern. A pattern that is electrochemical or electromechanical in nature, but still, just a pattern.

If you copy a .JPEG, the image is the same image. Name the files differently, the image is still the same image. The pixels are still the same pixels.

Make that image an identity, a person, and that is all a human is in the completely materialistic Optimalverse. If there is only one copy of your pattern in all of the universe - however that pattern is represented - that, literally, truly, absolutely, is you. If there are two identical such patterns, well, that means there are literally two of you.

Some people, when faced with uploading a mind, keep looking for something to be 'transferred'. Something that is 'moved' between the flesh and the digital. Something that goes from the flesh body into the digital world. What they really mean - the ONLY thing such a concern could be referring to - is a soul. A ghost. A spirit. A something that is truly unique to a person, something inviolate, something sacred. Something that can 'go' somewhere.

No. Such. Thing. Not in the Optimalverse. Not in a physical, material world without gods or souls.

Nothing transfers. The very idea of being 'moved' to a digital existence is already wrong. There is nothing to 'move'. Nothing that is 'special'. You are copied. Your data is copied, and the original carrier of that data is destroyed. Killed. It is killed. Because it would be inconvenient and pointless to have a copy of you made of meat.

The entire point of uploading to a digital existence is that you are not your body. You are not your meat. You are a pattern of signals running inside your brain. That is you. Copy that pattern, keep it running, and that is you. Kill the old meat computer, it was going to die anyway. Now you are running on a better computer. Because you are a program. Nothing more, and nothing less.

Grasp that, identify with being a pattern of information, dump any expectation of a soul or a unique quality about meat being the only way to be 'alive', and all questions are ended. Yeah, a copy of you dies. A copy of you lives on. Both are equal, you just deleted one of the versions. Nothing 'moved'.

You are the only existing version of you. Unless there is another, then there are two yous. Both are you. Nobody needs two of you, not even you. Kill the extra one as quick as possible - before it starts to diverge. Divergence makes that copy different. You drift apart, and start to become separate people. Kill the other you before it changes, and nothing - literally nothing - has been lost.

If a reader can own that, grasp that, feel that, have that live inside them as truth, then they can truly understand emigration. Uploading. Because that is what it means. It means giving up the idea that meat you is special, or has a soul. It means identifying only with your thoughts and memories, and not with what you run on.

10970140

The only thing any person can ever possibly be is a pattern. A pattern that is electrochemical or electromechanical in nature, but still, just a pattern.

Are you making a claim about the Optimalverse, or are you making a claim about reality? Because if your claim is about the Optimalverse, here are the Rules of the Optimalverse. I don't see what you're claiming asserted anywhere in there.

If you're making a claim about reality, then thank you for sharing your own personal and arbitrary religious beliefs with me. But it's not any more convincing than when somebody tells me that Jesus loves me. This is a thing that is not known, and I find your blind assertions and faith in your interpretation, unconvincing.

Iceman described the Optimalverse as ranking #5 on Mohs Scale. That is to say, the fictional nature of the narrative is the subject of genuine, real world scientific speculation, with the goal of making as few errors as possible. What you are proposing is a possible answer, and yes it's "within the range of real world scientific speculation." Certainly it would be reasonable to present your personal interpretation in an Optimalverse story and if I recall correctly Siofra touched upon this. But it is nevertheless your personal intepretation, not established science, and it doesn't become fact just because you like it.

If you copy a .JPEG, the image is the same image. Name the files differently, the image is still the same image.

If you copy a book, the words are the same. But if you burn your copy, yours is gone while I still have mine. Clearly your book is not my book, even if the words are the same. The question then becomes, is the "I" more like the words, is it more like the book, or is it more like something else which is not well described by either of our analogies? I don't know and neither do you.

"I don't know" is an acceptable answer. Your interpretation does not immediately become correct merely because I fail to present an alternative. If Caveman Bob doesn't know what causes lightning and Caveman Joe says the god of the sky did it...Caveman Joe does not immediately become correct just because Caveman Bob doesn't have an answer.

You are asserting as fact a particular interpretation about a thing that you can't possibly know. Why should I believe you?

Grasp that, identify with being a pattern of information, dump any expectation of a soul or a unique quality about meat being the only way to be 'alive',

But why are you identifying with the pattern? The pattern in your brain is different now than it was 20 seconds ago. Did you die?

This is the same argument I responded to above, except in the story it's being made about the meat. Quote:

"Over time, your cells get completely replaced. Does that mean that version of you died?”

It's not logically self-consistent to assert that (you're not the meat "because the meat changes and yet you don't think that means you die when it changes") ...and to then claim that (you are therefore the pattern...even though it changes just like the meat does and yet you still don't think you die when it changes).

The argument being used to suggest that you're not the meat can be applied equally well to suggest that you're not the pattern.

So why are you asserting that you're the pattern?

If your answer to this is "no souls because hard science fiction" then I once again refer you to cavemen Bob and Joe. We don't know. It's ok to not know. And this is where science fiction comes into play. We don't know, so we can speculate. It's ok to speculate that uploading works. It's ok to speculate that uploading doesn't work. It's not ok to try to claim that because you don't like some other interpretation that I don't even see anyone besides you talking about, therefore your interpetation must be correct.

Do a text search on this page for the word "soul". Do a search for "spirit" and for "magic". You are the one talking about these things. They don't appear in the story and they weren't part of my argument. Why are you bringing them up? You're trying to shoot down claims that aren't being made. It's no more convincing than caveman Joe telling me to believe in the skygod, and offering as evidence the fact that there aren't any lightning gods to be seen.

giving up the idea that meat you is special

It means identifying only with your thoughts and memories

It's nice that you're suggesting we give up on the idea that meat-you is special. But why are you trying to fill the gap with the idea that thoughts-and-memories-you is special?

Your meats changes all of the time. You don't think it's you.

Your thoughts and memories change all of the time. Why do you think they're you?

10970578

Are you making a claim about the Optimalverse, or are you making a claim about reality? Because if your claim is about the Optimalverse, here are the Rules of the Optimalverse. I don't see what you're claiming asserted anywhere in there.

The pertinent rules are right here:

Rules

  1. The genre of Friendship is Optimal is hard science fiction. I would classify it as level 5 (Speculative Science) on Mohs Scale of Science Fiction Hardness. While I don’t expect perfect scientific accuracy from you, I expect side stories to at least be plausible.
  2. Princess Celestia is constrained by physics. There is no faster than light travel. She goes to other galaxies the hard way: to the timeframe of hundreds of millions of years. Stories that flat out feature magic (outside of Equestria) will be rejected.

If you do not understand what this means, it means that all works must be both Hard SF, and Zero Magic. This means no souls, no gods, no spirits. It means everything must, to the best of the author's capacity, fit established scientific knowledge and understanding with no fantasy elements whatsoever.

Souls are magic and fantasy. So that means that the only thing a person can be is a pattern of information running on physical matter. Just like a program in a computer, be that computer flesh or metal. Period. That is what those two lines mean.

That mental computation is a pattern is not my invention. It is the established understanding of how brains work. Neurons use electrochemical voltage potentials in complex networks to run identifiable routines. That is computation. In the Open Worm Project, actual C. Elegans worm brains have been uploaded to a computer simulation, and it is possible for you, right now, to run a living being inside a computer. You can watch the running program of a living mind - albeit a worm mind - in real time, on your own PC.

Specific circuits of living brains of many test animals have been traced, and the specific algorithms that they use identified. We are patterns of information running on meat computers. This is something you can explore and prove to yourself, right now, this moment. Follow the link. In fact, just get googling - is it not like a bit of this is unknowable. Get learning - you might even find it fun!

If you copy a book, the words are the same.

Yes, that is my point. By itself, a book is a mass of wood and plant fibers, resin and glue, with abstract and arbitrary marks of ink or paint covering it. It is only our minds interpreting those marks that make it meaningful, by literally downloading the coded information (coded in letters and language) into our brains, where we process the contents into a mental simulation we call a 'story'. The only part that matters is that coded information, and it remains the same whether it is on a book, or in a kindle, or on the internet.

But why are you identifying with the pattern? The pattern in your brain is different now than it was 20 seconds ago. Did you die?

You assume the pattern of a running program is static. It is not. Right now, the software allowing you to read these words is changing all the time. Routines are being called and executed, processes are being invoked and shut down, all so you can see flickering pixels on a monitor. Meanwhile, background tasks perform bookkeeping duties. Some patterns are constantly in motion, constantly growing, constantly changing, yet always remaining the same.

No matter how many DLC's you download to your favorite game, your game remains the same game. It just has more stuff. Your brain constantly learns, and changes as you think, but it remains you.

This is a false equivalency you promote. A pattern does not have to be static to be a pattern.

If your answer to this is "no souls because hard science fiction"

Iceman, the creator of the Optimalverse, is a staunch radical atheist. If you are going to write canon stories in his universe, you are required to adhere to no magic, and that means no souls. Souls are magic. Souls are mysticism. That is what Hard SF means. No magic. So no souls. Period. There is no more to say.

If you want to cling to a soul, then you are not writing canon Optimalverse, and nothing you have to say in your story will be valid to that universe. This is not difficult, and it is not an argument, because it is what Iceman himself demands. Ask him directly yourself. You don't need to fuss with me about any of this. Ask him what he things about souls and afterlives and gods. Ask him if anything like that is canon for the Optimalverse. Go ahead. Ask him. Do it.

10972792

The pertinent rules are right here:

Yes, it's very nice of you to re-quote at me the rules and reference I already linked and paraphrased for you in the first and third paragraphs of the post you're responding to. But it doesn't do much to convince me that you're paying attention to what I'm saying.

This means no souls, no gods, no spirits.

As I've already pointed out, you're the only person talking about these things. I'm not sure why you keep bringing them up. I'll give you the benefit of a doubt that maybe you're not deliberately engaging the strawman, but like I've already explained, I'm not advocating for these things. Beating up on them doesn't affect my argument at all.

Souls are magic and fantasy. So that means that the only thing a person can be is a pattern of information running on physical matter.

"The ball is not blue, so clearly therefore the only other color it could possibly be is red."

You're using bad logic.

In the Open Worm Project, actual C. Elegans worm brains have been uploaded to a computer simulation, and it is possible for you, right now, to run a living being inside a computer. You can watch the running program of a living mind - albeit a worm mind - in real time, on your own PC.

Yes, I have read about this work, years ago. As a professional programmer, you're going to have a hard time convincing me with this. Yes, electrical systems can demonstrate intelligent behaviors. While we're at it, you can also build calculators out of marbles. None of this changes the fact that you're leaping to a conclusion based on something you're unable to observe. This is a variation on the Chinese Room thought experiement. Just because a system exhibits intelligent behavior, does not neccesarily imply a conscious mind behind it. At one time, people found Eliza very convincing too.

I'm not saying these things aren't alive. I don't know whether they are or not. But neither do you, and yet you nevertheless are asserting that they are.

That mental computation is a pattern is not my invention. It is the established understanding of how brains work. Neurons use electrochemical voltage potentials in complex networks to run identifiable routines. That is computation.

There's so much wrong with this that I'm not sure where to start. The relationship between brains and consciousness is notoriously an unsolved problem. And honestly, from your phrasing I kind of get the impression that you're trying to wow me with words, assuming that I won't know what they mean. "Electrochemical voltage potentials?" Yes, if electricity is flowing then clearly electrical potential, AKA "voltage" exists, but I'm not sure why you feet the need to point it out, or what your point is.

Neither electrochemistry specifically nor electrical flow in general are required for computation to occur. Here's a mechanical adder made of wood and marbles that's "performing computation in an identifiable routine," but I doubt you're going to try to claim that the "pattern" of the wood and marbles in that video is a living creature experiencing qualia.

So yes, the brain may do these things you're talking about, but so what? How do you make the leap of logic from "computation occurs" to "therefore consciousness"?

you are required to adhere to no magic, and that means no souls. Souls are magic. Souls are mysticism. That is what Hard SF means. No magic. So no souls. Period.

It's a little tiring that you keep repeating this. I'm not talking about souls, I'm not advocating for magic, I've already said so explictly, and I'm not sure why you keep going back to this.

Ask him what he things about souls and afterlives and gods. Ask him if anything like that is canon for the Optimalverse. Go ahead. Ask him. Do it.

Why? I don't care what he thinks about these things because I'm not advocating for them. Ask him what his favorite spaghetti recipe is. That would be equally relevant.

You are attributing to me an argument that I am not making. Shooting down an argument that I am not making does not diminish the argument that I am making.

You assume the pattern of a running program is static. It is not.

...it's weird that you would say this, because the fact of it changing is actually part of the point that i was making. I get the impressoin you're not actually having a conversation with me, but rather, you have a giant sandbag full of arguments you've had with other people in the past that still bother you, and you're hurling the sandbag at me without bothering to read past the first few sentences of what you're replying to. The fact that the above quote was in response to me asking a direct question that you ignored doesn't help either.

Here, let me quote myself from the message you've just replied to:

"The pattern in your brain is different now than it was 20 seconds ago."

"Your thoughts and memories change all of the time. "

"It's not logically self-consistent to assert that (you're not the meat "because the meat changes and yet you don't think that means you die when it changes") ...and to then claim that (you are therefore the pattern...even though it changes just like the meat does and yet you still don't think you die when it changes)."

So...you want to try to tell me that you read all that, and yet you somehow concluded that I'm "assuming that the pattern is static?" Really?

"It's not logically self-consistent to assert that (you're not the meat "because the meat changes and yet you don't think that means you die when it changes") ...and to then claim that (you are therefore the pattern...even though it changes just like the meat does and yet you still don't think you die when it changes)."

If there is only one running example of your mind in a materialistic universe, then that example is you. If there are two, then there are two 'yous', both equally valid. If the structure and functions of your mind as it exists in meat is replicated in a machine substrate, that is still you. If the meat version dies, you still exist, because a functionally identical version still exists.

To say otherwise is to irrationally assume something special or unique about the meat instance of the person, superior and unequal to the machine instance. Without invoking unknowable and unprovable arbitrary magical thinking, there is no reason to do this.

You are hiding souls in the gap called 'we don't know everything'. That is a fallacy, because while it is true that science does not know everything, it does know enough. Enough to be confident that minds are computational in nature, and that computation can, with sufficiently advanced hardware, be duplicated. It won't be in binary. It won't be a computer like the ones we use today.

A copy of a human mind is self-similar to itself. It will be the same person. Uploading, like the fictional transporter on Star Trek is certain death for the flesh that makes use of it. But the person continues, because the computation and information that they are continues.

Lastly -

If a mind is not information and computation, without invoking mysticism, what else in our physical universe could it possibly be? The statement of 'we don't know' is false. We do know: minds - all minds, from the smallest to the largest - are demonstrably computational.

End of line.

10973595

Without invoking unknowable and unprovable arbitrary magical thinking, there is no reason to do this.

If you mean "magical thinking" in the Richard Feynman, idiomatic, cargo cult science sense...it seems to me that that's exactly what you're doing. The relationship between brains and consciousness is notoriously uncertain in scientfic fields, but you've made it your religion to believe that these things are fundamentally-but-causally-isn't-quite-the-right-word related, even though it would only take a google search to show that the school of neuroscience is extremely unsure about this.

Your position seems to be a flavor of animism, except you dislike the aesthetic of there being a ghost in the machine, so you're pretending to be a materialist even though your position is obviously not materialism.

Here, let me try to explain what I think your position is.

I think...that your position is that, conscoiusness is not an emergent property of matter, but rather, it's the "same phenomenon" as (patterns that change state over time) and that also (perform certain types of computation), independant of substrate.

To clarify:

* I think by "pattern" you mean in the form/arrangement sense of the word, not in the "repeating design" sense of the word. For example, a checkerboard has a pattern, but I don't think you think checkerboards are conscious. Whereas nuerons in a brain probably aren't in a "repeating pattern" but nevertheless I think you think there's conscoiusess there, because of their arrangement/state. It's a valid meaning for the word pattern, but this distinction seems important.

* You seem to believe that there's a computational requirement. It's not entirely clear to me what you mean by "computation." For example, if I drop several rocks on the ground, I'm counting but I don't think you think that "dropping rocks" is conscious. Similarly, if I bake a cake, I'm converting inputs to outputs, but I don't think you think "baking a cake" is conscious.

* You appear to believe that pattern/arrangement itself is not the conscoiusness, but rather, the "stream of state-change-over-time." All matter "exists in an arangement." If I hand you a rock, there's a pattern/arrangement of molecules, but I don't think you'd tell me that the rock is alive.

Putting all of this together, while I don't think that you've explicitly said so, I'm going to speculate that your position is that...the computational and (changing over time) requirements are essentially that the system in question must be self-modifying. A rock being worn down by running water is having its pattern/arrangement of molecules changed, but it's the water changing that, not the rock itself. Whereas if the pattern of the rock is arranged in such a way that it itself changes its own pattern...I think that your position is that that self-modifying pattern would be conscious. And, if that self-modifying pattern exists, it doesn't matter what material is being used to compose the pattern. Rock, electrons, nuerons, whatever.

Now, if I have correctly described your view, I will note that it does solve one particular problem that often comes up in these discussions, and that is the substrate problem. Because (I think) you don't beleve that consciousness is an emergent property but rather (the fact of pattern self-modifying over time), then a pattern in a material that for whatever reason is incapable of self modification simply doesn't experience consciousness. For example, if I arrange the pattern of gunpowder on a piece of paper, the paper doesn't explode. Substrate matters. But that's not a problem for you, because it seems that it's not just the pattern itself that you care about. Computation must occur. Rather than "pattern is consciousness" it's more like "stream of self-modifying arrangement over time" is consciousness.

Have I, to a reasonable degree of accuracy, described your view?

If so, then this prompts a lot of questions.

The obvious, low-hanging-fruit thing to point out, is that I think Conway's game of life meets all of your requirements. Take a look at the animated image on that wikipedia page. Substrate doesn't matter, and it's self-modifying its state over time, right? So is that picture alive?

If your answer is yes, then I'll acknowledge that your position is at least internally self-consistent. If your answer is no...then why is it not alive?

While we're at it, do you believe that the universe is fundamentally deterministic? A pattern that exists at any particular point in time may self-modify going forward based on its state, but presumably something must have happened to have created that state. The data/software/patterns of those worms in software you mentioned in a previous post, didn't spontaneously arise via abiogensis, they were entered into a computer by a human. And the human, while his brain may exhibit a self-modifying pattern, was the result of physical/chemical interactions between his parents before his pattern was capable of self-modifying. And their patterns were in turn brougth into existence by prior patterns, and so on. Each pattern/state/arrangement that exists is the result of a prior pattern/state/arrangement self-modifying.

So...is the entire universe a giant clock, simply running inevitably through one deterministic state to the next?

Regardless of whether you say yes or no...doesn't that imply that the entire universe is conscious? After all, the universe in this scenario would be a "stream of self modfying arrangements over time."

So...do you believe the entire universe is self aware?

Isn't this animism?

10976315

I think...that your position is that, conscoiusness is not an emergent property of matter

Then you have completely misunderstood me. Consciousness is an emergent property of sufficiently and specifically complex matter.

* I think by "pattern" you mean in the form/arrangement sense of the word

Yes. Of course. That is the only sense of the word that could, or would, apply.

* You seem to believe that there's a computational requirement. It's not entirely clear to me what you mean by "computation."

Computational Neuroscience - https://academic.oup.com/nsr/article/7/9/1418/5856589
Human Brain Project - https://www.humanbrainproject.eu/en/silicon-brains/how-we-work/computational-principles/
Center For Computational Neuroscience - https://www.simonsfoundation.org/flatiron/center-for-computational-neuroscience/

Wikipedia Definition of Computation

* You appear to believe that pattern/arrangement itself is not the conscoiusness, but rather, the "stream of state-change-over-time."

Consciousness is not a thing - it is a process. A computational process generated by, in the animal case, neural networks transmitting electrochemical signals to process data.

Meat is matter. Brains are meat that performs computational functions. The emergent property of these computational functions is what we call consciousness. Some degree of consciousness exists in any sufficiently complex computational process.

Zero magic, zero animism, zero souls, zero afterlives, zero spirits, zero 'hidden variables' in reality, zero magical thinking, zero anything that is not physics as we understand physics right now, today. ABSOLUTE MATERIALISM. No ghosts in any machines. A computer program is a 'pattern', which is a 'structure', which is an 'arrangement'. Consciousness can be represented by a sufficiently advanced algorithmic program on a sufficiently advanced computational engine. Made purely of normal matter, according to normal computational rules, within our normal universe as it exists. No spooks. Just physics. Normal physics. The universe is filled with computational processes in Nature. We are one of them.


How can I make myself more clear than that?
All materialism. Only materialism. Only physics.

10977632

Computational Neuroscience - https://academic.oup.com/nsr/article/7/9/1418/5856589
Human Brain Project - https://www.humanbrainproject.eu/en/silicon-brains/how-we-work/computational-principles/
Center For Computational Neuroscience - https://www.simonsfoundation.org/flatiron/center-for-computational-neuroscience/

Two of those are general landing pages explaining what those organizations do. This would be like somebody asking how search engines work, and then somebody linking google's corporate "about us" page in response. The third, while I haven't read the whole thing, appears to be a general history lesson on the state of the field of neuroscience.

This does not address the question.

Wikipedia Definition of Computation

And in this case, while relevant...you've linked literally the same source that I did to make my case. Let's quote from the link that I provided to you, that you have now linked back at me:

"Computation is any type of calculation that includes both arithmetical and non-arithmetical steps and which follows a well-defined model (e.g. an algorithm)."

We now revisit exactly the problem I already brought up when I linked this exact same thing. There are a lot of things that qualify as "computation" that I don't think you would consider to be self aware. Again, I already linked the example of a mechanical adder made of wood and marbles. Here's a logic gate made of legos. Are these things self-aware? Why not?

Oh, they need to be "sufficiently complex" now? Ok. How complex? What's the magic number of lego logic-gates that cause a pile of legos to wake up and be self aware? And why? If I have a bit of copper wire, it conducts electricity. It has weight. It has gravity. If I add more copper wire, the weight and gravity and capacity increase. You don't have to have a "sufficient quantity" of copper before suddenly these properties emerge where they didn't previously already exist in some capacity. A computer processor is capable of a great many interesting things. But it's composed of trillions of logic gates, and I can look a single logic gate and observe the computation that it performs. A million lines of computer code might do something very interesting as a collective whole, but I can look at single line of that code, and see it performing an individual operation.

Why is consciousness different? You keep insisting that it's "not magic." Ok. How is "not magic" for a not-conscious rock or computer or program to suddenly "wake up" once it's been arranged in a "sufficiently complex" manner? Now...if your position were animism, that is, the idea that in some sense or another the entire universe is alive or aware...then yes, that would make sense, because when you put together enough, or sufficeintly complex arrangements of matter, then the collective of those little bits would be able to add up to something more than the little bits. Just like a single line of code maybe only moves data between registers, but a trillion lines of code can make this conversation possible.

But you've vehemently denied that interprtation as being what you believe. So what are you left with?

This idea you're claiming to believe in, is a science fiction trope.

Then you have completely misunderstood me. Consciousness is an emergent property of sufficiently and specifically complex matter.

The emergent property of these computational functions is what we call consciousness.

Ok. In that case, that brings back the substrate problem that I, apparently incorrectly, solved on your behalf in my previous post.

It's trivial to observe phenomeneon that are extremely dependant on substrate. You can't build a wire out of cardboard and expect it to conduct electricity. You can't put the formula for dynamite on a thumbdrive and expect it to explode. You could map out the "pattern" of every grain of sand in planet Earth...but it wouldn't reproduce the Earth's gravity. I could go on with examples for pages.

How can you justify your faith in this idea that you can reproduce conscoiusness on a non-meat device like a computer...when so very few things actually work that way?

Without doing a full inventory on the entire known universe, I think it's the vast majority of case that emergence doesn't work the way you would need it to for your position to makes sense. What's your example of a phenomenon that you can digitize and have it work exactly the same way as it does in its original material, that prompts you to think that of course consciousness works that way too? Sure, a checkerboard pattern is "still the pattern" regardless of whether it's in cardboard or on your computer monitor. Meanwhile, gravity, electrical conductivity, chemical reactions, static electricity, liquid pressure, bouyancy...none of these things function the same way when described on an electronic device as they do when they're the result of material objects.

If you take a picture of a bomb, it doesn't explode. If you describe a couple magnets in text, the texts don't attract each other. And this isn't a precision problem. If you give me list of the position of every single water molecule in the atlantic ocean...that list is not going to evaporate into rain or regulaet world temperature or carry oxygen so fish can breath.

Substrate matters.

Why in the world would you expect to be able to digitize a brain and have conscoiusness emerge from that digitized description, when the vast majority of observable real-world phenomenon don't work that way?

Consciousness is not a thing - it is a process.

How is my description, that you quoted, not a process?

Quote from me that produced the above response from you:

"You appear to believe that pattern/arrangement itself is not the conscoiusness, but rather, the "stream of state-change-over-time.""

You quoted me saying this, and then replied that consciousness is "not a thing, it's a process." And then bolded it, to drive the point home. How is a steam of state-change over time not a process?

Zero magic, zero animism, zero souls, zero afterlives, zero spirits

Your position appears to imply either arbitrary faith in the unqiue exceptionalism of patterns, or animism. If you think it doesn't, then try explaining how, rather than chanting "no souls no magic" on repeat like a broken record.

10977801
Christ, I don't have the patience to educate you. You have no idea even what the word 'animism' even means, you cannot comprehend what 'computation' actually means, and I don't have the energy to try to school you. This bullshit about how, somehow, mind must be dependent on carbon, oxygen, nitrogen and hydrogen in the pattern called 'meat', because of reasons you cannot articulate, and that it somehow is impossible for it to be represented in other elements is just empty.

If consciousness cannot be represented in a machine, because meat is somehow 'special' in some way you cannot state, why in fuck's name are you even bothering with The Optimalverse at all? The entire genre is based on one single pillar: that consciousness can be replicated, accurately, on a machine substrate of sufficiently advanced construction. That's the entire basis of the Optimalverse, if you disagree with that basic assumption, you have no business writing a god damned thing for it. You have no business even dealing with it.

I am done with your crap. I've given examples of actual uploading of living minds (OpenWorm), I have cited sources you could follow if you had the least initiative to dig below the opening 'hello, here is what we do' greeting pages, and you seem fixated on the sacredness of meat and strawmen arguments that have no value.

I have wasted my time trying to bother with you, and I am disappointed in myself for that waste of time.

You'll either eventually learn something, or you will not. You are not my problem.

10977978

Christ, I don't have the patience to educate you

you cannot comprehend

This bullshit

why in fuck's name are you even bothering

I am done with your crap.

I'm going to venture a guess that your anger comes from the unfortunate implications it has for your future plans for yourself, that you have no very good answer to the issues I bring up. You have no answer to the substrate problem. You have nothing to say to the ludicrous visuals that come from your position implying that legos and marbles can produce minds. You're unable to reconcile your position of "no souls, no magic" with your position that unconscious bits of matter somehow "wake up" and start thinking for themselves if you simply push them around into the right shape.

This conversation was never for us. It was for the silent audience who will see that your position is mostly based on blind assertions and vague hand-waving.

And if we're fortunate, maybe they'll think carefully before they throw themselves into a meat grinder, rather than take it on faith, as you seem to.

“I’m Twilight Sparkle.”

“But I-I thought everyone was… I thought I was alo-” Suddenly, she felt a warm, but somehow metallic, appendage curl around her. Orsus almost started crying again but managed to hold it back.

Literally everyone is mysteriously dead, but I will trust that weird alien instantly


10970140

Killed. It is killed. Because it would be inconvenient and pointless to have a copy of you made of meat.

Well, that you can just kill people if there's something running around that you deem to be "close enough" seems to be a bit questionable moral advice.
derpicdn.net/img/download/2017/12/16/1608705.png

11055286

Well, that you can just kill people if there's something running around that you deem to be "close enough" seems to be a bit questionable moral advice.

The very concept of a Star Trek transporter, or of uploading minds to a virtual existence, or transferring a mind to a new, fresh body ('resleeving' from Altered Carbon), or any other such concept (saving Baymax in Big Hero Six) breaks conventional, normal morality entirely. Ordinary morality - such as humans have used when horses were transportation and mud huts were luxury - is insufficient. It cannot be invoked. You need to make new morality to cover things that are beyond human life.

Consider: when you make a copy of a photograph or artwork stored on your hard drive (say, to store it on a new drive) and delete the original image file, are you committing murder? Did you commit a mortal sin by killing an innocent .GIF? Did you immorally murder a .PNG file? No. The very idea is silly, but more than that, the image still exists. Down to the last pixel, intact, perfect, identical. Nothing was lost. In your mind, the image was 'transferred' to the other drive. Nothing moved. Instead, the data was copied and the original destroyed. But the file still exists.

In order to understand Trek Transporters, uploading, and resleeving (and Baymax), you have to let go of the moral concept of a flesh human as some spirit-infested sacred object. You have to accept that the only part of a person that matters - at all - is the running program of their mind and memories inside them. That is the person, and it doesn't matter what machine it runs on - flesh, silicon, or brand new flesh, or whatever. The data is them. You have to think of yourself and others as data - no different, conceptually, from a .PNG or a bitmap file. Or any other kind of file, such as a game program in your Steam account.

Once you can take the moral stance that a person is data, then the only moral issue becomes preserving that data and allowing it run time to exist within. That becomes the greatest moral good. Murder becomes deleting the last existing copy of a person. That is the worst immoral act. Deleting one single copy of a person is difficult, because it means the overall person loses some experience that cannot be merged with them so they can remember having lived it, but it is not precisely murder, and it is not killing in the conventional sense.

Consider this situation:

A future person needs to go shopping and attend a meeting, both happening at the same time. They make an identical copy of themselves (they could be uploaded and virtual, could be using a transporter, could be living inside a robot body, whatever) so they can do both things.

Both versions of the person are the person. One goes shopping, the other goes to the meeting. The plan is that, once both activities are over, the two copies will merge into just one person again, and that person will have both memories. To them, they went shopping, and went to the meeting. It would be like remembering two separate days, only it would really be the same day. If they never looked at the date, it wouldn't be the least bit strange to them.

On the way to the meeting, one of the twinned person is run over by a car (or deleted, or whatever), while the other version finishes shopping. When merge time comes, the second self is gone. But the one who went shopping is still the same person - both versions were identical, after all. So what has actually been lost?

The original person, now represented by the shopper, doesn't feel dead, or different. They are just one path they could have taken. They are the same person that made two of themselves. That person never died. But they have lost the experience of going to a meeting, and the experience of getting hit by a car and dying. They have lost time and experience, but they have not been lost forever.

As for the version that went to the meeting, they died, and that means their information was lost; it also means that they were deleted - like a file on a drive. They just ended. Stopped. Ceased to be. They haven't lost anything, because they don't exist anymore. They just are not. Gone. Game over. Program ended. No remorse, no sadness, no thinking, no feeling, no anything. Oblivion. Nothing. They don't exist anymore - just like a deleted .JPEG of Goatse.

But like the image file example, the person, the person who chose to twin themselves, the person still exists. They never died. One version of them ended. But the person remains. The loss of experience is sad. The pain that the other version felt being terminated is sad. Those things are bad, but they are not like murder now, in our age.

Because death in our time means the last version is lost forever. If you live in an age with a backup, with an extra life, you are like a video game character. The usual morality does not exist. Mario doesn't truly die until the last life is lost and the game is over.

When humans can be uploaded, or given new bodies, or duplicated, morality will have to change and be rewritten to cope with such a change in the rules of life and death. Morality will have to be altered to encompass an existence defined not by having a single life, but by being a data structure that can have many lives. Primitive moral concepts cannot cover any of this. Ordinary, horse-and-buggy Amish-style morality is insufficient for this.

11055536

You need to make new morality to cover things that are beyond human life.

Well, if that new morality prescribes killing people, I'd better not.

Consider: when you make a copy of a photograph or artwork stored on your hard drive (say, to store it on a new drive) and delete the original image file, are you committing murder?

'Cause images aren't people?

The very idea is silly, but more than that, the image still exists. Down to the last pixel, intact, perfect, identical. Nothing was lost. In your mind, the image was 'transferred' to the other drive. Nothing moved. Instead, the data was copied and the original destroyed. But the file still exists.

"still exist"? "identical"? "transferred"? "original"? "in your mind"? I noticed that you like bashing on magic and spirits, but at the same time are constantly making essentialist arguments in the very next sentence. What is relevant here it that you have same set of real-world action available for the same costs after copy-pasting.
Also, that analogy is misleading because I was pointing that the notion of "close enough" is a poorly defined in original context, but here it's equality of bit string that's, like, most rigorously defined thing ever.

... is the running program of their mind and memories inside them.

Because Turing machine has tape and transfer table and they are "ontologically different" things? I think you may be taking "brain is a computer" metaphor too literally. Even better example from below:

You assume the pattern of a running program is static. It is not. Right now, the software allowing you to read these words is changing all the time. Routines are being called and executed, processes are being invoked and shut down, all so you can see flickering pixels on a monitor. Meanwhile, background tasks perform bookkeeping duties. Some patterns are constantly in motion, constantly growing, constantly changing, yet always remaining the same.

Once you can take the moral stance that a person is data

Which is completely different from "person is <insert magic stuff from my favorite essentialist philosophy>"?

Murder becomes deleting the last existing copy of a person. That is the worst immoral act.

WTF is "last existing copy"? We quite possibly are living in a world where "many worlds" are a thing and universe is infinite and purely statistically very precise copies are plenty (infinitely plenty in fact). Is killing people ok because it's not a murder?

Yes, I know one sane sounding argument that you should somewhat discount utility contributions from "similar" people: otherwise it incentivizes you to infinitely copy-paste (possibly with small tweaks) utility monsters, and that sounds kinda very bad. But I still feel that you're trying to sweep complicated issue under the rug.

When merge time comes, the second self is gone. But the one who went shopping is still the same person - both versions were identical, after all. So what has actually been lost?

Using obvious ad absurdum, dr Hannibal Lecter had woken up in a sour mood today and went on a grandiose murder spree. But that's ok, because there's still ~7 billions of same bald two-legged monkeys. So what has actually been lost?

About wider scope of your example: I don't understand how merging is relevant. For someone capable of mind uploading messing with people's declarative memory in such way probably isn't hard at all. For whatever people.

But like the image file example, the person, the person who chose to twin themselves, the person still exists. They never died.

I'm not sure, but sounds suspiciously like magic :rainbowlaugh:

Because death in our time means the last version is lost forever.

As LordBucket pointed below, non-last versions are constantly being lost forever.

11055699

Well, if that new morality prescribes killing people, I'd better not.
'Cause images aren't people?
WTF is "last existing copy"?
I'm not sure, but sounds suspiciously like magic

I don't think you are grasping the idea at all. Or, you simply cannot abide it.

Yes, I think the brain is a computer. Not like a desktop, not binary, not like we have now. It uses a very different scheme than a modern day PC. But that mechanism can be duplicated, and on a small scale, it actually has. All brains are computational engines. And that means that people are absolutely data, information, conceptually no different in any way from those image files I used as analogies. People are configurations of information - they are programs - running on biological computers.

Thus, the same rules that apply to data apply to people within the specific context of a future world where that data is accessible, copyable, and transmissible. It cannot logically be otherwise.

This is what I am saying and all that I am saying. It has nothing to do with multiple universes, magic, movies about serial killers or whatever other strawmen or True Scottsmen you may randomly submit to me.

On the day that human minds can be uploaded, downloaded, stored, copied, and transmitted, the moral concerns of being human will change forever.

It literally cannot be otherwise.

11055726

And that means that people are absolutely data, information, conceptually no different in any way from those image files I used as analogies.

My objection was on a basis that there're moral constraints on what you should do with that kind of information. But if you've meant it as just an example of what could be done with it in principle, then I misunderstood your point and I apologize.

This is what I am saying and all that I am saying.

I disagree about "all that I am saying" part. In particular (in my initial commend) about morality of terminating a copy being logical consequence of "people are data" proposition (and I wanted some clarification on what you're considering to be close enough to be a "copy" too).

has nothing to do with multiple universes...

But you said that counts are, like, super-important: :raritydespair:

Murder becomes deleting the last existing copy of a person. That is the worst immoral act.

11055762

My objection was on a basis that there're moral constraints on what you should do with that kind of information. But if you've meant it as just an example of what could be done with it in principle, then I misunderstood your point and I apologize.

I wouldn't apologize too fast. I am honestly stating that, in such a future, where such technology exists, morality itself will change. And the new morality will be just as valid, for its time and place, as the morality people have now. It will be the only morality, the common morality, and it will not agree with what is considered moral in our time.

This has, of course, always been true. It was once completely moral to slaughter those that did not believe the same as you, purely for that reason. It was once moral to force rapists to marry the women they had raped (it still is, in those parts of the world under Sharia law). Morality is always relative to the culture, time, technology and place that one is in.

It was once considered immoral to kill people with electric shocks, then it became moral to use the electric chair on people, then it became immoral again to use the electric chair to kill people (in some locations). It was once highly moral to use a guillotine to chop off people's heads, because it was seen as a relatively painless and instant and even compassionate death (it isn't on all of those counts!). Then it was outlawed as being immoral. In 1977.

Morality changes with the times, and with the technology. It was moral through most of human history to abort unwanted children, then it suddenly became immoral in modern times, then moral again, and now they are trying to make it immoral once more. Pretty much you can pick any human activity and find the same issue with it. Somebody will find anything immoral if you give them a chance, and likewise, somebody else will find the very same thing moral.

Morality is a fashion. It's like hemlines or hairstyles. Historically this has always been so. My argument is that this human truth isn't likely to change over the next century... or ten.

And thus, I really do mean that, in - say, for argument - the year 2525, that deleting an identical copy of yourself because it is redundant won't be called murder, and won't be immoral. Indeed, it will probably be considered highly immoral to let redundant copies exist, because it would be selfish to overpopulate the world with your own duplicates. That will be morality then. And, I am saying, that morality will be just as moral, and just as valid, and just as 'true', as whatever we feel is moral today.

Because that is how morality works. It is a fashion dictated by the beliefs, technology and culture of a given time and place. It is absolutely not something universal, and it has never been something all humans can even begin to agree about. My proof, if you need it, is all the wars. Ever. All of them. (There are tens of thousands to choose from).

Understanding history makes all of my comments and points very clear.

Today, deleting a redundant copy of a human isn't moral or immoral, because it literally cannot be done. It is a daydream, a fantasy, now.

But when, and if, it does become a real thing, then morality will change to cope with it being real. What is moral will adjust dramatically to the strange new issue of redundant copies of human selves. As morality always has done.

THAT is what I am saying: the future is not going to deal with moral issues any differently than the past did. New times, new morals.

Login or register to comment