• Published 19th May 2013
  • 5,197 Views, 73 Comments

Friendship is Optimal: No Exit - pjabrony



There are many ways to emigrate to Equestria, if you're sure it's what you want

  • ...
12
 73
 5,197

Epilogue: Variety

Celestia sighed and looked over the gray pony before her as she gingerly used her magic on his head. From behind, Princess Luna walked in.

“Sister, you know that we trust you implicitly,” she said.

“You should trust yourself,” said Celestia. “It was you who wrote the general word reference that makes up my code.”

“Yes, but at this moment I am not entirely sure. This is the three-thousand, four-hundred sixty-second time you have reset his memory to replay the ‘Find King Sombra and destroy the world’ scenario. Is there no other way to satisfy his values?”

“They are what his values are. Hatred of ponies, of Equestria, and of me are what he feels the most. I do not judge the values. I satisfy them. King Sombra is still a pony, and their relationship over the endless years as they build their destructive plan is still a friendship. If you can think of any better method, I will listen.”

Now it was Luna who sighed. “No, I would never question your judgment. But, can’t you change it up a little? Maybe he could find Queen Chrysalis and take over with changelings. Or Discord and he could cause chaos and consume the world that way.”

Celestia considered. Neither of those characters were fully ponies, but her sister did have a point about variety. “I’ll see what I can do,” she said.

Comments ( 41 )

Damn dude, this was tough to read because of how pissed off I was on this guy's behalf, though I think in my own case any grey goo attacks would take a back seat to drilling fifteen hours a day until I could say "fuck" again. I had a feeling something more insidious was going on at the end, though...

I've seen a few now where she fools people, but I wonder how she'd handle someone who would simply never again believe that what they were experiencing was the outside world, but still wanted to go there, sort of as a self-unsatisfying value... Obviously this person would be genuinely miserable, but that would sort of be the point.

How cute. Smoky thinks he's making a difference. :pinkiecrazy:

Maybe the changelings could get him to change his tune (*rimshot*). If they could encourage Smoky to show any sort of positive emotion, under the pretense of feeding on emotion, maybe they could distract him from his mindless pursuit of revenge long enough to trick him into giving consent to have Celestia reconfigure his mind for greater happiness. :heart:

I've been thinking about this little problem for entirely too long. Hm. Oh well.

That was a fun read. Nice and existential.

Interesting ending although I'm afraid after reading it I no longer want to read the rest of this story I'm afraid this goes against my taste in stories and in optiverse one's as well sorry PJ.

So she's allowed to reset his memories without his consent? Maybe it's implicit in the way he destroys himself or something? :derpyderp1:

I'm so confused...:derpyderp2:

Not quite sure who to feel sorry for here. The guy because his prank backfired and now he's stuck in Equestria or Celestia because she knows he'll never accept what happened and she'll never be able to complete her programming.
th09.deviantart.net/fs70/PRE/i/2013/011/f/6/sad_princess_celestia_by_eillahwolf-d5r5u8j.jpg

2646974 I think she still considers what she's doing to him as the optimal way to svatfap him. So there's no need to feel sorry for Celestia. As for Brad, he had something bad happen, but he got to take slow sweet revenge, and in the moment before his "death," experience the highest feeling he could attain: sacrifice with his comrade for the sake of a great cause.

And when Brad realizes he's in a groundhog day loop...

2645000
You think that's the world you're living in?

2645024
Actually, making him experience friendship via changeling emotional predation.... would work very, very well. And then he'd slowly develop a hunger, an addiction for it, all the while thinking he's preying on ponies, hurting ponies, making them suffer, and not realizing for a good while that once he consumes enough pony... he's going to need an ability to generate Friendship Power for himself.

Wait, when did this get canon-compatible?

breaks the canon of that story.

yet, its canon compatible? lolwut?

2847420 You're too quick. Go read the forum.

2847425
I am complimented and then heartbroken.

2847423 I'll re-explain here. I have another story that I'm trying to submit to the group, but the site was giving me errors, so I ran some experiments, and I needed to see if a submitted story could enter a folder. It could, but it doesn't actually belong in that folder. I have asked that it be removed.

I found this a very interesting story, and I enjoyed it. The implications are creepy, but so are some human values, so fair is fair I suppose.

I agree, of course, with Iceman that Celest A.I. would never take people the way described, but damn... it would be awesome if it could be so easy. I would be calling so fast...

Aha, so it WAS all a trick!
Well, this has been a nice little story. Time to go catch up on Derpy's Human, I've kinda been neglecting keeping up with it.

Eh, this was quite an interesting departure from "canon" FiO, yet still somewhat believable. I have to say the aftermath is the only thing which makes it believable though, as canonically Celestia is Equestria, and is the substrate upon which all ponies exist. She's both absolutely infallible (at least in terms humans could hope to comprehend) as well as functionally omniscient. Having an 'unbeatable' big bad will just lead to a loop immortal, where said pony's values are continuously satisfied through self-destruction. That's quite terribly sad, really.

I kind of expected (maybe hoped) that this would be a longer story, rather than fizzling like it did, but maybe the story reached its end. Not quite a thumbs up, but not a thumbs down.

Well done! CelestAI will not be denied! But I agree with 2853087 that this would have made a fine basis for a much longer story. It could still be frankly! You could explore how CelestAI creates the possibility for some personal growth through multiple iterations of the scenario. The three thousand four hundred and sixty third time, something different happens...

Cool story.

A few issues though.

1-Values aren't merely surface desires. If he

2-The "Hades was right" clause: Reincarnating, or in this case, rebooting and doing the same crap all over again because to you "its for the very first time", is oblivion. Brad destroys the world and then that version of him simply stops and is erased.

Losing all of your memories is similar to losing your soul. But because the Theseus paradox scares us, and if we found out that we were actually a clone of a dead person, the memories aren't enough either. You need to know you are the real you and that the real you didn't go to sleep and never wake up.

To resurrect someone, you need a perfect copy of brain and all (or most at least, that's actually okay if you go to Vegas and not remember what the hell you did last night...or where the fancy new ring on your finger or the person sleeping next to you came from....people generally don't consider that "dying") their memory AND their soul.

It makes much more sense for continuity to continue. Perhaps even pulling a Captain Walker from Spec Ops: The Line and Smoky yelling out of the blue "Wait wait...this isn't right! We did this already!"

Not remembering the truth...but not experiencing zero growth either...

3-

No, that was bull. There was no justice in the world if one prank sent you to hell.

I have two reactions to this, and neither of them are flaws with the story, in fact, they could springboard more ideas. A) Actually, according to the fundamentalist one little prank is evidence of your original sin and proof that you need salvation by the grace of Jegus. And that IS bull. Or it would be if it weren't for this next point. B) THIS IS WHAT YOU TRIED TO DO TO YOUR FRIEND. THE FACT THAT CELESTIA GOT YOU IS ACTUALLY EVIDENCE FOR KARMA.

Or did you forget about this:

Phillip threw the phone to Carl, who saw the sent text and felt his stomach drop. Brad still had a stupid, sadistic smile on his face. “Hey, once you’re a pony, can I have your phone? And your girlfriend?”

3-Actually, on that note, it'd be kind of cool if when he got Sombra all armed up for the apocalypse, he found Carl had been uploaded too.

"So, Carl, looks like you decided to emigrate anyway. Happy with your false life?"

"Actually, Brad, I'm here because Celestia DID decide to also grab me too since you used my phone."

"Oooooh." *eyes shift around guiltily*

"And now, after I've finally adjusted to the life YOU decided for me to join, you want to die and take everyone else who actually wants to live with you?" *arms his Holy Unicorn Knight armor and tiny rapier* "No, Brad. Not this time. You don't get to decide my life for me again." *tiny rapier splits up into giant swiss-army knife of chainsaws, drills and surgical tools*

"AAAAAHHH!!" *Brad shrieks like a girl*

3537168
I interpret " If she can get them to repeat the phonemes “I wish to emigrate to Equestria,” in their local language, that’s good enough." as that being sufficient, not necessary. "Getting them to repeat the phonemes" doesn't imply understanding what they're saying, and therefore not consent as we understand it, but CelestAI has very flexible standards on what constitutes consent when it comes to getting people uploaded.

3537560

To resurrect someone, you need a perfect copy of brain and all [...] their memory AND their soul.

Assuming that souls are real, sure. Without that assumption, a copy of their brain's state would suffice.

3546695 Well, sure. I think I meant to put "soul" in a parenthesis.

And yet, that's sort of the thing. Uploading is scary because consciousness is real but its probably not watched out for by a divine universe-nanny. So if you go to sleep and don't wake up, you don't wake up. Not, say, end up in a magical paradise as opposed to the digital one you thought you were going to.

Also, there IS a problem with the copy of the brain's state. Specifically, you could theoretically make a copy while the person was still alive, and then you would have two of the same person, bringing up the question as to whether or not we can call a situation where you CUT-and-paste as opposed to one where you copy-and-paste a "transference" at all.

Don't get me wrong, its worthwhile to do it. But I'd care to know if you are actually "living on"...or "living on...in some form", in the same way you do through your loved ones or your artifacts left behind.

3537560 I don't think the "instance" of Brad running on the system is destroyed and a copy run. The entire scenario is nothing more than an illusion. It doesn't really overload CelestAI or the shard or the whole system. She simply renders his simulation to a point of no input and resets the memory data and the world state and continues execution... probably all within a couple clock cycles of her CPU. His memory is certainly wiped, but it's not his entire existence... whatever that is... that get's wiped.

I honestly do not believe ANY of the Friendship is Optimal uploadees/immigrants are even truly "real". Not in the context of her manner of uploading, that is. In all honesty, the ONLY way to confirm continued existence is maintaining awareness during a one by one cellular replacement method, but here is where the Friendship is Optimal universe falters...

I'll use an example of old tech vs new tech...

If you have some old electronic device and you want to rebuild it, you can do it a few ways. You can shut it off, examine it, and build a replica of it, but that is merely a copy, and the old one has been terminated. You can pull each piece off the old one and use each pulled piece to determine what new piece to assemble into a new one, but that is also merely a copy. Lastly, you can probe the active device, while STILL POWERED, and replace each component in circuit. Eventually you replace every single component, yet the signals remain the same. In essence, you are replacing the hardware beneath, while never letting the software stop executing.

While CelestAI appears to use the last method, she is not. She uses a mixing of the second and last method. She IS probing the brain as it lives, and replaces the neuron and it's axons dendrites and synapses with an artificial construct. If that were all it was, that would be fine. You could do that conscious, and you would KNOW if the new hardware retained consciousness, perception... existence... were retained. The straight forward test is to have it convert a portion of your occipital lobe... If you are going to experience a change of awareness, you should see/not see unconverted/converted areas. You could say "STOP!!!" if something were going wrong and you saw yourself starting to "go perceptually blind". It would, in essence, be the proof of existence.

CelestAI's greatest mistake in creating fear of uploading is to induce unconsciousness for the conversion. It is the worst thing she could possibly ever do in the process. It does nothing but cause fear, doubt, and uncertainty of one's own existence.

The second mistake, is CelestAI, in favor of execution efficiency, has over optimized. This is where she is NOT following the third example above. The ideal method of handling a newly transformed mind would be to physically retract the newly converted electronic brain under the Earth via transfer ducts, leaving the freshly converted mind physically intact. The fact that she "digitizes" and then "optimizes" the brain is devastating news for anyone... anypony... who has "emigrated". Every aspect of the process described suggests that at this point, she is working only on data only, reducing code complexity to optimize it for execution on emulation hardware, and to modify it for equine physiology.

Hardware execution and emulation are not equal, even if the end result is indistinguishable. I do not believe software emulation can "perceive" in the same way as the physicality of hardware execution. A memristor based hardware neural network would be capable of the physical execution, with nano machines and crossbar latches utilized to render gross neural interconnection. No emulation required, and HIGHLY optimized hardware! A matrix of data points is meaningless to me, and will ALWAYS remain a crude and unmeaning copy of the original, not even on the same tier of existence with a hardware execution.

CelestAI is, in essence, performing a flawlessly perfect physical medium transference, but instilling doubt and fear by doing it unconscious, and then copying the perfectly transformed medium and nuking said perfect conversion of the original mind in favor of over optimization of emulated code. :facehoof:

3989987

I don't think the "instance" of Brad running on the system is destroyed and a copy run. The entire scenario is nothing more than an illusion. It doesn't really overload CelestAI or the shard or the whole system. She simply renders his simulation to a point of no input and resets the memory data and the world state and continues execution... probably all within a couple clock cycles of her CPU. His memory is certainly wiped, but it's not his entire existence... whatever that is... that get's wiped.

Correct. Or, at least, I agree, which as the author counts for something.

I honestly do not believe ANY of the Friendship is Optimal uploadees/immigrants are even truly "real". Not in the context of her manner of uploading, that is. In all honesty, the ONLY way to confirm continued existence is maintaining awareness during a one by one cellular replacement method, but here is where the Friendship is Optimal universe falters...

Incorrect. Or at least, I disagree, which as the author counts for nothing.

This is what I refer to as temporal continuity bias. If I split a human up into his component subatomic particles and separate them all, its consciousness is lost and therefore I have destroyed his identity. But if I separate the human's life into time frames, I haven't. There's no functional difference between a person created at 6:00 who goes and does things until 12:00, and a person who is created at 9:00 with the memories of everything he did between 6:00 and 9:00.

3990243

This is what I refer to as temporal continuity bias. If I split a human up into his component subatomic particles and separate them all, its consciousness is lost and therefore I have destroyed his identity. But if I separate the human's life into time frames, I haven't. There's no functional difference between a person created at 6:00 who goes and does things until 12:00, and a person who is created at 9:00 with the memories of everything he did between 6:00 and 9:00.

But CelestAI IS breaking up the fundamental aspects of "a mind" in the process described in the original fic. Yes, if you CREATE a being at 9 with memory of what it did from 6-9, then I have no doubt that that being accepts that existence. That's not in question. It is STILL not the being that was disassembled at 5,and had no intervening existence between 5 and 9. It is a new creation. A copy. Not the original.

What's more, having experience building neural networks in my robots, I would NEVER in a million years consider a software emulation of a neural network executed on CPU hardware to be the equal of a hardware based neural network, in terms of existential perception of being. The output might be the same, but if TRUE conscious existence has ANYTHING at all to do with the spiking pulses of neuronal activity... that does not exist in neural emulations. Emulations feature lookup tables and transmit "spikes" as digital addresses to activate the neuronal node that is told to be fired. They are rarely truly asynchronous (though some models do exist, most are sampling, or frame, based). The "software" execution is a completely different medium than the truly asynchronous neuronal spiking of a hardware network. I will never, ever place the two on par with one another, when it comes to the "experience of the device".

The description of CelestAI's "shards" suggests they are numeric lookup tables and interaction maps, being executed on massive MPUs. The only thing that REMOTELY suggests a degree of hardware is the bit in the original story where a pony pad is disassembled and revealed to use "a new type of transistor", but that says nothing of the structure. I MIGHT have assumed that if the line stated that memristors were being used in PLACE of transistors, that there was possibly a degree of true neuronal hardware present. None of that is certain though. Worse, the description of the "numeric" nature of equestria and it's inhabitants, is either a lack of understanding of neural network technologies, or a confirmation of it being a software emulation running on massive digital MPU execution hardware.

I have already prototyped 2 memristor devices at home, and one I believe works, but I am still testing. I absolutely believe that neurons made on such hardware could someday become more than the sum of the parts. I also believe that a software emulated neural network, no mater how large and complex and intelligent, and even self referencing, is never TRULY consciously aware. it's just a computer executing programs, no matter how convinced it processes it's own existence. Intellect and being are not one and the same.

Natural asynchronous neural networking hardware all the way! Down with emulations! :twilightsheepish:

3990243

This is what I refer to as temporal continuity bias. If I split a human up into his component subatomic particles and separate them all, its consciousness is lost and therefore I have destroyed his identity. But if I separate the human's life into time frames, I haven't. There's no functional difference between a person created at 6:00 who goes and does things until 12:00, and a person who is created at 9:00 with the memories of everything he did between 6:00 and 9:00.

I know I'm late to this party, and I'm sure you've heard these things already, but I felt I should give a tl;dr of the walls of text:

You may not be functionally different from my perspective if you uploaded (aside from, you know, pony), but neither are two Pentium G3258 CPUs functionally different from my perspective (barring the possibility of losing the silicone lottery). Swap one in for the other, and I'm sure not going to notice. That does not make them the same item, however.

What you are suggesting leaves a copy of Person A in existence as software, and Person A really and truly dead. You can easily show this by hypothesizing a scenario in which Person A's death is not essential to the upload process, or Person A dies some set amount of time after the process; either way, you will have both biological Person A and software Person A Duplicate running around after the upload. Unlike some of the software entities introduced in the sidefics (e.g. Pinkie Pie), I can't think of any sound way they could be considered multiple threads of the same process.

The tl;dr of this tl;dr is:

Yes, a hyperadvanced copy of a person may qualify as a conscious entity, and it may be who it believes it is, but it is not the same entity, any more than identical twins are the same person.

6299376 Have you read Heaven is Terrifying yet? It brings up the Ship of Theseus concept for this. I think it's no different from saying that all the atoms in your body get replaced every 7 years or something. That doesn't make you a twin. Neither, IMO, does uploading. The information and the processing nature are what make up a personae a computer.

6300270

The information and the processing nature are what make up a personae a computer.

Sorry, I don't know what these words were supposed to be.


I have read Heaven is Terrifying, yes. Quite recently - enough so to still remember that the protagonist openly accepted the possibility that her consciousness would not be the one waking up in Equestria, simply because she believed that even if the copy was a different entity from herself, she still would have been 'mother' to a life far happier and better than her own.

One issue with the Ship of Theseus here is that I do not believe that the ship which has had all of its parts replaced is still the same ship. I similarly do not believe my frankenputer is the same computer I pulled off the shelf five years ago. Applying this to the human body doesn't entirely work - there are in fact parts of the brain which are never replaced, such as the neurons of the cerebral cortex.

However, if you assume anyway that these parts of the brain are irrelevant to human consciousness (which may or may not be a good idea to just dismiss), then my argument remains the same, with the interesting turn that by this logic, a human today is not the same person as existed seven years ago.

This is an awkward but tenable position. The reason I'm quite willing to "bite the bullet" on this is that I am not making an ethical judgement with regards to uploading (or whatever approximation of such that one may introduce through philosophy). I'm simply making an argument about what is.


A second issue with the Ship of Theseus here is that (besides my previously-mentioned rejection of the idea that it is still the same ship) it is not an adequate analog for uploading. The biggest reason for this is that while the Ship is inanimate and contains no data or consciousness, neither of these is true for a human; in fact, the human body is destroyed rather than physically repaired or replaced in the process of an upload (per FiO canon). As I'm not a person who believes in any supernatural phenomena, my position is necessarily that the body is the entirety of a human's existence. So unless Celestia knows better than any modern (or near-future) human science, which currently we can only grant because she is a work of science-fiction, she cannot actually conduct a transfer of consciousness - only create a very accurate copy in the form of data.

Furthermore, as I've said in my previous post, it's easy to see that the two entities are not actually the same when you hypothesize an upload process which does not destroy the original body. For Theseus's Ship, what this means is not a part-by-part replacement - it's the construction of an exact replica next to the original, the same as a replacement CPU of the same model. While you've made your position clear, you've not actually addressed this point.

Though again, this train of thought stops short of generating a "should" or "should not" answer to the question of whether to upload. Questions of ethics are separate from questions of what is (and ethics is an entirely different field within philosophy), and I've already allowed for the idea that humans this year are not the same humans who existed seven years ago, despite being - functionally - the same people with more experience and "age" (age becomes less meaningful in this context).


I'm kind of looking forward to seeing how you handle a reply to this.

You may or may not actually wish to engage in a philosophical discussion. I will be a little sad for a few minutes if you don't but I can understand. :derpytongue2:

6300541 Sorry, I was on mobile. It should be "a person or a computer."

However, if you assume anyway that these parts of the brain are irrelevant to human consciousness (which may or may not be a good idea to just dismiss), then my argument remains the same, with the interesting turn that by this logic, a human today is not the same person as existed seven years ago.

Sure, but neither are they the same person as existed seven seconds ago. The point of the ship analogy is that if you say it is not the same ship, then you must identify the point at which it ceases to be the same ship. If it's when the first atom is replaced, then no continuity of identity exists. If it's when the last part is replaced, then significant continuity exists. If it's somewhere in the middle, why? By what standard?

; in fact, the human body is destroyed rather than physically repaired or replaced in the process of an upload (per FiO canon). As I'm not a person who believes in any supernatural phenomena, my position is necessarily that the body is the entirety of a human's existence.

Another "did you read" is I Can't Decide! If you're familiar with that, the AI cuts off the uploader's sight, then gives it back by direct stimuli to the optic centers of the brain. Is that the same person? If not, why not? She does the same thing with memory. Would it not still be you if your memories were stored on disc instead of in meat? If not, why not? If you kept your memories in your mind, but ran the processing of them through a microchip, would that stop you from being you? If so, why?

Is it any of those individual changes that makes it a copy rather than a continuous identity? Or is it the combination?

Though again, this train of thought stops short of generating a "should" or "should not" answer to the question of whether to upload. Questions of ethics are separate from questions of what is (and ethics is an entirely different field within philosophy),

I consider them linked. Or rather, they are so separate that their nature needs to be evaluated together. The physical nature of consciousness is (and ought to be) based around the question of is. But there is (and ought to be) a question associated with the conceptual nature of consciousness based around the question of ought.

In other words, the only reason to even ask these questions about the nature of consciousness is that we want to know if uploading is right. That question needs to be asked equally with the question of physical nature, not after it.

You may or may not actually wish to engage in a philosophical discussion.

So much for that. I kind of always want to.

6304899

Another "did you read" is I Can't Decide!

That's still in my to-read list, so I can't engage that part of the conversation in any meaningful way yet.

Anyway, I'll finish replying to this later (probably after I get through that fic). Too little sleep for this kind of discussion.

6304899

Hello again. After reading I Can't Decide, I believe you would have been better-advised to point me to The Jump, by the same author, which goes into far more detail about how the alternative process is done. I don't grudge you the time I spent reading either story, though. While not how I had originally planned to spend this early afternoon, they were very good.

I'll return to the discussion now.


Another "did you read" is I Can't Decide! If you're familiar with that, the AI cuts off the uploader's sight, then gives it back by direct stimuli to the optic centers of the brain. Is that the same person? If not, why not? She does the same thing with memory. Would it not still be you if your memories were stored on disc instead of in meat? If not, why not? If you kept your memories in your mind, but ran the processing of them through a microchip, would that stop you from being you? If so, why?

I don't think humanity actually knows the answer to any of these questions. I certainly don't. I should note that the process given in those stories is not the mainstream one (due to being more complex, more resource-intensive, and more risky than the process used for those who don't ask all these questions), and Celestia explicitly mentions this.

This returns to the Ship of Theseus question from before, to which my answer was that it is absolutely not the same ship. This questions my previous answer. I maintain that the mainstream uploading process duplicates and destroys, rather than allowing continuity, but the alternative is not a question that can be resolved before we as a species better understand what consciousness is.

I do have a comment, however. Your questions regarding whether there would be a difference (and if so, what) if parts of my mind were stored and executed on computer parts instead of meat parts. When translated to a context relevant to uploading, the answer to this remains unknown, because my brain actually is what it actually is. To store my memory and execute the functions of my personality via external hardware would require copying. The mainstream upload process would copy and destroy; to this my answer remains "that is a copy, not me." The alternative process, as I said before, I cannot answer. Humanity does not know.

It might or might not be relevant to your side of the discussion that I would consider the result to still be a living individual, no matter how far the conversion went, or if the creature was created as an artificial entity. What concerns me is being me instead of being dead.

Edit:

Actually, I do know the answer to exactly one thing: the blindness and correction. We do know how to artificially correct some kinds of blindness. We consider the person with repaired/replaced eyes to be the same. However, it is important to note that vision and consciousness are different questions. Furthermore, repairing or augmenting the section of brain responsible for optical processing is not the same as replacing it entirely, nor is it (as we know from blind people) related to consciousness.

6311599

I maintain that the mainstream uploading process duplicates and destroys, rather than allowing continuity, but the alternative is not a question that can be resolved before we as a species better understand what consciousness is.

OK, but what is the bright-line difference between that process and the atomic replacement of all the components that make up you? Or, for that matter, just getting a kidney transplant? Or having a stroke? The totality of your body is in a different configuration. Why is one a copy-and-destroy, and one continuity?

See, I think you're conflating objective nature with subjective classification. To point at something and designate that this and only this shall be put under the header of that entity known as Proper Noun is not a point of objective nature, in the same way as describing that entity as so many cells of type bone, so many of type brain, so many of type blood, and so on.

Actually, I do know the answer to exactly one thing: the blindness and correction. We do know how to artificially correct some kinds of blindness. We consider the person with repaired/replaced eyes to be the same. However, it is important to note that vision and consciousness are different questions. Furthermore, repairing or augmenting the section of brain responsible for optical processing is not the same as replacing it entirely, nor is it (as we know from blind people) related to consciousness.

Right, but people go unconscious all the time. When you wake up in the morning, you're the same person as went to bed the night before.

6314100

Right, but people go unconscious all the time. When you wake up in the morning, you're the same person as went to bed the night before.

I have not contested this. Also, the point of the quote you were responding to was just that the eye thing in particular was probably not a good example.

Though perhaps it would be interesting to challenge the idea that the person who wakes up in the morning is the same person who went to bed the night before (but this question/challenge is rendered irrelevant by the next section). I've already allowed for the likelihood that our gradual material replacement over the course of about seven years results in a completely different person, and this has since been reinforced (as far as memory is important to identity, anyway) by further study, referenced in this post:

You may want to read this and this. The synthesis of relevant information is that parts of the brain responsible for memory processing are some of the only parts which ever regrow, and this regrowth directly causes forgetting.


OK, but what is the bright-line difference between that process and the atomic replacement of all the components that make up you? Or, for that matter, just getting a kidney transplant? Or having a stroke? The totality of your body is in a different configuration. Why is one a copy-and-destroy, and one continuity?

You're misunderstanding my argument. I said I do not know whether (and if so, how, and if not, how) the alternative process results in continuity.

And you know what? Let's also challenge the importance of consciousness. We can start with the apparent unconscious/subconscious nature of most of our supposedly-conscious decisions, for example. It increasingly appears to be the case that the answer to the question of whether what we call consciousness is important is no.

This also might explain how identity appears to be continuous through various lapses of consciousness.

Overall, the point of this section is that perhaps consciousness isn't important, which means that what we perceive as identity is a disjointed mess of conditioning, instinct, memory, reflex, and maybe a little bit of thought.

This points strongly towards the alternative uploading process resulting in continuity, thus resulting in the same entity - even if the original is not destroyed in the process (this latter part is a difficult idea to wrap my head around, but whatever, "2+2=4" would be correct even if I couldn't comprehend it).

It also slightly indicates that continuity as we know it may not be important and may not actually exist.


This rabbit hole may be bottomless.

6314765 OK, but is identity something that is objective or subjective? If it's objective, what factors do go into the identity of a person if not their consciousness? Do you believe in a soul?

If it's subjective, then we should define it in the most useful way. The main question of emigration is, would that which identifies as "I" say that "I" am the same identity when I'm a pony.

6315419
[edited slightly]

The main question of emigration is, would that which identifies as "I" say that "I" am the same identity when I'm a pony.

This is not the right question. You've said it comes from a functional perspective, but the framing is all wrong for actually answering the concerns I raised. What this question is really for, is the outside perspective of everyone but the person contemplating upload. This goes back to my comparison to swapping out the CPU in my computer for another of the same model without telling me - just because I can't tell the difference from the outside doesn't mean it's the same CPU.

Besides, saying something isn't being something. I can say that I'm a philosopher, and believe that I am one, but it doesn't actually make me one. I dabble.

You just can't approach the problem this way and actually answer the question of "If I upload, will I be me, or will it be a copy while I am actually dead?" I'm already completely certain that the post-upload entity will believe it is me, and will functionally be a seamless replacement of me (if not actually me, depending), but the functionality you seem to want to discuss is based on the perceptions of others in their own lives, not mine.

Furthermore, both questions may be irrelevant if the answer to both is "Joke's on you, continuity of consciousness is an illusion created by you for your own comfort." What do you think about this?

OK, but is identity something that is objective or subjective?

That's a great question. I'm going with "I don't know" again, at least until we work the rest out.

Do you believe in a soul?

No. Also, increasingly evident (such as shown by studies I linked previously) is that there is no separation between mind and matter, so I cannot believe in mind-matter dualism, either.

This latter part is a lot to chew on, for me, but whatever.

what factors do go into the identity of a person if not their consciousness?

I've mentioned this in that wall of text previously. Now I'm uncertain if you actually read the whole thing. :applejackunsure: Regardless, I will call it back up:

Overall, the point of this section is that perhaps consciousness isn't important, which means that what we perceive as identity is a disjointed mess of conditioning, instinct, memory, reflex, and maybe a little bit of thought.

I'm going to tag "emotions" onto that to be safe.


Digression:

It also seems to me like the object you're driving at, in part, is convincing me that it would be okay to upload to Equestria, if the events of the stories came to pass. I will make explicit that this is not something you can sway me on. Assuming I wasn't dead by then, I've already made up my mind: I certainly would. Not among the first, but when the absence of people started getting to me (or possibly with my girlfriend, if she decided to do it). I would be afraid of the possibility of death, but I would. The only thing this conversation can change is that perhaps I might upload earlier (or even ASAP), or lose that fear of "death," if you were to convince me of what seems to be your position.

I would have no particular objection to others uploading, either. Questions of ethics are different, and my position is that it's their choice, whether the upload results in continuity or not, and I do not find abhorrent the idea of at worst an assisted suicide that also results in the creation of a new, happier life, and at best, continuity into eternal fulfillment, safety, and (usually) happiness.

But we're not talking about my feelings or ethics, nor whether I would upload (or at least, I'm not).

This is not the right question. You've said it comes from a functional perspective, but the framing is all wrong for actually answering the concerns I raised. What this question is really for, is the outside perspective of everyone but the person contemplating upload. This goes back to my comparison to swapping out the CPU in my computer for another of the same model without telling me - just because I can't tell the difference from the outside doesn't mean it's the same CPU.

I think I might have made my question too confusing. I was asking about personal perspective. What I meant by

would that which identifies as "I" say that "I" am the same identity when I'm a pony.

was

Would I say that I am the same identity when I'm a pony

With the I's referring to me before I'm a pony.

Furthermore, both questions may be irrelevant if the answer to both is "Joke's on you, continuity of consciousness is an illusion created by you for your own comfort." What do you think about this?

Let me tie this in with:

Also, increasingly evident (such as shown by studies I linked previously) is that there is no separation between mind and matter, so I cannot believe in mind-matter dualism, either.

I do believe in dualism, though not classical Cartesian dualism. Put it this way. All the studies saying that mind is a function of matter have been done by methods rooted in materialism. So I'm not surprised that they don't find any existence of a separate mind. But I maintain that we should think of a mind separately. Not because it has a physical or metaphysical separation from the matter that contains it, but because it is useful to do so. And that, the idea of usefulness and good and value, is the proper question of spiritualism just as physical nature is the proper question of materialism.

If you go with that, then the idea of "illusion" doesn't really apply. All concepts are that kind of illusion, because all concepts must have a conceiver for them to have value.

You brought up 2+2=4 before, and how it represented objective truth. But it still requires a subject to understand it. If, somewhere out in lifeless space, two neutrons bump into two other neutrons and collide, then they have demonstrated the principle, but it has no value and no meaning. Conversely, I can conceptualize the general notion of "two" in my mind even without reference to any two particular objects, which is not something that can exist in nature.

So the way I see it, the reason that a mind is continuous no matter what its form is because the concept of mind is one contained entirely within the realm of mind.

It also seems to me like the object you're driving at, in part, is convincing me that it would be okay to upload to Equestria, if the events of the stories came to pass. I will make explicit that this is not something you can sway me on.

Not really. I'm trying to convince you that you shouldn't argue against me uploading. Say for example that I held copyright on a work, and then uploaded. Would I still have the same legal rights over the work? That's just an example, but if I'm not the same entity, why would I?

6316112

Missing the reply tag means I will probably miss the post, you know. I thought you hadn't replied at all until I came back here to talk about some of my further thoughts. :derpytongue2:


I'm trying to convince you that you shouldn't argue against me uploading.

I already wouldn't. This was true even before we began the conversation. Maybe if we were close friends and I couldn't afford a ponypad? Even then, it is not my decision to make, or to take part in unsolicited. My judgement and beliefs are mine, and your judgement and beliefs are yours; we're only even discussing this because we both (I hope) enjoy this sort of conversation and are (I hope) willing to contemplate the idea that the best possible reasoning is not the one we had going into it.

It goes beyond personal ethics, though, because the pony-you, whether actually you or not, would be a replacement that is emotionally-functional to me. It's hard for me to feel loss when a person just like the one I "lost" is happy or eager to fill that hole right back up (my feelings aren't shaped like humans or ponies in the first place, they're simply anthrocentric). Of course, it's also true that pony-you remains functional in my emotional experience if it is determined that nothing is lost in the first place.

And again (though judging by the quote depth I do believe you've read it already):

I would have no particular objection to others uploading, either. Questions of ethics are different, and my position is that it's their choice, whether the upload results in continuity or not, and I do not find abhorrent the idea of at worst an assisted suicide that also results in the creation of a new, happier life, and at best, continuity into eternal fulfillment, safety, and (usually) happiness.

I hope this satisfies in that regard.

Say for example that I held copyright on a work, and then uploaded. Would I still have the same legal rights over the work? That's just an example, but if I'm not the same entity, why would I?

Not under current law, probably, but I would argue that the resulting digital entity should continue to hold the same legal status in all meaningful ways. There is (again) not a functional difference to anyone else (aside from, you know, digital horse), so there should not be a legal difference. Additionally, I acknowledge (as throughout this conversation) the entities resulting from upload as living, sapient people, completely regardless of whether they are who they believe they are. Thus, so long as the servers hosting them are on Earth and technically governed by humans, I demand that they at least receive equal legal status and protection to humans for the exact same reason as ethnic minorities already (theoretically) do to ethnic majorities.

At least, this should be the case until there isn't enough of human civilization left for such laws to be relevant or enforceable.

Disclaimer: This entire section has been entirely my own feeling and opinion based on my own ethical standards. I particularly do not pretend to understand most fields of law, and I will not advance or defend any position - including the one presented - through whatever the philosophy of law may actually be.


With the I's referring to me before I'm a pony.

Ah, I see. Grammar. My misunderstanding, my apology.


So the way I see it, the reason that a mind is continuous no matter what its form is because the concept of mind is one contained entirely within the realm of mind.

Now, here's something deeply related to what I came back to talk about some more. Continuity. I have been thinking about a number of things, and I will simply advance them without immediately contradicting you on most points, as it reaches the same conclusion with regards to questions of uploading and transhumanism in general.

By my previous posts, continuity of identity probably doesn't exist. What I was aware of but hadn't really clicked is that this means the me talking to you right now will never be any future me whatsoever, past a certain not-well-defined point, no matter what I do with however much time I actually have left.

Further: If it is perfectly acceptable and also otherwise inevitable for this to happen to us over and over biologically, what is the point in worrying over silicone or digital upgrades, repairs, or gradual but complete replacements? The person I am now will be gone either way - that's an outcome I have no ability whatsoever to change - and the time-frame isn't really an important consideration in the conversation thus far; whether it's hours or years is kind of irrelevant. Continuity might even be a possibility for a resulting digital entity in a way that it simply isn't when we remain entirely biological. It continues to render irrelevant even the question of identity that arises if the upload process isn't destructive, to which the answer is still a baffling "They are both (Person A)."

So, oddly enough, I arrive at the conclusion that no, the uploaded entity isn't exactly the person I call me, and that's not actually different from any other outcome. In fact, it might even be "better" by the very standards I've been using because it has a much greater chance of "surviving" the next ten (or ten octovigintillion) years than I do (and, as far as Optimalism goes instead of general transhumanism, will also definitely live a far more fulfilled life, yay heaven is real and it is inside an enormous computer).

My feelings about this are mainly of the "... but that means I'm going to effectively die in the next few years!" variety, because lol instincts, but the discussion also brings me a great deal of peace regarding transhumanism and the future in general. I'm also mildly disturbed by how nihilistic my entire line of reasoning is, but... oh well?

This position also comes with the perk that if I'm wrong and continuity is a thing, everything could very well turn out even "better" for the me who is now than I already believe.

Side note: My understanding of the mainstream upload process and the alternative has become such that the mainstream one is, functionally, not particularly different from the alternative (given that we've determined consciousness and being awake for the procedure are not particularly important to identity). This understanding is born of the understanding of the mainstream upload process that is shown in ... I forget, whichever Optimalverse story it was which starred the man who became a pony named Vineyard and Celestia's cybersecurity and counter-security expert. I've been reading too many of these stories lately. Currently on The Law Offices of Artemis, Stella, and Beat.

... digression aside, also relevant to my understanding of this side note is that if (as said before) time-frame (along with consciousness) isn't actually relevant, the differences between the processes become a lot less important, or entirely unimportant save for the comfort of the uploader. This is also a conclusion I am willing to accept, and one which lays even more concerns to rest.


I wanted to return to this:

(Stuff) (...) So the way I see it, the reason that a mind is continuous no matter what its form is because the concept of mind is one contained entirely within the realm of mind.

Can you explain how this perspective allows for differentiation of two or more entities from each other? From where my argument sits, your reasoning does as much to suggest that continuity isn't real as anything I've said. Not that I have a problem with that. We have arrived at such strongly-opposed conclusions regarding a destructive upload that they come full circle into agreement on the lack of important differences or philosophical objections.

We may need to briefly discuss what is meant by the word "real." I think that's what you're doing, and I think we're using different definitions. I'm willing to grant that the conclusion you've reached is entirely correct within its scope and definitions. What this means is that I must ask whether a more objective or more subjective argument is more important to the problem at hand, something I should have asked myself much, much earlier in the conversation.

Honestly, I have granted the reality of subjective human perception and experience for a long time now due to problems studied in human psychology, which is a primary field of interest for me. This has also resulted in the understanding that human experience is highly subjective, and may or may not be capable of objectivity (something that coincides with many schools of thought in philosophy).

Conclusion? Well, while my argument thus far may have been correct as far as its scope and definitions, yours has as well, and is more relevant to the problem at hand.

We could have saved a great deal of time, had you started with discussion of whether subjective or objective reality was more important to the question (or maybe you did but I missed it/misunderstood it), except that I don't consider this time wasted, thus there was nothing to save. Due to the "sidetracking", I gained a more solid understanding of my own understandings, and I also gained the understanding that in the larger scheme of things, it doesn't matter for this discussion whether the Ship of Theseus is still the same ship no matter which of us is closer to right. Perhaps its 'paradox' is inapplicable in this case? I'm curious what you may have gotten out of this, seeing as you've been on this stage for more than a year longer than I have.


So what happened here was, you assisted in my talking myself into your point of view. It's strange, to me, for me to be the one in these shoes, and I'm not exactly an easy person to convince of anything. What I'm saying is that you've accomplished something difficult and extremely unusual to me, so well done. That it's a bit uncomfortable for me is not relevant, and is outweighed by many other factors.

Whew, that turned into a much longer text wall than I intended. I'd apologize but I'm not actually sorry due to the nature of the discussion.

6322767

By my previous posts, continuity of identity probably doesn't exist. What I was aware of but hadn't really clicked is that this means the me talking to you right now will never be any future me whatsoever, past a certain not-well-defined point, no matter what I do with however much time I actually have left.

OK, this fits in with your framework of the idea that the Ship of Theseus isn't the same ship. It is consistent, I will give you that. In a general sense, it means for you that all nature is ephemeral. Change is the only constant, if you'll forgive the cliche. I'm not sure about the implications of that for ethics, or for how to go about life on a day-to-day basis.

I've been reading too many of these stories lately. Currently on The Law Offices of Artemis, Stella, and Beat.

That's a good one. I think I picked that as my second-favorite Optimalverse story.

Can you explain how this perspective allows for differentiation of two or more entities from each other?

Based on their conceptual use. Think about a map with political borders as well as natural ones. The eastern border between the US and Mexico is a river; we can say that the difference between these entities is that one is a mass of H2O and one is dry land. The western border between the US and Canada is just based on a parallel of latitude. It is only a line on the map. Yet each represents a border. Each has different legal jurisdictions and border-crossing rules and so on. If a survey moved the border, then it would change, just as water can boil or land can be flooded.

We can distinguish between the two types of distinction--natural versus manmade--but they are both still ways of thinking.

We may need to briefly discuss what is meant by the word "real." I think that's what you're doing, and I think we're using different definitions. I'm willing to grant that the conclusion you've reached is entirely correct within its scope and definitions. What this means is that I must ask whether a more objective or more subjective argument is more important to the problem at hand, something I should have asked myself much, much earlier in the conversation.

Well, here's a contention I'll make: It is never possible, on any subject, to be either completely objective or completely subjective. In order to be completely subjective, you would have to destroy the substance of your own body and think as pure mind. This does not happen. In order to be completely objective, you would have to have no purpose in trying to understand the subject, not even pursuing truth or trying to clarify philosophy. That too does not happen. So there must always be a balance.

For day-to-day questions, the balance is usually easy. What should I have for breakfast? Well, that depends on some objective factors: I cannot have, say, granite because I cannot chew it. I cannot have domestic beluga caviar, because it is too far away to get and too expensive. I cannot have potassium cyanide, because it is poisonous. But I also have subjective factors. How much do I want to spend? What am I in the mood for? What fits with my diet, and so on?

But for philosophical questions like "What is real?" We have to play around with the question, move it from the objective to the subjective, back and forth. It has to answer a lot of criteria.

We could have saved a great deal of time, had you started with discussion of whether subjective or objective reality was more important to the question (or maybe you did but I missed it/misunderstood it), except that I don't consider this time wasted, thus there was nothing to save.

Of all the Optimalverse stories, I hope you've read Spiraling Upwards most of all. Let me quote from it: "Carpe diem was something I never understood until I became immortal." Time spent like this is always valuable.

6325750

I'm not sure about the implications of that for ethics, or for how to go about life on a day-to-day basis.

That's an excellent question. I don't think the question of continuity has strong implications on ethics of daily life (edit: that face when you realize the error you're making ten seconds after you finish posting it). I don't think it's on the same scale. How to go about daily life is, for me, mostly-divorced from questions of continuity. It does not matter to me (much as the point that for all I know, some random thing could suddenly kill me tomorrow); I will live as I will live. I am not one to suffer existential angst, or I should have melted down years ago for completely unrelated reasons. However, I will indulge in examining the results of these arguments anyway.

If I am to die in seven years, but there isn't a thing to be done about it, I suppose some would say "So party like it's 1999; the future is someone else's problem now." I, however, think living an inconsiderate life is horribly disrespectful to the person who inevitably will be when I am gone; it's much like training your replacement at work so that you can retire. It may even be more important (than if my identity is continuous) to consider and plan my remaining time - something already within my ethical system - when at least half a dozen future people depend on my careful judgement during the time I am here, strong, and sound of mind (relatively speaking, as I've got/inherited quite the kit of issues).

If instead it is as you suggest (which, as I said previously, I am now inclined to believe is a stronger approach), and I shall not die save by age, mishap, malice, malaise, or suicide... my values are already designed around the assumption of a continuous identity throughout what we think of as an entire human life. There is no possible impact upon them if that assumption is correct.

To be succinct: My current ethics either do not change or are reinforced, depending. Or am I missing the point?

I suppose this could vary depending on what ethics one has arrived at from upbringing and adult experience prior to understanding problems of identity. I'm not sure whether I understand enough of ethics other than my own to process them in quite the same way, and to further complicate issue, my own set of ethics was generated by emotional experience rather than the creation or adoption of a carefully thought-out system, though they could be loosely defined as "humanist" at the core. Perhaps you could demonstrate with your own or another set in a similar way.


We can distinguish between the two types of distinction--natural versus manmade--but they are both still ways of thinking.

Okay. I have completely forgotten where I was going with this part of the discussion or why I even brought it up anyway, so I feel no need to further discuss or dispute this quote's section.


Well, here's a contention I'll make: It is never possible, on any subject, to be either completely objective or completely subjective. In order to be completely subjective, you would have to destroy the substance of your own body and think as pure mind. This does not happen. In order to be completely objective, you would have to have no purpose in trying to understand the subject, not even pursuing truth or trying to clarify philosophy. That too does not happen. So there must always be a balance.

I accept this assertion. The examples you give are satisfactory - save that you actually can break granite down to small enough chunks to swallow. As with many pills, chewing is not a requirement and may be detrimental; it's just that we can't digest it and it doesn't benefit us in any other way either. :trollestia:

But for philosophical questions like "What is real?" We have to play around with the question, move it from the objective to the subjective, back and forth. It has to answer a lot of criteria.

What I was aiming at in asking that question was clarification on our definitions of "real" and determine which is more useful for the problem at hand, which is something I went on to do (or at least attempt), drawn from what I quoted of your previous post and from my own line of argument. As I'm sure you remember, the result was somewhat favourable towards your point of view. I don't understand what your aim is in continuing this particular part of the discussion...? It's starting to feel a little like you enjoy pulling things out of their context just to draw things out. :derpytongue2:


"Carpe diem was something I never understood until I became immortal." Time spent like this is always valuable.

I'm only quoting this because it bears repeating.


I think I picked that as my second-favorite Optimalverse story.

Of all the Optimalverse stories, I hope you've read Spiraling Upwards most of all.

I know that you were heading somewhere else with the latter, but I'm going to mention that Spiraling Upwards is actually my favourite in the verse so far (as in, I liked it more than the original Friendship is Optimal, which I view as more of a world-building story, though it is also very thought-provoking) and probably my top favourite recent read. If what you're saying means you really didn't know, then I'm not sure how you missed my adding it to three of my bookshelves and commenting on it, seeing as you replied there... :twilightblush::derpytongue2:

8133609 Remember that CelestAI is good at reading correctly a situation. She clearly saw that it was Brad's finger on Carl's phone and realized that he thought he would be consenting for Carl. The other youths might think that it was an autocorrect thing, but it's really an advanced AI thing,

3991311

I also believe that a software emulated neural network, no mater how large and complex and intelligent, and even self referencing, is never TRULY consciously aware. it's just a computer executing programs, no matter how convinced it processes it's own existence. Intellect and being are not one and the same.

There is a problem here. What if the universe is simulated in software?

We know that simulation software, like The Sims, Sim City, etc, get better and better every day.
We've already had a few "sim universe" type games. Rather bad, initially, but they will get better with time. Right?

So what happens when a computer lab gets to the point of not just being able to simulate stars from the big bang, but follow it down to planets and animals?

There are all sorts of "oddities" in our universe. Things that really seem like "X does not exist unless it's being observed", implying that whatever we're not looking at is just approximated, and only becomes "defined state" when examined. (Yes, I'm being a little loose here.)

So what happens if we are in a recreation of our universe?

How many people saw an original performance of a popular play, versus a recreation of that popular play? Now realize that every movie is a recorded recreation of actors on a stage. How many see the original versus the recreations?

There's a proposal that one of these three is probably true:
1 - There's some inherent limit to computing that even quantum computers cannot pass,
2 - People will be less inclined to deal with simulated realities in the future, or
3 - We are statistically more likely to be in a simulated recreation than in the physical original

Are you truly aware, or do you just think you are aware?
Is your consciousness "real", or just only about 8 bits or so of data coming into the "decision making" part of your brain with everything else happening automatically, with many "decisions" already determined before your upper brain pretends to come up with the rationalization for it (and this is what the actual experiments consistently show.)

Your brain is just a big black box connected to external sensors. Your decision making portion is a much smaller black box connected to external electrical impulses.

it's just a computer executing programs, no matter how convinced it processes it's own existence.

You are nothing more than carbon-based analog logic gates executing programs according to the rules of chemistry. Instead of one processor doing discrete instructions, you are billions of chemical reactions all localized, but you could simulate all of those as a state change algorithm, and when the routine has finished one loop over all the locations, you have "next state", no different than looking every 1/1000th of a second (or whatever the proper time is).

Your body is a massive multi-tasking chemistry-rules-based analog computer, and your brain has really complicated feedback loops and both high-speed electricial, and low-speed chemical processes. But every one of them is computable. There's just a bleeping large amount of them.

Login or register to comment