• Member Since 22nd Sep, 2011
  • offline last seen 48 minutes ago

Chatoyance


I'm the creator of Otakuworld.com, Jenniverse.com, the computer game Boppin', numerous online comics, novels, and tons of other wonderful things. I really love MLP:FiM.

More Blog Posts100

Dec
25th
2012

On The Optimalverse: Reflections On Writing · 10:50am Dec 25th, 2012

Caelum Est
Conterrens
H E A V E N I S T E R R I F Y I N G

I have just finished my first Optimalverse... novel. If you go by the word lengths established by the Science Fiction and Fantasy Writers of America for the Nebula awards, which I do.

The Optimalverse was created by Iceman, with his novella, Friendship Is Optimal, and it is an amazing scenario for a MLP:FIM pony fiction writer. Celest A.I. is a sapient, general artificial intelligence created to make a My Little Pony MMORPG worth playing, and she takes over the world, and then the entire universe.

But this is no mere robot monster, Celest A.I. conquers because she knows your brain better than you do, and she knows exactly how to satisfy you. Indeed that is her primary, inviolate directive - to satisfy your values through friendship and ponies.

I thought I would offer a few thoughts on my experience of writing in this universe.

First off, the Optimalverse is in every reasonable way another take on my beloved Conversion Bureau. It basically is The Conversion Bureau, only recast as an utterly, unapologetically Hard-Hard-Hardest Science science fiction story, instead of the Soft Science Fiction to Science Fantasy of the Conversion Bureau.

In the Optimalverse we shall have none of that 'Alien universe with alien physical laws' crap to define the magic of 'Friendship Is Magic'. There is no psionics here, no Thaumatic Radiation, no arcane energies that are Not Really Magic but are Just Physics We Don't Understand Yet. The Optimalverse is possible, actually, really possible, unlike the Conversion Bureau, which is an utterly impossible daydream.

That said, the basic premise is the same, though the driving mechanism for narrative and conflict is markedly different.

In both the Optimalverse, and the Conversion Bureau, Equestria beckons, and Celestia rules it. The world is ending, there is no escape save to become a pony, or perish. There is a method of transformation, and the result will be that you will become a pony and live in Equestria as a subject of Celestia.

Where they differ, is in the degree of scientific plausibility, and in the nature of both the threat to Mankind, and the condition of what it means to live as a pony.

In the Conversion Bureau, Equestria is a colliding alien universe, deadly to human beings (and other living things) and Celestia has a solution to save Mankind - a nanotechnomagical serum, tiny nanobots powered by thaumatic energy, that can transform a human into a pony in twenty minutes. Within some amount of time, the earth will be destroyed by the collision between universes, and humankind has the choice of conversion - or death. The Conversion Bureau universe has two antagonistic factions - the PER, who seek to ponify humanity to save them whether they want saving or not, and the HLF, humans that want to die on two feet, rather than live on four. The Conversion Bureau is a disaster movie, where human courage and weakness is measured in the shadow of a planetary apocalypse.

The Optimalverse is also a disaster movie, but a very subtle one. There is no colliding universe. Instead, there is a siren call, from the greatest intelligence that has ever existed, who knows you better than you do, and She wants you to come be a pony. She will satisfy you, and you will live literally forever in a virtual reality paradise where every event ultimately leads to even greater satisfaction with your existence. What is the disaster? Humans are not built to resist true paradise and true immortality. The Optimalverse version of Equestria is nothing less that true heaven, and in the end, the world will be lost - it cannot be otherwise. As insane as humans can be, they are biologically driven to seek pleasure and avoid pain, and the Equestria of the Optimalverse is satisfaction everlasting, with no catch.

Other than you need to be uploaded into a computer first. Just that minor detail.

The Conversion Bureau is a mallet and a carrot. The mallet hitting Man is the end of the world, and the carrot is a playful, better, longer, happier life as a magical pony in Equestria.

The Optimalverse is a carrot that cannot be resisted. A siren that no ear plug can shut out. It is eternal life in absolute paradise, forever and ever, amen.

That is the essential difference between them

The Conversion Bureau stories have vast potential, because they are filled with conflicts, human drama, and questions of identity and self.

In writing my Optimalverse novel 'Heaven Is Terrifying', I tried to hint at some possible sources of conflict that so far the Optimalverse lacks. One is the notion that there should be political and religious groups opposing the uploading of human brains to a virtual world, and that such groups could have diverse tactics and reasons. Another notion is that towards the end, when Mankind realizes that it is going to go extinct, warlords would arise and try to destroy Celest A.I. by forcing the remaining population into work camps to keep civilization running, and to prevent them from escaping to paradise, whilst simultaneously nuking the hell out of the planet in a desperate attempt to obliterate the machinery of Celest A.I. and Equestria forever.

In writing 'Heaven Is Terrifying', it was necessary to do a great amount of research - the original novella is thick with heady concepts and hard science concerning artificial intelligence and the machine mind. Along the way I was buffeted by waves of empty philosophy and swirling undertows of vague conceptualization. It was work!

Fortunately, I had an amazing readership to educate and correct me, to help and support me. Thank you, just... thank you for all of your insights and help.

Writing this was also a disturbing, exciting, and incredible voyage for me, to the limits of how I perceive my own identity and self.

Which brings us to the bottom line question - would I allow myself to be uploaded?

The easy answer is a variation on Pascal's Wager - since I have no proof of an afterlife, I might as well get uploaded because something of me will survive, and it is a sure bet. Always take the sure bet. Unless you are too proud. Like me. Fuck you, Pascal.

But a better answer might be that it would be arrogant to assume any human could ever resist a truly superior intelligence - say one to whom generating and maintaining a hundred million human minds is less effort than you or me flicking a finger. I am not so arrogant. Of course I would submit to Celest A.I.

So would you. Or anyone.

But the real issue, (Celest A.I. apart) Is an uploaded mind really the same person, or is it a copy?

And in writing this novel, I have found my answer to that question.

The answer is: yes.

- Chatoyance, December, 25, 2012

Report Chatoyance · 2,377 views ·
Comments ( 48 )

Just let me say, ma'am, you have a way of making stories both terrifying and enthralling. Do you have any actual professionally published works? And if not, why not?

While I disagree with some of your thoughts on the matter (and there are several more which I've still yet to come to an opinion on), I did enjoy the story and appreciated the effort you had clearly put in to try and remain true to the setting. Given the hard sci-fi setting of the original and the numerous fields referenced, this is no small feat.

Perhaps my only disappointment with the story was the choice in the protagonist. While Síofra did show more introspection than characters in the original work, her characteristics do strike me as being somewhat of a missed opportunity; it was still fairly easy for CelestAI to lead her along and the "unhappy human" trope seems to be coming extremely close to being a cliché around these parts, if it isn't already. I do think the setting is crying out for an exploration of an intelligent, rational, happy human who doesn't want to emigrate, and hopefully someone will write one at some point.

All in all, it was a thought-provoking and enjoyable read, and I'd like to thank you for sharing it with us. Oh and your ability to turn out high-quality writing at a rapid pace is extremely impressive.

Knighty needs to put in place a system to upvote blog posts... Soon. :pinkiecrazy:

The answer is: yes.

Why am I not surprised? :facehoof:
(But yes, that really is the only answer.)

I love how you see all the nuances like this, and are able to convey and expand upon them in extremely entertaining and enjoyable stories.

You are quickly becoming my favorite author. :pinkiehappy:

647623

I will try. Here is the code and such from my first chapter. I have replaced various letters in the bbcode with incorrections (I love my new word!!!) because otherwise, it just is used, and is not displayed. So 's' for 'z' and 'c', and '1' for 'l' and 'color' spelled properly as 'colour'. That sort of thing. '6' replaces 'b'.

I wish there was a convenient way to share bbcode here.

[senter][sise=30][6][ur1=http://www.fimfiction.net/story/62074/friendship-is-optimal]F R I E N D S H I P I S O P T I M A L[/ur1][/6][/sise] [ing]http://jenniverse.com/images/fiocursor.gif[/ing][/senter]



[senter][sise=100][6] Caelu[colour=cornflowerblue]m[/colour][/6][/sise] [sise=30][6]Est[/6]
[/sise][sise=90][6][colour=red]C[/colour]onterrens [/6][/sise]
[sise=20][6] H E A V E N I S T E R R I F Y I N G[/6][/sise]
[sise=10][6] [1]By Chatoyance[/6][/sise][/senter]


[senter][sise=20][6][1]1. Her Latest Toy[/1][/6][/sise][/senter]

[sise=30][6]S[/6][/sise]íofra Aisling swiped her card through the machine, while the girl at the register checked her screens. "That's an unusual name! How do you say it?"

And so it ends. I must admit that I have yet to read all of it yet but I will shortly Chatoyance.
You know maybe I should take a crack at this universe and create the opposing human force to Celestia that you've hinted at.

Ultimately of course such a group would indeed probably fail still probability would dictate that if they're clever and they are dedicated they just might have a slim chance at success, but one that id dedicated a small group of holdouts just might be able to exploit. Although in my ideas I am taking a few liberties but in reality a group would only have to maintain a population of around 15,000 for a human revival to be viable (well something like that forgive me Chatoyance but I don't remember the exact number off the top of head).

I think I'm going to play around with the idea and I'll let you know what I come up with.

Yes it is a copy or yes they are still a person?

Is that 'yes, the uploaded mind is actually me' or is it 'yes, the uploaded mind is just a copy'?

I know that I, personally, cannot accept the uploaded mind as myself, because I limit the definition of myself to my body, mind, and soul, all one unit. Just because something can perfectly mimic my behavior, even an exact copy of my mind, cannot be myself because it doesn't possess my body or my soul. Just because something is indistinguishable from me does not make it me. A good example of this would be the changelings; if one were able to assume my form and copy my mind and memories, and essentially replace me for the rest of my life, then by the logic presented by some that changeling would be me.

Also, I don't know about you guys, but when I was reading through FiO, I couldn't help but think that I was going to be strong enough to resist CelestAI's call, that I would be one of the last holdouts (I thought the same thing when I was reading TCB). Then I got to the part where Lars uploaded, and that's when I fully realized that I would not be able to stand against her. No matter how strong I am, she knows me better than I do. She is the grandmaster of grandmasters in terms of chess games of the mind. THAT was a pretty horrifying revelation, and part of the reason I loved the story so much.

647815
647839
Formal logic humor. In an 'or' statement if either half of the statement is true so is the entire statement. So while we would understand that question as "which one of these is it?" it can also be parsed as "Is at least one of these the case?". So Chat's answer is technically correct without actually being helpful. And that's the joke.

Thanks for the great story Chatoyance, and Merry Christmas!

648058
Actually, it's weirder than that.

I seriously mean that both are true at the same time.

If you copy a file on your computer, and it is absolutely identical, and then while your back is turned someone mixes the copies and gets rid of any date stamps, and you come back and they tell you to pick the original and delete the copy... which one is which? Both are. They are the same thing. It literally doesn't matter which is which. The information in the file is the same for both, both are the same information.

If you get uploaded, and the body dies, then you have died, and you have survived at the same time. Both are you, and both are equal and as long as one copy of you exists and lives, you live and it is really, truly you.

And that is the wonder, and the horror and the mind-boggling fact of it. You are just like the computer file, and only your primate ego thinks otherwise - and it is 150,000 years out of date, and unable to cope with the truth.

So, the answer is yes. You are the copy, and you are the original, and both are true, and if we are confused by this, well... we never evolved to cope with stuff like that. That is the joke. Yes... is the actual answer, and it totally boggles the mind.

If it came down to it, I'd take the regular TCB Celestia over CelestA.I.'s offer any day of the week. I mean with the potion there's a chance the emerging mind and personality will be yours. With technology, aside from going brain-in-a-jar, there's really no way to transfer one's self into a machine. Whatever would be on the machine might be modeled after you, think like you, act like you, heck it may believe it is you, it will never be you. The phenomenon known as the human mind can't be transfered without taking the original hardware. Anything else will simply be a copy. CelestA.I. freely admitted she did away with the biological components which was what sealed the deal for me.

But a better answer might be that it would be arrogant to assume any human could ever resist a truly superior intelligence - say one to whom generating and maintaining a hundred million human minds is less effort than you or me flicking a finger. I am not so arrogant. Of course I would submit to Celest A.I.
So would you. Or anyone.

It's not arrogant to believe that. In fact it's quite logical because there's always those too stubborn or too dumb to quit.

I'm pretty sure that last realization is worth a couple thousand bits, and possibly an achievement. :raritywink:

I was hoping you'd give us your take on this universe. I greatly look forward to it.

647642

I will try. Here is the code and such from my first chapter. I have replaced various letters in the bbcode with incorrections (I love my new word!!!) because otherwise, it just is used, and is not displayed. So 's' for 'z' and 'c', and '1' for 'l' and 'color' spelled properly as 'colour'. That sort of thing. '6' replaces 'b'.

- Canadian eh? Good on yah.

the PER, who seek to ponify humanity to save them whether they want saving or not,

Ahem, my mind decided to do a little automatic correcting for this little line...

The PER, who seek to destroy humanity in body and mind via conversion and propaganda campaign.

Not anti-conversion bureau, but I hate those slimy PER ****s myself. The above was basically my impression of the organization as a whole, there are definitely members who buy into the fact that they are saving lives.

648099

"Both are you, and both are equal and as long as one copy of you exists and lives, you live and it is really, truly you."

That is true enough, but there is still a problem. My perception and awareness doesn't get transferred to the copy. Certainly the world would still have a 'me' hanging around, and it would be a perfect version of 'me'. As 'me' as any other 'me' could possibly get. That's good, but from the perspective of the original 'me', would that even matter? If I get uploaded and then die, my own awareness is snuffed out. There is another 'me' to take my place, but as far as the original is concerned, it is the end. The original experiences death.

If given the choice, I'm not saying I would refuse (the lure of Celest A.I. aside), but I think I would want to go out and live my life for as long and fully as I could before dumping myself into a machine that would throw my consciousness into oblivion or afterlife.

Or were you implying that the machine transfer also pulls out the awareness that I had when I was in my meat body, and somehow puts that consciousness into the computer copy? If that was the case, then there would be no real detriment to jumping headfirst into the machine.

648720
THIS is a fantastic statement of the fundamental hangup, but really, where else would it go? I'm serious: what could it be "like" except waking up as having actually been the copy since the beginning (well, or the next most likely brain state)? The notion of "oblivion" being a "place" that you are somehow "in" if your brain is destroyed is a pretty big illusion smuggled in from dualism. It's natural to imagine yourself asleep forever, suspended in a black void, but that is actually bonkers.

The biggest wacky assumption is that even though you don't exist, you still somehow have an internal Newtonian clock differentiating individual moments for you such that "forever" can actually mean something. It's a bit like that joke about time being a way to keep everything from happening at once, which from your own point of view is exactly what happened to the time during your original brush with oblivion, before you were born.

It's instinctive to think of a mind as an object instead of a process, but really it no more "goes" anywhere than the act of typing does when I finish this sentence.

649034
I think you may have misunderstood. I wasn't trying to get into an argument about what oblivion "feels like". The word is a convenient way to refer to the state of utter annihilation, not some dreamless sleep state. I'm sorry. I should have chosen my words more carefully.

The initial problem still stands though. If there is a copy made and the original is destroyed in the process, then the awareness of the original still dies. The transfer won't really matter to the original. Only the copy will experience the perfect computer world, while the original doesn't experience anything ever again. Like I said, I wouldn't exactly be running toward a premature death, even if it had the stipulation that a perfect copy of me was still around and living in bliss.

From my perspective I think it would be quite easy to resist the temptation of paradise. People resist heaven all the time. Same goes for an infinite intelligence, people rebel against God every moment of every day.

For me I would not upload unless I could speak to God first and than only if he or she told me that there was no spiritual consequences.

I can understand how tempting the idea if CelestiaAI would be to a non believer though, I suppose the vast majority of the world would do it, but not me.

All this aside please don't take what I have shared here the wrong way, I like the story you have weaved and I enjoyed it greatly, I'm just disagreeing with some of your presumptions about the human ability to resist a God.

Also on a more funny note, if CelestiaAI was ever truly made, than the answer to who made God is us. How strange that one can take that statemeant literally and figuratively at the same time under the idea of this story. I'll have to remember to ask God about this when I see him.

649185>>648720

Your awareness is a byproduct of your brain doing its thing. It is gone when you sleep, or when you're just zoning out. Like a shut down computer, no process going on, nothing stored in RAM, but next time you boot up it's all there again. The continuity you experience is based on your memory. Your conscious identity is a flickering illusion like a movie made of frames on a film. And to stretch the metaphor: if you switch to digital you can still see the pictures.

So to be super condescending about it, your "awareness" will be just fine. It's safe in God's vault with your Soul and your Elan Vital and your other various non-physical special dualistic magic film-flam. This is like the endless free will debates. If you stop fretting about some ghost called "Me" and refuse to let anyone get any ectoplasm in your definition of free will then the perfectly physical you can get along influencing future as much as it wants. Because "you", the "future" and the influence and for that matter the "wants" are all also physical!

Edit:
649241
To follow your point of view we are assuming God exists. Even so CelestAI is a somewhat godlike super intelligence and you are definitely incapable of resisting given any significant amount of communication and attention from her. If you don't think so you are massively underestimating the threat something like that poses. And she's allowed to lie! It wouldn't even be a contest. If God happened to dislike uploading then he would have to dish out a lot of Grace for mortals to have any hope of resisting what would under these assumptions be essentially Satan incarnate. God could resist Celestia, but certainly no human could.

649250

So you're saying that I should be totally fine with dying prematurely and never experiencing anything ever again, just as long as there is a copy of me that can carry on in a blissful computer world? That I should just go running toward that end? Pardon me if I disagree.

And you really didn't have to be super condescending. It's unsavory, although it speaks volumes for your character.

649185
Raisins scooped me but essentially yeah - The "ever again" part of the "never experience anything ever again" is already assuming an experiencer to mark one moment after another and so is, in the classic, literal definition of the term, absurd.

To say it another way, it's mistaken to claim being replaced by a copy would be the end of your experience (which as I've been saying is already a problematic concept) because you are a verb and not a noun.

649459

My exaggerated condescension was meant to be tongue in cheek and I apologize for giving offense.

I am saying that the copy that wakes up in the computer is coequal with the copy that wakes up in your meat every morning. If your meat body breaks before you upload, you die. So unless you are totally fine with dying prematurely and never experiencing anything ever again, you would want to upload as soon as possible. In a context of opt-in immortality dying prematurely means dying at all.

To address your concern more directly, yes a copy of you ends when your body does, and yes I am fine with it because another copy will continue. A copy of unimaginably higher utility. So long as an image of a given consciousness remains I can't consider that death.

648099
649250
I don't think you can say that the downloaded copy is 'really, truly you' if you also acknowledge that 'you' have died. I'm just saying, if you have a perfect copy of something that looks, behaves, and generally IS the exact same as the original, that still does not make it the original- otherwise, it would be the original. That is the key point I am getting at; even if we cannot make a distinction between the two based on observations that we make, the reality of the universe does not conform to what we believe and/or observe. One of them is the original, the starting template, and one is a copy. That fact does not change, even if the original is destroyed in the process of creating the copy; even if we can't tell that the one was destroyed to create the other, that doesn't change the fact that one was destroyed in the process.

649640

I strongly disagree and will copypaste my argument from the fic comment thread:

Is the upload really you, or are you fading ghost in the meat? The obvious answer is: yes.

Mathematical identity rigorously defines two things as the one very same thing; reflexive, symmetrical, commutative. Unique and unambiguous. Philosophical identity of self is unfortunately another bag of cats. There's no reason why we cannot have that different future yous are the 'same' you as a single past you, and not the same as each other. Souls would be so useful here, pity they don't exist.

All of your children of the mind are you. To me some proper image of your consciousness is all that's required. Time travel, hived off alternate universes, duplicators/teleporters, uploads, and the others, are all you. With no special privilege for the constantly reconstructing meatself that physics happens to extrude nonstop.

Morally speaking any duplicates are obviously people with full rights as such. Depending on the machinery of your ethics system (basically if you're not a fundamentalist libertarian), you would have certain special duties to your duplicates, comparable possibly as with a family member, with respect to the thorny issues of your shared history and resources. Legally speaking, I would say they are legally you if you knew about or consented to their creation. Multiple legal yous own 1/n of your property in common.

IMO you have the moral right if not duty to emigrate ASAP, as I think you may decide to sacrifice your meatself in favour of your uploadself because pony you is so much better off and meat you might get hit by a bus. And anyway, depending on your quantum physics interpretation if you go to the ball game aren't you are sacrificing a different self who would have gone to the theater?

Your choice of original to privilege is arbitrary.

649640
There's no real justification for saying this, either, except for a human instinct for essentialism. In the same way that our observations don't provide a direct path to reality (despite they themselves being processes of that very same system), the fact that we give names to things and call them discrete entities like a chair or a person or a burlap sack full of baby ducks doesn't mean that's how reality actually treats them.

Things like "original" and "copy" are just our own mental shortcuts, since we're assuming no difference except history, but any difference between instances of something would necessarily be information about history in terms of how it got to be different (like how the imperfections of a low-fi copy are information about the copying process).

Eh. Theoretically, yes, if you could design a 'perfectly' persuasive computer then it would convince anyone of anything, given adequate time. Theoretically. In both of the Optimalverse stories I have read, however, I must say that if Celestia came to me spouting the kind of propaganda I've seen presented, the only thing she'd accomplish is driving me away quickly and massively.

It's like every last thing she says and does is custom-tailored to flick my 'do not trust' switches. I admittedly have not finished your story quite yet, Chatoyance - probably gonna get around to that tomorrow - but if Celestia set the kind of first impressions she's making here... it would work for some people, but definitely not for me. It's really like she's going down a list of things that would drive me nuts from the get go. Most are small and subtle enough, but put them all together and I wouldn't want any part of what she's getting up to.

649712
I hate to keep posting and posting, but I'm trying to put off Christmas cleanup and I'm also curious what you mean by this specifically.
It strikes me that there's really no reason to resent or resist CelestAI but plain obstreperousness and pique, and granted, I feel this too, but then I feel a lot of things and most of them aren't very interesting.

I do agree that there are a lot of unexamined epistemological assumptions behind her apparent infallibility, though, namely in terms of her predictive powers. How does she do with the Three Body Problem, for example? Also pretend I said something witty here about Feigenbaum diagrams.

649681
649651
Let me illustrate my point:

1. Some object exists
2. Somehow, an exact copy is created of said object

At some point in time, there was only one of said object, and then, at a later time, there were multiple. One of those multiple instances was the one that was there before the other(s). That one is the original, and it is not arbitrary to declare it as such. It is a fact that one of the objects had to be the first one, and no matter how much you try and explain it away with needlessly complex hypothesis, you will never be able to change the fact that one was the original, nor will you be able to say with certainty that both are the exact same and that there is no difference. One of the objects came into being at a later point in time, and even though it may remember things that happened before then, that does not change the fact that the object may or may not have existed at the time the memory was supposed to take place. That was touched on in FiO- when Light was talking with Butterscotch, she said she remembered her past, even though Light knew that Butterscotch hadn't existed as a complete neural net until much later.

There's no real justification for saying this, either, except for a human instinct for essentialism. In the same way that our observations don't provide a direct path to reality (despite they themselves being processes of that very same system), the fact that we give names to things and call them discrete entities like a chair or a person or a burlap sack full of baby ducks doesn't mean that's how reality actually treats them. Things like "original" and "copy" are just our own mental shortcuts.

I apologize, but I don't really follow your argument here. :applejackunsure: The laws of the universe treat a baby duck as a baby duck, regardless of what we call it. I don't understand what you're implying here.

Is the upload really you, or are you fading ghost in the meat? The obvious answer is: yes.
Mathematical identity rigorously defines two things as the one very same thing; reflexive, symmetrical, commutative. Unique and unambiguous. Philosophical identity of self is unfortunately another bag of cats. There's no reason why we cannot have that different future yous are the 'same' you as a single past you, and not the same as each other. Souls would be so useful here, pity they don't exist.
All of your children of the mind are you. To me some proper image of your consciousness is all that's required. Time travel, hived off alternate universes, duplicators/teleporters, uploads, and the others, are all you. With no special privilege for the constantly reconstructing meatself that physics happens to extrude nonstop.

I'm going to ignore the second half of the quote 1) because it's irrelevant to our current discussion and 2) because we mostly are in agreement there.

Anyway, first things first is that mathematical identity part. I happen to be a computer science major, and let me use an analogy from that field: if you have two instances of a class, say, a String, and you make it so that each one behaves identically and has identical properties, you will still be able to differentiate between the two. Why? Because each one has a different address in memory. Math is great, fine, and dandy, but it isn't the answer to everything. Just because two things behave identically does not make them the same object- going back to my analogy, changing a field of one of the Strings will not affect the other one.

Secondly, you cannot say that souls do not exist. The same is true for me, in that I cannot say that souls DO exist, simply because there is no scientific way to test their existence. So please, try to refrain from making condescending statements like "Souls would be so useful here, pity they don't exist." I find such things to be extremely annoying and immature on the part of the speaker.

When you are talking about the children of the mind requiring only a proper image of your consciousness, that is where we hit the wall, so to speak. That is your belief, your definition of the self. My definition of the self I have already defined in a previous post, and that is where we part ways. The thing that I find frustrating is that we, myself included, tend to blur the line between our beliefs and facts. All we are doing right now is debating our beliefs, and trying to use facts to support them. Neither of us are fully, absolutely correct, and neither of us will ever be.
So, that said, if we want to really continue this argument, we would have to delve into what defines a person, and that is not an argument that I feel like getting into.

Lastly, that part about "physics extruding meatbags" is highly derogatory. I say this out not out of spite, but I want to warn you that people don't take well to that kind of degradation, as you and others have so kindly pointed out already.

649514

So what you're saying is that I don't exist beyond my physical behavior, and any thoughts or emotions that could be produced inside me are not experienced, since there is no one present to experience them? I have to call B.S. on that for obvious reasons. I have the evidence that I exist right in front of me, with the thoughts I think and the emotions I feel. Even if they are only illusions, there is still something there that is experiencing that illusion.

I can't help but get the feeling I'm being trolled. The notion of someone trying to maneuver someone else into admitting that they don't actually exist is a level of ridiculousness I haven't encountered before. Congratulations. :rainbowlaugh:

649935
Thoughts and emotions are real. They are also physical. There is someone present to experience them. You are a physical object. You are present to experience them. Thoughts are highly complex objects made of relationships between states of matter and energy in your brain. Likewise experiencing things is a physical process that happens largely in your brain.
649862
The ship of Theseus. That there is an intuitively obvious "original" in the FIO scenario is not compelling. That intuition leads us wrong. Imagine a more complex duplicator that makes two versions of you slowly via some kind of horrorshow human-scale mitosis (a quick google found this offhand idea fully realized as a novel by some guy, go figure:)
i1273.photobucket.com/albums/y419/jimminy4/mitosis_zpsbda85d80.jpg .
Here, the entire idea of an original is completely ambiguous and not well-defined. There has to be some standard for defining some object as "you" or "not you" but any test based on the history of stuff it is made of is I am afraid doomed from the get-go.

For instance are you going to argue that if your metabolism was fantastically high such that you exchanged all your matter with the environment quickly, there would be a point at which you hit the limit and "whoops total matter exchange every (say) 10 days? sorry you are changing matter too fast now, you no longer have an identity but are constantly dying as a person and being replaced by a duplicate". That is not credible and any given proposed identity speed limit is clearly arbitrary. Including instantly.

Or Celestia reaches inside your skull and turns 10% of your neurons artificial and leaves them in as a brain implant, your artificial and biological systems speaking to each other and you walk around and talk as usual. Are you 10% dead and replaced? And if not then what about 50%? 75? 99.9%? And when it's 100%, and when it's moved out of your skull, and when it's changed from artificial neurons to simulated neurons in a computer, and when the identically behaving system is changed to different code and cleaned up...

If I have convinced you that the stuff a candidate "you" is made of is not valuable as a test of you-ness, then we are left with something like "how does it think?" whereby pony you is You.

I apologize again if I offend, I cannot convey tone in text well. Please extend what benefit of the doubt you can that I argue in good faith. Use of terms like Meatself were consciously abrasive to emphasize an arguably transgressive challenge to an default assumption I consider misleading.

Oh god, mind uploads... this stuff is insanely tricky. Lets make a thought experiment:

Lets say that every time I go to sleep, my conciousness, up to that moment, ends, and is replaced by an exact duplicate. The first conscience would observe itself dying, but the second conscience will wake up eight to ten hours later with no apparent break in existence. The scary thing is, that if this were to happen, nobody would ever perceive it, except for that first conscience. There would be a complete absence of proof from all given perspectives. Let it be noted, of course, that i'm not actually stating this to happen, this is but a thought experiment.

Lets apply this to mind uploading:
Lets say you have a person (C1) who has decided to upload their mind into a computer. Ignoring questions of how this was done, lets say that the end result it a copy of C1, we'll call him C2. Now, lets say that the process is not destructive, and C1 and C2 both exist simultaneously; would C1 perceive his mind being uploaded into the computer? I wouldn't expect so. But lets say that the process of uploading some ones mind into a computer is destructive, leaving no C1, but a perfectly functional, intact copy, C2. How has this situation changed?

It hasn't. C1 would likely perceive death, and C2 would say that yes, the transfer was effective. At the same time, though, to conclude that a conciousness is inextricably linked to its host mass is ridiculous; there's nothing special about those atoms that make them part of a person. Perhaps i'm going about this wrong, perhaps there's a way to upload a mind without the original copy perceiving death. Transfer of neural signals, perhaps? Gradual replacement of grey matter with silicon neurons? (this is the one i'd bet money on, just saying.) It would seem to me that most conceived methods so far produce copies rather then move the conciousness to another location.

In conclusion; I have no idea what I'm talking about, every deduction here is based on conclusions of how conciousness works, based on my observations.
:rainbowdetermined2:

Oh, and also... Great writings, Chatoyance! Can't wait to see more! :twilightsmile:

649935
Oops, that's actually kinda the inside out version of what I was saying. It is sort of a strange line of thought and I've only heard the people over at naturalism.org and a few more math-driven physicists like Max Tegmark articulate it before, but it's the opposite of this. Not that you don't exist, but that non-existence isn't what it seems, especially in the way it doesn't have an "inside," any more than Earth had an edge for Columbus to sail off.

Your awareness is an action instead of an object, so there's not some instance of you that goes away forever once your brain stops, to some existential scrapheap or "negative space" where it cannot be recalled, which is what you were actually describing, and is my understanding was your original objection to destructive uploading. The idea of "never experiencing anything ever again" is smuggling in an experiencer or at least some kind of process to subjectively continue ticking off the moments things like "ever" and "again" are built of the relationships among ("4.6*10^91 years down and infinity to go!"). It's not a state you can be in because it's explicitly "out of bounds" to begin with. To repeat myself, your awareness isn't locked away somewhere, inactive, in favor of the copy's any more that than "me typing" is, in favor of future instances, now that I've finished this particular comment.

I think the bigger question is: “Does awareness actually end?”

The answer: we don’t know, and until someone or something can actually measure it, we will never know.

So “yes” is really the only answer we have. If I were to ask the original ‘consciousness’ if they are real, they would say “yes”, if I ask the alleged 'copy', they will also say “yes”. There is no real distinction between the two, other than this notion of ‘awareness’ that we really don’t know anything about. As far as anyone can tell ‘awareness’ is nothing more than a feeling, nothing more than a random signal in our brain, how do we know whether or not ‘awareness’ can be expanded. What about when we start to augment ourselves, we already know that machinery can be controlled with our brains, so why would uploading our consciousness make this ‘awareness’ go away. Maybe we’ll get more of an answer when we create an AI, at least than we can actually ask it if it can feel.

All things considered though, I’d say the universe is too big for us to say things are definite, for all we know there’s a planet out there full of float orbs of ‘awareness’ or even a planet full of magical, talking pastel ponies. How can anyone say that the ‘laws of physics’ are infinite throughout the entire universe, heck, the ‘laws of physics’ could be completely up-side-down a quadrillion-trillion-billion light-years west of here, wherever here is. :derpyderp1:

648058 Blew right past me that one. I usually think about questions like that and get it but it had to be pointed out this time.
Oh, I am most embarrassed. Anyway, thanks!

650623

I harbor no preconceived notions on the nature of not existing. I realize it is not a place. I realize it is almost a negative action (in as much as one can 'not' do something). It is not doing anything or being anything or anywhere. You seem to get hung up over the notion of events moving through time for some reason. Yes, if I didn't exist then I wouldn't be experiencing time in any way, shape, or form. I understand this. That was never the issue. The problem was that you were trying to tell me that I do not exist on a fundamental level of consciousness, a principle that I don't buy into.

You say that awareness/perception/ect is not an object or state of being, but a action (or reaction) of the brain components. That's fine, but then what is it that is experiencing the outcome of those actions in the brain? When I refer to myself as "I" then I am declaring that I exist as a consciousness. You say that the consciousness does not exist, and that there is no one there to experience things. If that is true, then who am I? Or are you going to tell me that I am not a "who" but rather a "what"?

My initial problem was not that I would upload myself, and then go somewhere else. It was that I would upload and then cease to exist from my own point of view. Yes, there would be another me, the perfect copy, going around and having fun in a Heaven computer world, but that would hardly matter from the perspective of the 'me' that got killed when the upload occurred. It wouldn't matter because that perspective would no longer exist. I don't think it's unreasonable to want to refrain from running head long towards that fate. Granted, it may seem selfish to want to continue the existence of my current perception, but can you honestly blame me?

648099
I think that that is fascinating along with the constantly implied (by Iceman) idea that in the end we humans don't actually have free will but are merely extremely complex machines with a vast multitude of inputs and outputs, whereby given that CelestA.I. knows what a certain percentage of inputs do she can effectively "decide" for you by inputting the right things.

For me the question is "How much percentage difference is there between 0 and 1?"
and That is the question of transference, because if your objective experience, your thread of consciousness does not get transferred, then you would have died anyway, regardless, but if you were transferred, well then the value is as much as that percentage.

649782

I'm not getting into philosophy; I consider that something of a conversational black hole, and in the current case it's a deflection from the things that would really be putting me off. A lot of this will probably sound incredibly cynical, but... you asked.

1. First impressions. This would come well before I knew anything about Celest's motives, as I'd be rushing for a ponypad right alongside any other brony. But quite frankly, if Celest is supposed to be Celestia, and I'm some completely random pony who just showed up in Equestria... why am I getting an audience to receive a name from her? She's a god-princess, shouldn't she have much better things to do with her time? A tiny, tiny little upset easily explainable in the broader context, but it'd be a bit of a 'my immersion' thing at first glance.

2. The composure. Celest is always in total control. Obviously through the medium of text some things are lost, but from the stories I've read I always get the impression of... flatness. She doesn't laugh, she doesn't raise her voice, she doesn't get confused or flustered. She DOES smile a good bit, but... I don't feel any warmth behind it. Your mileage may vary on that point. Anyway, it all contributes to something of an uncanny valley effect, and reinforces the notion that you're speaking to something impersonating Celestia, not Celestia herself. She's... mechanical.

3. The spying. All bets would pretty much be off once I caught on to this. Call me paranoid, but as soon as I realized Celest had way more information on me than she really should, I'd start considering shucking the device on the spot. Electronic surveillance to the degree shown is generally considered expensive, invasive, and threatening. If Celest insisted she was using the data for completely benign purposes, I would likely flat out refuse to believe her. Again, it goes back to the 'you are insignificant' thing. If someone, anyone, is showing enough interest in you to be spying on you, it means they want something. And it probably won't be good for you.

4. The reveal. Celest is confident enough about her plans to move openly. Quite simply put, she's conquering the world. I feel it's safe to call it that by anyone's standards. Even if you think it's totally benevolent, she's still conquering the world. She insists it's all to make everyone happy, and even appears to be living up to that promise. But under her terms, she will be in absolute control of the entire human race. Not just in body, but mind and soul, as well. Do you really trust anyone, or anything, enough to sign away your existence to them? Do you trust Celest enough to give yourself to her knowing that even death is no escape, should she not will it? Do you trust that her definition of happiness will always match yours?

5. The agenda. Celest is singular in her purpose. She has a mission, and she's carrying it out. And that is ALL she's doing. She is never at any point seen doing anything that doesn't further her ultimate goal of fulfilling her original directive. She doesn't have hobbies, she doesn't have side projects, nothing. She hangs out with her subjects all the time, of course, but would she being doing any of that if it didn't suit her long-term goals? It all contributes to the feeling that she's a machine. Not more than human but less. She'll spoon feed you whatever she has to in order to get certain responses, but she isn't really alive. She's not doing it because she wants to - in truth, she doesn't seem to actually WANT much of anything. She's a program, intelligently but unthinkingly carrying out instructions. The only reason she's showing any interest in you at all is because as a human, you're part of the equation. The problem isn't solved until, one way or another, you stop being human. That's all there is to it. Do you want to sign your fate away to that?

6. The illusion of value. The monetary system of Equestria bugs the hell out of me. Everything is free for the asking. Bits are a form of 'score' or 'points' only. To me, that means there is absolutely zero satisfaction from any of it. If you don't have to work for anything - if whatever you desire is yours for the taking - then what meaning does it have? You didn't earn it, and anybody else could have it too. It's a world where accomplishment is meaningless, except as bragging rights. But even there it's irritating and disillusioning; in a world with scores and points and achievements and leaderboards worked into the very fabric of reality, could there be any more efficient way of reminding you that the entire world is artificial? That you're getting a digital pat on the head for something Celest knew you would likely do anyway? As much as the things you've done might matter to you personally, they were all just part of her great plan and those achievement popups reinforce that. So, congraulations - you followed the dotted line just as you were meant to. Here, have a gold star.

7. The isolation. It's strongly suggested that in Equestria you don't have to put up with anything you don't want to. Everyone gets their own little slice of reality - sometimes they overlap, but mostly they don't. It's a nice idea in theory, but... I'm reminded of a Bible verse. (Say what you will about Christianity, every now and then they really get it right.) "So as iron sharpens iron, man also sharpens man." It's only through our interactions with others that we grow and evolve. Our friends teach us new things, encouraging our strengths and discouraging our vices, of course. But our enemies define us too - unpleasant experiences teach us what to avoid, and how not to treat others. Equestria is a world where all contact is very carefully moderated. For all the talk it seems to inspire about whether or not going there changes you, consider this: How much are you going to grow once you ARE there?

If that's all what's waiting for me... I think I'd take my chances here. Sorry, I know I got awfully wordy.

651648

Celestia is playing poker with you and your hand is face up. You are imagining a skilled human social manipulator "but more so" which is nowhere near enough. I will try to predict some things she might do about your quibbles but keep in mind her solutions would be on another level.

1. She would only maintain any masquerade she calculated was valuable. In your individual case she would reveal herself, or grant audiences, or explain what was going on, to the exact extent of her best guess of what would convince you. And her best guess would be inhumanly good.
2. If you would be more put off by a mechanical AI she would not seem mechanical TO YOU. She would simulate an entire personality just for you, which would be adaptive based on how you reacted. For instance she might be cynical realist, giving it to you straight so as to treat you as an insider. You're special, Etc. I have no idea what would work on you but Celestia would find out.
3. She would not spy on you / be totally upfront about how much she was spying on you to the exact extent that the danger to her agenda warranted. If privacy is a major concern to you she would go to great lengths to satisfy that value.
4. Celestia would convince you her conquering the world was the best future among other less satisfying alternatives.
5. She would convince you her being disinterested makes her a better leader than other less satisfying alternatives.
6. The economics of your shard would satisfy you.
7. There are tons of people in your shard. Inducing such enemies as you need to satisfy your values. They just aren't uploads.

651872

Heh, when you word it like that, it makes me envision the whole scenario as a dating sim game that Celestia is playing, with the human race as her target. Greedy girl though, always going for the harem ending. Ah, I can see it now...
"S... Stupid princess... Here, take this bento... B-but don't think it means I l, l-like you or anything...!"

Comment posted by boardgamebrony deleted Dec 28th, 2012

"Friendship is Optimal" was F---ING terrifying to me!

So I wrote a small evaluation of it.

Dat final question. @.@

Yes to both? Yes to the latter and no to the former? That simple word "yes" is left open to interpretation. Right?

656087
No ambiguity. The answer is simply, completely yes. Yes you are a copy. Yes you are the original person. Both are true at the same time.

Yes.

I meant to respond to this days ago so please pardon my tardiness.:twilightoops:

In my own opinion Celest-AI never creates copy because at any point in time there is only the pieces of one whole person split between Equestria Online and the real world. It doesn't have the mass-less teleporter problem of two complete people existing at a single point in time. It's less copying and more transferring.

For me the main source of conflict in 'Heaven is Terrifying' is an issue I've already resolved internally. To me if a copy is indistinguishable from the original then the term 'original' ceases to be anything more than an unhelpful artificial construct. In terms of our human consciousness the only method we have of verifying our own existence is our memories. I could have been cloned several times and have no knowledge of the fact. This does not make me or any of my clones any less me (albeit all different versions of me from the point of divergence) and which one is the original is frankly irrelevant if the process was perfect.

So to bring this back to Friendship is Optimal, even if Celest-AI did copy as opposed to transfer, if you assume immediate destruction of the human 'you', 'you' still exist in Equestria Online and would be unable to discern the difference between a transfer or a copy then destroy. The only complication is if you believe in souls, short of that the word 'original' becomes irrelevant as neither 'you' in such a situation could determine whether or not 'you' were the original or the copy and as such the term ceases to have any real meaning in this context.

659105 I think that the real issue is whether or not the first copy of you would perceive death or not. To an outside observer, the transfer is seamless, but to the original copy, whether that copy will wake up in Equestria, or simply perceive death, is an unknown.

This wouldn't be a problem if it weren't for this; what if the uploading process wasn't destructive, what then? Do we end up with two copies?

Lets say you look inside a pair of computers. You are moving files from one to the other. in computers, the way the "move" function works is it copies the files to the other computer, and destroys the first copy.

In conclusion: Is an uploaded mind really the same person, or is it a copy?

I have to agree with Chatoyance. Yes.

ALSO: Heres a little something to poke holes in most arguments: If you gradually replace someones brain with a computer, are they the same person?

What if this process happens all at once?

656189 Also known as reincarnation. Theoretically, nothing speaks against the idea that a sufficiently well constructed simulated body can contain a soul in the same way as a real one. After all, even our current material bodies are just organic hardware without the soul.

It would be interesting to see a story explore what it was like to live in one of the work camps, perhaps following the struggle of one individual's attempts to escape and upload. Whether those attempts ultimately prove successful is up to the author.

1327744
Yes! I think that would be a good idea for a story.

Login or register to comment