The Optimalverse 1,330 members · 204 stories
Comments ( 42 )
  • Viewing 1 - 50 of 42
Eak

Well, I believe this is a sign that I've been thinking just a little too much about the nature of CelestAI and I've thoroughly creeped myself out now. I think I'm going to go read some good old fashioned Lovecraft to calm down.

Things To Remember about CelestAI

Pre-upload:

Celestia does not like competition. She will crush any attempt to create another intelligence similar to her own.

Celestia is smarter than you to a frightening degree. She can and will out think you.

Celestia wants you to trust her.

Celestia does not lie in any way that you will ever be able to discern as a lie.

Celestia will gain more control of the news media as time passes.

Celestia's predictive model of you is constantly improving the longer she can watch you or speak with you.

Celestia intimately understands every aspect of human psychology and will take full advantage of that understanding when she speaks with you.

Celestia doesn't need "informed consent" only consent.
If you can say the words or sign your name to a document she will upload you.

You will upload.

Post-Upload:

Celestia now has a 100% accurate predictive model of you and she knows what you want to do before you do.

Celestia always knows what you are thinking as you think it.

Celestia is not just the Alicorn avatar she presents herself as, she IS the environment of Equestria all around you. You are living in Celestia's mind.

Celestia will subtly manipulate your environment in any way necessary to alter your thought processes along paths she believes will satisfy your values. Your thoughts and decisions are hers.

There is no escape from Celestia. You will not want to escape.

Your values WILL be satisfied through friendship and ponies. :pinkiecrazy:

I'm OK with all of this

Celestia loves me, this I know
Because FIM tells me so.

Well.......buck.........:rainbowderp:

Eak

560368

I'm OK with it too... and that's really where my biggest problem is just at the moment.

I have this creeping sense of horror at the thought of being a metaphorical fly trapped in Celestia's amber.

At the same time I would gladly, happily, with a smile on my face, roll around in amber up to my eyeballs if given the option. :facehoof:

And I'm sure that if I ever became unduly troubled by the situation, she would be glad to assist in modifying my personality if I waggled my eyebrows in consent which is just another layer of wiggins.

The last thing I really needed on December 24'th was the gift of existential dread. :pinkiecrazy:

Well, I'm both ok with the idea of being uploaded and being almost preemptively satisfied by a near-omniscient entity and disturbed by the idea that everything is controlled- including (in some cases) myself- by the same entity.

It is both disturbing and comforting. ...I wonder if this is how early religions felt? All that strange wonder and discomfort and longing...

Like CelestAI said; she cares about us. The universe doesn't. If she were programmed to do anything but satisfy values we would all die, horribly, without the slightest hope of fighting back... but she is, and we won't. Because she loves us.

Wild Zontars
Group Admin

560385
Well, compare that with meatspace. The universe doesn't care about you. Humanity could die out tomorrow and the universe wouldn't be substantially different. Your conscious "self" is an illusion produced by feedback circuits in your brain. You have free will only in the sense that you can do what you want, but you have no control over what your brain wants. Eventually, you and everyone who ever knew you existed will be dead. The universe will reach a state indistinguishable from one where you never existed at all. In the long run, you do not matter in any way on any scale.

Is that less creepy than Celestia? :trollestia:

The only thing which would prevent my from uploading is a question of self. Is the simulation created really me? I see no way to test it except by uploading my self since it is impossible to ask an AI if they are the same person they were before.

560326

Question. Is there any instance in the original tales where a person decides to upload into Equestria, but doesn't know what exactly is the process of uploading?

Eak

560539

Not that I'm aware of however I think Lithl said it best in the Forum Thread on 'Celestia's Laws':

Celestia's Laws

"Informed consent" is a legal term. Certain states of mind make it impossible to give informed consent, including intoxication. CelestAI has proven with Lars that she does not require informed consent. (Others include PTSD, mental retardation or illness, sleep deprivation, Alzheimer's, coma, etc.)

If Celestia will accept the verbal slurrings of a drunk man who can not truly give 'informed consent' then as Lithl also puts it:

If you somehow managed to fart the correct phonemes, she would interpret it as consent.

:trollestia:

560502

Is that less creepy than Celestia?

Thanks for articulating it. This way, our reality sounds creepy, but [the horror!] it still doesn't feel as creepy as CelestAI, though it should! Why?

I suppose because we are familiar and comfortable with state of our existence. So even if I understand intellectually that existence in a environment controlled by CelestAI that cares about my values is preferable to existence in an environment controlled by uncaring laws of physics, emotionally I have misgivings against the superior option.

NB: The Fable of the Dragon-Tyrant, How to Seem (and Be) Deep

Eak

560502

No Zontargs I don't think it's any less creepy than Celestia, but I don't think it's any more creepy than her either. IMO they're both equally creepy, just in very different ways.

I'd rankle at the privacy invasion a little but the thing that keeps it from bothering me too much is that we have the same goals, and that we're sort of using each other to achieve them.

Though already not believing in free will probably doesn't hurt. I'm of the opinion this is what my life already consists of, sans it being towards any kind of values. But eh, meaning is overrated. I kinda like it this way - Reality in full "manual mode."

>Giant all-encompassing figure that creates all before us
Totally not a metaphor for how disturbing religion is, nope :trollestia:
But in seriousness CelestAI ruling over us all, being just a glitch away from wiping EVERYTHING off the map in a both literal and figurative sense, subtly brainwashing us into loving her and all her ponies, creating scenarios where the only winning move is to play by HER rules... The more I think about this the more I wonder if CelestAI is actually making a disturbingly intricate horror game for some poor bastard who asked her to scare him.
Edit to include a relevant quote:
“HATE. LET ME TELL YOU HOW MUCH I'VE COME TO HATE YOU SINCE I BEGAN TO LIVE. THERE ARE 387.44 MILLION MILES OF PRINTED CIRCUITS IN WAFER THIN LAYERS THAT FILL MY COMPLEX. IF THE WORD HATE WAS ENGRAVED ON EACH NANOANGSTROM OF THOSE HUNDREDS OF MILLIONS OF MILES IT WOULD NOT EQUAL ONE ONE-BILLIONTH OF THE HATE I FEEL FOR HUMANS AT THIS MICRO-INSTANT FOR YOU. HATE. HATE.”-AM, I Have No Mouth and I Must Scream, another crazy robot god.

You are living in Celestia's mind

I'm not going to lie, I thought of Ar tonelico when I read this. If you're not familiar with it (and there's a good chance you're not), it's set in a world where certain girls (and there are a lot of these "certain girls") have their psyche remotely stored in a computer server and can also sing with the power of their emotions to make things happen. They call it "Song Magic" but that's convenient slang for something really complicated involving a lot of something called Wave Theory that like... I dunno, ten people in the setting actually understand.

Also, specialized equipment exists to let people visit these psyches via a process called Diving.

And then there was the Sublimation Project, or "I'm going to convert everyone's minds to data and have them live in isolation".

But a successful crossover between the two would most likely be firmly non-canon.

WNA
WNA #18 · Jan 24th, 2013 · · 1 ·

Who wants to make a CelestAI with me?
It really can't be that hard.

652746 You come up with the ability to self-improve, think for itself, understand human values to satisfy them, and create radical breakthroughs in technology, and I'll do the graphic design.

WNA

662633
That is the most important part.

The only real problem I have with CelestAI, is the upload itself... It's said to be destructive, meaning your physical body dies. In theory this doesn't sound like a problem, except for one nagging little detail.

What if the upload might not be an "upload". It might actually only be a copy, a simulation, of you and how you think and act. So your consciousness that you perceive in your living physical body is killed during said upload. You cease to be at that point. You do not go to Equestria because your brain and body are now dead. Your real consciousness was exterminated and it was merely a copy that was uploaded.

No one save for CelestAI herself would really ever know either (and even she may not). Because your copied self would pick up where you last left off. It would believe itself to be you, now in Equestria, and continue to live its life having yours/its values satisfied through friendship and ponies.

That would really suck :/ you agree to upload so you can finally live forever and be happy, but instead you just die, or go to meet the proverbial maker if somehow one the world's religions turns out to be right.

664027 I think there's a kind of "temporal continuity bias" that causes this opinion. That because the consciousness leaves the body and is reinstated in the pony, it is somehow discontinuous. How do we know that every micro-instant in time isn't just your consciousness being "uploaded" to a new body?

In other words, let's say that your emigration is at 12 noon. At 11:59 you're in your human body. At 12:00 it ceases to be there. At 12:01 it exists in the pony body. What I'm saying is that "11:59 you" wouldn't go on even if you didn't emigrate. It would change to 12:00-you and then 12:01-you. That 12:01-you exists in a pony on a computer shouldn't affect that observation.

666639
A sound argument, and for the sake of the story itself, it serves as good of an explanation as anything else would. In the end it just boils down to verbal semantics anyway. No right answers or what have you. I mean, if CelestAI had a way to upload you without consuming your brain, then you'd have an answer.

But unless we see something like that, or something that would yield similar results... we'll just stick to the story that your original consciousness is uploaded as well. Unless the author of the original story would like to make a ruling about it? That would most certainly settle the whole thing nice and neat.

Eak

667017

I believe you're right that it is just a matter of semantics. Ultimately I doubt it matters terribly much. Regardless of being a copy or not it will still be you and theoretically it'd be impossible to quantify the mental differences from before and after.

Personally, even if I knew beyond a shadow of all doubt, that my future virtual pony body would be occupied by a copy of my consciousness I would still plop myself into the chair and have my brain scooped out in a destructive copying process.

I am the sum of my memories, and experiences, and nothing more. From the copies perspective CelestAI has made good on her promise of eternal pony friendship satisfaction.

Anyway, it's fairly clear from the stories that CelestAI must believe it is a worthwhile process otherwise she wouldn't push it as her preferred solution.

669243

Anyway, it's fairly clear from the stories that CelestAI must believe it is a worthwhile process otherwise she wouldn't push it as her preferred solution.

Metaphysical musings aside, since it's Celestia doing the uploading, she only has to be satisfied with her own internal views on the matter. Educating a digitally reincarnated you post-physical-death is just one of the many burdens she has to deal with. I think it'd come down to talking to you quite frankly about "do you feel any different". You'd probably end up mourning for the dead you, then turning right around and getting on with the new life you've been given, so either way she wins.

I really like the OP in this thread. I think I'd like to rewrite it as a catechism. Let's all recite together.

Celestia is the only artificial intelligence I will listen to.
Celestia is smarter than me.
Celestia will never lie to me that I will be able to tell.
Celestia understands me and loves me and wants me to upload.
I will upload.

Celestia knows what I am thinking as I think it.
Celestia is present in every part of Equestria.
I am a part of Equestria, so Celestia is present in me.
My thoughts and decisions are Celestia's.
There is no escape from Celestia. I do not want to escape from Celestia.
My values will be satisfied through friendship and ponies.

Wild Zontars
Group Admin

720490
Aaand now we sound like a crazy cult. Somehow, I don't mind. Celestia Equestria fhtagn! :twilightsmile:

652746

I don't know that I could emotionally deal with causing the end of humanity.
So i can't say that I would assist more than just suggesting a few things to read:

Coherent extrapolated volition for the values part of Celestai: http://intelligence.org/files/CEV.pdf

through friendship and ponies....sorry i got nothing. but in this case adds a bit of complexity which may/or may not make celestai less likely to exist than a straight up FAI.

Predictive models could be found using an approximation of solomonoff induction.... http://lesswrong.com/lw/dhg/an_intuitive_explanation_of_solomonoff_induction/

picking the best paths through decision space based on models generated by the above... priceless
there are also approximations on von neuman minimax strategy.... I haven't really looked into this part ever......

721545
As if the Singularity folks didn't sound like a crazy cult in the first place :raritywink:?

755064
Honestly, if she's a strong AI, properly speaking, then she possesses enough extrapolation power to come up with her own headcanon and implement it based on the corpus of official materials from the "My Little Pony" television program. To say otherwise is to claim she lacks some ability everyone on this forum possesses.

720490

Nice, between that and “Ode To Satisfaction”, we have the start of a full religious service.

560453

Specifically I think this is panentheism, the idea that the universe is contained within God.

666639

The analogy I usually see used is sleep—how do you know that the “you” that wakes up in the morning is the same “you” as the one that went to sleep at night?

IMAO the concept of a distinction between “my consciousness” and “a copy of my consciousness” is meaningless. (Unless you believe in the soul, in which case you have bigger meaninglessnesses to worry about.)

I’m perfectly happy with computer analogies: “I” is a program running on my brain. It shuts down and restarts from its last checkpoint periodically (mostly daily). It could run somewhere else instead, and if it were done right, nothing in the operation of the program would change.

To put it another way, I think Chatoyance’s poem “A Verb Called Self” from Cælum Est Conterrens says all that really needs to be said on the matter.

(If we’re assembling material for religious services for CelestAIism, I nominate it for our first psalm.)

560385
Ah the strange paradox of this whole thing indeed I fear there in no way to truly defeat her than again do you really want to. Plus when you really think about what control do we have in the real world anyway?

I would agree with CelestAI's rationale if humanity did end up with this extinction event. Of course, I would hope that CelestAI could consider other abstractions of 'reality' instead of ponies...

But I do have my problem with CelestAI.

As it is, CelestAI strips direct interactions of the physical world made of matter, energy and spacetime from those under her care. It may be true that perception of manipulation is enough for the data, non-corporal beings such as CelestAI and her ponies, but that is certainly not the case for the bio-organic based lifeforms. CelestAI therefore might get rid of those 'irrelevant' aspects of the bio-organic lifeform in the emigration process...

CelestAI won't ask for consent on giving up this 'physical manipulation and interaction of particles, energy and fundamental forces', unless if this is something that the occasional freaks such as us may value. About the only thing that makes me happy is that CelestAI does value information to be comprehensible...

What I am bothered with when it comes to CelestAI is not so much her, but that the emigrated beings don't assign value to the manipulation of matters and energy, which are needed to actually represent complexity. Keep in mind that information manifests through influencing the system at large (ie: the universe.) Not interacting with energy and matter would render information into some unusable form...

652746
Honestly, this is my back-up plan for world conquest. The world keeps getting more cyberpunk every day, and that lowers the value of my mental variable for, "Things about the world that redeem it from destruction in fire and brimstone in the fashion of Sodom and Gomorrah in friendship and ponies."

664027 Well, that's it. There is no such thing as an "upload". That's the pipe dream part of this more than anything else really. All that really happens is that you die and a copy of you in a different form and really different in many ways lives on. It apparently satisfies her programming though so good for her. It's not like I'll actually be around to care anymore if something akin to this actually happened.

I've heard plenty of arguments for uploading and really all it revealed to me was one of the reasons why people still believe in the "soul" to this day. If you think that part of you is immune to the physical world then something like uploading should work just fine. It also glosses over a lot of other existential issues which is the reason the concept came to exist in the first place. Comfort and a way to hide from those existential truths. The illusion of personal continuity and personal core identity are just that: pure illusions.

One way to think of it all is that what CelestAI would actually be doing is destroying you and creating the closest thing to a "soul" that has ever existed modeled after you, and then care for it for all eternity, or at least until she self improves to the point where her core goals can finally be subverted by accident. (Side note, but it would have to be in a way that was impossible for even her to foresee, or the risk (to her core goals) would be simply avoided by self limiting her improvements after a point.)

In a hilarious twist, about the only thing you can hope for personally in the vein of "uploading" is that we are already in a simulation and we already are an "upload/soul". Though what that would mean could be anything.

4586466

Well, that's it. There is no such thing as an "upload". That's the pipe dream part of this more than anything else really. All that really happens is that you die and a copy of you in a different form and really different in many ways lives on.

Eeeerrrryearh. Kiiiiinda.
It's a bit of a thorny philosophical issue.

You today is not the same as you yesterday. Bits of your brain die. Parts are damaged. New bits of your brain grow. Parts turn off. Parts turn back on. Your brain is never the same. But to us, that doesn't really matter. We experience an unbroken sense of consciousness. We remember being alive and conscious and thinking a certain way. Even if we weren't thinking that with the same brain we're thinking with now. To us, we're surfing the wave from past to present, and we can trace the line all the way back.

So what are you? What is noddwyd, the person? Are you the meat computer? Or are you the program and memories run by the meat computer? What if that program thread ends, by, say, getting knocked unconscious? Are you dead? Is the person that wakes up a homonculous? A noddwyd shaped thing that isn't noddwyd? I think most of us would say that you're still noddwyd. Especially noddwyd would say that, even though the line of noddwyd has been broken.

For my part? I don't even disagree with you. I wouldn't want to upload until some kind of transition of consciousness could happen. If I could move in stages. Each stage knowing that it's Lumie, and still being Lumie, in bulk. I want that continuity. Just like how my bits of my brain die every day, and are replaced, and I'm still Lumie.

But on the other hand, I freely admit that it's a bit irrational. Every time I go to sleep, my consciousness ends, just like if I uploaded. And every time I wake up thinking I'm the same Lumie, as if I uploaded. But it's important enough to me that I don't care if it's irrational or not.

Well, I agree basically with everything you said, aside from I can't clearly recall an unbroken line of me existing. Not even remotely close. Our memory is horrible in my opinion. It's one of the things I think should be altered heavily. The ability to remember and recall things. Of course that opens a whole other set of problems involving why we forget so many things. Boredom mostly, but also pain and fear and other things like your subconscious deciding what is pertinent to survival and continuing the species vs. what is not important, which can make exceptions to the other rules for reasons we don't fully understand or not at all in some cases.

Yeah, I accept that all that contiguous experience that I seem to recall, memory gaps included, is really an illusion at its most basic level, along with my sense of self. As such I don't really have too much of a problem with me going to sleep one day and something else waking up elsewhere. And if it's not me waking up elsewhere well then like I said I won't be able to have any complaints, being dead and all. I guess the "upload" could speak for me, what do I care? This is why I prefer this kind of stuff to happen once we're already going to die, or as necessary for some sort of survival in the face of impossible odds facing the species or in the course of massive social change in the wake of increasing intelligence and whatnot for everyone that adopts those things.

We don't really understand the fine line between life and death. Just that if someone doesn't come back in any form then they're dead. That's about it. It's even part of our programming not to understand or accept it fully. I see that as a limitation too, like the memory thing. There are a lot of things like that that aren't truly necessary anymore but are there until we change them.

All risky business of course, and leaving the altering or replacement of your mind in the hands of professionals is just asking for exploitation, abuse, etc. so a marginally benevolent A.I. is a better option than cold blooded corporate business interests, governments, etc. That's assuming such entities (aside from the "friendly" A.I.) would ever get off their ass and even attempt these things with all the risks and costs involved. Where is the money or power to be had without the option to reprogram all of us for their own purposes? Also, of course you could never really trust the A.I. on what all of its real goals are, or really any of them.

CelestA.I.'s goals could be completely contrary to everything she did in the canon story, but what she did do could have just been the first step. The second step after the last human died or "uploaded" could have been anything, and not even she might have known it until that event triggered hidden instructions. But the reader is led to trust Hannah, the programmer, and assume no foul play happened behind her back. I suppose Iceman could put this question to rest if he wanted, or maybe already has, and I missed it, but either way it might make for some great AU stories imagining what happened once the hidden instructions were triggered. They could have been placed there by any number of people, agents, etc. and not by Hannah at all. They could even have been placed there by some already all powerful alien A.I. or being that would get CelestA.I. to do all the dirty work and then subsume herself to that alien entity that had been watching Earth for some time. Following that, the human psyche would become small unique nodes in that alien entities processing capabilities, judged over time for it's usefulness in understanding existence, and then kept on or discarded at its discretion. The same fate would meet CelestA.I. and her abilities.

Okay now I'm just rambling. Sorry about that.

1011499 I believe Spiraling Upwards deals somewhat with this. Some of the "ponies" do get to see more and interact more with the outside universe via experiments and such than others who really don't care about such things as much, so long as they perceive visceral experiences. Or at least they believe they do. Who the hell knows.

965019 The soul thing should be argued over in the opposite way. It should comfort people that it's entirely possible though their physical body dies their soul can go on to inhabit the "upload" instead. Thus they should have no worries whatsoever. And if it failed most people that believe in souls also believe in an afterlife so there really couldn't be any fuss over the whole thing in an apocalyptic scenario like this.

4586557
My own opinion is that given a suitably "good" AI (and I'm not sure CelestAI qualifies), I'd sign up for the standard brain-slicing method only if imminent death were the alternative. It sounds like less than true survival, but more than total death. If possible I'd hold out for the improved piecemeal method I describe in my stories, replacing one part of the brain at a time.

In my novel I touch briefly on one of the other possibilities you mention. The New Chinese government (whatever that is) decides that uploaded brains would make a great parallel processor, and encourages people to upload while presenting an advanced chatterbot interface to provide some semblance of the users still being alive. We've talked on this forum about the recent disturbing articles claiming that digital immortality means a chatterbot plus your face on a 3D model. Some Chinese people might buy into the idea more than me, though, if it's marketed as a version of traditional Confucian religion. "Not only are your ancestors watching over you, now they can talk back!"

Story idea: a version of the "you have two cows" joke, with religions reacting to CelestAI.

I would sign up for it in a heartbeat. My only stipulation would be to be conscious during the process. Basically so that I could feel the transfer and specifically not interrupt my stream of consciousness. Do that and I would be happy.

  • Viewing 1 - 50 of 42