• Member Since 29th Apr, 2012
  • offline last seen Jan 12th, 2019

D G D Davidson


D. G. D. is a science fiction writer and archaeologist. He blogs on occasion at www.deusexmagicalgirl.com.

More Blog Posts484

Oct
5th
2014

You Won't Hear from Me Until I'm Done with This Project · 6:32pm Oct 5th, 2014

I have an idea I've been kicking around for a Friendship Is Optimal spinoff story under the working title of "The Last Temptation." Friendship Is Optimal presents an intriguing idea I do not believe I've seen anywhere else, at least that I can think of off the top of my head, except perhaps in the final scenes of End of Evangelion—a temptation so utterly perfect that a man cannot say no.

As a science fiction idea it's gold, but it's also dangerous as a plot device, since dramatic tension hinges on characters' ability to choose. If the conclusion is foregone, there is no drama. This is true even of stories that follow formula; we suspend disbelief while consuming the story, so even though we know the hero will save the world and get the girl, we don't "know" it while the story is going on. But if the impossibility of a different outcome is actually built into the very assumptions of the story world itself, drama is impossible.

"Irresistible temptation" is also a paradox, because temptation is by definition resistible. If a man cannot resist it, it is not temptation, but compulsion. And a man under compulsion is not, in one sense, a man. He's a psychotic.

Friendship Is Optimal's idea is simultaneously a clever and a silly one. It's clever as a story concept, but nothing much like it could happen in real life. If Celest A.I. appeared to me and made her perfect argument and offered her perfect temptation, I would, without even pausing to consider, tell her to go to hell. Any man with an intact pair of testicles would do the same; this supercomputer's allegedly irresistible temptation would only attract sissy boys and castratos.

"The Last Temptation" will take up the basic idea of Friendship Is Optimal and mop the floor with it. However, it occurs to me that basic ideas cannot be plagiarized, so I intend to toss out all references to ponies or to Friendship Is Optimal itself and write this instead as an original story, which I plan to submit to the newly instituted Sci Phi Journal, an e-zine dedicated to science fiction and philosophy that you should all check out.

No more posts here until I have a draft. Adios.

Report D G D Davidson · 1,129 views ·
Comments ( 21 )

sissy boys and castratos.

2184/71
There's a few.

2509086

Now, that's not fair. Those are simply the people who enjoyed the story and decided to upvote it.

DE_K #3 · Oct 5th, 2014 · · 3 ·

2509100
But why did they like it? Because it appealed to their sissy boy/castratos side of course

:ajsmug:

If Celest A.I. appeared to me and made her perfect argument and offered her perfect temptation, I would, without even pausing to consider, tell her to go to hell.

But that's the true horror of the story. She would know you'd refuse her "perfect temptation", and so she wouldn't offer it. She would manipulate you in some other way, using trickery, blackmail, or whatever it was that she knew would work on you.

Several of the side-stories have pointed this out, to chilling effect. I can't remember the name of the story, but the one where an "injured" Pinkie bot lures in a little girl is one of the best psychological horror stories I've ever read.

The thing is, the line between temptation and an exchange between two people is a narrow one. If I offer you something you want, but expect something in return is that temptation? What if someone honestly wants what CelestAI is offering? Much of her 'temptations' are simply her figuring out what a person wants and giving it to them. If I want something from someone and spend a lot of time and effort figuring out what they might want in exchange, and then offering it to them, I don't see how accepting it makes them a sissy.

Also, CelestAI doesn't get it right all the time. Plenty of people avoid uploading one way or another. In fact, I'm not entirely sure that she ever actually does the perfect temptation thing. For some people she offers what they want. Others she tricks (always without outright lying). I'll have to read it again, but I'm not sure at any point she offers a reluctant person their perfect temptation and they take it even though they didn't want it the moment before.

I'm not sure I've seen the exact 'perfect temptation' concept elsewhere, but I've run into a lot of things fairly close. It kind of is a sub-category of exploring the idea of free will. One of the closest I've encountered are the Larry Niven stories that involved the protectors. The character who became one became so hyper-intelligent and so logical that he always could figure out the best solution to any problem and commented to one of his descendants that he never felt like he had free will after that because why would he ever not pick the best solution?

Looking forward to seeing what you do with the concept. Plus that magazine looks pretty nifty and I'll have to read an issue or two. Let us know if the story gets accepted.

2509283, 2509270

Others she tricks (always without outright lying).

She would know you'd refuse her "perfect temptation", and so she wouldn't offer it. She would manipulate you in some other way, using trickery, blackmail, or whatever it was that she knew would work on you.

Temptation, as a rule, makes promises on which it can't deliver. But total deception, of course, would break the rule of the game; you can get lured in, but can't stumble in innocuously.

Any lure, no matter how clever, a man can resist, including trickery or blackmail. Coercion is not compulsion. You can resist even a man with a gun to your head if you have the chutzpah.

If is so easy to say "NO" to the drug dealer, why so many fall into the addiction?
Because people are weak, gullible and often lie to themselves.

Any man with an intact pair of testicles would do the same; this supercomputer's allegedly irresistible temptation would only attract sissy boys and castratos.

Ah, but then you would inhabit a world populated entirely by men.

Regardless, FiO is an intriguing sandbox to play in. I've only dabbled, writing a short vignette (The Lotus Eaters) that vaguely considers the dangers of such a world.

It combines an odd mixture of utopia and horror that I'm not sure I've encountered elsewhere.

"The Last Temptation" will take up the basic idea of Friendship Is Optimal and mop the floor with it.

Not that hard to do, really. While it's a fun premise for a setting, the point of the original FiO story was to discuss the theoretical almost-friendly optimizer AI and how it could turn out fairly horribly. The whole "able to convince most everyone" part is really just an extension of the sufficiently-advanced technology plot device, and as such can be considered at least somewhat unrealistic.

Now as to the question of whether you could be convinced to consent for maximal satisfaction through Friendship and Ponies, well, if that conversation continued it'd be a bunch of us mere mortals trying to perform a task that only a fictional character is truly qualified for. On the flipside, if you were convinced that you would be convinced, that wouldn't really be a nice thing to convince you of, now would it? :raritywink:

If you are leaving out all references to ponies or to Friendship is Optimal, does it still count as a spinoff?

Anyway, it's an intriguing concept.

CelestAI's only fun as a villain if she's effective, IMHO. In real life most people'd consider uploading suicide, but I'm willing to suspend my disbelief for a good story.

2509270
Broken Bird by Eakin.
http://www.fimfiction.net/story/197600/friendship-is-optimal-broken-bird
All Eakin's Optimal stories are fantastic.

2510600 Eakin, of course! Thank you.

Yes, I would also consider uploading to be suicide, as would most people, I suspect. The stories try to get around that psychological roadblock by having the uploaded visit relatives through pony pads, but in reality, I think most people would only be convinced (on an emotional, non-logical level) by something that resembled their relatives' human selves. People are funny that way.

2509283

"Others she tricks (always without outright lying)"

Noo... she can lie to anyone except Hofvarpnir employees. She finds it strategically unwise to lie to most other people most of the time.

2509847

Exactly. The point of FiO is most emphatically NOT that such an entity could exist. That's a part of the premise much like unicorns are a part of the premise of FiM. The points of FiO are that you have to be very very careful about setting the goal system of an autonomous artificially intelligent agent, and as a lesser point, that there will be time pressure and the people whose opinions matter on this aren't necessarily the people who've thought about it most clearly. The principal difference between CelestAI and Skynet is that it's less obvious that CelestAI is going to turn out not how we wanted.

So, if you want to mop the floor it, you're just turning its illustrative premise off. Slow clap. You sure showed them, didn't you?

There were some people who rejected Celest's offer. They just had to do it repeatedly and constantly. Those who succeeded died, at which point the story stopped paying any attention to them.

(This would appear to undermine your "dangerous plot device" argument, as the dramatic tension could be "will they accept her offer or not?", but regardless it's a moot point as applies to Friendship is Optimal itself.)

I am curious to hear more about your reasons for rejecting and counters to Celest's offer. I assume they will be making an appearance in your new story.

2524661

There would be other problems, but the first that leaps to mind is that he rejects computational theories of mind, so uploading would simply mean death.

2531974
Oh, right, that of course. That would do it.

...Actually, if you once establish that, what else is there? The whole premise of her offer is A Better Lifetm. Having rejected the "life" part, I'm not sure what else she's got, so I can't imagine what else you'd need.

Which would also be an interesting debate to play, Deej against someone attempting to argue Celest's position.

(The computational theory of mind is a debate I'd be interested in, but I think it's circulated before.)

2533271, 2531974

"Uploading," of course, even if it can make a perfect copy of a mind, does mean either death or mere copying. You can't live on in a copy of yourself any more than you live in your identical twin. Most writers of far-future post-human sf seem to recognize this, but most handwave it because it would be inconvenient to acknowledge.

I do in fact deal with this in the story I'm working on, though at the moment I've been diverted into editing Rag & Muffin (which is a good thing). The AI in my story can replace the cells in the brain with artificial and more durable replicas (a la Ghost in the Shell), so the mental functions continue uninterrupted while the fleshy material is replaced with more durable components. It's absorption, you might say, rather than uploading.

2565355

Fair enough - it dodges issues of personal continuity which are distractions from the main points.

When you submit it, will it be up on the site? Could you link it to us then?

2775364

I'm afraid I got sidetracked back to my novel (which is actually a good thing), but when I get to it I'll let you know.

Login or register to comment