LessWrong 316 members · 64 stories
Comments ( 15 )
  • Viewing 1 - 50 of 15

Should we compile a list of stories that mostly go against rational, pro-optimisation values?

I mean like this one, where it's set up such that an intelligent approach to a problem doesn't work and they should have trusted in the magic of friendship desperation.

I don't mean this idea to attack any stories or encourage rude reviews! No trolling the authors plx. More a kind of "if you don't understand what we're about, here are some of the things that frustrate us."

Titanium Dragon
Group Contributor

I haven't read the story in question, but it is worth noting that in a world where the magic of friendship is literal, not taking it into account in your plans is irrational. If your goal is to optimize outcomes, and you get inferior outcomes by acting "rationally", then you aren't actually acting rationally.

4767161 From what I've read of the inspiration, I have a feeling that this is Basilisk territory. Changing such a fundamental premise of reality would necessitate a restudying of rationality from almost the ground up.

4767009

That said, though, I'm not sure that this is a good idea. I don't think that codifying our antithesis is as effective as showing our rationale directly, and it may inadvertently draw ire to us.

Titanium Dragon
Group Contributor

4767280
Not really. Rationality is about making rational conclusions with whatever arbitrary rules exist. Indeed, rationality is itself a set of arbitrary rules - all logic is. Whatever the rules of reality are, you can apply logic to them in order to optimize outcomes.

We don't even know all the rules of reality, and yet we still apply rationalistic principles to our daily lives.

Bad Horse
Group Admin

4767009 4767285 I think what you're asking is, Can we have a folder for "Anti-rational stories"? My thought is: Who's responsible for deciding what goes into those folders?

I won't do it. I can put stories into LW with a preliminary skim, but I can't render a judgement against them without a thorough evaluation. It's not worth my time. And if somebody else is willing to put the time into doing it, well, they should put their time into evaluating the stories in "Nominated" instead, which are more important.

It's a reasonable idea, but as mod, it feels like something that would take my time and make people angry at me, and would steal time from evaluating positive-example stories which we can't spare. I wouldn't consider doing it unless "Nominated" were cleaned out.

4767429 Did you miss the part where I said Basilisk? As in, the intellectual Kobayashi Maru? The only winning move is not to play?

This restructuring of rationality would necessitate thinking rationally about when the rational outcome requires you to not think rationally. I'm all for thinking rationally, and especially using that word to encompass every possible win condition, but we're talking about antitheses, here.

4767285
4767613

Fair enough. I was in 2+ minds about the idea myself. (Note I'm not saying the story was badly written, more "tying itself in quasi-religious knots making Celestia right and Twilight wrong", but I suspect including it in a "what not to do" list would make it look like we were calling it generally terrible. And I'd hate to do that.)

4768587

Aye - I felt this was more or less where the linked story was going. Whereas the FiM-verse has high-INT, low-WIS Twilight muddling through and doing the best she can, and she does learn from experience and try to be sensible where it doesn't get in the way of Rule of Funny (and occasional Idiot Ball holding) for the show writers.

I dislike Reality Check's worldview, but I thought his story Parting Words (to which Overconfidence linked above was a reply) was funny and on point.

Titanium Dragon
Group Contributor

4768587
A small strike force is often better than bringing an army for many tasks. The US, when we took out Osama Bin Laden, only sent in a very small number of people to do it. This allowed the strike force to get in and out. If you live in a world where enormous amounts of force can be concentrated in a small number of individuals, then using a small force gives you a number of advantages (stealth, enemies underestimating you because of the small numbers, the enemy being unable to distinguish you from a non-threatening group like scouts, ect.) without a serious drop in firepower.

Indeed, in real life, the US took advantage of this principle against Japan; the teams that dropped atomic bombs on Japan were small groups of bombers that were indistinguishable to the Japanese from scouting groups. The Japanese lacked the fuel to intercept every group that came, so the best solution was to try and devote resources to fighting the firebombing runs. When the US could simply use one plane to drop a bomb that destroyed a city, the Japanese had no effective defense against it - they couldn't afford to deploy their fighters against every scouting group because one of them might carry a bomb, but if they didn't, then they would just let their cities get nuked; even if they tried, they'd just run out of fuel and be unable to prevent it in the future, and be unable to prevent the Americans from firebombing their cities. It put them in a position of Zugzwang.

It isn't rational to believe that the same rules apply to all situations to begin with. The idea that you'd have to rework rationality in a world which worked differently is wrong; rationality is about taking rational actions in the context of the situation you're in. Rationality is a process, not an end. What is rational in a world with different physical rules is going to be different, but that's not because rationality is different, but because the rules of the world are different.

In a world where friendship is literally magic, trusting your friends literally makes you more powerful. This makes trust a much stronger commodity in Equestria than it is in real life. Moreover, being able to power magical gems with the power of friendship is an objective measure of friendship - you know that these people are friends because the magical shiny jewels wouldn't work otherwise.

4768730

I agree with you. The mentioned fic is not anti-rationalism. It is just anti-overconfidence. There is a difference.

The bottom line of rationalism is to find the proper way to win. If you can imagine a situation where your rational, pro-optimization methods fail against other methods, then you are not defining your optimization methods correctly.

EDIT: By the way, here is an article from LessWrong about this very concept:
http://lesswrong.com/lw/nc/newcombs_problem_and_regret_of_rationality/

4768730

Okay. The possibilities are three:
1. You have never heard of the infamous Basilisk, and have no idea what I'm talking about. Good for you. Tell me so and I'll explain it differently.
2. You refuse to discuss Basilisk-class problems, since the best defense is not to engage. Something something Litany of Gendlin, considering that you are not actively engaged in a Basilisk, or likely to become so as this world does not run on fuzzy feelings.
3. You somehow think that a long-winded analogy that didn't really seem to go anywhere was an adequate answer to how to combat Basilisks. (If you honestly think you do have an answer, you should probably let Eliezer know - he might be interested in that)

In my opinion, you are practically descending into Applause Lights and No True Scotsman, and it's rather annoying to have to keep addressing the superficiality of your comments rather than your arguments themselves.

If any of the above capitalized terms is unfamiliar, please do say so, or Google them. For most of the internet, I don't mind a large epistemic distance, but as you are a moderator on a LessWrong group, I was hoping for a little more.

Titanium Dragon
Group Contributor

4768789
This has nothing whatsoever to do with basilisks of any sort, Roko's or otherwise. Indeed, a story about basilisks would fall under the opposite category, stories that Less Wrong folks would be interested in, because basilisks are interesting to a lot of them (hence why Roko's Basilisk got so much attention). There's nothing basilisk-like about this story, and in any case, basilisks are stupidity anyway - they're utterly harmless unless you drive yourself crazy over a mental problem which doesn't correspond to anything real.

The story in question was a response to a story which was ranting about the Crystal Empire episode, and how Twilight and Celestia's behavior in that episode was poor and irrational. The response fic has Twilight's attempt to bring a small army with her blow up in her face, dooming her men and resulting in the Crystal Empire disappearing again, remaining under Sombra's rule, and the group only barely escaping with their lives. There's nothing basilisk-like about it - the story is about Twilight thinking that she could roll in with an army and steamroll Sombra without understanding why Celestia didn't do it that way in the first place, and why Celestia might employ the method that she did.

There's nothing anti-rational about it, in the end; it was merely a response fic to a response fic. It wasn't particularly good, I don't think, but there's nothing basilisk-like about that.

Rationality is about making decisions based on fact and reason. What the facts are may change in any given situation, and indeed, in real life, as we make new discoveries, we update our beliefs to match the evidence we are provided, in the strength to which they are provided. In real life, we don't know all the facts. A rational individual is aware of this fact. And if our beliefs about how the world works is wrong, we change our beliefs and behavior to behave more optimally.

If you are looking at a hypothetical situation - about a different world, or about a hypothetical construct - rationalism doesn't change one whit. It is merely the results you get which change depending on the facts and strength of evidence you're provided with.

In this regard, it is much like science is. Many people think of Science being all the things that scientists say are true, but ultimately, science is a method for understanding the world. The scientific method is a powerful thing because it doesn't care about which reality it is being used in - people use the scientific method in video game worlds all the time to discover how stuff works or to try stuff out. There's no reason why you couldn't use the scientific method in any given universe. If you live in a universe where friendship is a literal extrinsic magical force, why does that change rationalism or science? It doesn't. It changes your knowledge. It changes how the world works. It changes what things you think are plausible, and what actions you might take. But it doesn't change the process by which you are going about exploring the world or making decisions based on said knowledge in the sense that you're still testing hypotheses and making decisions in accordance with the facts.

The fact that the facts of that world and the facts of the real world are not the same is so much water under the bridge. In a universe where friendship is magic, only a fool would deny it.

There's nothing that requires "reconstructing rationalism", unless you've made rationalism into a set of religious beliefs or doctrines (which isn't actually rational to do, but many people fetishize it). What you're doing is rebuilding your assumptions, not rebuilding rationalism. In the real world, we assume that souls don't exist because there's no evidence that they do and a lot of evidence that they do not, but if we were thrust into the magical world of Equestria, we'd have to question that assumption because there is a lot of new evidence (cutie marks, behavior being affected by having magic drained away, shapeshifting) which might indicate that we are wrong about it. That doesn't mean that we are wrong, but we should be a lot less sure of our assumptions about other worlds when significant things have changed.

There's nothing contradictory about saying that someone in world X should do action Y, even when in real life action Y would be a bad idea, because the rules of world X mean that instead, it is a good idea.

Indeed, it can be a healthy intellectual exercise to think up hypotheticals and see how they play out. Being able to contemplate things which are untrue is a useful ability to have, because you never know when something you believe to be untrue might turn out to be true. Mental flexibility is helpful in general, and can also help with your empathetic abilities, in understanding how other people think, feel, and might behave.

Bad Horse
Group Admin

4768789 (CC 4768730) He just doesn't think this is a basilisk.

But whether or not it is: Say we were taking geometric measurements in the real universe, and we found out Euclid's fifth postulate was wrong. You're saying we'd have to rebuild logic from the ground up. TD is saying we'd just throw out that postulate, and maybe find a different one, but still use the same rules of inference.

Titanium Dragon
Group Contributor
book_burner
Group Contributor

4767009 I deeply disagree with having such a folder of stories. If the space of possible ideas is a vast, vast space, and the space of useful, rational ideas is a tiny space, then giving people stories from the vast space of bloody-stupid is far less useful than giving them stories from the tiny space of useful and good.

  • Viewing 1 - 50 of 15