LessWrong 316 members · 64 stories
Comments ( 22 )
  • Viewing 1 - 50 of 22

So everyone's favorite science denier RealityCheck ( insert all your audible groans *here*) decided he was tired of being mocked for being a YEC and wrote up a defense of YEC. Or at least he tried to.

It went as well as you would expect. Perhaps the highlight is the fact that RC apparently cannot tell the difference between religion and science and claims repeatedly that he does not hold his particular religious views because of faith or any such piddly thing but because scientists have proven his religion. Through experimentation. yes really. He managed to fail BOTH science AND theology forever.

Anywho I wrote ( mostly for my own amusement but also hopefully in a manner that is educational ) a long long long LONG and detailed , well researched rebuttal ( including literally dozens of links to research papers etc) to RC's rant. I also begin by coppying and pasting his original at the top of the post.

I hope you read it and enjoy. Any constructive criticism is welcomed though I would prefer that you do it in the comments of the blogpost itself as opposed to this comment thread, as more comments drives up the traffic to the post.

Very thorough. I'd like to think I learned a thing or two.

I am going to do something a bit unpopular and defend RC a bit. I certainly disagree with many of his views, but he has come out very strongly in favour of universal life extension. And to my knowledge he has never done anything as hypocritical as defend capital punishment. For that reason I believe he is something of an ally because even if we disagree on everything else, if we succeed in our shared goal we would have time to sort out all of our other disagreements.

Hope springs eternal, I suppose

book_burner
Group Contributor

4619419 You know what? I just don't care. Sorry. I know he's denying science, I know he's a religious weirdo in certain ways, but you know what?

The Great Alicorn Hunt is freaking fun, and is an actual example of transhumanist literature that doesn't make me feel like I'm surrounded by self-centered psychopathic creeps. It's something I can use, at least in company where pony material is admitted, as a counterpoint to shit like The Transhumanist Wager. And I need that, because without it, it gets fucking hard to explain why the hell I even hang out with these people.

4619551

The Great Alicorn Hunt is a lot of fun. I was my first exposure to transhumanist pony fics. Whatever RC might say on his blog, he does treat the characters with respect, at least in the stories of his I read.

Oh for fuck’s sake! He was banned and his blog post deleted. Apparently disagreeing with Reality Check is a bannable offence now? It’s especially galling considering the sheer amount of information that was present and the fact that it could quite easily serve as a reference to anyone needing a convenient resource to rebut Christians of the “a true Christian believes in Young Earth Creationism” crowd.

4619705
Because "banned for disagreeing with Reality Check" is the first and most likely cause of his ban anyone should think of.

Did anyone get a copy of that blog post? Google's cache doesn't seem to have picked it up.

Edit: Something everyone should think about before getting into long arguments: "Someone is WRONG on the Internet!"

4619888
I cannot be certain that he was banned for that, no, but even if he did something else that would warrant his being banned it still wouldn’t warrant the deletion of the blog post as a refutation of poor reasoning isn’t a personal attack.

4619551 I think of him as the Orson Scott Card of pony fiction: he can word pretty good, but it's best (at least for the blood pressure) to ignore most of the things he says outside a story.

4619551
Christ, another thing I have to read now? The real tragedy of the human condition is how many browser tabs it forces you to leave open.

Oh well, any palate cleanser will do after The Transhumanist Wager.

4620105 And I'm always on edge when reading RealityCheck stories, wondering when the objectionable parts of his worldview or style will suddenly break through into the awesome story I'm reading. Alas, one of the few cases where ignorance would be bliss.

It would be nice to know what onlyanorthernsong was actually banned for. I somehow doubt it was simply posting a rebuttal blog post, given how he spammed it around to various groups (like this one) I'm guessing there may have been some level of conflict involved beyond that. Myself, I just unfollowed RealityCheck a long time ago and trusted that his stories had enough attention for the good ones to float to the top without me paying attention specifically. Seems best when the differences in worldviews are so insurmountably vast. I do engage him sometimes when I see his comments on other stories, but that's public commentary and so fair game IMO.

4619551

and is an actual example of transhumanist literature that doesn't make me feel like I'm surrounded by self-centered psychopathic creeps

Don't forget you're here forever! :pinkiecrazy:

But from what I've read of the guy this sentiment feels rather ironic. I didn't know a thing about the rest of his views but from his comment section I've stumbled onto it looks like the guy enjoys telling people off for things like benefiting from healthcare on a pretty regular basis.

book_burner
Group Contributor

4620824

I didn't know a thing about the rest of his views but from his comment section I've stumbled onto it looks like the guy enjoys telling people off for things like benefiting from healthcare on a pretty regular basis.

Holy shit, really? What is it with some people? Like, how can you want people to be immortal at some point but not want them to be healthy and whole right now?

Don't forget you're here forever! :pinkiecrazy:

How could I forget? We have that whole other group to remind me!

4622273
To be fair I don't really remember whether he brought up healthcare specifically but his pretty much in the "all taxation is a crime" camp of thought.

book_burner
Group Contributor

4622957 You know who are really silly people? Non-consequentialists.

Iceman
Group Contributor

Taking a step back...

Remember that the purpose of a belief to most people is in-group signaling, and it's only weirdos like LessWrongers who (sometimes!) are able to treat beliefs as anticipation-controllers instead of belief as a costly precommitment to belonging to a social group. Remember that your interlocutor doesn't accept your citations anymore than you'd accept citations to Answers in Genesis. Also take the outside view on your behaviour: are you really doing this to advance truth, or to stick it to the enemy tribe?

(Outside of FAI where the question which is a matter of life or paperclips in the universe, I consider it an open question whether the expected value of having better predictions about the future outweighs the expected value of increased coordination mediated by costly tribal beliefs. I am frustrated that a lot of the arguments in the LessWrong-o-sphere about epistemic vs instrumental rationality aren't actually trying to come up with expected value calculations on a case-by-case basis and are instead making sweeping statements about how one is always better than the other.)

book_burner
Group Contributor

4623950 Personally, I think the question is what you mean by the "LW-o-sphere." Large portions of it appear to not only have different "tribal" affiliations from the rest, but to genuinely anticipate the world in distinct ways. For example, Slate Star Codex readers seem to be unusually religious for "rationalists".

Remember that the purpose of a belief to most people is in-group signaling, and it's only weirdos like LessWrongers who (sometimes!) are able to treat beliefs as anticipation-controllers instead of belief as a costly precommitment to belonging to a social group.

I think that's a little disrespectful of humans. Our brains actually do implement anticipation-control, whether or not we use the verbal label "belief" for that, or instead for some other weird social-ritual thingy. In fact, in brass-tacks environments of business or government, where people need to plan out things that actually matter to them, they often discard at least some of their "beliefs" for "pragmatism" or "being realistic", aka: the things they actually expect to see happen.

"Believe" versus "believe in" at least roughly captures the distinction.

Iceman
Group Contributor

4624507

Personally, I think the question is what you mean by the "LW-o-sphere."

Sure. The neoreactionaries and the polyamorous cuddle pile crowd don't see eye to eye on a number of issues. I was more thinking about posts on LessWrong explicitly discussing the issue of instrumental vs epistemic rationality.

In fact, in brass-tacks environments of business or government, where people need to plan out things that actually matter to them, they often discard at least some of their "beliefs" for "pragmatism" or "being realistic", aka: the things they actually expect to see happen.

I disagree. In my post-collegiate life, I've worked for two companies.

The first was a research firm with DARPA funding. And while I'll save the seedy details, the CEO had found a way to produce nothing of value while continuing to get those sweet government contracts. While I never met our "competitors," I did read their publications, which were obviously bullshit. Now, if you're willing to expand actually matter to "acquire as much government money as possible," then sure, but as you started this section with a note about respect for humans, I suspect this is not what you are talking about. Except...I don't believe it. It is entirely possible that our CEO was highly Machiavellian to the point where he understood that playing stupid was the correct course of action, but...there were too many incidences of buffoonery. No.

The second is a household name; you know of this company. I have now been there longer than I have ever been at a single place in my entire lifetime. Practically, there is no displacing us from our market niche. Having a ton of money insulates you from reality, allowing you to play crazy status games. My reporting chain above my immediate boss is obviously playing Game of Thrones with each other. Our PM leadership (and even some of engineering!) make broad proclamations that are completely divorced from what our users want, what eng can deliver on a sane timeline, or what we can actually ship because of internal company politics.

Can businesses be highly consequentialist? Sure. I just dispute that it's common.

Meta: I disagree that I am being disrespectful to humans. There are really good reasons why we're the way we are. Whenever you ask why a behavior is the way it is, ask yourself: "Is this a Nash equilibrium?" Why act as a virtue ethicist or deontologist instead of a consequentialist? Because holding things to be taboo and that you won't do something no matter what is a costly signal that you aren't going to trade that thing off against some amount of utility in the future; it is akin to throwing your steering wheel out the window while playing chicken. Making credible precomittments is necessary. I've plugged this book before, and I'll do it again: Why Everyone (else) is a Hypocrite.

More locally meta: the whole be-respectful-to-humans thing is really weird and appears to me to be a side effect of a certain FIMFictionism where, because of a certain historical story, people lined up into "Humans are Bastards" and "Humans are Wonderful" camps, which are totally not beliefs that pay rent, aren't even proper hypothesis, and really really friggin weird.

book_burner
Group Contributor

More locally meta: the whole be-respectful-to-humans thing is really weird and appears to me to be a side effect of a certain FIMFictionism where, because of a certain historical story, people lined up into "Humans are Bastards" and "Humans are Wonderful" camps, which are totally not beliefs that pay rent, aren't even proper hypothesis, and really really friggin weird.

I could hug you. I mean, in my case, I'm also referring to the general tendency of LW-o-sphere people to go around saying, "Humans are bloody stupid", which is often implied to be a contrast with Rational Bayesians (perhaps as embodied as AIs). So my object-level disagreement there is that the project of Bayesian AGI would have succeeded a lot more quickly if the "human minds are so very dysfunctional" belief was true.

As it is, there are different schools of thought in cognitive science and AI, and at least in some of those, there are lawful, non-coincidental, necessary reasons for why the human mind works the way it does, where we can look at the model and say, "Yes, it appears that any mind you could build which you wanted to really work would have to be like humans in these couple of ways." Also as it is, the AGI field is bogged down in exactly the problems you encounter from trying to design a mind ex nihilo as a Rational Bayesian Reasoner, like Pascal's Mugging or the recent article on AIXI being arbitrarily stupid if you write it in the wrong programming language.

The proof of superiority for the Rational Bayesian Utility-Maximizer will be the ability to actually build one that does what we want it to do (eg: not paperclips, nor any of the other "Just UFAI Things"). Until then, our human minds work, at all, at least when we're actually trying to use them, and we ought to be actually trying a lot more often.

I disagree. In my post-collegiate life, I've worked for two companies.

The first was a research firm with DARPA funding. And while I'll save the seedy details, the CEO had found a way to produce nothing of value while continuing to get those sweet government contracts. While I never met our "competitors," I did read their publications, which were obviously bullshit. Now, if you're willing to expand actually matter to "acquire as much government money as possible," then sure, but as you started this section with a note about respect for humans, I suspect this is not what you are talking about. Except...I don't believe it. It is entirely possible that our CEO was highly Machiavellian to the point where he understood that playing stupid was the correct course of action, but...there were too many incidences of buffoonery. No.

The second is a household name; you know of this company. I have now been there longer than I have ever been at a single place in my entire lifetime. Practically, there is no displacing us from our market niche. Having a ton of money insulates you from reality, allowing you to play crazy status games. My reporting chain above my immediate boss is obviously playing Game of Thrones with each other. Our PM leadership (and even some of engineering!) make broad proclamations that are completely divorced from what our users want, what eng can deliver on a sane timeline, or what we can actually ship because of internal company politics.

Can businesses be highly consequentialist? Sure. I just dispute that it's common.

Oh, you work for Microsoft? Because that sounds a lot like Microsoft. Or possibly Google: from what I hear about their internal politics they're almost, but not quite, as bad as Microsoft. Or maybe IBM?

By the way, I work at a firm with government research funding, too. And you know what? Not only are we not sucking it down to produce nothing, we're sucking it down to produce some pretty neat stuff, and we've got it explicitly written into our bylaws that we're not allowed to treat government grants as operating revenue: we ship a product on its own merits and treat the grant as profit margin, or we don't take the grant. That's only somewhat on ethical grounds: it's also on grounds of not poisoning our own incentive structure.

My boss isn't a trained rationalist. He just knows where his brass tacks are.

4619549
At the same time, however, he is not one who really believes in freedom of speech. I am banned from his account by virtue of not being a yes-man and agreeing that "A Voice Among The Strangers" is the best fic ever.

4619551 Oh thank goodness I'm not alone. I've felt bad for a long time for hating RC's politics but legitimately enjoying his stories.

book_burner
Group Contributor

So anyway, I took a calculated trolling risk of noting that RealityCheck doesn't make his pony characters into Christians, but rather Deists. If he suddenly turns The Great Alicorn Hunt and all his other fics into massive evangelical circlejerks, I'm dearly sorry. And also really, really not-sorry :trollestia::trollestia::trollestia::trollestia::trollestia::trollestia:.

  • Viewing 1 - 50 of 22