This story is over 5 years old.


The Harry Potter Fan Fiction Author Who Wants to Make Everyone a Little More Rational

"Harry Potter and the Methods of Rationality," by AI specialist Eliezer Yudkowsky, has commanded a sprawling, cult-like international fan base thanks to its message of rational thinking.

Image via Wiki Commons user Ultra-lab

There's something weird happening in the world of Harry Potter. Something I can't quite get my head around.

On March 14, the most popular Harry Potter book you've never head of, Harry Potter and the Methods of Rationality, will come to its conclusion. It has been running online as a fan fiction for the past five years. It is 600,000 words long and contains 112 chapters. By the end, we'll be looking at a grand total of 700,000 words and 125 chapters. This will put it somewhere between Gravity's Rainbow and Route 66 in terms of length.


It has over 7,000 Reddit fans, 26,000 reviews, and a fan-made audiobook. There will be worldwide wrap parties to celebrate its culmination; throughout its pages there are cameos from its readers as Hogwarts students, little walk-on roles that make them feel alive. It's similar to the furor that surrounds a Murakami release and celebrated with a passion we only see these days for things like Netflix posting a new series of House of Cards in its entirety.

Much like all things these days, this came to me through word of mouth. Most people agree that it's brilliantly written, challenging, and—curiously—mind altering. HPMOR (as it's known by its fans) is categorically not your average piece of fan fiction.

Here, for example, is a random clutch of chapter titles: "The Fundamental Attribution Error," "Positive Bias," "Working in Groups Pt. 1," "Pretending to Be Wise Pt. 2," and "The Machiavellian Intelligence Hypothesis." Not, I'm sure you'll agree, your average fan fiction fodder.

There is no place here for Harry and Ron fucking or Harry as a rare species of incubus who has to have sex once a day to survive. Or, indeed, his brief spell as "a wild card that will change the Japanese middle school tennis scene." There's about as much sex as a night at home with the box set of Downton Abbey, so its popularity confused me. What's going on?

Why are regular, hard-working humans spending their time on a piece of fan fiction that doesn't even include a subplot where Hogwarts falls in love with a gigantic squid? And what the fuck have the methods of rationality got to do with the Boy Who Lived?


Imagine a book where Harry Potter is not a sex-starved, self-hating little dweeb but instead a miniature Ravenclaw Spock with a taste for deductive reasoning and a bowl haircut from Diagon Alley's hottest new barber, and you'll be someway to understanding its charm.

HPMOR reads like the originals after a lifetime spent playing Nintendo's Brain Training. Similar scenes feel like they've been moved at right angles to themselves and shunted into the fourth dimension. That is to say, I could sense the shadow of a shape I couldn't quite understand, as if the book was trying to tell me something about life.

Turns out, it was. The most common thinking errors humanity makes is something called a systematic thinking error, or cognitive bias. Like the sunk cost fallacy.

This can be explained by remembering the last time you went out to a club because, "I've already drunk six beers so might as well." That existence-questioning hangover you get that feels like your intestines have collided with the entire Heineken brewery? Yeah, that's nature's way of telling you to stop making decisions based on a logical fallacy.

All those times in the original where Harry grieved over his dead parents or said precisely the wrong thing to Cho Chang to get in her pants, turns out he was acting irrationally.

This new Potter, though, doesn't. He's basically the Jesus Christ of Rational Thought. He owns this book. He hits Voldemort out of the fucking park with a bunt while scratching his ass with his foot. And—here's the kicker—if you start copying him—that is, making rational decisions that overcome cognitive biases—you, too, can make life your bitch.


Welcome to the world of rational thinking, the art of being Less Wrong.

On February 28, 2010, somewhere between the western edges of San Francisco's Bay Area and the Pacific Ocean, 35-year-old American rationality and AI specialist Eliezer Yudkowsky uploaded the first ever chapter of HPMOR, "A Day of Very Low Probability." The success it went on to have could have been predicted by no one.

The act of reading this literature like stepping into a parallel universe one millimeter away from our own. On the one hand, it feels convenient that Yudkowsky chose the most popular character on the planet to write about, but on the other, it makes perfect sense. Because what's more irrational than using a spell to turn into a cat?

"I'd been reading a lot of Harry Potter fan fiction at the time the plot of HPMOR spontaneously burped itself into existence inside my mind, so it came out as a Harry Potter story," Yudkowsky told me on the phone. "If I had to rationalize it afterward, I'd say the Potterverse is a very rich environment for a curious thinker, and there's a large number of potential readers who would enter at least moderately familiar with the Harry Potter universe."

Eliezer Yudkowsky. Image via Wikimedia Commons

Taking a read of his website, it becomes quite clear that Yudkowsky is not your average fan fiction author. He is far more likely to talk about the Twelve Virtues of Rationality than how sad he was when Dumbledore died. His updates for his fan fiction include links to a place called the Center for Applied Rationality, where he is a Curriculum Consultant (though not employed, so it's unlikely he's making any money from his writing personally).


It "pursues," Yudkowsy said, "what I see as an important common project for the human species, namely taking all humanity's wonderful cognitive science research and trying to translate it into teachable skills for thinking better in real life, doing better in our own lives and the world."

There's a curious correlation between the work at CFAR and that which occurs at Yudkowsky's day job at the Machine Intelligence Research Institute, whose main goal seems to be to ensure that Skynet never happens. The former helps make humans think like machines. The latter makes sure super smart computers think like us.

I was interested in finding out how much Yudkowsky's work and personal beliefs factor into his Potter story. "Overwhelmingly, of course," he says. "It informs every shade of how the characters think, both those who are allegedly rational and otherwise. You can't spend years studying cognitive science and come out not having any opinions about how literary characters would realistically think, or, if you could, it would be sad."

The website for CFAR reveals a lot about the aims of the association—helping people overcome flawed thinking to self-improve. "What if," the website asks, "we could shrug off our feelings of defensiveness, and honestly evaluate the evidence on both sides of an issue before deciding which legislation to pass, what research to fund, and where to donate to do the most good?"


HPMOR's official website contains a page asking for support and funding for the Center, though Yudkowsky told me he is now "superfluous" to the company as those who spent most of "their whole working weeks prototyping the teaching units rapidly went past the point where my helpful advice could be any use to them."

These teaching units extend from walk-ins to weekend get-aways, called "minicamps"—Rejection Therapy (now known as Comfort Zone Expansion), Againstness Training (how to react under pressure), and they have a handy rationality checklist. A workshop weekend costs $4,000 a person, or you can be their test subject.

Make no mistake, this is a self-help system, just as something like Dianetics originally was. Facebook is one of their previous clients and—logic—Facebook is evil.

If this all sounds slightly cultish to you— a sacred text, a big bold call out for test subjects, the promise of a happier life, the call for donations on top of fees—that's because there are similarities here to the growth of other belief structures. Only in Silicon Valley would we get a group that treats the human mind like an app.

Having said that, a lot of the stuff they do actually sounds good and feels about as dangerous as Colin Creevey at a Quidditch match. Rather than basing their ideas on aliens that crash-landed on our planet millions of years ago, everything is empirically based. Why not see if you're acting rationally, right now?

You do begin to wonder if this really is a cleverly disguised academic exercise or a marketing tool for a belief structure rather than a piece of fan fiction. It's clear that HPMOR feels more like Flatland than The Philosopher's Stone, though I concede that a vast majority of its readers may just be there for the magic—quite literally.

I asked Yudkowsky if he'd be willing to go the way of 50 Shades and, after a bit of a facelift, sell the book as an original: "That's not possible in this case. HPMOR is fundamentally linked to, and can only be understood against the background of, the original Harry Potter novels. Numerous scenes are meant to be understood in the light of other scenes in the original HP."

When, in the 1930s, science fiction author L. Ron Hubbard began work on Dianetics: The Modern Science of Mental Health, no one could have anticipated his brand of self-help later becoming the center of a multimillion-dollar religion. It's strange, but it doesn't seem a stretch to say there are echoes of that movement here—hiding just behind one 11-year-old boy's scar.

Follow David on Twitter.