The Base Rate Is A Hell of A Thing

Feature image for The Base Rate Is A Hell of A Thing

Table of Contents

Sign up for the Newsletter

    Once a week. Three links. No spam. Unsubscribe anytime.

    Nick Maggiulli wrote a blog post a few weeks ago titled Respect the Base Rate , which opens with this wonderful story:

    The best decision I ever made in my life was a complete fluke. It happened during the beginning of my senior year in high school. I was compiling the list of which colleges to apply to when my godfather asked me whether I was applying to Stanford. To be honest, I didn’t know much about Stanford at the time other than they were a private school. And private schools were definitely off the list.

    You see, neither of my parents graduated from college so I didn’t grow up knowing anything about college or what options were available to me. So when I started looking at where to apply I focused on the one thing that I did understand—cost. As a result, I was only looking to apply to public colleges in California, mostly the University of California (UC) schools. Public because it was cheaper and California so I could keep the CalGrant money (~$10,000 a year).

    After explaining this to my godfather he insisted that I apply to Stanford because he knew they would provide some sort of financial aid package. I agreed and started doing research online before applying. Unfortunately, I quickly realized that getting into the school was easier said than done.

    After scrolling the College Confidential online forum for hours I realized I was in over my head. One student who got accepted had discovered their own asteroid, another was a stuntman, and another founded a $1M charity. How was I supposed to compete with these kids?

    I didn’t think I had any real chance, but I did learn that Stanford had a single-choice early action (SCEA) program that allowed you to apply early, but you could only apply early to them and no other school. Because I felt so unsure about my chances of getting in, I convinced myself that I would apply early just to get it over with. By mid December I would know if I was in or not.

    Unfortunately, I only had nine days until the early application deadline. So I scrambled through the numerous essays and short responses to put my application together and sent it in on the day it was due.

    Unbeknownst to me, I had just made the single best decision of my life. However, I didn’t realize this until a few weeks after I had applied. Because in those weeks after applying I kept scouring the College Confidential forums and learned something huge.

    In the year prior (2007), the overall acceptance rate into Stanford was a sobering 10.3%. However, the acceptance rate for those who applied early was closer to 20%. Through sheer dumb luck I had doubled my chances of getting into one of the most selective schools in the country.

    The argument that Maggiulli makes, of course, is that one should learn to ‘respect the base rate’. He continues:

    The base rate is simply the probability of some event occurring when you have no other information.  In this case, the base rate of getting accepted as regular applicant was 8%, while the base rate for getting accepted as an early applicant was 16%. Without any other information, you should assume that you will experience the base rate. (emphasis mine)

    I want to take Maggiulli’s argument further. Base rate thinking is incredibly useful — especially when applied in its explicit form (where you have a clear reference class, and a specific percentage available). But if you internalise the style of thinking about base rates, one implication of this is that you will think ‘what has happened to many others may very likely happen to me as well’, and think this about many things. This is a surprisingly handy lens, even when you’re not able to point to a probabilistic reference class.

    Let’s put this another way. One of the most commonly cited benefits of reading history is that you’ll learn what is often called the ‘lessons of history’. When I was a novice reader this always sounded a bit wishy washy to me: what, exactly, was I expected to learn from the Peloponnesian War? That I shouldn’t pick a fight with a militaristic sister city? Or that I should protect naval power at all costs? And what might I learn from reading The House of Morgan? That bankers were once really, really powerful, and now are less so?

    The recommendation makes a bit more sense if you take a step back to pick out patterns from the stories of the people you’ve read. If the same pattern happens to crop up again and again across multiple stories of multiple people written by multiple authors in different industries and in different eras, the obvious conclusion to make is “oh, that’s happened to lots of people, perhaps I’ll experience that in my life as well”.

    (Though I’ll admit — it’s a bit harder to apply when you’re reading ancient history. I still do not know what I’m supposed to learn from the Peloponnesian War.)

    Base Rate Thinking in Practice

    But let’s start from the top. The basic form of base rate thinking is actually really easy to do! Like Maggiulli describes, you:

    1. Select a reference class. In Maggiulli’s opening story the reference class is really simple: this is Stanford’s acceptance rate for single-choice early acceptance vs the overall acceptance rate. Other times, it is more tricky; you want to select something that is reasonably representative of your situation.
    2. Perform some adjustments. This doesn’t really apply to the Stanford example, but sometimes you want to adjust upwards or downwards based on extra information that you have. For instance “I’m a really good writer, if I work very very hard on my essays, I might perhaps increase the chance of acceptance by 5%”
    3. Make a decision based on that base rate of that reference class. This is clear: in Maggiulli’s case, you apply for SCEA, with the understanding that you still might not get in — we’re talking probabilities, after all. Nothing is guaranteed.

    Later on in the piece Maggiulli argues that he would never regularly drive a motorcycle, because the base rate for motorcycle fatalities is roughly 18 times that of fatalities in passenger cars (when controlling for the number of miles driven). This is another example of good base rate thinking in action — you’re taking a look at overall probabilities and then using that to inform your decisions.

    Naturally, which reference class you select is an important part of the method — if, for instance, you have good reason to think that the base rate doesn’t apply to you (you’re a particularly skilled motorcyclist, or you are thinking of riding in a different country) then perhaps motorcycle riding is acceptable. And again — this is probability we’re talking about; you may ride a motorbike for years with zero problems, or you may ride a motorbike for the first time in a relatively safe environment and die on your first night out; nothing is guaranteed.

    But there are more advanced applications of the idea. For example, the most famous base rate story is probably Daniel Kahneman’s Israeli textbook story, which he recounted in Thinking Fast and Slow:

    In the 1970s, I convinced some officials in the Israeli Ministry of Education of the need for a curriculum to teach judgment and decision making in high schools. The team that I assembled to design the curriculum and write a textbook for it included several experienced teachers, some of my psychology students, and Seymour Fox, then dean of the Hebrew University’s School of Education and an expert in curriculum development.

    After meeting every Friday afternoon for about a year, we had constructed a detailed outline of the syllabus, written a couple of chapters, and run a few sample lessons. We all felt we had made good progress. Then, as we were discussing procedures for estimating uncertain quantities, an exercise occurred to me. I asked everyone to write down their estimate of how long it would take us to submit a finished draft of the textbook to the Ministry of Education. I was following a procedure that we already planned to incorporate into our curriculum: the proper way to elicit information from a group is not by starting with a public discussion, but by confidentially collecting each person’s judgment. I collected the estimates and jotted the results on the blackboard. They were narrowly centered around two years: the low end was one and a half, the high end two and a half years.

    So far so good. Now Kahneman presents the twist:

    Then I turned to Seymour, our curriculum expert, and asked whether he could think of other teams similar to ours that had developed a curriculum from scratch. Seymour said he could think of quite a few, and it turned out that he was familiar with the details of several. I asked him to think of these teams when they were at the same point in the process as we were. How much longer did it take them to finish their textbook projects?

    He fell silent. When he finally spoke, it seemed to me that he was blushing, embarrassed by his own answer: “You know, I never realized this before, but in fact not all the teams at a stage comparable to ours ever did complete their task. A substantial fraction of the teams ended up failing to finish the job.”

    This was worrisome; we had never considered the possibility that we might fail. My anxiety rising, I asked how large he estimated that fraction was. “About 40 percent,” he said. By now, a pall of gloom was falling over the room.

    “Those who finished, how long did it take them?”

    “I cannot think of any group that finished in less than seven years,” Seymour said, “nor any that took more than ten.”

    I grasped at a straw: “When you compare our skills and resources to those of the other groups, how good are we? How would you rank us in comparison with these teams?”

    Seymour did not hesitate long this time.

    “We’re below average,” he said, “but not by much.”

    This came as a complete surprise to all of us—including Seymour, whose prior estimate had been well within the optimistic consensus of the group. Until I prompted him, there was no connection in his mind between his knowledge of the history of other teams and his forecast of our future.

    We should have quit that day. None of us was willing to invest six more years of work in a project with a 40 percent chance of failure. Yet although we must have sensed that persevering was not reasonable, the warning did not provide an immediately compelling reason to quit. After a few minutes of desultory debate, we gathered ourselves and carried on as if nothing had happened. Facing a choice, we gave up rationality rather than the enterprise.

    The book was completed eight years later. By that time, I was no longer living in Israel and had long since ceased to be part of the team, which finished the task after many unpredictable vicissitudes. The initial enthusiasm for the idea in the Ministry of Education had waned, and the textbook was never used.

    In this particular case, Kahneman asked Seymour Fox for a comparison class, since Fox had the most experience with such projects. And then he asked Fox for a base rate, and Fox came up with “about 40%”. Notice that none of the probabilities were ever super specific, nor was the reference class particularly large; Kahneman was really asking the question: “when other teams like us were working on similar things, what was the most likely outcome?”

    This is kind of the point. Not everything we do will have neat probabilities attached. But the style of base rate thinking (which Kahneman sometimes calls ‘taking the outside view’) is broadly applicable.

    I think this style of thinking is very useful, and we’ll spend the rest of this piece going through a few examples.

    Common Sense Applications of Base Rate Thinking

    When I was in university, I spent a fair amount of time talking to older people: people who had kids, who were somewhat along in their careers. A large portion of them said “Yeah, kids are good”, and only a small minority didn’t seem to ever want kids.

    I remember thinking that the odds seemed good that I would want kids myself — even if I couldn’t quite imagine having kids yet, or imagine the path that would lead to me wanting them. A bunch of friends said that they never wanted kids — one of them said “kids below a certain age are basically lemmings” — so while I put myself in the ‘I’ll probably want kids at some point’ camp, I was really working off the base rates.

    Today, a good number of my friends have kids, or are on their way to having kids, including my aforementioned ‘kids are lemmings’ friend. The split between ‘friends who have kids’ and ‘friends who don’t’ is about the same as the split I observed in the generation above me, while I was still at university, regardless of what we thought when we were younger.

    *

    Barbarians at the Gate is probably one of the best business narrative books I’ve ever read; it details the ‘fall’ of RJR Nabisco — or more accurately, the machinations that took place in the background of the then-largest private equity deal in history. The first couple of chapters of the book is basically the story of Ross Johnson, the man who eventually became CEO of RJR Nabisco, and it describes how Johnson used politics and charisma and backstabbing to climb the corporate ladder and get to the good life for himself and his cronies. It also describes how Johnson wasn’t a particularly good business operator — but also that it did not matter; RJR Nabisco was a cash cow, and Johnson could buy private jets and build luxurious private lounges for his own use, with virtually zero supervision from the board.

    There are various lessons you may draw from this story. Some would read it and go “oh wow, horrible backstabbers can get ahead in life”; others would read it as “even in corporate America, it’s possible for boards of large companies to phone it in”; still others might — God forbid — use this as inspiration to take control of a cash cow of their own.

    My take was “oh, people like Ross Johnson exist, and chummy boards probably also exist, I shouldn’t be too surprised when I encounter them in my career.”

    This sounds like an incredibly conservative takeaway, generic to the point of uselessness. But multiply such takeaways across dozens of books over many years, drawn from different industries and different eras, and the number of patterns eventually do add up to something. When it comes down to it, human beings fall into a relatively small distribution of personalities, behaviours, and vices. Taking advantage of that distribution by reading as much about it seems like a smart thing to do.

    *

    A more interesting application of base rate thinking is to figure out if you belong to a select reference class.

    I spend quite a bit of my time reading finance, mostly because I’ve found value investing a particularly rich source of information on businesses and markets and moats. And amongst the things you quickly learn when you read so much about finance are the following four facts, repeated ad nauseam across what seems like all the books:

    1. Good investors are rare.
    2. Good investors seem to spend non-trivial amounts of time getting good at investing.
    3. It is difficult to become a good investor, it seems, because you need to have the right temperament: you need to be able to have conviction to stick to an investment when you are right, and the ruthlessness to sell when you learn that you are wrong. Many investors believe that temperament is genetic, including Buffett. (Related idea: it’s how you act when you’re down, not when you’re up that matters).
    4. It also turns out that the majority of concentrated value investors who outperform the market over a long period of time spend one third of that time down (source). Over a 20 year period, one third of the time is effectively six years of bad returns, even if the six years are spread out over the entire period. No wonder temperament is so important.

    These combination of facts made me curious: did I have the temperament necessary to be a good value-style investor? I was fully aware that I had little to no training in investing; I was equally aware that the odds of me being a good investor was very low. On top of that, I knew that I was primarily a business operator, which meant a very different worldview and skillset from that of an investor. I knew that precious few were good at both.

    So: how could I tell if I had what it took? Four years ago, I constructed a test to see if I had the temperament necessary to do stock picking. I decided to buy into a very volatile ‘investment’, with the hopes of observing how I felt when it inevitably crashed. The rules were as such: I would put in an amount of money that was meaningful (I chose the price of a top-line iPhone), but not too much that I would be ruined (I didn’t put in, say, all of my savings). And I chose Bitcoin — because it was likely to subject me to wild swings, and because I didn’t have to wait years for a market correction (like I would have to with, say, equities).

    And exactly as I expected, a month after I bought an iPhone’s worth of Bitcoin it crashed on me. And I felt terrible. It was amazing. I watched my brain do backflips on itself, obsessing over the iPhone I could have bought with all that money.

    A year or so afterwards, Bitcoin corrected. I sold it for a modest $500 gain, bought two Airpods for myself and my girlfriend, and have stayed away from active investing ever since. I reasoned that my emotional regulation skills were terrible, and that perhaps I was not willing to get better at it in order to get good at investing. I believe the professionals when they say “investing is a life-long education, and its teacher is loss”; I weighted that against the cost of getting good at running a business, and decided I enjoyed the latter too much to ignore it. This is all a roundabout way of saying — yes, the odds of my sucking at investing are too high, and I have too much respect for the base rate to think otherwise.

    Wrapping Up

    In my previous post on Time Preference I wrote about the arc of an average career. I tried to sell you on the idea that an average career is about 40 years long, and that it is useful to think about it in the stages of early game/mid game/end game, grafted onto those 40 years.

    Given this post, you can probably guess that I constructed those stages from reading broadly (and from asking career questions of whoever that was older and willing to talk to me) — I couldn’t help but notice that regardless of era, industry, career path or gender, the stages of a career seemed to map pretty much to the same spectrum of ages. In other words, when it comes to careers, we should probably also respect the base rates.

    There is one other thing that I should mention. I suspect that my willingness to embrace base rates is a function of my experiences in university. I learnt — possibly in my second year but definitely by my third — that I wasn’t very smart compared to the rest of my cohort. I struggled at Computer Science, and repeated enough math classes that I completed my degree in five years, instead of the usual four. And so when you tell me that most people would experience X, or when you tell me a story about some person backstabbing another, I pay close attention; I am very willing to believe that these things would also happen to me. I believe that because I think I am quite average.

    My smart friends, on the other hand, always seem less likely to believe that what they read about in books and in blog posts would necessarily apply to them. And perhaps they are right. Perhaps their reference classes are different.

    But I think that in most cases, the base rate is too powerful to ignore. To put this concretely: you may show me a handful of people in their 60s doing interesting things, and a smaller handful of people in their 70s doing interesting things, and argue that age isn’t as big a deal if you are an outlier. But I’ll point out that the percentages fall as the decades go up, and that you cannot know that you are an outlier until you reach those ages. Respecting the base rate simply means that I do not think I will be very different from them. Respecting the base rate means that — while in some specific ways I think I am different and unique — in all the other ways, I believe that I am not so different after all.

    Originally published , last updated .

    Member Comments