This is part of the Expertise Acceleration topic cluster.

Practice As The Bar For Truth

Feature image for Practice As The Bar For Truth

Table of Contents

Sign up for the Newsletter

    Once a week. Three links. No spam. Unsubscribe anytime.

    I realised recently that whatever rigour exists on this blog may be boiled down to a single sentence: use practice as the bar for truth.

    Contrast this approach with the alternative. Most people use analysis as the bar for truth. Using analysis is an entirely cerebral exercise: you examine the structure of an argument to see if it is internally consistent; you use sophisticated reasoning to construct properly calibrated beliefs about some position or technique or framework. Or at least some people do this; others don’t have an explicit bar for truth in their heads; still others simply match a piece of information against what they currently know, or use a cognitive shortcut to decide if they believe the thing. Perhaps they perform a gut-check to see if it ‘feels’ right. Or perhaps they decide to debate it out, in the comments of some online forum. Analysis is a lot of work.

    But practice is a shortcut. You apply something to your life and if it works, it’s true. You get to skip over some of the up-front analysis you might otherwise need to do. You don’t allow yourself the luxury of debate.

    More importantly, practice comes with a rigour of its own.

    The Rigour of Practice

    The type of rigour that emerges from practice comes in a slightly different form from what we’re used to. It isn’t quite like the rigour that comes from good analysis.

    I think the first taste I got of this was when I spent a couple of years trying to apply deliberate practice to my life. I had good reason to believe it would work: references to deliberate practice abound in blogs, on Medium, and in other popular science books (“deliberate practice is the best known method for building expertise,” writes Cal Newport in So Good They Can’t Ignore You, which — as far as I can tell — remains perfectly accurate). So I thought: “surely deliberate practice would also work for me?”

    But alas, it did not.

    To be precise, deliberate practice contained ideas that I’ve found useful, but proper DP may only be performed under an instructor, in a domain with an established pedagogical approach. I didn’t know this until I failed to get DP to work, multiple times, over the course of about four years. Eventually, I got fed up enough to dig into the actual literature — not the blog posts or the popular science books that cited K. Anders Ericsson’s research, but the actual books and papers Ericsson had written himself. And what I found surprised me.

    I discovered that ‘deliberate practice’ was simply a name for the forms of practice that Ericsson found in easily measurable domains with — and this is key! — an established pedagogical approach. Ericsson did this because he was an academic, and the incentives for academics are to pursue novel research results that might survive peer review. So music and chess were included in Ericsson’s research program because training methods in music and chess are well established and success in both domains is clear; business and jazz were out because neither had established training methods.

    In other words, Ericsson went looking for skill domains with clear rules and well-explicated methods for instruction … and he did this because it was easier, and also because it hadn’t been done before. In this, he succeeded beyond his wildest dreams.

    I don't mean to belittle his achievements: thanks to Ericsson and his collaborators, we now have a working theory of practice as applied to fields with an established pedagogical approach. This remains a gigantic contribution to humankind. But the problem with DP is that it just doesn’t seem to be as useful when applied to everything else.

    I tell this story because it is an example of how practice may uncover the practical limitations of a scientific idea. Everyone else who had been writing about deliberate practice assumed it was the gospel truth for skill acquisition; very few, it seemed, had actually tried to apply it to their lives.

    Occasionally Ignoring Science

    “That’s all well and good,” I hear you say, “But then why take the time to put things to practice first? Why not read the literature and evaluate it directly? Why not do analysis before trying things out?” These are good questions. Obviously we have to be a little thoughtful about the interventions that we try for ourselves. But let me give you a counter-example to illustrate how putting things to practice early leads to benefits that analysis alone might not uncover.

    Carol Dweck, speaking at a TED event. (Bengt Lennartsson, CC BY-NC-ND 2.0)

    You’ve probably heard of psychologist Carol Dweck and her work on the ‘growth mindset’. (And if you haven’t, here’s her TED talk).

    The idea behind growth mindset is simple: people who believe they can improve and grow in their abilities, so long as they work hard, do better than people who believe that ability is innate. This is an incredibly attractive idea, because it is so easy to implement: just tell kids that good ability comes from growth and hard work, that it is not innate, and they will do better in their exams and therefore in life!

    The problem, of course, is that the original experiments that Dweck performed to justify growth mindset are too neat — the interventions are tiny, the results are clear, the correlations are strong, and this should arouse our suspicions because effect sizes from interventions in psychology are rarely this good.

    And so when the replication crisis swept through psychology during the mid 2010s, people began sniffing around ‘growth mindset’ with increasing suspicion. You can take your pick of summary articles: Scott Alexander being doubtful in 2015, Scientific American in 2019, Jay Lynch’s skeptical Medium piece from 2018, some coverage of a successful replication (and debate in the comments) over at Marginal Revolution in 2018.

    The current consensus seems to be that growth mindset replicates but the effect sizes are tiny, and so there’s absolutely no clarity about whether mindset research is true. People are still fighting this out. My take is that the research might not survive into the new decade; many things about applied mindset interventions seem too neat to be true, and the declining effect size should make us squint carefully at the research.

    But this is where practice makes me pause. The thing is, I have tried growth mindset in my life, and it’s worked wonders for me.

    In my final year of university, I was failing a bunch of freshman math classes that I had been retaking since my first year, with little to no success. In fact, I had been failing math for the entire duration of my university life, partly because I had a bad foundation (coming from a high school with average academics in a small city in Malaysia does that to you), but also because I didn’t want to admit that I had a problem. I doubled down on my computer programming classes, which I enjoyed; I refused to think about the calculus and linear algebra modules that always seemed to trip me up. I kept retaking them, and the students that attended kept getting younger and younger.

    Learning about growth mindset unlocked a whole bunch of things for me. It made me see that I had become too fixated on “oh, I’m just not good at mathematics, don’t even try.” It turned out that I had made my mathematical weakness part of my identity, and in so doing, I had begun to believe that mathematical ability was innate. I had become a ‘fixed mindset’ thinker, and it prevented me from improving.

    The thing that led me to Dweck’s research wasn’t her best selling book or her TED talk; it was a blog post by Aaron Swartz, written in 2012. He wrote:

    I used to think I was introverted. Everyone had always told me that you were either an extroverted person or an introverted person. From a young age, I was quite shy and bookish, so it seemed obvious: I was an introvert.

    But as I’ve grown, I’ve found that’s hardly the end of the story. I’ve started to get good at leading a conversation or cracking people up with a joke. I like telling stories at a party a story or buzzing about a room saying ‘hi’ to people. I get a rush from it! Sure, I’m still not the most party-oriented person I know, but I no longer think we fit into any neat introversion/extroversion buckets.

    Growth mindset has become a kind of safe word for my partner and I. Whenever we feel the other person getting defensive or refusing to try something because “I’m not any good at it”, we say “Growth mindset!” and try to approach the problem as a chance to grow, rather than a test of our abilities. It’s no longer scary, it’s just another project to work on.

    Just like life itself.

    Growth mindset worked for Swartz’s introversion, and it worked for my hangups over math class. Maybe this was a placebo. Maybe a therapist would have been just as effective, without ever invoking growth mindset. But shortly after learning about growth mindset, I shook off my problems, went to the freshmen in my classes for help, and scraped a C in those courses to graduate, albeit a year late. And I no longer believe (or at least, I no longer believe as strongly!) that I am innately bad at math — now, it’s a mix of “I’m still behind and I need to put in a lot of work to get better”, mixed in with a bit of sadness because I wish I’d worked harder at math earlier in my life.

    Surprising Second Order Implications

    The Dweck story makes me very conflicted whenever growth mindset comes up in casual conversation. On the one hand, yes, it is unlikely to stand as an example of good science. On the other hand, holy hell did it work for me. So what do I say? I’m not sure. But maybe it doesn’t matter — I’d gotten something useful from the research and it’s made a difference in my life. What more can I ask from an intervention?

    The point I’m making with the Dweck story is that something doesn’t have to be true scientifically for it to be useful for you. What makes a thing useful for you is if you have tried it out in your life and, err, you’ve found it to be useful. That’s it! It’s a simple bar. You don’t have to worry about replications over larger sample sizes (though this should increase your confidence in a method) — if it works for you, it works for you.

    In fact, I would go so far as to argue that if you want to be effective, you don’t have to hold extremely coherent and true beliefs about the world. You simply have to figure out what works for you through actual application, and if that is helpful to you, you may keep it as part of your arsenal as you move on to the next challenge in your life.

    This seems like an odd position to take, but as far as observations go, it’s surprisingly common amongst the most effective people I know. In contrast, the ones who like to hang around and discuss the finer epistemic points of some topic or other tend to not get much done. I once wrote a series about ‘Traditional Chinese Businessmen’ — that is, the generation of businessmen in South East Asia who were mostly uneducated, highly superstitious, and not particularly good decision makers when it came to personal affairs. They had to deal with genocide, and war, and corrupt third world governments, and yet they seemed to be very effective in business. And I contrasted this with the rationalist community LessWrong, who were filled with people who desired to hold coherent and true beliefs about the world. I repeated the criticisms of onetime-community members Aaron Swartz and Patri Friedman, who pointed out that, for all the supposed epistemic rigour the community possessed, very few of them seemed to have accomplished very much.

    I think there’s a simple reason for this: epistemic rigour isn’t enough to guarantee personal effectiveness. You need to put things to practice, because you want to test against reality. You may argue about some approach till you’re blue in the face, but so what? Does it work for you? The only way to tell is to try.

    One unexpected side effect from using practice as a bar for truth is that I’ve found myself leaning away from intense online debates about the finer points of some technique or viewpoint or framework. At the back of my mind, I find myself asking: “Has this person actually tried to implement it? Do they know what they’re talking about? Or are they debating for the sake of debate?”

    The easiest way to suss this out is to ask for a concrete story of application. In the best case, you get a reference that you can use when you attempt to apply the thing in your own life. In the worst case, you get an excuse, which you can then take as a sign to discount whatever the person is saying — without rejecting it as entirely false, of course.

    In fact, another acceptable outcome would be a person saying “I haven’t tried this, but I don’t see why this wouldn’t be helpful”, to which you may reply: “Let’s both try it and compare notes in a month or so.” This doesn’t work for every topic, but there are a surprisingly large number of things where this approach is tractable. And if your goal is to become more effective, you owe it to yourself to at least give it a go.

    Using practice as an arbiter for truth has some other nice properties. For instance, it was what led me to my views on mental models — it seemed like everyone was talking about how mental models were this amazing thinking approach, but I found it difficult to put into practice, and I couldn’t really get a concrete applied story of the mental models approach from anyone I talked to. I eventually concluded that it was a fad.

    The nice, flip side of this stance, of course, is that it leaves me open to changing my mind. The only thing that needs to happen is to have someone tell me a concrete, applied story about using ‘mental models’ in their lives, and I would take notice and listen. Such stories are doubly useful, because they give me new approaches to test.

    Finally, using practice as a standard has led me to believe that debate isn’t an effective method for seeking instrumental truths. I think the better question to ask is “Is there an application or a test that I can formulate — however imperfect — to verify this thing that we’re arguing about?” If so, my time is better spent doing that, instead of arguing with said person.

    (I still fail, of course, because debate is so easy and practice is so hard, but the point is to try!)

    Wrapping Up

    I wish I could say that practice as a bar for truth is some unique invention that I came up with. But it isn’t: in the field of epistemology, there are four classical theories of truth, and I plucked ‘use practice as the bar for truth’ from one of those four theories — the philosophical tradition called pragmatism. To quote Wikipedia:

    Pragmatism considers words and thought as tools and instruments for prediction, problem solving, and action, and rejects the idea that the function of thought is to describe, represent, or mirror reality. Pragmatists contend that most philosophical topics—such as the nature of knowledge, language, concepts, meaning, belief, and science—are all best viewed in terms of their practical uses and successes.

    That sounds really high-falutin’, but the general idea is that what is true is whatever is useful to you. This is a practitioner’s epistemology: a philosophy for people who want to get things done, not merely sit around and think about them. It isn’t perfect, and I don’t think it’s well-suited for certain types of truths, but boy is it good when applied to one’s career.

    And so I’ll freely admit that whatever rigour exists in this blog is attributable to pragmatism. Heck, the fact that so many of my posts are actionable is a natural side effect of using ‘practice as the bar for truth’.

    But ultimately, if I’m being honest, I use this standard mostly to protect me from myself — I know myself pretty well, and I know that I would love nothing more than to argue for hours on the Internet.

    Here’s to hoping that I've permanently stopped.

    Related: Ray Dalio's Believability, The Hierarchy of Practical Evidence.

    Originally published , last updated .

    This article is part of the Expertise Acceleration topic cluster. Read more from this topic here→

    Member Comments