The Forecasting Series

Feature image for The Forecasting Series

Table of Contents

Gregory Northcraft has an observation that goes ‘the difference between experience and expertise is that expertise is having a predictive model that works’.

We are often called to make predictions about the future in our careers and in our lives. It seems useful to me that we should look into the outer limits of human forecasting — if only to see what is possible there.

The Forecasting Series is a collection of posts about the work of Phillip Tetlock and his collaborators — a corner of the judgment and decision making research literature that explores the limits of geopolitical forecasting. Along the way, we grapple with questions like ‘is cognitive bias avoidance possible?’ and ‘is predicting the future even possible in an uncertain world?’

This is not a complete series. I plan to write more posts as interesting research trickles out in the wake of the conclusion of the Good Judgment Project. Read this collection if you want a primer into the best research we have on predicting the future.

Posts In This Series

  • The Difference Between Experience and Expertise (prelude) — Gregory Northcraft argues that ‘the difference between experience and expertise is that expertise is having a predictive model that works.’
  • Forecasting Under Uncertainty — Many experts believe that modelling the future is impossible, and that we are stupid to try. They argue that the future is uncertain, and unlike risk, uncertainty cannot be measured. Why do we believe that forecasting the future is even possible? On what basis do we think that uncertainty can be overcome?
  • How Do You Evaluate Your Own Predictions? — The first step to improving your predictions is to use a scoring system that keeps you accountable. This is important because humans are great at lying ... to themselves and others! This essay contains a summary of the scoring system used in The Good Judgment Project (GJP), and presents an attempt to apply it to an ordinary career.
  • How The Superforecasters Do It — Superforecasters are the best performing forecasters in the GJP's four-year-long forecasting tournament. This post is a summary of the techniques used by GJP's superforecasters, and an accounting of the interventions developed by the researchers to improve forecasting performance throughout the program.
  • Five Takeaways on Predictions (For The New Decade) — A quick summary of the series so far, written to coincide with the end of the 2010s, and in response to the flurry of predictions for the new decade written at the beginning of the 2020s.
  • The Limits of Applied Superforecasting — A mea culpa. What if superforecasting isn't as useful to the average practitioner as I would have thought? This piece explores the evolution of my thinking on this topic, after a few months of application. My conclusion right now is that — perhaps — the better, more tractable adaptation is to be able to move quickly in the face of uncertainty.
Illustration of a brain and a lightbulb