Ideas

How To Optimise for Success, A Theory

Feature image for How To Optimise for Success, A Theory

In a recent post on Ray Dalio's Hyperrealism, I dealt with the idea that ‘You should embrace reality and deal with it’, arguing:

I've found there to be a difference between what is useful and what is factually accurate. (For philosophy nerds, this is the difference between the ‘pragmatic’ theory of truthwith the ‘correspondence’ theory of truth.) For instance: IQ scores are correlated with a whole host of positive life outcomes including health, longevity and prosperity. But what are you to do, as an individual? The answer: you should ignore your IQ scores. What is 'factually accurate’ at the population level says nothing about your individual life outcomes. It is also useless as an input on how you should live your life, especially given the fact that you cannot change your IQ, your level of privilege, or the knowability of the world. Useful information here, on the other hand, would be of the sort that Dalio peddles: e.g. ‘embrace your reality and deal with it.’

I'm starting to realise this philosophy could be extended to an absurd degree.

Let's do a thought experiment. Say you take a random guy off the street, and somehow reprogram his mind such that he only adopts beliefs that are ‘useful’ to him. The definition of ‘useful’ here is that it contributes to him achieving his goals. If it hinders him, he discards the belief and looks for something better to replace it with. If it neither helps nor hinders him, he gets to decide if he wants to keep it or not.

After reprogramming him, you leave him alone for a decade or two, and come back after an extended period away. My question is this: what kind of outcomes do you think you'd find?

My suspicion? I think you'd get a pretty darned successful person.

Optimise for Usefulness

The problem with such hypotheticals is that you can’t prove that it’s true. You can’t clone the man and run this experiment in parallel, observing the difference in outcomes between the two clones. In the same way, I can’t prove that “optimise for usefulness” is the best philosophy to follow.

But I have my suspicions.

My suspicion is that the vast majority of successful people have some variant of this belief. These people span from rigorous thinker to superstitious schmuck, but they all do seem to run some variant of the “I’ll adopt a bunch of beliefs and see what works; then I’ll keep what helps me and throw away the stuff that doesn’t.” The rigorous ones do this consciously. The superstitious ones do it intuitively. Whichever way they do it, it does appear as an undercurrent in their lives.

I’ll give two examples, at two extremes.

In 1992, Steve Jobs gave a talk at MIT’s Sloan School of Management. I’ve embedded the video below, but the relevant bit is at around the 4:02 minute mark.


In it, Jobs talked about NeXT Computer’s big bet. He quoted Paul Strassman, in The Business Value of Computers, that the enterprises that got the best return on their IT investments were the ones that invested in ‘operational productivity’, instead of ‘managerial productivity’. (These two terms are fancy ways of saying that the successful companies invested in ERP systems instead of Excel.)

Job’s bet was that NeXT could develop a programming environment so productive, it would allow developers to build operational productivity apps at a speed far faster than any of the other platforms. This would then convince large enterprises to go with NeXT Computers as their platform of choice.

With the benefit of hindsight, it’s easy to see what Jobs got right and wrong. He got the bit about programming productivity right; NeXT’s operating system served as the foundation for macOS, iOS, and watchOS today. It’s not terribly wrong to say that Apple’s insane contemporary valuation had its roots in Jobs’s insight back in 1992. But he got the business model for NeXT critically wrong: it turned out that overly expensive computers wasn’t the best way to deliver a really powerful programming environment to enterprise users.

What intrigues me about Job’s talk is that you can see the evolution of his mental models. It was clear that he had been thinking deeply about the technology industry for some time, and it’s also clear he kept some of his thinking from back in 1992 — there’s a bit in the talk where he argued technology was basically a cycle of innovation ‘windows’, and if you were alert and prepared, you could get in while the window was opening (or, if necessary, and at great cost, push open the window yourself). You can draw a clear line from that belief in 1992 to Apple’s product strategy today, with the iPod, the iPhone, and arguably every product since.

But Jobs also changed a bunch of his mental models. For instance, he switched away from hardware at NeXT, despite arguing in his talk that only the revenues of selling computers could grant him the cash necessary to hire a salesforce for his enterprise play. This switch to software allowed NeXT to find a landing at Apple, which needed a new operating system in 1996.

It’s difficult to tell what Jobs learnt from his experience at NeXT; I’m sure it’s a lot more nuanced than what I’ve just described. But let’s move on.

My second example, at the other extreme, is Robert Kuok, the sugar tycoon, real estate magnate, palm oil extraordinaire and one-time commodities trader who has been for the longest time the richest man in Malaysia. His biography was interesting for much of the same reasons I found Job’s talk interesting; in a chapter on Chinese businessmen, he argued that most of his successful contemporaries succeeded not through formal education, but by ‘distilling wisdom from the air (…) always watching, always listening, always thinking’.

Kuok built his empire on the back of simple businesses with undifferentiated products. Sugar, steel, plywood and flour, and later: hotels, shipping, and natural oils. It wasn’t necessary for him to evolve his thinking as rigorously as Jobs did, but you could see the way he approached each new industry he expanded into. In a chapter on creating the Shangri-La chain of hotels, Kuok argued that the core principles of hoteling were simple — you needed to give a man a place to sleep, to eat, and to shower; but while the principles were simple, he always preferred to partner with an experienced businessperson before entering any industry. That way, he could watch and learn the principles that made the industry work. He called this ‘distilling wisdom from the air’.

My takeaway is that while all successful people do this, some industries require it to a degree that other industries don’t. Jobs needed to constantly update his mental frameworks to a degree that Kuok didn’t; the technology industry was a lot more difficult to figure out than undifferentiated commodities for developing markets.

Superstition as Failure Mode

One of the biggest problems with scientific research for the common man is: “how do I use the scientific result ‘X’ in my life?” where X is whatever it is that science writers argue is necessary for success today. This is sometimes easy to do, as is with the case of Bacopa Monnieri (conclusion: eat it for longer short-term memory, but only if you don’t get diarrhoea) but it is harder when it comes to research on grit and willpower.

The answer, then, is to adopt a result and see if it grants you an improved ability to achieve your goals. If it does, keep it. If it doesn’t, then it doesn’t matter whether the result is true in the scientific sense; at the level of the individual it doesn’t matter, and you should discard it.

This leads us to some pretty annoying side effects. For instance, if you adopt a belief that is scientifically suspect and find that it helps you, this method causes you to keep the belief regardless of the truth. In addition, harmless beliefs that don’t affect your goal achieving ability are going to stick around. See, for instance: Ray Dalio’s adoption of the debunked MBTI personality test at Bridgewater, or Kuok’s strong belief in Chinese superstition. These beliefs have stuck around because these people have either found it to be helpful to them, or because it hasn’t harmed their success anyway.

I think the best way to think of this philosophy of “optimise for usefulness” is to see it as similar to evolutionary pressure. Like evolution, the set of beliefs that work for success tend to be remarkably similar, and will arise independently over time. But then … sometimes you get weird bugs like the human appendix. Nobody has any clue why the appendix still hangs around, but it doesn’t seem to actively harm anyone, so nature lets us keep it.

Of course, the meta thing to do here is to apply this philosophy to my belief in this philosophy. That is, I should adopt “optimise for usefulness” only to the extent that I find it benefits me. If at any time “optimise for usefulness” prevents me from achieving my goals, I should discard it for something else.

I have researcher friends who call me out for what they see as sloppy thinking on certain things. They have a point. The demands of intellectual rigour for a practitioner are not as high as in academia. It’s not fully rational to believe in Chinese superstition or MBTI personality tests or — for that matter — in religion. But if I’m right about “optimising for usefulness”, then the pursuit of effectiveness calls for believing in some sloppy truths.

Perhaps the final takeaway here is that the practitioner optimises for what is useful, whereas the scientist optimises for what is true. You either have the privilege of rigour in the academy, or you have to face the evolutionary pressure of believing in sloppy truths. Pick your poison; your success demands that you do.

Update: this story about Jobs and the NeXT acquisition reveals that the reality was a lot more complicated than I thought.