Book Depository: Black Box Thinking by Matthew Syed

I didn’t find this book at ROLI, but Matthew Syed’s everywhere these days and this one looked like it might be worth a glance.

Matthew Syed is a tall man with no hair. His book is full of this kind of observation that never goes anywhere: the muscular build of a bereaved man; the hairstyle of a pilot who committed suicide. Here we are, forensically deconstructing the aftermath of a medical accident; suddenly, Syed jump-cuts to a widower’s eyes, welling as his tapering fingers tremble.

This gives the reader two problems. First, the central message of this book seems claustrophobic and insincere because he is constantly distracting us from it. Second, these soft-focus vignettes sit poorly with the tragic flavour of much of its material. Sometimes they feel voyeuristic. At other times, when you feel Syed is building up to a killer punch, he pulls it. This surgeon’s tyranny in an operating theatre nearly killed somebody. His obstruction of a subsequent investigation nearly leads to more deaths. But remember, he’s a hero. They’re all heroes. Atul Gawande trod this ground years before Syed, and did it authentically, as a driver rather than a back-seat passenger.

Syed’s book, then, is a poor recruiter for a great employer. At its core is a simple, powerful and universal message about the power of scientific inquiry. Here again, though, some flaws are unforgivable. Central to a writer’s integrity is a clear and honest use of words. You can’t tell your readers that science is what they need in their lives and then, in the next sentence, cut off its limbs to fit your bed.

Syed refers to ‘open-loop’ and ‘closed-loop’ thinking in a way that, for no good reason, inverts the established meaning of these terms. Hence, a ‘closed-loop’ system which, to the millions of us with technical training, is something that is ‘closed’ by a path that provides corrective feedback, is now ‘closed’ in the sense of ‘guarded against feedback and the influence of evidence’. Did anybody edit this book?

Lesson one: collect data about everything you’re doing. The title ‘Black Box Thinking’ refers to the two data recorders that capture the cockpit voice and telemetry in aircraft, so that crashes and near-misses can be better understood. Dispassionate forensic analysis of this data provides vital information about what went wrong. If you don’t have data, you’re reliant on lucky guesses to prevent disaster, and highly susceptible to the interference of people with their own agendas.

Lesson two: depersonalise this information, and don’t use it to shame people. The fear of shame leads to the deliberate concealment of errors, so everybody loses opportunities to learn. Humans are fallible under stress, and the first duty of a crash investigator is to improve flight safety. Before critical failures, there are near-misses, and people must be allowed to report and challenge these without fear. The exemplary attitude in aviation allows mechanical problems to be caught at an early stage. Best practice is also improved in the cockpit. In-flight checklists control the narrowing of a pilot’s concentration under stress; improved human factors fix problems with the flying controls; Crew Resource Management addresses the psychological difficulties of cockpit hierarchy. This is why, even as aircraft become more complicated, and the skies more congested, civil aviation gets safer.

Lesson three: learn by building, make marginal gains, iterate often, create theories and try to falsify them. Syed summarises with unusual concision, ‘If I want to be a great musician, I must first play a lot of bad music. If I want to become a great tennis player, I must first lose lots of tennis games. If I want to become a top commercial architect known for energy-efficient, minimalist designs, I must first design inefficient, clunky buildings.’ Nobody gets great without a lot of practice. If you’re a product company, solicit feedback from customers at a really early stage, while you’re a bit embarrassed by your offering: you’ll learn if you’re doing a really great job designing the wrong thing.

On the subject of iteration, there is another use of the term ‘Black Box’ that is more commonly employed by engineers. A Black Box model is one in which a system is characterised merely by measuring and relating its inputs to its outputs, without attempting to understand the internal process that connect them. This might have been woven into the central chapters on evolution and marginal gains. Here, in many places, it would have bolstered the book, but it doesn’t. The dual meaning is dismissed in a footnote on Page 33 and never mentioned again.

In a central chapter, Syed notes that Unilever employed physicists and biologists to approach a difficult optimisation problem from two directions. Detergent granules are produced by firing a hot, pressurised liquid through a nozzle into air, where it rapidly solidifies and lands as a powder. The powder has to have the right grain size and consistency, and be adequately mixed, so the nozzle design is critical.

First, as Syed narrates it, physicists tried to characterise how the nozzle worked by modelling the flow of fluid through it. Their failure to build a decent nozzle highlighted the intractability of the problem. A team of biologists then successfully optimised the nozzle with a typical ‘black box’ approach: starting with an existing, poorly-functioning prototype; measuring the powder it produced, tweaking its design, and favouring the best-performing candidates over dozens of generations; finishing when it was as good as it seemed it was going to get. Hundreds of prototypes later, the ‘black box’ approach produced a great nozzle.

Syed narrates this as a victory for the empirical, evolutionary approach. Dyson, who created thousands of prototype vacuum cleaners in order to arrive at the DC01, is press-ganged into Syed’s war. Take that, theorists!

Unilever nozzle

Had Syed been a scientist — had he taken his own advice — he would have seen this story as more than a battle between schools of philosophy. Both teams’ methods are in alignment with best practice: each collected data, analysed it dispassionately, and thus approached the truth. One school attempted to find a theory to solve the general case, realised that their best guesses were false, and conceded defeat. The other school set out to attack a specific case — a smaller problem — and succeeded. My conclusions would be:

  1. Failure informed the approach that led to success, as it often does. Failure’s a great teacher, but a slow and expensive one.
  2. Reducing the scope of the problem allowed a different tool to be used, which succeeded. The price of success was the abandonment of a general understanding of the problem.

Unilever have a brilliant nozzle, and the method that produced it, but they’ll never know why, or whether there’s one they missed that produces three times as much powder. Every time they want to increase flow through the nozzle or reformulate their detergent, they’ll have to make a hundred more prototypes.

Lesson four: understand and eliminate cognitive dissonance. Resist the temptation to spin failure as a success, or deny that something went wrong. Accept such failures as an opportunity to learn and improve.

If you’re involved in a technical discipline, you’re already a servant of hard physical truths. No amount of post-event rationalisation will excuse a prototype that is not fit for sale. (Although, if you’re building Mars landers for the European Space Agency, it seems you can crash-land at every attempt, act as though you succeeded, and continue to get funding, but I’m talking about real jobs.)

An external perspective of scientific method may help a wider audience to discover it. There are certainly pickings in this book for technical readers, but it’s principally for an audience who don’t get, or even seek, the same class of feedback from their work that the technologist must. Recommend it to your boss. Next time you have a corridor conversation, though, remember that ‘closed-loop’ is open-loop, ‘open-loop’ is closed-loop, and ‘black-box’ means collecting and responding to data. Or ‘science’.

Leave a Reply

Your email address will not be published. Required fields are marked *