Book Depository: The Five Dysfunctions of a Team by Patrick Lencioni

The first, and by far the more interesting, part of this book is written as a short story. It’s about a CEO called Kathryn, who is appointed to fix a software company. Agonisingly, she turns around its dysfunctional leadership team, its decision-making, and its results. The second part unceremoniously picks apart the desiccated carcass of the tale, adding worksheets and teaching material.

Pyramids as diagrammatic aids are troublesome at the best of times, as it’s never certain from first glance what the illustrator wants to say about the hierarchy of information. Do we head from base to summit, as one might climb a real pyramid, or is our eye supposed scan from its apex to base, starting with the smallest stage to tackle ever larger ones? How important are the middle sections? How do the tiers interrelate? The Five Dysfunctions themselves are depicted as a pyramid. Thus the only graphical aid in the book requires the whole book to explain it.

Never mind. The core message of this book is that great teams argue, all the time. They argue to determine the company’s goals; they argue to decide how money and time is spent; ideally, they argue without ego. Teams that hold meetings where no conflict happens hold very boring meetings, where nothing is decided, and people zone out, dumbly acquiesce, or cower in fear. Lencioni says that the most fundamental dysfunction is an absence of trust. Problems aired in company meetings matter to everybody, and everybody has a different perspective of what’s important and why, so meetings should be dramatic: full of tension, disagreement and discussion.

Trust is thus drawn at the base of the pyramid. This is the first place you visit when fixing a team, and the thing that must be right before anything else can be addressed. Every member of the team has to trust the others. To invite disagreement, they must allow themselves to be vulnerable to criticism and counter-argument without their openness being abused by ad-hominem attacks or disingenuous political manoeuvring, and this requires responsible facilitation. The antithesis of trust is invulnerability: if you don’t argue, you won’t ever lose.

Absence of trust / Invulnerability

Without complete trust, there cannot be honest conflict. The next stage of the pyramid concerns unfiltered conflict. In a supportive and respectful environment, ideas are primal, truth wins, and it doesn’t matter who has volunteered a suggestion, only how appropriate and useful it is. Conversely, in places where a person is shouted down for political reasons or without adequate justification, there will be people who are not honestly be satisfied with the eventual decision, or the motives behind it.

Fear of conflict / Artificial harmony

The result of hours of conflict and resolution is a plan that the whole team can commit to. If there isn’t universal buy-in, the overall mission of the company cannot be coherent or complete.

Lack of commitment / Ambiguity

Commitment to a plan then leads to responsibility to deliver that plan. This means maintaining high standards throughout the whole team, and yet more conflict: holding team members to account if they miss targets, ignore work, or get distracted by other goals.

Avoidance of accountability / Low standards

People may be members of many teams, while also attending to their personal ambitions. One of these teams must come first. If you’re on the leadership team, this your first team. Maintaining trust and confidentiality in this team is paramount because it’s how a company succeeds. It’s important that your team can expect you to work towards the collective goal, that conflict will be handled with discretion, and that the whole team can get on your case if you don’t perform.

While it’s easier to let people pursue individual, ego-led goals than holding them to account in front of the team, it’s the wrong thing to do.

Inattention to results / Status and ego

These problems are simple to state but, as we see in other books, it’s not always possible to dismantle political structures that teams evolve to deal with day-to-day situations. Sometimes teams cannot be made to work like this without replacing parts of them, or galvanising them with an external crisis.

Book Depository: Black Box Thinking by Matthew Syed

I didn’t find this book at ROLI, but Matthew Syed’s everywhere these days and this one looked like it might be worth a glance.

Matthew Syed is a tall man with no hair. His book is full of this kind of observation that never goes anywhere: the muscular build of a bereaved man; the hairstyle of a pilot who committed suicide. Black Box Thinking luxuriates in glib journalistic dazzle. In one instant, we are forensically deconstructing the political aftermath of a medical accident; in the next, the writer’s attention alights on a widower’s eyes, welling as his tapering fingers tremble. This is modern self-help in Dale Carnegie’s image: recounted with a ferocious zeal that that feels claustrophobic and contrived, and presented with a chattiness that sits poorly with the importance and tragic flavour of its material. It occasionally verges on voyeurism but, where a sting of criticism might hit home, it is immediately emolliated by condescension. After reading of a surgeon whose tyranny nearly killed somebody, we are reminded not to forget that even a doctor who, in an instant of hubristic idiocy, almost kills a patient, and then obstructs attempts to investigate procedural errors, is nevertheless a hero. They are all heroes. Atul Gawande trod this ground years before Syed, and did it better: directly, with sincerity, and sensitively.

Syed’s book is a poor recruiter for a great employer. At its core is a simple, powerful and useful message, but some flaws are unforgivable. To an engineer, mathematician, or scientist, the predominant sin of this book is the insouciance with which recognised terms are misappropriated. Syed earnestly understands that there would be a better world if the scientific method were more widely understood and better applied; if objective truth were served as slavishly as the will to power. He wants his friends to know this too. But remember, journalist, that we engineers will continue to practise and hone our trade in our tiny rooms long after your friends follow the siren call of something more lucrative. Our tools and our knowledge are ours: do not abuse them.

Syed refers to ‘open-loop’ and ‘closed-loop’ thinking in a way that, for no good reason, inverts the established meaning of these terms. Hence, a ‘closed-loop’ system which, to millions of us with a modicum of technical training, is something that is ‘closed’ by a path that provides corrective feedback, is now ‘closed’ in the sense of ‘guarded against feedback and the influence of evidence’. Did anybody edit this book?

Lesson one: collect data about everything you’re doing. The title ‘Black Box Thinking’ refers to the two data recorders that capture the cockpit voice and telemetry in aircraft, so that crashes and near-misses can be better understood. Dispassionate forensic analysis of this data provides vital information about what went wrong.

Lesson two: depersonalise this information, and don’t use it to shame people. The fear of shame leads to the deliberate concealment of errors, so everybody loses opportunities to learn. Humans are fallible under stress, and the first duty of a crash investigator is to improve flight safety. Before critical failures, there are near-misses, and people must be allowed to report and challenge these without fear. The exemplary attitude in aviation allows mechanical problems to be caught at an early stage. Best practice is also improved in the cockpit. In-flight checklists control the narrowing of a pilot’s concentration under stress; improved human factors fix problems with the flying controls; Crew Resource Management addresses the psychological difficulties of cockpit hierarchy. This is why, as we know, civil aviation becomes safer even as aircraft become more complicated.

Lesson three: learn by building, make marginal gains, iterate often, create theories and try to falsify them. Syed summarises with unusual concision, ‘If I want to be a great musician, I must first play a lot of bad music. If I want to become a great tennis player, I must first lose lots of tennis games. If I want to become a top commercial architect known for energy-efficient, minimalist designs, I must first design inefficient, clunky buildings.’ Prepare to produce a lot of dross on the road to success. All performers are poor at first; nobody gets great without a lot of practice. Solicit feedback from customers at a really early stage, when you’re still a bit embarrassed by your product: you’ll learn if you’re doing a really great job designing the wrong thing.

On the subject of iteration, there is another use of the term ‘Black Box’ that is more commonly employed by engineers. A Black Box model is one in which a system is characterised merely from measurement of its inputs and outputs without attempting to understand the reasons for this relationship. This might have been woven into the central chapters on evolution and marginal gains. Here, in many places, it would have bolstered the book, but it didn’t. The dual meaning is dismissed in a footnote on Page 33 and never mentioned again.

In a central chapter, Syed notes that Unilever employed physicists and biologists to approach a difficult nozzle optimisation problem from two directions. This nozzle must create detergent granules by firing a hot, pressurised liquid into air where it solidifies and lands as a correctly-sized powder. First, physicists tried to characterise how the nozzle worked by modelling the flow of fluids. Through their failure to build a successful working model, the intractability of the problem was appreciated. A team of biologists then successfully optimised the nozzle with a typical ‘black box’ approach: starting with an existing, poorly-functioning prototype; measuring the powder, tweaking the nozzle design, and iterating the best-performing candidates over dozens of generations; finishing when it was as good as it was going to get. Hundreds of prototypes later, the ‘black box’ approach worked, and Syed narrates this as a victory for the empirical, evolutionary approach. Dyson’s creation of thousands of iterations of vacuum cleaner to arrive at the first commercial prototype of his dual cyclone is advanced to bolster the cause. Take that, physicists!

Unilever nozzle

Had Syed been a scientist — had he taken his own advice — he would have seen this story as more than a battle between practices. Both teams’ methods are in alignment with scientific best practice: each collected data and analysed it and approached a truth. Some physicists were attempting to develop a theory that solves the general case and failed. Some biologists set out to attack the specific case and succeeded. My conclusions are:

  1. Failure informed the approach that led to success, as it often does. Failure’s a very good teacher, but a slow and expensive one.
  2. Changing tactic saved the project, at the cost of limiting the scope. What was sacrificed was a general understanding of the system.

Unilever have a brilliant nozzle, but they’ll never know how they came to this design, or whether there’s an even better one. Now there’s no way to double its capacity or apply it to a slightly different fluid without building another hundred prototypes.

Lesson four: understand and eliminate cognitive dissonance. Resist the temptation to spin failure as a success, or deny that something went wrong. Accept such failures as an opportunity to learn and improve.

If you’re involved in a technical discipline, you’re already a servant of hard physical truths, and no amount of post-event rationalisation excuses a non-working prototype.

An external perspective of scientific method will help a wider audience to understand it. There are certainly pickings in this book for technical readers too, but it’s principally for an audience who don’t get, or even seek, the same class of feedback from their work that a technologist will. Recommend it to your boss. Next time you have a corridor conversation, though, remember that ‘closed-loop’ is open-loop, ‘open-loop’ is closed-loop, and ‘black-box’ means collecting and responding to data. Or ‘science’.

Book Depository: The Power of Habit by Charles Duhigg

Charles Duhigg’s book creaks under the examples he throws at the wall to support any observation. They are wearisome after a while. This book could have been stated just as effectively as a pamphlet, so here it is.


The habit loop is what happens when a sensory trigger precipitates a routine, which then leads to a reward of some kind. Over time, neural connections that link the trigger to the routine are strengthened in anticipation of the next reward, until the routine happens without conscious thought. Animals can be trained to follow surprisingly complex routines by exploiting the habit loop.

A keystone habit is a single change introduced into a daily routine. It exploits the habit loop to precipitate a small change. The reward from this can be used to power ever-larger changes.

Putting a piece of fruit on your desk to trigger a health regime, so you don’t go searching for snacks, is a simple example. Keeping a register of things you eat is another. At the company level, as Paul O’Neill did with Alcoa, you might be able to focus the organisation on one goal because that goal necessitates other changes you want to see. In the case of Alcoa, the goal was zero accidents. This required transformations in the chain of command and in industrial processes that enabled Alcoa to become much more successful and less wasteful, without these being explicit goals.

Exploit the habit loop when you can. Reduce the change you want into one keystone habit, or to improve one metric, or to make one difference.

In the long term, self-discipline has more influence on long-term success than intellectual ability. Willpower is like a muscle:

  1. It develops and improves with practice;
  2. It can tire through overuse, leading to a speculative explanation for high-flying businessmen and senior politicians regularly making spectacularly poor decisions in their personal lives.
  3. You can burn it out altogether for a while, after which it’s weakened and slowly recovers.

Willpower, like habit, is fed through positive reinforcement. It requires a personal reason for applying it to complete the habit loop. Any kind of reward will suffice, but you need a carrot even if you already have a stick.

Willpower is vulnerable to pressure. To form new habits under hard conditions, train with those conditions in mind. Rehearse particularly stressful encounters or difficult situations that upset you. Plan for when you struggle with willpower and replay successful scenarios like videos in your mind’s eye. They will become a better habit.

Starbucks sees its service as more important than the quality of its coffee. It trains employees, some of whom have anger problems, using the LATTE method (listen; acknowledge; take action; thank the customer; explain). This serves a social purpose too. Using this method, staff write a plan about how they’ll deal with an abusive customer, and it helps them to maintain their professionalism under fire.

In An Evolutionary Theory of Economic Change (Nelson and Winter), the case is made that companies aren’t families, but battlefields in a civil war. A functional equilibrium is established with a network of truces between ambitious people. These may work when it’s business as usual to the extent that they’re impossible to change. However, they are too rigid for organisational improvement, and may break down entirely in a crisis. Disaster is then inevitable. The 1987 King’s Cross fire, and the Fennell Report after it, illustrates a dysfunctional organisation in a crisis, and a way to transform it. Desmond Fennell fanned a media circus and allowed people to be shamed in public. It can be worth stirring up a catastrophe rather than letting it die down because, when people are vulnerable, it is rare opportunity to face failure, make changes, and establish new rules and habits.

Some social movements succeed while others fail. Three things are needed: friendships between individuals, a community with specific, identifiable interests, and leadership that is able to divest power to the ranks as it inspires. Rosa Parks, The Montgomery bus boycott and Martin Luther King are the archetypical example. Rosa Parks was not the first black woman to be arrested on a bus, but her high social standing, combined with a creeping awareness of the Civil Rights movements, was enough to trigger change. Leadership of such a movement has to be able to establish a strong culture, and then stand back so that it can be owned and led by its people.

If you can include a core of religious faith, as the Civil Rights movement did, and as Alcoholics Anonymous does, you provide a stronger way of displacing destructive cycles of habit with helpful, community-focused ones.

Weak ties, soft power, and peer pressure are how individuals advance themselves. Weak ties are acquaintances and friends of friends: these networks get people their next job or their customers. Soft power is power that influences rather than coerces: the kind that makes you attend an event because you think that certain people will expect you to.