UK floods: Crises are often foreseeable, so why do leaders ignore the facts that could avert them?
The crisis management gurus all agree. In the majority of cases – some estimate up to 70% of the time – crises are foreseeable. The root causes are visible. The facts are known. But, for a combination of reasons, leaders either avert their gaze or stumble on in ignorance. The outcome is the same: hurried, panicky measures, promises to do better next time, to spend a bit more and "learn lessons". And still the flood waters rise. Lather, rinse and repeat.
The next few days will probably see arguments about who cut which budget and when, whether cash increases represent a fall "in real terms" or not. The prime minister will frown for the cameras and boast about his £2.3bn ($3.4bn). Others will point to graphs that show a significant drop in resources.
The numbers game is not the main thing, however. There is a bigger point: a wilful lack of preparedness for something experts have been predicting for years. We should be expecting unprecedented climactic conditions and extreme weather. This is what climate change brings us. Some farming methods and the possible mismanagement of rivers merely exacerbate a problem that is already, through neglect, out of control.
This situation, with floods causing misery and expensive damage in several parts of northern England, is what Max Bazerman and Michael Watkins called in their eponymous book (published over a decade ago) a "predictable surprise". A predictable surprise is, in their words, "an event or set of events that take an individual or group by surprise, despite prior awareness of all the information necessary to anticipate the events and their consequences".
Climate change and its consequences are a prime source of such "surprises". As Bazerman and Watkins wrote (in 2004): "It is probably the most significant and potentially destructive predictable surprise that society is currently ignoring...When the disaster starts to occur, we can count on [the politicians] to deny that they had any reason to believe the problem could become so large."
Why do leaders ignore the facts that are in front of their eyes? Bazerman and Watkins talk about "cognitive biases": a preference (or avoidance mechanism) that stops them from taking action. We might call these biases blind spots – although, as Margaret Heffernan has pointed out, some organisations suffer from "wilful blindness" – they choose not to recognise something awkward or embarrassing, to their cost.
Maybe it will be politically inconvenient to point out to a boss that more money has to be spent. Perhaps highlighting a problem will open a can of worms. It could be that the work required to deal with a problem will take years to carry out, and little credit will be earned by those who do it. Maybe people will just cross their fingers and hope that nothing goes wrong on their watch.
For some or all of these reasons necessary action is delayed or avoided. And the people at the top may not even know that action is being avoided at all. The former US defence secretary Donald Rumsfeld famously talked about the "unknown unknowns", details that no-one in command knows anything about. Unknown unknowns can lead to unpredictable surprises, calamities that strike out of the blue.
But Rumsfeld did not mention "unknown knowns", which can be every bit as lethal. These are pieces of information that are known somewhere in the organisation, just not at the top.
As Bazerman and Watkins explain, on September 11 1996 the US General Accounting Office, the investigative arm of Congress, published a report entitled "Aviation Security: Urgent Issues Need to be Addressed". This report warned that "events such as the [1993] World Trade Center bombing have revealed that the terrorists' threat in the US is more serious and extensive than previously believed".
Five years later, to the day, al-Qaeda seized four planes over northeastern America with the appalling consequences we all remember. Reports that summer from security agents in the field – "unknown knowns" – never seemed to be picked up by those at the top. One report did land on a desk that mattered. The CIA's daily brief to the President on August 6 2001 was labelled "Bin Laden determined to strike in US".
Not all crises can be prevented. But most of them can. If information flows freely, "unknown knowns" can be eradicated. If people feel able to deliver bad news to the top it will get through. Senior people have to pay attention, notice what the facts are telling them, and act on them. This can all be done. It just takes effective leadership.
© Copyright IBTimes 2024. All rights reserved.