Grenfell Tower, Catastrophe and Systemic Change

by ukcivilservant

This blog summarises a fascinating and important book by Gill Kernick.

Gill lived on the 21st floor of Grenfell Tower until 2014.  Three years later, seven of her immediate neighbours died in the catastrophic fire. 

Gill works with high-hazard industries to build their leadership capabilities and culture in order to prevent catastrophic events.  She therefore imagined that a residential fire that killed 72 people would engender a desire to learn and change.  She was wrong.  Her book and accompanying podcasts explore the reasons why.


Catastrophic events are by nature low probability  – but they have extremely high consequences.  They don’t just happen.  We create them.  Their frequency and/or intensity can be reduced, but we have to do this in a quite different way from which we reduce the impact of slips, trips and falls.

Excellent leaders avoid catastrophes by deploying chronic unease – constantly searching for dangers and vulnerabilities.  They are aware that the difference between a near miss and a catastrophic event is often a matter of chance. So it is vital to be on the look out for weak signals that something is not right, that something could go wrong.  These can always be found (in hindsight) after major disasters.

Chronic unease also means guarding against natural reaction of ceasing to listen if deluged by complaints.  The opposite reaction is the better one.

Effective leaders are empathetic.  They demonstrate that they care, and can imagine what could happen if a ‘fix’ doesn’t work.  They explicitly or implicitly wonder whether they would be happy if they or their child were asked to work or live in the environment under their control, knowing the risks that they do.

It is not difficult to find examples of leaders failing to respond properly to danger signals.  There were, before the fire, many well documented expressions of Grenfell residents’ concerns about then performance of their Tenant Management Organisation, including concern about fire risk. 

And the Lakanal House fire coroner (2009, six deaths) had recommended that buildings such as Grenfell should be fitted with sprinklers, and that ‘stay put’ advice should be reviewed.  Neither had happened by the time of the Grenfell fire, eight years later.  Here is a minister’s response to fellow MPs’ concerns about the delay:

‘I have neither seen nor heard anything that would suggest consideration of any of [the recommendations of the Lakanal House coroner) is urgent and I’m not willing to disrupt the work of my department by asking that these matters are brought forward.’

Here are two other examples.

Before the Ladbroke Grove rail crash, seven drivers had failed to see and stop at the red light missed by one of the two drivers who died, along with 29 others, at Ladbroke Grove.  The train operator had asked Railtrack ‘as a matter of urgency what [they] intended to do about this high-risk issue’ – but reply came there none.

Jill Rutter tells how senior BP managers knew that their Texas City refinery was a disaster area.  But its problems could only addressed by closing it down for a long periods, and that would have harmed BP’s bottom line at a time when it was a stock exchange darling reporting increasing profits each quarter.  15 died in the subsequent explosion.

Learning from Disasters

The second way to avoid catastrophes is to ensure that responses to disasters involve systemic change. 

Most inquiries lead only to piecemeal change.  They assume a controllable, predictable world in which an error can be prevented through technical solutions or new bureaucratic, command and control rules designed by experts.  The Mid-Staffs inquiry, for instance, led to 270 recommendations which undoubtedly became a tick box exercise for NHS trusts around the country. 

Rules and regulations are never a perfect answer.  They are by their nature reactive and so cannot displace the need to continually search for vulnerabilities.  Compliance is never 100% and compliance falls sharply, and enforcement becomes very difficult, if the regulations are complex, which is too often the case.  (Examples might be the pre-Grenfell building regulations and the Covid guidance and legislation.) Some organisations and individuals actively seek to circumvent regulation.

Systemic change assumes a complex and changing world in which disruption and experimentation are likely routes to improvement.  Those leading systemic change have strong values and draw on the expertise of all stakeholders. They understand and focus on the true purpose of the system – hospital staff caring about (not just ‘for’) their patients, in the case of Mid-Staffs. 

Tick-box and regulatory activity is attractive because they are outputs – they can easily be measured.  But they are not outcomes, valuable in themselves.  So how do we know that systemic change is happening? The answer is to look out for leading indicators.  Do people feel that they are encouraged to report dangers?  Are there plenty of near-miss reports?  Do patients feel well cared for?

Will Grenfell lead to systemic change?  The signs are not good.  Political and media attention has moved on and, four years later, there has been relatively little progress in addressing even the most urgent issues such as the cladding on other buildings.  There has also been little or no sign of the construction industry accepting responsibility for contributing to the disaster in any way – a necessary precursor to avoiding future problems.

Other Lessons

Gill draws a number of other lessons, for regulators, decision makers and others. In brief …

Regulators need to be adaptable to cope with new dangers and technologies, and should not lose sight of their core purpose, such as ensuring safety. Resource constrained enforcement can (but should not) lead to regulatory obligations becoming seen as a maximum target which might be attained of you are lucky, rather than a minimum which must be achieved without fail. 

Decision makers should avoid number-counting and tick-box consultations.  Many other more effective mechanisms are available. 

Decision makers should in particular challenge their assumptions about whose knowledge counts.  It can be experts (e.g. during Covid crisis) but it might well also be those on the front line – including junior staff and local residents.  There was little sign, for instance, that emergency managers were involved in designing the UK’s response to Covid-19.  And the government looked far too ‘blokey’.  The involvement of women, minorities and SMEs in planning would likely have led to better decisions.

Bureaucracies (and companies run like bureaucracies) compile excellent risk registers but should avoid regarding this as ‘job done’ and then failing to take the necessary action to reduce the key risks   The UK’s national risk register, for instance, gave prominence to a possible pandemic whose impact was given the highest possible rating – and yet the 120,000 Covid-19 deaths suggests that the country remained woefully under-prepared. 

Resilience is often seen as paying a big insurance premium for cover for a disaster that you hope won’t happen, or won’t happen on your watch.  But resilience can often be bought in inexpensive ways.  The UK’s pandemic preparations, for instance, did not necessarily need expensive stockpiles.  There could most likely have been plans to requisition manufacturing capacity.

Last, and certainly not least, Gill asks …

Why do we find it so difficult to learn?

One problem is our obsession with blame which doesn’t fix anything, least of all the systemic issues. Blame can be helpful if it sharpens debate, exposes issues.  But it is too often unhelpful because everyone naturally tries to avoid being blamed, so it too often inhibits debate.

For a start, behaviour is context dependent.  There is little point in identifying someone who has made a mistake if you then replace them with someone else who will in future make similar mistakes because they remain subject to the same pressures and constraints.

This is particularly true in government. We all – and journalists in particular – like to ascribe blame to one person.  In government, we like to assume that everything is about, and caused by, Presidents, Prime Ministers and other political leaders.  But the truth is much more complex.  Political leaders work within a system and most decisions are taken at much lower and/or more local levels.  There is seldom if ever any serious political will to bring about systemic change following a catastrophe, especially once the media spotlight has moved elsewhere.  The key political imperative is to avoid blame. 

This absence of political will is exacerbated by weaknesses in parliamentary governance and accountability.  There is in particular no process that ensures that recommendations from public inquiries are implemented, or assessed for effectiveness.

It is therefore devoutly to be hoped that Gill’s book will help bring about genuine change in the UK’s political system, including both the media and senior officials.  If there is no change then we are surely not honouring the 72 who died.

Further Reading and Listening

This blog can only summarise a well sourced and well argued book.  If you would like more detail and analysis then I suggest you begin by watching Gill’s fascinating conversation with Diane Coyle and Jill Rutter on the Bennett Institute website

You can also listen to Gill’s podcast series, Catastrophe

And then Gill’s book can be bought here:- 

… or on Amazon.

Martin Stanley

Editor – Understanding Regulation