logo-min

News & Views

Safety Culture Take Off

I often go running along the Flitch Way, a disused railway line, from my home in Great Dunmow, Essex to Hatfield Forest. The forest is a beautiful, National Trust site, which is designated as having special scientific interest and contains a nature reserve. People go there to have picnics, take their dogs for a walk and row on the boating lake. It also happens to be three miles from Stansted airport.

At 18:36 on 22 December 1999 the tranquillity of the forest was about to be disturbed. A cargo plane, Korean Air Boeing 747 flight 8509, was cleared for take-off from Stansted and less than 60 seconds after doing so it had crashed into the forest killing all four crew. The subsequent investigation found that an instrument called an Attitude Director Indicator (ADI), which informs the pilot of the aircraft’s orientation relative to the horizon, had failed. The pilot seemingly thought the plane was level when in fact it was banking irrecoverably towards the ground. That’s it, except of course, it’s not. Plane crashes like construction accidents are far more likely to be due to the cumulation of a series of minor errors which come about as a result of failings in systems and in cultures.

The flight crew on the plane’s preceding flight from Tashkent to Stansted had identified a problem. They had reported it to the ground engineer and an attempt was made to fix it. However, there is no record of the ground engineer briefing the outbound crew about the ADI fault and nothing on the cockpit voice recorder (CVR) to suggest that they were aware. During the flight, several warning alarms comparing the first officers functioning ADI with the pilot’s malfunctioning one started to signal visually and audibly. Warnings were individually cancelled prior to the final impact. The CVR did show that the flight engineer made several warning calls to indicate his awareness that there was a problem with the angle of bank but failed to understand or indicate that it was extreme.

One of the safety recommendations from the investigation was for Korean Air to update their flight crew training to ensure adaption to accommodate Korean culture. The inference was that the deferential nature of employee-manager relationships may preclude sub-ordinates from highlighting issues even in potentially fatal circumstances. The CVR picked up this snappy exchange between the commander and the first officer on the Stansted flight, “Make sure you understand what ground control is saying, before you speak”.

In poorly developed safety cultures people often suffer from cognitive dissonance – the tension we feel when our beliefs are challenged by evidence. Rather than accept the fault they reframe the evidence and self-justify the original decision. When the environment facilitates a growth mindset it enables us to embrace and utilise diversity of thought provided by managers, peers and assistants. Afterall, the biggest gains in safety often come from the actions not of Directors or Managers but operatives who provide feedback on the use of equipment and the procedures that are in place. We need to create psychologically safe environments to facilitate this. People need to feel membership of a team without the fear or rejection or humiliation. They need to feel safe to ask questions which will enable them to learn. Furthermore, we need to harness the natural desire of people to contribute to making a difference and challenge the status quo. As a result, we must provide respect and permission to dissent when they think something needs to change.

Within my own business we have a monthly safety award to encourage innovation suggestions. Inside the last quarter this had led to improvements related to the following: the removal of grease buckets in MEWPS when working at height; bringing to the attention of our supplier a problem with the design of the clamps for concrete hoses; strengthening of blowing out butts; and an alternative method for the tethering of augers during transportation in our yard. We also have quarterly safety workshops with all staff to discuss best practice procedures. This provides space for the raising of concerns and the development of new ideas away from the pressures of the working environment.

Returning to the airline industry, they are now an example of best practice safety culture. In the early days of aviation of course this wasn’t the case but now all planes are equipped with almost-indestructible black boxes recording the electronic systems and conversations. This has helped investigations that are carried out quickly and publicly. Open reporting and evaluation help ensure those accidents and incidents don’t happen again. Importantly every near miss is viewed as a learning opportunity.

The Federation of Piling Specialists (FPS) through its Safety, Plant and Operations group provide a mechanism within which to do this in the piling sector. Additionally, the European Federation of Foundation Contractors accident investigation training has helped equip our people with the necessary skills to do it effectively. Consequently, the near miss reports have doubtlessly saved industry workers from suffering serious injuries. However, in the most serious cases we often encounter a reluctance to share information and it stops us learning from failure. This is due to a desire to maintain privilege over documents which may be self critical from falling into the hands of a regulator such as the HSE. There are extensive regulations and case histories surrounding the circumstances within which this litigation privilege can be adopted, and I won’t pretend to understand all of it. What I do know is that the adoption of this position when the sharing of the information may benefit fellow industry workers is morally questionable.

A complimentary approach is something known as a pre-mortem. Here team members are told the accident has happened and they are asked to suggest plausible reasons why. The project manager will ensure that everyone is able to contribute to this. The benefit is that it unburdens the team from a defensive mind set and enhances their ability to change future outcomes. I’ve found that this works particularly well on new techniques or complicated schemes, feeding into mitigation requirements. However, it could also find success when analysing existing processes which have poor safety records.

Construction has struggled to make a quantitative improvement in accident frequency levels in recent years. If we are to do so we must look at other industries and learn from their practices and cultures.

Author: Steve Hadley, Chair, Federation of Piling Specialists