Flight time: what practitioners can learn from pilots

09 December, 2019 / indepth
 Will Peakin  

Lessons from the evolution of human factors in aviation can be applied in a healthcare setting

“Would you behave differently if you shared the fate of your patient arising from any error you made at work?”

Professor Peter Brennan, a consultant maxillofacial surgeon at Portsmouth Hospital NHS Trust and an Honorary Fellow of the Royal College of Physicians and Surgeons of Glasgow, posed the question recently, pointing out that this is the position pilots find themselves in as a matter of norm.

In aviation, we assume we are going to make mistakes, and our whole mindset and system is based around that.

Captain Niall Downey, Aer Lingus Pilot

“If they make a mistake, everyone dies,” he said. “If I make a mistake, I walk away from the experience.”

Professor Brennan said that his eyes were opened to the phenomenon of ‘Human Factors in aviation’ – optimising the relationship between people and systems in order to improve safety and performance – around nine years ago. The friend of a patient he was treating happened to be a British Airways 747 Training Captain.

“I got chatting to him,” recalled Brennan, “and he came to theatre and really opened my eyes to better team working, effective communication, reducing hierarchy, and workload management.”

Professor Brennan was speaking at a conference on Human Factors, organised by Royal College of Physicians and Surgeons of Glasgow, in November. He highlighted the importance of, apparently simple, things such as being properly hydrated and fed and interpersonal relationships, as well as more challenging concepts such as “flattening hierarchies”, and how to ensure the adoption of a “no-blame culture”.

Taking a lead from other high-risk organisations, including aviation and air traffic services, the conference examined Human Factors and their relevance to errors in practice. In the run-up to the conference, Dr Richard Hull, the college’s Honorary Secretary, who co-organised the conference with Professor Brennan, outlined to Scottish Dental the thinking behind hosting the event.

“‘Never events’ are simply that; they should never occur. An audit of never events in Wessex showed that Human Factors were implicated in over 80% of cases reported. Since the Kegworth air disaster 30 years ago, where Human Factors contributed to an accident which resulted in the deaths of 47 people, airlines and other high-risk organisations have embraced the relevance of Human Factors.

“There has not been a single death due to human error on a UK registered airline in over three billion passenger journeys. While the NHS environment is very different, we have much to learn to promote safe working, in a no-blame culture, to ultimately give better, safer, health care for our patients.”

The aim of the conference was to help people working in health care, dental professionals included, how to recognise the relevance of Human Factors in their day-to-day practice and performance. It was important, said Professor Hull, for people to understand the specific features of errors and the scale of the problem. He added: “Errors are everyone’s problems and we need to do the maximum to prevent them.”

Medical errors are common and largely preventable, the conference heard. In the UK, 1 in 10 hospital admissions has some form of human error, ranging from relatively minor incidents, to never events and death estimated to be up to 5,000 patients per year. Analysis of so called never events has found that Human Factors are responsible for the majority of these mistakes.

Professor Brennan underlined the view that healthcare cannot be compared exactly with aviation, “but we can use the many Human Factors that aviation and other high-risk organisations know so well; enhancing team working, effective communication, workload management, reducing hierarchy and professionalism among others”.

“If our work on Human Factors prevents serious error for just one patient, then we have succeeded. We are gaining recognition internationally and helping to promote our specialty as a leader in this area.” He said that most errors start at the organisational level and end with the unsafe act itself. “Most of my work has been looking at the preconditions; if you can block those conditions, you can almost certainly prevent the error from occurring.”

He added: “High-risk organisations – aviation, rail, nuclear energy, National Air Traffic Services – they recognise the importance of Human Factors. The only way to embed Human Factors across healthcare is that top-down, bottom-up approach, so that we meet in the middle. There’s a wealth of evidence to show that senior management is core; not just in practice, but also the regulator, the Colleges.”

Professor Brennan showed a slide of a man he had operated on; the right side of his head had been penetrated by the blade of an angle saw he had been using to cut tiles in a shipyard. The preconditions, said Professor Brennan, were that he was new to the job, he was unsupervised, and he had been set a time limit to complete the task. The unsafe act – the error – was that he pressed down too hard. The blade sheared off, went through the visor he was wearing and sliced into his face below his eye socket.

“A simple mistake, that should never have happened,” recalled Professor Brennan, “It was a seven-hour operation, involving bone grafts to rebuild the orbit.” The outcome was positive, he said. “His vision was fine. I got that result because every two hours, I walked away for a 10- or 15-minute break. And actually, you finish quicker than if you work for seven or eight hours because your performance falls with time.”

Captain Niall Downey, a pilot with Aer Lingus, described himself as a “recovering cardio-thoracic surgeon”; he said he switched careers in the nineties, from medicine to aviation. At the beginning of his presentation, he asked delegates: “Has anyone here ever made a mistake?” A delegate answered: “Every day.” Captain Downey responded: “So, we’re in the right room. In aviation, we assume we are going to make mistakes, and our whole mindset and system is based around that.” Looking back to his time in cardiac surgery, it was different: “We weren’t allowed to make mistakes. If you did make mistakes, you weren’t allowed to talk about it. I think there is a better way.”

To underline the urgency of his message, Captain Downey reviewed studies of deaths caused by human error in healthcare systems – some of which put the number at ten times greater than the figure quoted above. Extrapolated, he said, while showing a slide of the passenger cabin of a 174-seat Airbus A320: “Each one of those seats is a funeral in the Scottish healthcare system due to human error. Every 10 days, we crash one of those. It doesn’t get covered by the BBC and we don’t have to tell the CAA (Civil Aviation Authority). That’s your environment. We changed our environment over the last 40 years.”

Charting accidents and incidents, and the number of deaths, in aviation from 1920, he said there was a steady climb to 1977, and then a descent to a point now where there are fewer than 1,000 deaths a year per year in commercial jet aviation worldwide out of around four billion passenger movements. It was in 1977 that two Boeing 747 passenger jets collided on the runway at Tenerife airport, killing 583 people.

“That was a watershed moment in aviation,” said Captain Downey. “We decided as an industry, we needed to do things differently. It began as ‘cockpit resource management’, became ‘crew resource management’, and has evolved over the past 40 years into full-blown Human Factors.”

Captain Downey said that we need to stop focussing on a ‘no-blame culture’ – “We don’t have a no-blame culture. If I make a bollocks of something tomorrow, I will be blamed, I will be held responsible. But if I report it, I won’t be sacked for it” – there is a ‘Just Culture’ which, he said, means “honest human mistakes, not deliberate error or gross negligence, but it means we can make mistakes and admit to them.” In contrast, he said, in healthcare there existed a “name, blame, shame, and retrain” culture.

In aviation, he said, when an error is admitted they look at the system to uncover the ‘tripwire’ that led to the error and “we then try to engineer the tripwire out of the system and replace it with a safety net”. Aviation looks at crew resource management; communication, leadership, situational awareness, workload management. “Just Culture. Systems. Crew Resource Management. That’s our basic three-stage system,” said Captain Downey, “and that’s the system that we are trying to get across to you guys. You can’t just transplant it in, but the underlying DNA is good. We can genetically engineer it for your environment.”

That process of “genetically engineering” aviation’s three-stage system for a healthcare setting is something which Captain Graham Shaw, a Senior First Officer Training Pilot for British Airways, and Captain Chris Holden, a flight instructor with British Airways, have undertaken within the NHS. Captain Holden looked back to the early days of NASA when it was found that high-performing individuals did not work well together in teams. Tackling that problem has evolved today into what is termed an ‘integrated competency-based structure’ where there is no separation between technical and non-technical competencies.

“It is one skill set,” said Captain Holden. “You can use the competencies on a personal level, see your own strengths, and apply them to a team. There are technical skills – clinical knowledge and procedural conduct – and social skills – professionalism, communication, leadership, and teamwork. They should be evidence-based and observable. You can also track data. It’s about creating a bespoke version of competence for your own healthcare environment, but in principal they are broadly similar to any high-performing team.”

Captain Shaw said that the process of embedding this system in healthcare can face barriers. “Some are systemic, some are down to the individual,” he said. Systemic barriers include regulation, a lack of ring-fenced funding to support training, a perceived lack of relevance, a lack of an open culture, and a belief that use of the World Health Organisation’s Surgical Safety Checklist is sufficient. Individual barriers include a lack of clarity on how to implement and a lack of training.

“That’s where we come in; to help people recognise great behaviours, get teams to work together so that those behaviours spread throughout the organisation,” said Captain Shaw. “We can’t fix all those [barriers] while on the day job, but we can give ourselves the skills and knowledge to understand problems, to build an effective and empowered team, with everyone in the room working together to support each other, to use human factors as a final layer of defence when other protection layers in the system fail.

“Systemic shortcomings don’t prevent us from making the most of the resource we do have. Human factors are key parts of our skillset; we must nurture them as much as our technical skills. They form a final layer of defence. Can we come together a team and make the most of what we have? Human Factors won’t embed through stand-alone training. Integrated practical team training in the workplace is needed.

“Protect the patient, look after each other,” said Captain Shaw. “It’s the fundamental point.”


Resources

Brennan PA, Davidson M. Improving patient safety: we need to reduce hierarchy and empower junior doctors to speak up. BMJ. 2019; 366: l4461

Brennan PA, Oeppen R, Knighton J, Davidson M. Looking after ourselves at work: the importance of being hydrated and fed. BMJ. 2019; 364: l528

Tags: air / aviation / Human Factors / pilots

Categories: Magazine

Comments are closed here.

Scottish Dental magazine