Protect the patient, look after each other
At a conference on ‘Human Factors’ towards the end of last year, Niall Downey, a pilot with Aer Lingus, showed delegates a photograph of the lever which controls an aircraft’s undercarriage. The handle is wheel-shaped. To raise the undercarriage, you move the lever up towards the word ‘Up’. To lower the undercarriage, you move the lever down towards the word ‘Down’. Captain Downey’s dry sense of humour was on display throughout his presentation. “We assume passengers are prone to errors too,” he said, showing a picture of the approach to the passenger exit at Belfast City Airport where, on the floor, are printed the words: “Have you collected your luggage.”
You may well laugh.
However, the apparently absurd simplicity of a control designed to be operated not by stupid people, but by highly qualified, highly skilled personnel has its roots in a series of crashes that blighted the US Air Force during World War Two, when B-17 Flying Fortress pilots would, for no apparent reason, land their planes without lowering the undercarriage, or worse; pitch their craft into the ground, killing all onboard. At the end of the war, the Air Force assigned a psychologist to investigate.
What's in it for practitioners is that by looking after themselves they don't make mistakes which harm their patients
As an article in Wired magazine described recently, when he began looking at the aircraft, talking to pilots, and sitting in the cockpit, he did not see “pilot error”, he saw “design error”. Many of the critical controls felt, to the pilots’ hand, exactly the same. The psychologist subsequently created a system of distinctively shaped knobs and levers that made it easy to distinguish all the controls of the plane merely by feel, so that there was no chance of confusion even if flying in the dark.
“By law, that ingenious bit of design – known as shape coding – still governs landing gear and wing flaps in every airplane today,” noted the article. “You couldn’t assume humans to be perfectly rational sponges for training. You had to take them as they were: distracted, confused, irrational under duress. Only by imagining them at their most limited could you design machines that wouldn’t fail them.”
You would think that this approach to design and function would pervade high-risk environments today. Captain Downey went on to show a series of shocking photographs; including identically designed labels for diamorphine hydrochloride (except one dose was 30mg, the other 5mg) and a children’s cough syrup bottle with virtually the same look as a bottle of hydrogen peroxide.
His exhortation to practitioners attending the Human Factors conference hosted by the Royal College of Physicians and Surgeons of Glasgow (Read more in our article on Human Factors), was to approach every patient’s treatment with two questions in mind; “Where could this go wrong?” and “What’s Plan B?” Captain Downey also urged them to consider; “What’s in it for me?” What’s in it for practitioners, is that by redesigning systems to benefit themselves, they can reduce mistakes which harm their patients and result in professional censure and financial penalty.
Also in this edition is a feature on the Safety Climate Survey which, in 2017, was piloted in 14 dental practices in three NHS boards and is now available to every dental team in Scotland. Health Improvement Scotland (HIS) is working closely with the Chief Dental Officer to support quality improvement activity in General Dental Services.
“Undertaking the survey may seem daunting, but it only takes 15 minutes and I found it to be a valuable exercise,” commented Irene Black, Clinical Lead for Dentistry at HIS (Read more in our article on Safety).“It enabled us to learn more about the dynamics of our team and it provided the opportunity to discuss and share different perspectives.”
Team working. Effective communication. Reducing hierarchy. Workload management. It’s what practitioners can learn from pilots. As another speaker speak put it: “Protect the patient, look after each other.”