Thursday 27 June 2013

Book of the month: The Field Guide to Understanding Human Error by Sidney Dekker


The Field Guide to Understanding Human Error is a must-read for anyone interested in human factors and patient safety, and a "should-reread" for those of us who read it a while ago. I will spend the rest of this post explaining why.

Sidney Dekker is a Professor at the School of Humanities at Griffith University, Australia and has a host of publications on the subjects of human error and system failures.  The thrust of Dekker's book is that there are two views on human error, the Old View (which seeks to find the person or "bad apple" at fault) and the New View (which looks at the organisation, procedures and processes that the person is working within). Dekker makes a convincing argument, both for the existence of these two views and for the need to move from the Old to the New. 

Where's that "Bad Apple"?
Dekker defends the human within the complex system and repeatedly urges the incident investigator to see the events unfold as the people who were involved in the incident did. He asks us to stop wasting time on "should have" or "if only" discussions and instead focus on why the actions that were carried out must have seemed reasonable to the person(s) at the time.

The book explores the organizational causes of errors such as procedural/organizational drift, "borrowing from safety", production pressures and goal conflicts. Dekker argues that for someone to be accountable for something, they must be both responsible for it and have the requisite authority to change it without punishment. This means that you cannot blame a person for making a mistake if they have no power over making changes to the environment or processes which set the scene for the mistake to be made.

Dekker's last chapters are a call to arms. He urges individuals to adopt the New View, safety departments to be informed and informative, and organisations to embrace the difficult, political changes required to become safer.

Dekker also provides the human factors devotee with a host of well-written sentences. The following are a few examples which cut to the heart of some of the topics in the patient safety arena:
Asking what is the cause (of an accident), is just as bizarre as asking what is the cause of not having an accident. Accidents have their basis in the real complexity of the system, not their apparent simplicity. (p. xiii)
Cause it not something you find. Cause is something you construct. (p. 76) 
"Human error" problems are organizational problems, and so at least as complex as your organization. (p. xiv)
If you conclude "human error", you may as well not have spent money on the investigation. (p. 6)
If your explanation (of an accident) still relies on unmotivated people, you have more work to do. (p. 12)
A safety culture is a culture that allows the boss to hear bad news (p. 171) 

However at least one of the quotes is probably best not used:
Reprimanding "Bad Apples" is like peeing in your pants. You feel warm and relieved first, but soon you look like a fool. (p. 9)

Although the word "simulation" is not in the index, this book provides a lot of advice which can be applied to simulation-based education. In Chapter 11: Human Factors Data, Dekker suggests questions that could be used in the debrief after an accident. Some of these questions may be useful in a simulated scenario debrief, such as:
  • What were you focusing on?
  • What were you expecting to happen?
  • If you had to describe the situation to your colleague at that point, what would you have told (them)?
Dekker also provides examples for a number of mechanisms of error, which may be useful when debriefing:
  • Cognitive fixation
  • Plan continuation
  • Stress
  • Fatigue
  • Buggy or inert knowledge
  • New technology and computerization
  • Automation surprises
  • Procedural adaptations
In the Preface Dekker tells you both what each chapter is about and how this fits in with the whole book. The chapters cover such diverse topics as hindsight bias, root cause analysis and how safety departments should function.The chapters are well-written, use illustrations when appropriate and words which may be new to the reader, such as "counterfactual", are defined and explained with examples. Although diverse, Dekker uses each chapter to promote his argument using a coherent and stepwise approach with real-life examples. This also makes this book extremely easy to read.

Chapters 1 through 8 look at the different Views mentioned above, explaining how the Old View is recognisable in language and inquiries and why Old View thinking will not improve safety. Chapters 9 through 16 explain how to analyse and report an incident using New View methods such as a high-resolution timeline. Chapters 17 through 20 explore the organisational changes required to adopt the New View. Chapter 21 provides a useful bullet-point summary of the books main themes.

There is very little to fault with this book. There are some small typographical errors (e.g. on p. 110 when explaining transcription codes and notations). Also a few of the chapters could perhaps have been fleshed out (some are only 6 pages long) or amalgamated with other chapters.

Picking up "The Field Guide..." to write this review was like sitting down with an old friend for a cup of tea and a chat. As with the old friend there is common ground to cover and new thoughts and ideas to share. This book has something for everybody, from "sharp end" worker to chief executive. Buy or borrow a copy, you will thank me for it.

Monday 24 June 2013

The safest operating theatre in the world

Pictured on the right is the safest operating theatre in the world. You, dear reader, may have thought that your hospital has the safest theatre (or OR in the US) and therefore be dismayed at this turn of events. Fear not. You can also lay claim to this accolade, on one condition: You need to find an empty theatre.

In an empty theatre no patient is being harmed. No errors are being made. Nobody at the sharp end is making holes in the cheese of harm prevention. In the patient-less theatre nobody is sitting in the hot seat, frantically flicking through the field guide to understanding human error while hoping for a heroic recovery to restore patient safety.

This is a fundamental aspect of patient safety in surgery; if we carried out no surgery, no patient would come to harm due to surgery. As we all appreciate, this is not going to happen in the foreseeable future. We will continue to operate and patients will continue to be harmed due to human error in its various shapes and guises.

We must look then to reducing or minimizing harm, bringing the risk of harm down to as low as reasonably practicable (ALARP). We could classify the modifiers of this risk into internal and external (to a person). Internal modifiers might include: physical or mental fatigue, knowledge, skill and expertise. External modifiers might include: the team around you, leadership of your team/department/organisation and equipment available. Both internal and external modifiers are affected by the culture of the organisation in which we work. Although there are signs of change, much of the past and current human factors/patient safety work has combined exhortations from government and upper echelons of the health service to "Be safer!" with grassroots work to find out what can and needs to be done. The latter exemplified, in the UK, by the clinical human factors group (CHFG) led by Martin Bromiley.

In May 2013 Andrew Hughes tweeted:
Culture is behaviour over time but the support, fostering and rewarding of that behaviour is a top-down activity. The first two sentences of the guidance explaining the UK's General Medical Council's (GMC) expectations of all registered doctors are:
Patients need good doctors. Good doctors make the care of their patients their first concern.
Providing the environment to make care of patients our first concern is, in many respects, the responsibility of the organisation in which we work. Unfortunately the organisation inevitably has to balance conflicting demands such as financial and waiting time targets. Although these targets, such as length of wait in A&E, may contribute to patient care they are also likely to reduce patient safety. As an example, a patient on the operating list is going to breach their target for "weeks before surgery" and the list is running late. The safest thing for this patient may be that they are cancelled and rescheduled. The target however puts pressure on the theatre staff, who may be fatigued at the end of a long day, to carry on with the surgery for this patient.

What needs to change? There are a number of measures that may be undertaken to promote patient safety. "Human factors" workshops for all (senior) medical managers and healthcare personnel, face-to-face dialogue between management and the people at the sharp end and, lastly, unfailing perseverance by those who have embraced the patient safety culture to continue to "be the change you wish to see".

Post scriptum: You may be wondering why I didn't put in a picture of our theatre simulation suite and argue that this was the safest operating theatre in the world. In a future post I will discuss the simulation suite and the issue of safety for patients and for participants. In a nutshell, in the wrong hands the simulation suite may become an unsafe environment for both of these groups.