Thursday 19 December 2013

SimTechDay 2013: Backseat drivers take centre stage by Scott Rudnicki-Bayne

Simulation Technicians, we take care of the equipment, set-up scenarios, make the “magic” work. We’re confederates and actors. We keep scenarios on track, the guys and gals who ride in the backseat of the simulation education wagon. The inaugural Scottish Clinical Simulation Centre’s (SCSC) SimTechDay on the 27th of November 2013 was our day to take centre stage. Techs and other interested parties from all over Scotland descended on Forth Valley Royal Hospital (FVRH) in Larbert to network, share ideas and thoughts, and hear from our guest presenters.

Guest presenters:

Mannequin maintenance by Ian White (Laerdal) 
  • Ian White, a Laerdal Field Service Engineer, spoke about Laerdal’s commitment to improve after-sales and technical support, whilst also providing a few tips on maintaining the appearance of SimMan
  • Kevin Stirling, Lecturer in Simulation, University of Dundee and Finance Director, ASPiH, talked about the role of ASPiH and how Scottish SimTech could integrate this meeting into the national networks of technicians and how ASPiH might be able to support this.
  • Andrew Middleton, Scotia UK, who presented a piece on “Effective Video and Audio Recording for Simulation and Debrief” as well as showing us some new AV kit.
  • Nick Gosling, from St. Georges Advanced Patient Simulator Centre, talked about Mobile Simulation Strategies, which gave us a lot to consider when taking our simulated patients out to “in-situ” or “mobile” scenarios.
  • Sarah of Sarah’s Scars, who showed us how to create realistic-looking burns and an open wound with stage make-up, which will undoubtedly help increase the fidelity of our scenarios.
Presenters were available throughout the day to speak with us about any specific issues we had or developments we’d like to see.

What went well?

Don't do this at home
On reflection some aspects worked out really well. The support from presenters, attendees, sponsors and the SCSC team was fantastic. The ability to network and discuss issues with people in similar fields was both rewarding and motivating. Every presentation provided something for everyone: Ian White showed how to use talcum powder to improve the feel of, and minimise adhesive residue remaining on, SimMan's arms. Kevin Stirling initiated debate on a Tech Room at ASPiH.  Andrew Middleton provided individual support on SMOTs. Nick Gosling shared his huge wealth of knowledge and experience in simulation and Sarah made moulage look simple.

What were the challenges?

It's just a flesh wound.
There were also areas to improve upon, both in terms of organising and hosting an event like this. The programme was overpopulated for the time available, which meant some of the presentations were a bit rushed. Next time I’d try to provide time for the presentations rather than squeezing presentations into the time. I’d also consider using more rooms, for 2 reasons. Firstly, due to the overwhelming interest we sadly had to turn some people down who wished to attend. Secondly, the variety of techs (i.e. some university, some healthcare, some new to the role, some involved for years) had widely varying levels of knowledge and experience. So a larger number of attendees might give us the opportunity to have parallel sessions which could be tailored to suit the two distinct groups, where appropriate. I will also work on my skills as chair, e.g. formally introducing presenters and ensuring appreciation of their time and efforts is verbalised more clearly.  

Final thoughts

All in all a great day, focusing on some of the issues in our wide and varied job descriptions. A big thank you to all who attended, presented and helped make this event the success that it was. The feedback from the attendees was excellent and the planning has already started for SimTechDay 2014.

Scott Rudnicki-Bayne (SimTech, Scottish Clinical Simulation Centre, Larbert)
@5imTech1

Monday 16 December 2013

It's not my fault, it's the drifting, complex system.


One of the explanations provided by people for not embracing a systems-based approach to incident investigation is that it allows the "bad" individual to escape punishment.

In their book "Crisis Management in Acute Care Settings", St Pierre, Hofinger, Buerschaper and Simon state:
"Misapplication of (Reason's) Swiss-cheese model can shift the blame backward from a 'blame-the-person' culture to a 'blame the system' culture. In its extremes, actors at the sharp end may be exculpated from responsibility"
The concern is that some individuals (Robert Wachter labels them the "incompetent, intoxicated or habitually careless clinicians or those unwilling to follow reasonable safety rules and standards") will not be held accountable. These deviants will blame "the system" for not stopping them earlier or not making it clear enough what a violation was or not training them better. Wachter's 2012 paper "Personal accountability in healthcare: searching for the right balance" argues that lines must be drawn to distinguish simple human mistakes from sub-standard performance.

In his book "Managing the risks of organisational accidents" James Reason also calls for the drawing of a line "between acceptable and unacceptable behaviour" and calls for the replacement of a "no-blame culture" with a "just culture".

Drawing the line

A number of people/organisations have proposed "line-drawing" mechanisms:

  • David Marx: who says we must differentiate between "human error" (a slip/lapse), "at-risk behaviour" (wilful shortcuts) and "reckless behaviour" (substantial and unjustifiable risk)
  • The English National Patient Safety Agency (NPSA): their decision tree asks us to carry out 4 tests (deliberate harm, incapacity, foresight, substitution)
  • Leonard and Frankel: they consider 4 questions (impairment, deliberately unsafe, substitution, prior history)
  • Reason (see figure)

James Reason's decision tree (although it looks more like a hedge)

Not "Where?" but "Who?"

In his book "Just Culture", Sidney Dekker argues very convincingly that the line is arbitrary and that the definitions are fuzzy. Without being able to travel back in time and then read the mind of the individual who was at the sharp end of the error how can we be 100% sure that an act was intended or not? Instead Dekker argues that the argument should centre around who gets to draw the line(s): peers, regulators or prosecutors.

Abolishing retrospective blame

One idea that may inform the argument is that it may not be sensible, effective or just to attempt to make "someone" accountable in retrospect once an error has been committed and a patient harmed. Therefore if errors are discovered as part of an adverse event analysis then these should not be used to "blame" individuals. However, as a corollary, it may be appropriate to make people accountable for their current and future behaviour. It is therefore just as important to have systems in place to check and enforce correct behaviours as it is to be able to analyse past events.

As an example, consider the surgical pause as described in the WHO safety checklist. It makes little sense to blame a person or a team for not completing the pause correctly in retrospect once an error has been reported. It is much better to seek an active safety culture which would pick up the fact that the pause is not being done correctly before a patient has been harmed. It is this proactive approach to safety which is still missing in many places.

This post concludes with Don Berwick's thoughts:

"Blame and punishment have no productive role in the scientifically proper pursuit of safety."

References:


  • Berwick, D. Quoted at: http://www.england.nhs.uk/2013/12/12/never-events-news/
  • Decker, S. Just Culture. Farnham, UK: Ashgate Publishing Ltd. 2007.
  • Leonard MW, Frankel A. The path to safe and reliable healthcare. Patient Educ Couns 2010;80:288–92.
  • NPSA http://www.ahrq.gov/professionals/quality-patient-safety/patient-safety-resources/resources/advances-in-patient-safety/vol4/Meadows.pdf
  • Reason, JT. Managing the risks of organisational accidents. Aldershot, UK: Ashgate Publishing Co. 1997.  (Decision tree from: http://www.coloradofirecamp.com/just-culture/definitions-principles.htm)
  • Wachter, RM. Personal accountability in healthcare: searching for the right balance. BMJ Qual Saf 2012;0:1–5.
  • Wachter RM, Pronovost PJ. Balancing "no blame" with accountability in patient safety. N Engl J Med. 2009;361:1401-1406 

Thursday 12 December 2013

Book of the month: Just culture: balancing safety and accountability by Sidney Dekker (1st edition)

This is the second book by Sidney Dekker to be reviewed in this blog. (The first was his earlier book, "The Field Guide to Understanding Human Error"). This is a testament perhaps to both Dekker's readability and relevance to the patient safety movement.

About the author

Sidney Dekker is a Professor in the School of Humanities at Griffith University in Brisbane, Australia. He has published a raft of books on the topics of safety and failure. Dekker is also a pilot and therefore brings practical experience of the workings of a high-reliability industry to his writing.


Who should read this book?

"Just Culture" is aimed at anybody with an interest in how to bring about the conditions required to make the title of the book a reality, from individual practitioners to hospital managers to legislators. In terms of simulation centres, it will inform both your day-to-day debriefing skills, as well as your response to requests for the assessment of "bad" practitioners.


I haven't got time to read 149 pages… (but can manage 149 words)

Dekker's main argument is as follows:
  1.  'Everybody' agrees that the incompetent or reckless individual should be held accountable
  2. These individuals form a minority. The majority of errors are carried out by well-meaning practitioners who simply need re-training or support
  3. Unfortunately the decision as to who is incompetent is:
    1. Usually made "after the fact" and a number of biases may make it very difficult to consider the intent of the individual
    2. Fraught with adverse consequences. Blame and potential criminal/civil legal proceedings may have the effect of reducing patient safety as adverse events become less frequently reported through fear of a similar fate.
  4. To achieve a just culture one must find the answers to three questions:
    1. Who gets to draw the line?
    2. What is the role of domain expertise in the drawing of the line?
    3. How protected are safety data from the judiciary?

What's good about this book

Dekker uses some very powerful true stories which illustrate the tension between safety and accountability. Primarily these are stories of individuals who have been used as scapegoats for systematic failings: A nurse convicted of wrongly preparing a drug infusion, a captain accused of putting his passengers at risk, a police officer shooting an unarmed man.

Dekker discusses how the legal system, which is meant to provide "justice", is very poor at grasping the complexities of individual cases. The atmosphere of a courtroom several years after a lethal error is very different from a busy intensive care unit and so it may be impossible to relay the multifactorial causes of an error by the person "at the sharp end".

Dekker also informs us of that something that many suspected, namely that even in aviation (a high-reliability industry where error-reporting is the norm) it is not unusual for senior pilots to withhold information if they think that they can "get away with it". The reason? According to Dekker's source:
"Because you get into trouble too easily. The airline can give me no assurance that information will be safe from the prosecutor or anybody else. So I simply don't trust them with it. Just ask my colleagues. They will tell you the same thing."
Dekker provides useful definitions of reporting (informing supervisors etc.) and disclosure (informing clients, patients, etc.) and why they are both important. He also discusses how errors are often sub-divided (after the fact) into technical errors (honest mistakes made while learning a task) and normative errors (mistakes made by people failing to fulfil their role obligations).

Lastly, Dekker provides us with a step-wise approach to developing a safety culture and encourages us to start "at home, in your own organisation".


What's bad about this book

Dekker uses a number of examples showing how things go wrong but the book is very sparse on incidents where a "just culture" worked. It would have been useful to see some practical examples of just cultures in healthcare.

Final thoughts

Dekker does not pretend that realising a just culture will be easy or quick. However he does make a good argument for aiming for a just culture, not only because it will be "just" and safer, but because it is likely to be good for morale, job satisfaction and commitment to the organisation (p.25)