Monday 21 March 2022

We need to talk about SBAR

 You can’t have a discussion about communication without stumbling over the mnemonic SBAR. Transplanted from the US Navy, it is the most common handover tool mentioned in the secondary care debriefs I’m involved with. 

From “Online library of Quality, Service Improvement and Redesign tools” 

There are benefits to using some sort of mnemonic that all parties involved in a communication are familiar with. It reduces unnecessary questions, structures the information and suggests a minimum dataset that the caller should be able to relay.

There are significant downsides to SBAR.

First, nobody actually uses it (or if they do, they don’t use it in simulation). Everybody talks about it, everybody refers to it, people say that’s what they use, but they don’t. People use RAB, ABR, ABS and all sorts of other methods for conveying information.

Second, it’s not what the receiver wants to hear. In particular, the receiver does not want to wait however many minutes it takes to get to the R. The receiver wants to hear the R up front because then they know if they need to get out of bed, get someone else to take over the patient they’re dealing with, sit back to listen to the rest of the story to provide advice, etc.

Third, the SBAR does not confirm accurate receipt of the information. (In the picture above this is suggested at the bottom of the SBAR tool).

Fourth, the SBAR has become so engrained within healthcare that it will be difficult replace it even though it is a poor cognitive aid. 

There is a better mnemonic, it is called ISOBAR

From Porteous et al. “ iSoBAR — a concept and handover checklist: 
the National Clinical Handover Initiative”

ISOBAR includes identifying yourself and the patient, and a read back at the end. (The A has been changed from Assessment to Agreed Plan.) Now, if we just make it that little bit better by adding another R (for “Reason for calling” perhaps) and taking away the B, we would get IRSOAR (eye-arr-soar). 


It’s not quite as catchy as SBAR. It just about squeezes in to the “magic number 7” rule. But if it could overcome those barriers it would be a tool that might actually be used and useful.



Tuesday 19 June 2018

The A word


In a recent blogpost Paul Phrampus argued that we should not shy away from the word “assessment” in simulation. He states: “...every simulation is an assessment!”

Words shape our world. We communicate through the words we use, to relay information and influence people. The words we use also tell listeners about us, think of “economic migrants”, “illegal aliens” and “hostile environment”. Lastly, our vocabulary has a direct effect on our thinking; if we don’t know the words and their definitions it is difficult to think rationally about a subject.

Assessment

The word evokes feelings of stress. It is synonymous with judgment, passing and failing, a dispassionate observer providing an objective grade based on performance.
Some try to soften the word (formative assessment) or its synonyms (good judgment). Yet the people being assessed are unlikely to be reassured.

I would like to offer an alternative: “Analysis” Why not stop assessing people’s performance and start analysing it? The word evokes less stress and does not suggest judgment, but rather a review of what happened. For those who like modifiers perhaps “gap analysis” would work. The observer’s role is to look for the gaps in performance.

If we are analysing we are not assessing. We can be open and clear about what the goals of the scenario and debrief are. And we can remind ourselves that, in the end, it does not matter what we think the performance gaps were. What matters is that we have, through analysis and conversation, facilitated the realisation of these gaps in our learners. Moving from assessment to analysis may also help with another common problem that Paul has identified:

Monday 30 April 2018

Book of the month: Human Factors & Ergonomics in Practice (by Steven Shorrock & Claire Williams (eds))


(Conflict of interest: I have had a number of chats with Steven Shorrock, as well as email & twitter correspondence, and ran a 1-day Human Factors for Surgeons course with him. I have tried to give an objective review.)

About the editors
Steven Shorrock BSc, MSc, PhD (@StevenShorrock) is a chartered ergonomist and human factors specialist and a chartered psychologist. He is the European safety culture program leader at EUROCONTROL and adjunct senior lecturer at the School of Aviation, University of New South Wales, Sydney, Australia.

Claire Williams BSc, MSc, PhD (@claire_dr) is a chartered ergonomist and human factors specialist. She is a senior HF/E consultant at Human Applications and visiting research fellow in HF and behaviour change at Derby University, Derby, UK.

About the contributors

There are 45 other contributors, including internationally-recognised names such as Ken Catchpole, Sidney Dekker, Erik Hollnagel and Martin Bromiley. The contributors are based in North America, Europe and Australia. As befits the title, they are mainly involved in applied, practical human factors ergonomics.

Who should read this book?

This book should be on the bookshelf (actual or digital) of all those who are involved in HF/E work. This includes the "HF/E curious" with no formal qualifications in HF/E, experienced chartered ergonomists, as well as those who are purchasing the skills of HF/E practitioners. The book will also resonate with simulation-based educators with a number of themes such as safety culture and HF/E in healthcare.

In summary

The book is divided into 4 parts (31 chapters), as well as a foreword and afterword.
Part 1, "Reflections on the Profession", considers the definition of HF/E, as well as its history and current practice. 
Part 2, "Fundamental Issues for Practitioners", looks at some of roles HF/E specialists have to adopt and the challenges they face. These challenges include carrying out research when the employer is looking for a practical solution or the information gathered could be sensitive or embarrassing.
Part 3, "Domain-specific issues", details the musings of specialists currently engaged in a number of different domains. These include "obvious" sectors such as aviation, oil and gas exploration, and the nuclear industry, as well as less well-known sectors such as web engineering, agriculture, and the construction and demolition industry.
Part 4, "Communicating about Human Factors and Ergonomics", explains how to engage with executives as well as those at the sharp end.

I haven't got the time to read 413 pages...

Each Part has its own summary to give you an idea about what is going to be discussed and, as should be the norm with edited books, every chapter also starts with a single paragraph practitioner summary. You can therefore decide which chapters are most likely to be of benefit to you (although see "What's good about this book?" below)


What's good about this book?

There is a degree of soul-searching here not often seen in textbooks. For example, in Chapter 5 "Human Factors and the Ethics of Explaining Failure”, van Winsen and Dekker refer to the case of Karl Lilgert. Karl was jailed after the ferry he was in charge of, the “Queen of the North”, sank in 2006. In their verdict, the Supreme Court of British Columbia stated: “Maintaining situational awareness at all times and in all circumstances is key to proper navigation.” Situational awareness (SA) is a construct which HF/E practitioners (and others) use to explain human behavior. When terms such as SA are (mis)used by the legal profession then how much responsibility do HF/E practitioners bear? Similar arguments around the use of terms can be made regarding the use of “human error”. Although HF/E practitioners might know what they mean when they say “x% of accidents are due to human error”, the media and public often do not (p.87).

The book also reflects on the tension between the HF/E practitioners who work in research/academia and those who work in industry, as well as the place for those who are not formally qualified in HF/E. As with all professions, each group has different priorities, ways of working and cultures. In Chapter 1, Shorrock and Williams argue for a middle ground in which there is “collaboration among those with expertise in theory, method, and aspects of context… and those with deep expertise in their jobs, working environments, and industry” (p.14).

Hollnagel’s chapter on “The Nitty-Gritty of Human Factors” (p.45-62) is a good read. He talks about a pragmatic approach to human factors and counsels caution when using constructs such as “short term memory”. He advises us to remember that these constructs have been created to explain some observations but that they are constructs, with limitations.

Although healthcare workers might not be immediately drawn to a chapter entitled “Becoming a Human Factors/Ergonomics Practitioner” (Chapter 12), this chapter is worth a read for those of us involved in simulation and education. This chapter explores a number of challenges faced by those who want to certify as HF/E practitioners, as well as those who run the courses which lead to certification. In particular, there is a sense that the courses provide graduates with knowledge but perhaps not the skills required to enter the workplace. A similar problem is seen in healthcare where nurses graduate with the skills required to do the job from the first day whereas doctors often have a significant amount of “on the job” learning to do. This inability to perform Miller's "shows how" and "does" is something that we can use simulation-based education to address.

Healthcare practitioners and HF/E workers involved in healthcare must read Chapter 13 “Human Factors and Ergonomics Practice in Healthcare”. This details some of the issues that affect HF/E work in healthcare including the proliferation of checklists and the “try harder” mentality. Shelly Jeffcott (@drjeffcott) and Ken Catchpole (@KenCatchpole) are rightly optimistic about the future of HF/E in the healthcare setting. Simulation-based educators will also be pleased to see reference to simulation in design and procurement (p.189)

What's bad about this book?

The authors explain why we should be adopting a Safety-II approach, spend more time looking at the system than at the person and appreciate that the system is complex and intractable. However, there is a dearth of information about what to practically do. When an avoidable death occurs in healthcare (or other industries) there is little chance that bereaved families would be satisfied with explanations of complex systems, etc. It would be useful for the reader of “HF/E in Practice” to be given an introduction to current methods in HF/E and their uses.


Final thoughts

This is the best book I have read on Human Factors/Ergonomics. Its focus on the applied, practical aspects of HF/E make it relevant to front-line workers as well managers and researchers. If the General Medical Council is serious about wanting to involve HF/E professionals in its work then council members would do well to read this book.

Further reading:


Sunday 22 April 2018

Making Care Better: Lessons from Space

Healthcare Improvement Scotland supports continuous improvement in health care and social care practice and this event is part of their QI Connect WebEx series connecting health and care professionals with improvement expertise from all over the world.
This event took place on 8 November at the Planetarium within the Glasgow Science Centre, with more than 120 health and social care colleagues in attendance and many more attending virtually by WebEx.


 “This is always a difficult presentation for me, but it is one of hope. The hope is that the people who hear it will tell the story and spread the word. The similarities in what we did, in terms of understanding, mitigating and minimising risk is as much as part of your everyday job in caring for your patients, as it is mine. To me, I owe it to the next generation of people who climb into the next space craft. I don’t want them to end up in the same situation as my friends, the crew of Space Shuttle Columbia.”

Dr Nigel Packham
Born in London and now living in Houston, Texas, Dr Nigel Packham is no stranger to the world of healthcare. Both his parents were clinicians: his father an Urologist and his mother an Ophthalmologist. His brother, a recently retired General Physician. Nigel, himself, works at NASA Johnson Space Centre as lead for flight safety and managed the review which led to the public release of the Columbia Crew Survival Investigation Report in 2008.

On 16th January 2003, Space Shuttle Columbia (STS-107) embarked on her 28th orbital flight which was to be a 16 day science mission. At 81.7 seconds into the flight, a piece of foam detached from the external fuel tank and collided with the left wing of Columbia causing significant damage. Whilst in space, Columbia was able to perform what appeared to be as normal and the crew of seven completed their scheduled experiments successfully and without any cause for concern.

On 1 February 2003, Columbia deorbited and reached the entry interface to the Earth’s atmosphere (around 400,000 feet in altitude) travelling at 24.5 times the speed of sound. The planned touchdown at the Kennedy Space Centre, Florida, was at 14.15 pm GMT. At 13.58pm GMT, Mission Control reported an issue with the inboard tyre pressure on the left side of the Shuttle and by 13.59 pm GMT, they had lost communication with the crew.


“In the space of 10 seconds we went from being in control to being out of control.”

The tragedy of the last moments prior to the disintegration of Space Shuttle Columbia was graphically shown through different perspectives. Through those watching on the ground; a video simulation depicting a vehicle out of control; and the impending disaster through the eyes of the crew who bravely battled to re-gain control.   

Each of these perspectives show the same tragic events unfolding but from different viewpoints. The story of NASA’s learning from the Columbia disaster has learning for health and social care.  



Is it safe?

The simple fact, as Nigel explains is that space travel is not without risk and, as in health and social care, we need to, instead, ask the question: ‘is it safe enough?’ How we manage risk is key. We must identify and understand the likelihood of any risk and mitigate to minimise the potential impact.  

But, who ultimately accepts these risks? In space travel, this would, of course, be the astronauts themselves. Within health and social care, we have a responsibility to ensure that people are supported to make an informed decision about their own care and understand the risks they are ultimately facing. The principles of ‘Realistic Medicine’ now apply, not only globally, but also universally.

The consideration and interpretation of risk changes with the accumulation of knowledge. The risk of a disaster at the outset of the Space Shuttle Programme in 1981 was estimated between 1:1,000 and 1:10,000.  By the time of the completion of Shuttle Programme in 2011, 135 flights later, the modern tools estimated the risk for the first flight was revised to 1:12.  New data and the accumulation of learning made NASA radically re assess their quantification of the risks of space travel.  

So what about health and social care? How should we systematically interpret our perception of risk based on our experience of incidents, both locally and nationally? How do we share our knowledge and learning so that we can prevent further tragedies? 


“These were our friends"

Following the Columbia disaster, NASA has carefully considered its culture and leadership model. Their decision to publically share the final investigation report would, in no doubt, have been a difficult one, due to the sensitivities for the families and loved ones of the crew but, also, for NASA staff who were responsible for guiding the Shuttle safely back to Earth.

As part of their commitment to continuous improvement, NASA now routinely collect and share examples of real and potential adverse events at different stages - from blast off, to orbit, re-entry, and landing. Each stage is described as well as the implications for improved and ultimately safer systems. Sharing this internally to improve their own safety procedures is one thing, but NASA goes a step further by pro-actively sharing with other space faring nations so that they can also learn and avoid making similar mistakes.  

Ensuring that we too create a culture within health and social care which supports openness and learning is essential so that we can continue to make care better. The events in Mid Staffordshire NHS Trust highlighted the fundamental difference in perspectives of the Trust Board, the regulators who oversaw that Trust, the staff and those families caring for their loved ones. The voices of the weakest - the junior doctors and the families - were not heard until it was too late. From the bed side to the board room, there was a deep and fundamental failure to listen and to act.


Lessons for Healthcare

There is much that we can learn from NASA as well as other high risk and high reliable organisations. Specifically, how they have continued to develop processes to support learning and improvement following close calls or poor outcomes. Though there are inherent differences between health care and space flight, it is evident that success in outcomes in both these fields is ultimately dependent on the interaction between systems, people and environment.

The key is to have a better understanding of these interactions within a complex systems and its relevance when things do go wrong. Often when reviews of ‘incidents’ or adverse events are performed in healthcare, there can exist disconnect between reviewers and individuals or teams involved with the care of the patient. This includes differences in understanding of the challenges faced at the various levels within the overall system meant to support provision of good care. Feedback from reviews may be delayed or even not shared. This highlights the significance of the concept of Work-As-Imagined versus Work-as-Done in healthcare which often contributes to constraints in conducting effective reviews. This inevitably leads to a lost opportunity in understanding weaknesses within the system, possible incorrect focus on what is deemed to be the required improvements as well as in difficulty in capturing and sharing learning.

We believe a significant opportunity does already exist in health care to address these challenges and we are working on optimising this process for NHS Scotland. Mortality and Morbidity Review (M&M) or similar peer review meetings and process describes a systematic approach that provides members of a healthcare team with the opportunity for timely peer review of complaints, adverse events, complications or mortality. This facilitates reflection, learning and improvement in patient care. Importantly, such peer review processes also provide the opportunity to explore and inquire the significant majority of good practice that occurs daily in patient care.

When carried out well, structured M&Ms have added advantages compared to other review processes, including being as near to the event or patient experience as possible and helps promote a culture which support openness and learning in organisations. It provides an opportunity for teams to seek multiple perspectives, describe and discuss complex systems issues and interactions which may have contributed to the event. These factors can be missed when carrying out case note reviews or audits of care. M&Ms also facilitate sharing of learning and immediate feedback, ensuring concerns are addressed immediately thus helping mitigate against errors recurring whilst a relatively lengthy review process is undertaken. This process brings Work- As- Imagined and Work-As-Done closer and provides an opportunity for a better understanding of risks and sharing of learning from frontline to board to improve care.



Authors:
Jennifer, Nigel and Manoj
Manoj Kumar, National Clinical Lead, Scottish Mortality & Morbidity Programme, Healthcare Improvement Scotland / Consultant General Surgeon, NHS Grampian @Manoj_K_Kumar

Robbie Pearson, Chief Executive, Healthcare Improvement Scotland @rpearson1969

Jennifer Graham, Clinical Programme Manager, Healthcare Improvement Scotland @jennigraham8