This is the first blog here. I will bring subjects that often crop up in discussion and offer them up from my perspective for a light airing.
If you have ideas, or would like to provide a blog for us, please don’t hesitate to contact us*.
To offer a short discussion on Situation Awareness, from the get-go, I cannot mention S.A. without a deferential nod to Mica Endsley. If anyone should need to know more about her work, this is a good way to start:
Here Mica describes Situation Awareness as “the perception of elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future.” (Endsley, 1988).
In our early days at Trent Simulation & Clinical Skills Centre we would earnestly consider how to design and deliver simulations that enabled learning and assessment in the factors described in publications such as Anaesthetists’ Non-Technical Skills (ANTS) (view the Handbook).
Here Situation Awareness is described in 3 components that reflect Endsley: Gathering information, Recognising and Understanding, and Anticipating. The handbook describes Behavioural Markers that may demonstrate good practice. This caused us some angst, as the observable behaviours are often surrogates for unobservable nebulous concepts such as “Understanding” and frequently rely on observable communication. This raised the question whether – given that we frequently exhort simulation participants to ‘think aloud’ and so on – what we were actually measuring was firstly a surrogate and secondly simulation artefact. Were we helping practitioners to be good at simulation rather than carry behaviours for improved situation awareness back to the workplace?
The Latin root of some of our words about thinking, pensare, is a frequentative of pendere, which means to weigh or hang, so essentially those ancient romans understood that thinking was repeatedly weighing things up. There are, or have been, several PDSA type loops formulated like this: OODA (Observe, Orient decide, Act); DODAR (Diagnose, Options, Decide, Act, Review); FORDEC (Facts, Options, Risks, Decide, Execute Check). (“When you mess…” one of my anaesthetist colleagues would remind our final year medical students, “…reassess.”) Technology offers a more and more refined data delivery to support such analysis and inform S.A. but what about the (cough) human factor?
The AHRQ Teamwork Behaviours and Training System “TEAMSTEPPS”
Discusses Situation Awareness as “the state of ‘knowing what’s going on around you’ which is achieved by team members engaging in ‘Situation Monitoring’ and the continuous development of a ‘Shared Mental Model’, what I like about this is that S.A. is identified as an Outcome, and Situation Monitoring is Identified as a skill – and so presumably can be trained (and assessed). This though, begs the question whether a team member who has the available capacity and capability to thoroughly monitor the current situation can or will effectively deliver that to the individual attempting to have an accurate projection of the future status, the only person in the current team who is able to pattern match from available internal Schema.
When we find communication to be at fault – as so frequently we do, is the failure of communication to deliver S.A. the routine reason?
Endsley reports that as much as 88% of human error is due to problems with S.A. and furthermore that the majority cause is not getting basic data – which other team members may have easy access to, I wonder whether it is here, in the smallest nuances of human behaviour that teamwork can flourish or flounder, and whether, given the ad hoc nature of many clinical teams and the multiple priorities and heightened emotional content as a frequent component of daily working life, clinical human factors can provide behavioural insights less obtainable anywhere else?