Healthcare, harm and human factors.

Dr Jenny Porter

The assignation of blame to the human operator or ‘human in the system’ most proximal to the catastrophic event occurs routinely after adverse events in healthcare. These quick judgements can impede a complete assessment of both the events themselves and frequently the series of minor deviations from safe practice, which culminated in the adverse event1. The lessons from investigations of so-called ‘high- consequence’ accidents in non-healthcare industries such as commercial aviation have provided an understanding of how interactions between humans and complex technology at many points of the design, maintenance and operation of systems can collectively generate the accident. The most obvious failures are those that immediately precede the event - usually unintentional and occasionally deliberate (violations) in an attempt to mitigate harm. We now understand that human failure outstrips hardware or software failure2. Human factor engineering offers at least a partial solution – the design of process, equipment and the environment to optimise our function and performance3. In healthcare, however, the current estimates of patient iatrogenic injury remain several orders higher than those attributable to motor vehicle accidents in the U.S.A., U.K. and Australia4. Commercial aviation has been held up as exemplifying the ‘gold standard’ in terms of safety management for healthcare 4. Accidents in commercial aviation are extraordinarily costly and there are financial incentives to avoid such events at all costs. The fatal commercial aviation accident rates are 25 –fold lower than the mortality due to anaesthesia of less than 1 per 100,000 healthy patients undergoing routine surgery5. However, poor outcomes due to suboptimal care in healthcare are less costly than in aviation and rarely harm those responsible for patient care. Attitudes to safety are notably very positive in aviation: clear standards and rigorous requirements are the norm2. Pilots and engineers have licenses which are specific and limited, and aircraft undergo rigorous certification. The culture is reactive, which facilitates learning from catastrophe4. The industry has thus succeeded in reducing blame at the ‘sharp end’ of adverse events. A number of aspects of safety management have been successfully introduced to medical practice but there are obvious limits to this extrapolation6. It is widely acknowledged that managing emergency invasive procedures in sick patients is a poor analogy for flying an aircraft which has passed every safety check with the option of rescheduling should inclement weather conditions prevail6,7. Human factors training, simulation, and checklists have been successfully imported into medical training. However, checklists do not replace briefings, which are undertaken routinely and regularly before and during a flight. The ‘safe surgical checklist’ often excludes meaningful dialogue about ‘what ifs’ between team members6. This checklist is rarely challenged, and the active authority gradient discourages speaking up 6. In contrast, crew resource management in aviation has been successfully used to address not only the so-called ‘cockpit gradient’, but also communication, leadership, interpersonal skills such as conflict management, vigilance and crisis preparation. Another safety management system used in aviation is based on the concept that design ‘trumps training’, where user-centred design for devices and processes is deemed superior in terms of error reduction. The alternative option of training to avoid the pitfalls of a poorly designed system is evident in the ‘workarounds’ prevalent in healthcare3. The aviation industry uses annual assessment of performance whereas competence assessment has not yet become the norm for clinicians in Ireland. In addition, there are strictly controlled working hours for pilots unlike senior medical staff across Europe and the USA7. Whilst the issue of fatigue is acknowledged in healthcare, this has not been adequately addressed to date. A safety culture is essential for safety management systems to survive and develop and Reason describes the attributes of such a culture – ‘informed, wary, just, flexible, learning’8. Healthcare safety culture has evolved beyond the pathological stage, but the lack of robust systematic risk management implies that we still have a culture which responds to high-profile events (e.g. Bristol, Winnipeg) with repair but without systematic improvement9. The area of incident/ adverse event / near miss reporting is worthy of comparison between the aviation industry and healthcare. In aviation, confidential and non-punitive reporting replaces anonymous reporting to identify occurrences of poor performance. There are confidential crew surveys, which examine teamwork, leadership, error and speaking up 10. Unlike healthcare, the use of expert observers in the cockpit during normal flights allows the recording of the management of both safety threats and error. In aviation, incident reports are brief and serve as a trigger for wider investigation and action. This is an example of the pitfall of importing one practice from another industry in its most simplified form without considering its practical application to safety improvement. In healthcare, incident investigation lacks established methodology and attitudes are entrenched in a blame culture, characteristic of both pathological and reactive cultures11, 12, 13. The focus is on data collection, where all incidents are reported, increased reporting rates celebrated, indeed flagged as an indicator of safety performance. Reporting in healthcare appears often to be a one-way process. Reports in aviation are focused, specific and harnessed to a process to increase risk awareness. The challenge of developing effective patient safety strategies in healthcare is summarised by Jon Lloyd, MRSA prevention co-ordinator (Center for Disease Control and Prevention) who concludes that there is an ultimate requirement that healthcare professionals are enabled to define every dimension of a specific problem, to select a specific key strategy and apply the ‘appropriate socio-technical solution’ 14. Clearly, change is needed to attain this ambitious endpoint. In the operating room, patient safety is dependent on effective working relationships between all members of the perioperative team, awareness of roles and limitations. These can be enhanced using in situ simulation and non-technical skills training. Team briefings prior to commencing a list or difficult case to share information, concerns and intended strategies is another effective strategy. The findings of a review of the twelve emergency front of neck access (FONA) cases in the NAP 4 report, which identified team process failure in each case, substantiates the crucial role of teamwork in patient safety 15 . Finally, the importance of customising system design has been underlined by the publication of a white paper (2018) by the Chartered Institute of Ergonomics and Human Factors in the U.K. stating that each NHS organisation should have access to human factors expertise3. This needs to be supplemented by knowledge of psychology and design in order to embed safety science expertise in line with every other safety-critical industry.

References: 1. Vincent C. Framework for analysing risk and safety in clinical medicine. British Medical Journal 1998; 316 (7138): 1154-1157. 2. Carlisle J. ‘Humanware’: the human in the system. Anaesthesia 2019; 74:965-968. 3. Marshall S. Human factors and the safety of surgical and anaesthetic care. Anaesthesia 2020;75 (Suppl 1): e34-e38 4. Hudson P. Applying the lessons of high-risk industries to health care. Quality and Safety in Health Care 2003; 12 (Suppl 1): i7-i12. 5. Gaba D. Out of this nettle, danger, we pluck this flower, safety. Healthcare versus aviation and other high-hazard industries. Society for Simulation in Healthcare 2007;2 (4):213-217. 6. Rogers J. Have we gone too far in translating ideas from aviation to patient safety? Yes. British Medical Journal 2011;342: c 7309 7. Gaba D. have we gone too far in translating ideas from aviation to patient safety? No. British Medical Journal 2011;342:c7310. 8. Reason J. Human Error. New York, NY: Cambridge University Press; 1990. 9. Boysen PG. Just Culture: A foundation for balanced accountability and patient safety. The Ochsner Journal 2013; 13: 400-406. 10. Helmreich R. L. On error management: lessons from aviation. British Medical Journal 2000; 320:781-785. 11. National Academies of Sciences, Engineering and Medicine 2018. Designing safety regulations for high-hazard industries. Washington DC: The National Academies Press. 12. Macrae C. The problem with incident reporting. British Medical Journal Quality and Safety 2016; 25: 71-75. 13. Andersen J. What are the challenges for healthcare in learning from other industries? Forum 2010, 4-6. Beyond traditional patient safety tools and techniques. Copyright CRICO RMF. 14. Flin R. Human factors in the development of complications of airway management: preliminary evaluation of an interview tool. Anaesthesia 2013;68:817-825. 15. Lloyd J. How can competing patient safety improvement strategies be harnessed? Forum 2010, 9-13. Beyond traditional patient safety tools and techniques. Copyright CRICO RMF.