ICU Management & Practice, Volume 19 - Issue 4, Winter 2019/2020

Shaping the Human Side of Medical Devices in Critical Care: The Implication of Human Factor Studies

share Share
An overview of Human Factors Engineering (HFE), a multidisciplinary science in which human behaviour, capacities, and engineering principles are used to explore why errors occur, and how the likelihood of preventable harm could be reduced.

What Do We Know About Medical Device Errors in Critical Care?


Adverse events and errors are frequent in technology-rich critical care environments, such as Intensive Care Units (ICUs). In such a clinical setting, patients are more likely to experience treatment- or procedure-related adverse events due to the complexity of their conditions, workload fluctuation and need for urgent intervention (Garrouste-Orgeas et al. 2012). A number of studies have reviewed incidents in critical care units including equipment failure, unplanned dislodgement or inappropriate disconnection of lines, catheters, or drains, and errors related to medication or airway complications (Valentin et al. 2006). For example, Welters et al. (2011) reviewed all critical incidents in 9 critical care units (level 2 and 3 beds) in UK and found that 30% of all incidents (the largest group) were related to medical devices. One third of these were due to faulty equipment followed by incorrect handling and unfamiliarity.

Implications of Technology Development


New technology does not always enhance safety in healthcare. Some studies report a positive outcome following introduction of new technology while others indicate no such benefits (Nuckols et al. 2008; Rothschild et al. 2005) or even adverse events related to new technology (Han et al. 2005). Human factor studies have an essential role to play in understanding these issues and facilitating these innovations whilst improving their safety.

It is well recognised that many errors are caused by poorly designed systems that fail to address the human actions and needs between people and the system in which they work (Garrouste-Orgeas et al 2012; Reason 2000).

Some advances in technologies have taken measures to mitigate these errors (e.g. electronic health records, computerised provider order entry system, bar-code medication administration, smart infusion pumps (Hassan et al 2010). However, unexpected errors often occur when a new technology is introduced due to a number of newly generated, and sometimes unanticipated, human-device, device-device, and human-human interactions (Garrouste-Orgeas et al. 2012).

Role of Human Factors Engineering


Human Factors Engineering (HFE) is a multidisciplinary science in which human behaviour, capacities, and engineering principles are used to explore why errors occur, and how to reduce the likelihood of preventable harm to individuals (Russ et al. 2013). Studies in HFE have demonstrated that performance, efficiency, quality, and safety are the result of the interaction between people and the system in which they work (Scanlon and Karsh 2010). It has been argued that medical experts need further assistance in the adoption of HFE methods to avoid adverse events, to deal with errors, to optimise the relationship between humans and devices in the context of use and to support human performance (Borsci et al. 2016), especially in complex environments such as ICUs. Regulatory standards (e.g. IEC 62366, Medical Devices-Application of Usability Engineering to Medical Devices) have been developed and should be widely adopted to help medical device manufacturers understand and use HFE during the development and validation of medical devices (Hegde 2013). These standards aim to reduce the occurrence of unforeseen situations and require an understanding of the complex human-device-environment interactions.

In such a complex ‘sociotechnical environment,’ errors may occur in a variety of ways. This is due to the fact that operators with different skills, mental models and familiarity with existing devices are required to simultaneously use new technologies whilst adapting to a changing clinical environment. The term ‘sociotechnical systems’ (STS) has been used to pinpoint the role of choice and organisational design in the interaction between people (the social system), tools, technologies and techniques (Wilson and Sharples 2015) and in recent years has been applied to system ergonomics. This approach to the design of work systems, human task/job requirements, human-machine and human-software interfaces (Hendrick and Kleiner 2001) allows HFE to examine not only individual (i.e. micro) issues but also wider social and organisational factors (i.e. macro issues) (Wilson and Sharples 2015). Each sociotechnical context can be characterised by specific workflows, work cultures, rules and constraints of communication, social interactions along with a set of technologies. In these circumstances and within a clinical setting, human errors are rarely the ‘fault’ of the clinician. Rather, they emerge from the clinicians needs/expectations while using new technologies in a particular environment and doing a particular task (for example, the technologies may not be designed for the end user’s mental model of what the technology is actually doing; the environment may not be adequate or filled with interruptions and tasks may require intense cognitive workload) (Scanlon and Karsh 2010).

Key Variables in Human Factors Engineering for Medical Devices


At the individual level, the following factors are widely investigated to device evaluation in medical practice to fully understand and/or model the device use (Borsci et al 2016). These factors, in combination, impact upon the way in which care processes are delivered with promising outcomes for patient safety, quality of care and improved adoption of medical devices:

  • Acceptance of the device use (Davis 1989), consisting of perceived usefulness, ease of use, and attitude towards a device;
  • Usability, defined as effectiveness, efficiency, and satisfaction of product usage in the specific context (ISO 9241-11:1998);
  • User experience, defined as a person’s perceptions and responses that result from the use or anticipated use of a product, system, or service (ISO 9241-210:2010);
  • Expectations before use of the device and the reaction of users to the device during and after use, including physiological reaction assessments (Shadbolt et al. 2015);
  • Intuitiveness of a technical system when, in the context of a certain task, the particular user is able to interact effectively, whilst not consciously using previous knowledge (Naumann et al. 2007);
  • Trust towards systems, including a set of beliefs that a person has before they use or experience a technology or system, built throughout the relationship between user and system, and dependent on the cumulative experience with a specific system (Borsci et al. 2018).
  • Assessment of the simultaneous impact of individual, organisation, tasks and technology on quality of care and patient safety – System Engineering Initiative for Patient Safety - SEIPS model (Carayon et al. 2006).

Conclusion


Healthcare is a complex sociotechnical system. Healthcare innovation requires human factor engineers to help innovate safely and effectively to enable clinicians (and other users) to optimise their interactions with technology and reduce associated risks to patients.

Key Points


  • Human Factors Engineering (HFE) is a multidisciplinary science in which human behaviour, capacities, and engineering principles are used to explore why errors occur, and how to reduce the likelihood of preventable harm to individuals.
  • Medical experts need assistance in the adoption of HFE methods to avoid adverse events, to deal with errors, to optimise the relationship between humans and devices in the context of use and to support human performance.
  • Healthcare innovation requires human factor engineers to help innovate safely and effectively.

«« VITAMINS Trial: No Significant Benefit of Vitamin C Cocktail


Targeted Temperature Management After Cardiac Arrest »»

References:

Borsci, S, Buckle P, Hanna GB (2016) Why you need to include human factors in clinical and empirical studies of in vitro point of care devices? Review and future perspectives. Expert review of medical devices, 13(4):405-416.

Borsci S, Buckle P, Walne S, Salanitri D (2018) Trust and Human Factors in the Design of Healthcare Technology. In Congress of the International Ergonomics Association, 207:215. Springer, Cham.

Carayon P, Hundt AS, Karsh BT et al. (2006). Work system design for patient safety: the SEIPS model. BMJ Quality & Safety, 15(1):i50-i58.

Davis FD (1989) Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS quarterly, 319-340.

Garrouste-Orgeas M, Philippart F, Bruel C et al. (2012) Overview of medical errors and adverse events. Annals of intensive care, 2(1):2.

Han YY, Carcillo JA, Venkataraman ST et al. (2005) Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics, 116(6):1506-1512.

Hassan E, Badawi O, Weber RJ, Cohen H (2010) Using technology to prevent adverse drug events in the intensive care unit. Critical care medicine, 38:S97-S105.

Hegde V (2013) Role of human factors/usability engineering in medical device design. In 2013 Proceedings Annual Reliability and Maintainability Symposium (RAMS).

Hendrick HW, Kleiner BM (2001) Macroergonomics: An introduction to work system design. Human Factors and Ergonomics Society.

ISO 9241-11 (1998) Ergonomic requirements for office work with visual display terminals. Brussels: CEN.

ISO 9241-210 (2010) Ergonomics of human-system interaction – part 210: human-centred design for interactive systems. Brussels: CEN.

Naumann A, Hurtienne J, Israel JH et al. (2007) Intuitive use of user interfaces: defining a vague concept. In International Conference on Engineering Psychology and Cognitive Ergonomics, 128-136. Springer Berlin Heidelberg.

Nuckols TK, Bower AG, Paddock SM et al. (2008) Programmable infusion pumps in ICUs: an analysis of corresponding adverse drug events. Journal of General Internal Medicine, 23(1):41-45.

Reason J (2000) Human error: models and management. BMJ, 320(7237):768-770.

Rothschild JM, Keohane CA, Cook EF et al. (2005). A controlled trial of smart infusion pumps to improve medication safety in critically ill patients. Critical care medicine, 33(3):533-540.

Russ AL, Fairbanks RJ, Karsh BT et al. (2013). The science of human factors: separating fact from fiction. BMJ Qual Saf, 22(10): 802-808.

Scanlon MC, Karsh BT (2010) The value of human factors to medication and patient safety in the ICU. Critical care medicine, 38(60):S90.

Shadbolt N, Smart PR, Wilson JR et al. (2015). Knowledge elicitation. Evaluation of human work, 163-200.

Valentin A, Capuzzo M, Guidet B et al. (2006). Patient safety in intensive care: results from the multinational Sentinel Events Evaluation (SEE) study. Intensive care medicine, 32(10):1591-1598.

Welters ID, Gibson J, Mogk M et al. (2011). Major sources of critical incidents in intensive care. Critical Care, 15(5):R232.

Wilson JR, Sharples S (2015). Evaluation of human work. CRC press.



Errors, Medical Devices, Critical Care, human factors engineering, human factor studies, preventable harm An overview of Human Factors Engineering (HFE) and how it can be used to explore why errors occur, and how the likelihood of preventable harm could be reduced.

No comment


Please login to leave a comment...

Highlighted Products