ICU Management & Practice, ICU Volume 14 - Issue 1 - Spring 2014

Authors

Alan D. Ravitz, PE1

Peter J. Pronovost,MD, PhD2-4

Adam Sapirstein,MD2,3

Johns Hopkins Applied Physics Laboratory, Laurel, USA.1; Armstrong Institute for Patient Safety and Quality, Johns Hopkins Medicine, Baltimore, USA2; School of Medicine, The Johns Hopkins University, Baltimore, USA3; Bloomberg School of Public Health, The Johns Hopkins University, Baltimore, USA4

[email protected]

 

The intensive care unit (ICU) is broadly viewed as the epicentre of a high reliability organisation (HRO) in a healthcare system. After all, ICU clinicians care for the sickest patients with the most complex, high technology therapies and monitors. Moreover, the Leapfrog Group developed standards for the best models of care in the ICU – an intensivist led-care team (Leapfrog Group 2011).

 

However, the facts tell a very different story about reliability in the ICU. We know that only 42 percent of hospitals in the USA that responded to the Leapfrog Group’s hospital survey reported compliance with the ICU Physician Staffing standard (unpublished data, Leapfrog Group, February 17, 2014). Therapies to prevent avoidable harm are delivered erratically. For example, of patients at risk for ventilator-associated lung injury, only 20 percent to 40 percent receive appropriate, weight-based tidal volumes on the ventilator (Pronovost et al. 2010). Clinicians may also be overly optimistic about the quality of care they provide. Scales and co-workers (2011) found that only 50% of eligible ventilated ICU patients had the head of their bed elevated to > 30 degrees before the quality improvement intervention was implemented. Most importantly, patients continue to experience harm at an unacceptable rate that far exceeds the level of harm in an HRO; avoidable error is considered to be the third leading cause of death in the USA (Wachter et al. 2013).

 

The HRO model was developed by examining commonalities of industries that require near error-free performance, such as commercial aviation and nuclear power. Despite being very risky, HROs create systems to manage the complexity of technology and task performance (Sutcliffe 2011). High reliability organisations apply systems engineering to ensure the technology, work process, and culture are all carefully integrated and orchestrated to deliver high levels of safety. Compare this to the current ICU system design recommended by the Leapfrog Group. In this model, the core of the system, the data storage and knowledge management system, is the intensivist and the care team. This model has evolved naturally with the development of critical care, but its deficiencies are obvious. It relies on the flawless performance of an intensivist-led team, a model that depends on the heroic performance of individuals. A model we have argued is outdated, under-engineered, and doomed to fail at high frequency (Pronovost et al. 2014). If the ICU is to reduce harm and work toward becoming an HRO, the system design will need to mature.

 

Systems engineering has been applied successfully in HROs to virtually eliminate errors and catastrophic failure. Systems of systems (SoS) have been created, in which many subsystems are integrated, and become interdependent to reduce harmful errors and improve efficiency. Like an HRO, safe and high quality healthcare depends on the interaction of many systems, aligned purposefully to achieve common safety and quality care goals (Shekelle et al. 2013). By applying systems principles, we believe it is possible to create, a safe, productive SoS for ICU care (Christianson et al. 2011).

 

We have developed a comprehensive plan to start integrating the many constituent subsystems that comprise ICU care to create a SoS. Our plan is based on the US Navy’s submarine force known as Advanced Processor Build (APB)/Acoustic Rapid COTS Insertion plan for submarines (Stevens 2008), which began in 1998 and continues today. Our plan will design, implement, iterate, and evaluate a systems approach, articulating the necessary components and partners. The SoS plan, adapted for healthcare, involves the following seven major elements (see Figure 1).

 

1. Concept for Integrated Healthcare Delivery System.

The high-level description of an integrated healthcare delivery system is a system of systems (SoS). The constituent elements that comprise this SoS include all subsystems, such as ICU settings (eg, surgical ICU, medical ICU), operating rooms, emergency departments, primary care offices, home care. Subsystems are identified and their mutual interactions described. Subsystems are characterised as ‘black boxes’ with appropriate inputs and outputs between each box.

 

2. Concept for Integrated Healthcare Delivery Subsystems.

The subsystems are detailed using a Concept of Operations (CONOPS). A CONOPS provides the vision and purpose for the (sub)system and an analysis of the system’s operational needs and mission requirements. The CONOPS describes the roles and activities of each user, the operational process, and operational command structures.Importantly, key performance parameters, interdependenciesbetween subsystems, and thefacilities, equipment, hardware, software, andpersonnel associated with the subsystem aredefined. The CONOPS also describes knowngaps with existing capabilities. These gaps areflags for innovators to develop new solutionsthat fulfil the vision described in the CONOPS.Healthcare has yet to produce such detailedCONOPS and this work will advance the field.

 

3. Call for Innovation for Candidate Solutions.

Innovators across industry and academia are asked to develop candidate solutions to the system gaps that are identified in stage 2 of the model. In contrast to the current top down approaches, the SoS program provides a goal-directed approach, focusing on a problem to be solved or a job to be done.

 

4. Learning Laboratory.

The Agency for Healthcare Research and Quality (AHRQ) defines a learning laboratory as: “…places and professional networks that allow multidisciplinary teams to identify interrelated threats to patient safety, stretch professional boundaries, envision bold design innovations, and take advantage of brainstorming and rapid prototyping techniques that other leading-edge sectors of the economy employ…” (U.S. Department of Health and Human Services 2013). The Johns Hopkins Armstrong Institute implemented a robust process to analyse novel candidate solutions to fill gaps across four dimensions: culture, workflow, technology, and learning and accountability. These four dimensions characterise the key aspects of integrated socio-technical solutions, purposefully designed and integrated to enhance overall safety and quality in the ICU.

 

5. Systems Integration.

After an evaluation in the Learning Laboratory, candidate solutions need further refinement before integration into an operational production system. Academia (eg, Johns Hopkins Armstrong Institute) and a Systems Integrator, such as the role that Lockheed Martin or Boeing serve in aviation, should collaborate on additional laboratory assessments and production level refinements. The goal of this interaction is to develop a product that can be broadly implemented. It is important to understand that this development includes both technical and social components. The product must be rigorously tested and evaluated to ensure the integration matches the needed performance and is aligned with the key performance parameters identified in the CONOPS.

 

6. Production Integrated Healthcare Delivery System.

The vision described in elements one and two is realised here. The capabilities defined previously are integrated into a comprehensive care delivery model that is used in clinical settings. Because of the magnitude of building a healthcare SoS, it will be impossible to immediately implement a full set of integrated systems envisioned in Element 1. Instead, project teams must incrementally develop and build capabilities in sequence by repeatedly cycling through the plan (Figure 1).

 

7. Outcomes Analysis.

Measurement in clinical settings is essential for benchmarking performance of new systems and allows continuous improvement of overall SoS performance. Each successive pass through the development cycle described in Figure 1, will reveal unanticipated performance deficits and unintended consequences. Accordingly, an outcomes analysis is essential for keeping the overall effort precisely focused on the ultimate vision. This analysis allows the focus to adjust for changes in the challenges and gaps that are revealed in real-world settings.

 

While this SoS approach is mature in other industries, it is grossly underdeveloped in healthcare. Neither clinicians nor technology companies can do this alone. Healthcare needs a learning laboratory that convenes clinicians, engineers, researchers and others to design the healthcare systems patients deserve, clinicians want, and the country needs.

 

Source of Funding and Conflicts of Interest:

There was no funding. Dr. Pronovost reports receiving grant or contract support from the Agency for Healthcare Research and Quality, the Gordon and Betty Moore Foundation (research related to patient safety and quality of care), the National Institutes of Health (acute lung injury research), and the American Medical Association Inc. (improve blood pressure control); honoraria from various healthcare organisations for speaking on patient safety and quality (the Leigh Bureau manages most of these engagements); book royalties from the Penguin Group for his book Safe Patients, Smart Hospitals; consultant fees as a strategic advisor to the Gordon and Betty Moore Foundation; and stock and fees to serve as a director for Cantel Medical. Dr. Pronovost is a founder of Patient Doctor Technologies, a startup company that seeks to enhance the partnership between patients and clinicians with an application called Doctella. Dr. Ravitz and Dr. Sapirstein report no conflicts of interest.

 

For full references, please send a request to [email protected]

«« ISICEM 2014: iMDsoft Presents Unique ICU Data Management Capabilities


New Atrial Fibrillation Treatment Device »»

References:

Christianson MK, Sutcliffe KM, Miller MA et al. (2011) Becoming a high reliability organization. Crit Care, 15 (6): 314.

Leapfrog Group (2011) ICU Physician Staffing (IPS) Factsheet. [Accessed: 16 February 2014] Available from http://www.leapfroggroup.org/media/file/FactSheet_IPS.pdf

Pronovost PJ, Bo-Linn GW, Sapirstein A (2014) From heroism to safe design: leveraging technology. Anesthesiology, 120 (3): 526-9.

Pronovost PJ, Murphy DJ, Needham DM (2010) The science of translating research into practice in intensive care. Am J Respir Crit Care Med, 182 (12): 1463-4.

Scales DC, Dainty K, Hales B et al. (2011) A multifaceted intervention for quality improvement in a network of intensive care units: a cluster randomized trial. JAMA, 305 (4): 363-72.

Shekelle P, Wachter R, Pronovost P et al. (2013) Making health care safer II: an updated critical analysis of the evidence for patient safety practices. Evid Rep Technol Assess, 211 (211): 1-945.

Stevens, J. (2008) The how and why of open architecture. Underseawarfare, Spring. [Accessed: 24 February 2014] Available at http://www.navy.mil/navydata/cno/n87/usw/spring08/HowAndWhy.html

Sutcliffe KM (2011) High reliability organizations (HROs). Best Pract Res Clin Anaesthesiol, 25 (2): 133-44.

U.S. Department of Health and Human Services (2013) Funding Opportunity Announcement. Patient Safety Learning Laboratories: Innovative Design and Development to Improve Healthcare Delivery Systems (P30). [Accessed 24 February 2014. Available at http://grants.nih.gov/grants/guide/rfa-files/RFA-HS-14-005.html.
 
Wachter RM, Pronovost P, Shekelle P. (2013) Strategies to improve patient safety: the evidence base matures. Ann Intern Med, 158 (5 Pt 1): 350-2.