Introduction
Among healthcare providers, and especially among those working with intensive care unit (ICU) patients, one of the greatest concerns that has emerged in recent decades has been a steady escalation in global resistance against antibiotics among certain strains of bacteria, typically accompanied by elevating rates of mortality (Centers for Disease Control 2019; Weiner-Lastinger et al. 2020a; Weiner-Lastinger et al. 2020b). Where once clinicians largely only had to really contend with hospital-acquired methicillin-resistant Staphylococcus aureus (MRSA) (David and Daum 2010), over the past few decades both the number of different pathogens developing resistance to multiple antibiotics and their spread have increased astronomically. Globally, an estimated 700,000 people currently die annually from infections caused by antibiotic-resistant bacteria (Wall 2019), and that number has been projected to increase up to 10 million annually by the year 2050 (O' Neill 2016).
In 2017, the World Health Organization (WHO) published a list of 12 families of bacteria they considered to pose the greatest threat to humans (Mancuso et al. 2021; Mulani et al. 2019). The list categorises bacteria into three categories - critical, high, and medium priority - based upon the urgency of need for effective interventions to be developed and put into clinical use. Designated by the acronym ESKAPE, the six multi-drug resistant pathogens listed as of critical priority are Enterococcus faecium, Staphylococcus aureus, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa, and Enterobacter species (De Oliveira et al. 2020; Mancuso et al. 2021; Mulani et al. 2019). A recent example of one such organism’s transition from benign contaminant to highly-lethal pathogen, concomitant with it developing antibiotic resistance, is Acinetobacter baumannii (Visca et al. 2011). As recently as the late 1990s, A. baumannii was considered minimally pathogenic and rarely life-threatening (Peleg et al. 2008). Now, however, it has been labelled “an urgent threat” by the U.S. Centers for Disease Control because of rapid increases in its pathogenicity, resistance to antibiotics, and mortality rates, now upwards of 50% (Mohd Sazlly Lim et al. 2019; Perez et al. 2020; Visca et al. 2011; Weiner-Lastinger et al. 2020b). Mortality rates from nosocomial sepsis with gram negative antibiotic-resistant bacteria have been reported as high as 80-85% (Mathers et al. 2011; Snitkin et al. 2012).
This article examines the management of multidrug resistant pathogens from the perspective of prevention, with prevention further categorised into preventing further development and preventing further spread once encountered, including therapies currently available and those still being tested to prevent further spread.
Preventing Drug Resistance from Developing
To gain some understanding about how to prevent further drug resistance from happening, one must first appreciate the forces that have been driving its emergence thus far. These forces are multiple and multifaceted. First, bacteria innately change over time in response to their environment – whether their environment is within a human, an animal, or a plant, or on some non-living surface – and this process sometimes results in changes towards antibiotic resistance (Cepas and Soto 2020; Iramiot et al. 2020). Such changes outside of humans may be accelerated, however; for example, by using antibiotics commercially in food production (Oliver et al. 2011; Tollefson et al. 1997). This said, though antibiotic resistance has been documented in the intestines of beef cattle and other animals, whether and how this adversely affects human health, and to what degree, remains unknown (Oliver et al. 2011; Tollefson et al. 1997).
It is widely believed that the principal driver behind the rapid escalation in the number and severity of infections caused by antibiotic-resistant human pathogens has been the oftentimes non-judicious overuse of broad-spectrum antibiotics, especially among ICU patients (Kollef and Micek 2014; Lindsay et al. 2019; Mulani et al. 2019; Strich and Palmore 2017; Teerawattanapong et al. 2017; Wall 2019; Wunderink et al. 2020). Such use is understandable, since a sizeable percentage of ICU patients either have life-threatening infections upon admission or develop them while they are in the ICU. Antibiotics have unquestionably saved the lives of millions of patients who otherwise would have died. However, their overuse has led to the development of bacterial strains that are resistant to almost all forms of anti-bacterial therapy. Again, Acinetobacter baumannii is a prime example of how rapidly antibiotic resistance may develop. In a 2009 study of ICU patients conducted in South Korea (Jang et al. 2009), for example, Acinetobacter resistance rates against imipenem and meropenem were both just 4.5%. Yet, five years later, resistance rates to imipenem and meropenem were reported as 45 and 49%, respectively (Viehman et al. 2014); and, in our own recent ICU experience, they now may exceed 90%.
Antimicrobial stewardship
This alarming rate of elevating resistance among many bacterial strains has led to the concept of antimicrobial stewardship (Medina and Pieper 2016; Strich and Palmore 2017; Wunderink et al. 2020), which calls for optimising the selection and dosing of antibiotic medications and reducing the duration of therapy (Strich and Palmore 2017). Antimicrobial stewardship programmes (ASP) have been examined, both in individual trials and in a recently published meta-analysis, and have been shown both to be feasible to implement and to result in decreased antibiotic use and costs, shorter treatment times, and reduced incidences of antibiotic-resistant infections in the absence of worsening patient outcomes (Karanika et al. 2016; Katsios et al. 2012). In a meta-analysis extracting data from 26 studies with observation periods prior to and after the implementation of an ASP ranging from six months to three years, Karanika et al. (2016) identified pooled reductions in total antimicrobial consumption after ASP implementation of 19.1% (95% confidence interval 7.5–30.1%), with reductions even greater within ICUs (39.5%; 6.4–72.5%). Similarly, the use of broad-spectrum antibiotics declined by a mean 18.5% (5.0–32.0%) for carbapenems and by 14.7% (1.7–27.7%) for glycopeptides, with overall antimicrobial costs also reduced by 33.9% (25.9–42.0%). More pointedly, these reductions were accompanied by reductions in the risk of infection with methicillin-resistant Staphylococcus aureus, imipenem-resistant Pseudomonas aeruginosa, and extended-spectrum beta-lactamase Klebsiella species, as well as shorter hospital stays, and no increase in mortality.
One additional component of ASP is the element of audit and feedback, whereby one or more healthcare workers with specific expertise in antimicrobial stewardship – independent of both the clinical team and any formal infectious disease consultations – provide regular (e.g., multiple times weekly, if not daily), prospective written and/or oral recommendations for antimicrobial use to the ICU clinical team on specific patients (Lindsay et al. 2019). In their meta-analysis of eleven published case-control studies orchestrated to evaluate the impact on mortality of ASPs that incorporated a prospective, routine-use audit and feedback process and together encompassed a total of 10,545 cases and 9510 controls, Lindsay et al. (2019) identified no increase in the relative risk (RR) of mortality (RR=1.03; 0.93–1.14). However, they also concluded that all 11 studies were at high risk of bias and failed to report any data on the programmes’ use of antibiotics or incidence of infections with resistant organisms.
Rapid testing
One major contributor to the overuse of broad-spectrum antibiotics is undoubtedly the time delay that typically occurs between when a presumed septic patient is admitted to the hospital and when culture results return. Conventional methods (i.e., microbial growth-based methods) have the advantage of not only identifying an organism but predicting both its resistance and susceptibility to advantage. An all-too-common disadvantage, however, is that these results typically take from 36-72 hours to return after samples have been sent. Conversely, novel molecular methods that do not rely on microbial growth often yield results within one to a few hours. In a meta-analysis published in 2020, De Angelis et al. (2020) assessed the results of 20 studies involving 1930 isolates, and calculated pooled sensitivity and specificity estimates for two major commercial systems – Verigene® and FilmArray® - which were 85.3 and 99.1% when phenotypic comparators were used and 95.5 and 99.7% when genotypic comparators were used. The meta-analysis did not examine how these results impacted antibiotic use or outcomes, however, and both must be studied to fully establish the clinical utility of these tests. Nonetheless, if rapid testing is proven both accurate and capable of reducing the use of broad-spectrum antibiotics when not needed, without impairing outcomes, it is a reasonable assumption that such tools could become valuable in the fight to reduce both the development and spread of antibiotic resistance.
Antimicrobial de-escalation
Moving from the initiation to the cessation of antimicrobial therapy, another approach to potentially reducing the development of antimicrobial resistance is to reduce the duration of therapy, a process called antimicrobial de-escalation (ADE), though this approach continues to be considered controversial and lacking sufficient evidence to justify its widespread use (De Bus et al. 2020; De Waele et al. 2020; Lakbar et al. 2020; Tabah et al. 2020; Tabah et al. 2016). As of 2020, the use of ADE was not yet recommended for widespread use by a combined task force of the European Society of Intensive Care Medicine (ESICM) and the European Society of Clinical Microbiology and Infectious Diseases (ESCMID) Critically Ill Patients Study Group (ESGCIP) (Tabah et al. 2020). In a 2016 meta-analysis whose authors included some members of ESGCIP and for which two randomised controlled trials and 12 cohort studies were assessed (Tabah et al. 2016), the investigators identified considerable variability in the definition of ADE; that it was consistently associated with reduced symptom severity scores (with p values ranging from 0.04 to <.001); and that pooled data revealed a reduced relative risk (RR) of mortality (RR=0.68; 0.52–0.88). However, because none of the studies were designed to investigate the effect of de-escalation on antimicrobial resistance, the issue of whether this therapeutic approach should be adopted for the explicit purpose of reducing antimicrobial resistance was considered unresolved (Tabah et al. 2016).
Procalcitonin
Procalcitonin is a peptide precursor of the calcium-regulating hormone calcitonin that, over the past two decades, has come into common use as a tool to guide both the initiation and cessation of antibiotic therapy, given the propensity of procalcitonin serum levels to increase in the setting of active infection (Branche et al. 2019; Cleland and Eranki 2022; Kip et al. 2018; Kyriazopoulou et al. 2021; Meier et al. 2019; Schuetz et al. 2017). Its effectiveness, in terms of reducing antibiotic use and enhancing patient outcomes, including mortality and length of hospital stay, has been demonstrated both in randomised clinical trials (Kip et al. 2018; Kyriazopoulou et al. 2021) and in two meta-analyses of individual data. For one, individualised patient data were extracted from 13 clinical trials on 523 patients with positive blood cultures (Meier et al. 2019). In the other, individual patient data were analysed from 26 controlled trials totalling 6708 participants (Schuetz et al. 2017). Both meta-analyses revealed reduced antibiotic use and a decreased rate of mortality, the odds of death decreasing by 17% (OR=0.83; 0.70-0.99, p=0.037) in one analysis (Schuetz et al. 2017), while the mortality rate decreased by 7.9% (from 29.8 to 21.8%; 1.8-26.6%) in the other (Kip et al. 2018).
Nonetheless, despite the test’s low cost (often roughly $10 USD), its cost-effectiveness has been called into question (Kip et al. 2018), as have concerns been expressed about its overuse, given its relative lack of specificity, serum procalcitonin levels also increasing with viral infections and a number of other non-infectious states, like certain malignancies and renal failure. Moreover, though its effectiveness decreasing antibiotic use has been documented, its impact upon reducing antibiotic resistance has not yet been adequately studied.
Emerging supplementary antimicrobial therapies
Since the advent of sulphonamides prior to World War II, antibiotics have progressively been viewed as the cornerstone of anti-bacterial therapy; and, until the recent surge in antibiotic resistant organisms, there have been few valid reasons to question this. However, this now has changed and the time of merely adding newer antibiotics to pre-existing ones appears to have passed. Among the various supplementary therapies that have been developed to fight infections are bacteriophages (Abedon et al. 2011; Chan et al. 2013; Czaplewski et al. 2016; Górski et al. 2017; Kutter et al. 2010; Lin et al. 2017; Moelling et al. 2018; Sybesma et al. 2018), antimicrobial peptides (Berglund et al. 2015; Du et al. 2017; Mahlapuu et al. 2016; Rios et al. 2016), photodynamic light therapy (Cieplik et al. 2018; Hu et al. 2018; Tomb et al. 2018; Wozniak and Grinholc 2018), and silver nanoparticles (Liang et al. 2016; Miller et al. 2010; Munger et al. 2014; Peng et al. 2017; Radulescu et al. 2016; Verbelen et al. 2014). The main rationale behind using such supplementary therapies is two-fold. First, adding them to standard antibiotics may reduce the level of resistance that already-resistant organisms have to antibiotics. Second, their use also may spare clinicians from using broad-spectrum antibiotic cocktails prior to culture and sensitivity results returning. To date, however, data supporting their effectiveness in humans remains limited.
Among these four options listed above, bacteriophage therapy has, by far, the longest history and most published data, actually having antedated the use of antibiotics. Bacteriophages are viruses with the capacity to invade and kill bacterial cells. Among their advantages are their high host specificity for certain bacteria, including several phages with documented in vitro effectiveness against ESKAPE organisms (Abedon et al. 2011; Chan et al. 2013; Górski et al. 2017; Lin et al. 2017; Mulani et al. 2019; Sybesma et al. 2018) and their ability to adapt to bacterial changes, thereby limiting the potential for their bacterial hosts to become resistant to them (Mulani et al. 2019). Their major limitation is the current dearth of any supportive published data beyond in vitro studies, anecdotal case reports, and small case series (Mulani et al. 2019).
Moreover, much the same is true for all the other supplementary antimicrobial therapies, clinical trials limited to two randomised clinical trials evaluating silver nanoparticles, both involving their topical use in wound dressings: one demonstrating reduced wound-healing time for leg ulcers (Miller et al. 2010), and the other wound healing in and being well tolerated by burn patients (Verbelen et al. 2014). That said, photodynamic light therapy is already being used widely to treat dental, skin, and soft tissue infections (Mulani et al. 2019).
Preventing the Spread of Antibiotic-Resistant Organisms
Environmental cleaning, decolonisation, and source control
Other strategies have been developed, used, and tested for reducing the incidence of antibiotic resistant infections, including environmental cleaning, decolonisation, and source control. Though each of these three processes involve cleaning, they differ in that environmental cleaning generally entails decontaminating the environment around the patient (e.g., medical equipment, latrines, sinks) (Carling et al. 2008); decolonisation involves decontaminating the patient and any catheters, lines, tubes or other equipment in current direct use on the patient (Decker and Palmore 2013; Huang et al. 2013); and source control entails controlling microbe transmission between patients and caregivers by all means, including cleaning but also ensuring that personal protective equipment – like masks, shields, gloves, and gowns – are used and fit appropriately (Lagunes et al. 2016). One meta-analysis was recently conducted to assess the use of ASP in conjunction with these three other strategies – of environmental cleaning (EC), decontamination methods (DM), and source control (SC) – to reduce the incidence of antibiotic-resistant infections relative to standard care (Teerawattanapong et al. 2017). This meta-analysis, which assessed 42 studies encompassing 62,068 patients, revealed that combining standard care with ASP, EC, and SC was the most effective approach. Indeed, the CDC has already published guidelines on how to successfully decontaminate the ICU environment, patients, and healthcare personnel and mentions each of these approaches (Sehulster and Chinn 2003). However, research has shown that, while some aspects of decontamination are being applied adequately and appropriately, others – like the decontamination of several objects at high risk of becoming contaminated with nosocomial pathogens, including bedpan cleaners, toilet area handholds, doorknobs and light switches – were not performed as consistently or thoroughly (Carling et al. 2008). This is important, because some common antibiotic-resistant organisms, like A. baumannii, thrive on inert surfaces that are commonly overlooked during decontamination efforts, including computers and computer keyboards (Lu et al. 2009), medical charts (Chen et al. 2014), and objects as distant from the patient as elevator buttons, door handles, staircase railings, telephones, and water taps (Bhatta et al. 2018).
Hand hygiene
Proper hand hygiene would seem a blatantly obvious measure to stop the spread of microbials, especially in an ICU. It is considered one of the overriding goals of infection control by both the CDC and the World Health Organization (WHO). That said, despite considerable research demonstrating this simple concept’s efficacy and extensive research efforts to promote its use in hospitals, compliance remains as low as 40-60% and might be even lower in ICUs (Erasmus et al. 2010; Kowitt et al. 2013; Stahmeyer et al. 2017). The most likely explanation for this is the time hand washing requires; as, in one German study of ICU nurses, mean compliance was 42.6% and the average length of time nurses spent washing their hands per hand washing was just 6.8 seconds, 23.2 seconds less than the 30 seconds recommended in WHO guidelines (Stahmeyer et al. 2017). The investigators further stipulated that, given 218-271 hand-washing opportunities per patient, the average nurse would need to spend an additional 58-70 minutes washing their hands, per patient, over their 12-hour shift, rendering the simple practice of thorough hand hygiene almost impossible to implement without additional changes to ICU nursing care, like additional staff.
Isolation
Another strategy that is simple in concept, but highly dependent on available resources and, hence, sometimes very difficult to implement is isolation (Gasink and Brennan 2009; Landelle et al. 2013; Rosenberger et al. 2011; Strich and Palmore 2017), a practice that is considered one of the core elements of infection control by agencies like the WHO and CDC (Sehulster and Chinn 2003). Isolation is generally used in two clinical settings: to prevent the transmission of micro-organisms from a patient already known to have an antimicrobial-resistant organism, like methicillin-resistant S. aureus; and to prevent the potential transmission of micro-organisms from a patient in whom the nature of their infection is not yet known – a process termed empiric isolation. Both these objectives require considerable adjustments to be made, including isolating not only the patient but their nurse as well, and considerably increased time fulfilling source-control and decontamination protocols like hand hygiene, masking, gloving, gowning, and the decontamination of all equipment. Isolation has been documented to work, however, especially during infection outbreaks (Klein et al. 1989; Palmore et al. 2011; Rosenberger et al. 2011; Snitkin et al. 2012).
Education
Ultimately, the effectiveness of any measure designed to reduce antimicrobial resistance and the spread of resistant organisms relies on how, how well, and how consistently they are implemented; and all this, in turn, relies on all affected parties being educated both in the process and in the need for such measures. As clearly demonstrated for such a conceptually-simple and easily-justified practice as hand washing (Stahmeyer et al. 2017), proper performance relies on everyone – not just those in direct contact with patients, but also those generating nurse schedules and hospital administrators issuing staffing guidelines –being on board and in full agreement. It also requires that patients and their visitors be educated in required infection control practices and why they are necessary. All this depends on researchers continuing to generate new and improved methods to control infections and empirical evidence to justify their use.
Conclusion
The exponential increase in antimicrobial resistance in recent decades has created a global health crisis. Generating new, improved antibiotics to replace the old ones is no longer enough. That said, numerous supplementary strategies already exist and have been proven effective at reducing the spread of resistant organisms – including antimicrobial stewardship, rapid testing, antimicrobial de-escalation, employing serum procalcitonin levels to guide antimicrobial initiation and cessation, and various on-site infection-control procedures like environmental cleaning, decontamination, source control, consistent hand washing, and, when necessary, patient isolation. Several novel avenues of antimicrobial therapy also are presently being developed and tested, including antimicrobial phages and peptides, photodynamic light therapy, and silver nanoparticles. What is needed now is strict adherence to those practices that have already been documented to be effective, combined with concerted efforts to further develop and test those approaches whose effectiveness hitherto remains unknown.
Conflict of Interest
None.