Features from HHM

The "Defining Clean" Series from Healthcare Hygiene Magazine

Defining “Clean” in Sterile Processing

Patient-Ready Scopes Remain Contaminated After Reprocessing

 

By Kelly M. Pyrek

Note: This is the second in a series of articles examining how “clean” is being defined in the healthcare environment.

For the last several years, researchers have been sounding the alarm about processed, presumably patient-ready endoscopes that have retained bioburden and pose a risk to patients. And the Food and Drug Administration (FDA) has been doubling down on its scrutiny of scope manufacturers since high-profile outbreaks and patient deaths linked to contaminated scopes have been making headlines.

The discovery of still-contaminated scopes coming out of sterile processing departments despite manual cleaning and high-level disinfection (HLD) cycles in automated endoscopic reprocessors (AERs) has changed everything we thought we knew about how to reduce and eliminate risk posed by invasive medical devices and instruments.

In 2014, Cori Ofstead, MSPH, president and CEO of Ofstead & Associates, and colleagues, presented a paper at the annual APIC meeting that alarmed the infection prevention community by its startling conclusion that despite guideline adherence by sterile processing technicians, endoscopes remained contaminated with debris and microorganisms. Furthermore, Ofstead, et al. (2014) found that visual inspections performed during manual cleaning did not identify the debris that researchers could visualize on white swabs during data collection, and rapid-indicator tests detected contamination on endoscopes with and without visible debris. Additionally, cultures confirmed viable microorganisms after manual cleaning and HLD.

In their study, endoscope reprocessing was directly observed during 60 encounters with 15 used colonoscopes and gastroscopes at a large tertiary-care medical center. Researchers documented

adherence with guidelines. Surface swabs were used to sample distal ends, control handles, ports, caps, and buttons. Water samples were obtained from suction-biopsy channels and auxiliary water channels. Samples were tested for protein, blood, carbohydrates, and adenosine triphosphate (ATP) using rapid indicator tests. Aerobic cultures were performed, and positive cultures were sent to a

reference lab for species identification. Researchers visually inspected 500 endoscope components and conducted 588 rapid indicator tests, including surface protein, surface ATP, water ATP and dipsticks for protein, blood, and carbohydrates. Cultures were performed on 88 channel effluent samples. No residue was visible on endoscopes after manual cleaning. Residue was seen on swabs or in effluent for 31 percent of post-manual cleaning samples and zero post-HLD samples. After manual cleaning, samples exceeded benchmarks for ATP (46 percent biopsy ports) and protein (75 percent handles). Post-HLD tests revealed persistent contamination (p<.05). Colony counts from bedside-cleaned channel samples were higher than manually cleaned and disinfected counts.

This study came on the heels of a 2013 Ofstead-produced paper tackling endoscopy-associated infection (EAI) risk estimates and their implications, which shattered any remaining complacency about scope-associated infections.

As Ofstead and Dirlam Lang (2013) remind us, “Recent audits have documented widespread lapses in infection control involving medical equipment. Inspections of multiple facilities determined that certain endoscopy equipment was not properly reprocessed for up to several years. Direct observation in a multisite study revealed that endoscopes were virtually never reprocessed in accordance with guidelines. The implications of these lapses are unknown because no epidemiologic studies have determined the risk of EAI associated with reprocessing quality.” In their paper’s conclusion, they asserted that, “Evidence indicates that current EAI risk estimates are inaccurate, outdated, based on flawed methodology, and can have profound effects on patients. These extremely low risk estimates are used to justify the lack of reporting, routine monitoring, patient notification, and laboratory testing following a lapse. In 1993, researchers recommended prospective studies involving both patient monitoring and laboratory cultures be conducted to evaluate the risk of transmitting infections via contaminated endoscopes. Today, there remains a need for epidemiologic studies to accurately estimate the risk of EAI and other complications. Prospective studies should involve observation of reprocessing practices, microbial testing, and outcomes assessment. The results could be used to develop criteria for patient notification and reporting of reprocessing lapses and assist decision makers in determining what actions to take when a lapse occurs.”

Lichtenstein and Alfa (2019) remind us of the parameters that define the steps of cleaning, disinfection and sterilization:

  • “Cleaning refers to removal of visible soiling, blood, protein substances, and other adherent foreign debris from surfaces, crevices, and lumens of instruments. It is usually accomplished with mechanical action using water, detergents, and enzymatic products. Meticulous physical cleaning must always precede disinfection and sterilization procedures, because inorganic and organic materials that remain on the surfaces of instruments interfere with the effectiveness of these processes. Mechanical cleaning alone reduces microbial counts by approximately 103 to 106 (three to six logs), equivalent to a 99.9 percent to 99.9999 percent reduction in microbial burden.”
  • “Disinfection is defined broadly as the destruction of microorganisms, except bacterial spores, on inanimate objects (e.g., medical devices such as endoscopes).”

According to the researchers, three levels of disinfection are achievable depending on the amount and kind of microbial killing involved:

  1. High-level disinfection (HLD): the destruction of all viruses, vegetative bacteria, fungi, mycobacterium, and some, but not all, bacterial spores. For liquid chemical germicides (LCGs), HLD is operationally defined as the ability to kill 106 mycobacteria (a six-log reduction). The efficacy of HLD is dependent on several factors and includes the type and level of microbial contamination; effective precleaning of the endoscope; presence of biofilm; physical properties of the object; concentration, temperature, pH, and exposure time to the germicide; and drying after rinsing to avoid diluting the disinfectant.
  2. Intermediate-level disinfection: the destruction of all mycobacteria, vegetative bacteria, fungal spores, and some nonlipid viruses, but not bacterial spores.
  3. Low-level disinfection: a process that can kill most bacteria (except mycobacteria or bacterial spores), most viruses (except some nonlipid viruses), and some fungi.
  • “Sterilization is defined as the destruction or inactivation of all microorganisms. The process is operationally defined as a 12-log reduction of bacterial endospores.”

Lichtenstein and Alfa (2019) remind us that, “HLD of endoscopes eliminates all viable microorganisms, but not necessarily all bacterial spores. Although spores are more resistant to HLD than other bacteria and viruses, they are likely to be killed when endoscopes undergo thorough manual cleaning. In addition, survival of small numbers of bacterial spores with HLD is considered acceptable because the intact mucosa of the GI tract is resistant to bacterial spore infection. Endoscope sterilization, as opposed to HLD, is not required for ‘standard’ GI endoscopy, as a reprocessing endpoint of sterilization has not been demonstrated to further reduce the risk of infectious pathogen transmission from endoscopes. Sterilization of endoscopes is indicated when they are used as ‘critical’ medical devices, such as intraoperative endoscopy when there is potential for contamination of an open surgical field. In addition, individual institutional policies may dictate sterilization of duodenoscopes and linear endoscopic ultrasound instruments due to elevator mechanisms that have been difficult to clean and eradicate all bacterial contaminants with HLD alone.”

Achieving the levels of clean as defined by guidelines and recommendations stakes some skill but it is possible. However, the complexity of endoscopes requires rigorous adherence to currently accepted reprocessing guidance. The endoscope features that challenge the reprocessing procedures, according to Lichtenstein and Alfa (2019), include:

  • Complex endoscope design with several long, narrow internal channels and bends that make it difficult to remove all organic debris and microorganisms (e.g., elevator channel and elevator lever cavity of duodenoscopes).
  • A large variety of endoscope manufacturers require different cleaning procedures and devices and materials.
  • Occult damage (e.g., scratches, crevices) to the endoscope can sequester microorganisms and promote biofilm formation.

Human factors is an undeniable component of the narrative around lack of compliance with reprocessing guidelines. For example, Ofstead, et al. (2010) demonstrated that compliance with all the reprocessing steps occurred for only 1.7 percent of flexible endoscopes reprocessed when cleaning steps were performed manually and disinfection was automated, compared to 75.4 percent compliance when both cleaning and disinfection were automated.  As the researchers note, “Until recently, the only aspect of this process that was monitored was to test the MEC of the high-level disinfectant to ensure it contained a sufficient concentration of the active ingredient … Often staff are not aware of additional channels in new models of endoscopes and are not trained on specific cleaning requirements. The use of different sizes and types of channel brushes for the various channel sizes, the fact that some channels cannot be brushed, and the multitude of different types of cleaning brushes available makes duodenoscope reprocessing a confusing process prone to human error.”

Predating the CDC’s warning to the FDA by almost two years, the Association for the Advancement of Medical Instrumentation (AAMI) and the FDA held a workshop in 2011 that actually addressed the definition of “clean” by identifying as one of its clarion themes the importance of gaining consensus on ‘how clean is clean’ and on adequate cleaning validation protocols for reprocessing reusable medical devices.

The workshop identified three key challenges and barriers to “clean”:

  • Lack of understanding and lack of a consistent definition for the meaning of “clean” for reprocessed medical devices
  • Lack of specific criteria and endpoints for measuring whether a device is clean
  • Lack of standardization of clinically relevant test soils for validating the effectiveness of reprocessing methods

To address these challenges, AAMI and FDA recommended three priority actions:

  • Research the essential factors to be considered when defining “clean” for handling and reprocessing medical devices. Develop a common definition or explanation of “clean” for reprocessed medical devices.
  • Standardize test soils for validating the reprocessing of specific types of medical devices.

At that time, AAMI had established seven clarion themes

  1. Gain consensus on “how clean is clean” and on adequate cleaning validation protocols for reprocessing reusable medical devices.
  2. Create standardized, clear instructions and repeatable steps for reprocessing whenever possible. 3. Pay early, iterative, and comprehensive attention to reprocessing requirements throughout the device design process.
  3. Make human factors and work environment factors priorities when developing reprocessing requirements.
  4. Improve information collection and sharing to broaden the use of best practices in reprocessing.
  5. Improve reprocessing competencies by strengthening training, education, and certification.
  6. Create a greater sense of urgency and understanding throughout the healthcare community about the consequences of inadequate reprocessing.

Soon after the workshop, AAMI issued a technical information report (TIR30:2011) that acknowledged, “There are few tests that can be used to verify cleaning. To verify cleaning of a given device, one must have a test soil and a quantitative test method for detecting residual soil after cleaning. If cleaning protocols that could be used for verification were in wide use today, they could help ensure that adequate cleaning is accomplished so that a device can be reliably disinfected and/or sterilized before it is used on the next patient. The manufacturer must validate the instructions for reprocessing a reusable device before marketing it. In addition, manufacturers must consider:

  1. that exposure to chemicals, such as cleaning agents, could alter the material used in the device
  2. whether the materials of construction will absorb or adsorb chemical agents, which could then gradually leach from the material over time
  3. how cleaning processes could affect the function of the device
  4. that cleaning processes and tools must be able to contact all areas of the device that could become contaminated

AAMI TIR30:2011 represented a compendium of processes, materials, test methods and acceptance criteria for cleaning reusable medical devices that remains an essential roadmap for reaching the destination of “clean” devices. However, one of the best documents to follow, according to Susan Klacik, clinical educator at IAHCSMM, is AAMI’s standard on flexible endoscope processing, ANSI/AAMI ST91:2015 Flexible and semi-rigid endoscope processing in healthcare facilities. This standard has been under review since its original publication, Klacik says, and the committee hopes to have the revision completed in 2020.

Klacik adds that it is essential for SPDs to double-check their scopes: “Use a borescope and other cleaning verification products,” she advises. “Perform competency reviews on all scopes by all staff members that process the scopes.”

Industry is also wrestling with the recommendation that scopes should be sterilized going forward. “In 2017 AAMI held a scope stakeholders meeting with leading experts in the infection prevention, sterilization, disinfection and endoscope field,” Klacik says. “The consensus was to transition semi-critical items to sterilization. This is intended to be a gradual transition, as it cannot occur quickly since many processes need to be in place.”

Until then, ANSI/AAMI ST91:2015 Flexible and semi-rigid endoscope processing in healthcare facilities, remains a go-to document for guidance.

AAMI TIR30:2011 reminds us that, “Cleaning is normally accomplished by manual wiping, brushing, or flushing or by using mechanical aids (e.g., ultrasonic cleaners, washer–decontaminators, washer-sterilizers) in conjunction with water and detergents to remove foreign material.”

The document adds, “In the past, a device was considered ‘clean’ if the person who was performing the cleaning task observed no visible foreign material. Today, however, more devices have long or narrow opaque lumens, crevices, hinges, acute angles, serrated edges, junctions between insulating sheaths, coils, or other designs that make it difficult or impossible to rely on the traditional visual endpoint. In addition, visual observation might not be adequately sensitive to detect levels of soil that could interfere with subsequent reprocessing.”

And as Grein and Murthy (2018) remind us, “… semi-critical medical devices are far more likely to be associated with disease transmission compared with critical or non-critical devices. Semi-critical devices such as endoscopes are often contaminated with a high degree of bacterial bioburden, possess long channels or intricate designs that are challenging to clean, and are prone to biofilm production when moisture is present. Also, as described by Rutala and Weber, any breach in the reprocessing protocol can lead to significant contamination. Specifically, the cleaning step may reduce the bacterial burden by 2 to 6 log10, and HLD may reduce it by an additional 4 to 6 log10, for a total of 6 to 12 log10. Because GI endoscopes may contain 107 enteric microorganisms after use, the margin of safety in HLD of GI endoscopes is low to nonexistent, in stark contrast with the 17 log10 margin of safety in sterilization of surgical equipment.”

They add that, “Prompt bioburden removal before HLD is the most important step of reprocessing, because the presence of bioburden impedes the effectiveness of the high-level disinfectant. The cleaning procedure includes precleaning, a leak test, and manual cleaning and reduces the number of microorganisms and organic debris by 4 logs or 99.99 percent.”

However, they acknowledge that the complexity of cleaning places a significant burden on technicians and their supervisors to ensure that every step is done correctly before HLD: “Some AERs currently on the market perform automated cleaning in addition to HLD, although they do not replace the initial immediate cleaning step performed at bedside. Although automation provides greater standardization and reduces the risk of human error, the reliability of these devices is yet to be confirmed through independent peer-reviewed studies in clinical settings.” They add, “Endoscope reprocessing requires meticulous attention to detail and rigid compliance with reprocessing instructions. Unfortunately, lapses are common and frequently implicated in exposure events or outbreaks.”

Recently updated multi-society guidelines provide current recommendations for critical steps in reprocessing flexible GI endoscopes and incorporates guidance specific to duodenoscopes. Experts emphasize that strict compliance with HLD processes is a critical requirement for all endoscopes. For duodenoscopes, all personnel must additionally be trained and knowledgeable in new recommendations for additional flushing and cleaning steps for the elevator channel. Experts advise healthcare facilities to implement recent interim guidance from the FDA for duodenoscope reprocessing and ensure compliance with updated recommendations as they become available. In response to the multiple duodenoscope outbreaks, the FDA and CDC outlined four optional additional enhanced disinfection measures for consideration by healthcare providers to decrease the risk of infection and include microbiologic testing of duodenoscopes after processing, ethylene oxide (ETO) sterilization, use of liquid chemical sterilants for HLD, and repeat HLD.

Regarding sterilization of certain scopes, Grein and Murthy (2018) note, “ERCP endoscopes and reusable accessories, such as biopsy forceps, are used in sterile body cavities and as such, many experts consider that they should be classified as critical devices.”

The future remains unwritten, but as Grein and Murthy (2018) acknowledge, “Many issues remain unresolved in the current guidelines owing to the lack of robust data to develop specific recommendations. However, it is clear that compliance with accepted guidelines for the reprocessing of GI endoscopes between patients is critical to the safety and success of their use and that, when these guidelines are followed, pathogen transmission can be minimized. Increased efforts and resources should be directed to improve compliance with these guidelines and to future research in prevention of GI endoscope related infections. In the meantime, health care facilities should improve their own internal quality control processes, regularly reinforce necessary competencies, and consider performing post-procedure infection surveillance. Until methods to sterilize these devices can be implemented to ensure optimal patient safety from infection risks associated with GI endoscopy, continued vigilance is required to ensure strict adherence to current reprocessing guidelines and to detect infrequent infections that may signal breaks in adherence to current processes, design flaws that increase risk, or damaged equipment.”

Despite the chorus of voices echoing the criticality of proper reprocessing, investigators looking into the kinds of contamination and defects found in processed scopes has uncovered a host of alarming discoveries. Guidelines are increasingly recommending the use of lighted magnification and visualization following reprocessing to identify any residual contamination, as studies comparing visual and microscopic analysis have demonstrated that visual inspection alone is insufficient to determine cleanliness.

Researchers have confirmed that what technicians can’t see with the naked eye can hurt their patients. Ofstead, Wetzler and Heymann, et al. (2017) conducted a study involving visual inspections with a borescope, microbial cultures, and biochemical tests for protein and ATP to identify endoscopes in need of further cleaning or maintenance. Three assessments were conducted over a seven-month period; the control group endoscopes were reprocessed using customary practices and were compared with intervention group endoscopes subjected to more rigorous reprocessing. At final assessment, all endoscopes (N = 20) had visible irregularities. Researchers observed fluid (95 percent), discoloration, and debris in channels. Of 12 (60 percent) endoscopes with microbial growth, four had no growth until after 48 hours. There were no significant differences in culture results by study group, assessment period, or endoscope type. Similar proportions of control and intervention endoscopes (about 20 percent) exceeded post-cleaning biochemical test benchmarks. ATP levels were higher for gastroscopes than colonoscopes. Eighty-five percent of endoscopes required repair due to findings.

Surprisingly, the researchers found that more rigorous reprocessing was not consistently effective. Seven-day incubation allowed identification of slow-growing microbes. The researchers say their findings bolster the need for routine visual inspection and cleaning verification tests recommended in new reprocessing guidelines.

During the final assessment, researchers had observed discoloration, scaly deposits, debris, scratches, and dents on external surfaces. Gastroscope insertion tubes were commonly stained yellow or orange and buckling was also observed. Irregularities were often found on distal ends. Borescope examinations revealed numerous irregularities, including discoloration, scratches, and filaments of debris protruding into channels. Researchers observed fluid in 19 of 20 (95 percent) of patient-ready endoscopes, which were stored vertically after reprocessing.

Researchers tested every endoscope in use at the final assessment and found a similar proportion of endoscopes in each group exceeded the post-cleaning benchmarks for ATP (20 percent control and 30 percent intervention) and protein (20 percent control and 20 percent intervention). Overall, more gastroscopes exceeded the ATP benchmark (67 percent gastroscope and 7 percent colonoscope; but there was no difference in protein levels (17 percent gastroscope and 21 percent colonoscope).  There were no differences in cleaning effectiveness by endoscope age or use history, and the highest post-HLD microbial colony count was found in one of the newer ACs. Although gastroscopes were found to be more highly contaminated than colonoscopes, they were used for fewer procedures than colonoscopes. Every endoscope had <10 CFU except one intervention AC with 15 CFU. Positive control samples were highly contaminated (ATP, 4,831 RLU; protein, 29 µg/mL; and microbial cultures, >600 CFU). Negative control samples had low ATP levels (9 RLU), negative protein tests (0 µg/mL), and no microbial growth.

Technicians conducted ATP tests after manual cleaning for all intervention endoscopes. Post-cleaning benchmarks (<200 RLU) were met during 301 of 304 (99 percent) colonoscope encounters (mean, 17 RLU and median, 11 RLU), and during 69 of 143 (48 percent) gastroscope encounters (mean, 571 RLU and median, 214 RLU). In 16 (11 percent) gastroscope encounters, the ATP levels were still high after double manual cleaning and 2 cycles of cleaning and HLD in the AER.

The take-home message from this study is that the researchers found microbial growth in samples from 60 percent of endoscopes, and that residual fluid was found in most endoscopes, which suggests insufficient drying methods that can foster the growth of bacteria and fungi.  As the researchers emphasized, “This study demonstrated that more rigorous reprocessing practices may not be sufficient to ensure that patient-ready endoscopes are free from residual contamination, particularly when the endoscope has defects that could harbor organic debris and biofilm. Visual inspection and routine monitoring for biochemical markers of residual contamination may be essential to identify suboptimal reprocessing and proactively identify endoscopes in need of repair or refurbishment. Residual fluid found inside endoscopes indicate that current industry standards do not effectively dry endoscopes, which is essential to minimize growth of environmental contaminants and potential pathogens. The association between visual abnormalities, biochemical markers of contamination, microbial growth, and the potential for adverse patient outcomes is not known. Research is needed to establish optimal methods and frequency for assessing endoscopes for visual abnormalities, residual contamination, and microbial growth, as well as a schedule for routine maintenance. At this time, the goal is for every institution to have documented proof that their endoscopes are in good working order and are not contaminated in ways that put patients at risk.”

Some researchers disagree that current cleaning and disinfection methods are adequate, and some have raised the issue of the contamination and the damage to endoscopes that accumulate over extended time and use.  Ofstead and Wetzler, et al. (2016) evaluated flexible endoscope damage and contamination levels at baseline and two months later in an ambulatory surgery center and found that post-cleaning test results exceeded benchmarks for all gastroscopes and no colonoscopes. Microbial growth was found in samples from 47 percent of fully reprocessed endoscopes at baseline and 60 percent at follow-up. Borescope examinations identified scratches, discoloration, debris and fluid inside endoscopes, and further documented that irregularities changed over time.

Researchers found that total microbial growth was <10 CFU for every endoscope except a PC (17 CFU) at baseline and a different PC (24 CFU) at follow-up. Cultures were more commonly positive for gastroscopes (80 percent) than colonoscopes (33%) at baseline, but not at follow-up (60 percent for both). Common skin and GI flora and nonpathogenic soil bacteria were found. Positive controls (pre-cleaned endoscopes) had very high colony counts and ATP levels >600 RLU, whereas negative controls (sterile materials) had ≤2 CFU and ATP <30 RLU.

Irregularities observed during borescope examinations of patient-ready endoscopes included scratches, discoloration, surface damage, debris, and residual fluid. Debris retrieved from one endoscope was later determined to be a fragment of channel lining. Channel irregularities appeared to change over time, with additional damage observed in endoscopes that were reassessed two months after the baseline. Discoloration found in control group channels at baseline was similar two months later but was reduced in the intervention group. After the baseline assessment, two endoscopes were sent out for repair due to borescope examination findings. During the two-month assessment, one endoscope failed a leak test and three other endoscopes were quarantined and sent out for repair based on borescope examinations and tests for residual contamination. The manufacturer determined all the endoscopes to have multiple critical defects.

Ofstead and Wetzler, et al. (2016) concluded that their findings confirmed results from other studies that found GI endoscopes were frequently contaminated despite reprocessing in accordance with guidelines: “Our findings lend support to new recommendations for enhanced visual inspections and cleaning verification. ATP tests and borescope examinations allowed damaged and contaminated endoscopes to be identified so they could be re-reprocessed or repaired as needed to prevent biofilm buildup and potential transmission of infection.”

In the recent webinar, “Conducting visual examinations of flexible endoscopes: A focus on channels and ports,” researcher Cori Ofstead, of Ofstead & Associates, summarized her many inquiries into the contamination of presumed patient-ready status of contaminated endoscopes.  She reminds us of the why proper endoscope reprocessing is important:

  • Endoscopes are contaminated during procedures
  • Contaminated endoscopes transmit pathogens effectively
  • Infections and injuries have been linked to every type of endoscope
  • Endoscope-related infections and injuries should be preventable
  • Current reprocessing and maintenance practices are far below standards
  • There are implications for patient safety and public health

The cause-and-effect between damaged and contaminated endoscopes is very real. As Ofstead pointed out, reports to the FDA include endoscope defects contributing to mucosal trauma and bleeding in the patient, requiring the procedure to stop. Additionally, debris and bioburden retained in scopes after reprocessing has been reported to the FDA. The type of retained material ranges from human and foreign tissue, to debris from channel shredding.

Ofstead emphasized that when she and her team examined scopes, most of the so-called “patient-ready” instruments retained bacteria even after cleaning and HLD.  What’s more, these contaminated scopes were linked to infectious outbreaks among patients. Ofstead further emphasizes that visual inspection could have prevented these adverse events. AORN, AAMI and SGNA all recommend this step in their guidelines:

- perform visual inspection every time a scope is used

- use good lighting and magnification

- Consider using a boroscope for channels

- Evaluate endoscope cleanliness

- Look for any visible damage or defects

Additionally, manufacturers’ IFUs recommend visual inspection, Ofstead said, advising technicians to look for any irregularities, including scratches, cracks, chips, tears, pitting, peeling, stains, discoloration, deterioration, protruding objects into the channel, and adhesion of any foreign bodies. Manufacturers advise user to never utilize a scope with irregularities on a patient, to troubleshoot from; the IFUs, and in the worst-case scenario, to send the scope out for repair.

In her webinar, Ofstead offered the following Critical insights:

- Routine visual inspections could have identified the damage and retained debris that harmed patients

- Visual inspection of endoscopes will find problems; ensure you have a plan for responding to these findings

- Retained moisture fosters the growth of bacteria, mold and biofilm; scopes should be completely dry before storage

In a presentation at the IAHCSMM annual meeting in April of this year, Ofstead summarized findings from a survey of IAHCSMM members (see related article on page XX), with the overall conclusions that:

  • Endoscope reprocessing does not work as envisioned
  • Patients and technicians are at risk of infection and injury
  • Contributing factors include: endoscope design and durability issues; inadequate guidelines and IFUs; lack of sufficient education; and pressure to cut corners
  • Solutions require active collaboration by all stakeholders

She advised that manufacturers develop tools for success by:

  • Simplifying IFUs and evaluate their clarity/feasibility
  • Confirming that IFUs can be followed precisely and consistently
  • Conducting real-world research to ensure products work as intended in real-world settings and identify gaps
  • Address the real-world challenges reported by techs
  • Respond to scientific evidence with innovations

She advised the guideline-issuing bodies, regulatory agencies and accrediting agencies:

  • Recognize the need for clear direction in the field
  • Address the complexity and inconsistency of standards
  • Require competency assessments for all personnel
  • Ensure that adequate time and resources are allocated
  • Develop customizable quality checklist for auditors
  • Make sure surveyors are trained sufficiently to assess endoscope reprocessing setups and practices

She also made the following recommendations for technicians:

  • Review IFUs and guidelines
  • Embrace opportunities to learn from vendors and IPs
  • Take the time to do every step correctly, every time
  • Approach cleaning verification with curiosity
  • Document findings from visual inspections
  • Collaborate to troubleshoot Issues
  • Share insights with others

She also made the following recommendations for sterile processing supervisors and managers:

  • Invest In continuing education for reprocessing staff
  • Ensure IFUs, policies and guidelines are accessible
  • Arranged for routine, preventive maintenance of equipment
  • Insulate your staff from pressure to cut corners
  • Foster a spirit of continuous quality improvement
  • Empower technicians to report issues and challenges
  • Provide healthcare institution leaders with evidence of SPD-related capabilities and needs

Researchers have found the ill effects of improperly dried scopes, in that moisture may foster microbial growth and biofilm development in endoscopes and retained fluid is associated with higher ATP levels and microbial growth.

Ofstead and Heymann, et al. (2018) evaluated the effectiveness of fully reprocessed endoscope drying and storage methods and assessed associations between retained moisture and contamination. Examining scopes in three hospitals, researchers performed visual examinations and tests to detect fluid and contamination on patient-ready endoscopes. Fluid was detected in 22 of 45 (49 percent) endoscopes. High adenosine triphosphate levels were found in 22 percent of endoscopes, and microbial growth was detected in 71 percent of endoscopes. Retained fluid was associated with significantly higher adenosine triphosphate levels. Reprocessing and drying practices conformed with guidelines at one site and were substandard at two sites; damaged endoscopes were in use at all sites.

As the researchers point out, “Reprocessing guidelines describe drying as critically important, but there is no consensus among experts and guideline-issuing bodies on best practices for endoscope drying and storage. Alfa and Sitter reported that 10 minutes of purging with forced air reduced Gram-negative bacilli in endoscope channels. However, this method has not been widely embraced because it requires space, equipment and time. Instead, many institutions rely on alcohol flushes and brief air purges before hanging endoscopes vertically in hopes that residual fluid will drain out or evaporate.”

The researchers also observed substantial defects in all 45 endoscopes, and these irregularities included discoloration, white or black residue, scratches, gouges, non-intact channel lining, debris inside endoscopes, damaged distal ends, insertion tube buckling, and dented channels.

Multiple reprocessing deficiencies were observed, where dirty-to-clean workflow and PPE use were substandard. At these sites, leak testing and manual cleaning were inadequate, and site personnel stated that their AERs' automated cleaning cycles had been disabled to save time. Technicians wore the same gloves for handling manually cleaned endoscopes, loading AERs, and removing disinfected endoscopes. No hand hygiene was performed between reprocessing activities. No cleaning-verification tests or visual inspections of endoscopes were done.

As Ofstead and Heymann, et al. (2018) observe, “After HLD, dripping-wet gastrointestinal endoscopes at Site A were carried by hand to a storage cabinet and hung vertically without wiping external surfaces. Alcohol was flushed through channels manually because their reprocessing system did not perform alcohol flushes. Residual fluid drained onto the cabinet floor. Ventilation grills had visibly dirty filters that were reportedly never changed, and there was no active ventilation. Wet cystoscopes, ureteroscopes, and intubation endoscopes were removed from the automated reprocessing system and carried by hand to a small, unventilated, metal storage cabinet. Insertion tubes were inserted into dirty, reused Styrofoam blocks, which technicians reportedly used to protect distal ends. Due to insufficient storage space, one ureteroscope was stored horizontally on the cabinet bottom. There was no protocol for cleaning storage cabinets, which were visibly dirty. Following HLD, the AER at Site C performed an alcohol flush and air purge. After removal from the AER, technicians wiped external surfaces with reused towels and used an air pistol for 15-20 seconds to evacuate fluid from channels while holding the endoscope with dirty-gloved hands. The air pistol did not have a pressure regulator, and pressures were high enough to result in spray being ejected in a visible plume from the distal end during brief bursts of forced air. Endoscopes were stored vertically in closed cabinets with ventilation grills. Cabinet fans were present but unplugged or disabled. Although a weekly cleaning protocol was described, blue lint was observed on cabinet floors, and technicians could not recall the last time they cleaned the cabinets. ATP tests were completed for at least 1 cabinet at each site. ATP levels in storage cabinets at all three sites indicated residual contamination (maximum levels on cabinet door handles, interior walls, and floors at A: 898, 247, 44; B: 53, 900, 85; C: 161, 286, 4219 RLU).”

The researchers add, “Although it is tempting to conclude that retained moisture was responsible for fostering microbial growth, researchers identified several variables that could have affected reprocessing outcomes. At Sites A and C, researchers observed numerous quality breaches that were unexpected given their Joint Commission accreditation and affiliations with large healthcare systems. In addition to violating several reprocessing standards, Sites A and C intentionally disabled AERs' automated cleaning cycles because of pressure to achieve faster turnaround times. Omitting this cleaning step presumably reduced the effectiveness of HLD. Given these breaches and contamination found, both sites followed researchers' recommendations to convene multidisciplinary teams to assess risk, determine whether patient notification was warranted, and address quality issues.”

During endoscopic procedures at all sites, clinicians used silicone-containing products as lubricants and de-foaming agents (e.g., infant gas relief drops with simethicone, cooking oil sprays, and silicone sprays), which are not water soluble. Endoscope manufacturers state that these products may interfere with reprocessing effectiveness.46, 47 Researchers have found that simethicone is not removed during reprocessing. The Canadian Association of Gastroenterology states that simethicone products are universally used in endoscopy; however, their use should be minimized because simethicone residues may contribute to biofilm formation and microbial growth.49

After observing sticky residue on surfaces and debris adhering to channel linings, researchers learned that some clinicians administer tissue glue during procedures. Reprocessing personnel reported difficulty removing this glue from endoscopes.

 

References and Recommended Reading:

Alfa MJ, et al.  Comparison of clinically relevant benchmarks and channel sampling methods used to assess manual cleaning compliance for flexible gastrointestinal endoscopes. Am J Infect Control, 42. Pp. e1-e5. 2014.

Alfa MJ, et al. Validation of adenosine triphosphate to audit manual cleaning of flexible endoscope channels. Am J Infect Control, 41. Pp. 245-248. 2013.

Alfa MJ, et al. Establishing a clinically relevant bioburden benchmark: a quality indicator for adequate reprocessing and storage of flexible gastrointestinal endoscopes. Am J Infect Control, 40. Pp. 233-236. 2012.

ANSI/AAMI ST91: 2015, Flexible and semi-rigid endoscope processing in healthcare facilities. (2015), pp. 1-70.

AORN. Guideline for processing flexible endoscopes. Sterilization and disinfection. (2016), pp. 675-758.

Association for the Advancement of Medical Instrumentation (AAMI). A compendium of processes, materials, test methods, and acceptance criteria for cleaning reusable medical devices. 2011.

AAMI and FDA. 2011 Summit on Reprocessing: Priority Issues from the AAMI/FDA Medical Device Reprocessing Summit.

 

Coton T, et al. New flexible endoscopes: Surprising bacterial colonization post-disinfection. Clin Res Hepatol Gastroenterol, 41. Pp. e63-e64. 2017.

 

FDA. Infections associated with reprocessed flexible bronchoscopes: FDA safety communication.  Sept. 17, 2015.

Fushimi R, et al. Comparison of adenosine triphosphate, microbiological load, and residual protein as indicators for assessing the cleanliness of flexible gastrointestinal endoscopes. Am J Infect Control, 41. Pp. 161-164. 2013.

Grein JD and Murthy RK. New Developments in the Prevention of Gastrointestinal Scope-Related Infections. Infect Dis Clin North America. Vol. 32, No. 4. Pp. 899-913. December 2018.

Kovaleva J. Endoscope drying and its pitfalls. J Hosp Infect, 97. Pp. 319-328. 2017.

Lichtenstein D and Alfa MJ. Cleaning and Disinfecting Gastrointestinal Endoscopy Equipment. In: Clinical Gastrointestinal Endoscopy (Third Edition). 2019

Neves MS, et al. Effectiveness of current disinfection procedures against biofilm on contaminated GI endoscopes. Gastrointest Endosc, 83. Pp. 944-953. 2016.

Ofstead C. Webinar: Conducting visual examinations of flexible endoscopes: A focus on channels and ports. 2019.

Ofstead CL, Heymann OL, et al. Residual moisture and waterborne pathogens inside flexible endoscopes: Evidence from a multisite study of endoscope drying effectiveness. Am J Infect Control. Vol. 46, No. 6. Pages 689-696. June 2018.

Ofstead CL, Wetzler HP, Heymann OL, Johnson EA, Eiland JE and Shaw MJ. Longitudinal assessment of reprocessing effectiveness for colonoscopes and gastroscopes: Results of visual inspections, biochemical markers, and microbial cultures. Am J Infect Control. Vol. 45, No. 2. Pp. e26-e33. February 2017.

Ofstead CL, Wetzler HP, et al. Assessing residual contamination and damage inside flexible endoscopes over time. Am J Infect Control. 1;44(12):1675-1677. 2016.

Ofstead CL, Wetzler HP, et al. Simethicone residue remains inside gastrointestinal endoscopes despite reprocessing. Am J Infect Control, 44. Pp. 1237-1240. 2016.

Ofstead CL, Doyle EM, et al. Practical toolkit for monitoring endoscope reprocessing effectiveness: Identification of viable bacteria on gastroscopes, colonoscopes, and bronchoscopes. Am J Infect Control, 44. Pp. 815-819. 2016.

Ofstead CL, Wetzler HP, et al. Persistent contamination on colonoscopes and gastroscopes detected by biologic cultures and rapid indicators despite reprocessing performed in accordance with guidelines. Am J Infect Control, 43. Pp. 794-801. 2015.

Ofstead CL, Wetzler HP, Doyle EM, Rocco CK, Visrodia KH and Baron TH, et al. Persistent contamination on colonoscopes and gastroscopes detected by biologic cultures and rapid indicators despite reprocessing performed in accordance with guidelines. Am J Infect Control, 43. Pp. 794-801. 2015.

Ofstead C, Tosh P, et al. Persistence of Organic Residue and Viable Microbes on Gastrointestinal Endoscopes Despite Reprocessing in Accordance with Guidelines. APIC 41st annual meeting, Anaheim, Calif. June 7-9, 2014.

Ofstead CL and Dirlam Lang AM. Re-evaluating endoscopy-associated infection risk estimates and their implications. Am J Infect Control. 41. Pp. 734-6. 2013.

Rex DK, et al. A double-reprocessing high-level disinfection protocol does not eliminate positive cultures from the elevators of duodenoscopes. Endoscopy. Dec 13, 2017.

Ribeiro MM, et al. Effectiveness of flexible gastrointestinal endoscope reprocessing. Infect Control Hosp Epidemiol, 34. Pp. 309-312. 2013.

Rubin ZA and Murthy RK. Outbreaks associated with duodenoscopes: new challenges and controversies. Curr Opin Infect Dis, 29. Pp. 407-414. 2016.

Rutala WA and Weber DJ. Gastrointestinal endoscopes: a need to shift from disinfection to sterilization? JAMA, 312. Pp. 1405-1406. 2014.

Saliou P, et al. Measures to improve microbial quality surveillance of gastrointestinal endoscopes. Endoscopy, 48. Pp. 704. 2016.

SGNA. Standards of infection prevention in reprocessing flexible gastrointestinal endoscopes. (2015), pp. 1-31.

Snyder GM, Wright SB, et al. Randomized comparison of 3 high-level disinfection and sterilization procedures for duodenoscopes. Gastroenterology, 153. Pp. 1018-1025. 2017.

Visrodia K, et al. Duodenoscope reprocessing surveillance with adenosine triphosphate testing and terminal cultures: A clinical pilot study. Gastrointest Endosc, 86. Pp. 180-186. 2017.

Defining “Clean” in the Healthcare Environment: A Microbial Standard is a Moving Target

By Kelly M. Pyrek

This is the first article in a series that examines how we define the concept of “clean.” This feature originally appeared in the October 2019 issue of Healthcare Hygiene magazine.

In its time, the 2003 Guidelines for Environmental Infection Control in Health-Care Facilities was a good starting point for healthcare institutions to establish and implement their cleaning and disinfection protocols and policies.

Fast-forward to late 2019, and the guidelines seem quaint and almost antiquated, as the environmental hygiene research agenda is being pushed and stretched to its limits, and as investigators pursue key scientific inquiries relating to surface cleaning and its impact on patient-centered outcomes and healthcare-acquired infection (HAI) rates.

Carling (2013) summarized much of the current thought on the role of the environment in HAI prevention: “Over the past decade, multiple studies have shown that approximately 30 percent to 60 percent of surfaces in the patient zone of individuals colonized or infected with C. difficile, VRE, or MRSA are contaminated with these organisms. Although less widely studied, several reports have confirmed similar rates of contamination with A baumanii in colonized or infected patient rooms. Furthermore, several studies have shown significant environmental contamination with C. difficile, MRSA, and VRE in rooms of patients not in isolation for these HAPs, raising the possibility that such contamination is related to prior room occupants and ineffective disinfection cleaning practices.”

He continues, “Indeed, multiple studies have now confirmed that there is an approximately 120 percent increased risk of a susceptible patient becoming colonized or infected with a wide range of HAPs if the individual previously occupying that room was colonized with that organism. Although not being able to define causality with respect to transmission because of limitations in study design, extensive covert studies have uniformly confirmed that opportunities for improving environmental cleaning can be identified in many healthcare settings. These studies took place in a wide range of healthcare settings in which a systematic evaluation of environmental cleaning was performed using the same fluorescent marking system. As a result of the above findings in acute-care hospitals, as well as pioneering studies in research hospital settings, a multisite project using identical process improvement interventions based on objective performance feedback of cleaning thoroughness using a fluorescent marker “test soil” system was performed. Highly significant improvement in terminal room disinfection cleaning was confirmed in two large independent groups of hospitals.  Several reports have now shown that improved environmental cleaning decreases HAP contamination of surfaces. In four comparable clinical studies objectively evaluating thoroughness of environmental cleaning over many months, contamination of patient zone surfaces decreased an average of 64 percent as a result of an average 80 percent improvement in thoroughness of disinfection cleaning.”

Carling (2013) adds, “Although the complexity and cost of studies to evaluate the impact of decreased patient zone HAP contamination on acquisition has limited such undertakings, two landmark studies found similar statistically significant results. The 2006 study by Hayden et al., which confirmed a 66 percent reduction in VRE acquisition as a result of a 75 percent improvement in thoroughness of environmental cleaning, as well as the more recent study by Datta et al., which found a 50 percent reduction in MRSA acquisition and a 28 percent in VRE acquisition as a result of an 80 percent improvement in environmental cleaning, clearly show that direct patient safety benefits can be realized by improving the thoroughness of patient zone disinfection cleaning.”

The word “clean” is used quite frequently, but do we truly understand what that means? How does that translate for the end user? A globally or even nationally accepted standard as a definition of “clean” in the healthcare environment has been elusive.

In 2004, UK microbiologist Stephanie J. Dancer proposed a microbial standard for what is considered to be “clean” in healthcare settings – 2.5 CFUs per square cm as an aerobic colony count; in the subsequent years, the use of ATP became more common and the value of 100 RLUs was promulgated, dependent on the manufacturer – some manufacturers recommended the value of <250 RLUs. But the values weren’t a recognized standard, and variability persisted.

For example, in a review of the literature pertaining to background and findings on standards and benchmarks for the cleaning of high-touch surfaces, Campbell, et al. (2014) observed, “Visual inspection, the most common, if not only, evaluation used in the facility industry, was found to be wholly unreliable in the measure of surface contamination. ATP and ACC were recommended as effective measures with benchmarks of <100 RLU and <2.5 CFU/cm2, respectively. This review found that in the healthcare industry as well, cleaning effectiveness has largely gone unmeasured and is maintained by subjective evaluations. Researchers also found that in every study completed to measure cleaning effectiveness using these methods with various benchmarks, current cleaning practices left surfaces organically and microbiologically contaminated.”

“’Clean’ is usually defined as the removal of dirt or unwanted matter,” says Charles P. Gerba, PhD, professor of microbiology and environmental sciences at the University of Arizona. “I prefer ‘hygienic’ since the goal is to reduce the transmission of infectious microorganisms. I believe this can be done with the application of quantitative microbial risk assessment. I think the healthcare area is in need of standards; they are used to control the spread of infectious microorganisms in water and food – and it is time we apply these same concepts to disinfection and cleaning – otherwise we have no scientific way to judge effectiveness of different interventions.”

Gerba says he is in favor of using the CFU standard “because you need a certain level of bacteria to have a high probability of infection.” He adds, “ATP may also be useful to judge or compare interventions – but both have limits.”

Gerba says a definitive microbial standard for surface cleanliness is feasible, explaining, “Application of quantitative microbial risk assessment with knowledge of the number of pathogenic organisms on a surface and then modeling the transmission can give number(s) for guidance.”

He points to a study on which he was a co-author a few years ago where it was concluded that a reduction in bacterial numbers on a fomite by 99 percent (2 logs) most often will reduce the risk of infection from a single contact to less than 1 in 1 million. This quantitative microbial risk assessment (QMRA) by Ryan, et al. (2014) included a problem formulation for fomites and hazard identification for seven microorganisms, including pathogenic Escherichia coli and E coli 0157:H7, Listeria monocytogenes, norovirus, Pseudomonas spp, Salmonella spp, and Staphylococcus aureus. The goal was to address a risk-based process for choosing the log10 reduction recommendations, in contrast to the current Environmental Protection Agency (EPA) requirements.

For each microbe evaluated, the QMRA model by Ryan, et al. (2014) included specific dose-response models, occurrence determination of aerobic bacteria and specific organisms on fomites, exposure assessment, risk characterization, and risk reduction. Risk estimates were determined for a simple scenario using a single touch of a contaminated surface and self-inoculation. A comparative analysis of log10 reductions, as suggested by the EPA, and the risks based on this QMRA approach was also undertaken.

The researchers found that aerobic bacteria were the most commonly studied on fomites, averaging 100 CFU/cm2. Pseudomonas aeruginosa was found at a level of 3.3 × 10−1 CFU/cm2; methicillin-resistant S. aureus (MRSA), at 6.4 × 10−1 CFU/cm2. Risk estimates per contact event ranged from a high of 10−3 for norovirus to a low of 10−9 for S aureus.

“A standard based on quantitative microbial risk assessment makes sense,” Gerba confirms. “It is used in both the water and food industries to set minimum treatment requirements. For example, treatment of drinking water to transmission of infectious waterborne organisms is set for a one day/one-time event risk of 1:1,000,000.”

The History of the Quest for a Standard

As we have seen, one of the first to question the concept of “clean,” years ago, Dancer (2004) noted that “…the importance of a clean environment is likely to remain speculative unless it becomes an evidence-based science.”

She had called for microbiological standards with which to assess clinical surface hygiene in hospitals, based on those used by the food industry. She had mused that a standard would require identifying a specific 'indicator' organism, the presence of which would suggest a requirement for increased cleaning. For example, indicators would include Staphylococcus aureus, including methicillin-resistant S. aureus, Clostridium difficile, vancomycin-resistant enterococci and various Gram-negative bacilli.

The standard would also indicate a quantitative aerobic colony count of <5 CFU/cm2 on frequent hand-touch surfaces in hospitals, explaining that, “The principle relates to modern risk management systems such as HACCP, and reflects the fact that pathogens of concern are widespread. Further work is required to evaluate and refine these standards and define the infection risk from the hospital environment.”

Before the role of the environment had gained great acceptance, Dancer (2004) had asserted, “Hospital patients can acquire organisms from many sources, including the environment, but the extent to which the latter contributes toward HAI is largely unknown. This is because cleaning has never been regarded, let alone investigated, as an evidence-based science. The difficulties in measuring cleaning efficacy are compounded by the lack of standardized methodologies and are rarely quantitative. Environmental screening usually takes place on an ad hoc basis after an outbreak, but it is patently impossible to screen the entire surface of a ward and finding the outbreak strain is not guaranteed. Furthermore, organisms still must be transmitted to patients. As this is thought to occur via staff hands, strategies for controlling HAI are more likely to favor improvements in hand hygiene than comprehensive screening programs. Cost-benefit and lack of standardized methodologies might also explain the perceived reluctance of private cleaning companies to participate in screening. Certainly, most microbiologists would be cautious about taking environmental samples from hospital wards on a routine basis.”

Dancer (2004) had pointed out the business case for cleaning before the reimbursement landscape began to change significantly, and was ahead of her time in using cleaning to leverage it as a risk management strategy: “As cleaning could be a cost-effective method of controlling HAI, it should be investigated as a scientific process with measurable outcome.”

To achieve this, Dancer (2004) said, it would be necessary to adopt an integrated and risk-based approach that would encompass preliminary visual assessment, rapid sensitive tests for organic deposits and specific microbiological investigations. Such an approach has already been established by the food industry to manage cleaning practices in a cost-effective manner…”

Dancer (2004) had proposed possible bacteriological standards for assessing surface hygiene, based on standards applied in the food industry but modified to reflect the differences between risk management in food preparation and the risk for acquiring infection in hospital. As we have seen, two features of the standards were the identification of an indicator organism of potential high-risk to patients in any amount, and the quantitative assessment of organisms found within a specified area, regardless of identity.

Dancer (2004) had suggested that there should be <1 CFU/cm2 of the indicator organism(s) present in the clinical environment and noted that the identification of an indicator organism should generate immediate cleaning and disinfection practices. Repeat sampling would be mandatory, and risk assessment would determine a hygiene review, additional cleaning, or even the closure of a clinical area for deep cleaning if appropriate.

Dancer (2004) had proposed that the internationally recognized figure of <5 CFU/cm2 could be used as a starting point in working toward a standard of clean in the healthcare environment: “The finding of ≥5 CFU/cm2 from a hand contact surface, whatever the identity of the organisms, indicates that there might be an increased risk of infection for the patient in that environment. This should generate an evaluation of the cleaning/disinfection practices and frequencies for that surface. This is based on three suppositions: first, an increased microbial burden suggests that there has been insufficient cleaning. This would increase the chances of finding a pathogen. Second, a heavy microbial burden may mask the finding of a pathogen. Third, a heavy concentration of certain organisms implies an increased chance of finding an epidemiologically related pathogen.”

As Dancer (2004) had observed, “We need to be able to judge cleanliness by the same standards, even if this is done by empirically grading set situations. There are already internationally agreed microbiological standards for air, water and food preparation surfaces, so why not for surfaces in hospitals? … Widespread adoption of standards would allow risk assessment and evaluation of infection risks to patients (and staff) in hospitals. The ability to compare results between different clinical units and different hospitals would contribute toward further evaluation. Infection control and domestic personnel could justify their actions regarding routine and incident measures. Cleaning efficacy could be subjected to internal audit, with feedback to managers and the infection control committee for regular review. These standards would allow national and local audits on hygiene to be conducted on a scientific basis, rather than the ill-defined and almost certainly subjective criteria used to date. Visual assessment of hygiene has been shown to be a poor indicator of cleaning efficacy.”

At the time of her seminal study, Dancer had indicated that much more research was needed around all available microbiological methods, the role of rapid methods such as bioluminescence, clinical surface definitions, sampling indications and frequencies, and responsibilities and cost. She also recommended that researchers “attempt to equate the environmental findings with the probability of acquiring a hospital infection,” which has been the Holy Grail in all aspects of infection prevention-related interventions for decades.

For example, in 2004, Dettenkofer, et al. performed a systematic review of the impact of environmental surface disinfection interventions on occurrence of HAIs. The authors concluded that the quality of the studies existing at that time was poor, and none provided convincing evidence that disinfection of surfaces reduced infections.

Experts continued to disagree about the validity of current benchmarks for defining “clean” surfaces and debate their merit of serving as meaningful surrogate measures for HAI transmission.

Four years after Dancer’s paper was published, Al-Hamad and Maxwell (2008) asked “how clean is clean?” and confirmed that, “Although microbiological standards have been proposed for surface hygiene in hospitals, standard methods for environmental sampling have not been discussed.” Their study sought to assess the effectiveness of cleaning/disinfection in critical care units using the wipe-rinse method to detect an indicator organism and dip slides to quantitatively determine the microbial load.

The researchers microbiologically surveyed frequent-hand-touch surfaces from clinical and non-clinical areas, targeting methicillin-susceptible (MSSA) and methicillin-resistant Staphylococcus aureus (MRSA). A subset of the surfaces targeted was sampled quantitatively to determine the total aerobic count. MRSA was isolated from 9 (6.9 percent) and MSSA was isolated from 15 (11.5 percent) of the 130 samples collected. Seven of 81 (8.6 percent) samples collected from non-clinical areas grew MRSA, compared with two (4.1 percent) from 49 samples collected from clinical areas. Of 116 sites screened for the total aerobic count, 9 (7.7 percent) showed >5 CFU/cm2 microbial growth. Bed frames, telephones and computer keyboards were among the surfaces that yielded a high total viable count. The researchers suggested that combining both standards would give a more effective method of assessing the efficacy of cleaning/disinfection strategy.

Nante, et al. (2017) remind us that “…methods to assess hospital environments cleaning can be considered an integral part of infection prevention and control programs. Among these, the most known and used are visual inspection, microbial methods, fluorescent markers and adenosine triphosphate (ATP) bioluminescence. The latter measures the presence of ATP on surfaces. The ATP bioluminescence consists in a swab, used to sample a standardized area, which, subsequently, is placed in a tool that uses the firefly enzyme ‘luciferase’ to catalyze the conversion of ATP in adenosine monophosphate (AMP): this reaction results into an emission of light which is detected by the bioluminometer and quantified in relative light units (RLUs). The presence of ATP on surfaces, obviously, is a proxy of organic matter and, consequently, of microbial contamination. This method has been used in food industries for over 30 years. Its use in the healthcare environment is growing, but it is still controversial, in that different tools consider different threshold values, and, therefore, this technique seems not to be standardized.”

Around the same time as the Al-Hamad and Maxwell (2008) paper, Lewis, et al. (2008) acknowledged that, “Calls have been made for a more objective approach to assessing surface cleanliness. To improve the management of hospital cleaning the use of ATP in combination with microbiological analysis has been proposed, with a general ATP benchmark value of 500 RLU for one combination of test and equipment.”

In their study, Lewis, et al. (2008) used this same test combination to assess cleaning effectiveness in a 1,300-bed teaching hospital after routine and modified cleaning protocols. Based upon the ATP results a revised stricter pass/fail benchmark of 250 RLU is proposed for the range of surfaces used in this study. This was routinely achieved using modified best practice cleaning procedures which also gave reduced surface counts with, for example, aerobic colony counts reduced from >100 to <2.5 CFU/cm2, and counts of Staphylococcus aureus reduced from up to 2.5 to <1 CFU/cm2 (95 percent of the time). The researchers say that benchmarking is linked to incremental quality improvements and both the original suggestion of 500 RLU and the revised figure of 250 RLU can be used by hospitals as part of this process, and that they can also be used in the assessment of novel cleaning methods.

The Methods of Monitoring “Clean”

Carling (2013) reminds us that, “Effective strategies are required to assess the effectiveness of environmental cleaning and disinfection in healthcare settings to reduce HAIs. One of the most basic ways to assess contamination following environmental cleaning and disinfection is visual inspection but concerns about the adequacy of visual inspection alone have necessitated the development of technology-based approaches, such as the use of ACCs, which are a culture-based method for assessing environmental contamination; other methods include the use of invisible fluorescent markers placed on high-touch room surfaces before cleaning, with UV light inspection following cleaning. Bioluminescence-based ATP assays have been developed as another alternative that offers direct, rapid feedback and provides a quantitative measure of cleanliness; however, the detected presence of ATP does not necessarily indicate viable pathogens on the tested surface. As genomic and polymerase chain reaction (PCR)-based technologies become less expensive and more widespread, these may also have a role in assessing environmental contamination and effectiveness of disinfection.”

As we have seen, the challenge, Carling (2013) acknowledges, is that there is the need for identifying standardized criteria for determining that surfaces are “clean” using these monitoring modalities: “At the heart of the issue is that while routine cleaning and disinfection strategies may not result in a completely sterile environment, consensus is needed on the threshold of contamination below which pathogen transmission is minimized and can be considered safe. Studies have emphasized the importance of monitoring the operational processes associated with cleaning and disinfection practices, and properly training and managing environmental services personnel tasked with these duties, are additional elements necessary for preventing transmission of HAIs. Strategies for assessing compliance may include use of checklists, direct observation (open or covert), and surveys of personnel and patients. Process evaluation and improvement should also consider important human factors and logistical concerns that interact with environmental cleaning procedures, including workflow, staffing, staff training and supervision, collaboration between support services and clinical staff, institutional leadership, and patient preferences.”

1

Casini, et al. (2018) confirm that “Although not validated, microbiologic standards for a safer hospital environment have been proposed as three colony-forming units (CFU)/cm2 on surfaces; this value is related to an ATP value of 100 RLU/100cm2. Maintaining counts below these thresholds may assist in reducing HAIs.”

The researchers add, “Several methods have been used to assess environmental cleanliness; one such method is the ACC assay, which reveals the amounts of cultivable bacteria present on surfaces. The original quantitative ACC-based standard for defining the surfaces in a ward environment as clean was less than 5 CFU/cm2, but this value has been reduced to 3 CFU/cm2. Currently, the non-culture ATP bioluminescence assay is extensively used to evaluate cleanliness because readings can be obtained on site. Because of its presence in living organisms, ATP was first used as an indicator of cleanliness in the food industry. Subsequently, ATP measurements have been employed to assess hospital cleanliness using different benchmark values expressed in RLUs. Quantitative results are available in less than 5 minutes with these assays; this makes it possible for infection prevention or housekeeping staff to monitor the adequacy of cleaned surfaces. The microbial evaluation of surfaces is useful for monitoring the effectiveness of cleaning and disinfection practices.

The aim of the study by Casini, et al. (2018) was to evaluate cleaning procedure efficacy in reducing bacterial contamination. The researchers analyzed surface contamination using cultural methods and ATP detection, performed with a high-sensitivity luminometer. The values 100 CFU/cm2 and 40 RLU/cm2 were considered as the threshold values for medium-risk category areas, while 250 CFU/cm2 and 50 RLU/cm2 were defined for the low-risk category ones. The cleaning/disinfection procedure reduced the medium bacterial counts from 32±56 CFU/cm2 to 2±3 CFU/cm2 in the low-risk area and from 25±40 CFU/cm2 to 7±11 CFU/cm2 in the medium-risk one. Sample numbers exceeding the threshold values decreased from 3 percent and 13 percent to 1 percent and 5 percent, respectively. RLU values also showed a reduction in the samples above the thresholds from 76 percent to 13 percent in the low-risk area.

As Casini, et al. (2018) explain, “Our results cannot be considered as indicators of microbial contamination, considering also that ATP molecules present on surfaces may not have a microbial origin. Currently, ACCs of < 2.5–5 CFU per cm2 on hand-touch sites have been assigned as a microbiological limit. An additional international institution also uses microbiological standards incorporating the presence of indicator organisms. Their identification depends on the risk to human health and on the matrix inspected.”

The researchers point to the variability of ATP system benchmarks depending on the type of luminometer used. These range from 25 RLUs to 500 RLUs for 10–100 cm2 on hospital surfaces. They note: “A poor correlation between microbial contamination and RLU values has been demonstrated, where the values of the former are low. For this reason, the relationship between the two indices considered is not evident. Different values should be chosen, depending on patient risk; surfaces in outpatient clinics are not necessarily as critical for infection risk as sites beside a ventilated patient receiving intensive care. ATP can also be confounded by disinfectants, microfiber products, and manufactured plastics used in cleaning and laundering industries. If an ATP assessment was introduced into hospitals, it should help to monitor cleaning quality and its failures, even when there is not a serious risk for patients. In conclusion, since a visual assessment does not offer reliable information on infection risk to patients, high-risk (hand-touch) surfaces in hospitals should be subjected to a scientific screening method to monitor the overall levels of microbial dirt. If they were integrated into a formal monitoring regimen, ATP and/or microbiological benchmarks would help to identify unacceptable soil levels and associated patient risk provided they were systematically collected over time and interpreted accurately.”
<!‐‐nextpage‐‐>

In their review, Nante, et al. (2017) found that the country where the studies were conducted may have influenced the choice of the RLU cutoff values: “For example, in the U.S. the most often used value corresponded to 500 RLU …These differences among the benchmark values make difficult the comparisons between measurements carried out with different tools … Another limitation of this technique could be the residues of detergent or disinfectants on the surfaces, which may require rinsing of these surfaces before the use. Despite of these considerations, some advantages of this technique can be listed, such as the possibility to provide real-time results (within 20 seconds of sampling), its simplicity of use (which makes possible the adoption of the method not only by trained healthcare staff), and the quantitative results. The latter allows comparisons between pre- and post-cleaning or between different surfaces. In conclusion, the ATP bioluminescence could be considered a practical, useful method to assess hospital hygiene, performing better than visual inspection, if properly adopted, also being aware of its possible limits.”

The variability, despite the benefits of a monitoring method such as ATP, is a significant concern. Chai, et al. (2018) observe that, “The lack of environmental sampling standardization in healthcare hinders the ability to objectively assess and compare the quality of articles evaluating the efficacy of newer antimicrobial technologies. This variability needs to be addressed by regulatory agencies. The many variables in each of the four process steps (collection, transport, recovery and culture) can independently influence the quality of the sampling methods and inter-study comparisons are thus admittedly difficult. It is tempting to suggest a limited number of environmental sampling methods to facilitate standardization. Unfortunately, this is a challenge specifically because the selection of each method within the four process steps depends upon the surface, its size, shape, and location, and the results desired (qualitative versus quantitative).”

The researchers add, “At a minimum, a description of methodology should consider these elements: 1) moisture must be present at the time of sampling, 2) a neutralizing solution is necessary to arrest residual disinfectant action, 3) a physical dissociation method must be used to release organisms from the collection device prior to culturing, and 4) special consideration is required for the collection and culturing of spore-forming organisms.”

Rawlinson, et al. (2019) acknowledge that evidence on how best to sample these surfaces “is patchy and there is no guidance or legislation in place on how to do this.” Their review assessed current literature on surface sampling methodologies, including the devices used, processing methods, the environmental and biological factors that might influence results. The researchers emphasize that, “Although the numbers of cells or virions recovered from hospital surface environments were generally low, most surfaces sampled were microbiologically contaminated. Of the organisms detected, multi-drug resistant organisms and clinically significant pathogens were frequently isolated and could, therefore, present a risk to vulnerable patients. Great variation was found between methods and the available data was incomplete and incomparable.” They add, “In light of the changing awareness of the risk the surface environment poses, more hospitals are considering instigating routine monitoring of their environments, either to assess cleaning or as part of a continuous risk assessment.”

Rawlinson, et al. (2019) summarize that, “Simple CF numbers per cm2 provided by total viable counts (TVCs) often do not reflect the true risk to the patient, as studies show that surfaces with the highest bioburden are not always the surfaces with the most multidrug-resistant organisms (MDROs) which are of greater clinical concern. TVC sampling is frequently undertaken in order to monitor cleaning, rather than as a risk assessment. Seventy-three studies sampling the hospital environment were reviewed with varying contamination of surfaces likely due to studies using different sampling methodologies, processing methods and targeting different organisms on different surfaces.”

The researchers conclude that, “Background environmental monitoring of the hospital surface environment is not enforced by law or legislation and hospitals are under no obligation to monitor surfaces. Hospitals that choose to sample may use in-house guidelines or guidelines from the food or pharmaceutical industry. There are no comprehensive guidelines available for hospital sampling and there is little evidence-based literature on efficacies of sampling methods under different conditions which exist in the real hospital environment.”

In Mulvey, et al. (2011)’s evaluation of three methods for monitoring hospital cleanliness (visual monitoring, ATP bioluminescence and microbiological screening of five clinical surfaces before and after detergent-based cleaning), they found that visual assessment did not reflect ATP values nor environmental contamination with microbial flora including S. aureus and MRSA. There was a relationship between microbial growth categories and the proportion of ATP values exceeding a chosen benchmark but neither reliably predicted the presence of S. aureus or MRSA. An ATP benchmark value of 100 RLUs offered the closest correlation with microbial growth levels <2.5 CFU/cm2. The researchers add, “The original quantitative standard stated that ACC on hand-touch sites should not exceed 5 CFU/cm2 but this has since been reduced to 2.5 CFU/cm2. The qualitative standard states that any pathogen isolated should be <1 CFU/cm2 on surfaces. Well-cleaned surfaces with little organic material yield <250 RLU, whereas poorly cleaned surfaces can yield >1,000 RLU. These values are dependent upon the make and model of equipment used, since one RLU is not necessarily the same as that decreed by another type.”

Mulvey, et al. (2011) indicated from the studies they analyzed that light, moderate and heavy growth were classified as hygiene ‘failures,’ while no growth and scant growth were hygiene ‘passes.’ They explain, “Mean ATP values were examined against microbial growth to assess whether ATP levels could be used in place of microbial growth to predict hygiene levels. The sensitivity and specificity of ATP at all possible benchmarks corresponding to observed ATP values were calculated against ‘gold standard’ microbial growth categories and used to construct a receiver operating characteristic (ROC) curve. Both pre- and post-cleaning measurements were included in this analysis. Routine cleaning appears to have had some effect on ATP values since they decreased by 32.4 percent for most sites … Microbial counts also decreased after cleaning [48 percent (43/90) of sites], although a similar proportion were similar after cleaning as they were before cleaning [42 percent (38/90) of sites]. Just nine of 90 sites demonstrated higher microbial growth after cleaning.”

As Mulvey, et al. (2011) observe, “As the number of microbial colonies changes at a specific environmental site, so does the RLU value, but the variability of RLU in a short-term study made it difficult to choose an ATP benchmark designed to identify unacceptable levels of soil. The lower the benchmark is set, the more sites will fail; conversely, the higher the benchmark is set, fewer sites will fail. We found that the benchmark showing the closest proportionate failure rates to a study using a 250 RLU benchmark was 100 RLU. This value corresponded with an ACC of <2.5 CFU/cm2, which has previously been used as a surface hygiene benchmark. As ATP systems become more sophisticated, these benchmarks will continue to require revision downwards.”

They caution, “The range and diversity of the ATP results must be carefully considered. Despite monitoring in triplicate, occasional inflated values, for no apparent (visible) reason, skewed the overall results. It is already known that organic soil contains both microbial and human DNA, as well as food debris and liquids. ATP can also be confounded by disinfectants (bleach), microfiber products and manufactured plastics used in cleaning and laundering industries. If ATP assessment is introduced into hospitals, it should be on the understanding that there will be inevitable failures that do not necessarily indicate true infection risk for patients. Sensitivity and specificity of 57 percent mean that the margin for error is too high to justify stringent monitoring of the hospital environment at present. Further work is required to fully assess routine ATP monitoring in hospitals.”

Carling (2013) points to the concept of ‘cleanliness’ versus cleaning: “… it is important to consider the difference between assessments of cleanliness and programs to evaluate environmental cleaning practice. Although optimizing cleanliness of the patient zone surfaces represents the goal of disinfection-cleaning practice, practical and biologic limitations substantially preclude the isolated use of monitoring methods that utilize cleanliness monitoring as a surrogate for evaluating cleaning practice. A further limitation of evaluating cleanliness rather than evaluating cleaning practice relates to the fact that patient zone surfaces often have an intrinsically low bioburden prior to cleaning. Whereas it seems somewhat counterintuitive to optimize environmental cleaning of surfaces that often have low viable bioburdens, it is important to note that both transmissible and infectious doses of HAPs are very low.”

Carling (2013) adds, “Because approximately 60 percent of patients zone surfaces have low or no viable aerobic organisms on them, to use a monitoring tool that evaluates cleanliness, it is necessary to determine whether the object was clean before the monitored cleaning intervention took place. Although somewhat logistically cumbersome, cleaning practice can be evaluated by a system that measures cleanliness if objects with pre-existing very low bioburden or organic matter are eliminated from the evaluation process.”

Upsetting the Apple Cart of “Clean”

In their study of applying the suggested ACC standards to monitor environmental contamination, UK researchers Cloutman-Green, et al. (2019) say the data demonstrate that a large proportion of sites screened for bacterial contamination would fail if using the criteria suggested by other researchers —particularly those sites closest to patients—suggesting that a new standard might be required.

Cloutman-Green, et al. (2019) point to a lack of government guidance regarding acceptable numbers of microorganisms on hospital surfaces: “Griffith, et al. suggested a site should fail screening and be subject to investigation if it has an ACC > 2.5 CFU/cm2 on an agar contact plate (60 CFU/plate). This cutoff was based on food preparation standards and has been adopted by others. Dancer proposed the cutoff limit of 5 CFU/cm2 (120 CFU/plate) based on U.S. Department of Agriculture limits of bacteria on food-processing equipment, with failures leading to bed space closures and repeat cleaning. Dancer and others have since published articles using the lower cutoff limit of 2.5 CFU/cm2, referring to it as a ‘standard.’”

The researchers’ study applied the suggested ACC standards put forth by Griffith and Dancer for environmental monitoring to wards and outpatient settings at a UK hospital over 18 months to determine their suitability as part of routine infection control monitoring. A total of 1,986 samples were taken at the same time each day, a minimum of two hours after cleaning (within the pediatric hospital with chlorine dioxide and in the adult units with water and microfiber towels; other cleaning was undertaken using alcohol wipes by nursing staff.) Samples were taken from fixed surface sites of differing heights, touch frequencies, and materials and included bed rails and areas that were expected to have high levels of contamination such as the floor and bed wheels. ACC sampling was carried out and CFUs were analyzed by the criteria proposed by Griffith and Dancer, and sites failed screening if they had counts above these suggested limits.

Cloutman-Green, et al. (2019) report that when using the suggested standard of 2.5 CFU/cm, 93 percent of bed spaces and 32 percent of sites sampled failed screening. Using the suggested standard of 5 CFU/cm, more than half of bed spaces (66 percent) would have needed to be closed and recleaned and 15 percent of sites would have failed. Results were similar across ward types using the 120 CFU standard, with a range of 9 percent to 27 percent; however, a broader range of failures occurred with the 60 CFU standard (28 percent to 55 percent). Colony counts for bed spaces were similar for the 60 CFU standard with a range of 80 percent to 100 percent failing screening. Using the 120 CFU standard, the range was broader for bed spaces, with between 50 percent and 83 percent failing screening.

Cloutman-Green, et al. (2019) conclude that, “Our data demonstrate that in a dynamic hospital environment, a large proportion of sites screened for bacterial contamination would fail if using the criteria suggested by previous authors, particularly those sites closest to patients. This could lead to the closure of wards or bed spaces, increased costs, and decreased patient care. Surfaces are frequently contaminated even with routine cleaning and no reported cases of bacterial HAI on the pediatric wards during the monitoring period; therefore, the levels proposed by Griffith and Dancer are not practical to maintain. ACCs are clearly helpful, particularly in research settings where standardization is critical. However, for routine environmental monitoring, the proposed standards may lead to considerable disruption in the absence of a direct correlation with transmission of HAI. We suggest a move away from using ACC for determining microbiology standards in hospital environments and instead advocate monitoring for indicator organisms such as methicillin-resistant Staphylococcus aureus and carbapenemase-resistant Enterobacteriaceae.”

What Hospitals Can Do in the Absence of a Definitive Standard

Without definitive standards for defining cleanliness, infection preventionists and environmental services directors must evaluate the literature as well as the current chemistries and technologies in the marketplace to determine a plan of action for their institutions.

Despite efforts over the past 15 years, it has been extremely challenging to pinpoint a system that definitively defines the clinical relevance of the healthcare surface. As Carling says, “There had been some hope for 2.5 CFU/cm2 as being a standard, yet there is no scientific evidence of the relevance of such a value. The problem is that bioburden on either cleaned or uncleaned surfaces is very low, and it has been difficult to develop a reproducible single value system with enough sensitivity and specificity to measure such a value accurately. Furthermore, reproducibly evaluating small and irregular objects adds to the challenge. The other point besides the fact that bioburden is low, is that for the pathogens we worry about, every one of them has a low infective dose. So just because a surface has a low level of bioburden, and you don’t find MRSA on that one little slide you just took off the surface doesn’t mean it’s not right next to it. The same goes for other pathogens such as norovirus and C difficile.”

Only one thing is for certain right now — that the lively dialogue over defining a standard of cleanliness — and whether to use the 2.5 CFU/cm2 standard — continues. As noted in the commentary by Carling and Huang, “Improving Healthcare Environmental Cleaning and Disinfection: Current and Evolving Issues” in Infection Control and Hospital Epidemiology (2013), “Since the realistic goal of environmental cleaning and disinfection of patient care areas is not to produce a continuously sterile surface environment but rather to effectively decrease pathogen transmission, multi-center studies evaluating both environmental contamination as well as acquisition also have the potential for identifying a threshold of environmental contamination below which transmission and therefore disease risk is minimized.  Identifying such a threshold for key healthcare pathogens could then facilitate additional studies using such a threshold as an acceptable ‘gold standard’ for minimizing disease risk.”

As Carling, et al. (2014) emphasize, “Our findings shed further light on the challenge of defining when an apparently clean healthcare surface might reasonably be considered bacterially contaminated enough to provide evidence of poor cleaning practice or be defined as dirty. Several years ago, it was suggested that the industrial hygiene threshold for defining food preparation surfaces as clean (ACC of less than 2.5 CFU/cm2) could be used to evaluate the cleanliness of near-patient surfaces in healthcare and that surfaces containing heavier bacterial bioburdens be defined as cleaning failures. Although a plausible concept, logistical limitations as well as the fact that the standard has yet to be correlated with the relative risk of transmission of healthcare-associated pathogens have been noted by several authors.”

The Future

In the 15 years since UK microbiologist Stephanie J. Dancer first proposed a microbiological standard, she continues to emphasize the importance of a standard, akin to a CFU formula. In a paper she co-authored last year, Dancer and her colleagues had investigated if any correlation existed between environmental contamination of air and surfaces in the ICU, and any association between environmental contamination and ICU-acquired staphylococcal infection.

In this study, Smith and Adams, et al. (2018) screened patients, air, and surfaces were screened on 10 sampling days in a mechanically ventilated 10-bed ICU for a 10-month period. Near-patient hand-touch sites (N = 500) and air (N = 80) were screened for total colony count and Staphylococcus aureus. Air counts were compared with surface counts according to proposed standards for air and surface bioburden. Patients were monitored for ICU-acquired staphylococcal infection throughout.

The researchers found that overall, 235 of 500 (47 percent) surfaces failed the standard for aerobic counts (≤2.5 CFU/cm2). The researchers note that, “The surface standard most likely to reflect hygiene pass/fail results compared with air was 5 CFU/cm2. Rates of ICU-acquired staphylococcal infection were associated with surface counts per bed in 72 hours encompassing sampling days.”

“The standards proposed in this paper do work, but it’s for the UK,” Dancer explains. “We clean with detergent. I doubt if they equate to a hospital that douses every surface imaginable with powerful disinfectants.”

Still, Dancer says that despite knowing how quickly pathogens can return to surfaces, especially given poor hand hygiene efforts, a standard of “clean” could work. “I’m a glass-half-full person,” she says. “Of course, we can define ‘clean.’ You just have to understand the risk.”

References:

Al-Hamad A and Maxwell S. How clean is clean? Proposed methods for hospital cleaning assessment. J Hosp Infect. 2008 Dec;70(4):328-34.

Bartlett AH. How Clean Is Clean Enough -- And How Do We Get There? Medscape, Dec. 29, 2014.

Campbell J, Jones C and Hill BB. Cleaning: Finding a Microbiological Standard. Int J Facility Management. Vol.5, No.1. November 2014.

Carling PC, Perkins J, Ferguson J, and Thomasser A. Evaluating a New Paradigm for Comparing Surface Disinfection in Clinical Practice. Infection Control and Hospital Epidemiol. Vol. 35, No. 11, November 2014.

Carling PC. Methods for assessing the adequacy of practice and improving room disinfection. American Journal of Infection Control. Volume 41, Issue 5, Supplement, May 2013, Pages S20-S25.

Carling PC, Parry MF, et a. Identifying opportunities to enhance environmental cleaning in 23 acute care hospitals. Infect Control Hosp Epidemiol, 29 (2008), pp. 1-7.

Casini B, Tuvo B, et al. Evaluation of the Cleaning Procedure Efficacy in Prevention of Nosocomial Infections in Healthcare Facilities Using Cultural Method Associated with High Sensitivity Luminometer for ATP Detection. Pathogens 2018, 7, 71.

Chai J, Donnelly T, et al. Environmental sampling of hospital surfaces: Assessing methodological quality. Canadian J Infect Control. Vol. 33, No. 3. Pp 138-145. Fall 2018.

Cloutman-Green E, et al. How clean is clean—is a new microbiology standard required? Am J Infect Control. Vol. 42, No. 9, September 2014, Pages 1002-1003.

Dancer SJ. How do we assess hospital cleaning? A proposal for microbiological standards for surface hygiene in hospitals. J Hosp Infect. 2004 Jan;56(1):10-5.

Dettenkofer M, et al. Does disinfection of environmental surfaces influence nosocomial infection rates? A systematic review. Am J Infect Control, 32 (2004), pp. 84-89

Doll M, Stevens M and Bearman G. Environmental cleaning and disinfection of patient areas. International Journal of Infectious Diseases. Vol. 67, February 2018, Pages 52-57.

Donskey CJ. Does improving surface cleaning and disinfection reduce health care-associated infections? Am J Infect Control. 2013 May;41(5 Suppl):S12-9.

Dumigan DG, Boyce JM, et al. Who is really caring for your environment of care? Developing standardized cleaning procedures and effective monitoring techniques. Am J Infect Control, 38 (2010), pp. 387-392.

Guh A and Carling PC. The Environmental Evaluation Workgroup. Options for evaluating environmental cleaning. Atlanta: Centers for Disease Control and Prevention, 2010.

Hayden MK, et al. Reduction in acquisition of vancomycin-resistant enterococcus after enforcement of routine environmental cleaning measures. Clin Infect Dis, 42 (2006), pp. 1552-1560

Lewis T, Griffith C, Gallo M, Weinbren M: A modified ATP benchmark for evaluating the cleaning of some hospital environmental surfaces. J Hosp Infect 2008, 69(2):156-63.

Mulvey D, et al. Finding a benchmark for monitoring hospital cleanliness. J Hosp Infect. Vol. 77, No. 1. Pp. 25-30. January 2011.

Nante N, et al. Effectiveness of ATP bioluminescence to assess hospital cleaning: a review. J Prev Med Hyg. 2017 Jun; 58(2): E177–E183.

Otter JA, Yezli S and French GL. The role played by contaminated surfaces in the transmission of nosocomial pathogens. Infect Control Hosp Epidemiol, 32 (2011), pp. 687-699.

Rawlinson S, Ciric L, Cloutman-Green E. How to carry out microbiological sampling of healthcare environment surfaces? A review of current evidence. J Hosp Infect. 2019 Jul 29. pii: S0195-6701(19)30309-3.

Ryan MO, Haas CN, et al. Application of quantitative microbial risk assessment for selection of microbial reduction targets for hard surface disinfectants. Am J Infect Control. Vol. 42, No. 11, Pp. 1165-1172. November 2014.

Smith J, Adams CE, et al. (2018) Is there an association between airborne and surface microbes in the critical care environment? J Hosp Infect. November 2018. Vol. 100, No. 3, Pages e123-e129.

Weber DJ, Rutala WA, et al. Role of hospital surfaces in the transmission of emerging healthcare-associated pathogens: norovirus, Clostridium difficile, and Acinetobacter species. Am J Infect Control, 38 (2010), pp. S25-S33.

The following article is from the October 2019 issue of Healthcare Hygiene magazine.

Candida auris: A Stealth Enemy With Environmental Persistence

By Kelly M. Pyrek

Public health entities such as the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO) have compiled lists of the problem pathogens that continue to represent some of the greatest challenges to infection prevention and control (IPC) efforts. Rising on that list is Candida auris, an emerging worldwide public health scourge that is growing in intensity as a healthcare-acquired organism. It is particularly fearsome because of its innate resistance to multiple anti-fungal drugs and its resilience in the face of traditional hygiene measures.

In contrast to other Candida species, C. auris is transmitted easily in the healthcare setting, and it is demonstrating the ability to persist both in the human host and on inanimate surfaces.
“Candida auris is an emerging fungus and one of the headaches with this species is it’s multidrug-resistant, meaning that it is resistant to multiple antifungal drugs commonly used to treat Candida infections,” confirms Rodney E. Rohde, PhD, MS, professor and chair of the Clinical Laboratory Science Program at Texas State University.

C. auris is also transcending traditional classification.

“Infection prevention experts tend to want to group pathogens into groups such as healthcare-acquired or community-acquired but when it comes to resistant organisms you really can't use those terms anymore,” Rohde says. “For instance, in some of the studies I have conducted with different organisms, what we used to think was only in the community, we see those strains or genotypes showing up in healthcare patients, and vice versa. So, even though microbiology and science tells us certain species are strictly found in the healthcare setting or strictly found in the community setting, in reality, it’s more of a global presence now due to so many vehicles and vectors that move in and out of healthcare settings.”

Rohde continues, “When we talk healthcare settings, we tend to only think hospitals, but we must think long-term care, outpatient and dialysis centers, clinics, and school and university health centers. I think it’s a communication problem; as scientists, we love to put things into pots and think we’ve described them adequately, and they are going to stay that way, but one of my favorite sayings is, organisms do not read the book of rules we’ve written for them. However, we teach it that way sometimes to try to make sense of it, but when you get into the real-world trenches, the truth is, we’re not able to do that. Organisms are crossing all boundaries.”

C. auris is insidious and still greatly unknown to hospitals and healthcare systems, much like the early days of Clostridium difficile. Experts are tending to agree that C. auris has operated under the radar. Rhodes and Fisher (2019) say that “Since its discovery, C. auris has caused a ‘stealthy pandemic,’ emerging across the globe and is now recorded in all continents except Antarctica. However, C. auris is thought to have been misidentified as C. haemulonii on several occasions, suggesting that C. auris has likely been circulating as a human pathogen before 2009.”

Candida auris as a newly recognized cause of fungal infection is catching healthcare professionals unawares, despite cases being reported for nearly a decade now. Since the first official isolation of Candida auris in 2009, scientific community has witnessed an exponential emergence of infection episodes and outbreaks in different world regions. According to the CDC as of June 30, 2019 (the most recent data available), there are 725 confirmed cases, plus an additional 1,474 patients have been found to be colonized with C. auris by targeted screening in 10 states with clinical cases.

“Candida, as a fungus, is an unusual and a difficult problem to deal with,” Rohde says. “It’s not typically something you would be looking for in a healthcare setting, and certainly not looking for it in the common population. But it is a factor when looking at the immunocompromised population in context of environmental and healthcare exposure; for example, could a patient have been working in a certain environment where fungi and yeast are present? But to have a fungus emerging and breeding in a healthcare setting is a novel challenge in our lifetime. So, I think we are trying to grasp what that means.”

He continues, “Microbiologists are very concerned about C. auris because you almost have to rethink everything from the laboratory perspective. If you are a clinician in a smaller facility or a rural hospital, you often must wait on sending specimens out to an off-site, more central clinical laboratory, and that eats up valuable diagnostic time. You can typically know pretty quickly if you have Staph or Klebsiella or any of these other multi-resistant organisms if you have access to rapid assays; but yeast and fungi are not typically, at least right now, on the radar for panel testing. Fortunately, it’s starting to get there, obviously, but if you don’t have a trained medical laboratory professional on staff who can look under a microscope or conduct some type of test to rule out yeast -- you could have some slippage with respect to turnaround time on a diagnosis, so there’s a significant challenge.”

Kenters, et al. (2019) confirms that C. auris is a budding yeast that forms white, pink, or purple colonies on CHROMagar and can be difficult to distinguish from C. glabrata: “Some strains form aggregates of cells while others do not. In contrast to most other Candida species, it grows well at higher temperature (40-42° C) … First attempts to identify C. auris using PCR directly from swabs, seem to produce frequent ‘false positive’ results -- positive in PCR, negative in culture swabs. The first report of three cases of nosocomial fungemia due to C. auris showed that this yeast is commonly misidentified as C. haemulonii and Rhodotorula glutinis using traditional phenotypic methods. These widely used routine identification methods for yeasts are based on phenotypic assimilation/fermentation tests using sets of carbon and nitrogen compounds. An investigation of 102 clinical isolates, previously identified as C. haemulonii or C. famata, showed that 88.2 percent of the isolates were in fact C. auris, when confirmed by ITS sequencing. Several studies have since reported that, in routine microbiology laboratories, C. auris remains a problematic, difficult to identify pathogen, because commercial biochemical identification systems lacked this yeast in their databases.”

“C. auris transmission might be happening in a healthcare facility where professionals are completely unaware; they haven’t been able to detect it, or they are just missing it due to false negatives, and it begs the issue of the importance of having access to clinical microbiology services,” Rohde says. “An expert in clinical microbiology and medical laboratory needs to be part of this equation -- and not just for Candida auris, but for all pathogens in the healthcare setting."

He continues, “People may argue with me on this, but physicians and others in healthcare who are responsible for patient outcomes, often do not have the kind of expertise, education and background to immediately identify problematic organisms; not to detract from their other medical diagnostic skills, but you cannot know intuitively which microbe is infecting a patient without a medical laboratorian conducting a test -- it’s just not going to happen. Unless of course you have those cases where a physician might have the training and the background to do so, but that’s rare. Unfortunately, healthcare has become so lean that many laboratories are being consolidated, with their full microbiology toolbox being moved to a central location. This makes sense from an economic perspective, but sometimes for clinicians, that’s not the best scenario.”

That becomes even more of an issue when considering the long list of risk factors for patients, including immunosuppressed state, significant medical comorbidities, central venous catheters, urinary catheters, recent surgery, parenteral nutrition, exposure to broad spectrum antimicrobials, intensive care unit admission, and residence in a high-acuity skilled nursing facility.

Sears and Schwartz (2017) point out that C. auris has been recovered in samples from blood, catheter tips, cerebrospinal fluid, bone, ear discharge, pancreatic fluid, pericardial fluid, peritoneal fluid, pleural fluid, respiratory secretions (including sputum and bronchoalveolar lavage), skin and soft tissue samples (both tissue and swab cultures), urine, and vaginal secretions. Clinically, it has been implicated as a causative agent in fungemia, ventriculitis, osteomyelitis, malignant otitis, complicated intra-abdominal infections, pericarditis, complicated pleural effusions, and vulvovaginitis. They add that, “Much like other Candida species, there is uncertainty about the ability of C. auris to cause true respiratory, urinary, and skin and soft tissue infections despite being isolated from such samples.”

Cortegiani, et al. (2019) observe that “It is likely that many cases are missed, due to its misidentification with other non-albicans Candida spp. (e.g., C. haemulonii) by common microbiological diagnostic methods. Most of the reports occurred in critically ill adults, with risk factors for invasive fungal infections, such as immunosuppression, surgery, or indwelling catheters. The most common form of infection was candidemia, with a crude mortality of nearly 30 percent, but up to 70 percent in some reports.”

Cortegiani, et al. (2019) emphasize the criticality of fighting C. auris in the intensive care unit (ICU): “Despite implementation of countermeasures to limit colonization and infections in ICUs, cases continue to be reported, with a tendency to an endemic pattern. This reflects the ability of C. auris to persist in the clinical environment, facilitating its transmission within the critical-care setting. Multidrug-resistant (MDR) pattern and has been frequently observed (around 40 percent) with serious and complex consequences for antifungal therapy.”

The researchers note that due to the progressive spread of C. auris and treatment-related concerns, attention should be focused on the following major issues: worldwide transmission, anti-fungal treatment resistance, resilience and mechanisms of transmission, Implementation of infection prevention and control measures, and surveillance.

Let’s review each issue:

Antifungal treatment resistance
Cortegiani, et al. (2019) note that, “To date, there are not established minimum inhibitory concentrations (MICs) breakpoints for susceptibility testing of C. auris. Antifungal susceptibility data from three continents demonstrated that nearly 40 percent were MDR, with strains being resistant to fluconazole (90 percent), amphotericin B (30 percent to 40 percent) and echinocandins (5 percent to 10 percent). Moreover, a small percentage were also resistant to all antifungals available. C. auris demonstrates a high propensity to develop antifungal resistance under selective pressure. Recent studies demonstrated mutations in ERG11 (encoding lanosterol demethylase, the target of azoles) and FKS1 genes (encoding 1,3-beta-glucan synthase, the target of echinocandins). The recommended antifungals for C. auris treatment are mainly based on in-vitro testing and on the most frequently retrieved resistance profiles. Echinocandins are the recommended first-line treatment, pending specific susceptibility testing. Lipid formulation of amphotericin B should be an alternative in patients not responding to echinocandins. Close monitoring to early detect therapeutic failures and evolution of antifungal resistance is needed. New antifungals (e.g., SCY-078, APX001A/APX001, and rezafungin) have been tested with success but they are not available to date for clinical use.”

Resilience and mechanisms of transmission
Cortegiani, et al. (2019) explain that, “Unlike others Candida species, C. auris can colonize different anatomical sites (e.g., skin, skin, rectum, axilla, stool) and contaminate hospital equipment and surfaces, creating a vicious cycle of acquisition, spreading, and infection, particularly in ICUs. Indeed, bed, chairs, and monitoring tools (e.g., pulse oximeters, temperature probes) were contaminated during outbreaks. Recently, Eyre et al. published the results of a patients’ and hospital environmental screening program in Oxford, UK, after 70 patients (66 admitted to a neuro-ICU) were identified as being colonized or infected by C. auris. Seven patients developed an invasive infection during hospital stay. C. auris was detected mainly on skin-surface axillary temperature probes and other reusable tools. In patients monitored with skin-surface temperature probes, the risk of C. auris infection/colonization was seven times higher. Adoption of specific bundles of infection control had no significant effects until removal of the temperature probes. Recent studies have confirmed that C. auris can form biofilms, with a high variation of capacity of production depending on the C. auris strain considered. Biofilm may present reduced susceptibility to hydrogen peroxide and chlorhexidine. Quaternary ammonium compounds and cationic surface-active products seem to be ineffective against C. auris. Chlorine-based products appear to be the most effective for environmental surface disinfection. Chlorine-based disinfectants (at a concentration of 1,000 ppm), hydrogen- peroxide, or other disinfectants with documented fungicidal activity are recommended for environmental cleaning by the European CDC (ECDC).”

Implementation of infection prevention and control measures
Cortegiani, et al. (2019): say that, “Usually, outbreaks follow an exponential increase in the number of affected patients. It is mandatory to trace contacts with the aim to achieve early identification and screening of possible colonized patients that might be responsible for persistence of C. auris. Patients potentially or already colonized should be placed in single rooms with contact isolation precautions. Screening should be applied for contacts and patients previously hospitalized in healthcare settings where C. auris isolation was confirmed. Hand hygiene (with alcohol or chlorhexidine handrubs), wearing of protective clothing, and skin and environmental/equipment decontamination should be performed to prevent ongoing transmission.”

Global surveillance
Cortegiani, et al. (2019) emphasize that the emergence of C. auris and progressive spread of infections caused by other resistant pathogens has strengthened the need for a surveillance network for antimicrobial resistance globally for critically ill patients’ safety. The researchers observe, “It is hard to predict future C. auris diffusion. There will be outbreaks also in countries in which C. auris has been not reported yet? Will new MDR clones continue to emerge? Will we be able to apply effective antifungal stewardship programs and control measures?”

So much about C. auris is still unchartered territory, and as Rhodes and Fisher (2019) observe, “The global emergence of C. auris testifies to the unmapped nature of Kingdom Fungi and represents a new nosocomial threat that will require enhanced infection control across diverse healthcare and community settings.” The researchers add, “Currently, nothing is known about the origins and initial emergence of C. auris; its propensity to survive on inanimate objects within the hospital alongside resistance to disinfection protocols suggests the existence of an unknown non-human environmental reservoir. However, similar to other Candida species, the true nature of C. auris’ ancestral reservoirs currently remains elusive. The detection of clonal C. auris isolates on multiple continents simultaneously with distinct geographical antifungal resistance mechanisms suggests at least four independent emergence events followed by clonal expansion and the ongoing evolution of resistance in response to antifungal therapy … As sequencing technology develops, it is likely rapid sequencing of C. auris isolates can be achieved in 48 hours or less leading to the potential for bedside diagnostics twinned with molecular epidemiology of nosocomial patterns of transmission. Currently, it is not often known when patients become colonized – whether from the hospital environment or endogenous carriers – and the extent of carriage in the community remains largely unexamined.”

Lockhart (2019) indicates that rapid identification of colonized patients followed by isolation and contact precautions can help stem the spread of resistant clones: “Real-time detection methods can not only rapidly identify colonized patients but may also contribute to the rapid detection of resistance. Besides the existing laboratory-developed tests, there is at least one commercially available PCR test for the rapid detection of C. auris. There are currently two real-time assays for detection of anti-fungal resistance in C. auris, one for detecting azole resistance and the other for echinocandin resistance, as well as a report that echinocandin resistance can be detected using MALDI-TOF. These rapid platforms may become essential for the rapid determination of appropriate therapy.”

Environmental Persistence of C. auris
Short, et al. (2019) are sounding the alarm about the environmental persistence of C. auris; in their study, they found show that the ability of this multidrug-resistant yeast to form cellular aggregates increases survival after 14 days, which coincides with the upregulation of biofilm-associated genes. The researchers also caution, “Additionally, the aggregating strain demonstrated tolerance to clinical concentrations of sodium hypochlorite and remained viable 14 days post treatment. The ability of C. auris to adhere to and persist on environmental surfaces emphasizes our need to better understand the biology of this fungal pathogen.”

The researchers explain, “A key attribute of its pathogenic repertoire is its ability to survive and persist in the environment, yet the methods employed by this multidrug-resistant pathogen to disseminate throughout healthcare environments are still not fully understood. This has profound implications for decontamination and infection control protocols. Therefore, understanding the mechanisms of spread and survival in the hospital environment is critical, particularly as it persists on hospital fomites, extensively colonize individuals, and to survive as biofilms. Although traditionally biofilms are associated with formation on an indwelling medical device or on a mucosal substrate, recent investigations have suggested that these communities can facilitate residence and survival upon surfaces within a clinical setting. Despite the lack of nutrients, these communities adapt to survive and display increased tolerance to both heat and conventional disinfection treatments compared to a free-floating, equivalent cell. C. auris has been shown to readily transmit between hospital equipment, such as reusable temperature probes, and patients suggesting limitations of current infection control strategies. Commonly used disinfectants have been shown to be highly effective when tested in suspension, yet our previous data indicate that adherent C. auris cells can selectively tolerate biocides, including sodium hypochlorite and peracetic acid, in a substrate-dependent manner.”

To test the theory of biofilm formation being employed as an endurance strategy of C. auris, Short, et al. (2019) performed survival studies using two phenotypically distinct isolates based on their ability to form cellular aggregates. The researchers report, “Similar to previous findings, C. auris was found to remain viable for at least two weeks within a dry environment, regardless of the organic material in which it was suspended. It was shown that aggregating cells survived considerably better than their single-cell counterparts in PBS (>2.5 log2 cfu/mL) and 10% FCS (>4 log2 cfu/mL).”

The researchers add, “To confirm a role for biofilms in facilitating environmental persistence, a panel of biofilm associated genes, selected according to our group's previous transcriptional characterization of C. auris biofilms, was assessed. These genes were highly expressed across both phenotypes; however, comparative analysis revealed increased expression of approximately two-fold of several of these genes, which are involved in adhesion, extracellular matrix (ECM) production, and efflux pumps. ECM production is a well-documented resistance mechanism in pathogenic fungal biofilms of Candida spp. Increasing ECM production could provide the necessary protection for C. auris to survive extended periods of desiccation and retain viability following terminal disinfection.”

Using Infection Prevention and Control to Fight C. auris
Case investigation by public health entities such as the CDC and others has demonstrated that C. auris patients within similar geographic regions commonly had overlapping stays in the same acute-care hospital or long-term care facility, further supporting healthcare exposure as a key method of transmission.

Given the risk of nosocomial transmission of this multidrug-resistant pathogen, Sears and Schwartz (2017) emphasize that, “…infection control measures are vital to slowing the spread of C. auris. CDC recommends that all hospitalized patients with C. auris infection or colonization be treated using both Standard Precautions and Contact Precautions and housed in a private room with daily and terminal cleaning with a disinfectant agent active against Clostridium difficile spores (Cadnum et al., 2017). Receiving healthcare facilities should also be notified prior to transfer of an infected or colonized patient. Infection control precautions should be maintained until a patient is no longer infected or colonized with C. auris although there is uncertainty as to how best to monitor for ongoing colonization (CDC, 2017). There are no clear data on the efficacy of decolonization measures for patients colonized with C. auris, however this has been attempted with chlorhexidine in healthcare facilities during outbreaks.”

Kean, et al. (2018) articulate one of the greatest worries about C. auris: “The ability of this organism to survive on surfaces and withstand environmental stressors creates a challenge for eradicating it from hospitals.”

An experience with surface cleaning and disinfection to help combat C. auris in a U.S. healthcare facility was documented by Marrs, et al. (2017) who reported on two patients with C. auris infections that were admitted to the University of Chicago Medicine (UCM). In their study, the researchers collected environmental samples to assess environmental contamination before and after cleaning. They sampled the following surfaces: Bathroom sink drain, bedside table, bedrail, mattress, chair and window ledge. Routine terminal cleaning included using a 10 percent sodium hypochlorite solution that was applied to high-touch surfaces of the patient room and bathroom. The enhanced terminal cleaning process also included removing and replacing privacy curtains, using a single UV disinfection cycle in the room and bathroom, as well as supervision of the process by the environmental services manager.

The researchers note that due to a delay in identification of C auris for the first patient, pre-cleaning samples were taken more than two weeks after the patient had been discharged. During the intervening weeks, multiple patients had occupied the room and there had been more than three routine terminal cleanings. None of these samples was positive for C auris. Pre-cleaning, in-residence samples indicated C auris contamination of multiple surfaces for the second patient. Because of transfers within the institution, there are three sets of post-cleaning cultures for the second patient. All post-cleaning environmental cultures were negative for both patients. The researchers concluded that while routine terminal cleaning may have been effective in removing C auris from surfaces in one patient’s room, the enhanced terminal cleaning strategy used here was effective in their facility.

In their study, Kean, et al. (2018) evaluated a panel of C. auris clinical isolates on different surface environments against the standard disinfectant sodium hypochlorite and high-level disinfectant peracetic acid. The researchers note that, “C. auris was shown to selectively tolerate clinically relevant concentrations of sodium hypochlorite and peracetic acid in a surface-dependent manner, which may explain its ability to successfully persist within the hospital environment.”

The implications for infection control are significant, and Kean, et al. (2018) add that, “Understanding the mechanisms of spread and survival of this pathogen in the hospital environment is therefore crucial, particularly as it may persist on plastics and steel, and survive as biofilms. Several recent investigations have confirmed that C. auris is capable of prolonged survival on surfaces and have shown that surface disinfection protocols have variable and unsatisfactory outcomes. Since it has been shown recently that 1,000 ppm of an active chlorine solution is highly effective against these organisms when tested in suspension, the interaction between the pathogen and surfaces is likely to be important in determining survival of C. auris in the hospital environment. Our own work confirms this, with C. auris biofilms being generally insensitive to a range of key antimicrobial agents, thus prolonging their survival capacity.”

Kean, et al. (2018) investigated the general disinfectant sodium hypochlorite (NaOCl), widely used for terminal cleaning within the hospital environment, and the high-level disinfection agent peracetic acid, on different substrate surfaces. Four C. auris isolates obtained from various clinical sites were used, and several test surface substrates were used: cellulose matrix, 304 stainless steel, and polyester coverslips.
The researchers report that, “Initially, a standard disinfectant challenge was performed against C. auris on different substrates relevant to the hospital environment. A cellulose substrate was included to act as control for porosity. All four C. auris were significantly killed by NaOCl challenge at 1,000 and 10,000 ppm, irrespective of substrate and strain, though differences were observed between these substrates. Complete eradication was only achieved on the cellulose substrate. On the non-porous materials, significant quantities of viable yeast cells were killed on the steel surface following NaOCl at all treatment parameters, with ∼2.5 log10 reduction, with no significant differences observed at each time-point and concentration tested. Notably, those isolates treated with 1,000 ppm for 5 minutes showed significantly more regrowth compared to the other test conditions. When C. auris was tested on a polymer substrate, 5-minute exposure at 1,000 ppm was the least effective overall; although significant activity was observed, 4.95 log10 was retained on the surface. However, following an increased contact time of 10 minutes, or increased concentration to 10,000 ppm, significantly enhanced activity was observed, with an approximate overall 3.5 log10 reduction. When comparing both increased treatment parameters, no significant differences were observed between the regimens, and no notable regrowth was detected.”

The researchers observe, “There was a significant difference in activity between polymer and steel, which could be explained by the general ability of Candida species to adhere and form biofilms that are inherently more resistant. Whereas the isolates on steel responded by ∼3 log10 equally to the treatment regimens, on plastic we demonstrated differential activity depending on concentration and time of exposure to NaOCl. Another study reported greater efficacy of chlorine-based products on steel, but differences in experimental design may explain this, e.g. products and inoculum. Taken together, these data suggest that the standard disinfection procedures are surface-dependent, and that the diversity of fomites in the hospital setting could pose a problem for disinfection.”
Kean and McKloud, et al. (2018) used a three-dimensional complex biofilm model to investigate the efficacy of a panel of antiseptic therapeutics, including povidone iodine (PVP-I), chlorhexidine (CHX) and hydrogen peroxide (H2O2). They hypothesized that the ability of C. auris to form biofilms may be a potential mechanism that results in reduced susceptibility to antiseptic agents.

Initially, the antiseptic efficacy of three agents was tested against four C. auris isolates. As Kean and McKloud, et al. (2018) report, “When biofilms were treated with PVP-I for 5 minutes, concentrations of 1.25-2.5 percent were required to inhibit biofilms, which is a 16- to 128-fold change compared with planktonic cells. Increasing the exposure time (10 and 30 minutes) was shown to increase susceptibility to 0.625-1.25 percent, which is an eight- to 64-fold change compared with planktonic cells. CHX was highly active against planktonic cells, whereas biofilms were less susceptible with MICs increasing 2- to 16-fold. Elevated biofilm MICs were also observed following H2O2 exposure, with concentrations ranging between 0.25 and >1 percent required to kill biofilms, i.e. a 16-fold increase in the planktonically-active concentration. Regardless of the antiseptic active used, minimal differences in susceptibility were observed between 24- and 48-hour biofilms.”

The researchers observe, “Interestingly, PVP-I, a commonly used pre-surgical wash, was shown to be equally active against both early and mature biofilms when assessed by culture. This agreement with other studies in which PVP-I demonstrated excellent fungicidal activity against C. auris. The use of 10 percent PVP-I for surgical skin preparation has been used clinically for C. auris, with no postoperative infection reported. An interesting finding from this study was the ineffectiveness of H2O2 against both planktonic and sessile cells of C. auris. This finding is inconsistent with previous studies in which H2O2 showed significant fungicidal activity. Discrepancies between these findings are likely due to the test methodologies employed, with vaporized H2O2 assessed in one of these studies.”

The researchers add, “Future studies assessing the efficacy of CHX diluted in alcohol may provide a potential anti-biofilm strategy for successful skin disinfection. Collectively these findings illustrate the need for a greater understanding of the survival strategies of C. auris. Considering the documented high transmissibility of C. auris between patients and the environment, the implementation of stringent infection prevention and control procedures, coupled with the biological understanding of the organism, will ultimately aid the intervention strategies for this emerging pathogen.”

Recommendations
It is quickly becoming evident that current evidence for pragmatic infection prevention and control (IPC) recommendations is lacking, according to Kenters, et al. (2019), who reviewed the epidemiology of C. auris and identified best practices to provide guidance and recommendations for IPC measures, based on available scientific evidence, existing guidelines and expert opinion. The IPC working group of the International Society of Antimicrobial Chemotherapy (ISAC) organized a meeting with infection prevention and mycology experts to review recommendations on IPC measures on C. auris in inpatient healthcare facilities for healthcare workers on the most common interventions including: screening, standard precautions, cleaning and disinfection, inpatient transfer, outbreak management, decolonization and treatment.

The working group identified these best practices to help combat C. auris:

1. Optimal diagnostics
It is crucial to identify and report C. auris correctly in order to provide optimal patient care, treatment and initiate appropriate IPC measures. All isolates should be susceptibility tested, from whatever body site, because of varying levels of resistance.

2. Patient screening upon admission
Outbreak investigations have revealed that screening sites most frequently culturing positive for C. auris were axilla, groin, rectum and urine. Other sites for screening, even if less sensitive to colonization are the nose, mouth, external ear canals, catheter urine and wounds. If a patient has open wounds and/or intravascular catheters they should be included for screening in addition to the swabbing sites. It is important to only swab open wounds if they are not sealed by wound dressing, as opening a sealed wound can cause a risk for patients to become colonized with C. auris if they are carriers. Risk groups admitted to the hospital are patients previously admitted into intensive care units in endemic countries and, transfers from hospitals known to have C. auris. Hospitals that have C. auris present should liaise with receiving hospitals infection control teams and ensure that appropriate is information is shared. If healthcare institutions have an existing surveillance screening protocol in place for patients at high-risk for colonization with MDRO, it might be more valuable to add C. auris to the existing testing panel in the lab, rather than to re-implement additional swabbing sites.

3. Infection prevention and control interventions
For healthcare facilities to be prepared for a first case of C. auris, it is important to have a screening protocol as well as adequate IPC procedures in place. Any detection of C. auris should be immediately reported to the IPC department, leading to timely implementation of strict IPC-related measures. Patients colonized or infected with C. auris should be isolated until discharge and flagged for at least one year after the first negative screening culture. When patients are transferred within the institution or to other healthcare facilities handing over of the patient C. auris status needs to be ensured. Patient contact screening of direct contacts should be initiated on the detection of a ‘first’ case, including those being discharged and this may involve tracking back throughout the patient admission if internal transfers have taken place. Every patient should be screened in the axilla and groin, including any other relevant sites (e.g. nose, urine, rectum, throat, wounds and catheter exit sites).

The following IPC measures should be implemented for any C. auris case:
- Standard precautions
Hand hygiene is key to prevent transmission of any microorganism, including C. auris. Special attention should be given to adequate compliance with hand hygiene while caring for patients in isolation. Hand hygiene should be performed at the point of care using alcohol-based handrub (ABHR). While ABHR is the preferable choice, water and soap should be used when hands are visibly soiled and a dedicated sink needs to be in place to wash hands.

- Patient environment
Patients colonized or infected with C. auris need to be placed in contact precautions in a single room, ideally with negative pressure, and preferably with an anteroom and in-suite bathroom/toilet. If the latter is not available, patients should use a dedicated washroom or a waterless washing product, as well as a dedicated commode. The use of an isolation room with anteroom might be preferable, not because airborne spread is assumed, but because compliance with isolation measures might possibly be higher, as the double doors function as a reminder. A flagging system indicating the isolation needs to be visible at the entry of the patient room and instructions for HCWs and visitors need to be available. All biomedical products and equipment should be used as disposables, or, if re-usable should be left in the patient's room until discharge and thorough disinfection. Sharing biomedical products and equipment to other wards poses a risk of additional transmission. For mattresses and pillows, HCWs should ensure that they are 100 percent sealed before using them for a C. auris patient and its integrity should be assessed upon discharge if they are to be used for another patient.

- Personal protective equipment (PPE)
It has become evident that the use of a long-sleeved gown and gloves are sufficient to enter the room of patients found positive for C. auris. Taking into consideration that people often (unconsciously) touch their face, a surgical mask could be considered to prevent colonization of healthcare staff, since one HCW has been found transiently positive in the nose in a previous outbreak.

- Environmental cleaning
Enhanced daily and terminal disinfection has been shown to be crucial to control the spread of C. auris within healthcare facilities. In addition, the frequency of cleaning and disinfection should be at least twice daily, of at least all high-touch surfaces. Terminal cleaning and disinfection of the rooms after patient discharge needs to be performed with great diligence. When selecting a product, users should keep in mind the toxicity of a product and select one that is safer to use near a patient. Innovative automated decontamination technologies, such as UV-C disinfection can be used to ensure optimal terminal cleaning of surfaces but are an additional safeguard and not a replacement of the routine cleaning method. Both methods – UV-C and HPV – require vigorous cleaning before being effective against microorganisms. If UV-C is used, the duration of exposure for efficacy is longer than that for vegetative bacteria and a cycle time effective for spores such as C. difficile should be selected.

Cleaning and disinfection of reusable equipment is particularly important, especially as these items may be decontaminated at the departmental level by clinical staff. Where possible, dedicated equipment should be used. If dedicated equipment is not an option equipment and devices must be disinfected thoroughly after every use in line with the manufacturer's instructions and considering materials compatibility. The surfaces of re-usable items should be periodically examined to check for surface integrity and the continued ability to be able to effectively decontaminate. Materials that cannot be disinfected should not be used or discarded after use. Where possible, single-use equipment is preferred to limit possible spread via inadequately disinfected equipment. Equipment that is cleaned by clinical staff should be audited to ensure that it is effective, and facilities may wish to consider whether formal training for clinical staff in decontamination has been or should be provided.

- Patient clothing
The role of patient clothing in the transmission of C. auris is unclear. In the experience of the hospitals dealing with outbreaks, patients were asked to use hospital garments or clothes that have been washed at high temperatures. The expert group was not able to give a recommendation on this topic. Seeing that C. auris has been found to survive on linen, it may be prudent to change bedding and patient attire daily if decolonization or skin suppression is being attempted.

- Patient movement in the facility
Transfer of colonized or suspected patients for C. auris should be done with great care. The treatment of patients should always come first, but if transfer can be prevented by using mobile equipment, this should be considered. When patients need to go to radiology, for example, they should ideally be placed at the end of the schedule to allow time for terminal decontamination of the area.

- Readmission of previous C. auris positive patient
If known, previous positive C. auris patients should be placed in contact isolation and screened on three consecutive days. Contact precautions may be stopped if all three screens are negative. Weekly screening is however recommended as C. auris may resurface after antibiotic therapy or other interventions such as chemotherapy. This is the minimum measure, as local MDRO-related guidelines that are stricter must be regarded.

- Outpatient management
Family members and healthcare providers can become colonized; however, this is of minimal risk to the “healthy” individual. There are no guidelines yet for management of C. auris-colonized patients; however, it is prudent that sharing items should be kept to a minimum in line with the principles for other fungal infections: in particular, towels and clothing should not be shared, as well as cosmetic items, creams, ointments, etc., even in the absence of studies demonstrating any effectiveness.

- Healthcare personnel education and training
Compliance with, as well as adequacy of using IPC measures is essential to prevent transmission of C. auris within the healthcare facility. To increase staff's awareness around IPC measures, on-site training and auditing is critical to contain C. auris. Training should focus on standard precautions, PPE, environmental cleaning and other IPC measures applied to control C. auris. In addition, compliance and correct execution of IPC measures should be monitored, and direct feedback to HCWs should be provided.

- Outbreak management
The index or any unexpected case colonized or infected with C. auris should be isolated in a single/isolation room and direct contacts should be placed in cohort isolation, with contact precautions and no new patients should be admitted to the affected room. Maintaining cohorts of “proven colonized”, “possibly colonized” and “no risk” patients is important under all circumstances, even if that would lead to lowered bed capacities, reduction of admitted patients or cancelation of operating procedures. As C. auris has been cultured from the hands of healthcare personnel, where possible they should be assigned to one of the cohorts, instead of working throughout the whole unit. If the outbreak is large, creation of a separate unit for all proven colonized patients might be advisable. Single use equipment or dedicated equipment should be used. A root cause analysis by Schelenz, et al. found that patients who had contact with a positive case or contaminated environment were likely to have contracted C. auris within just four hours of contact.

To confirm negative patients, three consecutive C. auris screenings should be negative. In the absence of published data, it is advisable to space out the three screening times (such as day 3-5-7), instead of doing them on day 1-2-3 or even all in a day. When de-isolated, weekly screening of the negative contact patients, until discharge, is recommended. Healthcare personnel have been identified as carriers in the nose and groin, so during an ongoing outbreak, screening of healthcare staff could be considered, as well as unannounced cultures from hands as an educational measure.

In an outbreak scenario, cleaning and disinfection should be increased to three times daily of at least all high-touch surfaces with a product effective against C. auris. Terminal cleaning and disinfection should be monitored with quality indicators that go beyond visual inspection, such as ATP or fluorescent markers, to ensure the quality of terminal cleaning and disinfection. If available, UV-C or HPV could be used after terminal cleaning and disinfection, as an additional assurance that the room has been adequately decontaminated and is safe for the next bed occupant.

No recommendations can be provided about to the effect of decolonizing patients; in theory, this may lead to a lower burden of yeast on the patient's skin and thus, a lower risk of transmission. This approach has been adopted in outbreaks; however, non-conflicted data are missing to draw that conclusion for C. auris. Currently, limited evidence on the use of topical agents for the control of skin colonization exists. In one UK major outbreak, 2 percent chlorhexidine washcloths or 4 percent chlorhexidine solution were used to control skin shedding as part of several interventions. However, despite daily CHX bathing, patients described in the UK continued to be colonized with C. auris. Chlorhexidine solutions may dry the skin in such a way that it may lead to prolonged colonization with C. auris. Some patients however remained persistently colonized, possibly due to recolonization from bedding, as Candida spp. have been demonstrated to survive on polyester textiles for up to eight days.
Mandatory national reporting of outbreaks in institutions should be considered, as well as mandatory reporting of infections with C. auris. In countries that do not have laws accounting for this, mandatory sharing of outbreak status data with regional healthcare providers is advisable.

References and Recommended Reading:
Cadnum JL, Shaikh AA, et al. Effectiveness of disinfectants against Candida auris and other Candida species. Infect Control Hosp Epidemiol, 38 (2017), pp. 1240-1243.
Chow NA, et al. Multiple introductions and subsequent transmission of multidrug-resistant Candida auris in the USA: a molecular epidemiological survey. Lancet Infect Dis, 18 (2018), pp. 1377-1384.
Chowdhary A, et al. Candida auris: a rapidly emerging cause of hospital-acquired multidrug-resistant fungal infections globally. PLoS Pathog, 13 (2017), Article e1006290.
Cortegiani A, Misseri G, Fasciana T, Giammanco A, Giarratano A, Chowdhary A. Epidemiology, clinical characteristics, resistance, and treatment of infections by Candida auris. J Intensive Care. 2018;6:69.
Cortegiani A, Misseri G, Chowdhary A. What's new on emerging resistant Candida species. Intensive Care Med. 2018. https://doi.org/10.1007/s00134-018-5363-x.
Cortegiani A, et al. The global challenge of Candida auris in the intensive care unit. BMC Critical Care. Vol. 23, No. 150. 2019.
Eyre DW, Sheppard AE, Madder H, Moir I, Moroney R, Quan TP, et al. A Candida auris outbreak and its control in an intensive care setting. N Engl J Med. 2018;379:1322–31.
Kean R, et al. Surface disinfection challenges for Candida auris: an in-vitro study. J Hosp Infect. Vol. 98, No. 4. Pages 433-436. April 2018.
Kean R, McKloud E, et al. The comparative efficacy of antiseptics against Candida auris biofilms. Int J Antimicro Agents. Vol. 52, No. 5. Pages 673-677. November 2018.
Kenters N, et al. Control of Candida auris in healthcare institutions. Outcome of an ISAC expert meeting. International Journal of Antimicrobial Agents. Aug. 13, 2019.
Ku TSN, Walraven CJ, Lee SA. Candida auris: disinfectants and implications for infection control. Front Microbiol. 2018;9:726.
Lockhart SR, Etienne KA, Vallabhaneni S, Farooqi J, Chowdhary A, Govender NP, et al. Simultaneous emergence of multidrug-resistant Candida auris on 3 continents confirmed by whole-genome sequencing and epidemiological analyses. Clin Infect Dis. 2017;64:134–40.
Lockhart SR. Candida auris and multidrug resistance: Defining the new normal. Fungal Genetics and Biology. Vol. 131. October 2019.
Madder H, Moir I, Moroney R, Butcher L, Newnham R, Sunderland M, et al. Multiuse patient monitoring equipment as a risk factor for acquisition of Candida auris. bioRxiv. 2017:149054.
Marrs R, Pellegrini D, et al. Session: 58. HAI: The Environment: Successful Environmental Disinfection to Prevention Transmission of Candida auris. Oct. 5, 2017.
Moore G, Schelenz S, et al. Yeasticidal activity of chemical disinfectants and antiseptics against Candida auris. J Hosp Infect, 97 (2017), pp. 371-375.
Osei Sekyere J. Candida auris: a systematic review and meta-analysis of current updates on an emerging multidrug-resistant pathogen. MicrobiologyOpen. 2018;7:e00578.
Piedrahita CT, Cadnum JL, et al. Environmental surfaces in healthcare facilities are a potential source for transmission of Candida auris and other Candida species. Infect Control Hosp Epidemiol, 38 (2017), pp. 1107-1109.
https://www.elsevier.com/connect/preventing-diagnostic-errors-by-uniting-the-clinical-laboratory-with-direct-patient-care
Rhodes J and Fisher MC. Global epidemiology of emerging Candida auris. Current Opinion in Microbiology. Vol. 52. Pages 84-89. December 2019.
Sears D and Schwartz BS. Candida auris: An emerging multidrug-resistant pathogen. Int J Infect Dis. Vol. 63, Pages 95-98. October 2017.
Schelenz S, et al. First hospital outbreak of the globally emerging Candida auris in a European hospital. Antimicrob Resist Infect Control (2016), pp. 1-7.
Sherry L, Ramage G, et al. Biofilm-forming capability of highly virulent, multidrug-resistant Candida auris. Emerg Infect Dis, 23 (2017), pp. 328-331.
Short B, Brown J, et al. Candida auris exhibits resilient biofilm characteristics in vitro: implications for environmental persistence. J Hosp Infect. Vol. 103, No. 1, Pages 92-96. September 2019.
Welsh RM, Bentz ML, et al. Survival, persistence, and isolation of the emerging multidrug-resistant pathogenic yeast Candida auris on a plastic healthcare surface. J Clin Microbiol, 55 (2017), pp. 2996-3005.