[ad_1]
With COVID-19 deaths dramatically down from earlier peaks, US health care providers still confront a different nationwide emergency: hospital errors that kill an estimated 150,000 patients annually.
Increasingly, physicians and hospital administrators say fulsome embrace of air-safety principles is likely to be a big part of any solution. But many are rethinking how to make such efforts more viable than in the past.
Long before COVID-19 devastated millions of families and pushed thousands of doctors and nurses nearly to the breaking point, medical leaders sought to learn from the airline industry’s impressive safety record to reduce a veritable epidemic of surgical and other treatment mistakes. Now, with greater stresses on medical staff, supply chains, and safety measures, developing a new vision to reassess application of aviation practices to combat deadly treatment errors appears more vital than ever.
Based on recent evidence, overall hospital safety has declined since the onset of the pandemic. A Centers for Disease Control and Prevention study covering more than 2,900 hospitals revealed a significant increase in bloodstream infections associated with intravenous catheters over a three-month period in 2020, versus the same quarter of 2019. A separate study of 148 HCA Healthcare-affiliated hospitals showed surges in various treatment-related infections between March and September 2020.
Scheduled domestic airlines transported more than eight billion people over the past 13 years without a single passenger fatality—a phenomenal feat surpassing expectations. US hospital blunders, by contrast, needlessly cost an estimated 400 patient lives each day, comparable to a packed jumbo jet crashing every 24 hours.
What accounts for this discrepancy? Veteran leaders from both realms say health care has struggled to fully understand and translate aviation’s safety processes to improve patient safety. Hurdles include reliance on limited techniques such as checklists, technical difficulties sharing safety data across the industry, and cultural challenges persuading veteran physicians to act more like team players alongside junior doctors, nurses, and other hospital staff.
Barely a few months ago, the Department of Health and Human Services inspector general concluded that hospital safety improvements generally have stalled, with Medicare patients suffering preventable harm or serious complications effectively at the same rate as in 2008.
Three Strategies To Improve Safety
Experts point to health care’s failures to fully incorporate three fundamental strategies relied on by airlines and their federal regulators: extensive voluntary reporting of serious incidents; prompt and widespread dissemination of information about life-threatening hazards; and user-friendly equipment designs intended to prevent a repeat of the same fatal errors.
In response, from Capitol Hill offices to hospital boardrooms there is escalating discussion of how to better incorporate such lessons. One example is renewed debate over creating a National Patient Safety Board, patterned after the National Transportation Safety Board (NTSB), which probes aviation accidents and makes public safety recommendations. Such an entity could provide the framework for collecting and distributing essential data about medical mistakes.
Systemic lapses in hospital safety were highlighted by the Institute of Medicine’s seminal 1999 report, “To Err is Human,” which sparked the modern patient safety movement. Over the next two decades, Atul Gawande’s internationally acclaimed book, The Checklist Manifesto, and follow-on publications popularized a bevy of aviation-derived safeguards.
But health h care needs to move beyond those early steps. Undue emphasis on medical checklists, mimicking an aviation procedure initially developed many decades ago, frequently impedes reliance on newer and more effective safety techniques. Checklists’ usefulness in medicine has been oversold, according to Raj Ratwani, director of MedStar Health’s National Center for Human Factors in Healthcare. After initially embracing them above other safety practices, physicians and hospital administrators have opted for more sophisticated safety tools, including robust data sharing, enhanced teamwork, and greater responsibilities for junior staff.
Non-Punitive Reporting
Non-punitive incident reporting and swift distribution of details about dangerous “near misses” helped revolutionize modern air safety. Too often, however, those concepts falter in hospital settings due to institutional opposition or fear of management reprisals.
“Progress has been very slow reducing deaths” by applying proactive air-safety practices “to nearly all medical specialties,” according to David Mayer, head of the safety research arm of Maryland-based MedStar Health. “When it comes to acknowledging and documenting mistakes,” he says, “we are still not where aviation is.”
Christopher Hart, a former chairman of the NTSB, explains it more bluntly. “Information sharing in health care is pitiful compared to aviation.”
Promoting transparency and empowering reporting of treatment errors “requires a major cultural change,” says Kathleen Bartholomew, a former nurse and hospital manager, but “that mindset never took hold” in much of medicine. “Nurses are still getting fired for bringing up safety concerns,” she adds, while physicians often remain reluctant to criticize colleagues.
Disincentives for safety reporting prompted nationwide headlines during the recent trial of a former Tennessee nurse, RaDonda Vaught, who was convicted of two felonies for a fatal medication error. Various health care organizations such as the American Nurses Association and the American Hospital Association expressed concern that criminalization of treatment errors would further impede voluntary reporting and data sharing. (She was sentenced to three years’ probation.)
Data Sharing
Medicine already has hundreds of national and state registries collecting data on patient outcomes, complications, and best practices. They are all confidential and shielded from legal discovery. In theory, the model is similar to air-safety reporting.
Yet, unlike aviation, there is no comprehensive database aimed at preventing hospital errors. Information often is contained in digital silos that communicate poorly, if at all, with each other. Typically, incident data are used by the government to financially punish hospitals for safety lapses—but only after problems occur. That discourages timely public disclosure of errors, which translates into limited opportunities for ambitious data mining to uncover precursors, root causes, and essential takeaways.
Generally lacking are urgent recommendations that, in turn, can be quickly adopted by other health care providers. Instead, medical feedback loops primarily focus on reporting the incidence of adverse events. Without more thorough voluntary reporting and deeper data analyses, experts say hospitals are likely to find it difficult to implement effective and sustainable safety programs.
User-Friendly Medical Devices And Technology
Moreover, health care hasn’t followed aviation’s lead incorporating human-centered technology. Modern jetliner cockpits are designed—and then evaluated before and after planes enter service—to ensure the automation they rely on is user friendly. So far, that principle isn’t ubiquitous in operating rooms or other clinical settings.
“What I don’t see are human-factors experts” who are “designing the equipment but also designing the processes” to avoid human mistakes, says Hart, a board member of the Joint Commission that accredits hospitals.
More user-friendly medical equipment, combined with advanced electronic health records and predictive analytics, could yield dramatic safety improvements for patients. Potential benefits could resemble the way highly integrated cockpits and in-depth data analyses have revved up airline safety since the late 1990s.
At this point, most hospitals rely on software designed to prevent mistakenly giving excessive doses of drugs to infants or providing adult patients with medications that should never be taken together. Another safety push entails installing common designs of switches and control knobs on different brands of equipment, including defibrillators and infusion pumps, to prevent staff confusion that can lead to serious mistakes.
For their part, aircraft makers and equipment suppliers want to harness artificial intelligence to pinpoint emerging dangers. Proposed safeguards, for instance, include systems able to autonomously pinpoint airports, perform essential radio communications, and actually land aircraft and then safely stop them on runways if pilot actions suggest confusion or incapacitation. Without any human intervention, cockpits of the future also likely will carry out emergency maneuvers to prevent deadly stalls and avoid collisions with other aircraft, mountains, or even man-made obstructions.
Eric Horvitz, Microsoft’s chief scientific officer, says that path could unlock “the sleeping giant of health care” innovation. Tech companies and hospitals have already rolled out solutions to help identify patients most likely to deteriorate or suffer complications.
Artificial intelligence (AI) champions also point to existing aviation technology that allows some pilots, immediately after completing a flight, to review a digital and video replay of their performance and compare it to those of other pilots. Similarly, certain robot-assisted surgical systems use algorithms based on previous procedures to help surgeons move the controls more smoothly. And more hospitals are tapping into AI networks to identify emergency department or intensive-care patients at highest risk for strokes or other life-threatening conditions.
But here, too, medicine appears to be significantly trailing aviation. Aircraft manufacturers already have devised cockpit features that anticipate pilot reactions and can unilaterally take over flight-control systems in dire situations. Physicians, however, continue to resist ceding decision making or control of patient procedures to computer networks—no matter how advanced they may be.
Further complicating matters is that while AI-enabled software and devices are pitched as time-saving, cost-cutting, and more accurate solutions to complex medical procedures, the Food and Drug Administration is playing catch-up policing fledgling initiatives fraught with technical and ethical questions.
Unless the medical community re-envisions how reliable, time-tested aviation principles can boost hospital safety, the status quo portends many more years of well-meaning discussions. What’s likely to remain missing, however, are essential changes to reduce the tragedy of deadly patient errors.
[ad_2]
Source link