A front–page article in Yesterday’s New York Times reports “The federal Centers for Disease Control and Prevention projected this year that one of every 22 patients would get an infection while hospitalized — 1.7 million cases a year — and that 99,000 would die, often from what began as a routine procedure.”
A little reported on New England Journal of Medicine study from a few months back concluded that 19,000 people die from preventable infections acquired during the insertion of catheters.
But shock, worry and amazement won’t help answer the question; what makes the hospital environment a major killer? While some medical infections will always occur, through aggressive cleanliness campaigns several European countries have all but eliminated MRSA, one of the most deadly hospital acquired diseases. The New England Journal of Medicine study reports that catheter related blood stream infections dropped 66% with some minor changes (including rigorous hand-washing, thorough cleaning of the skin around catheters, and wearing sterile masks, gowns and gloves as well as removing catheters from patients as soon as possible and avoiding inserting catheters in the groin area). According to a 2004 Canadian survey published in the American Journal of Infection Control, up to half of all hospital-acquired infections were found to be preventable if infection control procedures were adequate. And a similar six-year old American study concluded that up to 75 percent of deadly infections caught in hospitals could be avoided by doctors and nurses using better washing techniques. (Studies show that over half of the time physicians fail to clean their hands before treating patients and that 65 percent of physicians and other medical professionals go more than a week without washing their lab coat.)
Nevertheless, it is wrong to simply blame front-line medical workers for these unnecessary infections. Data shows that successful behavioral change is contingent upon vigilant supervisors who put in place adequate preventive measures and demand proper cleaning practices. As well, understanding management, a culture of respect, proper staffing levels, ongoing education programs and proper shift scheduling have all been shown to improve the health and safety of hospitals — for both patients and workers. (“By nearly doubling cleaning staff hours on one ward,” US News and World Report explains, “a hospital in Dorchester reduced the spread of MRSA by nearly 90 percent.”)
The biggest barrier to improvement, however, is our economic system, which focuses on cures and technology because that’s where the biggest, quickest profits can be found. Pfizer isn’t likely to fund studies that look into the role hand-washing plays in hospital-acquired infections since they don’t see a profit in doing so. Billions of dollars are spent annually on the development of new drugs and medical technologies, but little is spent on basic hospital infection control — even though this would save a greater number of lives — because there has been little economic incentive to do so. Some company makes a profit when a new MRI machine is purchased, but the bottom line that benefits from better hand-washing techniques is only measured in lives.
It has taken a public outcry just to get some states to force hospitals to track and report hospital acquired infections. But, unlike restaurants and cruise ships, the body that inspects and accredits US hospitals, Joint Commission, does not measure cleanliness.An over-reliance on the profit motive outside the hospital door also causes infection-control problems. More than 70 per cent of hospital-acquired infections are resistant to at least one common antibiotic. Infections resistant to antibiotics significantly increase the chance of death.
This increase in deadly multi-resistant viruses is, in large part, attributable to our overuse of antibiotics, which is connected to drug companies’ bottom lines. Doctors, faced with patients demanding quick cures, and encouraged by a pharmaceutical industry that spends tens of billions on advertising, over-prescribe antibiotics. “Prescribing antibiotics has become so common that many doctors literally are just phoning it in,” a recent USA TODAY article explains.
According to an analysis of 1.5 million insurance claims for antibiotic prescriptions in 2004, 40% of people who filled an antibiotic prescription had not seen a doctor in at least a month. Without seeing the patient, how can doctors determine whether their symptoms were the result of a viral infection — which don’t respond to antibiotics — or a bacterial infection that do. This over-prescription of antibiotics increases the growth of multi-resistant organisms.
And in the case of the Clostridium difficile superbug, which has killed many hospital patients over the past few years, antibiotics perturb the bacterial flora in the intestine. This opens the door to the super-bug. (One study found, according to the Times of London, that “reducing the number of prescriptions for broad–spectrum antibiotics [which kill a wide range of bacteria] from about 53 per 1000 admissions to 17 per 1000 caused the number of cases of C difficile to fall by two thirds.”) Additionally, half of all antibiotics sold each year are used on animals, according to New Scientist. Industrial farmers give their animals constant low doses of these drugs to treat infection but also as a growth hormone. The administration of low doses is especially problematic since it becomes a feeding ground for organisms to mutate. Data shows a strong correlation between increased use of antibiotics on animals and the emergence of resistant strains in the animal population with mirrored increases amongst people.
To end this practice, the European Union recently banned antibiotic growth promoters. Washington and Ottawa, kowtowing to the animal industry, have done little. Hospitals can be much safer and healthier places. Tens of thousands of lives can be saved if real health outcomes can be given priority over profit-making opportunities.