Friday, June 28, 2013

Spreading the wealth

One of the underappreciated side effects of antibiotic use is its ability to promote antibiotic resistance. It's no surprise that antibiotics would selectively enrich resistant strains of bacteria, favoring their reproduction over that of susceptible strains. More unexpected are the findings that antibiotics induce a shotgun blast of resistance genes into their local environment, where they may find new homes.

Bacterial genomes play host to bacterial viruses (bacteriophage) that quietly reproduce along with the host, and sometimes encode genes that are of some benefit to their hosts - including genes for antibiotic resistance. However, when these viruses sense that the host is in trouble, they will excise themselves, replicate (usually to the point of killing the host) and leave to find a new home.

Evidence is mounting that this phenomenon - which has been known for 70 years - can cause the spread of resistance both within and between different types of bacteria, including MRSA, Enterococcus and the gut microbiome in general. The upshot is that using an antibiotic increases the likelihood that previously susceptible strains will be made resistant, chipping away at the usefulness of that antibiotic.

The odd thing about this behavior is that it spreads the resistance gene at the expense of the individual bacterial host - despite harboring the resistance gene, the host is killed anyway by its bacteriophage parasite. This most likely occurs because the bacteriophage jump ship whenever they sense that the host is in distress, whether the source of the stress is starvation, excessive heat, toxic chemicals or antibiotics.

This behavior may be an example of selfish genes in action. If antibiotics are in the environment, the selective advantage of a resistance gene is at a maximum - it is a sellers market for them. In killing their original host, they have made tens or hundreds of copies of themselves. If more than a few percent of the infectious particles find new homes, then their strategy has been a success, and will continue to be successful as their newly-resistant hosts outcompete susceptible rivals.

However, this rationale is true regardless of whether the host is stressed or not - if by killing the host the phage progeny can find 5 new hosts to infect, then obviously it is to their advantage to do so. Lytic phage follow precisely this strategy, killing every host they infect in order to reproduce as quickly as possible. Not surprisingly, bacteria have evolved a number of mechanisms, such as CRISPR and modification-restriction systems to thwart bacteriophage infection.

Thus the selfish gene view of antibiotic-induced resistance gene mobilization needs a corollary: that antibiotic-induced stress renders bacteria more susceptible to phage infection, possibly by suppressing bacterial defense mechanisms.

A study from James Collins lab provides indirect evidence that this is the case. Modi et al exposed mice to ampicillin and ciprofloxacin, and followed the spread of phage-encoded resistance genes. Resistance genes not only were more abundant after antibiotic exposure, but they were linked to a more-diverse set of bacterial genomes. In other words, it appears that bacteriophage were able to infect a wider variety of bacterial strains and species after antibiotic treatment than they were before. These findings support the idea, that from the resistance genes' selfish point of view, it's a good idea to look for new hosts when antibiotics are around.

The upshot is that antibiotic use not only encourages the spread of resistance by differential survival of resistant bacteria, but by dissemination of resistance genes into the environment. These genes can be picked up by different species that may not previously have served as hosts. This mechanism suggests a new danger of imprudent use of antibiotics: it allows resistance genes found in nonpathogenic strains to find their way into pathogens.

Wednesday, June 26, 2013

More on looking the part

Mike Edmonds has a few comments regarding the study on lab coats and cleanliness from Dr. Silvia Munoz-Price and colleagues.  Lab coats are washed about every 12 days on average, even though 90% of the respondents were aware that they were potentially contaminated.  The most prevalent reason given for wearing lab coats? (hint, it is not to do lab work) - Instead, it is "to symbolize their profession", a reason that must be immensely gratifying to some.  


The empire strikes back

Stung by court orders and new FDA policies aimed at reducing the indiscriminate use of antibiotics in animal feed, the National Beef Cattle Association is mounting an "education" campaign in Congress.

Some 80% of all antibiotics sold in the US are given to livestock. The primary use is not to treat individual sick animals, but to cause them to gain weight more rapidly. Feed antibiotics are given a subtherapeutic levels and appear to alter the gut microbiome in a manner that leads to increased fat accumulation

The NCBAs press release mentions none of this of course. It claims that antibiotics improve animal health and thus protect the food supply - all good stuff. In fact, the opposite is true. The danger of spreading antibiotic resistance through this practice has long been recognized, and the entrance of resistant organisms into the food supply has been confirmed.

It's pretty clear what's going on here. The FDAs guidelines are voluntary because it does not have statutory authority to enforce them without exposing itself to hundreds of appeals from well-funded industry groups. Congress could change this - and that is what the cattlemen are trying to prevent. Presumably they will use the same tactics that are employed by climate-change deniers: throw out a bunch of half-truths and distortions, and then use the resultant confusion to claim that there is a controversy that provides an excuse not to act.

Tuesday, June 25, 2013

Return of VRSA?

A very unsettling report in Lancet today from Portugal, of what is apparently the first appearance of vancomycin-resistant S. aureus in Europe.
Vancomycin is the principal treatment for systemic MRSA infections. A broad-spectrum antibiotic, it is often prescribed on the basis of symptoms and in the absence of any microbiology data. As a result, it is the most-prescribed antibiotic in US hospitals.
Given this history, it is expected that resistance should emerge. About 20% of the population carries S. aureus, and so exposure of Staph strains to vanco is continuous and widespread, even in patients who do not have Staph infections. These are the perfect conditions for the development of resistance. Indeed, the rate of vancomycin resistance in another common Gram-positive pathogen, Enterococcus, has risen to some 30%.
Yet outright vancomycin resistance in S. aureus has remained vanishingly rare. A few cases emerged in Michigan and Pennsylvania in the early 2000's, but were contained and have not re-emerged. In response to this outbreak, the NIH chartered the Network on Antimicrobial Resistance in S. Aureus (NARSA) in an attempt to get out in front of a potential epidemic of untreatable Staph infections.
The epidemic never materialized, and no one is sure exactly why. The gene that confers vancomycin resistance in most Enterococcus strains, vanA, also produces resistance when introduced into S. aureus, and was found in the VRSA isolate described in the Lancet paper. S. aureus strains that harbor vanA are not obviously less fit. Even if they were, they would be expected to evolve - the first MRSA strains grew very poorly and were quickly outcompeted by susceptible strains in the absence of methicillin selection. S. aureus does not readily take up DNA from other organisms in its environment. But Staph and Enterococcus are frequently found together in wounds (as they were in the Portugese case) and other infections, and even once in a trillion events like vanA gene transfer must have occurred many millions of times.
We've been virtuous (chartering NARSA was an act of prudence and foresight), but mostly lucky. An outbreak of VRSA that spread like MRSA would create an epidemic of truly terrifying proportions. Almost everyone expects it to happen eventually. The question is whether we will use the respite we've been granted to develop effective alternatives to vancomycin when it begins to fail.

Monday, June 24, 2013

Why there aren't more antibiotics, continued

Short answer: The low-hanging fruit has all been picked. For that matter, so has the medium-hanging fruit, and much of the very highest fruit as well.

It's worth taking a minute to review just how most antibiotics have been discovered and developed. Nearly all human pathogens (Mycobacterium tuberculosis is an exception) grow readily in flasks and Petri dishes. This growth typically takes a day or two to cover a plate and can be easily detected by simple inspection.

Spread Staph aureus on a plate and put a drop of some test compound on the plate. Come back the next day, and if your drop has killed the bacteria, there will be a clear spot. Congratulations, you have discovered an antibiotic, and possibly a drug lead compound.

This is how the first antibiotic compounds, such as sulfonamide, penicillin and streptomycin were discovered. Modern methods of screening are much faster and more sophisticated, but rely on the same principle - expose bacteria to a compound and see if they stop growing or are killed.

Of course lots of things kill bacteria but aren't useful drugs - they are toxic, or are poorly absorbed. Medicinal chemists address these problems by adding various chemical groups to a lead compound and repeating the testing process until they find a derivative of the original compound that works as desired.

Pharma companies got very good at this in the 50s, 60s and 70s, and are even better at it today. Even though resistance began appearing almost immediately, there was a robust pipeline of new antibiotics that would keep working when the old ones would not.

So why aren't we finding new antibiotics at a rate that keeps up with resistance anymore? Well, think about all the criteria that a useful antibiotic has to fulfill: it must kill, or at least stop the growth of pathogenic bacteria; it must have high potency so that only a small amount of drug is needed; it preferably is water-soluble and absorbed through the digestive system; it persists for several hours once absorbed and is not rapidly cleared or degraded by the liver and kidney; and neither it nor its breakdown products are toxic to humans.

Bacteria and humans share the same basic biochemistry: we use oxygen to burn sugar to create energy in the same way; our DNA encodes and expresses genes in very similar ways; and we replicate our DNA in very similar ways. One of the biggest differences between humans and bacteria is that they have a cell wall and we don't.

There are thus a finite number of differences between bacteria and humans; only so many biochemical candidates for antibiotic targets. Pharma companies have been working on them for decades now, investing many millions in the effort. More importantly, soil fungi (whose biochemistry is more similar to humans than it is to bacteria) have been at work on this problem for billions of years. They are in a never-ending worldwide effort to find compounds that will stop the growth of bacteria, and thus give them an advantage in the contest for life.

The really good solutions that they found enabled them to grow more and thus become widespread. That's why most good antibiotics in the soil are not hard to find, but easy - they helped fungi to be successful.

This is why the best antibiotics were discovered first, with penicillin being the prime example. The class of drugs to which it belongs - the beta-lactams - are still among the most effective and best-tolerated antibiotics. They target the bacterial cell wall, and thus are targeted at a bacterial feature that has no human counterpart.

The likelihood of finding a new class of antibiotics that is as good as the beta-lactams is zero. New antibiotics will be much more narrowly targeted, preferably toward biochemical pathways that are found only in certain groups of pathogens. Because this approach requires a substantial R&D effort, pharma companies will only make the effort if the financial incentives and the regulatory environment are right. Until then, we can't expect to see much progress.

 

Saturday, June 22, 2013

Rising Plague, the trailer

Brad Spellberg, author of Rising Plague, along with Rich Proctor and David Gilbert, are at work on a documentary that shows the human costs of antibiotic resistance. The trailer is now available. It shows a scenario that is increasingly common: healthy young people are devastated by infections that hospitals can do little to treat. The two subjects of the trailer survive, but their health is permanently compromised.

It's a very good explanation of the causes and consequences of antibiotic resistance, and of what we need to do about it. Please watch.

 

Wednesday, June 19, 2013

Antibiotic journalism, the bad and the good

Two responses to the G8 ministers' communication on the threat of antibiotic resistance:

  1. Hysteria and hyperbole reading like bad science fiction from the Sunday Times
  2. A very smart, skeptical and informed interview of Sally Davies, the UK Chief Medical Officer by Geoffrey Carr of the Economist. Killer question: "Is there any way that doctors can be induced to behave more sensibly?" (at about 7:10). Dame Sally's response was suitably diplomatic.

FDA flexibility with FMT

The FDA issued a statement Monday stating that it will not require filing and review of an IND for Fecal Microbiota Transplant procedures, so long as there is adequate informed consent. The agency still asserts its authority to regulate the procedure, and intends to issue guidance covering this "enforcement discretion".

This seems about right to me - adequate informed consent would be a part of any responsible practitioner's protocol at present anyway, so no one will be denied treatment. In the meantime there is an opportunity to have a conversation about how to ensure the procedure can be made safe and effective and part of the medical mainstream. The worst outcome would be to leave it unregulated and suffer the inevitable disaster that will discredit FMT.

Broad-spectrum badness

Here is another report describing high rates of inappropriate antibiotic use, this time in outpatient dialysis centers. Fully 30% of the antibiotic prescriptions written were inappropriate, usually because criteria for infection were not met, or because a broad-spectrum antibiotic was used when a narrow-spectrum antibiotic was the better choice.

In an earlier post I made the point that developing broad-spectrum antibiotics is not necessarily a good thing. The rationale here is that such antibiotics, while providing some therapeutic benefit, are usually not optimal. Narrow-spectrum antibiotics are preferred when the identity and antibiotic susceptibilities of the infecting bug are known, from the standpoint both of patient benefit and antimicrobial stewardship.

Broad-spectrum antibiotics thus enable second-best prescribing practices - they allow doctors to treat patients without ordering a microbiology report, or allow them to fail to follow up on the report. Broad-spectrum antibiotics such as vancomycin are valuable medicines - but is apparent that they also can be used as a crutch to support inferior medical practices.

The end of the golden age

I should explain what I mean by "The End of the Antibiotic Era". It's not that antibiotics will cease to be used or to be useful. Or that new antibiotics won't be discovered. Instead, I'm defining the antibiotic era as the period in which the average well-trained doctor could prescribe antibiotics on the basis of clinical signs and symptoms alone, and be highly confident that they would work. This era started in the 1940s with the introduction of sulfonamides and penicillin, and (I would say) ended in the 2000s when methicillin resistance in S. aureus became widespread.

The impact of antibiotics was profound and is certainly comparable to any other technological advance of the period. Only the introduction of clean water and food has clearly had a greater impact on human health, although an argument could also be made for vaccination. Death rates dropped from 250 per 100,000 Americans in 1937 (the beginning of the golden age of antibiotics) to about 50 by 1953. The effect on infant mortality was even more dramatic, dropping from 5500 per 100,000 live births in the late 1930s to half that by the mid-1950s. Even if only half of this drop is attributable to antibiotic use, that works out to more than 120,000 lives saved per year.

Of course, resistance emerged soon after antibiotic use became common. But pharmaceutical companies became very good at identifying lead compounds (usually from soil microorganisms) and then chemically modifying them to enhance uptake, reduce toxicity and thwart resistance mechanisms. Aminoglycosides, macrolides, tetracyclines, quinolones and cephalosporins were introduced and then improved on, generation by generation. Not only were these medicines highly effective, they became remarkably cheap. The cost of a life-saving course of antibiotics was (and usually still is) less than that of a dinner at a nice restaurant.

Because these drugs were also very safe and nontoxic, they were used as placebos and prophylactics. I can well remember our whole family lining up for penicillin injections in the 1960s when we reported to the doctors office with colds. I'm sure he knew that we did not have bacterial infections, but saw little downside: we went away happy, he made a few extra bucks, and could rationalize that he was practicing preventive medicine. In the current era, anyone who practices this sort of Dr. Feelgood approach to antibiotic administration should be considered a public nuisance and threat to public health, and be dealt with accordingly.

As we leave the golden age of antibiotic effectiveness, there are several changes that we can expect to see:

  • More people will sicken and die from bacterial infections. This is already happening of course. I don't expect mortality rates to return to pre-antibiotic levels because of better public health infrastructure and supportive care, and because new antibiotics will be introduced. But nearly everyone will know someone, or be someone, who has suffered from a serious infection that could not readily be resolved due to resistance.
  • Because of this suffering, antibiotic effectiveness will come to be viewed as a public resource, or commons. Abusing antibiotics is no less a tragedy of the commons than fouling the water or air - shotgun prescribing may provide an immediate benefit to a sick individual, but it will end up making many more people sick.
  • Doctors will not be able to resolve this tragedy on their own. Most are aware that prescribing antibiotics empirically (ie., in the absence of any test results that indicate the appropriate drug to prescribe) contributes to the problem of increased resistance. But faced with a seriously ill patient, it is too much to ask of doctors that they withhold a treatment that might work, due to theoretical concerns that someone else, somewhere down the line, is somewhat more likely to get an untreatable infection.
  • Therefore antibiotic use will become much more regulated. Your GP, upon seeing a spot in your chest X-ray, cannot start treating you with cisplatin or any other powerful cytotoxic therapy. Similarly, the use of whole classes of antibiotics will increasingly be restricted to specialists, often with the involvement of hospital pharmacists.
  • As a result, the cost of antibiotic therapy will rise significantly. On the whole, this will be a good thing. Antibiotics will be used more discriminately, and profit margins will increase, incentivizing new discovery R & D. So long as these costs are equitably distributed, public welfare will be increased.
  • A market for diagnostics that can rapidly determine antibiotic resistance and susceptibility will be created, in order that antibiotics can be still prescribed in a timely way, but based on evidence. Several accelerated tests, including one from my former employer (MicroPhage, Inc), have been introduced. None has gained much traction - hospitals don't yet see appropriate antibiotic use as a sufficiently compelling problem to warrant the extra costs of testing, which they usually cannot bill to insurers. This will change.

In short, the end of the golden age of antibiotics will not mean the collapse of civilization - climate disruption or new virus emergence are much more likely candidates for that role. But people will die, changes will have to be made, and we'll all wish we had made them much sooner.

Monday, June 17, 2013

Looking the part

Wow. Stephanie Dancer has written an editorial for the British Medical Journal lashing out at the apparent plague of doctors who, by not wearing neckties, "...intimate a lack of personal hygiene and correspondingly lower standards of hygienic behaviour". She wants all such "scruffiness" to stop immediately, because lack of a tie could "indicate something more sinister".

Several studies (like this and this) have found that neckties carry pathogens, which is hardly surprising as they are a) handled by ungloved hands, b) apt to brush up against contaminated surfaces and c) rarely laundered.

Mike Edmond has disposed of the more incoherent of Dancer's objections and arguments. But it's pretty clear from the spluttering tone of the editorial that it is really all about maintaining status because "Doctors are members of a distinguished profession and should dress accordingly". I suspect that her notion of an ideal patient-doctor relationship would look a bit like this:

No one has ever proved that a patient was infected by a contaminated necktie (or long-sleeve shirt, or lab coat), but ... really? I would think that knowing you are carrying around a contaminated item of clothing would be all the evidence that's needed to make a change. Unless of course your first priority is to look like a TV doctor.

 

Friday, June 14, 2013

Why there aren't more antibiotics

Well, there are more reasons than can be covered in a single post, so I'll start with just one: economics. More specifically, return on investment. And the economics here, as in so much of medicine and health care, are driven by fear rather than by a rational cost-benefit analysis.

Antibiotics have been so good for so long, that few people fear losing a life they cherish to a bacterial infection. Antibiotics are one of the few medicines that reliably cure people rather than just alleviate symptoms or slow disease progression. And they do it with so few side effects that doctors feel comfortable prescribing them just in case they might actually be helpful, or to make a patient feel like they are being treated. In other words they are used as both prophylactics and placebos, as well as effective treatments for specific diseases. I can't think of any other class of medication that answers to this description.

And they are cheap. A daily dose of vancomycin costs about $10, and it will reduce the mortality rate of systemic MRSA infections from about 80% to 30%. Think about that - for a couple hundred dollars, patients go from near-certain death to a decent chance of a full recovery. Daptomycin, at around $120/day, is considered a very expensive antibiotic, prescribed reluctantly for MRSA by some physicians, even though it will give better outcomes for some infections.

For perspective, compare these costs to those of the newer targeted cancer chemotherapies: $50K (and higher) for treatments that make patients miserable and extend life by weeks or months at best.

Any rational analysis of life-days saved by the most expensive antibiotics would conclude that they are more valuable than targeted cancer chemotherapies by at least a factor of ten. A rational market would price them accordingly, attracting more investment into new antibiotic R&D.

But that's not likely to happen, and the chief reason is fear. Because antibiotics have been such good drugs (and cancer drugs have been such bad ones) we don't fear death from bacterial infections nearly as much as we fear cancer. The result is that cancer has been romanticized. Think about how many book, movie and mini-series plots are driven by a battle with cancer; how cancer patients are encouraged to view their illness and treatment as a hero's journey; and how we as a society have declared war (the most romantic pursuit of all) on cancer.

There's no mystery here about the economics of antibiotic development - they stink. Drug companies are much better off putting their resources into drugs they can charge stratospheric prices for (ie cancer treatments) or ones they can keep patients on for life (like statins). Drugs that are cheap and actually cure people in a couple of weeks are just not going to be on their radar.

 

Thursday, June 13, 2013

A new target for broad-spectrum antibiotics?

New targets for antibiotics are not so common, so the recent report from Ken Keiler's lab of small molecule inhibitors of the trans-translation pathway in bacteria is pretty noteworthy. Collaborating with Novartis, who presumably provided the library, they identified several inhibitors that are effective at low micromolar concentrations. This is about 100 times more potent than streptomycin, the first antibiotic discovered to target bacterial protein synthesis, and comparable to more modern protein synthesis inhibitors such as clindamycin.

Furthermore, the inhibitors work on a biochemical pathway that appears to be confined to bacteria, and is found in a wide variety of bacteria. Thus it appears that Keiler et al have found promising leads for a new class of broad-spectrum antibiotics.

However, I'm not altogether certain that this is good news. The current exemplar of a broad-spectrum antibiotic is vancomycin, the most-prescribed antibiotic in US hospitals. It's cheap, has clinical benefit for a wide variety of bacterial infections, and is only mildly toxic. These properties make it a default choice when the identity of the infecting bacteria is unknown. And since it usually takes 3 days to isolate and identify bacteria, it's easy for a physician to continue vancomycin therapy, even when practice guidelines indicate that a switch to a more selective agent is indicated.

The results are predictable. Enterococcus, which was once considered a commensal, has become widely vancomycin resistant and opportunistically virulent as other microbiota are wiped out. There are increasing reports of vancomycin tolerance and clinical failure in S. aureus infections.

A new class of broad-spectrum antibiotics would be welcome of course, as it would provide physicians with a tool against bacterial infections that might fail other treatments. But it would also further enable lazy prescribing practices by those doctors who are unable or unwilling to do the microbiology workup and bother to understand it. The inevitable result will be the emergence of resistance.

There probably aren't a lot of new targets out there for antimicrobial development. We can't afford to waste any of them. Yet this is precisely what will happen so long as any doctor is allowed to prescribe any antibiotic for any indication.

 

Wednesday, June 12, 2013

The resistome: it's not the 1%

PLoS One published a paper by Walsh and Duffy that reports the following:

  • Culturable soil bacteria are all antibiotic resistant at the > 20 ug/ml level
  • 80% are multi-drug resistant
  • Efflux pumps and innate resistance are the predominant mechanisms
  • The genes and enzymes responsible for clinical resistance don't seem to be in evidence

So it appears at first glance that soil is not acting as a reservoir of antibiotic resistance genes waiting to be recruited by clinical strains. Given that nearly all clinical antibiotics are derived from soil microbes, this is something of a surprise. It would have been a nice tidy story if Walsh and Duffy had shown that we get antibiotic resistance from the same place we get antibiotics.

 

So where do resistance genes come from? It seems unlikely that they could have arisen de novo in the clinic in the last 70 years - it is much more plausible that there is a well-established environmental reservoir that serves as a resistance bank from which clinical strains have been making withdrawals.

 

The best bet is still the soil, as that's where the antibiotics are. Walsh and Duffy did not want to make any assumptions about enzymes or genes, so they ran a phenotypic screen: they exposed soil bacteria to antibiotics and asked if they still grew as well in their presence as they did in their absence. The weakness of this approach, as the authors acknowledge, is that only about 1% of bacterial strains will grow as pure cultures under lab conditions. The rest require more complex conditions for growth, most especially including other microbes that form a community. It's these community-loving bacteria - the 99% - that are the most like source of current and future resistance genes.

 

For more on the sources of resistance in the environment, see this paper

 

Stools, fans and glands

There is a fair amount of outrage over the FDAs decision to begin regulating fecal stool transplants.  After all, it is a method that appears to be truly effective  for treating refractory C. difficile infections, which will kill some 14,000 Americans this year. Since the treatment has not been cleared by the FDA, practitioners will have to file an Investigational New Drug application for each study. The time for preparation and approval will be well over a month, meaning that this therapy will effectively become unavailable for acutely ill patients. The frustration that is being felt by patients and doctors is entirely understandable.

But the FDAs position is equally understandable and should result in greater benefit to the patient population in the long run. Fecal transplantation is now practiced by a small cadre of expert physicians and microbiologists who are well-informed and highly motivated to make the procedure work.  But every aspect of the procedure - from making the fecal "cocktail" to testing its safety to administering the dose - is a home brew based on some science, a little bit of experience, and fairly large doses of intuition.

Possibly the optimal conditions for these parameters are so broad and so forgiving that the precise details really don't matter much. That would be great. But we'll never really know until standardized protocols are tested against each other. In the meantime, we are sure to see a lot of jockeying for credit.  After all, there is probably a Nobel Prize at stake here, as there should be for the creators of a treatment that might save many thousands of lives.

 My experience with research scientists is that they tend to have the bitterest fights over the least important details of protocols: when they are all equivalent, the precise choice of procedure is a marker of personal validation, and emotions run high. I expect doctors to be no less insecure and egotistical.

So if the FDA were to do nothing here is what we could expect:

  • Dramatic differences in quality of treatment between institutions
  • Belated discovery of serious side effects
  • Inconclusive and unresolvable controversies over best practices
All this would end up discrediting the procedure for a long time, possibly forever, as confused physicians turn to other, more mainstream alternatives such as fidaxomicin

The FDAs actions will undoubtedly deprive some patients of a valuable therapy. But it will keep fecal transplantation from becoming the modern equivalent of goat gland therapy

The origins of second-best medicine


Medicine is not a science, but an art empowered by science. So it is little surprise to find that the same social factors that shape other workplaces - habit, hierarchy and deference to colleagues personal sensitivities - shape the antibiotic prescribing practices of physicians. Who likes to stand up and tell an esteemed colleague that they are making a mistake?  Well, some people do, but they soon get labeled as habitual contrarians to be ignored or suppressed. Not a career path for success. 

Esmita Charani and colleagues at The National Centre for Infection Prevention and Management in London have put a bit of science behind the suspicion that doctors behave in their workplace pretty much like everyone else does in their workplace. In a series of structured interviews with doctors, nurses and pharmacists, they found that evidence-based guidelines and policies counted for much less than personal experience, intuition and personal authority when it came to actual antibiotic prescribing practices. 

How big a problem is this?  After all, doctors care about giving the best care possible, and want to see their patients get better.  If poor prescribing practices were leading to bad outcomes, surely they would stop those practices - right?

The problem is not that doctors don't care, it's that they come equipped with human brains. And human brains are notoriously unreliable when it comes to evaluating success rates. We are good at remembering a few spectacular successes, thinking that they validate our competence. We forget the equally spectacular failures, believing that they are due to an unforeseeable combination of circumstances and bad luck. Most of all we are just plain bad at distinguishing natural fluctuations in success rates from pure chance.  

Baseball offers a great example of this phenomenon. Over the course of a week of baseball games, a star player will typically get 7 or 8 base hits. A journeyman would get 6, and a scrub would get 5. The difference between best and worst is a single event every few days.  No one could possibly tell the difference just by watching. That's why baseball teams keep score -  so that they always put the best players out on the field, rather than trust to recent experience, reputation or intuition. 

For critical diseases, such as Staph aureus bacteremias, the differences in outcomes between the best prescribing practices and the second-best are about the same as the differences between a star player and a scrub. MSSA patients that are prescribed general-purpose broad spectrum antibiotics have  roughly a 25% chance of dying; those whose prescriptions are evidence-based, and are specifically targeted have their risk of death reduced by half.

An individual doctor, who might see only 25 or fewer MSSA bacteremia cases per year, is not likely to notice the extra death or two that results from suboptimal prescribing practices. No doubt that patient was weaker to bnegin with, and had other complications. Nor will they be aware that the survivors spent more time in the ICU and more time in the hospital than necessary: they have treated their patients with a therapy that was helpful and most have survived. Why question what seems to be working?

Individual human brains are basically incapable of getting this right; we can't distinguish best from second best outcomes that are separated by frequencies of 10 or 20%. We need well designed and well controlled studies to identify these differences, and then we need to pay attention to them. But this can't happen if we defer to authority, trust intuition over published best practices, or remain quiet for fear of treading on a colleague's turf. 

About 5000 people are killed by MSSA infections each year in the US, and it is likely that the majority received suboptimal antibiotics. So it is safe to say that the behavior described by Charani et al are responsible for up to a thousand excess deaths per - for just one bad bug. That seems a pretty high price to pay to manage doctor's egos.