Tuesday, December 31, 2013

High prices for antibiotics - a good thing

The problem of antibiotic resistance is getting another profile boost with its inclusion in the NYT's "Room for Debate" series. There is actually little debate in the section, save the attempt by the Pork Council to downplay the threat of overuse of antibiotics in the livestock industry. Instead, there are good, but somewhat unconnected pleas for test-driven use of antibiotics (David Gilbert), better financial incentives for antibiotic R&D (Brad Spellberg), a more rigorous, top-down approach to infection control (John Bartlett), and advocacy of phage therapy (Matti Jalasvuori).

I'd like to connect the dots by suggesting that most of our problems with antibiotics have an underlying root cause: that antibiotics are too cheap.

Cheap antibiotics do not produce a return on investment for R&D. Cheap antibiotics are too readily prescribed for minor respiratory infections that are predominantly viral. Cheap antibiotics can be used as a growth promoter for livestock, and as a substitute for clean living conditions.

If a life-saving course of antibiotics cost thousands of dollars, rather than hundreds, pharma companies would race to develop them. If Z-packs cost hundreds of dollars, doctors would not be so quick to prescribe them for sore throats. If it cost tens of dollars, rather than pennies, to dose hogs with tetracyclines, then routine use of them would cease.

Antibiotic susceptibility in bacteria is not precisely a finite resource - it's not as though there are only a certain number of prescriptions that can be written before a given drug becomes useless. But it is certainly a common resource, one that can be depleted rapidly or husbanded for the use of future generations. We have been choosing to deplete their effectiveness rapidly, but we don't have to.

We could preserve effectiveness by a top-down approach, placing limits on who could prescribe antibiotics, or requiring test results for prescription. But these regulations would quickly become outdated as new technologies and diseases emerge. Regulation also would provide no financial incentive for R&D investment.

In contrast, high prices would make development of narrowly-targeted antibiotics more attractive, and would rein in overuse in both humans and livestock. Although individuals might pay more, society and the health care system would pay less, as there would be fewer patient deaths and fewer long stays in the ICU due to untreatable infections.

How would we obtain high-enough prices for antibiotics? Cancer drugs often cost many tens of thousands of dollars for a course of therapy. It's not clear how they obtained this pricing power, other than the fear of cancer that our society has cultivated. We've lost our fear of bacteria, but starting a scare campaign to restore it hardly seems ethical or feasible.

A conservation tax would raise prices to end-users, but would not provide a ROI for developers, unless the tax proceeds were channelled back to them. This system would be susceptible to abuse. Another approach might be to set a floor on prices, much as we do for milk and other commodities. A guaranteed price would reduce market risk for developers, as well as suppress overuse of currently available antibiotics. A price that is determined by trends in the development of resistance would make this system more effective, and somewhat less arbitrary and prone to abuse.

Higher antibiotic prices would reflect their true value to society. They would let us get to a future where new antibiotics are being developed, while old ones maintain their usefulness. It's a step we need to take.

 

Wednesday, December 25, 2013

Empiric therapy - not a substitute for rapid testing

In the last post I wrote about the case for the value proposition of narrowly-targeted antibiotics as developed by Spellberg and Rex. Their model has an implicit assumption - that the hypothetical high-priced antibiotic will be prescribed in a timely manner to patients who will benefit from it, and only to those patients. My point in that post was that a companion rapid diagnostic would be needed in order to realize this scenario, that such a test is likely to have a significant cost as well, and that this cost needs to be factored into the analysis.
David Shlaes has taken issue with this view, explaining that
"Luckily, physicians don't wait for the diagnosis before treating. They treat empirically - and if they are in a hospital where MDR Acinetobacter is a risk - they will use a new expensive drug to cover for that possibility until they have confirmation or not. "
He is of course correct. No physician faced with a failing patient would withhold a drug that might have a significant therapeutic benefit. But this practice has all sorts of implications for the value of narrowly-targeted antibiotics, and none of them are good. I'm not sure that "luckily" is the adverb I would have chosen to describe the situation.
Let's be clear - empiric prescribing is a necessary response to imperfect information. But there should be no illusions that physicians have some magic powers that make empiric prescriptions much better than guessing. Indeed, a careful study of prescribing practices is entitled "Empirical antimicrobial therapy for bloodstream infections ...: No better than a coin toss". As the title suggests, when faced with a binary choice between choosing an antibiotic that is optimal for MRSA and one that is optimal for MSSA, doctors did no better than predicted by chance. There is no reason to think they would do any better in choosing optimal antibiotics for Gram-negative infections.
Combine this element of chance with the fact that the organism of interest in this scenario is not all that common, and the case for the value of an antibiotic narrowly targeted toward it can deteriorate pretty quickly. Acinetobacter infections are not currently likely to constitute more than 10% of Gram-negative infections in any hospital in North America. Although not dominant, this level is high enough to be a threat in any patient with a serious infection, and physicians will have to treat accordingly.
If they prescribe the antibiotic to cover possible Acinetobacter infections in half of these patients when only 10% are actually infected, then there are several unintended consequences, all of them bad (except for the 10% who actually have Acinetobacter infections).
First, the value proposition for the antibiotic becomes far weaker. In Spellberg and Rex's median case the antibiotic costs $10K per dose, and the net cost for an additional year of life saved is only $3K (anything less than $50K is considered to have a favorable cost-benefit ratio). But if the antibiotic has no benefit for the patients who turn out not to have an Acinetobacter infection (and this is stipulated in their scenario), then use of the drug is a dead loss economically. If 50% of patients receive the antibiotic when only 10% are infected with Acinetobacter, then the cost per year of life saved goes from $3K to $15K, a much less attractive proposition.
Second, the non-Acinetobacter patients are exposed to the risk of adverse events, with no corresponding clinical benefit to compensate. Since this is a hypothetical drug, there is no point in trying to quantify this risk, but it is surely not zero.
Third, this level of overuse will accelerate the development of resistance to the drug. Although narrowly targeted by definition, it is biologically implausible that it would have no negative effect on the fitness of any of the thousands of species of bacteria that comprise the human microbiome. And once resistance develops in one species, it is only a matter of time until it spreads to many species, including the targeted pathogen.
Now imagine that there are several narrowly targeted antibiotics available: one for Pseudomonas, another for carbapenem-resistant Enterobacteriaceae, etc. In the absence of a rapid test, how will a physician decide which to prescribe? Should she prescribe all of them? If she does, all of the negative effects of inappropriate therapy listed above would be compounded.
All of these problems go away if there is a rapid test. If I were a pharma exec, I would not begin considering a program to develop a narrowly-targeted antibiotic unless I felt pretty confident that a companion rapid test could also be developed. The path to clinical and economic value is just too full of land mines otherwise.

Thursday, December 19, 2013

Can't have the drug without the test

As I've written before, antibiotics are too cheap. Drug companies simply can't get sufficient ROI on a product that is used once and costs a few hundred dollars. They would much rather develop drugs that patients take every day for the rest of their lives, or develop cancer medications that bring in $50-100K for a course of therapy.

Via David Shlaes, I see that Brad Spellberg and John Rex have looked into the question of how much a course of antibiotics could cost, and still return a net benefit to all parties. Their hypothetical drug is narrowly targeted toward carbapenem-resistant Acinetobacter baumanii (CRAB), a bug for which all current therapies are ineffective or highly toxic.

They find that at $10K for a course of therapy, an Acinetobacter-specific drug which reduced mortality from 20% to 10% would end up costing some $3000 per year of life saved. The comparable value for Avastin, a best-selling cancer drug, is $168,000. In other words, even a $10K antibiotic is actually pretty cheap.

What Spellberg and Rex don't address is how that cheap/expensive antibiotic gets prescribed in the first place. Since the hypothetical drug is narrowly targeted, there first needs to be an identification that the infectious agent is indeed Acinetobacter, and that it is in fact carbapenem resistant. Standard hospital micro lab procedures can do this, but it typically takes 3 days to get a result. In that time the patient is being treated with ineffective antibiotics, and may well be past the point of recovery by the time the correct diagnosis is made.

The obvious answer - and I know Brad is an advocate of this - is to develop rapid diagnostics that would identify Acinetobacter and determine carbapenem resistance in a few hours from a patient sample or positive blood culture. The first part of this is very doable using nucleic acid technologies. Verigene has a research-use test that ID's Acinetobacter spp, and other platforms surely have the same capacity. However, carbapenemase genes come in lots of varieties and developing a gene sequence test for them that has high sensitivity is no trivial task. And it's necessary, because 25-50% of Acinetobacter infections are still susceptible to carbapenemase, and you don't want to use the new, expensive antibiotic unless it is known that other agents won't work.

But let's say it's doable - what then? To have an impact, the hospital micro lab will have to test every Gram-negative blood culture with this test. Acinetobacter infections are on the rise, but they are still only a few percent of bloodstream infections in the US. Let's say they are 4%. That means that the micro lab will have to run - and pay for - 25 tests to catch a single positive case. Current PCR tests for MRSA, which are much simpler than a prospective carbapenemase test, cost $50-100. A dedicated CRAB test would have to cost at least $200, when fully capitalized and staffed. So the micro lab will have to spend $5000 to make a single diagnosis that drives use of the new $10,000 antibiotic.

That's a lot, but it still makes good economic sense, from a societal standpoint, to spend that money and treat the infection effectively. But society (at least in the US) doesn't make the spending decision - individual actors, such as doctors and microbiology lab managers do, and their incentives do not always align with society's.

We found this out the hard way at MicroPhage. Our test allowed patients with methicillin-susceptible S aureus infections (about half of all S aureus infections) to be taken off empiric vancomycin, and be treated with more-effective and less-toxic beta-lactams. It cost $50, and a typical hospital would have to spend about $2000 in testing to get an actionable result. The savings in health care costs would be $10,000+, making the test very cost-effective.

It was an utter flop in the marketplace, with sales of about $50K in its first and only year on the market. The problem? The micro lab manager, who had to buy the test, paid all of the costs and saw none of the savings. That was pretty much a deal-breaker.

Absent significant structural reform, the hypothetical CRAB test would likely meet the same fate. And without the test, the antibiotic would have much less impact, and likely fail to generate a decent ROI.

None of this is meant as a criticism of Spellberg and Rex. But it points out another layer of structural barriers that we have unwittingly erected to the development of new antibiotics. Unfortunately, until all of these problems are solved, it's as if none of them are solved.

 

Tuesday, December 17, 2013

The sore throat menace

The IDSA's report on infectious disease IVDs is out now, and it is squarely focused on the need for tests that will enhance antibiotic stewardship efforts. In particular, a test that would distinguish viral from bacterial infections is needed. Tens of millions of patients seek medical aid for upper respiratory infections each year in the US. About 10% of them have bacterial infections, but 60% or more go home with a prescription for antibiotics. Worse, the fraction of these prescriptions that are for broad-spectrum antibiotics is steadily increasing.

This is pure folly in so many different ways. Antibiotic susceptibility is a finite resource. Of the few sore throats that are bacterial infections, nearly all are due to Strep pyogenes, and nearly all strains are susceptible to penicillin. How many patients who went home with Cipro later ended up with C. difficile diarrhea or became infected by some cephalosporin-resistant bug?

A place where the report's authors did not go is to advocate restrictions on the ability of physicians to prescribe antibiotics as promiscuously as they have become accustomed to doing. Guidelines and education are all that is advocated. But this approach has been in place for decades and has stalled out: antibiotic prescriptions for upper respiratory infections went from 80% of patients in the 90's to 60% in the 00's, and has stayed there ever since.

 

From Barnett and Lindner 2013

Two developments are needed to have an impact. The technological fix would be a point of care test that distinguishes viral from bacterial infections, and this is certainly near the top of the IDSA's wish list. But availability of a test that will improve prescribing practices is not enough to ensure its adoption. We developed such a test at MicroPhage, and it sank like a rock in the clinical marketplace. There also needs to be some form of coercion, or if you like, encouragement, to do the right thing rather than the expedient thing. The growth of electronic medical record keeping means that it should be possible to track MDs who prescribe excessive amounts of antibiotics. A letter from the state medical board, or the FDA, might be a good way of gaining the attention of these miscreants. Or just knowing that someone is watching is often sufficient to improve behavior.

If this seems heavy-handed, consider that these physicians are creating a public health hazard, while providing minimal clinical benefit to their patients and exposing them to an increased risk of adverse events. I think that is sufficient rationale for impinging on physician autonomy.

 

Wednesday, December 11, 2013

Breaking good?

Scheduling time with a doctor is difficult now. If the Affordable Care Act survives and is successful, it is likely to get harder, as 10 or 20 million newly insured citizens will be added to the patient pool. Doctors will have to churn through patients even faster than they now do, or we will need more doctors.

Is this possible? Is the number of doctors limited by the number of people who have the desire and ability to qualify for an MD? Or are we limiting the supply of MDs by constraining med school admissions and certification of foreign-trained doctors? The answer is almost surely the latter. Western European countries have per capita about 3 doctors for every 2 that practice in the US. Perhaps it's just a coincidence, but they also have better outcomes and lower costs than the US as well. Although the US public does not seem to benefit from our doctor shortage, doctors do - they are paid much more than European doctors.

The ostensible reason for limiting the supply of doctors is to maintain the quality of licensed physicians and thus protect the public. But a look at the selection process for screening applicants to medical school suggests that other factors are just as important. .

I was reminded of this by a recent essay in the New York Times in which Barbara Moran, an aspiring med student, both deplores and defends the role of organic chemistry classes in weeding out prospective pre-meds. As anyone who has taken this class knows, pre-meds are frantic to get an A in it; anything less will disqualify them from admittance to med school.

The ostensible rationale for this requirement is that doctors need a grounding in O-chem to understand biochemistry and pharmacology. This may sound good to anyone who is ignorant of both fields, as well as of the practice of medicine, but it is complete bunk. Even if doctors did remember what they transiently learned in O chem many years prior, there is simply no medical scenario in which they would put their knowledge of the principles of alkene oxidation into practice.

It's true that doctors should have a good understanding of biochemistry. But human biochemistry involves a very limited number of types of reactions (plant and microbial biochem is another story), and these are all mediated by enzymes in aqueous solution at neutral pH, not by the various catalysts and solvents one learns about in O chem.

Moran acknowledges that O-chem is pretty free of any practical applications for the practice of medicine, but defends its inclusion on the pre-med obstacle course anyway. She quotes her teacher, who claims that organic chemistry teaches "...inductive generalization from specific cases to something you’ve never seen before." A useful skill in doctors (and other human beings) to be sure, but inductive generalization is a part of just about any intellectual discipline, from philosophy to astrophysics.

Knowing some chemistry is good. Chemistry explains the transformation of one form of matter to another, and the flow of energy that drives living beings. No one would want to be treated by a doctor who didn't understand acid-base or redox chemistry, or who didn't know what a protein is, or how vitamins work.

But getting an A in organic chemistry is not necessary for any of this, nor is it a guarantee that a prospective doctor really understands the fundamentals of biochemistry. What it does signify is that a student was willing and able to memorize long lists of formulas and recite them quickly on demand. This is not a useless skill - anatomy and lists of drug contraindications require good memory skills. But there are plenty of good ways to assess this ability - such as testing prospective doctors knowledge of anatomy and drug contraindications.

O chem is in reality a signifier, in much the same way that fraternity hazing rituals are. It shows a basal level of ability, but more importantly, it shows the commitment of the candidate to becoming part of the group. Or perhaps "caste" is the more appropriate term. Doctors, like other castes, are notoriously jealous of their prerogatives. They overwhelming oppose expanded responsibilities for nurse practitioners, despite 50 years of evidence that this does not result in reduced quality of care. They strive to maintain a dress code that sets them apart from their patients.

This is not the worst example of chemistry being used for a less than noble purpose. But it does make me wonder just how many potentially good doctors we have lost because they could not quickly regurgitate reaction diagrams that they would never use again.