Tuesday, May 30, 2006

Medical expertise

Previously I posted on the myth of expertise. Experts in many fields often perform no better than chance when making predictions.

This Businessweek article claims that no more than 25% of medical recommendations are supported by statistical data. Most of it is done by tradition, or according the doctor's intuition. The article profiles David Eddy, a heart surgeon who spent some time learning mathematics and earned a PhD in operations research. He revolutionized the field by pushing for "evidence-based" medicine. What is amazing to me is that this took so long and happened so recently.

...Even today, with a high-tech health-care system that costs the nation $2 trillion a year, there is little or no evidence that many widely used treatments and procedures actually work better than various cheaper alternatives.

This judgment pertains to a shocking number of conditions or diseases, from cardiovascular woes to back pain to prostate cancer. During his long and controversial career proving that the practice of medicine is more guesswork than science, Eddy has repeatedly punctured cherished physician myths. He showed, for instance, that the annual chest X-ray was worthless, over the objections of doctors who made money off the regular visit. He proved that doctors had little clue about the success rate of procedures such as surgery for enlarged prostates. He traced one common practice -- preventing women from giving birth vaginally if they had previously had a cesarean -- to the recommendation of one lone doctor. Indeed, when he began taking on medicine's sacred cows, Eddy liked to cite a figure that only 15% of what doctors did was backed by hard evidence.

A great many doctors and health-care quality experts have come to endorse Eddy's critique. And while there has been progress in recent years, most of these physicians say the portion of medicine that has been proven effective is still outrageously low -- in the range of 20% to 25%. "We don't have the evidence [that treatments work], and we are not investing very much in getting the evidence," says Dr. Stephen C. Schoenbaum, executive vice-president of the Commonwealth Fund and former president of Harvard Pilgrim Health Care Inc. "Clearly, there is a lot in medicine we don't have definitive answers to," adds Dr. I. Steven Udvarhelyi, senior vice-president and chief medical officer at Pennsylvania's Independence Blue Cross.

What's required is a revolution called "evidence-based medicine," says Eddy, a heart surgeon turned mathematician and health-care economist. Tall, lean, and fit at 64, Eddy has the athletic stride and catlike reflexes of the ace rock climber he still is. He also exhibits the competitive drive of someone who once obsessively recorded his time on every training run, and who still likes to be first on a brisk walk up a hill near his home in Aspen, Colo. In his career, he has never been afraid to take a difficult path or an unpopular stand. "Evidence-based" is a term he coined in the early 1980s, and it has since become a rallying cry among medical reformers. The goal of this movement is to pierce the fog that envelops the practice of medicine -- a state of ignorance for which doctors cannot really be blamed. "The limitation is the human mind," Eddy says. Without extensive information on the outcomes of treatments, it's fiendishly difficult to know the best approach for care.

The human brain, Eddy explains, needs help to make sense of patients who have combinations of diseases, and of the complex probabilities involved in each. To provide that assistance, Eddy has spent the past 10 years leading a team to develop the computer model that helped him crack the diabetes puzzle. Dubbed Archimedes, this program seeks to mimic in equations the actual biology of the body, and make treatment recommendations as well as figure out what each approach costs. It is at least 10 times "better than the model we use now, which is called thinking," says Dr. Richard Kahn, chief scientific officer at the American Diabetes Assn.

WASTED RESOURCES

Can one computer program offset all the ill-advised treatment options for a whole range of different diseases? The milestones in Eddy's long personal crusade highlight the looming challenges, and may offer a sliver of hope. Coming from a family of four generations of doctors, Eddy went to medical school "because I didn't know what else to do," he confesses. As a resident at Stanford Medical Center in the 1970s, he picked cardiac surgery because "it was the biggest hill -- the glamour field."

But he soon became troubled. He began to ask if there was actual evidence to support what doctors were doing. The answer, he was surprised to hear, was no. Doctors decided whether or not to put a patient in intensive care or use a combination of drugs based on their best judgment and on rules and traditions handed down over the years, as opposed to real scientific proof. These rules and judgments weren't necessarily right. "I concluded that medicine was making decisions with an entirely different method from what we would call rational," says Eddy.

About the same time, the young resident discovered the beauty of mathematics, and its promise of answering medical questions. In just a couple of days, he devoured a calculus textbook (now framed on a shelf in his beautifully appointed home and office), then blasted through the books for a two-year math course in a couple of months. Next, he persuaded Stanford to accept him in a mathematically intense PhD program in the Engineering-Economics Systems Dept. "Dave came in -- just this amazing guy," recalls Richard Smallwood, then a Stanford professor. "He had decided he wanted to spend the rest of his life bringing logic and rationality to the medical system, but said he didn't have the math. I said: 'Why not just take it?' So he went out and aced all those math courses."

To augment his wife's earnings while getting his PhD, Eddy landed a job at Xerox Corp.'s (XRX ) legendary Palo Alto Research Center. "They hired weird people," he says. "Here was a heart surgeon doing math. That was weird enough."

Eddy used his newfound math skills to model cancer screening. His Stanford PhD thesis made front-page news in 1980 by overturning the guidelines of the time. It showed that annual chest X-rays and yearly Pap smears for women at low risk of cervical cancer were a waste of resources, and it won the most prestigious award in the field of operations research, the Frederick W. Lanchester prize. Based on his results, the American Cancer Society changed its guidelines. "He's smart as hell, with a towering clarity of thought," says Stanford health economist Allan Enthoven.

Dr. William H. Herman, director of the Michigan Diabetes Research & Training Center, has a competing computer model that clashes with Eddy's. Nonetheless, he says, "Dr. Eddy is one of my heroes. He's sort of the father of health economics -- and he might be right."

..."At each meeting I would do the same exercise," he says. He would ask doctors to think of a typical patient and typical treatment, then write down the results of that treatment. For urologists, for instance, what were the chances that a man with an enlarged prostate could urinate normally after having corrective surgery? Eddy then asked the society's president to read the predictions.

The results were startling. The predictions of success invariably ranged from 0% to 100%, with no clear pattern. "All the doctors were trying to estimate the same thing -- and they all gave different numbers," he says. "I've spent 25 years proving that what we lovingly call clinical judgment is woefully outmatched by the complexities of medicine."

10 comments:

  1. Thanks for the post. Evidence based medicine is an interesting trend. Positive to the extent that it genuinely improves the scientific basis of medical decision-making. Not-so-positive to the extent that it is used to justify "take an aspirin and call me in the morning" therapy-on-the-cheap. "Hey, there's insufficient *evidence* that new unfamiliar therapy you want is effective! Claim denied."

    I find that a lot of intelligent, well-educated people are disaffected with medicine because (my interpretation) they get opaque non-explanations from poorly-informed doctors. I can't help but wonder if the public's routine experience of spurious medical pseudo-science isn't a factor in the popular sentiment against science in general and evolution in particular. After all, if medical 'science' is so routinely wrong and self-contradictory, how could scientists be right about evolution?

    When you pit particle physics against theology, the contest is rather uneven. But most people have never really learned any hard physics or chemistry. The 'science' they hear about is limited to random medical assertions in the media about what is or isn't good for you or social 'science' arguments about what our kids need, etc. That stuff isn't obviously or consistently superior to the commonplace wisdom available from traditional sources like the Bible.

    ReplyDelete
  2. STS,

    I agree completely. If you are scientifically literate, you will often be disappointed with your interactions with doctors.

    But the article points out something even stronger. In the past I had assumed that doctors' training was good and their decision making sound even if they couldn't articulate their reasons. (They learned something by rote and are just following the rules.) However, the article points out that most of the rules they learned are not even well justified!

    ReplyDelete
  3. I can't believe I'm coming to the defense of doctors but have either of you ever looked at clinical data? Nothing is ever clearcut. We don't understand many of the most basic aspects of the biology behind diseases. Evidence-based medicine sounds good and should be practiced but it will be decades maybe centuries before we can validate every single procedure. Currently, we just have correlations. High cholesterol is correlated with heart disease. We have theories of why it should be but no real firm proof. Until, we have biological mechanisms behind every disease, medicine will partially be art. To me it's amazing that they can even do the things they do with the little knowledge that they have.

    The real problem is that there is a huge disparity between the level of competency between doctors. That is what I think the public doesn't understand. It's only when they start shopping around more for the best care will the quality of doctors become more even and even rise. Doctors also have to start reporting on their colleagues if they see that harm is being done. We could go on for days about what is wrong with medicine today.

    ReplyDelete
  4. But some of Eddy's examples (chest X-rays, etc.) seem pretty egregious.

    Also, the idea that you might be doing the wrong thing and not know it seems to have taken a long time to take hold.

    ReplyDelete
  5. Anonymous6:00 PM

    carson chow wrote, We have theories of why it should be but no real firm proof.

    But you're conflating "empirical proof that something works" with "an understanding of how something works."

    Take the case of cholesterol and heart disease. A practical question is: statins lower cholesterol, but will that necessarily lead to improved cardiovascular outcomes, and improved health outcomes in general?

    The answer to this question can be gleaned (at least at the level of statistical inference) without ever knowing precisely how things worked.

    The real problem is that there is a huge disparity between the level of competency between doctors.

    But the only useful, clear cut concept of medical competence is that of statistical measurements of outcomes (weighted for things like case mix, of course). (This is aside from outliers, like con artists.) Which gets back to the epistemological issue I pointed out above.

    ---------------

    Put me down as a firm advocate in favor of evidence-based medicine. It's time we stopped wasting so much GDP on a lot of questionable stuff.

    ReplyDelete
  6. Fromm hits the epistemological nail on the head -- if you don't have statistical evidence, how do you even know you are doing the right thing? (i.e., helping the patient, or at least not causing harm.)

    "To me it's amazing that they can even do the things they do with the little knowledge that they have."

    What do you mean "do the things they do"? Suppose I decide to operate on a cancer patient. After a painful and risky procedure, she gets well. What do I conclude? Did I accomplish something amazing, or would she have gotten better on her own (and the surgery just made things harder for her)? You can't answer any of these questions without "evidence based" medicine. Otherwise you are just doing random things are trusting your instincts, despite how plausible the course of action might be ("tumor bad, excise it, patient got better!").

    Think about the analogous problem of portfolio managers in finance. Joe made money last year. Does that mean he has great "clinical instincts," or was he just lucky?

    ReplyDelete
  7. For portfolio managers, a career may not be long enough to separate luck from skill in a purely statistical manner.

    More broadly, statistical evidence is all very well, but it doesn't serve us terribly well because there are too many variables. A may be strongly correlated with B, but you haven't got enough time (or grant money) to control for all the potential confounding factors.

    This is why stronger theory -- some form of causal explanation in which the intermediate steps of the causal mechanism are elaborated -- is essential for robust scientific knowledge. But even as that becomes available in biological systems, it may remain quite challenging to measure and relate enough variables to make good predictions -- it may still resemble econometrics or weather prediction rather more than we would like.

    ReplyDelete
  8. Anonymous11:41 AM

    sts wrote, More broadly, statistical evidence is all very well, but it doesn't serve us terribly well because there are too many variables. A may be strongly correlated with B, but you haven't got enough time (or grant money) to control for all the potential confounding factors.

    Often false. You can control other variables through a well-designed experiment, in many if not all cases. That's the whole point of the field of statistics known as "experimental design."

    Your comment is certainly applicable to a lot of observational studies, but a lot more random assignment or very well designed case-controlled studies can be set up.

    One of my favorite examples (might be getting the details wrong here; it's been awhile and don't have the cite in front of me) is the use of a bone marrow transplant technique to treat some kind of nasty breast cancer. IIRC it was breast cancer that had spread a lot, and the technique is to take out some of the patient's marrow, give them extremely high doses of chemo, then put the marrow back in (the marrow part is to give them back a functioning immune system).

    After a long study with a pretty high sample size, it was concluded that in terms of some reasonable measure of outcome (probably months of life extended), this did no better than conventional chemo and yet cost a ton more.

    The trials were either random assignment or very carefully case-controlled.

    I claim that this statistical study showed this expensive technique was useless. Yet I doubt anyone fully understands the subtleties of the mechanism of chemo, particularly why it works relatively well in some situations but not others, the dose dependence (that's particularly relevant in this example), etc.

    Interesting "political" thing is that many of the MDs who had staked their careers (or a big chunk thereof) on the technique continued to back it, whereas the "evil" insurance companies wanted to (perhaps did) begin to refuse to pay for it. I agree with the insurance companies and wonder whether such MDs should be allowed to continue practicing medicine.

    This is why stronger theory -- some form of causal explanation in which the intermediate steps of the causal mechanism are elaborated -- is essential for robust scientific knowledge.

    Yes, of course. And I completely agree that it's good to pursue such knowledge---it might lead to better treatments, whereas statistics can only evaluate treatments.

    But robust knowledge of mechanism is simply not usually necessary for evaluating treatments.

    ReplyDelete
  9. Anonymous11:46 AM

    steve wrote, "To me it's amazing that they can even do the things they do with the little knowledge that they have." What do you mean "do the things they do"?

    I somewhat agree with Carson. E.g. CABG surgery, which IIRC has been shown to be effective.

    But a lot of the amazement should really be reserved for life itself, not medicine. That is, medicine is sculpting and tweaking life---e.g., in CABG, it's not like medicine invented the entire process, since our biological endowment is responsible for the graft taking and the functioning properly. So in many ways medicine is just riding on the coattails of life.

    Can you imagine fixing cars that way?

    ReplyDelete
  10. Anonymous7:14 PM

    It's interesting looking back over older posts to see how things develop. There was a report on the UK news yesterday how so many Chineese are starting to holiday abroad and experience life outside of the 'regime'. Thailand is especially popular and one of the biggest attractions are the lady boy shows! New resorts are being constructed especially for the mass increase in tourism because of the new found wealth of Chineese citizens.

    There are predictions of 300 million Chineese tourists every year. That is about 5 times the total population of the UK! One of those holidaymakers interviewed taking his family on holiday abroad for the first time was a taxi driver, which goes to prove how the wealth is spreading. He says he is planning to bring his family on holiday to the UK next year. With prices the way they are in the UK, he may save hard. Perhaps he could read my history of accounting which covers the emergence of Western economies.

    ReplyDelete