Skip to content
Opinion Column

Money, Medicine and Myths

I was on a phone call with fellow health policy types back during the presidential primary season when the conversation turned to pay-for-performance. A physician on the line immediately began to bristle at the idea that doctors’ pay should (or could) be linked to certain quality standards. The group, made up of moderate-to-liberal Democrats, quickly moved on to broader areas of agreement.

As health care reform enters the treacherous terrain of actual legislative language, that conversation carries an important warning about hidden hot buttons. Proposals that seem eminently reasonable can trigger angry responses even from seeming allies. And there is no hotter hot button than efforts to change how physicians are paid to encourage higher-quality, lower-cost care.

Despite decades of studies about misaligned incentives that injure and kill hundreds of thousands of patients and waste hundreds of billions of dollars, many Americans still sincerely believe that just leaving doctors alone to use their best judgment is all the health care system needs.

President John F. Kennedy once observed, “The great enemy of the truth is very often not the lie — deliberate, contrived and dishonest — but the myth — persistent, persuasive and unrealistic.” Nowhere is that truer than in the myths surrounding the “good old days” of medicine. For example, in gauzy memory, the post-war era when most patients still paid medical bills out of pocket may look like a utopia where autonomous professionals were free to tenderly minister to their patients. Some no doubt did, but an honest look at how U.S. health care functioned with virtually no oversight or guidelines looks a lot closer to nihilism than Nirvana.

Doctors paid on volume responded accordingly. Popular magazines documented an epidemic of unnecessary operations so widespread that one state legislature passed a law requiring surgeons to deposit allegedly diseased organs with the local county clerk so they could be examined later. (The governor vetoed the bill.) A brutally honest American Medical Association report in 1955 to its own House of Delegates concluded that physicians “think about money a lot — about how to increase their incomes, about the cost of running their offices, about what their colleagues in other specialties make, about what plumbers make for house calls and what a liquor dealer’s net is compared to their own.”

Nevertheless, when a decade later Congress passed Medicare and Medicaid, the landmark guarantee of access to care for the elderly and the poor, lawmakers gave hospitals “cost plus” reimbursement, let physicians charge “usual, customary and reasonable” fees and prohibited government from exercising “any control over themanner in which medical services are provided.”

Medicare and Medicaid took effect July 1, 1966. By the next June, the government was sponsoring its first conference on controlling medical costs and the AMA was pleading with doctors to show economic constraint. Meanwhile, hospital costs climbed 13 percent annually from 1966 to 1969, the year Richard Nixon become the first president to proclaim a national health care “crisis.”

On television, Marcus Welby, M.D. was a huge hit. It was not a documentary. In the real world, the U.S. Surgeon General denounced doctors for providing care that was often “fragmented and impersonal.” In 1972, Congress finally gave Medicare and Medicaid the power to disallow “any costs unnecessary to the efficient provision of care,” explicitly establishing the principle of “no more blank check.” As a society, we have been grappling with the right balance of physician autonomy and accountability ever since.

Most physician leaders understand quite well that professional norms are important but cannot by themselves provide a reliable check on behavior. It’s just easier to talk publicly about the corrupting influence of drug-company honorariums than, say, the way in which a doctor who displeases colleagues by pressing too hard to reduce unnecessary care can find his referrals from those colleagues radically reduced.

“Physicians are neither saints nor sinners,” wrote Dr. Mike Magee, a second-generation physician who founded the Pride in Medicine Project (now called Positive Medicine) in the 1990s. “The truth is that doctors are like everyone else.”

Still, the aspiring saints could use some assistance. Ultimately, the 30 percent waste in American medicine can be eliminated only if those providing medical services act in partnership with those who use them. (Yes, patients have responsibilities, too.) That cannot happen without fundamental payment reform. Health information technology, for example, provides powerful tools, but designing a system that supports, empowers and enables the use of those tools is by far the more important activity.

Eighty years ago, the staunchly Republican chairman of the Committee on the Costs of Medical Care, Dr. Ray Lyman Wilbur, wrote these words: “The unprecedented growth of medicine, the enormous expansion of personnel and facilities and the investment of billions of dollars have created issues from which society cannot escape merely through its own optimism or through confidence in the high character of medical practitioners.”

Change “billions” to “trillions” and Wilbur’s advice still rings true today.

Michael L. Millenson, a Highland Park, IL-based consultant, is also a visiting scholar at the Kellogg School of Management and the author of
Demanding Medical Excellence: Doctors and Accountability in the Information Age

Related Topics

The Health Law