Personal Observations on the Changing Scene in American Medicine – 1955 to 2010

Clifton K. Meador, M.D.

When I entered private practice in 1962 (after graduating from medical school in 1955, completing a medical residency, serving  two years in the Army Medical Corps, and completing a N.I. H. Fellowship in Endocrinology), there was no Medicare, no Medicaid, and very little medical insurance of any kind.  Patients paid cash, vegetables, meat, or nothing.

We turned no one away for lack of insurance or inability to pay. I had no idea when I saw a patient if they did or did not have insurance or were able to pay. The medical insurance of those times paid only if the patient was admitted to a hospital with a known diagnosis. For those patients with insurance, we admitted all we could in order to get tests and imaging studies paid. This led to an abuse of hospitalizations and many bogus diagnoses. Almost any diagnosis would justify the admission, including pseudo-diagnoses such as achlorhydria, hiatus hernia, or even retroverted uterus.  This wrong use of admissions to get insurance coverage was common. There was no insurance for outpatient care. That did not change until late in the 1960s.

To illustrate how irrational medical insurance was, consider this: I had a female patient in the late 1960s with hyperthyroidism and I wanted to treat her with Radio Iodine (I131). She had Blue Cross insurance. I called to tell them I could treat her as an outpatient for around $150. If I admitted her, it would cost over a $1000. The insurance refused to cover her as an outpatient so I admitted her, costing the company over $900. There were many examples of this sort of absurdity until insurance began to cover out patient care. When outpatient insurance coverage came into existence it became very difficult to get patients admitted to hospitals.

We gave free care as a “professional courtesy” to physicians, ministers, and veterinarians and all members of their families. “Professional courtesy” disappeared some time ago and is now illegal under Medicare rules (deliberate free care is judged to be an enticement!).

Hospitals were largely charitable organizations, barely able to stay alive.  I saw this first hand in the early 60s when we had to pay cash on delivery to get milk for the nursery at the University Hospital in Birmingham. No cash, no milk.

Around 1966, Medicare and Medicaid came into existence. By the late 1960s for-profit health care companies and hospitals appeared on the scene, relying on the steady stream of money from the federally funded Medicare program. In addition capital costs of hospitals were allowed to be reimbursed as a “pass-through.”  This steady source of federal money for expansion of facilities coupled with the increasing profits from for- profits companies funded an explosion of new technologies.  Wall Street for the first time saw health care as some sort of fundable commodity and began to fuel the rapidly expanding medical- industrial complex with stock offerings. Stock prices soared feeding the expansions of programs across the board. The preceding increases in N.I.H. funds in the 1950s and 1960s had led to the discoveries and new knowledge that were needed before new technologies could be developed. (Scientific knowledge precedes technologic or engineering development in most cases.)

There was a remarkable timing of the convergence of forces and sequence of events:

*Scientific discoveries from N.I.H. increases in funds in the 1950s and     1960s.

*Inventions in technology and engineering

*Federal funding of Medicare

*Medicare capital pass-through for equipment and expansions

*Wall Street recognition that health care was a fundable commodity

*Creation of for-profit companies with Medicare funds and stock offerings,

These events led to the exponential increase in growth and costs for medical care. But to sustain all of this the businesses of medical care needed to grow. And grow it did, not always attending to the clinical need or lack thereof.

The public, previously reluctant to seek medical care and only when sick, now came in droves to physicians and hospitals. The AMA’s restraint on physician advertising was set down by federal court orders in 1975.The flood gates were opened and the public began to be saturated with appealing ads for hospitals, drugs, tests, and  procedures, whether needed  or not.  Each major television channel soon had medical experts extolling the latest device or drug. Drug companies began to monger new drugs to treat new and thinly defined ailments. For the first time, drug ads were aimed directly at the public. The flood of patients now became a tsunami and the cost for health care soared to 17% of the gross national product.

Now add to this federally funded technological growth, another factor: the appearance of well people for the first time. As medicine began to make more inroads and technical improvements, the public came in ever increasing numbers. For the first time, well people started to appear, seeking detection of early disease in the hopes that it would be curable. In addition, a whole new category of patients began coming to see doctors. Sidney Garfield of Kaiser Health System dubbed these as the “worried well.” The well had now become worried, noticing smaller and smaller symptoms generating more and more worries. The public’s expectations for curative medicine became unreasonable and not sustainable.

A system originally designed to find disease in sick, symptomatic people was now faced with finding disease in well, asymptomatic people. This fundamental change in the course of events has received little attention. One of my favorite quotes says, “There must be something the matter with someone who goes to see a doctor when there is nothing the matter.” Another, from an unnamed medical resident when asked to define a well person answered, “A well person is someone who has not been completely worked up.” The new technologies, almost all visualizations, could find smaller and smaller lesions whether they were the source of a patient’s problem or not. False positive test results sky rocketed. As visualizations increased in power through the new technologies, listening to patients decreased. It was as if the whole auditory modality of medicine was vanishing: if you could not see the disease, it did not exist.

Prior to Medicare, patients came only when they were sick or thought they were sick. All had one or more symptoms. The job of the physician was to determine if the patient was sick or not and if so, what sickness. In those times, disease could be considered to be a “digital” disease. Like an on or off signal, the disease was either present or not. That is no longer the case.

We have entered the age of “analogue” diseases. Consider atherosclerosis of the vessels. It begins early and builds gradually along a scale or continuum.  When does the process become a treatable disease, say of the coronary arteries? Consensus seems to say a 50% stenosis (narrowing) of an artery is treatable disease but we know that some doctors treat 40 % or even 30% stenoses.  With analogue diseases there is no easily definable answer. This is not like typhoid fever – either present or absent. We now have a whole array of analogue diseases or even pre-diseases. In effect we have converted nearly the entire population of America into analogue diseases, needing medical attention at earlier and earlier stages.

Lewis Thomas noticed and wrote about this change in the American public in a publication “Doing Better and Feeling Worse” published in 1977:

“Nothing has changed so much in the health-care system over the past 25 years as the public’s perception of its own health. The change amounts to a loss of confidence in the human form. The general belief these days seems to be that the body is fundamentally flawed, subject to disintegration at any moment, always on the verge of mortal disease, always in need of continual monitoring and support by health care professionals. This is a new phenomenon in our society.”

Some of this rush to medical care is fueled by a large misunderstanding by the public about the difference between life expectancy and human life span.  The rise in life expectancy from birth has been widely and incorrectly attributed to advances in curative medicine. People believe that the human species is being made to live longer mainly by medical care. This is not the case. The rise is life expectancy from birth is attributable to reductions in deaths in childhood, mostly before the age of one year. Most of the reductions in childhood and infancy came from public health measures, clean water, clean milk and from immunizations. More people are living to older ages because they did not die in childhood.  Humans as a species are not living longer life spans. The life span of humans is fixed at 85 years plus or minus 5 years. It has not changed in recorded history.

The lack of understanding and acceptance of a fixed life span is another factor driving up the cost of medical care. A large percent of Medicare funds is spent in the last six months of life of the recipients. Much of these funds go to futile care to prolong death, not meaningful or useful life. Sometimes it appears that medicine has abdicated one of its most essential roles – the telling a family that all has been done that can be done. That it is now time to stop futile care and begin pure comfort care and support. We need to define futile care and explain palliative and hospice care to the public.

Another force, sometimes overlooked, in the increases in use and costs of medical care is the near disappearance of primary care. At the level of primary care or first contact care, somewhere between 25 and 50 percent of patients have stress or a psychological basis for their symptoms. There is simply not a definable medical disease behind every symptom. It takes time and listening to understand the sources of human distress. That is the job of primary care. By listening and understanding the patient, the primary care physician makes careful use of tests and imaging procedures. The good primary care physician knows the high rate of false positives if he jumps too soon to unfocused or indiscriminate use of tests. There is an old dictum that says, “It is more important to know the patient with the disease than to know the disease.”  That is the heart of a primary care physician.

A seldom discussed feature leading to the decline of primary care occurred in the early 1980s, when managed care began to expand. Primary care internists and family physicians charged for their time, for lab work done in their offices, and for simple x-rays, also done in their offices. This was the classic cottage industry.  The internist or family physician could on one or two visits define the problem within the office practice. Managed care mistakenly went after primary care reimbursement early on. They refused to pay for lab or x-rays done in the office, saying the primary care physicians were doing unnecessary tests. These office procedures accounted for over 50% of revenues for primary care. When reimbursement for office lab and office x-rays were refused, the Internists and family physicians had to double or even triple the volume of patients seen to stay at the same revenue level. This great reduction in time for the visit led to shorter and shorter visits and much less attention to listening and sorting out psycho-social problems. This over scheduling led to long waiting periods for appointments. The overcrowding of Emergency Rooms can be traced to these same root causes.  Referrals directly to specialists became a common method for dealing with complex patients. In addition to the reduction in the amount of time available for patients, there was a much larger increase in costs from having to use commercial labs and imaging facilities. The loss of the cottage industry decreased quality and increased costs for care.

As primary care fades away, the process of careful listening and sorting for life stresses is going away. The public, unfiltered by primary care, will go directly to specialists, as they already do in large numbers. When this happens the prevalence of medical diseases (read “probability of disease”) falls and when prevalence or probability of disease falls, the rate of false positive results increases greatly. (At a 2% probability of a disease, testing with a very sensitive and specific test will yield 72% of results to be false positives!) Specialists are not trained to sort out social or psychological factors, they are trained to test and do procedures to detect diseases of their specialty organs- e.g. heart, kidney, lung, bladder, brain, spinal cord, and so on. Given the analogue and continuum nature of many disease processes, they often find lesions at very early stages. These lesions will have to be followed by periodic examinations so eventually many people will be in some situation of being followed for some or another lesion by a specialist. The real causes of their distress, if social or psychological, will persist and not be addressed by the specialist.

So here is the current situation as I see it: The public is seeking care far beyond any need or reasoning. The influx of well, worried well, and worried sick people  into a system designed to find medical diseases in sick people leads to large increases in false positives. Advertising brings even more people into the system, further lowering the probability of disease producing even more false positives. The excessive over use of tests leads to huge increases in profits, with built- in incentives to do even more testing and procedures. The analogue or continuum of many disease processes leads to more and more finding of early lesions. Working through the false positive problems either generates more and more repeated testing OR it produces people labeled with diseases they do not have.

(Removing false labels is very difficult –a nearly impossible task. If anyone doubts this, try stopping thyroid replacement in an obese female proven to have normal thyroid function or try stopping B12 injections in older patients when there is no proven B12 deficiency)

Almost all suggested approaches and solutions to the high costs of medical care have been exclusively financial or payment based. While financial considerations are very important, the underlying causes of the high costs are based on misdirected clinical and diagnostic thinking together with financial incentives to do more and more. Very few suggested changes focus on the details of the clinical system or on the needed but lacking emphasis on evidence based medicine.

It is quite clear that medicine has become a very big business.  It’s also a truism that all businesses must take in more money than they spend, call it black ink or profit.  The business we are currently in calls for more and more to be done to patients. The procedures and tests must grow and expand.  I suggest we are in the wrong business.

Instead of making money from doing things to people, we should make the needed money from keeping people healthy by doing less and preventing more. The transition from our current business system to a new business system will take a massive shift in thinking. I’m not sure it is possible.

The only way out of this problem is to greatly increase the primary care specialties. These are the doctors and nurses who can listen and think through a clinical problem BEFORE jumping to testing. How to accomplish the needed increase in primary care has not been defined. There is simply too much money in the system and no easy way to redistribute it among the specialties or the multiple businesses.

Clifton Meador, MD is Executive Director of the Meharry-Vanderbilt Alliance and Professor of Medicine at both medical schools.

6 thoughts on “Personal Observations on the Changing Scene in American Medicine – 1955 to 2010

  1. Dr Meador:

    This is one of the best summations I have read in some time. It crystallizes the fact that we expect too much – all of which has led to the ever-increasing cost accelerators, i.e., over-worrying, over-diagnosing, over-treating and over-prescribing.

    Thank you for such a wonderful commentary.

  2. Dr. Meador,
    I am a fourth generation family doc in a small town. Your article is the best I have seen regarding the fundamental problem we are witnessing today in healthcare. Knowing whether someone is sick or not sick, when he/she needs treatment for an ‘analog’ disease, and how to gracefully allow one to die a dignified and appropriate death following an unambiguos terminal condition, were the most difficult, and yet most rewarding parts of my job. I consider pyschology as 40% of my daily work for just the reasons you mentioned. Thanks for sharing your insights.
    Thomas Schwieterman MD.

    1. I am glad to know there are real doctors still out there. I hope more will come along .thank you for your kind word,,Clifton

  3. This line in particular gets me: “As primary care fades away, the process of careful listening and sorting for life stresses is going away.” I guess there’s no money in listening! It has to be tremendously frustrating for careful physicians to treat patients in this sort of environment. We are fortunate that the vast majority of doctors in our country are trained so superbly. A good doctor’s curiosity and patience can’t be suppressed, no matter how short the time is.

  4. Your comment regarding careful listening is truly the unheard scream of the medical practitioner in America. As an RN, I have (unfortunately) often found the primary cause of a symptom or concern of a patient by simple, careful conversation. Many times, these people were hospitalized and treated for totally unrelated conditions at a high cost in both dollars and suffering due simply to a failure to have enough time with a patient. In many offices, 15 minutes is an office visit. That’s just enough time to say hello, and do a blood pressure!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s