I recently encountered an incontrovertible and mind blowing fact. 1970 is equidistant from 1918 and 2022. In 1970, when I was 22 years old, 1918 seemed impossibly long ago. My grandfather was 20 years old in 1918. Compared to my current perspective as a retired family doctor, 1970 doesn’t feel to me nearly so far from today as 1918 did in 1970 when I was a first year medical student.
- Marc Ringel, MD
Looking Back While Looking Forward
Especially with respect to medical practice, the gulf between 1918 and 1970 seems a lot greater than the one between 1970 and today. Medical textbooks from the early 20th Century appear to stop just short of recommending blood-letting. There were no antibiotics. Radiology was in its infancy. In 1918 insulin for diabetes was still four years away. Tens of millions were dying worldwide from the influenza pandemic. Thanks to flu vaccines, imperfect though they are, the world is not likely ever again to experience an influenza epidemic as deadly as the one that in 1918 infected an estimated 500 million people and killed 50 million, including Grandpa’s mother, sister and niece.
Today we have coronavirus to contend with, a bug that, like influenza virus, regularly evades the immunologic reach of vaccines. The delta, omicron and now BA.2 variants have been teaching us the hard way that we must be eternally vigilant. So far there have been almost half-a-billion COVID-19 infections, over 6 million deaths, and 10.5 billion vaccine doses administered worldwide. I have little doubt that, thanks to medical science, COVID-19 will be at worst a nuisance before 52 more years pass.
The body of medical knowledge has grown greatly in the 104 years since influenza decimated my grandfather’s family, when the only vaccines available were for smallpox and pertussis (whooping cough). We don’t even immunize against smallpox anymore because, thanks mostly to astute, aggressive case tracking, that virus was declared eradicated from the face of the Earth in 1977 (except for supplies that still reside in national armories for possible use in germ warfare, which scares the devil out of me).
When I was a medical student there were no such things as CT scans, not to mention MRI and PET scans. Ultrasound imaging was in its very infancy. The first of those devices I saw, still experimental, took up most of a room. It had a large articulating arm that translated movements of the transducer head the operator slid over the patient’s skin to a TV monitor that displayed a very fuzzy image of the innards. The device looked kind of like a pantograph, the mechanical gizmo that engravers have employed since the 17th Century to reduce and transfer designs from paper to metal. You have likely encountered the handheld ultrasound machines that clinicians use every day now to get a look at what’s inside of most every part of the body from head to toe. Good quality images can be transmitted directly to an electronic tablet or any other viewing device. If I were to go back to active practice I’d need to take a crash course in bedside ultrasonography--an indispensable aid to clinicians’ sense of touch, hearing and sight--before I could be considered an up-to-date clinician.
All sorts of great new medicines have been released since I started training: a wide range of anti-hypertensive and diabetes drugs; potent antibiotics; oral anti-inflammatories; antivirals, including for HIV and hepatitis; psych drugs (a mixed bag); chemotherapeutic agents; immunizations against a host of childhood and adult diseases including polio, diphtheria, tetanus, hepatitis, cervical cancer, shingles, meningitis, pneumonia, and ear infections.
But is medical care that much better than it was when I started down my career path? Lifespan has taken a hit in the last couple of years because of COVID and problems related to it, like suicide. On average, though, Americans’ longevity has crept up since 1970, from 70 to 78 years. It was 54 years in 1918. That all sounds pretty good until you compare the US to other countries. In 2022 projected longevity is greater in 45 countries than it is here. Hong Kong tops the list at 85 years, followed closely by Japan. European countries, Switzerland, Italy, Spain, and Iceland, made it into the top ten. You probably didn’t expect to find Cuba, Qatar, Chile, Costa Rica, Slovenia, French Guiana, Poland and Estonia ahead of us too. The United States lags also in most other statistical indicators of health, like child mortality. Yet in 2020 we spent $4.1 trillion, 19.7% of our GDP, on healthcare, the largest share of the largest GDP in the world. In 1970 it was 6.9%, the biggest chunk of the biggest GDP back then. Our national health statistics looked pretty awful at that time too.
It seems pretty clear that, for all the science America is rightfully proud of--for example, leading development of COVID-19 vaccines at blinding speed--and for all the money we spend, we’re not doing very well at caring for our population, and haven’t been for a long, long time.
Then there’s information technology. Outlays for health information systems have gone through the roof (including for telehealth which, as you know, I heartily approve of). Most every clinician now uses a computer to document care, to receive and send reports, and to communicate with patients, colleagues and institutions. You’ve no doubt seen a data processing device in the hand or nearby most every provider you’ve consulted in the past decade. The doctor’s gaze is too often divided between you and a computer screen. A healer’s most precious assets are compassion, scientific knowledge, experience and simply their attention. Thanks to the demands of electronic devices, the average physician now spends less than half their workday in the presence of patients.
In 1970 a doctor’s only electronic distraction was phone calls, a problem that could be addressed by simple protocols. Individuals didn’t even own computers. My closest encounter with computing devices was to observe overburdened graduate students lugging heavy boxes brimming with punch cards across campus, on their way to or from the computing center where they were scheduled for precious time to process hard-won data with homemade programs on the institution’s mainframe. Today I depend on and adore the instant access that the phone in my pocket gives to the whole world of medical literature.
Up until after WWII, the grand majority of American doctors were generalists, working in small private practices. They took care of patients and families over time, getting to know them well. Today, thanks mostly to corporatization and the profit motive (including in “non-profit” institutions that avoid taxes by calling their profits “margin”), more than half of US physicians are employed. Less than 1/3 are in primary care.
I know my complaining about how things have gotten worse in healthcare in the last 50-plus years makes me sound like a geezer, not unlike the seasoned doctors I considered to be geezers when I was a young recruit. These guys (almost all of them were men, something in American medicine that has gotten better) complained all the time about the course our profession had taken since they were wet behind their professional ears.
I didn’t retire out of despair over the less human, less efficient, and more costly direction healthcare has taken over the term of my career. No matter what my situation, it has always been a privilege to practice medicine; to be let into people’s lives; to help and even to heal them.
But darn it, I wish things would get better. I pray that 52 years from now some old doctor like me will look back with satisfaction, not just at the new drugs, diagnostic, and surgical procedures that came along during their career, but at how much more potent and healing healthcare had become for both patient and doctor. I don’t see how things could get much worse. I said that in 1970 too.