The Good Doctor: A Father, a Son, and the Evolution of Medical Ethics

The Good Doctor: A Father, a Son, and the Evolution of Medical Ethics

by Barron H. Lerner
The Good Doctor: A Father, a Son, and the Evolution of Medical Ethics

The Good Doctor: A Father, a Son, and the Evolution of Medical Ethics

by Barron H. Lerner

eBook

$15.99 

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

The story of two doctors, a father and son, who practiced in very different times and the evolution of the ethics that profoundly influence health care
 
As a practicing physician and longtime member of his hospital’s ethics committee, Dr. Barron Lerner thought he had heard it all. But in the mid-1990s, his father, an infectious diseases physician, told him a stunning story: he had physically placed his body over an end-stage patient who had stopped breathing, preventing his colleagues from performing cardiopulmonary resuscitation, even though CPR was the ethically and legally accepted thing to do. Over the next few years, the senior Dr. Lerner tried to speed the deaths of his seriously ill mother and mother-in-law to spare them further suffering.
  
These stories angered and alarmed the younger Dr. Lerner—an internist, historian of medicine, and bioethicist—who had rejected physician-based paternalism in favor of informed consent and patient autonomy. The Good Doctor is a fascinating and moving account of how Dr. Lerner came to terms with two very different images of his father: a revered clinician, teacher, and researcher who always put his patients first, but also a physician willing to “play God,” opposing the very revolution in patients' rights that his son was studying and teaching to his own medical students.

But the elder Dr. Lerner’s journals, which he had kept for decades, showed the son how the father’s outdated paternalism had grown out of a fierce devotion to patient-centered medicine, which was rapidly disappearing. And they raised questions: Are paternalistic doctors just relics, or should their expertise be used to overrule patients and families that make ill-advised choices? Does the growing use of personalized medicine—in which specific interventions may be best for specific patients—change the calculus between autonomy and paternalism? And how can we best use technologies that were invented to save lives but now too often prolong death? In an era of high-technology medicine, spiraling costs, and health-care reform, these questions could not be more relevant.
      
As his father slowly died of Parkinson’s disease, Barron Lerner faced these questions both personally and professionally. He found himself being pulled into his dad’s medical care, even though he had criticized his father for making medical decisions for his relatives. Did playing God—at least in some situations—actually make sense? Did doctors sometimes “know best”?
 
A timely and compelling story of one family’s engagement with medicine over the last half century, The Good Doctor is an important book for those who treat illness—and those who struggle to overcome it.

Product Details

ISBN-13: 9780807033418
Publisher: Beacon Press
Publication date: 05/13/2014
Sold by: Penguin Random House Publisher Services
Format: eBook
Pages: 240
File size: 2 MB

About the Author

Barron Lerner is the author of four previous books on medicine and a frequent contributor to the New York Times’ Well column, TheAtlantic.com, Huffington Post, and several blogs. He lives in Westchester County, New York, and is a bioethicist, historian of medicine, and internist at New York University’s Langone School of Medicine.

Read an Excerpt

Chapter Two
Super Doctor
 
I have written several books on the history of medicine, and I readily admit to having had unrealistic expectations about how many copies of each would be bought. Tuberculosis, breast  cancer, and even celebrity patients were less compelling topics than I had anticipated. Still, I had to chuckle when I read of my father’s onetime plan to publish a book entitled Consultant, detailing his experiences seeing patients with infectious diseases at the Veterans Administration
Hospital, Mount Sinai Hospital, and other Cleveland medical institutions in the 1960s and 1970s. As my agent could have told him, the subject was too “specialized.”
 
Fortunately, however, my father saved his notes, which not only illuminate his early medical career but also provide a moving depiction of him at the height of his powers, as he was practicing an intense type of medicine that might best be described as all-consuming. My dad provided a crucial service to internists, surgeons, and other physicians by diagnosing the illnesses of their sick patients and then prescribing effective antibiotics. His overarching concern for the physician-patient relationship also shone through in many of the cases that he documented.
 
As an infectious diseases consultant, my father turned out to have a front-row seat to some of the emerging ethical issues—such as medical errors and the limits of medical technology—that would soon burst forth into the public spotlight. But his approach to these issues remained largely based on paternalism and beneficence: How could and should the doctor help his patient navigate these complicated and often controversial questions? Given his background and training, which stressed “Doctor knows best,” he could hardly have chosen a different approach.
 
Meanwhile, I was a fairly typical teenager, trying to balance schoolwork, friends, and jobs. Two important decisions I made during these years were to have a bar mitzvah and to work for two summers at a nursing home at which my father was the medical director. The first represented an important exploration of my Judaism, although, like my father, I had great ambivalence about religion. The second turned out to be a dry run for my becoming a doctor.
 
Not all cases of infection require an infectious diseases consultation. Garden-variety pneumonias and urinary tract infections, for example, can be treated with a standard assortment of antibiotics. The task becomes even easier if a culture of the infectious material—such as sputum, urine, or blood—grows a specific organism. The microbiology laboratory can then test specific drugs against the bacteria in question, simplifying the choice of medication.
 
So when my dad was called in on a case, it was a good bet that it was complicated, either because the source of infection could not be determined or because the choice of antibiotic was unclear. For the most part, my father was happy with this arrangement. Like most physicians, he loved difficult and unusual cases, as they made for interesting discussions in the hospital corridors, on rounds, and at the citywide infectious diseases conferences he inaugurated in Cleveland in the late 1960s. Plus, even though my dad was incredibly busy, being a consultant provided him with considerably more flexibility and independence than his colleagues had, with their regular office hours and hundreds, perhaps thousands, of patients.
 
A typical case that my father saw in the early 1970s was a man with a lymphoma and an unusual pneumonia whose doctors were deciding whether or not to do a lung biopsy, which would involve opening the man’s chest. Deducing that the man had pneumocystis pneumonia, which occurred in immunosuppressed patients and would later become a common malady of the AIDS era, he convinced the team to skip the biopsy and treat the patient empirically with antibiotics. The man recovered completely.
So did a teenager with congenital heart disease who had endocarditis. Despite being treated with penicillin, he was still running temperatures as high as 105. The patient’s cardiologist was worried because the infection was not getting better, and he consulted a surgeon about replacing the infected heart valve—a major operation that the boy might not have survived. Noting that, despite his high temperature, the patient appeared to be improving, my father implored the physicians to hold off on surgery. He added another antimicrobial agent to treat what he thought was a small pneumonia. The patient never required surgery and was cured of both infections.
 
A man with pancreatic cancer had recurrent bacterial blood infections. He had recently become infected with a highly resistant strain of an organism called Serratia. The team was at a loss as to what to do to save the man’s life. My father reached into his bag of tricks and suggested trying an older antibiotic, tetracycline, not normally used for this type of blood infection. The patient recovered, although he ultimately died from the cancer.
 
When an eighty-five-year-old man was admitted to the hospital with a severe infection of his neck, my father was able to diagnose a condition that was very rare: Ludwig’s phlegmon. The infection, which had first been described by German physician Wilhelm Frederick von Ludwig in 1836, had become uncommon in the antimicrobial era but was still featured in infectious diseases textbooks. The infection required drainage of the abscess in the operating room, and my dad scrubbed in for the procedure. The surgeon had so little experience with such cases that he took my father’s recommendation that he do an extra-long “guillotine incision” of the neck to help treat the patient.
 
My dad’s consult notes, which reflected his intimate knowledge of the diseases in question, were often tutorials for the doctors (and patients) involved. When an elderly man with advanced lung disease continued to have fevers and positive sputum cultures, my father wrote: “In patients with chronic restrictive pulmonary disease, who have trouble raising secretions, antibiotic therapy of an acute pulmonary infection leads to bacterial overgrowth of the respiratory secretions.” The treatment: stop antibiotics and pound on the man’s back four times daily to mobilize his phlegm. It worked.
 
As in this case, my father’s successes often resulted from using fewer as opposed to more antibiotics, an approach he had learned early on from Louis Weinstein and one that he would impart to generations of Case Western Reserve medical students, house officers and fellows. One woman with multiple myeloma, who had been admitted repeatedly for infections, had a pneumonia that would not respond to any treatment. Given her overall condition, my dad, thinking it was cruel “to put her through any more torture,” recommended that antibiotics be withdrawn and the patient made comfortable. The pneumonia, or whatever the lung condition was, resolved.
 
In another case, a woman had severe diarrhea probably related to previous use of an antimicrobial. A visiting professor had seen the patient and recommended “massive antibiotic therapy” to clear out what he thought was an infection in the intestines. But my father had noticed that in addition to the diarrhea, there was mucus in the stool, indicating that the patient’s immune system was already fighting the diarrhea. He recommended stopping all antibiotics and simply giving her sugar water, “the way one would treat an infantile diarrhea.” The patient recovered over the next several days.
 
Finally, my father consulted on a man who had been admitted with three weeks of high temperatures from an unclear source, a condition termed fever of unexplained origin (FUO) in a well-known 1961 paper by Paul Beeson and his Yale colleague Robert Petersdorf. With antibiotics, the patient’s fevers had come down but had not gone away. My dad convinced his colleagues to avoid any further invasive testing and send the patient home. They agreed, and the temperatures gradually disappeared.
So how did my father instinctively know when to be aggressive and when to cut back? He would have cited his clinical expertise, beginning with his years as a medical resident and an infectious diseases fellow in Boston and continuing with his experience as an attending physician in Cleveland. Indeed, after the case described above, he planned to compile a series of his FUO cases, those patients “who spontaneously recover without any definitive diagnosis and without ever again getting into trouble.” This, after all, was the sort of clinical research that my father did during his years in Cleveland—retrospective studies of cases that shared a common characteristic, usually infections caused by a particular rare organism. The research was top-notch and published in excellent, peerreviewed journals. In order to improve his knowledge and conduct his research, my dad diligently tracked down patients who had been discharged, often sending them personal letters.
 
But my father’s style of research belonged to an earlier era. By the late 1970s, the randomized clinical trial—in which large numbers of patients were enrolled in formal studies and followed prospectively over time—had come into ascendancy. Researchers sought major grants from the National Institutes of Health (NIH) and often collaborated at multiple medical centers throughout the country. Only through this type of sophisticated scientific analysis, biostatisticians argued, could true knowledge be obtained. Case studies like those done by my father were interesting but not necessarily representative.
 
Still, on any given day at any given hospital, consultants like my dad and Weinstein could amaze colleagues, students, and patients. In one instance, the VA doctors asked my father to see an unusual case of pneumonia that had stumped everyone. But he knew this bacterium well. “It was Nocardia,” Robert Bonomo, who trained under my dad, told me a few years ago. “He came over and nailed it.” On another occasion, when several physicians were evaluating acomplicated skin infection, my dad was the only doctor present who knew all the planes of tissue where bacteria could hide. J. Walton Tomford, another Cleveland infectious diseases specialist, fondly recalled attending the citywide conference at which my father and the two other local infectious diseases gurus, Marty McHenry and Manny Wolinsky, had the opportunity to show off their vast knowledge about the field—even taking one another on at times. “It was
like our church and synagogue,” Tomford told me. These doctors also loved to visit their colleagues’ hospitals. In one memorable case, my father asked McHenry to come to the Mount Sinai to convince a very reluctant Orthodox Jewish woman that she needed to have a lung biopsy. When McHenry, a devout Catholic, took out his rosary to pray for the woman, she quickly acquiesced.
 
In another instance, when no source of infection could be found in an older woman with a high fever, my father suggested that an artery along the side of her head—which was not at all tender—be biopsied to look for inflammation. The medical literature suggested that the condition he was looking for, temporal arteritis, caused only low-grade temperatures in the elderly, but my dad had seen three other cases with high fever. To the surprise of everyone, even my father, the biopsy was positive and the diagnosis was made. “This really represents the evolution of a consultant’s experience,” he subsequently wrote.
 
Once, an internist asked him to see a young woman who had developed a rash on her right forearm several days after receiving the antimicrobial ampicillin for a fever and sore throat. My father saw her at her home because she lived near us. I presume he took with him his black doctor’s bag, which he used for his infrequent house calls and which contained the slides, syringes, and other equipment he needed to make a diagnosis. The rash was very distinctive, extending in a linear pattern from her elbow to her wrist. It was petechial—that is, composed of small purple spots caused by broken blood vessels. My dad immediately suspected meningococcemia, the severe bacterial blood infection on which his future mentor Weinstein had been lecturing on the day they met in 1960. Employing a bit of showmanship, my father asked the woman if she had recently been playing tennis. Startled, she said that she had, a
few days before, around the time she had first seen her internist. He later attributed this feat of clinical acumen to the concept of locus minoris resistentiae (place of least resistance), taught to him by Weinstein. Due to the vigorous motion in the patient’s forearm caused by playing tennis, the meningococcal bacteria had preferentially settled there and caused a rash. Before leaving, my dad put a drop of liquid from one of the lesions onto a slide, returned to the hospital, and did his own Gram stain, confirming the diagnosis.
 
In yet another case, my father overruled an “excellent internist and competent ophthalmologist” and prescribed a low dose of an antileukemia drug for a woman who had shingles that involved her eye. The patient’s symptoms dramatically improved by the next morning. The basis of his decision? Observations that he and an infectious diseases colleague at nearby University Hospital had made that, in fact, contradicted a controlled study of the medication that had recently been done at the NIH. “Another aspect of the case that is certainly worth commenting on is the beautiful demonstration of the value of a specialist in a given situation,” my dad later wrote. “This is really a minutia type of therapeutic maneuver and can only come about through word of mouth and personal experience.” Although he closely reviewed the results of major clinical trials, he passionately asserted that keen clinical observation remained
the most important way to approach illness and care for sick people.

To what degree was my father truly bucking the trend in medicine that increasingly favored population-based data over clinical intuition? In her 2013 book on Sister Kenny, the famous polio nurse of the 1930s and 1940s, the historian Naomi Rogers argues that Kenny actually constructed an alternative pathophysiological model of the disease based on her personal bedside observations. That is, she rejected the then-current scientific explanations for how polio
crippled patients, feeling that the mechanism responsible was the muscle spasms she observed, as opposed to nerve damage. Kenny’s novel treatment strategy for polio, which involved mobilizing paralyzed muscles as early as possible, followed directly from her unorthodox perspective. Seeing, in other words, was believing. I encountered this epistemological conundrum when researching breast cancer: certain patients insisted that their cancers were caused by toxic exposures and others assured me that screening mammograms had saved their lives even though controlled-study data indicated that both scenarios were unlikely.
 
Firmly grounded in scientific medicine, my father did not hold radical beliefs. But he practiced what the sociologist Charles L. Bosk called, in his 1979 book on surgical training, “clinical individualism.” My dad believed that his observations and research constituted a type of clinical reality that was lost when physicians considered only the characteristics of a given disease among large populations of patients. More provocatively, he thought that doctors could use their clinical acumen, experience, and even empathy to reach conclusions about how specific illnesses acted or were likely to act in specific patients. Without scientific proof, of course, these claims could always be contested. But for my father and those trained in a similar manner, such insights needed to be considered at patients’ bedsides.
 
Regardless of their beliefs about how medical knowledge was best generated, my dad’s colleagues largely appreciated his insights and were glad to have a meticulous, experienced, and compassionate physician looking over their shoulders. But he was not universally beloved. My father had extremely high standards and little tolerance for those who he believed were lazy or incompetent. If a patient needed antibiotics emergently and was not getting them, according to Robert Bonomo, my dad would yell, “You need to do this now!” until it happened. In one instance, my father learned that a senior surgeon was planning to follow his usual routine and give a penicillin compound to a patient who was already on the operating table. Believing that the patient had a penicillin allergy, my father went to the operating room and demanded that the patient be given a different antibiotic. The situation was tense but Phil Lerner ultimately prevailed. In his early years as a consultant, my father weighed carefully “the pros and cons of stepping on toes.” Later in his career, he often directly confronted certain colleagues, which probably contributed to an informal nickname that he acquired over the years: the Madman of the Mount Sinai.
 
My father justified such conduct by an intense devotion to his patients, which emanated from his upbringing and training and was, for him, the heart of medical practice. He went the extra mile not only to comfort his patients but also to demonstrate behaviors that he hoped other doctors might emulate. Of course, my dad’s ability to sit down with patients, spend time with them, and answer their questions was enhanced by his job as a consultant in the postwar, pre–managed-care era. Paul Beeson had advocated the same humanistic approach. So did fictional television doctor Marcus Welby, who was on the air from 1969 to 1976.
 
A representative case was that of a twenty-eight-year-old woman admitted for severe intestinal bleeding and several related complications whom my father described as “emotionally-shocked,” “terribly sick,” and “frightened.” Even though she turned out not to have an infection, my dad was the one who calmed her down during a “fantastic screaming spell” that he suspected was due to the “frightening stillness” of the regular ward once she had left the “hustle and bustle” of intensive care. Appreciating his kindness, the woman relied on him both during and after her hospital stay. “A very rewarding experience,” he later wrote, “but really outside my primary area of interest.” So, too, perhaps, was the time that he spent a half hour speaking with a woman with a severe bone infection who had become upset when one of her other physicians had made a lighthearted remark about her condition. Another patient my father bonded intensely with was a young man who showed “considerable fortitude and maturity” while battling ultimately incurable endocarditis. Although my dad strove to maintain a strictly “professional relationship” with his patients, he admitted that he had developed a “genuine affection” for this man.
 
One of my father’s more interesting encounters involved a patient who wrote to him to protest what he believed was an excessive fee for a ten-minute consultation. My dad wrote him back and explained that the consult had also involved discussions with the man’s doctors and nurses as well as a review of his chart and X-rays. The patient was evidently impressed because he paid the entire bill and added in an extra five dollars for the time it had taken my father to write a reply. This gesture was perhaps the 1970s equivalent of paying the doctor with a chicken. The five dollars was, of course, returned. This encounter was quintessential Phil Lerner: he went the
extra mile to explain and demonstrate his philosophy of doctoring, and he gained a patient and a friend in the process.
 
My father was hardly the only intensely engaged physician at the Mount Sinai. Much more so than today’s practitioners, doctors of my dad’s generation viewed the medical decisions they made as almost personal—and defended them with ardor. My father was even known to get into fights with his brother, Allan, himself a passionate patient advocate. Once, when my dad noticed a house officer watching him and his brother argue with each other about a particular case, my father turned to the resident, rolled his eyes, and asked, “See what I have to put up with?” On another occasion, my father walked into the medical intensive care unit to find a surgeon and a gastroenterologist who disagreed about a patient’s care actually physically fighting with each other. In this case, my dad served as a peacemaker, helping to pull the two men apart. Recalling what he regarded as the Mount Sinai’s heyday, he later wrote that he “could bring any of my patients or anyone of my family into this hospital with the absolute assurance that the finest medical care was available in a stimulating, friendly and warm environment where everyone took pride in their work and was rewarded with the satisfaction of a job well done, even when the patient ran into problems, and even when death was the final chapter.”
 
My father’s concern for his patients was only enhanced by the fact that so many of them had a personal connection to him. Having lived most of his life in Cleveland and its suburbs, he knew many Jewish families. Of course, he also had personal connections to many non-Jewish patients, who were often former classmates, friends, and colleagues. In the words of the historian David J. Rothman, “doctor and patient occupied the same social space,” promoting a shared relationship. Meanwhile, the poor and minority patients my dad met for the first time at the Mount Sinai—including many he would then follow for years—got the same royal treatment. Just as my father’s choice of profession was in part out of gratitude that he’d grown up an ocean away from the Holocaust, his devotion to these ward or service patients, as they were called, was his way of acknowledging his good luck in the face of so many ongoing catastrophes around the globe. His goal was to “take extra pains with the service patients, to be certain they are reassured and confident in your care, and come to believe that you really care about him or her as an individual.” One way he did this was to take advantage of his flexible schedule. “It’s so simple,” he wrote, “to make an extra visit in the afternoon for these special cases, come back to report a new lab test result, review an X-ray [or] reassure that the scheduled test is necessary, important and will lead to some conclusive information.” Illness, he underscored, was “frightening.” When I read these words many years later as a professor, I had to smile. It was the exact sort of advice that I gave to students and residents when teaching them about the history of medicine and the doctor-patient relationship. My dad had acted this way as a matter of course.
 
Another group of my father’s patients consisted of doctors, nurses, other hospital employees, and their relatives. There may be no higher compliment for a physician than to be asked to care for a colleague’s loved ones, and my dad was definitely a “doctor’s doctor.” The 1970s was still an era of professional courtesy, and my father generally waived or reduced his fees when treating coworkers or their family members. This concept of the medical profession as a sort of guild that looked out for its members was a comforting one and may even have contributed to my decision to become a doctor. I remember seeing a few of my dad’s peers for sundry medical issues and feeling as if I was in good hands.
 
Whether he was caring for a friend, a relative, or a stranger, my father’s clinical interactions were always dominated by a paternalistic philosophy. It made sense to him that, since physicians trained for decades, spent long hours in the hospital, and devoted themselves to the care of both the poor and the wealthy, they should call the shots, and patients should acquiesce. Doctor knew best, whether he—and it was usually a he in those days—was renowned Harvard surgeon Francis Moore, polio vaccine inventor Jonas Salk, or heart transplant pioneer Christiaan Barnard. Practicing any other way was an abrogation of one’s duty. Physicians of my father’s era saw their paternalism as not only altruistic, but therapeutic: it was widely believed that if patients followed doctors’ orders (and that is what therapies and other interventions were called), they were more likely to recover. Sometimes in serious illness, New York infectious diseases specialist Walsh McDermott said, the physician himself was the treatment.
 
Perhaps the best example of how paternalism dominated medicine was the fact that physicians in the postwar era routinely lied to cancer patients about their diagnoses. Doctors jumped through all sorts of hoops to try to convince their patients—many of whom were dying of their disease—that they had merely a tumor or an inflammation. Surgery and radiation therapy, the patients were told, were given “to be on the safe side.” When Columbia nephrologist Jay I. Meltzer joined a group practice of more senior physicians, he soon realized that “the best doctors were,” paradoxically, “the best liars.” He demurred, preferring to find out from patients in advance what they would or would not like to know about their diagnoses. Meltzer told me a story about paternalism involving his former Columbia colleague and devoted physician Randolph Bailey. After learning of the famous 1964 surgeon general’s report delineating
the dangers of cigarettes, Bailey had told Meltzer that he planned to keep smoking so that his patients, who were unlikely to be able to stop themselves anyway, would not feel “frightened and helpless.”
 
Thanks to the large number of clinical advances in the postwar years—many of which came from experimental research—the medical profession at this time was gaining enormous prestige. The new antimicrobial agents had made formerly ubiquitous and scary diseases, like syphilis, tuberculosis, and bacterial pneumonia, far more manageable. The discovery and synthesis of insulin meant diabetes was very treatable. Salk’s vaccine had led to dramatically lower rates of polio, the dreaded summer plague. Meanwhile, the ability to bank blood made possible more aggressive operations for aneurysms, cancer, and other conditions. By the early 1960s, there were medications to treat high blood pressure, and dialysis machines to prolong the lives of patients with severe kidney disease. It was no longer enough to simply be a kindly and caring physician; the public also wanted doctors who were engaged with the latest laboratory research. “If They Can Operate, You’re Lucky” was the tagline for a cover story in the May 3, 1963, edition of Time magazine that detailed several innovative and aggressive new operations.
 
From a modern vantage point, it may seem curious that patients were so passive when dealing with such a serious topic as lifethreatening illness. And there have always been patients who have questioned their doctors and disregarded their advice. But many sick people welcomed the opportunity to have highly trained professionals make all their decisions. One explanatory model for this behavior, introduced by Harvard University sociologist Talcott Parsons in 1951, was called the sick role. Relieved of their normal duties due to their illnesses, patients believed they had an obligation to do what was necessary to get well, specifically by cooperating in the therapeutic process. Thus, while Columbia professor Robert Loeb’s domineering personality bothered some patients, most revered him.

After all, the man was in the hospital seven days a week and knew the names of all his patients, their family members, and even the hospital janitors. The same admiration was shown to breast surgeon Jerome Urban, who pioneered the super-radical mastectomy, a dramatic operation in which a woman’s breastbone and ribs were removed in an attempt to get rid of elusive cancer cells. Urban often slept on the couch in his office so he would be available to resume
operating first thing in the morning. Long after such disfiguring surgery had been discredited, Urban’s patients remained quite certain his extraordinary efforts had saved their lives. Late at night, Columbia-Presbyterian surgeon Philip Wiedel pushed a small cart down the hospital’s corridor, quietly entering patients’ rooms and changing their dressings by himself. “A doctor must work 18 hours a day and seven days a week,” wrote one physician from this era. “Ifyou cannot console yourself to this, get out of the profession.”

But by the early 1970s, the situation had begun to change. Over the previous decade, a series of research scandals revealed that some physicians had been willing to put fame and science above their concerns for patients. These violations occurred despite the fact that the Nuremberg Code—written in 1946 in response to the inhumane experiments carried out by Nazi physicians during the Holocaust—had explicitly mandated that all subjects must give informed consent
before being enrolled in research. Yet in one case, a doctor studying the body’s immune response to cancer had actually injected cancer cells into the skin of chronically ill noncancer patients at a hospital in Brooklyn. These individuals, many of whom were, ironically, Holocaustsurvivors, were told only that they needed an injection. Andwhat about the deliberate administration of active hepatitis virus into physically and mentally disabled children at the Willowbrook State School on Staten Island? The researchers argued that since there were frequent outbreaks of hepatitis at the institution, these children would get the disease anyway, and the study might lead to a preventive vaccine. But wasn’t this doing harm to a population that could not consent? The outrage reached a peak in 1972 when an Associated Press reporter revealed that in Tuskegee, Alabama, the US Public Health Service had observed poor southern African American men with syphilis for as long as forty years in order to study the “natural history” of the disease. The researchers had even continued the
experiment—depriving the men of treatment—when a highly curative antibiotic, penicillin, became available in the 1940s. Many of the subjects died of syphilis as a result of this deception. That the men in the Tuskegee study were black and the doctors who conducted research at major medical centers were overwhelmingly white was especially objectionable in an era of civil rights protests.
 
Meanwhile, some women with breast cancer were in almost fullfledged revolt. For decades, surgeons had done biopsies of breast lumps while women were under anesthesia. If the biopsies were positive, the surgeons believed immediate radical mastectomies were indicated. Rather than awakening their patients to obtain consent, they preferred to forge ahead with the procedure, although if a woman was married, the surgeon would ask the husband’s permission.When a woman awoke from this operation, she commonly reached for her chest to see whether or not her breast was still there, an experience that many described as thoroughly traumatic.
In the early 1970s, with the support of an iconoclastic surgeon from the Cleveland Clinic, George Crile Jr., a group of activist women began to refuse both this combination procedure and the reflexive use of such a mutilating radical operation for small, localized breast cancers. The best known of these women was Rose Kushner, a Washington, DC, journalist whose story I was privileged to tell in The Breast Cancer Wars, my book on the history of breast cancer. Confronted by this feminist initiative, most breast surgeons initially became more—not less—paternalistic, and at times patronizing. For example, in 1971, another journalist, Babette Rosmond, got her surgeon to agree to do just the biopsy of a breast lump. But when it came back positive and she asked for a few weeks to consider her options rather than immediately undergoing a radical mastectomy, he called her “a silly and stubborn woman” and made the ridiculous claim that she might be dead in a few weeks without the procedure.
 
By the mid-1970s, historians and other authors began to portray the medical profession and the history of medicine in an unflattering light. Medical Nemesis, by the Austrian philosopher Ivan Illich, argued that doctors did far more harm than good. Patients, he wrote, were “defenseless” against “the damage that doctors inflict with the intent of curing.” In The Birth of the Clinic, French historian Michel Foucault saw paternalism as a cloak that gave physicians far too much power and, paradoxically, also distanced them from patients.
 
Neither a breast surgeon nor a gynecologist, my father was less likely to encounter these new activist patients. But issues of patients’ rights were slowly creeping into his world, and at times, his notes contained a glimmer of the coming revolution. For example, one of his patients developed congestive heart failure due to endocarditis but, against the advice of my dad and a cardiologist, was able to successfully maintain a very vigorous exercise program. “We have a situation,” my father wrote, “where the patient, either out of ignorance, or fear, teaches the consultant something and challenges some traditional concepts.” Cases like this reminded my dad that even when he relied on his best medical judgment, he was not necessarily right.
 
More interesting was a case in which an orthopedic surgeon disregarded my father’s advice regarding an infected elbow without telling the patient what had been recommended. My dad, in contrast to the surgeon, thought that the elbow needed to be opened up and drained. My father later wrote that he did not know where his “legal, moral and ethical duty would reside” in this circumstance, but that “ideally,” he would “ignore the surgeon and directly tell the patient what I believe that the proper treatment should be and recommend that she seek this from some other source.” In this instance, at least, the etiquette of not stepping on his colleague’s toes
won out over the patient’s right to know my dad’s contrary opinion. He did not go behind the orthopedist’s back and talk to the patient. But the decision clearly made him uncomfortable.
 
 
Coincidentally, twenty years later, when running a session on bioethical dilemmas, I heard a very similar case, one in which a neurosurgeon was insisting on inserting a drain into the head of a boy despite the fact that the radiologist had told the house staff that it was not needed. When asked whether I thought the younger doctors had an ethical obligation to go over the senior surgeon’s head and inform the boy’s parents about this disagreement, my answer was an unequivocal yes. In an era of patient autonomy, they simply had the right to know. One can sense my father struggling here to incorporate new ethical norms into a very powerful and familiar style of medicine that he had long believed was best. It was a difficult task.
 
The tension between doctors’ prerogatives and patients’ rights came to a head, most commonly, in questions of medical error. Traditionally, hospitals had dealt with these problems internally, either informally among the involved physicians or at morbidity and mortality conferences. Although mistakes might be admitted during such reviews with the goal of making sure they did not happen again, any acknowledgments of guilt were kept secret from patients, families, and even uninvolved coworkers. The fear of lawsuits was simply too overwhelming. Thus, my father often used errors he had witnessed to provide himself and his colleagues with important lessons for the future, such as “It’s not necessarily gangrene just because there is some gas in the tissues” and “You can’t use a reduced dosage of Loridine in patients who have renal dysfunction.” What to do about the problem of medical errors more broadly was not addressed. In one particularly disturbing case, the father of a young man with a possible brain abscess kept interfering with the efforts of my dad and the other doctors to obtain a diagnostic arteriogram because the test was invasive and involved the injection of dye. As the negotiations persisted, the patient suddenly became comatose: he did have an abscess and it had ruptured, a true medical emergency that might have been avoided. Fortunately, the man survived. Not doing what was best for his patient, my father concluded, had been a huge mistake. “It will never happen again as far as I’m concerned,” he wrote. But, as usual, no larger investigation into the case by the hospital hierarchy ensued.
 
My father’s reticence to go public extended to cases in which patients had been mismanaged prior to being admitted to the hospital. Some patients’ serious illnesses could be traced to questionable diagnostic or therapeutic choices made by previous clinicians. In one case, for example, my dad saw a woman for pneumonia and realized that none of the other doctors had found her very obviously enlarged neck lymph nodes, which indicated that the patient almost surely also had lung cancer. He likely never revealed that mistake to the patient. When a newborn baby developed a serious infection known as toxoplasmosis and my father was called in on the case very late, he experienced “true frustration.” Although he was “not certain that earlier treatment would have made a difference,” he believed that effective therapy had been available, “and it might have made the difference between a child totally retarded and institutionalized and perhaps a child who could have lived a more normal existence.” Again, there is no suggestion in his journals that he ever said anything to the baby’s parents. My father thought that he could be a more effective consultant by educating his colleagues about their errors and reminding them when and why they should call him. But that is where it ended, at least well into the 1980s. A consultant, he wrote, had to be “extremely careful in what he says” to patients and families. Under the ethical codes of the era, the sanctity of the doctor-doctor relationship still took precedence.
 
But my father remained conflicted about this sort of silence and made some gestures that—if not open admissions of errors—acknowledged bad outcomes. In one case, a teenage boy had died of an extremely severe pneumonia that his pediatrician had not originally taken seriously. My dad, who described himself as “shaken by this case,” asked the referring physician if he might seek out the family and explain what had taken place. “I did this,” my father wrote, “and it was not a very pleasant task.” I suspect that, in addition to expressing his condolences, my dad focused on the medical aspects of the case. In another instance, he typed up a three-page single-spaced note for his own fi les about a boy who had died of severe liver disease after bouncing in and out of several hospitals. “Given the final diagnosis and a treatable disease, could we have expected to save this youngster?” he asked himself. “I think the answer is yes and no.” My father attended the several-hour autopsy but called the “cold and austere” autopsy room “chilling” and felt that he was the only person in attendance with any emotional connection to the case. There is no mention of a postmortem visit with the parents in this instance, but it was this type of complicated case that bioethicists would soon urge physicians to openly discuss with patients and their families.
 
Being a full-time academic meant that in addition to seeing infectious diseases consults at the Mount Sinai and neighboring hospitals, my father taught medical students and house offi cers and sat on various medical school and hospital administrative committees. He also regularly read up to twelve medical journals, both those devoted to his specialty and others meant for the larger medical community. Keeping up with the literature was essential for any practicing clinician, but particularly for the consultant, who was relied upon for the latest knowledge about new medications and recent scientific studies. Not knowing all such information would have been an abrogation of one’s duty. As was the case in the homes of many doctors of this era, our bookshelves were lined with volumes of certain medical journals that my dad had read and, at the end of each year, had bound. This information was also available at the medical school, but having a home library ever available for consultation was important. In the days before the Internet simplified researching the scientific literature, my father would rip out articles on interesting cases from various journals. When encountering a similar case at work, a lightbulb would go on and he would try to find the jagged pages he had stuck, most often, in a poorly labeled folder. At one point he wrote that some people were “incredulous” that he had no interest in golf, bridge, or other hobbies, but he nonetheless remained focused on increasing his medical knowledge, studying his profession.
 
My dad’s passion for medical research came through vividly in an anecdote I found in his journals about a national infectious dis eases conference that he attended. A speaker there had presented exciting data on a new modality, monoclonal antibodies, that held enormous potential for the treatment of infections and cancers. Someone in the audience asked why he had not mentioned the earlier work of another researcher, leading the speaker to praise that investigator’s work, which had gone largely unappreciated. Then another hand went up in the audience. It was the earlier researcher himself, who then recounted his saga, adding that the Nobel Prize–winning immunologist Macfarlane Burnet had once come up to him on a London bus to commend him for “immortalizing the cell that makes antibody.”
 
“The audience went wild,” my father wrote, “well, at least as wild as that type of audience could muster.”

It is not hard to understand why he included this event in his journals. Monoclonal antibodies were the exact type of scientific breakthrough that had drawn my father to infectious diseases in the late 1950s, as the nascent specialty was conquering tuberculosis, polio, and other dread diseases. There was nothing quite so exciting in medicine as learning about a new technology that might save the lives of otherwise doomed patients.
 
My dad fanatically followed his own patients, even when we were on vacation. He scheduled daily calls from Cape Cod or wherever we were staying to review the cases with the residents or fellows covering the infectious diseases service. We traveled only at the end of the month so that the covering doctors would know the service well, a habit that struck me as utterly inconceivable once my wife and I began the arduous task of planning family vacations that suited all our schedules. In later years, when my parents vacationed each June with my uncle in the south of France, my father still kept tabs on his patients. A June 1990 journal entry noted that he was “consumed with worrying about my patients” and had dreamed about the sickest
one the previous night. I am not positive, but I suspect he himself paid for the phone calls from France to Cleveland. Indeed, as the only infectious diseases specialist at Mount Sinai, he was technically on call every day he worked there from 1973 until 1993, when he finally hired an associate. And yet, as Robert Bonomo reminded me, my omnipresent father always made sure to let his younger colleagues “spread their wings.” “What are you trying to achieve?” wasone of his favorite questions to his trainees.

My father’s various commitments left little time for the other portion of his job: clinical research. The only time he could do this activity was on nights and weekends. And that’s when he didit. Pretty much every night after dinner, if he was not attending a meeting, he would spend hours reading and writing at a table crowded with books and journals. Within the house, my dad was
sort of a wandering Jew. For a while, he used the kitchen table after it had been cleared. However, this necessitated reorganizing and removing the materials every night before he went to bed. So at some point, he commandeered the less-used dining-room table, which could serve as a more permanent repository. The fourth bedroom upstairs, his home offi ce, was piled high with journals, books, and medical charts. New York City gastroenterologist and author Michael
Lepore called the colleagues of his who spent their free time doing extra patient care, teaching, and researching the “sons of Hippocrates,” physicians who “gave more than they took.” My dad was doing the same thing five hundred miles to the west.
 
I grew up thinking that my father’s constant work was essential and commonplace. With important knowledge to soak up and papers to write, he did what he had to do. Sure, it made him less available to his family, but my admiration for his diligence substantially outweighed whatever frustration I felt. Once I was a teenager, I was just as glad to have him occupied and not bothering me. My sister, however, later admitted that she had felt that his work habits
were excessive and even selfish. For my mother, her spouse’s constant immersion in medicine was one of many sacrifices she had had to make in marrying a physician who was both an academic and a workaholic. Having now read his journals, I see that publishing his
findings was his way of keeping alive the sort of case-based clinical knowledge that he believed was disappearing.
 
Meanwhile, I was growing up, staying at Hawken School for both middle and upper school. Decades later, when I began researching events in the history of medicine in the 1960s and 1970s, I was glad that I had started Hawken in 1969. The formal traditions that I experienced at the school must have provided a reassuring contrast with the social turmoil that had come to Cleveland, mostly in the form of riots in the largely African American Hough area that abutted
Case Western Reserve and its affi liated hospitals. On at least one occasion, the police prevented my father from getting to the Veterans Administration hospital due to violence in the area. Hawken stayed immune for as long as it could. When I started there as a fourth grader in 1969, we called all the male teachers “sir” and stood whenever a female entered the room. Before winter and summer vacations, each student formally shook hands with the entire faculty.
Within a couple of years, all these customs, plus the dress code, seemed antiquated and had been relaxed. But my Hawken experiences reminded me about the varying ways in which social change spreads in different communities.
 
A notable event during my middle school years was my bar mitzvah. There had been no question that my father would have an Orthodox bar mitzvah, given his religious household. But, in part due to his subsequent ambivalence about Judaism, our family had never even joined a temple. By the time I got around to deciding to forge ahead, at age twelve, the only Hebrew school willing to prepare me for the event was a local Orthodox school, Yeshiva Adath B’nai Israel, located about a mile from my house. The fact that I was the son of a noted Cleveland doctor probably did not hurt in their decision to accept me as a student. Not surprisingly, YABI was like a different world for me, although one my Eastern European ancestors would have recognized. The male teachers dressed in black, had long beards, and wore traditional yarmulkes and tzitzit. The women were second-class citizens and, to my chagrin, had to
sit separately from the men at services. Like my father, I had little use for the prayers.
 
Ultimately, however, I was glad I had a bar mitzvah. All four of my grandparents, plus my great-grandfather Ben, attended. My grandfather Meyer, upset that my father had largely renounced his faith, was thrilled that the service was Orthodox, allowing me to chant in Hebrew for close to two hours. It was especially meaningful to me that Ben, by then a widower and in his eighties, was there. Here was a man who had left Poland without his family or money, somehow leapfrogging over the Holocaust, and who was now attending his great-grandson’s Orthodox bar mitzvah in America. I hope my performance that day was at least partial payback for the
remarkable path my ancestors had followed.
And my experiences at YABI were a good stepping-stone for my future career studying bioethics and history. The biblical stories we discussed in class, populated with just and heroic characters (usually Jews) and readily identifiable villains (occasionally Jews, but usually not), were full of ethical lessons about right and wrong. And I recall some provocative discussions about whether these stories were historically accurate or merely moralistic tales that anonymous scholars had penned. I also enjoyed questioning the seemingly straightforward religious traditions that Rabbi Joseph Fabian and the teachers passed down to the school’s students. For example, I remember asking why YABI members offered to come into our homes with some type of blowtorch to remove every last bit of bread (chametz in Hebrew) in our cabinets in preparation for the Passover holidays. “Isn’t the symbolic act of eating no bread for eight days much more important than whether or not you accidentally leave a few crumbs
lying around?” I asked. The teacher, as I recall, was appalled at my heresy. The medical profession, I would later learn, could be just as dogmatic about its traditions and beliefs.
 
Meanwhile, middle school was a challenge. I had gained weight, developed a bad case of acne, and, perhaps worst of all, become very socially awkward. As Hawken was still an all-boys school, I had fewer and fewer interactions with girls. My major outside interest was sports.
 
Plus, I was in conflict with my father. “Barron, I don’t know at all,” he wrote in 1977. “He’s a closed creature, a self-contained enigma.” In my defense, this characterization probably applies to a lot of teenagers. And the apple did not fall far from the tree. My dad kept his emotions locked away as well. The thing that bothered him most was my academic performance—or lack of it. Accustomed to my getting all As, he did not understand why this was no longer the case and why it did not bother me more. Truth be told, I don’t really know the answer myself. One’s teenage years are generally a time of rebellion, but, aside from my solitary nature, I was essentially a Boy Scout. I had no allowance and earned my spending money with a paper route that required me to get up at six in the morning. Moreover, there were neither girls nor drugs nor secret parties; I did not even talk back to my parents. Once I was eighteen, I drank only on weekends and moderately. In fact, after I got my driver’s license, I became the de facto designated driver for my friends, even before the concept had gained currency in this country. My friends and I were fairly quintessential nerds, bonding over sports and ogling girls, undoubtedly to their dismay.
 
So perhaps taking it easy at school was my brand of revolt. In retrospect, I was probably exhausted. I remember staying up most nights to watch the monologue on Johnny Carson’s Tonight show, which meant that I was getting only six hours of sleep much of the time. And I was not doing terribly, by any means, just Bs along with my As. I think most of my father’s frustration stemmed from his own childhood history. Having had a father who had been forced to drop out of school, he had busted his butt at academics while also working part-time jobs. But here I was, attending a prestigious private school and seemingly not doing my best. When I got what he believed were bad report cards, he would yell at me and then give me the silent treatment. Typical of the era’s wives and mothers, my mom quietly tried to defend me but never really challenged my father’s authority.
 
Some subjects certainly interested me more than others. My first love was history. Going back to my early obsession with the presidents, I had always been fascinated with the past and would read for hours about old sports events, movies, buildings, and political figures. Many of my sixth-grade classmates were bored during a trip to the Henry Ford Museum and Dearborn Village, but I remember staring intently at the early-twentieth-century cars and gasoline pumps, even imagining myself living in that era. I was very nostalgic as a kid and have remained so as an adult. History tests were always easy for me, as I was a great memorizer of facts and an above-average writer. As a high school senior, I became the coeditor of the peculiarly named school newspaper, the Affirmative No. I also did fine in math, biology, chemistry, and physics, which would help me as a premed in college.
 
My summers were largely spent at home, aside from the annual late August vacation to beautiful Wellfleet on Cape Cod. I bypassed the opportunity to go to sleepaway camp. A typical summer for me was spent doing my paper route and either going to a local day camp or taking classes at the public high school. I also “worked” for my father, requesting reprints for him of medical articles—often from obscure journals—the titles of which he had found interesting. In an era before the Internet and Medline, exchanging reprints was an important way for physicians to share information. I also sent out reprints of my dad’s articles. Even though many requests for them came from South America, Europe, and Asia, and thus required extra postage, my father saw the opportunity to share his work as an honor. My job, for which I was reimbursed, was tedious, but once again I admired my dad’s zeal for obtaining and disseminating medical knowledge.
 
In the summer of 1975, between ninth and tenth grades, I embarked on a new activity. In addition to his other tasks, my father had become the medical director of Montefi ore, a nursing home started under Jewish auspices and located in Cleveland Heights, about a twenty-minute bicycle ride from my house. Most of the medical problems among the residents were not infectious in nature, but my father was also a highly competent internist. I suspect he took the job mostly for the extra paycheck it brought in.

As with other nursing homes, Montefi ore encouraged teenagers to volunteer as “friendly visitors.” In taking on this position, my main task was to keep the residents company, either in their rooms, outside, or in the small coffee shop. Why did my father want me to volunteer at Montefi ore? Perhaps, as with the not-too-subtle interesting clinical vignettes he frequently told me, it was part of his plan to push me toward a career in medicine. But I suspect that it was more related to his larger campaign to get me interested in something other than sports. As usual, I complied with his suggestion. The fact that two cute girls from the public high school
were also volunteering there probably did more to motivate me than anything else.
 
But once I started working there, I had to admit that I liked it. One of my earliest memories is my father taking me upstairs to the nursing home’s second fl oor, which was essentially a large dayroom for patients who needed full-time supervision. He introduced me to an elderly man, impeccably dressed in a suit and tie, who had been the chairman of medicine at one of the local hospitals. We chatted briefly and he wished me luck. When my father and I went back downstairs, I asked him what such an impressive individual was doing in that location. “He is completely demented,” I was told. “He no longer has any idea of where he is.” It was my fi rst encounter with Alzheimer’s disease and an utterly vivid demonstration of how a healthy body could house a severely dysfunctional brain.
 
I still remember the first two residents I visited. One, named Esther, was a warm and very talkative woman who probably qualified as the nursing-home gossip. The second, Max, was a polite and generous man who reminded me of my grandfather Meyer. Truth be told, neither of these people needed a volunteer. They were among the most high-functioning and sociable residents of the home. But I assumed that the head of volunteers liked to start out her new charges with simple cases. I eased into my position as a volunteer with little difficulty. I had always been a child who was quite comfortable speaking with adults; I showed them respect, used humor when appropriate, and was eager to hear their stories. Later on, my older patients would often tell me I was the only doctor who listened to them, surely an exaggeration but something I was nevertheless pleased to hear.
 
The most memorable resident of Montefiore at the time was someone we would now call high maintenance. Ed was wheelchairbound, for reasons I could never quite understand, and highly emotional. He used his feet to move his wheelchair around the premises, constantly getting into other people’s business. When he learned that I was Phil Lerner’s son, he started crying, an event that would repeat itself many times. He would then proceed to rave about how my father was the most wonderful doctor he had ever met and how I was so lucky to have him as a father.
 
Such unbridled emotion made me uncomfortable, especially if either of the two girls I liked was around. But I could not help but be moved. My dad did something for a living that really made a difference to people. Even though I probably hated him at the time for putting so much pressure on me, I could see what his intense devotion achieved for people in need of not only medical care but an interested friend. When I read his journals thirty-five years later, I came to understand how the combination of scientific knowledge, clinical judgment, and prolonged face-to-face contact with patients and families enabled a physician to truly excel at his or her craft.
 
It was my experience at Montefi ore that led me to seriously consider becoming a doctor. “I’m delighted,” my father wrote to me on my fifteenth birthday, “that you show this evidence of empathy and willingness to help those less fortunate than yourself.” I returned to the nursing home for a second summer in 1976. I still had no dates, however, with either the Montefiore girls or any others.
 
The next year, 1977, would be a devastating one for our family. My grandfather Meyer unexpectedly died, and my mother was diagnosed with breast cancer. We all took these events hard, but no one took them harder than my father. His relationship to medicine would never be the same.

Table of Contents

PROLOGUE

CHAPTER ONE
The First Dr. Lerner

CHAPTER TWO
Super Doctor

CHAPTER THREE
Illness Hits Home

CHAPTER FOUR
The Second Dr. Lerner

CHAPTER FIVE
Forging My Own Path

CHAPTER SIX
Treating the Whole Patient

CHAPTER SEVEN
Family Practitioner

CHAPTER EIGHT
Growing Disillusionment

CHAPTER NINE
Slowing Down

EPILOGUE

ACKNOWLEDGMENTS

BIBLIOGRAPHIC NOTE

What People are Saying About This

From the Publisher - EBOOK COMMENTARY

“Barron Lerner’s marvelous book—a deeply intimate story about his father and the practice of medicine—touches on some of the most profound issues in medicine today: autonomy, medical wisdom, empathy, paternalism and the evolving roles of the doctor and patient.  This is one of the most thoughtful and provocative books that I have read in a long time, and I suspect that generations of doctors and patients will find it just as thought provoking.”
—Siddhartha Mukherjee, author of The Emperor of All Maladies

The Good Doctor is a lovely book and a loving book; it's a book about medicine and family and ethics and history which embraces complexity and speaks to all those subjects with wide-ranging compassion and great good sense. And it's a father-son doctor saga with much to say about the healing power of story and understanding.”
—Perri Klass, MD, author of A Not Entirely Benign Procedure and The Mercy Rule

“An absolutely compelling treatise on bioethics told thru the lens of a physician's relationship with his physician father. If you want to understand the modern state of ethics in medicine, read this book.”
—Mehmet Oz, MD, Professor and Vice Chair, Surgery NY Presbyterian/Columbia

“A heartwarming story about a father-son doctor duo spanning a century, exquisitely showing the evolution of medical practice from antibiotics through bioethics. A small gem of a book.”
—Samuel Shem, MD, author of The House of God and The Spirit of the Place

From the B&N Reads Blog

Customer Reviews