All Articles
Health

The Doctor Who Lit Up in the Exam Room: How American Medicine Learned to Listen

By The Then & Now File Health
The Doctor Who Lit Up in the Exam Room: How American Medicine Learned to Listen

The Doctor Who Lit Up in the Exam Room: How American Medicine Learned to Listen

If you walked into a doctor's office in 1965 and noticed a cigarette burning in an ashtray on his desk, you probably wouldn't have thought much of it. Doctors smoked. Nurses smoked. Hospitals had smoking lounges. The American Medical Association ran tobacco advertisements in its own journal well into the 1950s, with copy suggesting that certain cigarette brands were easier on the throat.

This wasn't fringe behavior. It was the mainstream of American medicine — and it's just one thread in a much larger story about how dramatically the culture of healthcare has changed in a single lifetime.

When the Doctor's Word Was Final

Mid-century American medicine operated on a model of authority that would feel deeply uncomfortable today. The physician was the unquestioned expert. Patients were largely passive recipients of whatever treatment the doctor decided was appropriate, and the idea of a patient asking detailed questions, requesting a second opinion, or refusing a recommended procedure was viewed — by doctors and patients alike — as somewhere between unusual and rude.

Informed consent, as a formal legal and ethical requirement, barely existed before the 1970s. The principle that a patient has the right to understand what is being done to their body — and to say no — was not standard medical practice. Surgeons sometimes performed procedures during an operation that patients had not agreed to in advance, reasoning that they were in the best position to judge what was necessary. Courts largely backed them up.

Medical records were considered the property of the physician, not the patient. Many doctors routinely withheld diagnoses from patients — particularly cancer diagnoses — on the paternalistic belief that the patient couldn't handle the information and might lose hope. A patient asking to see their own chart was an unusual and sometimes unwelcome request.

Treatments That Made It to the Mainstream

The list of standard medical practices from the mid-20th century that are now considered harmful, misguided, or outright dangerous is long enough to be genuinely startling.

Thalidomide, prescribed to pregnant women in the late 1950s for morning sickness, caused severe birth defects in thousands of children across Europe and elsewhere before being pulled from markets. The US was largely spared thanks to one FDA reviewer, Frances Kelsey, who held up approval due to insufficient safety data — a story that later helped drive stronger drug testing requirements.

Lobotomies were performed on tens of thousands of Americans between the 1930s and 1960s for conditions ranging from schizophrenia to depression to what contemporaries sometimes described as social nonconformity. The procedure's inventor received a Nobel Prize in 1949. It was presented as a medical breakthrough.

Diet pills containing amphetamines were prescribed freely through the 1960s and 1970s. Hormone replacement therapy was handed out to menopausal women for decades before research revealed it carried significant cardiovascular and cancer risks at the doses commonly used. Bed rest was prescribed for everything from back pain to heart attacks, despite growing evidence that immobility often made outcomes worse.

None of these treatments were the work of negligent or malicious physicians. They were the best guesses of a medical culture that hadn't yet built the systems to rigorously test what it was doing.

The Turning Points That Forced Change

Several forces converged in the 1960s and 1970s to begin reshaping how American medicine operated.

The consumer rights movement, energized by figures like Ralph Nader, began pushing back against the idea that professionals — whether automakers or physicians — were beyond accountability. The women's health movement was particularly significant: books like Our Bodies, Ourselves, first published in 1970, directly challenged the paternalism of mainstream medicine and argued that patients — especially women — had both the right and the capacity to understand their own health.

Legal cases through the 1970s began establishing that informed consent was a genuine requirement, not a courtesy. The National Research Act of 1974, passed in the wake of revelations about the Tuskegee Syphilis Study — in which Black men with syphilis were deliberately left untreated for decades without their knowledge — mandated new ethical oversight for medical research. The damage done by Tuskegee to trust in the American medical system still echoes today.

Evidence-based medicine as a formal discipline began gaining traction in the 1980s and 1990s, pushing the field toward systematic research and away from individual physician judgment as the primary standard of care. Randomized controlled trials became the gold standard for evaluating treatments. Practices that couldn't be supported by data began to face real scrutiny for the first time.

The Culture That Replaced It

Today's medical environment looks almost unrecognizable by comparison. Informed consent is a legal requirement before virtually any significant procedure. Patients have the right to access their own records. Shared decision-making — the idea that treatment choices should reflect both clinical evidence and patient values and preferences — is now a recognized and taught framework in medical education.

Smoking, of course, has been banned from hospitals and medical offices for decades. The American Cancer Society and the CDC run active campaigns to help healthcare workers quit. The image of the cigarette-smoking physician in the exam room feels less like history and more like a scene from a different civilization.

None of this means modern medicine is perfect or that the system doesn't have serious gaps. But the distance between what medicine was in 1960 and what it aspires to be today is genuinely vast.

How Recent "Common Sense" Actually Is

Perhaps the most thought-provoking thing about this history is how recently most of it changed. The idea that a patient deserves a full explanation of their diagnosis and a say in their treatment plan became legally and ethically standard within the last fifty years. The systematic testing of medical treatments against control groups — the foundation of modern clinical research — only became widespread in the latter half of the 20th century.

What we experience today as basic, obvious standards of care were, not very long ago, considered radical departures from the norm.

That's not a criticism of the physicians of the past. Most were doing their best with the tools and frameworks available to them. It's simply a reminder that medicine, like every other human endeavor, is a work in progress — and that the progress, when you look at the full sweep of it, has been extraordinary.