Too Short for a Blog Post, Too Long for a Tweet 181
Here are a few excerpts from a book I recently read, "Natural Causes: An Epidemic of Wellness, the Certainty of Dying, and Killing Ourselves to Live Longer," by Barbara Ehrenreich.
The paradox of the immune system and cancer is not just a
scientific puzzle; it has deep moral reverberations. We know that the
immune system is supposed to be “good,” and in the popular health
literature we are urged to take measures to strengthen it. Cancer
patients in particular are exhorted to think “positive thoughts,” on the
unproven theory that the immune system is the channel of communication
between one’s conscious mind and one’s evidently unconscious body. But
if the immune system can actually enable the growth and spread of
cancer, nothing may be worse for the patient than a stronger one. He or
she would be better advised to suppress it, with, say, immunosuppressive
drugs or perhaps “negative thoughts.”
In
the ideal world imagined by mid-twentieth-century biologists, the
immune system constantly monitored the cells it encountered, pouncing on
and destroying any aberrant ones. This monitoring work, called
immunosurveillance, supposedly guaranteed that the body would be kept
clear of intruders, or any kind of suspicious characters, cancer cells
included. But as the century came to a close, it became increasingly
evident that the immune system was not only giving cancer cells a pass
and figuratively waving them through the checkpoints. Perversely and
against all biological reason, it was aiding them to spread and
establish new tumors throughout the body.
You
can think of death bitterly or with resignation, as a tragic
interruption of your life, and take every possible measure to postpone
it. Or, more realistically, you can think of life as an interruption of
an eternity of personal nonexistence, and seize it as a brief
opportunity to observe and interact with the living, ever-surprising
world around us.
Once
I realized I was old enough to die, I decided that I was also old
enough not to incur any more suffering, annoyance, or boredom in the
pursuit of a longer life. I eat well, meaning I choose foods that taste
good and that will stave off hunger for as long as possible, like
protein, fiber, and fats. I exercise—not because it will make me live
longer but because it feels good when I do. As for medical care: I will
seek help for an urgent problem, but I am no longer interested in
looking for problems that remain undetectable to me. Ideally, the
determination of when one is old enough to die should be a personal
decision, based on a judgment of the likely benefits, if any, of medical
care and—just as important at a certain age—how we choose to spend the
time that remains to us.
As
it happens, I had always questioned whatever procedures the health care
providers recommended; in fact, I am part of a generation of women who
insisted on their right to raise questions without having the word
“uncooperative,” or worse, written into their medical records. So when a
few years ago my primary care physician told me that I needed a bone
density scan, I of course asked him why: What could be done if the
result was positive and my bones were found to be hollowed out by age?
Fortunately, he replied, there was now a drug for that. I told him I was
aware of the drug, both from its full-page magazine ads as well as from
articles in the media questioning its safety and efficacy. Think of the
alternative, he said, which might well be, say, a hip fracture,
followed by a rapid descent to the nursing home. So I grudgingly
conceded that undergoing the test, which is noninvasive and covered by
my insurance, might be preferable to immobility and
institutionalization.
The
result was a diagnosis of “osteopenia,” or thinning of the bones, a
condition that might have been alarming if I hadn’t found out that it is
shared by nearly all women over the age of thirty-five. Osteopenia is,
in other words, not a disease but a normal feature of aging. A little
further research, all into readily available sources, revealed that
routine bone scanning had been heavily promoted and even subsidized by
the drug’s manufacturer.1 Worse, the favored medication at the time of
my diagnosis has turned out to cause some of the very problems it was
supposed to prevent—bone degeneration and fractures. A cynic might
conclude that preventive medicine exists to transform people into raw
material for a profit-hungry medical-industrial complex.
Then
my internist, the chief physician in a midsized group practice, sent
out a letter announcing that he was suspending his ordinary practice in
order to offer a new level of “concierge care” for those willing to
cough up an extra $1,500
a year beyond what they already pay for insurance. The elite care would
include twenty-four-hour access to the doctor, leisurely visits, and,
the letter promised, all kinds of tests and screenings in addition to
the routine ones. This is when my decision crystallized: I made an
appointment and told him face-to-face that, one, I was dismayed by his
willingness to drop his less-than-affluent patients, who appeared to
make up much of the waiting room population. And, two, I didn’t want
more tests; I wanted a doctor who could protect me from unnecessary
procedures. I would remain with the masses of ordinary, haphazardly
screened patients.
Of
course all this unnecessary screening and testing happens because
doctors order it, but there is a growing rebellion within the medical
profession. Overdiagnosis is beginning to be recognized as a public
health problem, and is sometimes referred to as an “epidemic.” It is an
appropriate subject for international medical conferences and
evidence-laden books like Overdiagnosed: Making People Sick in the
Pursuit of Health by H. Gilbert Welch and his Dartmouth colleagues Lisa
Schwartz and Steve Woloshin. Even health columnist Jane Brody, long a
cheerleader for standard preventive care, now recommends that we think
twice before undergoing what were once routine screening procedures.
Physician and blogger John M. Mandrola advises straightforwardly:
Rather
than being fearful of not detecting disease, both patients and doctors
should fear healthcare. The best way to avoid medical errors is to avoid
medical care. The default should be: I am well. The way to stay that
way is to keep making good choices—not to have my doctor look for
problems.
When
a pediatrician prescribed my second child an antibiotic for a cold, I
asked whether she had a reason to believe his illness was bacterial.
“No, it’s viral, but I always prescribe an antibiotic for a nervous
mother.” The prescribing was, in other words, a performance for my
benefit. Muttering that I was not the one who was going to be taking it,
I picked up my baby and left.
If
a medical procedure has no demonstrable effect on a person’s
physiology, then how should that procedure be classified? Clearly it is a
ritual, which can be defined very generally as a “solemn ceremony
consisting of a series of actions performed according to a prescribed
order.” But rituals can also have intangible psychological effects, so
the question becomes whether those effects in some way contribute to
well-being, or serve to deepen the patient’s sense of helplessness or,
in my case, rage.
Western
anthropologists found indigenous people worldwide performing supposedly
health-giving rituals that had no basis in Western science, often
involving drumming, dancing, chanting, the application of herbal
concoctions, and the manipulation of what appear to be sacred objects,
such as animal teeth and colorful feathers. Anthropologist Edith Turner
in the 1980s offered a lengthy and lovingly detailed account of the
Ihamba ritual performed by the Ndembu of Zambia. The afflicted person,
whose symptoms include joint pains and extreme lassitude, is given a
leaf infusion to drink, then her back is repeatedly anointed with other
herbal mixtures, cut with a razor blade, and cupped with an animal
horn—accompanied by drumming, singing, and a recital of grudges the
patient holds against others in the village—until the source of the
illness, the Ihamba, exits her body.
Does
this ritual work? Yes, insofar as the afflicted person is usually
restored to his or her usual strength and good humor. But there is no
way to compare the efficacy of the Ihamba ritual to the measures a
Western physician might use—the blood tests, the imaging, and so on—in
part because the Ihamba itself is not something accessible to scientific
medicine. It is conceived as the tooth of a human hunter, which has
made its way into the victim’s body, where it “bites” and may even
reproduce. If this sounds fantastical, consider that, as an agent of
disease, a “hunter’s tooth” is a lot easier to visualize than a virus.
Sometimes at the end of the ceremony one of the officiants will even
produce a human tooth, claiming to have extracted it from the victim’s
body. And of course the opportunity to air long-held grudges may be
therapeutic in itself.
Most
of us would readily recognize the Ihamba ceremony as a “ritual”—a
designation we would not be so quick to apply to a mammogram or a
biopsy. The word carries a pejorative weight that is not associated
with, for example, the phrase “health care.” Early anthropologists could
have called the healing practices of so-called primitive peoples
“health care,” but they took pains to distinguish the native activities
from the purposeful interventions of Euro-American physicians. The
latter were thought to be rational and scientific, while the former were
“mere” rituals, and the taint of imperialist arrogance has clung to the
word ever since.
Physicians
have an excuse for flouting the normal rules of privacy: The human body
is their domain, sometimes seen, in the case of women’s bodies, as
their exclusive property. In the middle of the twentieth century, no
woman, at least no heterosexual laywoman, was likely to ever see her own
or other women’s genitalia, because that territory—aka “down there”—was
reserved for the doctor. When in 1971 a few bold women introduced the
practice of “cervical self-examination,” performed with a plastic
speculum, a flashlight, and a mirror, they were breaking two
taboos—appropriating a medical tool (the speculum) and going where only
doctors (or perhaps intimate partners) had gone before. Many doctors
were outraged, with one arguing that in lay hands a speculum was
unlikely to be sterile, to which feminist writer Ellen Frankfort replied
cuttingly that yes, of course, anything that enters the vagina should
first be boiled for at least ten minutes.
Well
before the revival of feminism in the 1970s, some American women had
begun to complain about the heavy-handed overmedicalization of
childbirth. In the middle of the century, it was routine for
obstetricians to heavily sedate or even fully anesthetize women in
labor. Babies were born to unconscious women, and the babies sometimes
came out partially anesthetized themselves—sluggish and having
difficulty breathing. Since the anesthetized or sedated woman could not
adequately use her own muscles to push the baby out, forceps were likely
to be deployed, sometimes leading to babies with cranial injuries.
There was, however, an alternative, though obstetricians did not
encourage it and often actively discouraged it: the Lamaze method, which
had originated in the Soviet Union and France, offered breathing
techniques that could reduce pain while keeping the mother and baby
alert. In the 1960s, growing numbers of educated young women were taking
Lamaze classes and demanding to remain awake during birth. By the time
of my first pregnancy in 1970, it would have seemed irresponsible, at
least in my circle of friends, to do anything else.
We
were beginning to see that the medical profession, at the time still
over 90 percent male, had transformed childbirth from a natural event
into a surgical operation performed on an unconscious patient in what
approximated a sterile environment. Routinely, the woman about to give
birth was subjected to an enema, had her pubic hair shaved off, and was
placed in the lithotomy position—on her back, with knees up and crotch
spread wide open. As the baby began to emerge, the obstetrician
performed an episiotomy, a surgical enlargement of the vaginal opening,
which had to be stitched back together after birth. Each of these
procedures came with a medical rationale: The enema was to prevent
contamination with feces; the pubic hair was shaved because it might be
unclean; the episiotomy was meant to ease the baby’s exit. But each of
these was also painful, both physically and otherwise, and some came
with their own risks. Shaving produces small cuts and abrasions that are
open to infection; episiotomy scars heal more slowly than natural tears
and can make it difficult for the woman to walk or relieve herself for
weeks afterward. The lithotomy position may be more congenial for the
physician than kneeling before a sitting woman, but it impedes the
baby’s progress through the birth canal and can lead to tailbone
injuries in the mother.
So
how are we to think of these procedures, which some doctors still
insist on? If a procedure is not, strictly speaking, medically necessary
to a healthy birth and may even be contraindicated, why is it being
performed? Anthropologist Robbie E. Davis-Floyd proposed that these
interventions be designated as rituals, in the sense that they are no
more scientifically justified than the actions of a “primitive” healer.
They do not serve any physiological purpose, only what she calls “ritual
purposes.” The enema and shaving underscore the notion that the woman
is an unclean and even unwelcome presence in the childbirth process.
Anesthesia and the lithotomy position send “the message that her body is
a machine,” or as Davis-Floyd quotes philosopher Carolyn Merchant, “a
system of dead, inert particles,” in which the conscious patient has no
role to play. These are, in other words, rituals of domination, through
which a woman at the very peak of her biological power and fecundity is
made to feel powerless, demeaned, and dirty.
In
one sense, childbirth rituals “worked.” The women giving birth were
often traumatized, reporting to Davis-Floyd that they “felt defeated”9
or “thrown into depression”: “You know, treating you like you’re not
very bright, like you don’t know what’s going on with your own body." Yet, having submitted to so much discomfort and disrespect, they were
expected to feel grateful to the doctor for a healthy baby. It was a
perfect recipe for inducing women’s compliance with their accepted
social role: rituals of humiliation followed by the fabulous “gift” of a
child.
Comments