Pharmageddon. David Healy
wise and seemed to transcend the boundaries of religion, politics, and division I saw elsewhere.
When my mother developed problems in the early 1960s after giving birth, Dr. Lapin had suggested she come to see him once a week, but at the time she felt the arrangement was too open-ended, and she could not afford it. She was seen instead by another doctor, diagnosed with an ulcer and ultimately received the standard operation of the day, which involved cutting the vagus nerve and partial removal of stomach. This left her with bowel problems for the rest of her life, and regrets for not having taken Dr. Lapin's offer of treatment for what she later regarded as postnatal depression.
When my mother consulted him about the wisdom of an operation for my father, Dr. Lapin was slow to comment. But when pressed, he pointed out that my father had a number of illnesses, any of which could kill him before the tumor would. Many people, he said, went to their graves with cancers, heart disease, or other problems, but these were not what killed them. An operation would take a heavy toll on him.
My mother relayed this perspective to my father and suggested that he take six months to build himself up and then have an operation if he felt stronger; he agreed. When this plan was mentioned to the surgeon, he responded, “That's fine, but have him out of the hospital within 48 hours.” When my mother revealed that my father still didn't know he had a cancer, the surgeon went straight from the phone to tell him. Without an operation my father would be dead within months, Dr. Neligan indicated, but an operation offered the prospect of a cure. My father, alarmed, agreed and the operation took place two days later. Dr. Neligan afterwards said there was little they could do about my father's tumor when they opened him up. He died six months later, his life almost certainly shortened by the operation.
If there had been progress to speak of in the treatment of lung cancer in the years since my father's death, his medical care might be viewed as one of those sacrifices that at least ultimately benefits others. But there has been little progress, even though advances on almost all medical fronts are trumpeted daily. Genuine progress has been made in some areas, but far less in most areas than many people have been led to believe. More importantly, when it comes to pharmaceuticals in particular, many of these apparent advances underpin and contribute to what in recent decades has become a relentless degradation in medical care, a replacement of Lapins with Neligans, a quickening march toward Pharmageddon. While drugs played no part in what happened to my father, they have played a huge role in fostering a surgical attitude to medical care, a kind of fast healthcare.
My father's illness came just as I entered medicine, seventy years after a momentous change in Western clinical practice. Around 1900, a series of new diagnostic measures, some based on blood tests, others linked to X-rays, and yet others involving the culture of sputum or urine samples for bacteria, enabled physicians to distinguish among many diseases and find remedies for some of them. Before this, the diagnosis patients got was based on how they looked and what they said about themselves when they walked through the door of a doctor's office—if they were weak and tired, they had “debility”; if they were wasting, they had “consumption.” If they were diagnosed with a tumor, it was because it was visible or could be felt; if they had diabetes, it was because their urine had a distinctive smell. With the development of new tests, however, the diagnosis only came after a blood test or X-ray confirmed what was wrong, perhaps weeks after the visit to a doctor's office or admission to hospital. And the tests revealed new conditions such as heart attacks and duodenal ulcers. Among the states of consumption, it became clear some stemmed from tuberculosis, while others did not.
A new breed of physician and hospital emerged. In Boston, Richard Cabot was celebrated for his diagnostic acumen, and the reputation of Massachusetts General Hospital in the early decades of the twentieth century began to soar on the abilities of its physicians, aided by their new technologies, to get the diagnosis right, which, it was presumed, would lead to better medical care. But others were concerned. Alfred Worcester, a professor of hygiene and prominent Massachusetts physician, who was later lauded as a father of both modern geriatrics1 and palliative care,2 lamented that “the demands of modern diagnosis diverted doctors away from developing and exercising their traditional knowledge of human nature.” Worcester was troubled that the new testing requirements for making a diagnosis would alter a doctor's interactions with his patients. Absorbed in the new technologies, doctors would lose their ability to have an ongoing therapeutic influence over their patients.3
Good medical care, we might imagine, should manage to embrace the visions of both Cabot and Worcester. The new techniques after all made a great difference in our ability to help patients, and while humane medical care is wonderful, most people would regard a cure as excellent care even if they don't much like the doctor. Patients in the early twentieth century voted with their feet and sought out the new generation of specialists. But as my father's case illustrates all too vividly, there is a balance to be sought between caring and attempts at curing, and this balance is particularly important in the many instances where cures aren't possible.
Early concerns that medicine might lose its caring soul in exchange for earthly cures were sidelined in the 1940s and 1950s when a host of new life-saving treatments came onstream. While there were also great surgical advances, culminating in the dramas of the first kidney and heart transplants, the key breakthroughs occurred in the pharmaceutical domain. In addition to offering cures in their own right, new drugs like the immunosuppressants and antibiotics laid a basis for developments in surgery and other areas of medicine.
Despite these wonderful breakthroughs-indeed, some critics thought in part because of them-concerns about medical specialism reemerged in the 1960s framed in terms of medicalization. Concerned observers argued that we were ceding too much power to a medical establishment engaged in pathologizing huge swathes of daily life and not equipped to take it upon themselves to define what it meant to be human. The most powerful critique of medicalization came from the Austrian philosopher, Ivan Illich, in his book Medical Nemesis,4 published in 1975, the year my father died.
In retrospect, the mid-1970s can be seen as close to the acme of medicine's ascendancy. The pharmaceutical industry was still at this point a junior partner to the medical establishment. But as roles have shifted and the power of drug companies has become more apparent, references to medicalization since the mid-2000s have begun to be replaced by references to pharmaceuticalization, which increasingly sees our identities as a series of behaviors to be managed by drug use.
Then in 2007, Charles Medawar, Great Britain's leading healthcare consumer advocate, raised the prospect of something beyond pharmaceuticalization: “I fear that we are heading blindly in the direction of Pharmageddon. Pharmageddon is a gold-standard paradox: individually we benefit from some wonderful medicines while, collectively, we are losing sight and sense of health. By analogy, think of the relationship between a car journey and climate change—they are inextricably linked, but probably not remotely connected in the driver's mind. Just as climate change seems inconceivable as a journey outcome, so the notion of Pharmageddon is flatly contradicted by most personal experience of medicines.”5
By “Pharmageddon,” what Medawar and colleagues (myself included) had in mind was something quite different than a simple pharmaceuticalization, where we talk about our neurotransmitters rather than our moods, a biological reduction of secularism.6 Pharmageddon refers not to a change in the language of medicine or a change from religious to biological language, but to a process that was deployed in the first instance in the belief that it would better enable us to care for each other, though now it is a process that seems set to eliminate our abilities to care—a fate that beckons in spite of what everyone wants. At the heart of this process is the turn toward quantification in the middle years of the twentieth century. While genuinely helpful, this turn gave healthcare a set of scientific appearances that a handful of shrewd advisors and marketers have been able to manipulate to infect our abilities to care as if with a clinical immuno-deficiency virus (CIV). As a result the defense reactions that we might expect from prestigious journals and professional bodies just don't happen. Indeed the virus seems to have been able to subvert these bodies to its own purposes, so that when critical comments are raised they have reacted almost as though it was their programmed duty to shield a few fragile companies from the malignant attentions of a pharmaco-vigilante.