The story of psychiatric meds is all about progress. Isn't it?

As the use of 'miraculous' psychiatric drugs rose, so did the rate of disability for mental illness. A medical journalist asked why, and his findings shed light on even the most commonly used antidepressants.

Crosscut archive image.

Navy Corpsman prepares prescriptions

As the use of 'miraculous' psychiatric drugs rose, so did the rate of disability for mental illness. A medical journalist asked why, and his findings shed light on even the most commonly used antidepressants.

Robert Whitaker casts a puzzled eye on the story of progress that the American medical community, the pharmaceutical industry, and the public have learned to tell themselves about “the psycho-pharmacological revolution.”

As Whitaker tells it in his engaging, well-researched book, Anatomy of an Epidemic (reviewed in these pages last week), the conventionally accepted story opens in 1955 with the advent of the antipsychotic medication Thorazine. The event was hailed as comparable to the discovery of penicillin. Psychotic patients who took Thorazine or kindred medications daily on an ongoing basis became asymptomatic enough to leave their hospital wards and return to life in the community.  

Then in the 1970s came Prozac for depression, with a host of similar “selective serotonin reuptake inhibitors.” And then, in the 1990s, a second generation of antipsychotic medications arrived, most notably Risperdal. Doctors, pharmaceuticals companies, and the media hailed the new drugs as being far more effective than their predecessors and as having less-serious side effects.

In fact, during the half-century since 1955, these medications alleviated so many disturbing symptoms and did it so successfully, at least for a good while in most patients, that professionals as well as the public came to view thought disorders (e.g., schizophrenia) and mood disorders (e.g., major depression, anxiety, and bipolar illness) as biochemical imbalances in the brain that could be put right with chemicals.

The drug industry’s favorite analogy was the way insulin acts on diabetes. The “broken brain” model, in which neurochemical imbalances are said to be correctable by chemicals, now turns up everywhere, even though, Whitaker reports, no scientific evidence supports the theory. It's even mentioned in the curriculum of the superb NAMI (National Alliance on Mental Illness) “Family to Family” course for relatives of people with mental illnesses, which I’ve taught several times.

As psychiatry settled into this medical model during the 1980s and 1990s, media references to the new medicines were commonly couched in the language of common physical remedies, said Whitaker, former director of publications at Harvard Medical School and an award-winning medical journalist, in a presentation at Western State Hospital last November. Taking antidepressants for depression and antipsychotics for psychosis, the public was told, is no different from taking antibiotics for biological infections.

But something’s wrong with this story of progress, argues Whitaker, who has written about psychiatry and medicine for 20 years. If the new drugs are so effective and safe, why are increasing numbers of people becoming disabled by mental illness?

Between 1987 and 2007, as the use of psychiatric drugs surged, the numbers of individuals on the Social Security rolls for disabling mental illness more than tripled. The percentages rose, too: In 1987, one American in 184 (about half of 1 percent) was receiving psychiatric disability payments, but 20 years later the number climbed to one in 76 (about 1.3 percent).

After exhaustive research, Whitaker speculated that medications might be causing an “epidemic” of psychiatric disability. After all, increasing numbers of people are permanently taking medications with a powerful impact on brain and body.

The problem isn’t just the typical side effects of antipsychotics, such as tics, tremors, lethargy, increased blood sugar and cholesterol, movement problems due to muscle stiffness, and obesity leading to diabetes. In the long run, Whitaker says, studies show that these drugs impair cognitive function and emotional engagement. Statistically speaking, they reduce people's fitness for work and life in the community. 

Before 1955, outcomes for mental illness were better than after the advent of the “miracle” drugs. A 1956 study by the National Institutes of Mental Health (NIMH) showed that five years after being diagnosed with schizophrenia, only about a third of the patients needed extended or permanent hospitalization.

About a third continued to exhibit some symptoms but were functioning — living in the community and working. And a third, in the natural course of events, had become wholly free of schizophrenia symptoms.

Whitaker compares those conclusions to present-day outcomes. According to respected studies cited in his book, drug treatments leave people who have schizophrenia or bipolar disorder in worse condition after 15 years, on average, than people with these diagnoses who did not take the drugs. Further, depressed people who take antidepressants long-term are more likely to become chronically depressed, and 20-40 percent develop bipolar disorder.

Overall, the book looks like a post hoc argument: "Because Y happened after X, X must have caused Y." Because psychiatric disability surged after the rise in use of psychiatric medications, the medications must be causing the disability.

But correlation is not causation, of course. Whitaker himself concedes that the increased use of mood-destabilizing street drugs — even marijuana has a higher THC count than it did a half-century ago — may lurk behind today’s steep rise in incapacitating mood disorders such as serious bipolar illness. Other factors such as today's greater media attention to mental illness, gradually declining stigma, and better access to treatment, all of which may be bringing more people to professional diagnosticians, also could bear on the statistically higher incidence of poor treatment outcomes that Whitaker cites.

Add to the mix an increasingly fragmented and hectic society, the medicalizing of common though unpleasant human feelings like sadness and rage, and the possibility that the bar for "normal" is now higher than it was 50 years ago. Throw in the recent growth of a whole industry of disability attorneys, and parents wanting to exercise today's legal rights to disability accommodations for their qualifying children who are not doing well in school. Then there's the fact that (unlike 50 years ago) both parents in families are typically working now, leaving nobody to care for a disabled grown son or daughter at home. Any and all of the above could be contributing to the rise in psychiatric disability statistics.

More to the point, the rate of Social Security disability is "not a good marker for mental health outcomes," said Terry Lee, a psychiatrist at Seattle Children's Hospital. "I’d agree that it’s a bad outcome, and not what we want." But looking at disability statistics isn't the same as doing a rigorous epidemiological study. For that, Lee told me, "we try to sample the whole population and use a structured instrument." He added, though, that the profession is "very concerned about how many kids are being prescribed those medications."

To Whitaker's credit, then, his book does more than analyze disability rolls. The author found and assembled respected research that makes a good case for choosing non-drug treatment alternatives for mental illnesses more often. Medications are sometimes necessary, he says, but why prescribe them immediately, especially in response to a first-time episode? Symptoms stand a good chance of lifting without their help, and the drug will change the brain in ways that will likely make it necessary to continue taking it.

If medications turn out to be required, why not try gradually decreasing dosage levels as soon as possible? Psychiatrist Robert Hilt at Seattle Children's said that it's "beholden upon the prescriber, even if meds seem to be working, to see if they can’t be tapered off."

Whitaker describes several successful treatment programs for schizophrenia based on “psychosocial” interventions that combine intensive talk therapy, behavior management coaching, peer support, and guided involvement of a patient’s family and community. Although sleep medications may be prescribed as part of the treatment plan, heavier antipsychotics are administered only if psychosocial strategies fail to end a psychotic episode, and then the drugs are gradually withdrawn. For example, a study of “open-dialogue” therapy for schizophrenia in Finland, based on such strategies, showed that after five years 83 percent of patients could return to work and were not on disability.

The centerpiece of Whitaker's argument is that a truly evidence-based model of psychiatric care would focus not just on the alleviation of symptoms but on long-term outcomes. Are the great majority of people who receive treatment for mental illness physically healthy, living in their own homes, socially engaged with others, and working? 

At a February symposium in Portland, a group of psychiatrists from around the country embarked on a project of rewriting current protocols for psychiatric drug use in cases of schizophrenia and depression. They began with the assumptions that not every person with a psychiatric diagnosis benefits from these medications and that not everyone who does need them should take them for extended periods. Criteria for selected use would be specified, as would tapering protocols.

New protocols have not yet appeared, but Whitaker told an audience of residents in psychiatry at the UW Medical Center last week that the goal was to place psychosocial care at the center of treatment, not off to the side. Medications would be resorted to only as needed, and a first episode of psychosis would not automatically be met with antipsychotics.

The result, he said, would be psychiatric care more closely matched to a person's psychiatric condition. Physicians would discern which patients could do without drugs entirely, which ones needed them temporarily, and which ones might need them permanently. The result would probably match the one-third, one-third, one-third outcome of the 1956 NIMH study.

It would be a challenge, of course, to assign individuals to the right groups. As Whitaker made his last point, one psychiatrist in the UW audience was heard to mutter, “Flip a coin.”

And even though psychosocial programs save money in the long run, they're expensive in the short term — especially any that would safely house a nonviolent person with schizoprenia for six weeks or more in order to provide her with wraparound care and supervision while her reaction to drugs, if they became necessary, was monitored as they were administered and then tapered off.

Physicians also worry about sowing public doubts about the safety and efficacy of psychiatric drugs, which can make individuals who need the medications refuse to take them. One psychiatrist, though he hadn't read Whitaker's book, said he considered the author "an irresponsible idiot."

Of course, doubts are already out there about meds, even if only anecdotal or subjective doubts. Some have been aired onstage in Seattle. Elizabeth Kenny, in her monologue Sick, describes how antipsychotic medications piled on one after the other rendered her almost homicidal. Next to Normal stars a woman with bipolar disorder undergoing psychiatric treatment that makes her sicker. Whitaker's book contains several arresting, disturbing, moving anecdotes of this kind.

Doubts haunt professionals in the mental-health care field, too. Debra Morrison, program manager at a large behavioral-health services clinic in Seattle, told me how her two sons, both diagnosed with bipolar disorder, struggled with the side effects of medications and finally started flushing them down the toilet. Now, after the passage of several years, she said, "my kids are unmedicated and doing well. The chemical imbalance theory was a great comfort" at first, she observed. "It took a great deal of the fear out of my situation. At the same time, it's overly simplistic. The idea that it diminishes stigma — I don't know if that's been true. It's created a different kind of distortion."

Morrison, who hosted Whitaker at the UW last week, said she was talking to me "mostly as a mother." But she also expressed delight that her counterparts at the UW "are open to the concept that we don't need to use so many drugs every single time."

It's possible to "do it as a last resort, and not think of it as a lifetime thing," Morrison said. "I'm not a purist, not saying it's evil, just saying 'Let's do some real science.' There's some junk science out there being manipulated and perpetrated on us by people making money from it, which isn't exclusive to this field. The infiltration of our university by corporate interests is in a lot of other venues as well." Even so, she said, "my colleagues at the UW are doing wonderful work that's not drug-company sponsored."

On the other hand, of course, there's anecdotal, subjective support for the extended use of these medications — both in the scare stories that make the headlines and in the far more numerous success stories that don't.

Clearly, what's needed is what Whitaker and some medical professionals would like to see: good epidemiological studies based on long-term followup of large, randomized groups of patients given different models of care. However, funding for such expensive, ambitious studies has to come from deep treasuries like those of pharmaceutical companies (with reason not to support research that might show their products to be less effective and more hazardous than advertised) or those of the federal government (heavily influenced by Big Pharma lobbyists).

Still, the story Whitaker wants to see gain public and professional traction some day soon would be positive, even if it didn't match the false optimism of the one that's so popular now. It would go something like this:

After half a century during which different psychiatric medicines proliferated and their use skyrocketed, doctors grew more artful and judicious in employing them. Prescription protocols became more firmly grounded “in science, not advertisements” (as Whitaker put it at the UW). Effective psychosocial treatments for mental illness grew and flourished, and evidence-based practices defined the appropriate, limited use of particular medications in particular cases.

As a result, physicians were able to return the vast majority of people with mental illnesses to normal, productive lives in the community.

It’s a great story.

  

Please support independent local news for all.

We rely on donations from readers like you to sustain Crosscut's in-depth reporting on issues critical to the PNW.

Donate

About the Authors & Contributors