There have been two “worst fears” stated by those who have been decrying (and have now managed to defeat) proposed changes to the handbook of psychiatry called the DSM-5, for Diagnostic and Statistical Manual that would have lowered the standards (made it possible for teenagers to be treated earlier) for certain diagnoses, including psychosis.
The first was that by lowering thresholds and speeding up timelines for treating psychological disorders as diverse as elder depression and adolescent psychosis, we would be stigmatizing these new patients. They would then become victims of fear and discrimination by family, friends, coworkers—and, apparently, by complete strangers, as the comments section of any post on mental illness demonstrates all too well. The second risk most often articulated is that, for all their trouble, these same consumers would receive poor treatment for what ails them; read: over-medication.
Given the psychiatric profession’s history of incestuous dealings with pharmaceutical companies, and its track record of coercive, sometimes appallingly bad treatment of patients in the asylum era and beyond, psychiatrists as a class have engendered little trust from the people they serve. Barely 15% of their members, according to a N.Y. Times report, offer patients direct treatment beyond prescribing and managing drugs. Psychotherapy, when consumers can find a way to get reluctant insurers to cover it, is now dispensed by psychologists, social workers and other licensed counselors. Neither have psychiatrists made much or any effort to communicate with the public about what they do and why—that is, until now.
Ironically, those who are have led the charge to stop the DSM-5 revisions include many from the old guard of psychiatry, led by the editor of the previous edition of the handbook, the DSM-4. This contingent would like us to unquestioningly accept their position—since they are the experts— when they express concern about doctors (presumably not themselves) handing out too many psychiatric meds to people who don’t really need them. There’s something fishy here. It’s almost as if some in the psychiatric profession have a vested interest in keeping the “truly” mentally ill isolated from the rest of us; because that is the net effect of saving mental health care for only those who, for example, in the case of schizophrenia, have demonstrated symptoms such as hallucinations and extreme paranoia for a minimum of six months. From my perspective as a parent, it also strikes a potentially fatal blow against secondary prevention, when early intervention for symptoms of youth psychosis have shown that it’s possible to avert more severe disorders such as full blown schizophrenia.
[Note, I write about my son's experiences with early intervention for psychosis in my book, A Lethal Inheritance]
The leitmotif for this increasingly effective anti-DMS-5 advocacy campaign is the charge that by lowering thresholds for mental disorders, that is, by allowing individuals including children and adolescents, to receive treatment earlier on in the disease processes for mental disorders (such as schizophrenia or ADHD), the profession would be “medicalizing normality.”
In this essay, I turn this argument on its head and ask: what if “normalcy” was re-envisioned to include “mental illness” along a broad spectrum that includes mild mental illness at one end and optimum mental wellness at the other? And, what if psychiatry as a field was to get its act together and formalize staged treatment protocols for mental disorders—such as early onset psychosis which is being treated in a strict four stage protocol in places such as Australia and Canada—much as medicine has developed guidelines for the treatment of stage one through four cancers?
The Train Has Left the Station
A minority of mental health professionals and researchers, along with a larger number of laypersons, is rethinking mental illness precisely along these lines. Many, especially those identifying with the “recovery movement,” are choosing to view mental disorders as in no way separate from the physical kind. Nor do they view such disturbances as necessarily abnormal. The word “differences” is often favored.
Within the research community, this re-imagining of mental health as a wider spectrum measuring relative mind/body wellness was partly articulated in the concluding comments of a little-noted March 2011 study in the Journal of the American Academy of Child and Adolescent Psychiatry. First, the authors (respected psychiatric researchers from Duke University) noted that while only a small percentage of young people (one in five) in an Appalachian population sample met criteria for a DSM disorder at any given time, most of them would meet those criteria by young adulthood. They concluded by stating what few of their colleagues would dare: “As with other medical illness, psychiatric illness is a nearly universal experience.” (Italics mine).
In saying that most of us are likely to face a mental health problem at one or more points in our lives, the authors are asking us to consider mental and emotional health the same way we do the bodily kind. Following from this premise, mental disturbances, like physical illness, result from hereditary and environmental factors, and occur along a symptomatic spectrum of mild to severe. This is no different than, say, high blood pressure, which at its mildest usually requires taking a medication and making lifestyle changes, but if left untreated can result in death.
An apt comparison may be bipolar disorder. Mild or moderate symptoms can cause major depression interrupted by manic episodes of sleeplessness and touches of grandiosity. More serious symptoms of bipolar disorder include extreme manic behaviors such as overspending, sexual promiscuity, paranoia and hallucinations—until the individual collapses into a seemingly bottomless depression, at which point the danger of death by suicide is up to 20 %, especially among the young.
How We Got Here
The fact that the health of our cognitive and emotional functions ever occupied anything but an equal footing to physical health can be attributed to two major historical factors, in my view: First, having had religion or any moral code serve as keeper in chief of right behavior was probably never a good starting place for the scientific method. A second obstacle has been the much-critiqued absence of biomarkers for diseases of the brain — no blood test, thermometer reading or rash to provide material evidence of a “real” illness. On this score, there is some real progress to report from the world’s neuroimaging labs.
Dogged neuroscientists with powerful MRI microscopes are filling scientific journals with reports detailing structural distortions of the brain resulting from disorders originating there — images of shrunken amygdala in the brains of schizophrenics, and thinning cortex in the depressed and even in young people who simply carry a genetic risk for depression but show no symptoms. However, unless and until these imaging tools become available in frontline mental health offices, we are left only with the vagaries of human behavior as a lens through which to recognize and heal mental illness.
Meanwhile, according to the National Institute of Mental Health, far too many would-be patients, especially those under the age of 18, are receiving no treatment at all for their mental health problems — this despite the fact that roughly half of all lifetime mental disorder cases start before age 14. As of 2010, NIMH reported that mental illnesses— everything from garden-variety anxiety and depression to bipolar disorder and schizophrenia— are still woefully under-treated in the U.S., with only 60 % of adults and about 40 % of children and adolescents with a need receiving adequate mental health care. Most of the mental health care children receive is provided by pediatricians and other primary care doctors. According to the American Psychological Association, 70% of psychiatric medications are today prescribed by general practitioners who have medical and pharmacology training, but may have limited training in psychiatry and psychopharmacology—a reality that many blame—along with insurance companies that offer reimbursements for medication but not psychotherapy—for the over-prescribing of psychotropic medications to minors.
The Implications of Changing Definitions of Mental Illness
After decades spent developing the technology and expertise to search for the genetic disease pathways of major mental disorders, one decidedly low tech factor is still the most useful in determining someone’s risk for mental illness and for both diagnosing and treating a disorder: a full knowledge of an individual’s family mental health history. In science-speak, they say that it’s the “best predictor” for a mental disorder. Unfortunately, this proven tool is underutilized, another consequence of stigma.
Family study researcher Terrie Moffitt recently wrote in the foreword to my book A Lethal Inheritance:
“Family history is an essential part of every cardiologist’s interview of every patient. But contrast this with practice in psychiatry. Although family history could be one of the most effective tools in mental-health care, it is often asked in only the most cursory way, and it is seldom used seriously by clinicians to help patients and families understand their risk situation.”
Moffitt went on to say that knowledge of a child’s family history can spell the difference between “treat now” or “wait and see.”
So why is family mental health history given short shrift in assessment and treatment? Some researchers and practitioners complain that the historical information they get from patients about other family members is often incomplete and unreliable; they say embarrassment surrounding the presence of mental illness in a family tree makes getting more accurate information difficult. Diagnosis too often becomes detective work, forcing practitioners to root around in a family’s “dirty laundry,” parsing stories of dark moods, benders, violent tempers and absences of those shunned, jailed or missing relatives who may have had a diagnosable mental illness but never reached a clinic or doctor to find out or receive treatment for what may have ailed them. However, in abdicating this potentially valuable data to the exigency of avoiding embarrassment, treatment providers and consumers are allowing stigma to defeat both science and common sense. By openly inviting and jointly pursuing family history in the diagnostic process, practitioners are in effect granting their “patients” the status of partners in the longer-term processes of treatment and recovery.
Closing the Gap between Research and Available Treatment
Columbia University psychiatric epidemiologist Myrna Weissman told me in a 2009 interview that she sees mental health consumers in the United States facing two big problems. “Beyond the fragmented state of our service delivery system,” Weissman began, “there is an unacceptable gap between research and training.” Translated into everyday consumer reality, this means that even if you get past your primary care gatekeeper to see a mental health specialist, you may not get evidence-based treatment, as there is a gap between research findings and their entry into practice. The under-utilization of family history and a lack of access to treatment for early intervention of childhood and adolescent disorders constitute two vivid examples of this broken pipeline.
By recognizing the role of family history and by acknowledging at-risk diagnoses, we might also take a few steps forward as a culture in normalizing mental illness. Not to permit or encourage the popping of psychotropic pills at the slightest indication of trouble. Not to keep rambunctious kids in their seats at school, and not to line the pockets of psychiatrists and “big pharma.” The real payoff from recasting mental illness as a spectrum of possible states of mind and body to which we’re all subject will come by reducing suffering and regaining some of the human potential now needlessly lost to diseases that no longer have to be chronic or debilitating. Secondary prevention — meaning averting illness in someone who’s at risk — is then possible by taking concerted neuroprotective actions to support healthy minds, and not necessarily medications. By intervening early, practitioners can usually begin with less onerous treatment, such as parent education and family therapy, and keep childhood mental disorders from becoming adult disorders.
If taken to its logical conclusion, the argument that mental disorders are universal would call for a wholesale abandonment of the concept and language of “mental illness” as it’s been traditionally used. Not because mental illness doesn’t exist; rather, because of its normalcy as one dimension of the human experience. Neither the “physically ill” nor the “mentally ill” are served by continuing the false separation between the body and the mind in health or sickness. Neurons, the cells that connect thought and behavior, long thought to reside only in the brain, are now understood to also be located in the human stomach (those “gut feelings”) and heart.
In this new paradigm, there is simply illness and wellness, with vast implications for medicine and public policy if this so-called integrative approach to health were to replace our current dualistic stance. In fact, the Affordable Health Care law, passed by Congress in 2010, calls for the first steps toward integrating primary care and behavioral (mental health) care including substance use disorders. It suggests a sort of one-stop shopping where, for example, the alcohol dependence and underlying depression that are aggravating your heart or liver problem would be treated by the same team of wellness practitioners — all trained in a prevention model so that each of these conditions would be addressed far earlier than is typical in our current disease model of health care. The fact that early intervention and prevention are given priority status also necessarily elevates the role of the “patient” on a par and in partnership with the practitioners.
Unfortunately, in early 2012 there are signs that this common sense commitment to prevention may fall victim to other federal budget battles; according to a new report by the Robert Wood Johnson Foundation, a third of the funds set aside for prevention and public health under the Affordable Health Care Law have been pilfered to fund other priorities, including long term unemployment insurance and the payroll tax cut. Even if and when these provisions of health reform are fully implemented, it may take much longer for attitudes in the attendant professions, industries and the culture to catch up.
And today we find that we are going backwards…to a time when symptoms hiding in plain sight are going to be ignored until they are bad enough to fit the industry’s tidy definition of extremely mentally ill.