The Outbreak Narrative: What has changed this time around?

EDITOR’S NOTE: Points is delighted to welcome past guest contributor, Jessica Diller Kovler (check out her previous post here). Kovler is part of the History of Science program at Harvard University and currently teaches at John Jay College of Criminal Justice, the City University of New York. Her work has appeared in The New York Times, Forbes, and Discover magazines. 

Unless you’ve had your head buried in the sand for the past month, you’ve undoubtedly thought of the recent Ebola outbreak. Even if you have a background in public health, you would probably avoid the New York bowling alley visited by Dr. Craig Spencer (even though the City shut it down the day the news of his illness hit the papers). You’re probably using extra Purell, even though we’re relatively knowledgeable about the pathogen’s mode of transmission.

News reporters have scrambled to assemble our patient zero. Even our most liberal friends are arguing for shutting down the borders. We are blaming and looking for answers.

Bloomberg Buisnessweek, September 24, 2014
Bloomberg Buisnessweek, September 24, 2014

As my grandfather would ask at our Passover Seder: “Manishtana?” (What has changed?) As a social historian, I wonder what makes the societal response to Ebola any different than our collective response to the Black Death, typhoid, polio, and HIV? In the past few weeks, people have compared the response to Ebola to the first cholera pandemic of the early-19th century, the 1918 Spanish Flu epidemic, the polio epidemic of the first half of the 20th century, and AIDS in the early 1980s. Perhaps, as some have argued, there is a formulaic narrative in how we respond to outbreak of disease. But does this narrative also apply to epidemics involving alcohol abuse (or, in the case of the disease I’m about to describe, suspected alcohol abuse)?

From 1915 to 1927, a mysterious illness befell millions worldwide. Its symptoms were wide-ranging—no two patients presented exactly the same—and the illness left many of its survivors in a catatonic, semi-conscious state. Those who “awakened” were left with Parkinsonism, psychiatric sequelae, and severe behavior disturbance. Almost as quickly as Encephalitis Lethargica appeared in 1915, it seemingly vanished 12 years later. Thousands around the world, however, lived long past 1927, imprisoned—some for decades—in their own bodies. The lack of attention to this disorder beyond its peak, has, in recent years, earned the disease the moniker “The Forgotten Epidemic.” (Perhaps you’ve heard of the disease thanks to the 1990 Oscar-nominated film, Awakenings, starring Robin Williams and Robert DeNiro, based on the work of Oliver Sacks.)

Yet the history of Encephalitis Lethargica is more than the tale of a forgotten epidemic. It is an illness narrative evoking shifting socio-medical paradigms in the second half of the 20th century that is uniquely tied to the sociomedical response to alcoholism.

Read More »

The Islands of New York City: How a Real Estate Boom is Turning Former Homes of Crime and Contagion Into Boho-Chic Living—Except for One Tiny Island Off the Bronx (Guest Post)

Editor’s Note: Today’s guest post is a is a modified excerpt from Jessica Diller Kovler’s upcoming book, The Boys of the Bronx, to be published in 2015. Kovler is part of the History of Science program at Harvard University and currently teaches at John Jay College of Criminal Justice, the City University of New York. Her work has appeared in The New York Times, Forbes, and Discover magazines. 

In my city—which, as you may have heard, doesn’t sleep—some nonetheless lethargic neighborhoods have had an awakening of sorts. Many New Yorkers are forgoing the bustling city centers for the far-flung shores of Manhattan as well as some of the city’s 41 adjacent islands, neighborhoods previously considered “The Devil’s Stepping Stones.” (Legend has it that indigenous New Yorkers chased the Devil across the waters of New York, and every time the Devil stepped down on the water, an island was born.) These areas were so removed from the grid that they were used to house the city’s derelict, destitute, profligate, and banished—drug addicts, criminals and those deemed too mentally or physically ill, or even too dangerous to live in “mainland” New York City.

Take Roosevelt Island, where Nellie Bly penned her work on the infamous Woman’s Lunatic Asylum; that island is now home to luxury rentals, with Cornell University planning an extension campus for 2017. Randall’s Island and Wards Island, home to cemeteries, asylums, and contagion hospitals, are now home to Little League games and the Electric Zoo festival.

Amidst this transformation, one island has been forgotten, though thousands of New Yorkers have (reluctantly) called it home. The last inhabitants of North Brother Island comprise a lost chapter in the story of urban institutionalization, a faded memory of a city grappling with a perceived epidemic of both juvenile delinquency and adolescent narcotics addiction. Now abandoned, its buildings fading behind overgrowth, the island nonetheless reveals why New York institutionalized drug-addicted teenagers, even as a nationwide movement towards deinstitutionalization was beginning to gain momentum.

Read More »

Presenting Terada Shin: The Life History of a Female Drug User in Prewar Japan

Editor’s Note: Today, Points features a guest post by Miriam Kingsberg, an assistant professor of history at the University of Colorado at Boulder and author of Moral Nation: Modern Japan and Narcotics in Global History. (University of California Press, 2013). You can read the Points interview about the book here).

For historians of drugs, user perspectives are often frustratingly difficult to capture. Narcotics consumers generally leave behind few records in their own voice, forcing scholars to rely on the (frequently biased) perceptions of those who come into contact with them: law enforcement, doctors, social scientists, policymakers, etc. In the course of my research on narcotics in Japan and its empire from the 1850s through the 1950s, each of these groups provided critical information. My search for user-authored narratives, however, proved fruitless until virtually the last moment. In 2011, as I was preparing the penultimate draft of my book manuscript, I learned that a collection of documents, formerly inaccessible to scholars due to their poor condition, had been digitized and made available by the National Diet Library in Tokyo. To my delight, I found materials on the Drug Addiction Relief Association [Mayaku Kyūgokai], founded in 1933 as Japan’s first domestic facility for treating narcotics dependence. These sources not only enhanced my understanding of the history of addiction medicine, but also included about twenty life stories by patients, as recorded by doctors at the clinic in the mid-1930s.

screenshot_1139
Terada Shin (right) with Y. Masa (a fellow patient at the Narcotic Addiction Relief Association)

Read More »

The Long, Proud Tradition of the Fourth of July Buzzkill

Celebratory drinking has fueled Fourth of July festivity from its inception in the years following 1776, when double rum-rations for the troops, endless toasts at formal dinners, and makeshift booze-stalls at public gatherings became norms. And it was not long before high-minded patriots began to worry over the excesses of republican revelry. Before the Fourth of July oration itself became well established, there emerged within and alongside it a recognizable (if unnamed) theme in Independence Day rhetoric: the identification of that very day’s public drunkenness with whatever was ailing the republic.

All was not well in this republic.
All was not well in 1837.

Over the years, Independence Day jeremiads have taken numerous forms, from grim warnings about public health and morals, to wry satire of overzealous exceptionalism, to the ferocious indictment of national shortcomings. Many have focused on intoxication as the essential expression of decay, of hypocrisy, even of delusion.

Complaints begin with the sheer recklessness of the traditional program of events. Read More »

Notes from the Field as Massachusetts Does Medical Marijuana

Editor’s note: Today guest blogger and medical anthropologist Kim Sue offers her observations on how changing marijuana laws have slowly begun to impact the world of the opiate-addicted patients she studies–and the wider society’s assumptions about drugs and the reasons people use them.

I have been closely following the campaign for and roll-out of medical marijuana in Massachusetts as I conduct ongoing ethnographic fieldwork on opiate use and incarceration. Given marijuana’s prominent place in the historical, political, and cultural framings of the War on Drugs, it is critical to consider evolving legal frameworks and cultural attitudes toward the drug.

massachusetts-medical-marijuana-listening-sessions

Last fall, advocates for medical marijuana managed to get it enacted via referendum. Read More »

Setting the Record Straight: Part 1

Editor’s Note: Points is pleased to introduce a new guest blogger today. Marcus Chatfield is currently writing a book about coercive therapy in the “troubled-teen industry,” based on research he has conducted as a student at Goddard College. A client of Straight, Incorporated from 1985-1987, he is associate producer of the upcoming documentary film, Surviving Straight Inc. Marcus’s five-part weekly series for Points focuses on the research that enabled this program to win the trust of families, media, and high-ranking officials during its operations in nine states between 1976 and 1993.

“The problem, of course is that Straight really does not know what happens to a good many of its graduates. And it will be criticized for this in the future.” Andrew I. and Barbara E. Malcolm, report to the White House drug czar, 1981.

Straight building 1
The building that housed the Straight Inc. program in Springfield, Virginia.

Straight Incorporated is one of the most infamous adolescent treatment programs in the history of America’s War on Drugs. Straight was an intervention and prevention program, established in 1976 with a federal grant from the Law Enforcement Assistance Agency (LEAA). The LEAA funded hundreds of behavior-modification programs in America and many of them were found to be dramatically unethical. The coercive methods that were used at Straight were not only ineffective, but quite harmful for a large percentage of clients. This essay is a critical examination of an article published in 1989 by the Journal of Substance Abuse Treatment (JSAT), entitled “Outcome of a Unique Youth Drug Abuse Program: A Follow-up Study of Clients of Straight Inc.” Authors Alfred S. Friedman, Richard Schwartz, and Arlene Utada claimed that Straight was highly effective at reducing drug use and that 70% of the former clients from the Springfield, Virginia facility were “satisfied” with their treatment. Program executives presented this statistic to parents and the media as scientific proof that Straight worked.

Read More »

“Generational forgetting”: A year-end reflection

As 2012 comes to a close, there are a few drug- and alcohol-related stories I’d like to forget. But forgetting isn’t always the best way to cope with the unpleasant repercussions of US drug policy. For several generations, social psychologist Lloyd Johnston’s statistics have quantified the adage that those who cannot remember the past are doomed to experiment with bath salts (more on that in a minute).

In 1975, Johnston and his colleagues at University of Michigan began conducting the nationwide survey, Monitoring the Future. By pairing these results with the National Household Survey on Drug Abuse (initiated in 1971), we have been able to get a fairly accurate annual look at drug use prevalence for almost four decades. Both surveys were inspired, in part, by the increase in youthful experimentation with psychoactive substances (especially marijuana) in the 1970s. While substance use trends come and go, academic interest in youthful drug use has remained stable.

The good thing about studying high schoolers: We get older, they stay the same age

In examining trends in drug use over the course of several decades, Johnston and his colleagues noticed a pattern: Read More »

Weekend Reads: Micro Edition

Early October is a special time on the college calendar. Undergrads grit their teeth in anticipation of mid-term exams, the Seminoles experience their yearly swoon, and frosh throughout the nation finally realize – not a moment too soon – that laundry machines exist for a reason. The most predictable of early autumn college rituals, however, may be the annual media panic over alcohol abuse.

This year, “butt-chugging” has titillated the media. University of Tennessee student Alexander Broughton has become something of a minor celebrity, having given himself alcohol poisoning – a .45 blood-alcohol content upon his arrival at the University of Tennessee Medical Center – through…*ahem*…anal infusion. Mr. Broughton vehemently denies providing himself with an alcohol enema. According to recent reports, he is planning on suing someone because, you see, he’s a Christian and being accused of “butt-chugging” implies that he’s gay. Or something. For their part, police are skeptical, having found a plastic bag of wine (a rose, for the record) beside pools of Broughton’s blood in the frat’s bathroom.

Does this story of alcohol poisoning, self-abuse, suspended fraternities, thinly-veiled homophobia, and frivolous lawsuits really constitute news? Probably not. You probably couldn’t find a better slice of 2012 fraternity life, though.

Florida’s Cannabis Cannibal? Zombies, Bath Salts, Marijuana, and Reefer Madness 2.0

Editor’s Note: In a “ripped from the headlines” post, guest blogger Adam Rathge historicizes the recent episode of the Florida face-eater, drawing parallels between the contemporary panic over bath salts and 1930’s-era alarm over “reefer madness.”  A PhD candidate in History at Boston College, Adam is at work on a dissertation entitled “The Origins of Marijuana Prohibition, 1870-1937.”

On May 26, a 31-year-old man named Rudy Eugene tore off the clothes of a homeless man under a highway in Miami, Florida and then ripped off parts of the victim’s face with his teeth. According to witnesses, when a uniformed police officer shouted at him to stop, the attacker allegedly looked up, growled, and then “kept eating the other guy away.”The officer then shot Eugene at least five times before he fell dead on the scene. The damage, however, was done. Much of the victim’s face was gone; his skin ripped away, nose bitten and his eyes gouged. Given the horrific nature of the event, news reports immediately spread around the country, with most observers wondering the same thing: what could possibly make a person eat someone’s face? The answer, of course, was drugs.

The Crime Scene

Reports from the Miami Herald show that local police initially theorized the attacker might have been suffering from “cocaine psychosis,” a drug-induced craze that bakes the body internally and often leads the user to strip naked to try and cool off. Speculation on the internet suggested it was a mix of hard drugs, or perhaps a bad batch of LSD, or even the beginning of a Zombie apocalypse. Just days later all of these theories  were put to rest when the head of the Miami police union publicly speculated that Eugene was actually on “bath salts” – a range of synthetic stimulants that mimic the effects of marijuana, cocaine, and other illegal substances. Though bath salts had previously been blamed for psychotic episodes and wild hallucinations in other cases around the country, the gruesome nature of the face-eating case helped fuel a growing fear and hysteria of these legally available substances. Instantly, and for weeks after, police speculation on Eugene’s alleged use of bath salts became fact, firmly solidified in the national media as other shocking stories about this new and horrific drug emerged.

The Presumed Precipitator

To the general public much of this information undoubtedly seemed as novel as it was shocking. As a drug historian watching this story unfold, the developing narrative actually sounded pretty familiar: a sensational story, a new more-dangerous-than-ever-before drug, and law enforcement officials clamoring for strict laws to combat it, all combining to drive a nationwide panic. Read More »

Depression Depressants: Why Are We Drinking So Much?

Over the last few weeks, the Anglo-Atlantic world has engaged in a slight moral panic regarding drinking. True, our concern is yet to extend to Swine Flu or Killer Africanized Honey Bee levels. You may, in fact, not even know we’re in the midst of a moral panic. It does seem, however, that alcohol is, both literally and figuratively, on everyone’s lips.

Dean Martin, Iconic Drinker (photo courtesy of Alternative Reel)
Dean Martin, Iconic Drinker (photo courtesy of Alternative Reel)

In the United States, the Center for Disease Control recently estimated that one in six Americans  regularly binge drinks. In Canada, researchers at Dalhousie University concluded that, far from serving as beacons of moral support, our loved ones may be driving us to the bottle. Meanwhile, the British Conservative Party, an entity once led by the prodigious tippler Winston Churchill, are now asking the British public to dry up just twice a week.

One can only speculate as to why people on both sides of the Atlantic are re-acquainting themselves with John Daniels. Perhaps this upsurge in alcohol consumption is the manifestation of recession-related ennui. Maybe the issue is cultural, as the stigma of consuming “bargain-priced” alcohol is diminishing, giving buyers license to drink in greater quantities. Maybe it’s something else completely. Whatever the explanation, don’t be surprised by a drinking song and dirty limerick renaissance in the near future.