Author Essays, Interviews, and Excerpts, History, History and Philosophy of Science

Five Questions with Christian Warren, author of “Starved for Light”

Starved for Light book cover

Rickets, a childhood disorder that causes soft and misshapen bones, transformed from an ancient but infrequent threat to a common scourge during the Industrial Revolution, as malnutrition and insufficient exposure to sunlight led to severe cases of rickets across Europe and the United States. By the late 1800s, it was one of the most common pediatric diseases, seemingly an intractable consequence of modern life.

In Starved for Light, Christian Warren offers the first comprehensive history of this disorder. Tracing the efforts to understand, prevent, and treat rickets—first with the traditional remedy of cod liver oil, then with the application of a breakthrough corrective, industrially produced vitamin D supplements—Warren places the disease at the center of a riveting medical history, one alert to the ways society shapes our views on illness. He shows how physicians and public health advocates in the United States turned their attention to rickets among urban immigrants, both African Americans and southern Europeans; some concluded that the disease was linked to race, while others blamed poverty, sunless buildings, and cities, or cultural preferences in diet and clothing. Spotlighting rickets’ role in a series of medical developments, Warren leads readers through the encroachment on midwifery by male obstetricians, the development of pediatric orthopedic devices and surgeries, early twentieth-century research into vitamin D, appalling clinical experiments on young children testing its potential, and the eventual commercialization of all manner of vitamin D supplements.

Read on for a Q&A with Warren about Starved for Light.


Contemporary readers may not be familiar with rickets. What is the disease and why did it rise so dramatically when, and where, it did?

A reasonable working definition of rickets is that it’s a disease that softens and weakens children’s bones, often producing bowed legs and other deformities. Building strong bones requires calcium and phosphorus; vitamin D is the substance (a prohormone if you’re keeping score) that facilitates and regulates the absorption of those bone-building minerals. Rickets can develop because of inadequate calcium or phosphorus, but the most common cause is a lack of vitamin D.

The rise of dark and smoky cities since the 1600s produced a dramatic rise in rickets. With two sources of vitamin D, diet and sunshine, rickets was only likely to grow to epidemic proportions where neither source was adequate, and since D is rarely found naturally in foods, overcrowded housing, and air pollution made cities steady incubators of rickets. And in cities and the country, malnourishment deprived many poor or enslaved children of the mineral building blocks needed to build strong bones, no matter how much sun they took in. By the opening of the twentieth century, rickets was one of the most common childhood diseases physicians encountered.

People of all races and income levels were affected by rickets, but some public health officials linked the disease to race and poverty. How did their assumptions determine their medical and public health decisions?

While income and race are not iron-clad predictors of whether a child will contract rickets (for example, the rickety children studied by physicians in the first published reports in the 1600s were from middling or well-to-do families), poor children were more likely to suffer malnutrition and live in overcrowded, dark housing, putting them at far greater risk of rickets. No surprise then that medical and social observers at the peak years of rickets in the late 1800s and early twentieth century linked rickets and the social environment of its victims. As often happens, this link meant that the attention paid to remediating rickets rode on social attitudes about the victims’ class or racial status. In the Progressive Era, this meant scientists, physicians, and public health professionals invested tremendous resources to investigate the causes, preventives, and cures. Later in the century—when science brought what looked like a cure, in the form of potent manufactured vitamin D added to most dairy products in the U.S. —scientific focus turned to more obscure forms of rickets. The remaining low thrum of new cases was accepted as inevitable and intractable, a sad remnant of a bygone pandemic, now limited to “only” the poor and non-white population. With this sense of victory, medical science all but ignored the time-delayed impact on children who had contracted rickets in the years before the cure, whose distorted pelvises remained into the years after the cure, threatening their chances for safe childbirth.

How did rickets contribute to the rise of modern obstetrics?

The rising incidence of rickets in the 1600s, which observers of the era were keenly aware of, came at a time when physicians in Europe were energetically professionalizing. Because rickets often deformed girls’ pelvises, the growing rates of rickets produced a corresponding rise in difficult births, a golden opportunity for male medical practitioners, equipped with the newly invented obstetrical forceps and knowledge of the new field of pelvimetry, to step more boldly into the birthing chamber, previously the domain of lay and professional female midwives.

No matter how skilled these male physicians are, an obstructed pelvis could still have serious consequences. No matter the outcome, prolonged labor or the use (perhaps more often the misuse) of forceps could produce one of the most troublesome obstetrical complications, tears between the birth canal and the bladder or rectum. Repairing these vesico-vaginal fistulas remained impossible until the mid-nineteenth century, when an Alabama doctor, through dozens of experimental surgeries on enslaved women, developed a successful surgical repair. In cases where birth was impossible, the only option to save the mother involved delivering a dead fetus. Safe cesarean section was not an option before the twentieth century, but the high incidence of rickets-related pelvic obstruction provided powerful incentives to test the limits of safety, and with each year, c-sections became safer, setting America on its course to today, when close to one-third of all births are by abdominal surgery.

Scientists eventually identified vitamin D as an antidote to rickets. What was the role of animal and human experimentation in the early study of vitamins and food fortification?

The search for vitamins—whose absence from the body produced disease—often resembled the great modern search for the microbes whose presence in the body produced disease. One of the strongest parallels between the microbe hunters and the vitamin hunters was their reliance upon both animal and human subject testing. For rickets, these studies can be arrayed along an ethical continuum running from more-or-less benign “natural experiments” in which researchers introduced a therapeutic measure in a setting where rickets was already a debilitating problem, to a range of experiments performed on animals or humans, including institutionalized infants. Starved for Light recounts examples from along this downward spiral, including largely redundant studies inducing rickets in healthy infants to test new versions of known, potent anti-rickets medications.

Today, manufacturers fortify dairy products with vitamin D as a means of preventing malnutrition and rickets. Why dairy, and not another food source or product?

Most vitamin D supplements today get their “D” from animal proteins fortified by the Steenbock Process, patented in 1928 by Harry Steenbock at the University of Wisconsin. Cynics might suspect that Steenbock’s home in the Dairy State influenced the decision to have milk be America’s “drug delivery” vehicle for vitamin D. They would be only partly correct. Milk was a sensible choice. As one researcher put it milk was “an automatic method” for universal fortification, “since every child is fed on milk” and the vitamin D would be a fitting adjuvant to “the calcium and phosphorus elements in milk.” We can note that “every child” raises an irony, or perhaps an unchecked racial presumption, in the decision to fortify milk to prevent a childhood disease that most experts assumed affected non-white children, the children most likely to have difficulty digesting milk! Of course, when America started fortifying milk with vitamin D in the 1930s, lactose intolerance was not recognized as a health issue.

By the way, other countries followed different paths toward supplementation: in the United Kingdom, margarine (but not butter) is fortified, the assumption being that margarine was the spread preferred by Britain’s poor.

Author photo
Photo by Janis Warren

Christian Warren is professor of history at Brooklyn College. He is the author of Brush with Death: A Social History of Lead Poisoning.

Starved for Light is available now from our website or your favorite bookseller. Use the code UCPNEW at checkout to take 30% off when you order directly from us.