Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Why America's Public Health Has Always Been Fragmented

The nation has enjoyed public health triumphs, with life expectancy far higher than it was a century ago. But responsibility for health has always been scattered, with disease tracking less a priority than treating individuals.

shutterstock_242289544
Vaccinating homeless tenants at New York City's Municipal Lodging House in 1910. (Shutterstock)
When the 20th century began, the leading causes of death in the United States were all infectious diseases — pneumonia, influenza, tuberculosis, gastrointestinal infections and diphtheria. No one worries much about diphtheria anymore. Or cholera or yellow fever or other infectious diseases that triggered terror and caused death throughout the 19th century. 

Improvements in sanitation, antibiotics and vaccinations have all limited the spread of infectious diseases. As a result, life expectancy at birth has increased in the U.S. by more than three full decades of life — from 47.3 years in 1900 to more than 78 today. 

Now, the leading causes of death are all non-communicable problems such as heart disease, cancer and accidents. That is, they were, until COVID-19 became the country’s leading killer.

To some extent, public health has been a victim of its own success. Over the past century, the focus of American health care has shifted from public health — which is concerned with infectious diseases and the overall well-being of the community — to treating individuals. Nearly all the health policy energy is devoted to debating questions about how many individuals should be covered and who should pay the bill. 

As a result, public health is always a low priority, until it’s the highest possible priority. Public health is lucky to receive a penny or two out of every dollar spent on health care. Before the novel coronavirus struck, funding had been cut in half over the past decade for both public health emergency preparedness and response programs at the federal Centers for Disease Control and Prevention (CDC) and the federal hospital preparedness program. 

Throughout American history, public health has been fragmented, with first local governments and later states playing a more active role than the federal government. After nearly every modern epidemic, panels of experts have called for Washington to play a more robust role. Those recommendations typically go nowhere.

The problem during the coronavirus crisis is not so much that the federal government hasn’t taken full charge, says John Auerbach, president of the Trust for America’s Health, a nonprofit that advocates for public health. There’s always been a “division of labor” between the levels of government in responding to public health emergencies. The problem is communication between those levels.

Typically, the federal government takes the lead role in setting the medical and scientific strategy, leaving its plans largely to states and localities to carry out, each offering consistent messages. Federal, state and local responses were well-coordinated following the Sept. 11, 2001, terrorist attacks, Hurricane Katrina, the H1N1 virus, Ebola and Zika, Auerbach says. 

In response to COVID-19, however, the CDC has been virtually sidelined. Its daily briefings were canceled back in March. More recently, the White House vetoed a set of detailed guidelines that the CDC prepared for states, localities and businesses to follow in opening back up. In prior crises, you didn't have states competing against each other for resources.

“The gap has been less about having a completely centralized system,” Auerbach says. “Rather, the failing has been when we don’t have a well-coordinated system across the different levels, federal, state and local, with a clear understanding of the importance of each of these layers and a respect for meaningful involvement for all those layers in developing an approach.”

When All Children Were at Risk

Children born in 1850 had only a 50-50 chance of reaching their fifth birthday. As late as 1900, children under the age of 5 accounted for 30 percent of all deaths.

Today, children under 5 account for 0.9 percent of all deaths in the U.S. The difference is largely due to immunizations that have all but wiped out vaccine-preventable infections including smallpox, diphtheria, polio, typhoid and measles. 

There have been anti-vaccine movements at times throughout American history, but vaccine use has represented a public health triumph. In 1905, the U.S. Supreme Court ruled in a vaccination case known as Jacobson v. Massachusetts that individuals could not refuse medical treatment when the broader public health was at stake.

New York City had created the nation’s first board of health in 1805, in response to a yellow fever outbreak. The city’s health agencies have not always been effective — for decades, they were controlled by the corrupt Tammany Hall regime — but its board of health provided a template for other cities, quickly copied by Boston, Chicago, New Orleans and dozens of other cities. 

Outbreaks in crowded, dirty cities were dealt with locally. “Not only was there no genuine federal leadership in public health in 19th century America, few states had laws or policies that extended to all of their counties and cities,” science journalist Laurie Garrett writes in Betrayal of Trust: The Collapse of Global Public Health.

At the dawn of the 20th century, the lived experience of coping with contagion led to wide support for public health. Germ theory made scientists more aware of the role of bacteria and viruses and gave them a firmer empirical basis for taking action such as draining swamps.

By 1902, scientists had proven that asymptomatic people could spread disease, such as the bacteria that causes typhoid fever. New York City health authorities traced typhoid illnesses to a cook named Mary Mallon, who refused to quit her trade despite being a laboratory-confirmed carrier. She was incarcerated on an island and is remembered in folklore as Typhoid Mary.

The Scattered Federal Approach

During the early decades of the 20th century, different funding levels for public health helped lead to local outbreaks of diseases such as tuberculosis and diphtheria. The federal approach to health, meanwhile, remained scattered, with responsibility spread among dozens of agencies. Responsibility for public health has been reshuffled within the federal government numerous times.

In 1921, Congress passed a law to address childhood and maternal health known as the Sheppard-Towner Act. It put the U.S. Public Health Service — which had long had limited responsibility for monitoring ports of entry — in charge of sending grant money to states, establishing the now-longstanding pattern of the federal government funding states in support of Washington’s own health priorities.

During the 20th century, states, cities and counties grew increasingly dependent on federal sources to support their health budgets. This model ran into trouble when federal health spending was sliced by about a fourth during Ronald Reagan’s first presidential term during the 1980s, including the elimination of some public health programs. “The first question of most local governments is how much of the federal cuts will be offset by state funding increases. The answer given by most states — none — is not the answer that local governments want to hear,” a 1982 analysis by the University of North Carolina dryly noted.

As state and local governments struggled to balance their budgets in the wake of the Great Recession, public health was a popular place to cut. State and local health departments cut some 55,000 positions between 2008 and 2017. "We haven't invested in the workforce," says Chrissie Juliano, executive director of the Big Cities Health Coalition, a forum for the largest metropolitan health departments. "Even if everything else was perfect, we'd be behind the 8-ball on that."

And not all state or local health departments are created equal. Each operates under different authorities and laws. There’s a wide range of capacity both between states and within them. Some fund public health efforts generously and some have health departments in name only. Boston has a large, well-funded health department, for example, but not all its neighbors do.

“In Massachusetts, we have about 350 local communities,” says Auerbach, a former state health director. “Each one of those small towns has to have its own public health department. Many of those towns have a single person, or maybe two or three.”

Persistent Tensions

Garrett published Betrayal of Trust back in 2000, but her book makes it clear that many of the themes present in today’s debates — tensions between commerce and public health; individual liberties vs. the communal good; blaming or at least shunning poor and immigrant communities as they suffer from the spread of disease — have recurred throughout American history.

When New York City established its Board of Health in 1805, it was well-funded and enjoyed substantial legal authority. By 1819, its budget had been slashed by 94 percent and there were calls for its elimination. “The clash between New York’s wealthiest men of commerce and its civic authorities was a classic conflict between pursuit of short-term profit and prevention of longer-term threats to the populace,” Garrett writes.

Public health has a built-in public relations problem. When it’s successful, people don’t notice. It’s always hard to prove a negative. Saying that a particular program or approach saved X number of lives is never felt as viscerally as lives being lost. 

“How do you quantify somebody not getting sick?” asks Juliano. “As a field, we have to do a better job of talking about how critically important these things are to your community.” 

There’s also been an ironic but perennial tension between public health and health care. Over the course of the 20th century, treating individuals in hospitals and clinics became highly lucrative, shifting resources away from the sort of community treatment that was common during the public health field’s heyday. (In 1890, according to Garrett, a quarter of the health care in New York City was provided free at public dispensaries.) 

Specialized care such as surgery has long since been more prestigious and lucrative than the kinds of medical careers central to public health, such as family practice, pediatrics and internal medicine. Meanwhile, there was increased demand for treatment of chronic conditions such as heart disease and cancer, as opposed to infectious diseases.

In short, there are many forces pushing against public health. Some of the field’s historical purviews, such as clean air and water, have shifted to environmental agencies that aren’t concerned primarily with health effects. Infectious diseases have become less of a concern than chronic conditions. 

The demands of market-based medicine have supplanted public health within the healing professions. Even when they combat the same diseases, public health and private medicine are concerned with different stages. Prevention may be worth a pound of cure, but it’s less of a priority than treatment.

All that, coupled with the balkanized approach taken to public health among governments, helps explain why there’s been less contemporary attention paid to issues such as disease tracking and monitoring or to the underlying problems of community health than there were during the dark days of infectious disease a century or more ago. Public health has long suffered through boom-and-bust cycles, receiving significant funding in response to fresh outbreaks, but then stripped and left ill-prepared for the next one.

Last month, the Robert Wood Johnson Foundation called on Congress to increase funding for public health by $4.5 billion, with the money going out to state and local governments. The plan was endorsed by 160 health groups, with the dollar amount backed by about 60 former state health directors.

“When we came up with $4.5 billion, we all thought it was way too big,” says Auerbach of the Trust for America’s Health, who sat on a blue-ribbon panel that arrived at the figure for the foundation. “But here we are now, and it’s a rounding error; $4.5 billion sounds like a really good buy, if it would alleviate some of the hardship associated with this pandemic, and it certainly would.”

Alan Greenblatt is the editor of Governing. He can be found on Twitter at @AlanGreenblatt.
From Our Partners