Blog > allied health


A year after the United States bombed its pandemic performance in front of the world, the Delta variant opened the stage for a face-saving encore. If the U.S. had learned from its mishandling of the original SARS-CoV-2 virus, it would have been better prepared for the variant that was already ravaging India.

Instead, after a quiet spring, President Joe Biden all but declared victory against SARS-CoV-2. The CDC ended indoor masking for vaccinated people, pitting two of the most effective interventions against each other. As cases fell, Abbott Laboratories, which makes a rapid COVID-19 test, discarded inventory, canceled contracts, and laid off workersThe New York Times reported. Florida and Georgia scaled back on reporting COVID-19 data, according to Kaiser Health News. Models failed to predict Delta’s early arrival. The variant then ripped through the U.S.’s half-vaccinated populace and once again pushed hospitals and health-care workers to the brink. Delta’s extreme transmissibility would have challenged any nation, but the U.S. nonetheless set itself up for failure. Delta was an audition for the next pandemic, and one that America flubbed. How can a country hope to stay 10 steps ahead of tomorrow’s viruses when it can’t stay one step ahead of today’s?

America’s frustrating inability to learn from the recent past shouldn’t be surprising to anyone familiar with the history of public health. Almost 20 years ago, the historians of medicine Elizabeth Fee and Theodore Brown lamented that the U.S. had “failed to sustain progress in any coherent manner” in its capacity to handle infectious diseases. With every new pathogen—cholera in the 1830s, HIV in the 1980s—Americans rediscover the weaknesses in the country’s health system, briefly attempt to address the problem, and then “let our interest lapse when the immediate crisis seems to be over,” Fee and Brown wrote. The result is a Sisyphean cycle of panic and neglect that is now spinning in its third century. Progress is always undone; promise, always unfulfilled. Fee died in 2018, two years before SARS-CoV-2 arose. But in documenting America’s past, she foresaw its pandemic present—and its likely future.

More Americans have been killed by the new coronavirus than the influenza pandemic of 1918, despite a century of intervening medical advancement. The U.S. was ranked first among nations in pandemic preparedness but has among the highest death rates in the industrialized world. It invests more in medical care than any comparable country, but its hospitals have been overwhelmed. It helped develop COVID-19 vaccines at near-miraculous and record-breaking speed, but its vaccination rates plateaued so quickly that it is now 38th in the world. COVID-19 revealed that the U.S., despite many superficial strengths, is alarmingly vulnerable to new diseases—and such diseases are inevitable. As the global population grows, as the climate changes, and as humans push into spaces occupied by wild animals, future pandemics become more likely. We are not guaranteed the luxury of facing just one a century, or even one at a time.

It might seem ridiculous to think about future pandemics now, as the U.S. is consumed by debates over booster shots, reopened schools, and vaccine mandatesPrepare for the next one? Let’s get through this one first! But America must do both together, precisely because of the cycle that Fee and Brown bemoaned. Today’s actions are already writing the opening chapters of the next pandemic’s history.

Internationally, Joe Biden has made several important commitments. At the United Nations General Assembly last week, he called for a new council of national leaders and a new international fund, both focused on infectious threats—forward-looking measures that experts had recommended well before COVID-19.

But domestically, many public-health experts, historians, and legal scholars worry that the U.S. is lapsing into neglect, that the temporary wave of investments isn’t being channeled into the right areas, and that COVID-19 might actually leave the U.S. weaker against whatever emerges next. Donald Trump’s egregious mismanagement made it easy to believe that events would have played out differently with a halfway-competent commander who executed preexisting pandemic plans. But that ignores the many vulnerabilities that would have made the U.S. brittle under any administration. Even without Trump, “we’d still have been in a whole lot of trouble,” Gregg Gonsalves, a global-health activist and an epidemiologist at Yale, told me. “The weaknesses were in the rootstock, not high up in the trees.”

The panic-neglect cycle is not inevitable but demands recognition and resistance. “A pandemic is a course correction to the trajectory of civilization,” Alex de Waal, of Tufts University and the author of New Pandemics, Old Politics, told me. “Historical pandemics challenged us to make some fairly fundamental changes to the way in which society is organized.” Just as cholera forced our cities to be rebuilt for sanitation, COVID-19 should make us rethink the way we ventilate our buildings, as my colleague Sarah Zhang argued. But beyond overhauling its physical infrastructure, the U.S. must also address its deep social weaknesses—a health-care system that millions can’t access, a public-health system that’s been rotting for decades, and extreme inequities that leave large swaths of society susceptible to a new virus.

Early last year, some experts suggested to me that America’s COVID-19 failure stemmed from its modern inexperience with infectious disease; having now been tested, it might do better next time. But preparedness doesn’t come automatically, and neither does its absence. “Katrina didn’t happen because Louisiana never had a hurricane before; it happened because of policy choices that led to catastrophe,” Gonsalves said. The arc of history does not automatically bend toward preparedness. It must be bent.


On september 3, the White House announced a new strategy to prepare for future pandemics. Drafted by the Office of Science and Technology Policy, and the National Security Council, the plan would cost the U.S. $65 billion over the next seven to 10 years. In return, the country would get new vaccines, medicines, and diagnostic tests; new ways of spotting and tracking threatening pathogens; better protective equipment and replenished stockpiles; sturdier supply chains; and a centralized mission control that would coordinate all the above across agencies. The plan, in rhetoric and tactics, resembles those that were written before COVID-19 and never fully enacted. It seems to suggest all the right things.

But the response from the health experts I’ve talked with has been surprisingly mixed. “It’s underwhelming,” Mike Osterholm, an epidemiologist at the University of Minnesota, told me. “That $65 billion should have been a down payment, not the entire program. It’s a rounding error for our federal budget, and yet our entire existence going forward depends on this.” The pandemic plan compares itself to the Apollo program, but the government spent four times as much, adjusted for inflation, to put astronauts on the Moon. Meanwhile, the COVID-19 pandemic may end up costing the U.S. an estimated $16 trillion.

“I completely agree that it will take more investment,” Eric Lander, OSTP director and Biden’s science adviser, told me; he noted that the published plan is just one element of a broader pandemic-preparedness effort that is being developed. But even the $65 billion that the plan has called for might not fully materialize. Biden originally wanted to ask Congress to immediately invest $30 billion but eventually called for just half that amount, in a compromise with moderate Democrats who sought to slash it even further. The idea of shortchanging pandemic preparedness after the events of 2020 “should be unthinkable,” wrote former CDC Director Tom Frieden and former Senator Tom Daschle in The Hill. But it is already happening.

Others worry about the way the budget is being distributed. About $24 billion has been earmarked for technologies that can create vaccines against a new virus within 100 days. Another $12 billion will go toward new antiviral drugs, and $5 billion toward diagnostic tests. These goals are, individually, sensible enough. But devoting two-thirds of the full budget toward them suggests that COVID-19’s lessons haven’t been learned.

America failed to test sufficiently throughout the pandemic even though rigorous tests have long been available. Antiviral drugs played a bit part because they typically provide incremental benefits over basic medical care, and can be overly expensive even when they work. And vaccines were already produced far faster than experts had estimated and were more effective than they had hoped; accelerating that process won’t help if people can’t or won’t get vaccinated, and especially if they equate faster development with nefarious corner-cutting, as many Americans did this year. Every adult in the U.S. has been eligible for vaccines since mid-April; in that time, more Americans have died of COVID-19 per capita than people in Germany, Canada, Rwanda, Vietnam, or more than 130 other countries did in the pre-vaccine era.

“We’re so focused on these high-tech solutions because they appear to be what a high-income country would do,” Alexandra Phelan, an expert on international law and global health policy at Georgetown University, told me. And indeed, the Biden administration has gone all in on vaccines, trading them off against other countermeasures, such as masks and testing, and blaming “the unvaccinated” for America’s ongoing pandemic predicament. The promise of biomedical panaceas is deeply ingrained in the U.S. psyche, but COVID should have shown that medical magic bullets lose their power when deployed in a profoundly unequal society. There are other ways of thinking about preparedness. And there are reasons those ways were lost.


In 1849, after investigating a devastating outbreak of typhus in what is now Poland, the physician Rudolf Virchow wrote, “The answer to the question as to how to prevent outbreaks … is quite simple: education, together with its daughters, freedom and welfare.” Virchow was one of many 19th-century thinkers who correctly understood that epidemics were tied to poverty, overcrowding, squalor, and hazardous working conditions—conditions that inattentive civil servants and aristocrats had done nothing to address. These social problems influenced which communities got sick and which stayed healthy. Diseases exploit society’s cracks, and so “medicine is a social science,” Virchow famously said. Similar insights dawned across the Atlantic, where American physicians and politicians tackled the problem of urban cholera by fixing poor sanitation and dilapidated housing. But as the 19th century gave way to the 20th, this social understanding of disease was ousted by a new paradigm.

When scientists realized that infectious diseases are caused by microscopic organisms, they gained convenient villains. Germ theory’s pioneers, such as Robert Koch, put forward “an extraordinarily powerful vision of the pathogen as an entity that could be vanquished,” Alex de Waal, of Tufts, told me. And that vision, created at a time when European powers were carving up other parts of the world, was cloaked in metaphors of imperialism, technocracy, and war. Microbes were enemies that could be conquered through the technological subjugation of nature. “The implication was that if we have just the right weapons, then just as an individual can recover from an illness and be the same again, so too can a society,” de Waal said. “We didn’t have to pay attention to the pesky details of the social world, or see ourselves as part of a continuum that includes the other life-forms or the natural environment.”

Germ theory allowed people to collapse everything about disease into battles between pathogens and patients. Social matters such as inequality, housing, education, race, culture, psychology, and politics became irrelevancies. Ignoring them was noble; it made medicine and science more apolitical and objective. Ignoring them was also easier; instead of staring into the abyss of society’s intractable ills, physicians could simply stare at a bug under a microscope and devise ways of killing it. Somehow, they even convinced themselves that improved health would “ultimately reduce poverty and other social inequities,” wrote Allan Brandt and Martha Gardner in 2000.

This worldview accelerated a growing rift between the fields of medicine (which cares for sick individuals) and public health (which prevents sickness in communities). In the 19th century, these disciplines were overlapping and complementary. In the 20th, they split into distinct professions, served by different academic schools. Medicine, in particular, became concentrated in hospitals, separating physicians from their surrounding communities and further disconnecting them from the social causes of disease. It also tied them to a profit-driven system that saw the preventive work of public health as a financial threat. “Some suggested that if prevention could eliminate all disease, there would be no need for medicine in the future,” Brandt and Gardner wrote.

A masked health care worker in black & white with his face blocked by a white rectangle with a line of fading red crosses going across its middle

This was a political conflict as much as an ideological one. In the 1920s, the medical establishment flexed its growing power by lobbying the Republican-controlled Congress and White House to erode public-health services including school-based nursing, outpatient dispensaries, and centers that provided pre- and postnatal care to mothers and infants. Such services were examples of “socialized medicine,” unnecessary to those who were convinced that diseases could best be addressed by individual doctors treating individual patients. Health care receded from communities and became entrenched in hospitals. Decades later, these changes influenced America’s response to COVID-19. Both the Trump and Biden administrations have described the pandemic in military metaphors. Politicians, physicians, and the public still prioritize biomedical solutions over social ones. Medicine still overpowers public health, which never recovered from being “relegated to a secondary status: less prestigious than clinical medicine [and] less amply financed,” wrote the sociologist Paul Starr. It stayed that way for a century.


Read more