Not so long ago, public health ranked high in the scientific pantheon. Its heroes—the conquerors of cholera, smallpox, yellow fever, and polio—were celebrated in history textbooks. When the coronavirus pandemic began more than five years ago, public-health leaders were revered by journalists and the public. Americans’ bipartisan trust in scientists led millions of people to obey the unprecedented guidance in the spring of 2020 to remain in their homes.
But then, after the death of George Floyd, the rules suddenly changed—for some people. Isolating at home was no longer mandatory for those joining “protests against systemic racism,” according to an open letter signed by more than 1,200 academics, doctors, and public-health officials and released in early June. The signers acknowledged that leaders of their professions had previously condemned a street protest against lockdown policies—and they reaffirmed their opposition to such protests and other public gatherings. But the Black Lives Matter protests were different: “We support them as vital to the national public health.”
So it was now vital for thousands to march together in the streets, but you still couldn’t take your child to a playground or gather with family in a cemetery to bury your grandmother? Or march in protest of anything except racism? This made no sense, especially to conservatives, whose skepticism kept growing as COVID restrictions proliferated in blue states. By the time the pandemic ended, a survey had found that more than half of Republicans had little or no confidence that “science has had a mostly positive effect on society.” To them, the public-health establishment’s guidance on such topics as childhood vaccines and fluoridated water now seemed less about the science than the politics.
Listen: Who really protests, and why?
Although many skeptics have overreacted, rejecting sound science in favor of quack theories, they’ve gotten one thing right: A noble profession has been corrupted by politics. This became obvious during the pandemic, but the politicization of the discipline has been going on for half a century. The modern field has redefined the very meaning of public health, and in the process, it has made so many mistakes that it has itself become a hazard to Americans’ health.
The pioneers of public health focused on threats that were genuinely public: epidemics that spread through the water and the air. Halting the spread of cholera and smallpox required collective action, and the government had clear scientific justification to provide clean water, build sewer systems, and sponsor vaccination campaigns. In addressing diseases caused by unhealthy personal behavior, leaders mostly urged public education, not governmental coercion. The surgeon general’s landmark 1964 report on smoking cataloged the evidence of its harm but avoided policy prescriptions. In a congressional hearing that year, the president of the American Cancer Society rejected the notion of banning smoking: “We believe in the freedom of the individual in the matter of cigarette smoking.”
But the distinction between public and personal health began disappearing in the 1960s. Public health was undone, in part, by its success. As they conquered one scourge after another, practitioners began looking for new missions and expanding the definition of epidemic. Over the ensuing decades, the Centers for Disease Control and Prevention, founded in the 1940s to stop the spread of malaria, enlarged its portfolio to address epidemics of tobacco use, firearm fatalities, domestic violence, and racism. The American Public Health Association began campaigning to regulate products it deemed unhealthy, and went on to support income redistribution, nationalized health care, increases to the minimum wage, green energy, and transgender rights.
The field’s transformation was summarized in the title of a book published in 2000 by the economists James T. Bennett and Thomas J. DiLorenzo: From Pathology to Politics. “To a very large extent,” the authors concluded, “the public health movement has increasingly become a collection of liberal ideologies cloaked in the language and garb of health science.” Public health was redefined not merely as the absence of disease but as the presence of social justice, the pursuit of which became a major part of the curriculum in public-health schools. Unhealthy personal choices were recast as public hazards requiring new government interventions.
Read: Public health experts are not hypocrites
To combat heart disease and obesity, senators and advocacy groups pressed for federal dietary guidelines in 1980 telling Americans to eat less fat and more carbohydrates. Eminent nutrition researchers objected, arguing that hypotheses linking the diseases to dietary fat were too tenuous to justify the move. These researchers would eventually be vindicated by rigorous studies that failed to demonstrate the purported benefits of low-fat diets—and that revealed harms from carbohydrate-rich diets.
But such scientists were no match for their political opponents. Politicians and journalists portrayed them as outliers and tools of the food industry, because it had funded some of their research. The federal government went on recommending a low-fat diet and issued the now-infamous food pyramid promoting carbs. As the American food industry and public followed that advice, substituting carbs for fats, rates of obesity and diabetes soared.
Undeterred by that fiasco, the federal government issued more guidelines in 2005, telling Americans to adopt a low-salt diet and consume no more than 2,300 milligrams of sodium daily. It set an even lower limit for middle-age and older people—1,500 milligrams—which meant cutting out more than half of the sodium in a typical diet. Once again, some scientists said that there was no evidence to justify this experiment on the public. They were dismissed as outliers and industry shills, but their concerns, too, were subsequently taken up by other scientists: Large studies found that low-salt diets were actually associated with higher rates of heart disease, stroke, and death than diets with moderate amounts of sodium. The government eventually removed the draconian limit for older adults. But it has continued recommending the 2,300-milligram limit for the general population, which critics say is still unwarranted—and quite possibly harmful—because there’s little evidence that a low-salt diet would benefit people who do not have high blood pressure.
With the transformation of the profession, smoking was no longer defensible on the grounds of individual freedom. Bans on smoking in outdoor spaces, private workplaces, and restaurants were justified with overhyped claims about the deadly perils of secondhand smoke. (Later, when a rigorous study found no link between secondhand smoke and lung cancer, the bans were retroactively justified as a way to “change societal behavior” by “denormalizing” smoking.) Prohibitionists in the U.S. sought not merely a “smoke-free society” but also a “tobacco-free society,” which meant eliminating any form of nicotine, even the safer alternatives to cigarettes being encouraged by authorities in other countries. In Britain, the Royal College of Physicians noted that “nicotine itself is not especially hazardous” and warned that it would be “unjust, irrational and immoral” not to offer alternatives to cigarettes that could save the lives of millions of smokers. In 2015, England’s public-health agency published a study that found nicotine vaping to be an effective tool for quitting that eliminated around 95 percent of the harms of smoking.
In the U.S., the smoking rate among adults and youths declined sharply after nicotine vaping started growing rapidly in 2010, but you wouldn’t have guessed it from the reaction of officials at the CDC and the FDA. They launched a campaign against vaping, imposing strict new regulations and seizing on weak evidence to issue warnings of potential dangers. In 2014, the director of the CDC, Tom Frieden (who had risen to national prominence as New York City’s health commissioner by crusading against fat, salt, and smoking), warned that vaping could “do more harm than good” by luring young people to smoke cigarettes. But smoking rates among youths and adults have continued declining—and other potential dangers have been soundly debunked too.
As usual, the few scientists challenging the narrative have been mostly ignored (and denounced) by their colleagues and the media. Polls in recent years show that three-quarters of Americans wrongly believe that vaping is equally as or more dangerous than smoking—and some of them are smokers who have been dissuaded from using a singularly effective tool for quitting.
The most egregious errors, of course, came during the coronavirus pandemic. Before the first outbreaks began, the CDC had drawn up a pandemic plan incorporating the old-school public-health principles of Donald Henderson, the epidemiologist who had directed the worldwide campaign that eradicated smallpox in the 1960s and ’70s. In 2006, he and his colleagues at the University of Pittsburgh analyzed possible responses to an influenza pandemic, including bans on public gatherings, lengthy closures of schools, travel restrictions, and the universal wearing of surgical masks. The researchers did not recommend any of those responses, pointing to evidence that they would be ineffective, and warned that “if particular measures are applied for many weeks or months, the long-term or cumulative second- and third-order effects could be devastating socially and economically.” The article urged public-health officials to not put too much faith in computer models. It stressed the need for leaders to “provide reassurance” and reduce the public’s anxiety by maintaining “the normal social functioning of the community.”
In its pre-2020 pandemic-planning scenarios, the CDC didn’t recommend business closures or mask wearing by healthy individuals, even if a virus were as deadly as the 1918 flu. But once China claimed that its lockdown had stopped the virus, the CDC plan was abruptly abandoned. Instead of reassuring the public, leaders presented doomsday scenarios (based on wildly unrealistic computer projections) that terrified everyone, especially young people, who were actually at a much lower risk of death than the rest of the population. Instead of following the pre-pandemic advice of its own scientists, the CDC copied China’s lockdown.
Why did the public-health establishment want to believe, much less emulate, China’s authoritarian rulers? The sudden enthusiasm for lockdowns has sometimes been blamed on the profession’s partisan bias: If Donald Trump opposed lockdowns, Democrats assumed, they must be a good idea, and the economic damage wouldn’t seem quite so catastrophic if it prevented him from being reelected that year. The idea that the pandemic might discredit Trump even prompted Jane Fonda to describe COVID as “God’s gift to the Left.”
But there was more to it than just election-year politics. The pandemic was also God’s gift to the ambitious social engineers who had come to dominate public-health discourse. For decades, abetted by journalists eager for scare stories, these scientists and activists had been gaining prestige, power, and funding by declaring new menaces that could be conquered only by following their expert guidance, and they had become adept at silencing researchers who questioned their version of “the science.”
Early in the pandemic, a few prominent researchers echoed Henderson’s warning that the unprecedented restrictions on liberty were not justified by the scientific literature and could cause much more harm than the virus. But these researchers—including Harvard’s Martin Kulldorff and Stanford’s John Ioannidis, Jay Bhattacharya, and Scott Atlas—were promptly vilified by colleagues, smeared in the press, and censored on social-media platforms. Other researchers became afraid to openly challenge the establishment, so the officials promoting lockdowns became the public face of scientific authority.
And then the mask came off. The dispensation for the Black Lives Matter protests was so nakedly ideological that COVID policies became an intensely partisan issue. Red states began reopening while blue states remained shut, mandated masks, and kept schools closed long after they had reopened in the rest of America and in Europe. For some, wearing a mask became a political statement—the Democratic version of a MAGA hat. Blue-state governors and the Biden administration adopted some of the world’s most aggressive policies, such as the CDC’s recommendation (which was partially rescinded only last month) that everyone over the age of six months be vaccinated against COVID. In much of Europe, health agencies never recommended the vaccine for healthy children under 5, required those young children to wear masks, or mandated vaccines for people with natural immunity. In 2022 and 2023, many European agencies stopped recommending booster shots for most healthy people under 60, but the CDC still recommends the shots for everyone over 18—and still advises parents to consider vaccinating infants and children.
Read: How public health took part of its own downfall
American public-health officials eventually acknowledged having made a few mistakes, such as not reopening schools sooner, but they’ve mostly ignored the evidence that their response to the coronavirus pandemic was the worst debacle in the history of their profession, and arguably the costliest federal-policy blunder ever made in peacetime. Just as Donald Henderson had warned, lockdowns and other restrictions caused devastating economic and social damage while proving largely futile as a means of controlling the virus.
Many studies comparing mortality rates across the United States and Europe have concluded that lockdowns and other restrictions made little or no difference in reducing COVID mortality. Clinical trials of masks failed to show they were effective, and the rates of COVID infections and deaths in states with mask mandates were virtually identical to the rates in no-mandate states throughout the pandemic. Sweden, initially pilloried for refusing to lock down and for advising its citizens not to wear masks, ended the pandemic with one of the lowest rates in Europe of cumulative excess mortality.
The public’s lingering anger over COVID restrictions helped reelect Trump, who has brought two COVID heretics into the administration: Bhattacharya, the new director of the National Institutes of Health, and Marty Makary of Johns Hopkins, the new FDA commissioner. During their confirmation hearings, they vowed to make sweeping reforms and lamented Americans’ loss of faith in public-health institutions. During the pandemic, both had warned that the mandates for COVID vaccines were excessive and would make Americans leery of all vaccines. Sure enough, a Gallup poll found that the percentage of Americans who say vaccinating their children is “extremely important” fell from nearly 60 percent before the pandemic to just 40 percent last year.
To regain the public’s trust, both researchers promised senators that they would restore scientific rigor and debate at their agencies, but they clearly have their work cut out for them. Their agencies not only still have plenty of ambitious progressives in the ranks but also are now overseen by a particularly avid social engineer, Robert F. Kennedy Jr., the secretary of Health and Human Services and leader of the “Make America healthy again” movement. He wants to restrict seed oils, ultra-processed foods, pesticides, and other products that he considers unhealthy. Critics are already warning that the scientific rationales for these moves are so weak that his public-health campaign, like previous ones, could make Americans less healthy.
But RFK Jr. seems determined to continue in his predecessors’ footsteps, and he has even discovered yet another epidemic. “Anti-Semitism—like racism—is a spiritual and moral malady that sickens societies and kills people with lethalities comparable to history’s most deadly plagues,” he declared, in announcing a task force to combat it. Yes, anti-Semitism is deadly, and you can call it a plague, but that doesn’t mean scientists have the expertise to conquer it.
The public-health profession can go on pretending it knows how to cure every social ill, but it will never earn back the public’s respect unless it returns to its original principles. It needs to choose evidence over ideology—and humility over hubris.