Critics of the US health system argue that a higher proportion of the health dollar should be spent on public health, both to improve outcomes and to contain costs. Attempts to explain the subordinate status of public health in America highlight such factors as distrust in government, federalism, and a bias toward acute care. This article considers these assumptions by comparing public health in the United States, England, and France. It finds that one common variable is the bias toward acute care. That the United States has such a bias is not surprising, but the similar pattern cross-nationally is less expected. Three additional findings are more unexpected. First, the United States outperforms its European peers on several public health metrics. Second, the United States spends a comparable proportion of its health dollar on prevention. Third, these results are due partly to a federalism twist (while all three nations delegate significant responsibility for public health to local governments, federal officials are more engaged in the United States) and partly to the American version of public health moralism. We also consider the renewed interest in population health, noting why, against expectations, this trend might grow more quickly in the United States than in its European counterparts.
Critics of the US health system point out that we do not get good value for our health care dollar: we spend far more per capita than our peers yet rank quite low on various health outcomes. As noted in a recent report, the United States has a “strikingly consistent and pervasive” pattern of poor health outcomes compared to other high-income nations (Institute of Medicine and National Research Council 2013: 1). Efforts to explain this (dismal) cross-national performance often focus on the subordinate status of public health; presumably the key to a higher performing US health system (and better health outcomes) would be to spend a higher percentage of the health care dollar on prevention and public health (and a lower percentage on specialty and acute care). Moreover, the subordinate status of public health in America is presumably rooted in health care federalism: the decentralized American system arguably fares poorly compared to more centralized systems like those found in many European peers.
This article considers these assumptions by comparing public health in the United States, England, and France. The comparison demonstrates that each of the three nations places a far higher financial and cultural value on acute care and biomedical research than on public health and prevention. That the United States has such a fondness for high-tech and individualized medicine is perhaps not surprising, but that there is a similar pattern in England and France is less expected.
Four additional findings are also unexpected. First, the United States spends more than nearly all Organisation of Economic Co-operation and Development nations on prevention, both on a per-capita basis and as a percentage of overall health spending (Gmeinder, Morgan, and Mueller 2017).1 Second, the United States outperforms its European peers on several public health and prevention metrics (including tobacco use and dangerous alcohol consumption) while lagging behind on others (such as obesity and opioid use). The comparative scorecard is quite mixed (Davis et al. 2014). Third, these results are due in part to a federalism twist: a surprisingly strong federal role in US public health policy and a surprisingly strong local role in the public health systems in England and France. Finally, while there is renewed interest in public and population health in all three nations, we (tentatively) suggest reasons that this trend might grow more quickly in the United States than in its European counterparts.
Public Health in the United States, England, and France: Common Roots and Patterns
American history includes an ongoing debate over which level of government should do what. Back in the eighteenth century, for example, the anti-Federalists advocated for local autonomy, boycotted the Constitutional Convention, and argued that the proposed document was likely to lead to a new (American-style) monarchy. At the other end of the spectrum, Alexander Hamilton pushed for a strong federal government (fueled by a powerful executive branch). James Madison proposed a middle ground, a large but weak national government, with its power constrained by the separation of powers and other institutional checks. The constitution that emerged provides significant support for each view, leaving to future generations the endless federalism debate that remains with us today.
Importantly, however, the election of Thomas Jefferson in 1800 ushered in a long-standing era in which states' rights and local autonomy dominated, and limited government became ingrained as a key component of American political culture. And while the states clearly had the authority to act, they generally delegated to local communities two kinds of health and social services tasks: providing classic welfare-related services to the so-called deserving poor (those outside the workforce through no fault of their own) and providing basic public health services in their communities.
The focus on public health increased during the industrial revolution, with its overcrowded cities and increased international commerce, both developments leading to a rise in infectious diseases and mass epidemics. Interestingly, local officials would provide short-term fixes during epidemics (improved sanitary conditions and public education campaigns) but scale back such efforts once the crises ended, prompted by opposition from the business community as well as the antigovernment political culture (Fee and Brown 2002).
In this period, it was generally considered unwise (and perhaps even unconstitutional) for the federal government to enact health and welfare legislation. But there were exceptions, most notably the Marine Hospital Service (MHS), which provided medical care to sailors and soldiers. There also was a brief federal foray into public health, the Vaccine Act of 1813, under which a “national vaccine agent” distributed smallpox vaccine to thousands each year, a program that ended in 1821 after the accidental distribution of the wrong vaccine led to ten deaths (Singla 1998).
The political dynamic changed somewhat due to the rise of the so-called sanitary movement in the mid-nineteenth century. Several factors were key. First was the emergence and growth of the field of epidemiology, prompted by John Snow's famous decision to remove the pump handle from the source of contaminated water in London, thereby ending a devastating cholera outbreak nearly overnight. Second, and closer to home, were the terrible conditions in the Civil War military camps, which convinced President Lincoln to create a national Sanitary Commission to investigate. And third was the (slow) acceptance of the reality that horrific social conditions, rather than moral decadence, led to the high rates of disease and early death among the poor.
In this context, cities like New York and Chicago established the first permanent departments of public health, followed by the creation of the American Public Health Association in 1879. There also was a push for increased federal action, prompted by the influx of European and Caribbean immigrants, the perceived need to prevent entry to persons with a contagious disease, and the often inadequate local response to smallpox, malaria, and yellow fever epidemics. In 1897, for example, a smallpox epidemic swept through the southern states, overwhelming state and local public health capacities and prompting an aggressive federal response led by the MHS (Sledge 2017).
Over the next twenty years, the MHS increasingly focused on its public health mission, and in 1912 it was renamed the US Public Health Service (Sledge 2017). The federal public health role expanded again in 1921 when Congress enacted the Sheppard-Towner Act, which provided federal funding to the states to create prenatal clinics and fund a range of pregnancy education programs. While Sheppard-Towner funding ended in 1929, the program was reborn as Title V of the 1935 Social Security Act. Title VI of the Social Security Act also included millions to support state and local public health infrastructures.
With a more engaged and effective public-sector infrastructure, the first half of the twentieth century surely ranks as the golden era for the public health community. Between 1900 and 1950, for example, life expectancy increased from forty-seven to sixty-eight years, a remarkable twenty-one-year gain that is even more impressive compared to the nine-year gain (to seventy-seven) during the second half of the century.
What explains the extraordinary gains early in the twentieth century? The keys were public health interventions that led to working sewage systems, safer food, chlorinated water, cleaner streets, better workplace conditions, improved housing, more effective prenatal care, and faster and better responses to infectious diseases. These public health interventions were led by the nearly three thousand local health departments spread throughout the country, supported by their intergovernmental partners at both the state and federal level.
Looking toward the other side of the Atlantic, the public health evolution looks similar, sharing both the philosophical debate about individual versus collective responsibility and the center-periphery tensions over the intergovernmental division of power. These similarities include early beginnings and promising advances from the mid-nineteenth century to the first half of the twentieth century, as well as rapid gains in life expectancy due to, as in the United States, better sewage and water treatment, improved housing conditions, and a recognition that devastating social and economic conditions, linked to industrialization and urbanization, were contributing to outbreaks of infectious disease.
France represents an interesting case to study from this perspective, as the country is often analyzed in international comparisons as an archetype of a centralized state deriving from its Napoleonic tradition of centralization of power. But the country's early debates over public health display striking similarities with those in the United States. For starters, nineteenth-century France was far from the generous welfare state often described in the literature, which in fact only emerged after World War II. At the time, conservative movements were strong and consistently resisted the enactment of collective health care structures, in the name of a so-called ethics of prévoyance, emphasizing individual responsibility over structural determinants of health and blaming poor people for their lack of cleanliness, their poor behavior (alcohol, sexual behavior), and their lack of foresight. With the exception of vaccination against smallpox—the central government in 1808 ordered 2 million doses and sent vaccination doctors to rural areas to reach communities and fight a devastating outbreak—national action remained ill-accepted and of limited scope, and de facto was left to ad hoc initiatives from the municipalities and departments (Bourdelais 2003; Tabuteau 2010a; Bergeron and Castel 2014).
Like the United States, in France public health initiatives at the national level developed from 1870 on, when ideas of solidarity gained more strength nationally (Tabuteau 2007). A series of public health acts, including the Public Health Act of 1902, provided a legal framework and a range of policy initiatives such as new public physicians, free medical assistance for the poor, an obligation to respond to infectious diseases, and a national committee on public hygiene. These measures were reinforced after the Spanish flu outbreak in the 1920 with the creation of a public health ministry (Tabuteau and Morelle 2015). But primary responsibility for the delivery of public health services remained with local governments.
In nineteenth-century England, with its “poor laws” and heavy reliance on the distinction between deserving and undeserving poor for the delivery of social services, the moral debate about the social versus individual determinants of health was also important. As in the United States, early public health interventions in England were targeted by the navy, the army, and immigrant health officials to contain outbreaks and infectious diseases. Public health initiatives were then supported by the social medicine movement, including sanitary commissioner and poor law reformer Edwin Chadwick and modern nursing founder Florence Nightingale. These “sanitarians” campaigned for sanitation measures and hygiene, such as sewage and improvement in water systems. The Public Health Act of 1848 established a central board of health but delegated responsibility for clean water, drainage, and sanitation more generally to local authorities.
Over the next several decades, national legislation reinforced the local role in ensuring clean water, basic sanitation, safe food, and hygienic conditions more generally. The Public Health Act of 1866 created drainage districts and reinforced the duty of local authorities to detect and deter nuisances. The Public Health Act of 1875 required local boards of health to have a medical officer and a sanitary inspector, to ensure that regulations on food, housing, water, and hygiene were enforced. The golden age of public health soon followed, with advances similar to those in France and the United States.
Public Health in the United States: Increased Marginalization after World War II
Following his election in 1932, President Franklin D. Roosevelt persuaded the American people that the federal government needed to take the lead in efforts to respond to the stock market crash and economic depression. Roosevelt promised a “new deal” between citizens and the federal government, one in which a powerful (Hamiltonian) executive branch would work with state and local governments to enact an array of social protections and economic reforms. Overcoming fierce resistance from the courts (which initially held much of the New Deal to be unconstitutional), Roosevelt engineered a massive increase in the size and scope of the federal government, including the first comprehensive set of federal social welfare programs (Social Security and Aid to Dependent Children). But while the Social Security Act contained funding for various public health programs (as described above), Roosevelt was convinced by the American Medical Association to drop any effort to enact national health insurance.
Harry Truman succeeded Roosevelt in 1945 and immediately proposed that national health insurance be added to the list of federal social protection programs. While that effort failed, the federal government soon used the tax code to encourage employers to provide health insurance to their employees. In 1946 federal officials also created the Communicable Disease Center, now called the Centers for Disease Control and Prevention (CDC), to oversee the fight against malaria and other infectious diseases, providing scientific guidance and oversight to state and local health departments.
Despite the creation of the CDC, federal health policy after World War II focused on the belief that medical research and specialized medical care would eventually conquer nearly all forms of disease. This assumption prompted the federal government (though the National Institutes of Health) to funnel billions of dollars to academic medical researchers. Congress also enacted the Hill-Burton program, which provided federal funds to stimulate hospital construction and modernization, thereby offering more Americans access to the increasingly sophisticated medical care rendered in state-of-the-art hospitals (Thompson 1981). Two decades later, in an effort to respond to a perceived physician shortage, Congress enacted a host of initiatives designed to increase the nation's physician supply. But as the physician workforce grew, most new physicians specialized, drawn by higher income, higher status, and the demands of keeping up with evolving medical knowledge.
The growth of the medical care system, with physicians emerging as prestigious life savers using new technologies in an increasingly hospital-based health care system, led to dramatic growth in health care spending. Meanwhile, spending on public health seemed less necessary. There were fewer infectious disease epidemics and a growing belief that the nation's sewage, food, and water systems were in good shape. The nation was spending more on health care, but a smaller percentage of the health care dollar was going to support the public health infrastructure.
The declining fortune of the public health community was exacerbated by three additional factors: distrust of government, perceptions of “us versus them,” and the lure of technology.
Distrust of Government: An American Tradition
The post–New Deal expansion of the public-sector agenda fueled an ongoing backlash against government, a distrust of science, and a perception among many that government is simultaneously inept and dangerous. Long before Donald Trump was calling climate change a hoax and challenging as fake news the so-called mainstream media, prior presidents from Nixon to Reagan (and sometimes Carter and Clinton) were praising the virtues of a smaller and less intrusive public sector. This distrust of public authority is especially powerful in the public health sphere, where policy makers (albeit typically at the local level) often balance the public good against individual liberty. Rules requiring seatbelts, motorcycle helmets, and speed limits save lives but also clash with an American culture favoring individualism over collective action. This is especially so for more aggressive public health measures, such as efforts by Michael Bloomberg, the former New York City mayor, to limit the sale of supersized sugary drinks or to impose other measures designed to reduce obesity.
Making the case for public health even more complicated is that it is often hard to demonstrate a quantifiable return on investment of population-based initiatives. There is, to be sure, a vast literature on the cost savings and cost-effectiveness of clinical preventive care. This research finds that some preventive interventions are cost-saving (child immunizations and counseling adults on use of low-dose aspirin) but that the scorecard is mixed on other services, given the costs associated with screenings, treatment, and even longer lives (Cohen and Neumann 2009). There are far fewer studies, however, that examine the return on investment generated by community-based public health spending.2
The US public health community also occasionally suffers from a credibility gap. In the mid-1970s, for example, a soldier at the Fort Dix military base died from swine flu, prompting fears of a deadly epidemic. President Ford ordered a nationwide vaccination effort to cover 220 million Americans. Some of the first to be vaccinated, however, developed Guillain-Barré illness from the vaccine. A national outcry ensued, and the president halted the vaccination program after fewer than 40 million were vaccinated. The political backlash intensified as the feared swine flu epidemic never developed. For years afterward, the public health community was accused of falsely raising fears. The current antivaccination movement has roots in these sorts of unfortunate episodes (in addition to building off the growing antiscience movement more generally).
The swine flu fiasco illustrates another problem for the public health community: their work is invisible and taken for granted when it succeeds (in keeping the water clean and the food safe to eat) but all too visible during a public health crisis. For example, the trend toward globalization has enabled infectious diseases to travel the globe, infecting geographically diverse populations. The recent Zika virus outbreak, for example, prompted both horror (babies born with microcephaly) and fear (that the impact would be felt on US soil), but an outbreak of food-borne illness from the Chipotle restaurant chain caused the same political reaction. Public health officials are suddenly in the news, blamed if the epidemic lingers, only to be again ignored once the crisis eases and the newspapers turn to other matters.
One last point: In the 1960s, many local public health departments (especially in rural US communities) began providing clinical services to poor and vulnerable populations without alternative sources of care. But the decision to become an occasional provider of clinical services was controversial, both with some public health leaders (who worried about a loss of focus and mission) and with some in the general public (who increasingly viewed local health departments as focused on the poor rather than the community at large).
“Us versus Them”
There is a seemingly all-to-human tendency to identify with the familiar individual in trouble but offer less sympathy for large groups who seem both different and distant. The so-called rule of rescue suggests that people respond more generously to individual misfortune (at least in persons who look like “us”) than to bad news conveyed in statistical terms (thousands of “them” dying of famine in a faraway land).
At the same time, too many of us still blame the individual for health outcomes caused by a mix of behavioral and social (and sometimes genetic) variables. If obesity is due to laziness and poor eating habits, then why focus scarce public dollars on persons who have no one to blame but themselves? If addiction or mental illness is a consequence of moral weakness (as opposed to a not surprising consequence of social conditions), then why pay for substance abuse or mental health services? Not surprisingly, however, many are more sympathetic if the drug addict or schizophrenic looks like “us,” a phenomenon providing an important insight into a polity more inclined to provide aid to (largely white and rural) victims of the current opioid crisis than to (black and urban) heroin addicts in the past.
Technology and the Promise of Miraculous Cures
Critics of the US health care system often complain that physicians deliver too much defensive medicine: services, tests, and procedures prescribed but not medically necessary. Some blame the nation's medical malpractice system, arguing that physicians order extra tests to protect against potential litigation. Others blame economic incentives, noting that doctors that do more generally earn more. But a third factor is American culture, particularly the common desire for the most technologically sophisticated test and treatment, combined with a desire to have that test and treatment immediately.
The cultural preference for high-tech services (that help sick people recover) over prevention and public health (to keep them from getting sick) is not, of course, a uniquely American phenomenon. As national income increases, so too does a national desire for the latest biomedical procedures. That said, however, this cultural preference is especially strong in the United States, where any effort to even consider the cost of a particular medical procedure is almost immediately labeled rationing.
Finally, the interest groups that produce and deliver the latest technological advances (from pharmaceutical companies to academic medical centers) are politically influential; the public health community is not. It is thus an ongoing challenge to protect scarce public health funding, much less generate significant increases in appropriations.
Public Health in England and France: A Similar Bias toward Curative Care
In the French and English contexts, a comparable bias appears toward curative medicine at the expense of consistent population-based health actions. Public health initiatives were marginalized in both countries after World War II, while most policy efforts focused on improving health care infrastructures and access to medical care. As noted by many European experts, the eclipse of public health from the political agendas was paradoxical. Arguably, the development of more centralized European welfare states in the aftermath of World War II provided a window of opportunity for developing stronger public health policies, but neither nation followed that route.
In England, the creation of the National Health Service (NHS) nationalized medical services and created a tax-based health system free at the point of service, but the NHS had quite limited responsibilities toward public health, which remained the responsibility of local authorities. Put simply, the NHS nationalized curative, acute, and individual care, while public health and social care remained a local responsibility. This contributed to the English bias in favor of curative medicine at the expense of prevention and public health (Joyce 2009).
The low profile of British public health policies endured even after the 1974 reorganization of the NHS when public health was nominally integrated to the NHS, leaving only social care to the local level. This move, aimed at rationalizing health services and unifying the different facets of health policies, weakened the role of public health physicians and over time proved detrimental to public health, which lagged even further behind medical care. Subsequent reorganizations of the NHS continued to impact public health negatively. The creation of an internal market in 1991, separating purchasers and providers of care and introducing competition among health providers, further fragmented responsibilities for public health, weakening both national coordination and the public health doctors (Scally 1996). Furthermore, the 1990s conservative reforms introduced cuts in public health budgets, along with demands for efficiency, which together proved quite problematic for the public health community.
In France, public health also was eclipsed by questions of access to care (Loriol 2002; Steffen 2000). But national state action in the realm of medicine and health insurance was more contested than in England, mirroring (to a lesser extent) the resistance to public action existing in the United States. For example, the creation of the social insurance institutions in 1945 and the sickness funds (nonprofit occupational health insurance whose governance was left to the social partners, employers, and employee unions) further decreased an already contested legitimacy of the government to intervene in health policy (Tabuteau 2016). Public health also faced strong ideological and professional resistance linked to the tradition of liberal medicine in the French health care system (Hassenteufel 1997). Public health initiatives were continuously opposed by the medical profession, in the name of their professional autonomy and the protection of the patient-doctor relationship. Resistance of medical professions diminished the scope of public health, reduced to interventions targeted on specific conditions (cancer prevention, alcoholism reduction, mental health) (Tabuteau 2016).
With the creation of the Sécurité Sociale in 1945, individual doctors maintained private practices and considered prevention in their remit, strengthening the biomedical approach to public health as a main policy orientation. While public health structures emerged, such as centers focusing on occupational medicine, school medicine, and child and maternal protection, these different centers were poorly integrated with the dominant liberal medicine. Qualified as “poor medicine for the poor” (Loriol 2002), child and maternal protection medical teams and occupational doctors did not enjoy, for instance, the right to treat and prescribe. Post–World War II public health in France has been secondary (Loriol 2002; Morelle 1996), poorly promoted, insufficiently coordinated, insufficiently remunerated, insufficiently incentivized, and insufficiently taught (Loncle 2009).
A New Public Health: The Window of Opportunity
Improved access to medical care, seemingly endless technological progress, and widespread eradication of various infectious diseases all contributed to the eclipse of public health initiatives in Western countries. More recently, however, many have questioned the long-term viability of the current biomedical model. Highlighting the potential dangers of a minimal focus on prevention and the rapid and supposedly unsustainable growth of health expenditures of a hospital-centered care system, increased voices make the case for more investment in public health. There is renewed attention to individual behaviors (such as unhealthy eating habits, excessive alcohol use, and smoking), along with the societal factors that encourage such behaviors (food deserts, segregated housing patterns, and inadequate economic opportunity).
The public health infrastructure has strengthened. In 1988 the Institute of Medicine issued a blistering report on the poor condition of the US public health system, noting that 78 percent of local health departments were directed by someone without a public health degree, that there were too few epidemiologists, inadequate labs and computer systems, an aging workforce with poor communication skills, and too little training in emergency preparedness. More recently, however, new schools of public health have been springing up all over the country, and a new generation of active, engaged, and skilled public health professionals work both in revitalized local departments and in a strengthened CDC, which now provides public health leadership both in the United States and in more than a hundred countries around the globe.
The US public health community has achieved some surprising successes. Antitobacco initiatives have led to reduced smoking rates. Highway safety campaigns (from lower speed limits to anti–drunk driving initiatives) have reduced highway fatalities. Fluoridated water is required in much of the country. There are numerous mandatory vaccination programs (and the antivaccination movement remains small). Regular health promotion campaigns focus on healthy eating and physical activity while also promoting community gardens and population-based obesity prevention initiatives.
There also is an increased focus in all sectors of the US health system on population health and population health management. Hospitals and other health care providers are increasingly put at financial risk for the cost and quality of care delivered to large populations. Behavioral economists are turning their attention to efforts to nudge targeted populations to improve their health behaviors. Public health practitioners are looking to new technologies (such as wearable devices) to generate big data that can enable population-based predictive analytics. There are more sophisticated efforts to analyze the return on investment of alternative population-based strategies.
In both England and France there is a similar moment of potential rebirth of public health, with the release of numerous reports, commissions, and policy changes, all designed to encourage a more individualized approach to population health, aimed at harmful behaviors, to bring more balance between acute care and prevention. But the potential seems less realized, at least so far. For example, two white papers in England both emphasize the need to invest in health promotion (UK Department of Health 1987, 1992, 2000, and 2003). But these reports were largely symbolic and aspirational, setting targets for health improvement in such areas as cardiovascular diseases, cancers, and mental health but not resulting in additional funding (Webster 1996; Watterson 2003). The New Labour era (1997–2008) added attention to environmental and social variables (health inequalities) largely absent from the previous conservative government approach, but here again, little substantive change followed.
The drumbeat of policy papers calling for increased focus on prevention and public health did lead to an intergovernmental shift, as authority shifted away from the NHS and back to local authorities, who would presumably better integrate social and population health. Nonetheless, even with this renewed commitment, which included a goal of transforming the NHS from a “sickness service” to a “health improvement service” (Wanless 2002), the policy initiatives developed by the Labour government have had limited success (Exworthy, Berney, and Powell 2002), as the acute care provided by the NHS continues to capture most of policy attention at the expense of public health interventions.
In this context, it is hardly surprising that cross-national comparisons document that the British continue to score poorly on a variety of public health metrics, including harmful drinking, smoking, and prevention more generally. Indeed, while Michael Marmot argues that 40 percent of the nation's burden of disease could be prevented through action on the determinants of avoidable chronic conditions, and while the NHS England Five-Year Forward View issued in 2014 calls for a “radical upgrade in prevention and public health”(NHS England 2014: 3), a 2017 report from the House of Lords NHS Committee characterized the nation's efforts as “frustratingly low” and “chronically underfunded.” Moreover, public health funding continues to be cut, not increased, contradicting the political declarations emphasizing increased efforts on public health.
In France, a similar reintroduction of public health onto the national agenda is visible from the 1990s on, following a series of sanitary crises, such as the contaminated blood scandal. Public health gained additional prominence through the notion of sécurité sanitaire (public health security), leading to the creation of several new health agencies to increase disease surveillance, monitor the quality of care, and promote health education (Benamouzig and Besançon 2005).
As in England, a certain legislative effervescence in the 2000s aimed at reinforcing public health as a priority in health policies. The Public Health Law of 2004 represented an important step by setting up a policy framework and establishing long-term goals (Loncle 2009). The law created regulatory agencies and alert and reporting systems and systematized the evaluation of existing programs. It represented an attempt to deeply renovate the role of the central state in public health, to centralize and reduce fragmentation of initiatives, and to consolidate existing prevention policies and initiatives aimed at diminishing social and territorial inequalities.
Nonetheless, while the 2004 law defined one hundred prevention priorities, no specific budget was appropriated, and the law is vague with regard to who is responsible for what. Enduring fragmentation of responsibilities among the ministry of health, the sickness funds, and the regions led to implementation problems and general lack of data and evaluation of the effectiveness of existing programs. The results are mixed at best: improved metrics with regard to tobacco use and child obesity but less success with reducing harmful alcohol use and adult obesity.
The nation's most recent health law (the 2016 Health System Modernization Act) contains measures aimed at improving access to care, rationalizing patient pathways, and improving public health and prevention. With regard to prevention and public health, the law seeks to reinforce coordination among hospitals, liberal medicine, social services, and regional health agencies around public health priorities. The regional health agencies also have increased powers to develop elaborate regional health plans according to local health needs. At the central level, the fragmentation of responsibilities has been addressed by merging the three public health agencies responsible for disease surveillance, monitoring the quality of care, and promoting health education into a single French national public health agency, Santé Publique France, on the model of Public Health England, with missions of prevention, promotion and education, and crisis management, centralizing epidemiological knowledge and prevention. While representing an important step into the consolidation of public health expertise, critics regret that the new structure remains underfunded and understaffed.
Taking Advantage of the Window of Opportunity: Why the United States?
The US system of public health fares rather well compared to other Western nations. This conclusion is perhaps surprising, as one would expect that various social and political factors distinctive to the US system presumably make its public health politics even more difficult. After all, there is in the United States an ongoing backlash against government, a distrust of science, and a perception among many that government is simultaneously inept and dangerous. This trend is particularly apparent when “nanny state” politicians impose rules (from motorcycle helmets to smoking bans) that interfere with Americans' beloved individual liberty. Moreover, Europeans, with their more centralized social welfare system, arguably have a greater appreciation and acceptance of government regulation, oversight, and laws aimed at protecting the public good.
What, then, explains this public health window of opportunity (in an era in which traditional medical care still dominates)? Why might the United States score higher (or at least higher than expected) on a public health scorecard than many of its high-income peers? One hypothesis is that public health entrepreneurs are more successful in the United States than in their European counterparts. More plausible, however, is the impact of the American brand of public health moralism, combined with emerging trends in the American medical care delivery system.
Political scientists have long debated whether theory can offer a useful lens into how policy shapes politics (as opposed to the other way around). James Q. Wilson suggested one approach: look at whether the presumed costs and benefits of the policy are concentrated on particular groups or diffused among large populations. For example, if the costs and benefits are both concentrated on discrete groups, the politics likely will be a battle between those two interest groups. But if the costs are concentrated on a discrete group and the benefits are dispersed among a wide population, then the proposal is likely to be defeated, unless there is a policy entrepreneur who can raise the public visibility of the issue (or a crisis or disaster that does the same thing) such that large segments of the population become more politically engaged.
Public health policy often fits into the latter category: costs are concentrated (say, on pollution-emitting coal mines), but benefits are dispersed (among the general public). Sometimes a catastrophic event can focus public attention on the need for reform; the nuclear plant disasters at Three Mile Island and Chernobyl are examples. But there also are examples of policy entrepreneurs changing public health politics. Rachel Carson's 1962 book Silent Spring encouraged and energized the environmental movement. Ralph Nader's 1965 book Dangerous at Any Speed had a similar impact on car safety regulations. But it does not always take a book by an academic or activist. Candy Lightner's daughter was killed by a drunk driver, and her outrage over the lenient criminal penalties in such cases led to the creation of Mothers against Drunk Driving (Lerner 2011) and to both changes in criminal law and new public health campaigns (always have a designated driver).
Michael Bloomberg, the former mayor of New York City, used his elected position to serve as a public health entrepreneur. Under his leadership, New York City adopted a “health in all policies” approach, under which all city agencies were instructed to consider the public health implications of their work, be they in charge of schools, the police, or housing. The city also enacted a host of new public health measures and tried (and failed) to go even further (such as the unsuccessful effort to ban the sale of supersized sugary drinks at convenience stores and fast food outlets). Although no longer in office, Bloomberg continues to push (and fund) efforts to dramatically cut soda drinking, and one result is a recent wave of cities (starting with Philadelphia) enacting soda taxes, which have significantly cut soda sales.
At the same time, however, this variable seems equally relevant in other political contexts and seems less persuasive in explaining why the United States might outperform its European public health counterparts. Indeed, other nations were developing health-in-all-policies strategies (and similar initiatives, such as health impact assessments) long before Bloomberg entered politics. Put simply, the need for public health entrepreneurship seems to be a common theme, as opposed to a distinctly American advantage.
The Transforming Medical Care System?
The US health care system is in the midst of a significant transformation. The system is consolidating, and the effort to get bigger is accompanied by an attempt to develop integrated delivery systems in which previously siloed sectors (the hospital, community clinic, office-based physician) all become part of a single organization. These new integrated delivery systems have access to data about utilization patterns and costs that far exceed what have long been available. Meanwhile, payers are seeking to move away from fee-for-service payment and toward so-called value-based purchasing and, in so doing, are experimenting with efforts to put groups of providers at financial risk for the cost and quality of health care received by defined populations.
These developments present a window of opportunity for the public health community. There is a growing interest in so-called population health management, the effort to use data, incentives, and management tools to provide improved and targeted population-based preventive care. There are numerous efforts, for example, to use community needs assessment data to better target preventive services within particular communities. To be sure, most of these efforts are in their infancy, and the practice may not achieve the promise. For example, New York State received more than $8 billion in supplemental federal Medicaid funds as part of an effort to transform and improve the health delivery system for low-income New Yorkers. This effort will succeed only if health systems can develop novel strategies to improve population health metrics (while also reducing costs).
It is (far) too soon to tell if the trend toward population health management is a passing fad or a meaningful change. Moreover, it is too new a trend to explain any prior public health success. That said, however, it is clear that there is a new commitment to these efforts, by both public and private payers, and that this experiment offers an important opportunity for the public health community.
Public Health Moralism?
The most controversial issues in public health inevitably involve government interference in presumably private behavior. Given America's liberal roots, the public health proponent typically argues that unregulated private behavior can have negative public consequences, on both the health and the budget of the larger community. The motorcyclist without a helmet could well become the paraplegic on Medicaid. The chain smoker, soda drinker, and substance abuser are more likely to incur high lifetime health costs, largely paid by public dollars. At the same time, there also is an increased focus on the population health consequences of certain individual behaviors. By focusing on the impact of secondhand smoke, for example, the public health advocate both stigmatizes the smoker and persuades that efforts to regulate or tax cigarette use are necessary for the overall public good.
Those opposed to these regulatory interventions often ground their ire in their own moral reprimand. The obese are the victims of their personal choices; their failure to eat healthy food and/or engage in regular exercise is their own fault. Perhaps a mild public education campaign is justified (“just say no” to sugary drinks), combined with a hope the target audience changes behavior. But addicts presumably will have no one to blame but themselves if they cannot resist the lure of the drug (or the drink, cigarette, or can of soda). And the effort to use the public “police power” to change behavior, perhaps by imposing a cap on the size of large sodas, allegedly moves us toward a nanny state in which Big Brother interferes with our individual liberty.
Over the past few decades, however, public health advocates have become increasingly successful in turning the moral arguments in their favor. The cigarette smoker, the drunk driver, and even the obese are increasingly stigmatized, thereby justifying public health interventions. Moreover, the moral judiciary sometimes shifts its binoculars away from the individual and toward a corporate villain. The tobacco industry was vilified for lying about the dangers of cigarettes. The food and soda industries are accused of marketing to children, who understandably fall prey to their manipulative strategies. Car manufacturers and coal companies are cited for deliberate deceit and actions that are harmful to the public good (Kersh and Morone 2002).
The public health debate thus often turns into competing moral claims. This is not a uniquely American phenomenon, but moral arguments can carry greater weight in the United States than in many of its peer countries. The power of the moral stigma and the focus on the corporate villain are both factors that have contributed to cuts in the percentages of smokers and drunk drivers. To be sure, moral suasion alone is usually not enough: cigarette taxes are perhaps they key driver in lower smoking rates in the United States, and jail sentences have had a similar impact on drinking and driving. That said, however, moralism plays an exceptionally large role in every aspect of American public health policy.
The Politics of Public Health: Some Cross-National Lessons
The politics of public health has common cross-national features, starting with the lack of a strongly mobilized constituency in its favor, in contrast to the traditional medical care system with its powerful and influential interests. Indeed, public health is politically invisible until a crisis (tainted food, an Ebola virus epidemic, a devastating hurricane), and it typically recedes from public view once the crisis ends. Making the job of the public health advocates even harder is that it is difficult to show clear evidence of a return on investment for the population health dollar. It also is human nature to identify more immediately with individual misfortune in persons who look like “us” than with bad news about population-based threats that seem distant and hard to measure (such as the impact of climate change). And as nations become more economically secure, there evolves as well a cultural preference for the latest biomedical procedures (that help sick people recover) over prevention and public health (to keep them from getting sick). For all of these reasons, high-income countries spend the vast majority of their health care dollars on acute and high-tech care, and funding for public health and prevention is both minimal and endlessly vulnerable to cost-cutting pressure.
In addition to these common cross-national themes, two factors distinctive to the US system further complicate the life of the public health professional. First is the antigovernment ethos that carries exceptional weight with much of the American citizenry: the view that government is far less competent than the private sector. Second is a growing distrust of science and the rejection by many of scientific warnings of public health dangers. Public health measures in the United States are often viewed as unacceptable challenges to individual rights rather than as necessary protections of the public good. In contrast, Europeans have a greater sense of social solidarity and are more accepting of the centralized welfare state, which presumably should encourage and enable a more proactive public health agenda.
Surprisingly, however, while hardly a public health leader (just consider the dismal and ineffective public health response in the United States to the opioid crisis), the United States also is not a public health laggard. What explains this unexpected outcome? The argument here is threefold. First, the US federal government has had a longer and deeper engagement in public health policy than expected, one that is on par with their presumably more centralized European counterparts. Second, US public health advocates have become increasingly successful in turning public health morality in their favor, stigmatizing the cigarette smoker, the drunk driver, and the obese (thus justifying public-sector interventions) while also framing the tobacco industry and other industrial giants as corporate villains. And these moral arguments are often more persuasive in the United States than in its European counterparts. Finally, the US health system is slowly evolving into large (and somewhat integrated) delivery systems that increasingly are at financial risk for the cost and quality of care received by defined populations. The resilience and impact of this trend are unclear, as is its impact on public health politics. That said, however, it is an important variable to track going forward.
That the US public health system fares (modestly) well in a cross-national comparison may be a surprise, but it also is at best a pyrrhic victory: each of the nations discussed in this article would do well to dramatically increase its commitment and spending on public health and prevention. The argument for doing so is clear and compelling and is set forth persuasively in numerous white papers, policy memos, and academic treatises. The inevitable obstacle, however, is politics: interest group dynamics, cultural concerns, and the political invisibility of much of the public health enterprise. Only by overcoming these obstacles will we move toward better, fairer, and more effective ways of encouraging healthier societies.
It is difficult to note with certainty overall spending on prevention and public health, given uncertainties about what to include in the estimate. For example, garbage collection is generally not counted, though it surely is a public health necessity.
While thin, there is some literature along these lines, such as one study that found increases in public health spending linked to reductions in mortality rates (Mays and Smith 2011), a finding consistent with various case studies suggesting a positive return on investment for particular community-based initiatives (Trust for America's Health 2012).