Infectious diseases are a long-standing and continuing threat to health and welfare, with their containment dependent on national disease surveillance and response capacities. This article discusses infectious disease surveillance in the United States and the United Kingdom, examining historical national traditions for identifying and controlling infectious disease risks and how globalization and technical advances have influenced the evolution of their respective approaches. The two systems developed in different but parallel ways. In the United States, surveillance remained quite localized at the state level until the early twentieth century and still retains many of those features. The U.K. approach became centralized from the latter part of the nineteenth century and has principally remained so. In both cases, disease surveillance was traditionally conceived as a public good, where national or local authorities held sovereign rights and power to protect public health. With the increasing globalized nature of infectious disease, such notions shifted toward surveillance as a global public good, with countries responding in turn by creating new global health governance arrangements and regulations. However, the limitations of current surveillance systems and the strong hold of national interests place into question the provision of surveillance as a global public good. These issues are further highlighted with the introduction of new surveillance technologies, which offer opportunities for improved disease detection and identification but also create potential tensions between individual rights, corporate profit, equitable access to technology, and national and global public goods.

The text of this article is only available as a PDF.
You do not currently have access to this content.