Abstract

This article argues that the Cold War – era battle between information and uncertainty is a critical origin point for contemporary social theory – informed, data-intensive projects of the US national security state. Beginning in the 1950s, international relations experts and government officials turned to digital computing to help make decisions under the unavoidable pressures of geopolitical uncertainty. By the 1970s, their data banks of political knowledge and novel statistical tools purported to forecast political unrest long before an unaided human could. These efforts sparked a new epistemology of political knowledge, one that is now common in data science, in which designers and users prioritize correlation over causality and the instrumental management of problems over scholarly understanding or explanation. Far from a historical curiosity, this history is a warning. The sensibilities of Cold War technopolitical projects are continually rematerialized in contemporary computational security projects. Left unchallenged, their durability will continue to increase in tandem with the national security state's continued investment in computational social scientific projects for geopolitical management.

Since the late 2000s, the US military has invested millions of dollars in social theory–informed big data and machine learning tools that claim to monitor, predict, and mitigate international political unrest. For their advocates, these projects promise to deliver unprecedented insights and security via sociocultural data processing. Critics warn that they radically extend the US military's hubristic, dystopian quest for global dominance (González 2022). While the technological power the military harnesses for these projects is new, the quest to manage the messy world of geopolitics with digital data has a longer history. In this article, I argue that the Cold War–era battle between information and uncertainty is a critical origin point in the pursuit of data-driven global security. Cold War insecurities, furthermore, justified a new epistemology, one that we more commonly associate with today's algorithmic governance technologies. The Cold War challenge of deciding under conditions of ineradicable uncertainty, I argue, drove social scientists and government officials to embrace a managerial and predictive epistemology—the same epistemology that underpins contemporary computational security projects.

At first glance, this claim may seem counterintuitive. Decades of historical scholarship and popular culture have portrayed the security projects of the Cold War US state as symbols of Americans’ paranoid obsession with technoscientific and political certainty. Early Cold War technologies, such as the Air Force's Semi-Automated Ground Environment (SAGE), a massive human-computer network that scanned the skies for nuclear attacks, and game theoretic and rational choice approaches to military strategy that sanctioned mutually assured destruction, have become historical and historiographic icons. These shared reference points in the scholarship on science and the US state conjure a portrait of pointy-headed experts and stiff-spined national security officials bent on producing American security via totalizing networks of technological, intellectual, and political control (Erickson et al. 2013). The historian Paul Edwards (1996: 15) characterizes such projects as efforts to create a “closed world” in which information systems provided “total oversight . . . and technical-rational solutions to a myriad of complex problems.”

Yet, quests for certainty revealed certainty's limits. As Edwards (1996: 109) explains, “The closed world was a leaky container, constantly patched and repatched, continually sprouting new holes.” The obsession with security accumulated more uncertainties. Misinterpreted radars, miscommunications in chains of command, and technological failures threatened to produce the very disasters that command technologies sought to avert. At the mundane level of bureaucracy, too, game theory and systems analysis failed to produce the certainty that national security officials craved. Even Charles Hitch, the RAND economist and systems analysis proselytizer, argued in 1960 that certainty was unattainable; he warned, “No other characteristic of decision making is as pervasive as uncertainty” (Lawson 2014: 46.)

As a result, reasoning, deciding, and governing under conditions of uncertainty became a problem of existential significance. “Men must choose what to do,” intoned the decision theorist Ward Edwards, “in the absence of certain knowledge of their [decisions'] consequences” (1968: 34). Hitch suggested that the best solution in such circumstances was to amass “information to reduce uncertainty” (Lawson 2014: 46).

Seeking to understand the complexities of geopolitics and make decisions under the unavoidable weight of uncertainty, international relations experts turned to computers. In the 1960s they created enormous data banks of political knowledge and built automated analytical tools that located statistical patterns in machine-readable data. By the 1970s, those data banks became the centerpieces of computational tools for managing geopolitics. These projects sparked a new epistemology of political knowledge dominant in contemporary data science—the prioritization of correlation over causality and the management of problems over scholarly understanding or explanation. The origins of this promise of liberation through data are not merely a point of historical curiosity. They are a warning. The sensibilities of Cold War technopolitical projects are materialized in contemporary computational security projects. Left unacknowledged, their durability will continue to increase in tandem with contemporary digital security projects.

The Informatics of Geopolitical Uncertainty

In 1966, the peace studies pioneer Kenneth Boulding proposed that to combat global insecurity, scholars and government officials should create “a world network of social data stations.” Akin to weather stations, each node would gather “social data”—information about changing demographics, economics, political relationships, and psychological states—and send them to a “centralized information processing centre” for analysis. Equipped with real-time data, analysts could identify “indices of hostility, perceptions of threat, changes in value systems,” and other information critical for preserving humanity in a nuclear-armed world (Hermann 1975: 242).

Boulding imagined that his data network would reduce global hostility by fostering international knowledge and understanding. Instead, national security–minded scholars and their patrons pursued a US-centric version of Boulding's vision. With funding from the US military, the Carnegie Corporation of New York, and the National Science Foundation, international relations experts began to build enormous banks of computer-readable data in the early 1960s. At the University of Michigan, Boulding's colleague J. David Singer launched the Correlates of War project. Insistent that “an epistemological revolution and a cascade of hard evidence” were necessary to achieve global security, Singer and his research team converted the legacies of nearly one hundred major international wars since 1816 into stacks of machine-readable punch cards (Singer 1979a: xiv). Analyzing relationships between variables like battlefield deaths, national military capability, geography, and conflict duration, they searched for statistical patterns that differentiated conflicts that escalated to international war from those that did not (Singer, Bremer, and Stuckey 1979: 266–67).

At the University of Southern California, scholars funded by the Defense Department's Advanced Research Projects Agency (DARPA)1 built a data bank named the World Event-Interaction System (WEIS). Guided by the historian turned international relations expert Charles McClelland, they coded the New York Times daily, digitally recording the features of all newsworthy (to Americans) communications and behaviors, from trade agreements to bullets fired, that passed between the world's 160-plus nations. Subjecting these data to a variety of computational analyses, researchers sought the “normal” rates and types of communications exchanged between nations, knowledge that would help them anticipate instability by identifying deviations from normal affairs.2

These were just two among dozens of projects that yoked geopolitical knowledge to new information technologies (Burgess and Lawton 1972). Unlike the creators of closed-world projects, who held out the promise of certain security, the scholars who developed computational approaches to geopolitics insisted that the objects of their studies could never be known with certainty. Singer's assertion that he sought the “correlates,” not the causal laws, of war captures this insistence on scientific uncertainty. He explained that the name was an honest reflection of what “scientists can and cannot do.” He insisted, “There is nothing like the pursuit of systematic empirical research to cure one of theoretical certitude” (Singer 1979a: xiv, xviii).

Singer's abandonment of certitude marks a subtle but important departure from long-standing epistemic commitments in the social sciences that emphasized the systematic pursuit of causal knowledge as the key to governing an uncertain world. As Ian Hacking (1990: 108) explains, the “crucial step in the taming of chance” in the nineteenth century was the transformation of “descriptive knowledge of large-scale regularities into laws of nature and society that dealt in underlying truths and causes.” That causal spirit was a powerful presence in academic social science in the 1950s and 1960s. The architects of the behavioral revolution, such as David Easton and Robert Dahl, insisted that the study of politics should model itself on the quantitative methods of “modern natural science” (Gunnell 1993: 224), in which numbers would yield a “robust body of tested propositions” (Gunnell 1993: 214). The causal spirit was not exclusively quantitative; the qualitative study of politics, with its emphasis on history and context, also sought causal theories that could inform policy (Rohde 2021: 360–66).

For scholars who advocated computational approaches to international politics, both the systematic methods of natural science and causal theory implied a level of certainty that was unattainable in the study of politics. The political scientist Morton Kaplan (1968: 1–2) explained that traditional scientific methods were contrived for understanding closed, deterministic mechanical systems. They were “incapable of coping with” social systems, which were open, emergent, and shaped by human purpose.

Kaplan's argument turned on cybernetic theories that influenced social thought after World War II (Heyck 2015). According to Kaplan, McClelland, and others, international affairs was best understood as a system of complex, adaptive systems capable of self-direction and operating in dynamic relationships with their environments. The international system was no clocklike universe of tightly coupled mechanical relationships. It was indeterminate, McClelland explained: “A complex of partial couplings, of localized properties, of stochastic processes.”3

Cybernetic theory encouraged “a shift toward ‘data-driven’ research” in the social sciences (Halpern 2014: 148). Scholars sought to claw back uncertainties, reveal geopolitical patterns, and anticipate futures by creating, processing, and analyzing data. They invested decades creating massive data banks of “real-time” and historical political data: coding daily news sources and yellowed historical documents, scavenging thousands of reports from international organizations and government agencies for nuggets of information whose preservation, aggregation, and analysis might reveal the present and future contours of geopolitics (Rohde 2017).

Yet, they stressed that the fruits of their labor would always remain uncertain. Government secrecy, the lack of reliable sources about small and poor countries, and the absence of international consensus on data categories meant that their datasets were never completely reliable (Russett et al. 1964: 2–4; Mitchell 1968: 305). “Fugitive data”—the term for missing, lost, or otherwise uncollectable records—made it “impossible to produce” a complete dataset (Taylor and Hudson 1972: 2). The World Handbook of Political and Social Indicators, a data bank project funded by the National Science Foundation and the Defense Department and spearheaded by the era's leading political scientists at Yale, amassed an impressive seventy thousand machine-readable records culled from nearly sixty thousand sources. But an additional ten thousand data points were irretrievably missing (Taylor and Hudson 1972: 1, ix).

The compilers of geopolitical data amassed quantitative data. But they are best understood as being guided not principally by a trust in numbers but a trust in computerized data, quantitative and qualitative—a faith in the patterns, trends, and correlations that their computers purported to reveal. Some, but hardly all, of the data that researchers created was quantitative. McClelland's database of political communications, for example, represented behavior in machine-readable code. Computational data benefited from the status of numbers, but its advocates insisted on the distinction between data analysis and quantification. Quantification implied “the precision of mathematical models,” explained Klaus Knorr and Sidney Verba (1961: 3), highly prestigious but inappropriate to the always provisional status of knowledge about international politics.

As international affairs experts embraced computational approaches, they began to construct an epistemology of data and information processing. Though their approach benefited from the legitimacy conferred by computing's association with quantification, they stressed the fundamental unknowability of the object of their studies. Cybernetic feedback in political systems, characterized by the unpredictable workings of fallible, willful humans, represented subjectivities that could not be characterized with certainty, quantitative or otherwise. Provisionality, characterized by data patterns and trends, was the best they could offer.

From Knowing to Managing

In the 1970s and 1980s, projects designed to cope with the uncertainties of national security via information technologies moved from academia to the corridors of the Pentagon. Under the increased pressures of government contracts and in close proximity to security logics, these projects helped create a new epistemology of managerialism, one that privileged prediction and management over causality, understanding, and long-standing scientific notions of truth. In 1973, DARPA's Cybernetics Technology Office launched the Crisis Management Program (CMP). The decade-long, multimillion-dollar effort merged social science and information technology to create “interactive, user-oriented, computer-based” systems that monitored international political and military events, provided probabilistic crisis forecasts, and helped national security analysts make decisions (Hopple 1980: 11; Rohde 2017). Following on the heels of the Pentagon's failures in Vietnam, DARPA's investment in computation might seem counterintuitive. The Defense Department had failed to sell the military and the American public on the “appearance of highly rational, scientific warfare” (Gibson 1988: 124). Officials had counted battalion days deployed, helicopter missions flown, artillery rounds fired, villages secured, and most infamously, dead bodies—whole or partial—that provided the evidence through which US forces might measure their success. An apocryphal story conveyed the bitter results. In 1967, so the story went, analysts asked a computer in the Pentagon's basement, “When will we win in Vietnam?” The machine's answer: “You won in 1965” (Madrigal 2017).

Yet, rather than widening the scope of knowledge that might be relevant to conflict—the contextual knowledge provided by history or area studies, the diplomat's trained intuition, or the peace advocate's moral reasoning, for example—many security experts doubled down on data (Edwards 1996: 141–44). DARPA nixed its investments in social research save for projects to develop “state-of-the art technologies from the social, behavioral, and computer sciences.”4 The CMP funded academics and defense contractors who boasted equal expertise in computing and behavioral science. They converted the databases and statistical tools of the 1960s into an automated probabilistic forecasting system that by the late 1970s claimed to monitor international political and military events and anticipate crises automatically. The Event Monitoring and Early Warning System continually measured communications between all of the nations in the WEIS database of coded news stories, which was updated daily via ARPANET. Its algorithms scanned for deviations from “normal” baseline communications between nations. If the volume and types of communications indicated abnormalities reaching critical thresholds, the system issued an alert (Rothe 1982).

In 1980, contractors declared that their system produced geopolitical foresights more reliable than those that a human expert, such as a seasoned strategic intelligence analyst, could provide. In retrospective tests, the system forecasted real-world crises eight times out of ten (Hopple 1980: 39–40). In tests of its real-world value, it provided “real-time . . . , unambiguous warning” of the outbreak of the Iran-Iraq war.5 Its architects boasted that the system was “simplified enough to allow the user to make few decisions, press few buttons, and have little interaction with the computing machinery or the statistical techniques” (Rothe 1982: 9).

The Joint Chiefs of Staff, the National Security Council, and various military intelligence offices installed the system. A companion software package created by the defense contractor Consolidated Analysis Corporation Inc.6 and named “ExecutiveAids” helped analysts decide what to do when the system issued warnings. A searchable database of three hundred recent crises involving the United States, ExecutiveAids offered historical analogues to contemporary security challenges, identified potential policy actions, and indicated their possible consequences (Andriole and Daly 1979: 48). Users did not need to understand how the systems worked or even what data they collected and parsed. The CMP promised predictive instrumentality, the ability to act on the basis of correlation rather than understanding.

The CMP was a material incarnation of the social scientific conviction that “decision making is essentially an information-processing operation” (Singer 1979b: 134). Its forecasting methodology was similar to traditional probabilistic calculations that located risk in statistical regularities across aggregated aberrant events, like suicides and plane crashes. These forms of probability, however, typically rested on systematically assembled datasets and rigorously verified methods (Porter 1986; Bouk 2015). The CMP dataset, which captured global politics in coded extracts of English-language newspapers published in the United States and United Kingdom, was hardly an impartial, systematic representation of world politics. The system recapitulated in code the “white world order” that had long structured American international relations (Vitalis 2015). Only data that reflected American security concerns, in the form of rising conflicts that might pull the superpowers into confrontation, mattered. Data and values that did not fit into the framework of bipolar rivalry—countries’ quests for self-determination, the creation of regional solidarities, demands for human and economic rights—were excluded and invisible. The CMP's indifference to the provenance, original purpose, or quality of its data were part of the new epistemic, managerial culture that emerged with early computational social science projects.

So too was the project's indifference to scientific standards of verification or the generation of causal knowledge. Whereas traditional risk calculations and forecasts were rooted in rigorously tested statistical regularities, the CMP's foundation was an imaginative, speculative, and untested theory that world politics operated like a cybernetic system. All the verification it required was that it produced reliable geopolitical forecasts most of the time. When academic political scientists questioned DARPA's claims that the agency had sufficiently validated the CMP's forecasting capabilities, agency officials accused them of obsessing over the “formal, sterile” development of “empirical theory in the classical sense.”7 The head of the forecasting project—himself a PhD in political science—explained that they had no need “to justify ourselves theoretically beyond checking our ‘assumptions’ with the intuition and experiences of those we are trying to help.”8 Nor did they use it to seek knowledge about the causes of conflict itself or the possibilities for building a world other than one of sustained bipolar confrontation.

DARPA's computational political project was a site in the emergence of what the historian Matthew L. Jones (2018) aptly terms an “instrumentalist culture of prediction,” in which prediction and instrumental action supersedes the “scientific culture of truth.” As early as the 1950s, researchers in pattern recognition and high-dimensional data processing, many of them funded by defense and intelligence agencies, “prized prediction over knowledge about an interpretable, causal, or mechanistic model” (Jones 2018: 674; Mackenzie 2016). Many of the game theorists who dominated nuclear strategy, too, participated in this culture of speculative, predictive management of complex decision problems. By the 1960s, game theory in the human sciences was “less a grand positive theory of social interaction” and instead a heuristic method that simplified complex decision problems (Erickson 2015: 21; Heyck 2015). The culture of prediction also reflected the preoccupation with uncertainty that inspired cybernetic and computational approaches to international affairs. As Andrew Pickering explains, cybernetics offered clues for “get[ting] along in a world that . . . could not be subjugated to human designs” (Rindzevičiūtė 2015: 208).

The conviction that uncertainty was a fact of international affairs reinforced researchers’ and government officials’ increasing indifference to causal and theoretical knowledge. The architects of computational international affairs insisted that they would never be able to “calculate just when, where, and in what way a particular event will occur” (McClelland 1969: 2–3). After all, they could “neither conduct controlled experiments nor reduce the number of variables that constantly confound our analyses.”9 With truth and certainty perpetually beyond reach, it was better not to bother with them at all. Indeed, in the fast-paced, life-or-death realm of international politics, DARPA's researchers argued, the search for “objective truth” was an obstacle to “solving real world problems,” a luxury that the United States, and by extension the world, could not afford (Andriole 1981: 20). Managing trumped knowing.

In the meantime, projects caught between the old epistemology of causal truth and the emergent epistemology of data processing toiled on in relative obscurity. By the early 1980s, Singer's Correlates of War project had generated over eighty publications, but had not yet yielded “a data-based multi-factor explanatory theory of war in which we have high confidence.”10 Its datasets continued to expand to include more types of conflict—nonstate, intra-state, and extra-state—as well as religion, diplomatic exchanges, trade relationships, and geography.11 Singer retired in 2001, but the project continues its efforts to characterize the “temporal and spatial variation in war” as war's laws remain elusive.12

Doing More, Knowing Less

As Sun-Ha Hong (2021: 1944) writes, “Technofutures leave their mark not so much by delivering on every bold prediction, but by orienting diverse communities toward a common way of talking, thinking, and planning in the present.” The simultaneous pursuit of total national security and the assertion of certainty's impossibility is not just a central paradox at the intersection of science and the US state; it continues to be one of its sustaining features. The Cold War is long over, but US national security agencies have continued to advance that era's technopolitical projects, chasing data to plug the leaks produced by the quest to secure a world that cannot be closed. The network-centric vision of warfighting that inspired the 1990s Revolution in Military Affairs, the National Security Agency's bulk data analysis projects in the 2000s, and the merger of anthropology and computation in the Army's Human Terrain System during the US wars in Afghanistan and Iraq still promised information solutions to the problems of managing and deciding under conditions of uncertainty (Lawson 2014; González 2022).

The instrumental epistemic and political cultures of computational governance have troubling implications for knowledge, political authority, and public moral reasoning. Computer systems have not replaced experts or security bureaucrats, but the forms that they take increasingly locate expert legitimacy and government authority in information systems themselves. Like DARPA's Crisis Management Program, today's networked terminals promise to liberate analysts and decision-makers from much of the labor of knowing, learning, or understanding. Faced with uncertainty, ignorance has become intentional, excusable, and efficient (Suchman 2022).

The path to “the algorithm told me to do it,” whether for a visa application approval or a drone strike, was laid decades ago. The emphasis on management over understanding, predicting over knowing, has not merely propelled the Cold War national security status quo into the present; it has made it seem an inevitable, even unquestionable appendage of the information revolution. The idea that things might be otherwise—that the causes of insecurity and conflict might be mitigated through scholarly and policy understanding; that there might be other, more collaborative and peaceful possibilities for global futures—continues to recede further and further beyond the knowledge or imagination of mainstream geopolitical expertise.

Many thanks to the members of the IAS Science and the State seminar for their generous and thoughtful feedback. Special thanks to Joshua Barkan, Christo Sims, and Perrin Selcer for reading multiple iterations of this article.

Notes

1.

Originally ARPA, the agency added the D in 1972 to signal the centrality of defense concerns and applications in its projects. For continuity, I refer to the agency as DARPA throughout this article.

2.

Charles A. McClelland, “World Event-Interaction Survey: A Research Project on the Theory and Measurement of International Interaction and Transaction,” March 27, 1967, p. 3. Proposals AO 1066—University of Southern California file, box 4, ARPA DSOSSD Accession 330-74-74, HRRO Quantitative Political Science Projects, Record Group 330, National Archives II, College Park, MD.

3.

Charles A. McClelland, “Useful Applications of International Event Analysis,” 1974, p. 2. Southern California 2518 Management Reports file, box 15, Accession 330-78-0073, R&D Studies Sponsored by ARPA/Cybernetics Technology Office, Record Group 330, National Archives II, College Park, MD.

4.

“Memorandum for Director, Program Management,” August 4, 1977. MRAO, AOs, Props, & AO Amendments, AO 3196 Decisions & Designs—Crisis Early warning and Monitoring System file, box 1, DARPA DSOSSD 330-83-0138, Crisis Management Program Files, RG 330, National Archives II, College Park, MD.

5.

International Public Policy Research Corporation, “R&D Status Report, 1 October 1980–31 December 1980,” 1. Technical Reports folder, AO 3295– IPPRC—Early Warning and Monitoring System, Box 2, DARPA Crisis Management Program Files, DARPA DSOSSD 330-83-0138, RG 330, National Archives II, College Park, MD.

6.

Today it is better known as CACI, a military contracting company with the tagline “Ever Vigilant” and billions of dollars in annual revenue; www.caci.com (accessed November 11, 2022).

7.

Stephen J. Andriole to Raymond Tanter, September 30, 1977. Correspondence, AO 3182 University of Michigan—Conflict Outcomes file, box 4, DARPA Crisis Management Program Files, DARPA DSOSSD 330-83-0138, RG 330, National Archives II, College Park, MD.

8.

Stephen J. Andriole to Raymond Tanter, January 4, 1978. Correspondence, AO 3182 University of Michigan—Conflict Outcomes file, box 4, DARPA Crisis Management Program Files, DARPA DSOSSD 330-83-0138, RG 330, National Archives II, College Park, MD.

9.

Ronald Young to George Heilmeier, November 10, 1975, p. 2. Correspondence AO 2818 University of Maryland Components of International Behavior file, box 1, DARPA DSOSSD 330-83-0138, Crisis Management Program Files, RG 330, National Archives II, College Park, MD.

10.

“Correlates of War Project—University of Michigan, Bibliography,” May 1982, 1. J. David Singer file, Box 12, Papers of Karl Wolfgang Deutsch, HUGFP 141.xx, Harvard University Archives, Cambridge, MA.

11.

“Correlates of War: Data Sets,” correlatesofwar.org/data-sets (accessed July 4, 2022).

12.

“Correlates of War: History,” correlatesofwar.org/history (accessed July 4, 2022).

References

Andriole, Stephen J.
1981
. “
The Quiet Revolution in the Scientific Study of International Politics and Foreign Policy
.”
Policy Sciences
14
, no.
1
:
1
22
.
Andriole, Stephen J., and Daly, Judith Ayres.
1979
. “
Potential Applications of Computer-Based Crisis Management Aids to Problems of Physical Security
.” In
The Role of Behavioral Science in Physical Security
, edited by Kramer, Joel J.,
47
74
.
Washington, DC
:
US Government Printing Office
.
Bouk, Dan.
2015
.
How Our Days Became Numbered: Risk and the Rise of the Statistical Individual
.
Chicago
:
University of Chicago Press
.
Burgess, Philip M., and Lawton, Raymond W.
1972
. “
Indicators of International Behavior: An Assessment of Events Data Research
.”
Sage Professional Papers in International Studies Series
1
, no.
10
:
1
96
.
Edwards, Paul N.
1996
.
The Closed World: Computers and the Politics of Discourse in Cold War America
.
Cambridge, MA
:
MIT Press
.
Edwards, Ward.
1968
. “
Decision Making: Psychological Aspects
.” In vol.
4
of
International Encyclopedia of the Social Sciences
, edited by Sills, David L.,
34
42
.
New York
:
Macmillan and Free Press
.
Erickson, Paul.
2015
.
The World the Game Theorists Made
.
Chicago
:
University of Chicago Press
.
Erickson, Paul, Klein, Judy L., Daston, Lorraine, Lemov, Rebecca, Sturm, Thomas, and Gordin, Michael D.
2013
.
How Reason Almost Lost Its Mind: The Strange Career of Cold War Rationality
.
Chicago
:
University of Chicago Press
.
Gibson, James William.
1988
.
The Perfect War: The War We Couldn't Lose and How We Did
.
New York
:
Vintage
.
González, Roberto J.
2022
.
War Virtually: The Quest to Automate Conflict, Militarize Data, and Predict the Future
.
Berkeley
:
University of California Press
.
Gunnell, John G.
1993
.
The Descent of Political Theory: The Genealogy of an American Vocation
.
Chicago
:
University of Chicago Press
.
Hacking Ian
.
1990
.
The Taming of Chance
.
Cambridge
:
Cambridge University Press
.
Halpern, Orit.
2014
.
Beautiful Data: A History of Vision and Reason since 1945
.
Durham, NC
:
Duke University Press
.
Hermann, Charles F.
1975
. “
Indicators of International Political Crises: Some Initial Steps toward Prediction
.” In
Theory and Practice of Events Research: Studies in Inter-nation Actions and Interactions
, edited by Azar, Edward E. and Ben-Dak, Joseph D.,
233
44
.
New York
:
Gordon and Breach
.
Heyck, Hunter.
2015
.
Age of System: Understanding the Development of Modern Social Science
.
Baltimore
:
Johns Hopkins University Press
.
Hong, Sun-Ha.
2021
. “
Technofutures in Stasis: Smart Machines, Ubiquitous Computing, and the Future That Keeps Coming Back
.”
International Journal of Communication
, no.
15
:
1940
60
.
Hopple, Gerald W.
1980
.
Internal and External Crisis Early Warning and Monitoring
. Interim Technical Report.
McLean, VA
:
International Public Policy Research Corporation
.
Jones, Matthew L.
2018
. “
How We Became Instrumentalists (Again): Data Positivism since World War II
.”
Historical Studies of the Natural Sciences
48
, no.
5
:
673
84
.
Kaplan, Morton A.
1968
. “
Traditionalism vs. Science in International Relations
.” In
New Approaches to International Relations
, edited by Kaplan, Morton A.,
1
18
.
New York
:
St. Martin's
.
Knorr, Klaus, and Verba, Sidney.
1961
. “
The International System: Theoretical Essays; Introduction
.”
World Politics
14
, no.
1
:
1
5
.
Lawson, Sean T.
2014
.
Nonlinear Science and Warfare: Chaos, Complexity, and the US Military in the Information Age
.
London
:
Routledge
.
Mackenzie, Adrian.
2016
. “
Simulate, Optimise, Partition: Algorithmic Diagrams of Pattern Recognition from 1953 Onwards
.” In
Cold War Legacies: Systems, Theory, Aesthetics
, edited by Beck, John and Bishop, Ryan,
50
69
.
Edinburgh
:
Edinburgh University Press
.
Madrigal, Alexis C.
2017
. “
The Computer That Predicted the US Would Win the Vietnam War
.”
Atlantic
,
October
5
. https://www.theatlantic.com/technology/archive/2017/10/the-computer-that-predicted-the-us-would-win-the-vietnam-war/542046/.
McClelland, Charles. A.
1969
. “
International Interaction Analysis in the Predictive Mode
.” Technical Report no. 3.
January
. Mimeo.
Los Angeles
:
University of Southern California
.
Mitchell, Robert E.
1968
. “
Information Storage and Retrieval: Information Services
.” In vol.
7
of
International Encyclopedia of the Social Sciences
, edited by Sills, David L.,
304
13
.
New York
:
Macmillan and Free Press
.
Porter, Theodore M.
1986
.
The Rise of Statistical Thinking, 1820–1900
.
Princeton, NJ
:
Princeton University Press
.
Rindzevičiūtė, Egle.
2016
.
The Power of Systems: How Policy Sciences Opened Up the Cold War World
.
Ithaca, NY
:
Cornell University Press
.
Rohde, Joy.
2017
. “
Pax Technologica: Computers, International Affairs, and Human Reason in the Cold War
.”
Isis
108
, no.
4
:
792
813
.
Rohde, Joy.
2021
. “
War
.” In
Society on the Edge: Social Science and Public Policy in the Postwar United States
, edited by Fontaine, Philippe and Pooley, Jefferson D.,
358
85
.
Cambridge
:
Cambridge University Press
.
Rothe, Frederick A.
1982
.
Final Report of the Early Warning and Monitoring System (EWAMS) Project
.
McLean, VA
:
International Public Policy Research Corporation
.
Russett, Bruce M., Alker, Hayward R.Jr., Deutsch, Karl W., and Lasswell, Harold D.
1964
.
World Handbook of Political and Social Indicators
.
New Haven, CT
:
Yale University Press
.
Singer, J. David.
1979a
. “
Introduction
.” In
Research Origins and Rationale
, edited by Singer, J. David,
xi
xix
. Vol.
1
of The Correlates of War.
New York
:
Free Press
.
Singer, J. David.
1979b
. “
The Scientific Study of Politics: An Approach to Foreign Policy Analysis
.” In
Research Origins and Rationale
, edited by Singer, J. David,
133
44
. Vol.
1
of The Correlates of War.
New York
:
Free Press
.
Singer, J. David, Bremer, Stuart A., and Stuckey, John.
1979
. “
Capability Distribution, Uncertainty, and Major Power War
.” In
Research Origins and Rationale
, edited by Singer, J. David,
265
97
. Vol.
1
of The Correlates of War.
New York
:
Free Press
.
Suchman, Lucy.
2022
. “
Imaginaries of Omniscience: Automating Intelligence in the US Department of Defense
.
Social Studies of Science
(
June
). https://doi.org/10.1177/03063127221104938.
Taylor, Charles Lewis, and Hudson, Michael C.
1972
.
World Handbook of Political and Social Indicators
. 2nd ed.
New Haven, CT
:
Yale University Press
.
Vitalis, Robert.
2015
.
White World Order, Black Power Politics: The Birth of American International Relations
.
Ithaca, NY
:
Cornell University Press
.
Freely available online through the Public Culture open access option.