• Corpus ID: 16065586

Why Quantitative Probability Assessments Are Empirically Justifiable in Foreign Policy Analysis

  title={Why Quantitative Probability Assessments Are Empirically Justifiable in Foreign Policy Analysis},
  author={Jeffrey A. Friedman and Joshua D. Baker and Philip E. Tetlock and Richard J. Zeckhauser},
Aristotle counseled us to seek precision insofar as the nature of the subject permits. But how much is too much? This article provides the first systematic test of long-standing debates about how precisely foreign policy analysts can estimate probabilities. Using a data set of 888,328 forecasts drawn from a series of geopolitical forecasting tournaments, we demonstrate that qualitative probability assessments, including seven-step scales employed by U.S. intelligence analysts, systematically… 
2 Citations

Figures and Tables from this paper

Adopting and improving a new forecasting paradigm

  • Ian Speigel
  • Computer Science
    Intelligence and National Security
  • 2021
Improve the verification paradigm of the Allied intelligence community (IC) by recognizing that IC forecasts typically pertain to complex situations, and therefore require the tools, methods and concepts found in complexity science.



Accuracy of forecasts in strategic intelligence

Significance Forecasting is a vital part of strategic intelligence, offering policy makers indications about probable future conditions and aiding sound decision making. Nevertheless, there has not

Change the Analyst and Not the System: A Different Approach to Intelligence Reform

Recent intelligence failures, including first and foremost the mistaken estimate of Iraq's weapons of mass destruction (WMD) prior to the war, show that a prime source of such failures is the

Reducing Uncertainty: Intelligence Analysis and National Security

The US government spends billions of dollars every year to reduce uncertainty: to monitor and forecast everything from the weather to the spread of disease. In other words, we spend a lot of money to

Towards a Reasonable Standard for Analysis: How Right, How Often on Which Issues?

This article takes the view that largely impossible standards have been imposed on intelligence analysis, largely for political reasons stemming from the 9/11 attacks and Iraqi WMD. The article

Communicating Uncertainty in Intelligence and Other Professions

Recent events have focused new attention on the need for intelligence professionals to present alternative hypotheses to policymakers in a way that makes clear the uncertainties in the evaluation and

Superforecasting: The Art and Science of Prediction

A New York Times Bestseller An Economist Best Book of 2015 "The most important book on decision making since Daniel Kahneman's Thinking, Fast and Slow." Jason Zweig, The Wall Street Journal Everyone

Expert Political Judgment: How Good Is It? How Can We Know?

Author: Philip E. Tetlock is a psychologist who is Professor of Leadership at the Haas School of Business at the University of California, Berkeley. The book combines several of his research

Assessing Uncertainty in Intelligence

Current tradecraft methods attempt to eliminate uncertainty in ways that can impede the accuracy, clarity, and utility of estimative intelligence, and a focus on assessing uncertainty suggests solutions to these problems and provides a promising analytic framework for thinking about estimative Intelligence in general.

Psychological Strategies for Winning a Geopolitical Forecasting Tournament

Support is found for three psychological drivers of accuracy: training, teaming, and tracking in a 2-year geopolitical forecasting tournament that produced the best forecasts 2 years in a row.

Expert Political Judgment: How Good Is It? How Can We Know?

  • G. Gaus
  • Psychology
    Perspectives on Politics
  • 2007
Expert Political Judgment: How Good Is It? How Can We Know? By Philip E. Tetlock. Princeton: Princeton University Press, 2005. 352p. $45.00 cloth, $19.95 paper. This is a wonderful and important