Philip Tetlock

OK
About Philip Tetlock
Philip E. Tetlock (born 1954) is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sciences.
He has written several non-fiction books at the intersection of psychology, political science and organizational behavior, including Superforecasting: The Art and Science of Prediction; Expert Political Judgment: How Good Is It? How Can We Know?; Unmaking the West: What-if Scenarios that Rewrite World History; and Counterfactual Thought Experiments in World Politics. Tetlock is also co-principal investigator of The Good Judgment Project, a multi-year study of the feasibility of improving the accuracy of probability judgments of high-stakes, real-world events.
For more see here: https://en.wikipedia.org/wiki/Philip_E._Tetlock
For CV: https://www.dropbox.com/s/uorzufg1v0nhcii/Tetlock%20CV%20%20march%2018%2C%202016.docx?dl=0
Twitter: https://twitter.com/PTetlock
LinkedIn: https://www.linkedin.com/in/philip-tetlock-64aa108a?trk=hp-identity-name
For an interview: https://www.edge.org/conversation/philip_tetlock-how-to-win-at-forecasting
Customers Also Bought Items By
Author updates
Books By Philip Tetlock
In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people--including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer--who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They've beaten other benchmarks, competitors, and prediction markets. They've even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters."
The authors show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden's compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn't require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, learning to think probabilistically, working in teams, keeping score, and being willing to admit error and change course.
Superforecasting offers the first demonstrably effective way to improve our ability to predict the future--whether in business, finance, politics, international affairs, or daily life--and is destined to become a modern classic.
Since its original publication, Expert Political Judgment by New York Times bestselling author Philip Tetlock has established itself as a contemporary classic in the literature on evaluating expert opinion.
Tetlock first discusses arguments about whether the world is too complex for people to find the tools to understand political phenomena, let alone predict the future. He evaluates predictions from experts in different fields, comparing them to predictions by well-informed laity or those based on simple extrapolation from current trends. He goes on to analyze which styles of thinking are more successful in forecasting. Classifying thinking styles using Isaiah Berlin's prototypes of the fox and the hedgehog, Tetlock contends that the fox--the thinker who knows many little things, draws from an eclectic array of traditions, and is better able to improvise in response to changing events--is more successful in predicting the future than the hedgehog, who knows one big thing, toils devotedly within one tradition, and imposes formulaic solutions on ill-defined problems. He notes a perversely inverse relationship between the best scientific indicators of good judgement and the qualities that the media most prizes in pundits--the single-minded determination required to prevail in ideological combat.
Clearly written and impeccably researched, the book fills a huge void in the literature on evaluating expert opinion. It will appeal across many academic disciplines as well as to corporations seeking to develop standards for judging expert decision-making. Now with a new preface in which Tetlock discusses the latest research in the field, the book explores what constitutes good judgment in predicting future events and looks at why experts are often wrong in their forecasts.
Political scientists often ask themselves what might have been if history had unfolded differently: if Stalin had been ousted as General Party Secretary or if the United States had not dropped the bomb on Japan. Although scholars sometimes scoff at applying hypothetical reasoning to world politics, the contributors to this volume--including James Fearon, Richard Lebow, Margaret Levi, Bruce Russett, and Barry Weingast--find such counterfactual conjectures not only useful, but necessary for drawing causal inferences from historical data. Given the importance of counterfactuals, it is perhaps surprising that we lack standards for evaluating them. To fill this gap, Philip Tetlock and Aaron Belkin propose a set of criteria for distinguishing plausible from implausible counterfactual conjectures across a wide range of applications.
The contributors to this volume make use of these and other criteria to evaluate counterfactuals that emerge in diverse methodological contexts including comparative case studies, game theory, and statistical analysis. Taken together, these essays go a long way toward establishing a more nuanced and rigorous framework for assessing counterfactual arguments about world politics in particular and about the social sciences more broadly.