Minggu, 02 November 2014

[C154.Ebook] Free Ebook Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner

Free Ebook Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner

So, when you require quick that book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner, it doesn't should wait for some days to get guide Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner You could straight get the book to conserve in your tool. Also you love reading this Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner almost everywhere you have time, you could enjoy it to read Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner It is undoubtedly helpful for you who want to obtain the more valuable time for reading. Why don't you spend five mins and invest little money to obtain guide Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner right here? Never let the new point quits you.

Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner

Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner



Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner

Free Ebook Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner

Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner. In what situation do you like reading so a lot? Exactly what regarding the kind of guide Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner The needs to check out? Well, everybody has their own reason needs to check out some books Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner Mostly, it will certainly connect to their requirement to get knowledge from guide Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner as well as want to check out just to get entertainment. Novels, story book, as well as various other entertaining publications become so popular now. Besides, the clinical books will also be the very best need to pick, particularly for the students, teachers, medical professionals, business person, and also various other occupations who are fond of reading.

Why need to be book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner Book is among the very easy sources to search for. By getting the author and style to obtain, you can discover many titles that supply their data to get. As this Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner, the motivating book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner will certainly give you what you need to cover the task target date. And why should remain in this internet site? We will certainly ask first, have you more times to go for shopping the books as well as search for the referred book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner in book store? Lots of people could not have enough time to discover it.

Thus, this web site provides for you to cover your trouble. We reveal you some referred publications Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner in all types as well as themes. From common author to the well-known one, they are all covered to provide in this site. This Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner is you're hunted for book; you just should go to the web link page to show in this site then go with downloading and install. It will certainly not take many times to obtain one book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner It will rely on your internet connection. Just acquisition and also download the soft data of this book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner

It is so simple, isn't it? Why do not you try it? In this site, you can also find various other titles of the Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner book collections that may be able to assist you locating the best remedy of your task. Reading this book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner in soft file will also alleviate you to obtain the resource easily. You could not bring for those publications to somewhere you go. Only with the gizmo that consistently be with your anywhere, you could read this book Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner So, it will certainly be so rapidly to finish reading this Superforecasting: The Art And Science Of Prediction, By Philip E. Tetlock, Dan Gardner

Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner

A New York Times Bestseller
An Economist Best Book of 2015

"The most important book on decision making since Daniel Kahneman's Thinking, Fast and Slow."
—Jason Zweig, The Wall Street Journal
 
Everyone would benefit from seeing further into the future, whether buying stocks, crafting policy, launching a new product, or simply planning the week’s meals. Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts’ predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught?
 
In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people—including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer—who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They’ve beaten other benchmarks, competitors, and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters."
 
In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden’s compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn’t require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course. Superforecasting offers the first demonstrably effective way to improve our ability to predict the future—whether in business, finance, politics, international affairs, or daily life—and is destined to become a modern classic.


From the Hardcover edition.

  • Sales Rank: #9383 in Books
  • Published on: 2016-09-13
  • Released on: 2016-09-13
  • Original language: English
  • Dimensions: 8.00" h x .80" w x 5.10" l, .59 pounds
  • Binding: Paperback
  • 352 pages

Review
A New York Times Editors' Choice
A Washington Post Bestseller
A Hudson Booksellers Best Business Interest Book of 2015
Longlisted for the Financial Times and McKinsey Business Book of the Year Award
Winner of the Axiom Business Book Award in Business Theory (Gold Medal)

“A top choice [for best book of 2015] among the world’s biggest names in finance and economics... Eurasia Group founder Ian Bremmer, Deutsche Bank Chief U.S. Economist Joe LaVorgna, and Citigroup Vice Chairman Peter Orszag were among those giving it a thumbs-up.”
—Bloomberg Business

“The material in Superforecasting is new, and includes a compendium of best practices for prediction… The accuracy that ordinary people regularly attained through their meticulous application did amaze me… [It offers] us all an opportunity to understand and react more intelligently to the confusing world around us.”
—New York Times Book Review

"Tetlock's thesis is that politics and human affairs are not inscrutable mysteries. Instead, they are a bit like weather forecasting, where short-term predictions are possible and reasonably accurate... The techniques and habits of mind set out in this book are a gift to anyone who has to think about what the future might bring. In other words, to everyone."
—The Economist

"Tetlock’s work is fascinating and important, and he and Gardner have written it up here with verve."
—The Financial Times

“Superforecasting is the most important scientific study I’ve ever read on prediction.”
—Cass R. Sunstein, The Bloomberg View

"Just as modern medicine began when a farsighted few began to collect data and keep track of outcomes, to trust objective 'scoring' over their own intuitions, it's time now for similar demands to be made of the experts who lead public opinion. It's time for evidence-based forecasting."
—The Washington Post

"Superforecasting, by Philip Tetlock and Dan Gardner, is one of the most interesting business and finance books published in 2015.”
—John Kay, The Financial Times

"One of Tetlock's key points is that these aren't innate skills: they can be both taught and learned... Tetlock's 'Ten Commandments For Aspiring Superforecasters' should probably have a place of honor in most business meeting rooms."
—Forbes

"The key to becoming a better forecaster, if not a super one, according to Tetlock is the same as any other endeavor: practice, practice, practice."
—The Street

"In this captivating book, Tetlock argues that success is all about the approach: foresight is not a gift but rather a product of a particular way of thinking... In each chapter, the author augments his research with compelling interviews, anecdotes, and historical context, using accessible real-world examples to frame what could otherwise be dense subject matter. His writing is so engaging and his argument so tantalizing, readers will quickly be drawn into the challenge - in the appendix, the author provides a concise training manual to do just that. A must-read field guide for the intellectually curious."
—Kirkus Reviews, starred

"Tetlock and Gardner believe anyone can improve their forecasting ability by learning from the way they work. If that's true, people in business and finance who make an effort to do so have a lot to gain — and those who don't, much to lose."
—The Financial Post

"Superforecasting is a very good book. In fact it is essential reading — which I have never said in any of my previous MT reviews... It should be on every manager's and investor's reading list around the topics du jour of decision-making, prediction and behavioural economics."
—Management Today

"I've been hard on social science, even suggesting that 'social science' is an oxymoron. I noted, however, that social science has enormous potential, especially when it combines 'rigorous empiricism with a resistance to absolute answers.' The work of Philip Tetlock possesses these qualities."
—Scientific American

"One of the best books I've read this year... Superforecasting is a must read book."
—Seeking Alpha

"Keen to show that not all forecasting is a flop, Tetlock has conducted a new experiment that shows how you can make good forecasts, ones that routinely improve on predictions made by even the most well-informed expert. The book is full of excellent advice — it is the best thing I have read on predictions, which is a subject I am keen on... Gardner has turned the research into readable examples and a flowing text, without losing rigour... This book shows that you can be better at forecasting."
—The Times of London

"We now expect every medicine to be tested before it is used. We ought to expect that everybody who aspires to high office is trained to understand why they are so likely to make mistakes forecasting complex events... Politics is harder than physics but Tetlock has shown that it doesn't have to be like astrology."
—The Spectator

“Philip Tetlock is the world expert on a vital subject. Superforecasting is the wonderful story of how he and his research team got ordinary people to beat experts in a very serious game. It is also a manual for thinking clearly in an uncertain world. Read it.” 
—Daniel Kahneman, winner of the Nobel Prize and author of Thinking, Fast and Slow

“Superforecasting is a rare book that will make you smarter and wiser. One of the giants of behavioral science reveals how to improve at predicting the future.”
—Adam Grant, New York Times bestselling author of Originals 
 
“The best way to know if an idea is right is to see if it predicts the future. But which ideas, which methods, which people have a track record of non-obvious predictions vindicated by the course of events? The answers will surprise you, and they have radical implications for politics, policy, journalism, education, and even epistemology—how we can best gain knowledge about the world. The casual style of Superforecasting belies the profundity of its message.”
—Steven Pinker, Johnstone Professor of Psychology, Harvard University, and author of The Better Angels of Our Nature

“Philip Tetlock’s Superforecasting is a common-sense guide to thinking about decision-making and the future by a man who knows this terrain like no one else.”
—Ian Bremmer, Bloomberg Business’ Best Books of 2015

“In this accessible and lively book, Tetlock and Gardner recognize the centrality of probabilistic thinking to sound forecasting. Whether you are a policymaker or anyone else who wants to approach decisions with great rigor, Superforecasting will serve as a highly useful guide.”
—Robert E. Rubin, Former U.S. Treasury Secretary
 
“How well can we predict the future, really? There is no better way to answer that question than to read this book. You will come away disillusioned about the ability of experts, but also enlightened about how the best forecasters do it—and maybe even hopeful about your own prospects.”
—Tyler Cowen, Director of the George Mason University Mercatus Center and author of Average Is Over
 
“For thousands of years, people have listened to those who foretold the future with confidence and little accountability. In this book, Tetlock and Gardner free us from our foolishness. Full of great stories and simple statistics, Superforecasting gives us a new way of thinking about the complexity of the world, the limitations of our minds, and why some people can consistently outpredict a dart-throwing chimp. Tetlock’s research has the potential to revolutionize foreign policy, economic policy, and your own day-to-day decisions.”
—Jonathan Haidt, New York University Stern School of Business, and author of The Righteous Mind
 
“[Superforecasting] shows that you can get information from a lot of different sources. Knowledge is all around us and it doesn’t have to come from the experts.”
—Joe LaVorgna, Bloomberg Business’ Best Books of 2015
 
“Good judgment and good forecasting are rare, but they turn out to be made of teachable skills. By forcing forecasters to compete, Tetlock discovered what the skills are and how they work, and this book teaches the ability to any interested reader.”
—Stewart Brand, President, The Long Now Foundation
 
“Philip Tetlock is renowned for demonstrating that most experts are no better than ‘dart-throwing monkeys’ at predicting elections, wars, economic collapses and other events. In his brilliant new book, Tetlock offers a much more hopeful message, based once again on his own ground-breaking research. He shows that certain people can forecast events with accuracy much better than chance—and so, perhaps, can the rest of us, if we emulate the critical thinking of these ‘superforecasters.’ The self-empowerment genre doesn’t get any smarter and more sophisticated than this.”
—John Horgan, Director, Center for Science Writings, Stevens Institute of Technology
 
“Superforecasting is the rare book that is both scholarly and engaging. The lessons are scientific, compelling, and enormously practical. Anyone who is in the forecasting business—and that’s all of us—should drop what they are doing and read it.”
—Michael J. Mauboussin, Head of Global Financial Strategies, Credit Suisse

“[Superforecasting] highlights the techniques and attributes of superforecasters—that is, those whose predictions have been demonstrated to be remarkably accurate—in a manner that’s both rigorous and readable. The lessons are directly relevant to business, finance, government, and politics.”
—Peter Orszag, Bloomberg Business’ Best Books of 2015
 
“There isn’t a social scientist in the world I admire more than Phil Tetlock.”
—Tim Harford, author of The Undercover Economist
 
“From the Oracle of Delphi to medieval astrologers to modern overconfident experts, forecasters have been either deluded or fraudulent. For the first time, Superforecasting reveals the secret of making honest, reliable, effective, useful judgments about the future.”
—Aaron Brown, Chief Risk Officer of AQR Capital Management and author of The Poker Face of Wall Street
 
“Socrates had the insight in ‘know thyself,’ Kahneman delivered the science in Thinking, Fast and Slow, and now Tetlock has something we can all apply in Superforecasting.”
—Juan Luis Perez, Global Head of UBS Group Research


From the Hardcover edition.

About the Author
Philip E. Tetlock is the Annenberg University Professor at the University of Pennsylvania and holds appointments in the psychology and political science departments and the Wharton School of Business. He and his wife, Barbara Mellers, are the co-leaders of the Good Judgment Project, a multi-year forecasting study. He is also the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics.
 
Dan Gardner is a journalist and the author of Risk and Future Babble: Why Pundits are Hedgehogs and Foxes Know Best.

Excerpt. © Reprinted by permission. All rights reserved.
1

An Optimistic Skeptic

We are all forecasters. When we think about changing jobs, getting married, buying a home, making an investment, launching a product, or retiring, we decide based on how we expect the future will unfold. These expectations are forecasts. Often we do our own forecasting. But when big events happen--markets crash, wars loom, leaders tremble--we turn to the experts, those in the know. We look to people like Tom Friedman.

If you are a White House staffer, you might find him in the Oval Office with the president of the United States, talking about the Middle East. If you are a Fortune 500 CEO, you might spot him in Davos, chatting in the lounge with hedge fund billionaires and Saudi princes. And if you don’t frequent the White House or swanky Swiss hotels, you can read his New York Times columns and bestselling books that tell you what’s happening now, why, and what will come next.1 Millions do.

Like Tom Friedman, Bill Flack forecasts global events. But there is a lot less demand for his insights.

For years, Bill worked for the US Department of Agriculture in Arizona--“part pick-and-shovel work, part spreadsheet”--but now he lives in Kearney, Nebraska. Bill is a native Cornhusker. He grew up in Madison, Nebraska, a farm town where his parents owned and published the Madison Star-Mail, a newspaper with lots of stories about local sports and county fairs. He was a good student in high school and he went on to get a bachelor of science degree from the University of Nebraska. From there, he went to the University of Arizona. He was aiming for a PhD in math, but he realized it was beyond his abilities--“I had my nose rubbed in my limitations” is how he puts it--and he dropped out. It wasn’t wasted time, however. Classes in ornithology made Bill an avid bird-watcher, and because Arizona is a great place to see birds, he did fieldwork part-time for scientists, then got a job with the Department of Agriculture and stayed for a while.

Bill is fifty-five and retired, although he says if someone offered him a job he would consider it. So he has free time. And he spends some of it forecasting.

Bill has answered roughly three hundred questions like “Will Russia officially annex additional Ukrainian territory in the next three months?” and “In the next year, will any country withdraw from the eurozone?” They are questions that matter. And they’re difficult. Corporations, banks, embassies, and intelligence agencies struggle to answer such questions all the time. “Will North Korea detonate a nuclear device before the end of this year?” “How many additional countries will report cases of the Ebola virus in the next eight months?” “Will India or Brazil become a permanent member of the UN Security Council in the next two years?” Some of the questions are downright obscure, at least for most of us. “Will NATO invite new countries to join the Membership Action Plan (MAP) in the next nine months?” “Will the Kurdistan Regional Government hold a referendum on national independence this year?” “If a non-Chinese telecommunications firm wins a contract to provide Internet services in the Shanghai Free Trade Zone in the next two years, will Chinese citizens have access to Facebook and/or Twitter?” When Bill first sees one of these questions, he may have no clue how to answer it. “What on earth is the Shanghai Free Trade Zone?” he may think. But he does his homework. He gathers facts, balances clashing arguments, and settles on an answer.

No one bases decisions on Bill Flack’s forecasts, or asks Bill to share his thoughts on CNN. He has never been invited to Davos to sit on a panel with Tom Friedman. And that’s unfortunate. Because Bill Flack is a remarkable forecaster. We know that because each one of Bill’s predictions has been dated, recorded, and assessed for accuracy by independent scientific observers. His track record is excellent.

Bill is not alone. There are thousands of others answering the same questions. All are volunteers. Most aren’t as good as Bill, but about 2% are. They include engineers and lawyers, artists and scientists, Wall Streeters and Main Streeters, professors and students. We will meet many of them, including a mathematician, a filmmaker, and some retirees eager to share their underused talents. I call them superforecasters because that is what they are. Reliable evidence proves it. Explaining why they’re so good, and how others can learn to do what they do, is my goal in this book.

How our low-profile superforecasters compare with cerebral celebrities like Tom Friedman is an intriguing question, but it can’t be answered because the accuracy of Friedman’s forecasting has never been rigorously tested. Of course Friedman’s fans and critics have opinions one way or the other--“he nailed the Arab Spring” or “he screwed up on the 2003 invasion of Iraq” or “he was prescient on NATO expansion.” But there are no hard facts about Tom Friedman’s track record, just endless opinions--and opinions on opinions.2 And that is business as usual. Every day, the news media deliver forecasts without reporting, or even asking, how good the forecasters who made the forecasts really are. Every day, corporations and governments pay for forecasts that may be prescient or worthless or something in between. And every day, all of us--leaders of nations, corporate executives, investors, and voters--make critical decisions on the basis of forecasts whose quality is unknown. Baseball managers wouldn’t dream of getting out the checkbook to hire a player without consulting performance statistics. Even fans expect to see player stats on scoreboards and TV screens. And yet when it comes to the forecasters who help us make decisions that matter far more than any baseball game, we’re content to be ignorant.3

In that light, relying on Bill Flack’s forecasts looks quite reasonable. Indeed, relying on the forecasts of many readers of this book may prove quite reasonable, for it turns out that forecasting is not a “you have it or you don’t” talent. It is a skill that can be cultivated. This book will show you how.



The One About the Chimp

I want to spoil the joke, so I’ll give away the punch line: the average expert was roughly as accurate as a dart-throwing chimpanzee.

You’ve probably heard that one before. It’s famous--in some circles, infamous. It has popped up in the New York Times, the Wall Street Journal, the Financial Times, the Economist, and other outlets around the world. It goes like this: A researcher gathered a big group of experts--academics, pundits, and the like--to make thousands of predictions about the economy, stocks, elections, wars, and other issues of the day. Time passed, and when the researcher checked the accuracy of the predictions, he found that the average expert did about as well as random guessing. Except that’s not the punch line because “random guessing” isn’t funny. The punch line is about a dart-throwing chimpanzee. Because chimpanzees are funny.

I am that researcher and for a while I didn’t mind the joke. My study was the most comprehensive assessment of expert judgment in the scientific literature. It was a long slog that took about twenty years, from 1984 to 2004, and the results were far richer and more constructive than the punch line suggested. But I didn’t mind the joke because it raised awareness of my research (and, yes, scientists savor their fifteen minutes of fame too). And I myself had used the old “dart-throwing chimp” metaphor, so I couldn’t complain too loudly.

I also didn’t mind because the joke makes a valid point. Open any newspaper, watch any TV news show, and you find experts who forecast what’s coming. Some are cautious. More are bold and confident. A handful claim to be Olympian visionaries able to see decades into the future. With few exceptions, they are not in front of the cameras because they possess any proven skill at forecasting. Accuracy is seldom even mentioned. Old forecasts are like old news--soon forgotten--and pundits are almost never asked to reconcile what they said with what actually happened. The one undeniable talent that talking heads have is their skill at telling a compelling story with conviction, and that is enough. Many have become wealthy peddling forecasting of untested value to corporate executives, government officials, and ordinary people who would never think of swallowing medicine of unknown efficacy and safety but who routinely pay for forecasts that are as dubious as elixirs sold from the back of a wagon. These people--and their customers--deserve a nudge in the ribs. I was happy to see my research used to give it to them.

But I realized that as word of my work spread, its apparent meaning was mutating. What my research had shown was that the average expert had done little better than guessing on many of the political and economic questions I had posed. “Many” does not equal all. It was easiest to beat chance on the shortest-range questions that only required looking one year out, and accuracy fell off the further out experts tried to forecast--approaching the dart-throwing-chimpanzee level three to five years out. That was an important finding. It tells us something about the limits of expertise in a complex world--and the limits on what it might be possible for even superforecasters to achieve. But as in the children’s game of “telephone,” in which a phrase is whispered to one child who passes it on to another, and so on, and everyone is shocked at the end to discover how much it has changed, the actual message was garbled in the constant retelling and the subtleties were lost entirely. The message became “all expert forecasts are useless,” which is nonsense. Some variations were even cruder--like “experts know no more than chimpanzees.” My research had become a backstop reference for nihilists who see the future as inherently unpredictable and know-nothing populists who insist on preceding “expert” with “so-called.”

So I tired of the joke. My research did not support these more extreme conclusions, nor did I feel any affinity for them. Today, that is all the more true.

There is plenty of room to stake out reasonable positions between the debunkers and the defenders of experts and their forecasts. On the one hand, the debunkers have a point. There are shady peddlers of questionable insights in the forecasting marketplace. There are also limits to foresight that may just not be surmountable. Our desire to reach into the future will always exceed our grasp. But debunkers go too far when they dismiss all forecasting as a fool’s errand. I believe it is possible to see into the future, at least in some situations and to some extent, and that any intelligent, open-minded, and hardworking person can cultivate the requisite skills.

Call me an “optimistic skeptic.”



The Skeptic

To understand the “skeptic” half of that label, consider a young Tunisian man pushing a wooden handcart loaded with fruits and vegetables down a dusty road to a market in the Tunisian town of Sidi Bouzid. When the man was three, his father died. He supports his family by borrowing money to fill his cart, hoping to earn enough selling the produce to pay off the debt and have a little left over. It’s the same grind every day. But this morning, the police approach the man and say they’re going to take his scales because he has violated some regulation. He knows it’s a lie. They’re shaking him down. But he has no money. A policewoman slaps him and insults his dead father. They take his scales and his cart. The man goes to a town office to complain. He is told the official is busy in a meeting. Humiliated, furious, powerless, the man leaves.



1. Why single out Tom Friedman when so many other celebrity pundits could have served the purpose? The choice was driven by a simple formula: (status of pundit) X (difficulty of pinning down his/her forecasts) X (relevance of pundit’s work to world politics). Highest score wins. Friedman has high status; his claims about possible futures are highly difficult to pin down--and his work is highly relevant to geopolitical forecasting. The choice of Friedman was in no way driven by an aversion to his editorial opinions. Indeed, I reveal in the last chapter a sneaky admiration for some aspects of his work. Exasperatingly evasive though Friedman can be as a forecaster, he proves to be a fabulous source of forecasting questions.

2. Again, this is not to imply that Friedman is unusual in this regard. Virtually every political pundit on the planet operates under the same tacit ground rules. They make countless claims about what lies ahead but couch their claims in such vague verbiage that it is impossible to test them. How should we interpret intriguing claims like “expansion of NATO could trigger a ferocious response from the Russian bear and may even lead to a new Cold War” or “the Arab Spring might signal that the days of unaccountable autocracy in the Arab world are numbered” or . . . ? The key terms in these semantic dances, may or could or might, are not accompanied by guidance on how to interpret them. Could could mean anything from a 0.0000001 chance of “a large asteroid striking our planet in the next one hundred years” to a 0.7 chance of “Hillary Clinton winning the presidency in 2016.” All this makes it impossible to track accuracy across time and questions. It also gives pundits endless flexibility to claim credit when something happens (I told you it could) and to dodge blame when it does not (I merely said it could happen). We shall encounter many examples of such linguistic mischief.

3. It is as though we have collectively concluded that sizing up the starting lineup for the Yankees deserves greater care than sizing up the risk of genocide in the South Sudan. Of course the analogy between baseball and politics is imperfect. Baseball is played over and over under standard conditions. Politics is a quirky game in which the rules are continually being contorted and contested. So scoring political forecasting is much harder than compiling baseball statistics. But “harder” doesn’t mean impossible. It turns out to be quite possible.

There is also another objection to the analogy. Pundits do more than forecasting. They put events in historical perspective, offer explanations, engage in policy advocacy, and pose provocative questions. All true, but pundits also make lots of implicit or explicit forecasts. For instance, the historical analogies pundits invoke contain implicit forecasts: the Munich appeasement analogy is trotted out to support the conditional forecast “if you appease country X, it will ramp up its demands”; and the World War I analogy is trotted out to support “if you use threats, you will escalate the conflict.” I submit that it is logically impossible to engage in policy advocacy (which pundits routinely do) without making assumptions about whether we would be better or worse off if we went down one or another policy path. Show me a pundit who does not make at least implicit forecasts and I will show you one who has faded into Zen-like irrelevance.

Most helpful customer reviews

120 of 132 people found the following review helpful.
More about superforecasters than about superforecasting
By Jackal
There are two kind of pop-science books; one deep and thoughtful based on years of research, one quick and dirty written by a ghost-writer. This book is of the latter kind. Tetlock wrote Expert Political Judgment: How Good Is It? How Can We Know? about a decade ago. That book was deep and thoughtful. I had expected his new book to be an update with ten more years of research and consulting. Sadly, I am greatly disappointed. The book could have been written totally without additional research input. It starts with a couple of chapters of the history of the standard controlled experiment. There is about 50 pages of real content in the 330 pages of the book.There is a lot of content directly lifted from the web (e.g. Fermi-forecasting, Auftragstaktik) - kind of Malcolm Gladwell style, some insight and some misinterpretation.

The style is **extreme pop-science**. What do I mean with that? Far too many pages, plentiful descriptions of minute irrelevant details of individuals (so called human interest points - I guess that is what they teach in creative writing), never any figure or number (e.g. 67% is changed to two thirds), all difficult material removed or put in a footnote. And how come a book with two authors use the pronoun "I" all the time?

The researcher has run a forecasting tournament for several years. He has loads of data, but he does not provide any analysis in the book. He refers to his research in footnotes, but no explanation or description at all. Instead we get statements like 80% of superforecasters are more intelligent than average. What is wrong with running a regression to find out what characteristics are important? Why spend five chapters going through the characteristics of superforecasters? In the end, apparently, two characteristics stand out. (1) Continual updating of forecasts, (2) Being intelligent. That fact is told after around 200 pages of tedious writing. Wtf? I can reluctantly accept dumbing down the book, but it is inexcusable that the footnotes does not include some further help to the reader that wants more depth.

The author likes to give minute details of the superforcasters. Personally, I don't care that Brian likes Facebook updates of cats, that John is retired because he is sick and that he now likes to collect stuff or that Steve is and old colleague of the author that likes opera. Who reads and enjoy this written muzak? It goes on chapter after chapter. We "meet" 15-20 superforecasters.

There is a lot about the superforecasters in the book, but the title of the book is "Superforecasting". This is a seriously misleading title. It makes you believe that you will learn tools to become a great forecaster. You get some, mostly general, points in an eight page appendix. With the researcher's experience, I would have expected a lot of practical advice.

What is good about the book?
(1) The key message that experts are lousy forecasters and do not want accountability is very important, but that was already in the author's earlier book.
(2) Some useful anecdotes that you probably should pick up if you are teaching/presenting on the topic.
(3) Odd bits of information. I liked the discussion of how the German military used what we consider modern management already 100 years ago. As mentioned earlier, there are 50 pages of really good material in the book.

I bought the hard-cover edition. If you make notes with a normal pencil, be careful because it easily pierces the paper.

The book is worth two stars. If you are en educator and want a few anecdotes, read the book. Others should give it a pass. Instead sign up to the author's forecasting tournament. You learn more by trial and error learning. I signed up two years ago and it is a useful experience. You can also check the video features on edge.org. Then spend time reading better books. A few rigorous pop-science books:
* Another forecasting perspective is Steenbarger's Trading Psychology 2.0: From Best Practices to Best Processes (Wiley Trading). It is about trading in the market, but it covers many of the topics from a different perspective. Worth reading his earlier books too.
* And if you haven't read Thinking, Fast and Slow, that is a more important book (but also too fluffy for my linking).
* You should also read Taleb's The Black Swan: Second Edition: The Impact of the Highly Improbable: With a new section: "On Robustness and Fragility" (Incerto), but don't buy his fluffy version of the same topic Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets (Incerto)

102 of 110 people found the following review helpful.
This book has a 100% probability of making you think!
By Angie Boyter
Everyone wants to be able to predict the future, whether they are buying stocks, choosing a mate, or deciding how the next presidential election will go, but what, if anything, can we do to improve our ability to predict? Wharton School professor Philip Tetlock has been studying that question since the Reagan era and has observed forecasters from pundits and intelligence analysts to filmmakers and pipe fitters to try to learn why some people are better at making predictions than others. In this book, he describes his work and that of others and presents some techniques that may help all of us make better decisions.
As someone who enjoys reading about topics like decision-making, forecasting, and behavioral economics, I too often find myself reluctantly concluding, “That was well-presented, but there is nothing here I have not heard before.” For a reader new to the subject, it is good that Superforecasting delves into the ideas of people like psychologist Daniel Kahneman, whose description of the biases in judgment that impede our ability to make good decisions and forecasts earned him a Nobel Prize in Economics, and Tetlock appropriately covers topics like these.
I was pleased, though, he also presented some interesting work I was not familiar with, such as the author’s own Expert Political Judgment project to study whether some people really are better predictors than others and, if so, how they differ from the less successful experts, and the Good Judgment Project that was part of an effort to improve intelligence estimating techniques funded by IARPA (the intelligence community’s equivalent of DARPA). I was also especially amused by a contest run in 1997 by the Financial Times at the suggestion of behavioral economist Richard Thaler. People were to guess a number between 0 and 100, and the winner would be the person whose guess comes closest to TWO-THIRDS of the average guess of all contestants. If thinking about this contest begins to make your head spin, read this book. If it sounds pretty simple to you, then you should DEFINITELY read this book; the answer will surprise you!
The history of science was also interesting and often surprising, such as the idea of randomized controlled trials, which are taken for granted today, not being used until after World War II. The book introduces us to people like meteorologist Edward Lorenz, the author of the classic paper asking whether the flap of a butterfly’s wings in Brazil can set off a tornado in Texas, and physician Archie Cochrane, an early advocate for randomized trials and a scientific approach to medical decisions who nonetheless was driven by his human biases to make a decision about his own health that subjected him to a mutilating surgery and could have cost him his life.
After studying and identifying a group of superforecasters and their characteristics, Tetlock asked the natural question: Are superforecasters born, or can we all become superforecasters? As a good scientist, he concludes he cannot answer that question with certainty, but he does lay down some habits of mind that are very likely (Give me a probability here, Phil!) to improve anyone’s ability to make predictions and improve the resulting decisions.
If your aim is to improve your own ability to make predictions, Tetlock will both give you valuable advice and explain how following that rather simple-sounding advice may be harder than you think. I predict you’ll find the book both enjoyable and informative.

161 of 180 people found the following review helpful.
Valuable lessons for forecasting, but lacks a practical recipe: 3.5 stars
By Ash Jogalekar
In the 1990s Philip Tetlock gathered together hundreds of experts and "ordinary" - albeit extremely well-read - people and asked them to try to predict global questions of significance: What will happen to the stock market in the next one year? What will be the fate of Tunisia in two years? What kind of impact of middle eastern politics on oil prices are we going to see in the next six months?

He continued the contest for several years and came up with a shocking answer: the ordinary people who read the daily news and thought about it with depth and nuance were at least as good as self-proclaimed and well-known experts from the financial sector, from government and from intelligence agencies. These results of the so-called 'Good Judgement Project' were widely publicized by the media under the "there are no experts" drumroll, but as Fetlock and his co-author Gardner indicate in this book, what the media failed to report was the presence of a handful of people who were even better than the experts, albeit by modest amounts. Tetlock called these people 'superforecasters', and this is their story.

The crux of the book is to demonstrate the qualities that these superforecasters have and try to teach them to us. The narrative is packed with very interesting problems of forecasting like figuring out if the man in a mysterious compound in Pakistan was Osama Bin Laden or whether Yasser Arafat had been poisoned by Israel. In each case Tetlock takes us through the thought processes of his superforecasters, many of who have held non-forecasting related day jobs including plumbing, office work and construction. In addition, since Tetlock is a well-known psychologist himself, he has access to leading business leaders, academics and intelligence analysts who he can interview to probe their own views.

Tetlock tries to distill the lessons that these super forecasters can teach us. Foremost among them are an almost obsessive proclivity toward probabilistic and at least semi-quantitative thinking and an almost automatic willingness to update their prior knowledge in the face of contrary opinions and new evidence. Open mindedness, flexibility and an ability to move quickly between different viewpoints is thus essential to good forecasting. Other lessons include striking a good balance between under and over confidence and between under and overreacting to the evidence, breaking down problems into smaller problems (the so-called Fermi approach to problem solving), recognizing the limits of one's prediction domain, looking for clashing or contradictory causal factors and dividing the evidence into more and less certain pieces. Finally, being part of a good team and learning from each other can often be a revelation.

Tetlock and Gardner's book thus gives us a good prescription for confident forecasting. What I found a bit disappointing was that it does not give us a recipe - hence the 3 stars (actually 3.5 had Amazon permitted a fractional rating system). It points out the destination but not the path, and so even at the end I felt myself floundering a bit. To some extent this path is subjective, but in its absence at least some of the prescriptions (such as "break down a problem into parts" or "consider contradictory evidence") sound rather obvious. What Tetlock and Gardner could do in a forthcoming book in my opinion is teach us how to ingrain the valuable lessons that they learnt from superforecasters in our daily habits and thinking, perhaps with case studies. For instance how do we start to think along the lines of superforecasters the moment we open our daily paper or flip on a news channel? How exactly do we reach a conclusion when presented with contradictory evidence? It's great to know all the qualities that forecasters could teach us, but preaching is not quite the same as practicing so I think all of us would appreciate some help in that arena. I think there's a great self-help manual hidden in Fetlock and Gardner's book.

See all 228 customer reviews...

Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner PDF
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner EPub
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner Doc
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner iBooks
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner rtf
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner Mobipocket
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner Kindle

Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner PDF

Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner PDF

Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner PDF
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, Dan Gardner PDF

Tidak ada komentar:

Posting Komentar