Sunday, November 6, 2011

Deep Insight - Daniel Kahneman: Beware the ‘inside view’

How many times do you remember projects, ventures and adventures turning out faster or easier than expected?  Better than expected returns, financial or otherwise, are not uncommon, but easier or faster, next to never.  Why?

Here is Daniel Kahneman's  insight in his words (whole post reported below):

Why the inside view didn’t work   (a book writing project)

This embarrassing episode remains one of the most instructive experiences of my professional life. I had stumbled onto a distinction between two profoundly different approaches to forecasting, which Amos Tversky1 and I later labeled the inside view and the outside view.

The inside view is the one that all of us, including Seymour, spontaneously adopted to assess the future of our project. We focused on our specific circumstances and searched for evidence in our own experiences. We had a sketchy plan: we knew how many chapters we were going to write, and we had an idea of how long it had taken us to write the two that we had already done. The more cautious among us probably added a few months as a margin of error.


But extrapolating was a mistake. We were forecasting based on the information in front of us, but the chapters we wrote first were easier than others and our commitment to the project was probably at its peak. The main problem was that we failed to allow for what Donald Rumsfeld famously called “unknown unknowns.” At the time, there was no way for us to foresee the succession of events that would cause the project to drag on for so long: divorces, illnesses, crises of coordination with bureaucracies. These unanticipated events not only slow the writing process, but produce long periods during which little or no progress is made at all. Of course, the same must have been true for the other teams that Seymour knew about. Like us, the members of those teams did not know the odds they were facing. There are many ways for any plan to fail, and although most of them are too improbable to be anticipated, the likelihood that something will go wrong in a big project is high.

-------------------------------


We can generalize this outcome as "life happens" and "entropy interferes".  The hyper-optimistic mind of the entrepreneur is naturally at odds with these expectations. After all, Entrepreneur, from the French, is that person with the initiative to start a venture, to initiate something.  Optimism is the foundation -- "I can fill a need that no one else does or can" -- else why would one bother?
That very state of mind creates a mental filter that allows that "life may happen" but not necessarily in the most undesirable way. It ignores the physics of entropy which proves there are statistically many more ways to move away from a desired (ordered) condition than toward it.  Therefore, "sh*t happens" is legitimately the more commonly reported experience..

 A prescription for entrepreneurs

Believe in yourself, but ask others with relevant experience for their "outside view", those that succeeded and those that you may think failed. The ones that failed in similar circumstances can tell you much about entropy and how life happens.  Warnings of potholes in the road do not make the road impassable, nor do they recommend quitting, they are an opportunity to avoid breaking an axle over them.

Mentors, perhaps even more than investors, can be your true angels.  They may know shortcuts in the trail, dangerous potholes, market insights, contacts, any of them applicable even in the new economy business you are creating. Most importantly they can be an antidote to the entrepreneur's natural over-optimism by simply asking, at the right times, "Really? How do you know that?".  They can always be dismissed, but then "Beware the inside view"


Marco Messina

----------------------------
The original post follows

Daniel Kahneman: Beware the ‘inside view’

In an excerpt from his new book, Thinking, Fast and Slow, the Nobel laureate recalls how an inwardly focused forecasting approach once led him astray, and why an external perspective can help executives do better.



beware the inside view article, inside view ignores unanticipated events, Strategy

In This Article

In the 1970s, I convinced some officials in the Israeli Ministry of Education of the need for a curriculum to teach judgment and decision making in high schools. The team that I assembled to design the curriculum and write a textbook for it included several experienced teachers, some of my psychology students, and Seymour Fox, then dean of the Hebrew University’s School of Education and an expert in curriculum development.
After meeting every Friday afternoon for about a year, we had constructed a detailed outline of the syllabus, written a couple of chapters, and run a few sample lessons. We all felt we had made good progress. Then, as we were discussing procedures for estimating uncertain quantities, an exercise occurred to me. I asked everyone to write down their estimate of how long it would take us to submit a finished draft of the textbook to the Ministry of Education. I was following a procedure that we already planned to incorporate into our curriculum: the proper way to elicit information from a group is not by starting with a public discussion, but by confidentially collecting each person’s judgment. I collected the estimates and jotted the results on the blackboard. They were narrowly centered around two years: the low end was one and a half, the high end two and a half years.
A shocking disconnect
He fell silent. When he finally spoke, it seemed to me that he was blushing, embarrassed by his own answer: “You know, I never realized this before, but in fact not all the teams at a stage comparable to ours ever did complete their task. A substantial fraction of the teams ended up failing to finish the job.”
This was worrisome; we had never considered the possibility that we might fail. My anxiety rising, I asked how large he estimated that fraction was. “About 40 percent,” he said. By now, a pall of gloom was falling over the room.
“Those who finished, how long did it take them?”
“I cannot think of any group that finished in less than seven years,” Seymour said, “nor any that took more than ten.”
I grasped at a straw: “When you compare our skills and resources to those of the other groups, how good are we? How would you rank us in comparison with these teams?”
Seymour did not hesitate long this time.
“We’re below average,” he said, “but not by much.”
This came as a complete surprise to all of us—including Seymour, whose prior estimate had been well within the optimistic consensus of the group. Until I prompted him, there was no connection in his mind between his knowledge of the history of other teams and his forecast of our future.
We should have quit that day. None of us was willing to invest six more years of work in a project with a 40 percent chance of failure. Yet although we must have sensed that persevering was not reasonable, the warning did not provide an immediately compelling reason to quit. After a few minutes of desultory debate, we gathered ourselves and carried on as if nothing had happened. Facing a choice, we gave up rationality rather than the enterprise.
The book was completed eight years later. By that time, I was no longer living in Israel and had long since ceased to be part of the team, which finished the task after many unpredictable vicissitudes. The initial enthusiasm for the idea in the Ministry of Education had waned, and the textbook was never used.
Why the inside view didn’t work
This embarrassing episode remains one of the most instructive experiences of my professional life. I had stumbled onto a distinction between two profoundly different approaches to forecasting, which Amos Tversky1 and I later labeled the inside view and the outside view.
The inside view is the one that all of us, including Seymour, spontaneously adopted to assess the future of our project. We focused on our specific circumstances and searched for evidence in our own experiences. We had a sketchy plan: we knew how many chapters we were going to write, and we had an idea of how long it had taken us to write the two that we had already done. The more cautious among us probably added a few months as a margin of error.
But extrapolating was a mistake. We were forecasting based on the information in front of us, but the chapters we wrote first were easier than others and our commitment to the project was probably at its peak. The main problem was that we failed to allow for what Donald Rumsfeld famously called “unknown unknowns.” At the time, there was no way for us to foresee the succession of events that would cause the project to drag on for so long: divorces, illnesses, crises of coordination with bureaucracies. These unanticipated events not only slow the writing process, but produce long periods during which little or no progress is made at all. Of course, the same must have been true for the other teams that Seymour knew about. Like us, the members of those teams did not know the odds they were facing. There are many ways for any plan to fail, and although most of them are too improbable to be anticipated, the likelihood that something will go wrong in a big project is high.
How an outside view can help
The second question I asked Seymour directed his attention away from us and toward a class of similar cases. Seymour estimated the base rate of success in that reference class: 40 percent failure and seven to ten years for completion. His informal survey was surely not up to scientific standards of evidence, but it provided a reasonable basis for a baseline prediction: the prediction you make about a case if you know nothing except the category to which it belongs. This should be the anchor for further adjustments. If you are asked to guess the height of a woman and all you know is that she lives in New York City, for example, your baseline prediction is your best guess of the average height of women in the city. If you are now given case-specific information—that the woman’s son is the starting center of his high school basketball team—you will adjust your estimate. Seymour’s comparison of our team to others suggested that the forecast of our outcome was slightly worse than the baseline prediction, which was already grim.
The spectacular accuracy of the outside-view forecast in our specific case was surely a fluke and should not count as evidence for the validity of the outside view. However, the argument for the outside view should be made on general grounds: if the reference class is properly chosen, the outside view will give an indication of where the ballpark is. It may suggest, as it did in our case, that the inside-view forecasts are not even close.

About the Author
Daniel Kahneman is professor emeritus of psychology and public affairs at Princeton University’s Woodrow Wilson School of Public and International Affairs. He was awarded the 2002 Nobel Prize in Economic Sciences for his seminal work in prospect theory, which challenges the rational model of judgment and decision making. This article is an edited excerpt from his new book, Thinking, Fast and Slow, published byFarrar, Straus and Giroux (US), Doubleday (Canada), and Allen Lane (UK). Copyright © 2011 by Daniel Kahneman. All rights reserved.

No comments: