Risk Savvy

How to Make Good Decisions Gerd Gigerenzer Viking Press © 2014 322 pages [@] getab.li/21214 Book:

 

 

Rating

9

9 Applicability 8 Innovation 9 Style

Focus Leadership & Management Strategy Sales & Marketing Finance Human Resources IT, Production & Logistics Career & Self-Development Small Business

Take-Aways • A lack of “risk literacy” impedes your ability to assess hazards and make decisions. • Most people – even those in medicine and finance – misunderstand probabilities. • The “illusion of certainty” is a common risk-analysis error. Nothing is certain. • Illusory certainty appears as the “zero-risk illusion” or the “calculable-risk illusion.” • Rules of thumb, or heuristics, lead to better decisions than complex analysis. • The “recognition heuristic” springs from a familiarity of choices: If one choice is familiar, other choices that aren’t familiar seem inferior.

• In situations of unknown risks, “intuition is indispensable.” • The “law of succession” calculates an increasing probability of something recurring. • A productive “error culture” minimizes people’s fear of mistakes and unlocks the creativity that making a mistake can inspire.

• Many people fear unlikely hazards but behave dangerously in everyday life.

Economics & Politics Industries Global Business Concepts & Trends

To purchase personal subscriptions or corporate solutions, visit our website at www.getAbstract.com, send an email to [email protected], or call us at our US office (1-877-778-6627) or at our Swiss office (+41-41-367-5151). getAbstract is an Internet-based knowledge rating service and publisher of book abstracts. getAbstract maintains complete editorial responsibility for all parts of this abstract. getAbstract acknowledges the copyrights of authors and publishers. All rights reserved. No part of this abstract may be reproduced or transmitted in any form or by any means – electronic, photocopying or otherwise – without prior written permission of getAbstract Ltd. (Switzerland).

This summary is restricted to the personal use of John Judin ([email protected])

LoginContext[cu=165118,asp=696,subs=0,free=0,lo=en] 2015-02-18 08:12:14 CET

1 of 5

getabstract

getabstract

Relevance

getabstract What You Will Learn In this summary, you will learn:r1) Why people misjudge risk and don’t quantify uncertainty; 2) Why heuristics, or rules of thumb, can lead to better decisions than complicated calculations can reveal, and; 3) What rules you can use to improve your risk assessment and decision making. getabstract Review Gerd Gigerenzer, director of the Max Planck Institute in Germany, offers a brightly written guide to better decision making. He reports a widespread lack of “risk literacy,” and says that confusion over probabilities is pervasive among average people as well as among professionals in many fields, including medicine and investment management. He recommends using heuristics, or rules of thumb, to derive the best solutions for problems with unknown risks and he cautions that complicated solutions seldom fix complex problems. Gigerenzer also discusses why you can grant more credence than you might think to both trust and intuition. getAbstract recommends this intelligent overview to everyone who wants to make better decisions and cope with risk more skillfully – and especially to investors, students, entrepreneurs and cancer patients, who may take a keen interest in what Gigerenzer says about probabilities. getabstract getabstract

getabstract

Summary

getabstract

getabstract “Whether the context is a weather forecast, a medical decision or a large-scale disaster, being risk savvy requires a basic knowledge of our intuitive psychology as well as an understanding of statistical information.” getabstract

getabstract “Positive error cultures...make errors transparent, encourage good errors, and learn from bad errors to create a safer environment.” getabstract

Uncertainty and Heuristics Probabilities and risk assessment confuse many decision makers. Consider weather reports. If the chance of rain tomorrow is 30%, is rain probable during 30% of the day (7.2 hours), or in 30% of the area the forecast covers, or on 30% of the days to which such forecasts apply? People have misunderstood weather forecasts since weather broadcasts began in 1965. Such confusion over probabilities extends to many realms of risk, including medicine and finance. People who crave certainty tend to ignore or dismiss uncertainty. But, certainty is an illusion. No one has perfect knowledge of the world. As people seek certainty, they popularize various dubious belief systems, including divination and astrological predictions. The Internet and other modern technology may support new belief systems that increase “apparent certainty.” For instance, modern testing for genetic information and for disease markers like HIV inspires “blind faith in tests.” Embracing new technology “is one thing. Believing that it delivers certainty is another.” Insurers, investment managers and other “manufacturers of certainty” market the myth that the future is predictable, but it is not. Rules of thumb, or heuristics, can assist you in making good decisions in situations that defy statistical analysis. People use rules of thumb consciously and unconsciously. Used unconsciously, rules of thumb result in judgments that are actually based on intuition, itself a form of “unconscious intelligence.” Conscious calculation may be sufficient in situations with known risks, but in situations with unknown risks, “intuition is indispensable.” “Gaze Heuristics” Consider how USAir pilot Chesley Sullenberger saved the lives of 160 passengers aboard a jet that lost engine power after a collision with a flock of geese shortly after takeoff from New York City’s LaGuardia airport. The captain tried to return to the airport. Then, he

Risk Savvy                                                                                                                                                                           getAbstract © 2015

2 of 5

This summary is restricted to the personal use of John Judin ([email protected])

LoginContext[cu=165118,asp=696,subs=0,free=0,lo=en] 2015-02-18 08:12:14 CET

getabstract “You don’t need a background in finance to understand the difference between known risks and unknown risks.” getabstract

getabstract “In an uncertain world, simple rules of thumb can lead to better decisions than fancy calculations.” getabstract

getabstract “Like every strategy, simple or complex, trust is neither good or bad. It all depends on the environment.” getabstract

getabstract “For the mature adult, a high need for certainty can be a dangerous thing. It prevents us from learning to face...uncertainty.” getabstract

fixed his gaze on the airport control tower and watched its position relative to the cockpit windshield gradually rise. Seeing the tower rise like that – and referring to a pilot’s rule of thumb called gaze heuristics – confirmed to him that the jet had sustained too much damage to return to LaGuardia. He knew then that an emergency landing on the Hudson River was the better alternative. The gaze heuristic exemplifies the effectiveness of simple solutions to problems made more complex by uncertainty. The Illusion of Absent or Calculable Risk Illusory certainty takes two forms. The first is the “zero-risk illusion,” when someone presumes certainty based on a perceived lack of risk. The second is the “calculable-risk illusion,” when someone mistakenly thinks that a situation of unknown risk is actually a situation of apparent certainty that presents only known risks. A turkey, for example, may be vulnerable to the “calculable-risk illusion.” The turkey may initially fear the farmer, but then come to expect to receive food and water when the farmer appears. Unaware of its pending slaughter, the turkey may conclude that it knows all the risks in its world and that the farmer poses few. This is also an application of the “law of succession” that mathematician PierreSimon Laplace developed, a formula that calculates the increasing probability of something happening again if it happened before. A firm with a constructive “error culture” minimizes people’s fear of making mistakes. Openness to trial and error is a US cultural asset, creating a setting where trying and failing are not necessarily shameful. Errors spur inspiration. Mistakes can lead to serendipity or unintended discoveries made while seeking something else. You “learn by failing, or you fail to learn.” Sometimes, a lack of information can make decision making easier. A survey asked Germans which US city had more people, Detroit or Milwaukee. The Germans were slightly more accurate in picking Detroit, because most had never heard of Milwaukee. They applied the “recognition heuristic”: If one choice is familiar, people regard a competing unfamiliar choice as inferior. Another heuristic is “social imitation,” which comes into play when people fear something without experiencing it through learning to fear what their group fears – and through “biological preparedness,” innate caution in the presence of such perceived hazards as snakes and spiders. Financial Uncertainty Technological advances transformed investing, but they have not made financial forecasts flawless. Investing in an uncertain world is always risky. The question for investors is whether illusory certainty dissolves their doubts as surely as the “turkey illusion.” Harry Markowitz earned a Nobel Prize in economics for inventing the mean-variance investment portfolio that maximizes gain for a certain level of risk and minimizes the risk of variance in gain. Markowitz used a less-complex formula to allocate funds among different assets in his personal investment portfolio. He used a simple rule of thumb – “1/N” – to allocate an equal amount to each class of assets in his portfolio. Studies show that 1/N consistently predicts the future value of stocks better than mean-variance portfolios and 12 other complicated forecasting models. The “Monty Hall Problem” The Monty Hall problem illustrates a common misunderstanding of probabilities. Monty Hall hosted the television game show Let’s Make a Deal. Behind one of three closed doors was a valuable prize; behind two others were worthless items. If a contestant chose one

Risk Savvy                                                                                                                                                                           getAbstract © 2015

3 of 5

This summary is restricted to the personal use of John Judin ([email protected])

LoginContext[cu=165118,asp=696,subs=0,free=0,lo=en] 2015-02-18 08:12:14 CET

getabstract “Some psychiatrists argue that doctors, just like ordinary folks, fall prey to the same persistent cognitive illusions.” getabstract

getabstract “The more patients become aware of doctors’ innumeracy and defensive decision making, the more trust will be eroded.” getabstract

getabstract “Talking about survival can be useful for surgery and other medical treatments ... but in the context of screening it is always a misleading message.” getabstract

wrong door, Monty Hall often would open the other wrong door, revealing the second worthless item. Then he would give the contestant a chance to change his or her door selection. What was the probability of choosing the right door after Hall opened one of the other doors? The correct answer is 50%, or a one in two chance – not 33%, or a one in three chance, a common wrong answer. “Satisficing” Choosing someone to marry can become an overly complex process. You can take the “maximizing” approach by listing pros and cons, multiplying them by their probabilities, and adding the results. Universities teach this form of rational decision making based on expected utility. But in a world of uncertainty where probabilities are unknowable, “the entire calculation may be built on sand.” Rules of thumb are more likely to lead to good marital choices. One such heuristic is “one-reason decision making: Find the most important reason and ignore the rest.” Such satisficing entails first setting a level of aspiration and then choosing the first person who meets it. A recommended third step is falling in love and acting on this inexplicable gut feeling. Chance and Medicine Doctors who cannot explain their patients’ test results may compromise the health care they render. A gathering of gynecologists pondered this situation: An asymptomatic 50year-old woman tests positive in a routine mammogram, and she wants to know if she has breast cancer or what the chances are that she does. The probability that the woman has breast cancer is one in 100, or 1%, given that 9% of mammograms produce false alarms. Gynecologists had a choice of four possible responses to the screened patient’s question about the chances that she actually had breast cancer: nine in 10, eight in 10, one in 10, and one in 100. The third answer, one in 10, is best, because “mammography produces lots of false alarms...Out of 10 women who test positive in screening, one has cancer.” The other nine are false alarms. Yet most of the gynecologists said the best answer was that eight or nine of every 10 positive mammogram results were reliable indicators of breast cancer. Here is another way to view the public impact of mammogram screenings: • “One out of every 100 women has breast cancer.” • “This woman with breast cancer is likely to test positive.” • “Of the other 99 women, nine are likely to test positive.” These statistics on the prevalence of actual and detected cancer raise questions about the value of early screening. For women and men who get treatment for nonprogressive cancer, early detection often leads to unneeded radiation, biopsies and surgeries that reduce a person’s quality of life without extending its length.

getabstract “The war against cancer is not won by early detection. The best defense is prevention and developing better therapy.” getabstract

Research resulted in this misleading comparison: Findings stated that 60-year-old men who test positive for prostate cancer have a five-year survival rate of 100%, compared to 0% among 67-year-old men who test positive for prostate cancer. This comparison is misleading because it fails to prove that early screening saves lives. In this study, all of the men whom prostate cancer killed were dead at the age of 70. This faulty, screening-based comparison also failed to account for the detection of nonprogressive or slow-growing cancers, which generally includes prostate cancer. In fact, “more men die with prostate cancer than from it.” Thus, “as...prostate cancer screening demonstrates, early detection is not always a godsend for patients.”

Risk Savvy                                                                                                                                                                           getAbstract © 2015

4 of 5

This summary is restricted to the personal use of John Judin ([email protected])

LoginContext[cu=165118,asp=696,subs=0,free=0,lo=en] 2015-02-18 08:12:14 CET

getabstract “Only with both skills and a portion of curiosity and courage will we be able to take our lives into our own hands.” getabstract

The best defense against cancer is not early screening but prevention and development of better therapies. Avastin is one of the world’s most widely prescribed cancer drugs, with 2006 sales totaling $6 billion. Yet in 16 trials with 10,000 cancer patients, more people died after adding Avastin to chemotherapy than died with chemotherapy alone. In terms of prevention, half of all cancers are due to behavior. Obesity, a poor diet and a lack of exercise cause up to 20% of all cancers. Alcohol abuse alone accounts for 10%. Medical advances in the 20th century proved a mixed blessing for patients, because they primarily supported commercial aims, including the sale of drugs and screening technology. The 21st century “should become the century of the patient...promoting health literacy” and “better care for less money.” Achieving Better Risk Awareness “Dread-risk fear” is believing that you’re vulnerable to the fatal hazards that dominate news reports, including disease outbreaks and terrorist attacks. News media and governments often emphasize worst-case scenarios about disease and terrorism.

getabstract “An intuition is neither caprice nor a sixth sense but a form of unconscious intelligence.” getabstract

getabstract Trust is “the mother of all rules of thumb.” getabstract

For example, such worst-case scenarios inspired needless public panic over mad cow disease and swine flu, which were reported as massive public health threats but proved to be “two famous imagined catastrophes.” Many people who fear unlikely hazards still behave in perilous ways. For example, fear of dying in a shark attack may be more widespread than fear of dying in a car accident going to the beach. Revisions to education could help children become “risk savvy.” Using an “icon box” to represent “natural frequencies” is a useful alternative to probabilities. Imagine a school of magic, in which five of every 20 students have a wand. Of those five students, four also wear a magical hat. And of the 15 without wands, 12 wear magical hats. Of the students at the school who wear magical hats, what is the probability that all of them also have a wand? The correct answer: Four of every 16 students with a wand wear a magical hat. Research with second-grade and fourth-grade students shows that the solution is much easier to see if the teacher uses an icon box picturing 20 students, including 16 with a magical hat, and among them, four who also have a wand. Effective analysis of “digital risk literacy” starts with understanding the danger of using a cellphone while driving. The reaction time of a 20-year-old talking on a cellphone slows to that of a 70-year-old. Drivers engaging in distracting phone conversations sometimes fail to “see” traffic lights and other vehicles even while looking at them. An inability to stop using handheld devices with Internet access raises other issues as people increasingly “outsource” mental functions like searching and remembering. Paternalistic responses to hazards include “better technology, more laws and bigger bureaucracy.” Instead of these approaches, society should develop widespread risk literacy among ordinary people who must grapple with finance, health care and other realms of daily life.

getabstract getabstract

getabstract

About the Author

getabstract Gerd Gigerenzer, the director of the Max Planck Institute for Human Development in Berlin, also wrote Gut Feelings. Risk Savvy                                                                                                                                                                           getAbstract © 2015

5 of 5

This summary is restricted to the personal use of John Judin ([email protected])

LoginContext[cu=165118,asp=696,subs=0,free=0,lo=en] 2015-02-18 08:12:14 CET