2018-07-16 RRG Notes
Introduction
- Most people would prefer not to destroy the world citation needed
- Even evil villains generally prefer to have a world in order to perform their evil in
- As a result, if the earth is destroyed, it will probably be a result of a mistake rather than an intentional act
- In order to minimize mistakes, we should study heuristics and biases to see how they could lead us into a situation in which we inadvertently destroy the earth or ourselves
Availability
- Are there more words which start with the letter
r
or more words with r
as the third letter?
- When most people are asked, they say that there are more words which start with the letter
r
- However, in reality, the reverse is the case
- People guess that there are more words which start with
r
because they have an easier time recalling words which start with r
than words which have r
as the third letter
- This availability bias means that people systematically mis-estimate the likelihood of various risks, because they estimate based upon how often they've heard about the risk rather than how often the risk actually occurs
- Death from stomach cancer is far more likely than death from homicide but people's risk estimations are the reverse -- homicides are reported in the news, whereas stomach cancers are not
- People don't buy flood insurance even when it is priced artificially cheaply because they think about the worst flood that they have experienced, rather than the worst flood that has occurred
- Building dams reduces the frequency of flooding, which means that people take fewer precautions
- As a result, when flooding does occur, it is far more destructive
- On net, building dams may make flooding more economically damaging, not less
- Societies well protected against minor hazards don't protect against major hazards
- Societies vulnerable to minor hazards use the minor hazard as an upper-bound on risk
Hindsight Bias
- People routinely say that things are more predictable in hindsight than they actually are
- Two groups of citizens were given a hypothetical scenario in which a city did not hire a bridge watchman at a drawbridge
- In one group, the citizens were only given the data that the city had when it chose to make the decision to not hire a bridge watchman
- In the other group, the citizens were given all of the data, plus the fact that a flood had occurred due to a blockage at the drawbridge
- The second group was significantly more likely to hold the city liable for negligence, even after they had been told to avoid hindsight bias -- debiasing attempts were not effective
- People, in hindsight, look at the cost of dealing with the one risk that actually occurred, not the cost of dealing with all the risks at that level of probability that could have occurred
Black Swans
- Nassim Taleb suggests that availability bias and hindsight bias combine to yield a vulnerability to black swans
- A "black swan" process is a process in which most of the variation in a process comes from random, hard-to-forecast, low-probability events
- Example: a financial strategy that earns $10 returns at 98% probability but suffers $1000 losses at 2% probability
- Most years this financial strategy will look to be a sure winner
- However, when there's a bad year, the losses are more than enough to wipe out all the gains from the good years
- Not a hypothetical scenario -- trader had a strategy that worked without fail for 6 years, yielding profits of close to $80 million, but then was wiped out with a $300 million loss in the seventh year
- Long Term Capital Management lost $100 million per day during the Asian financial crisis of 1997
- LTCM said that the event that occurred was a 10-sigma event, but that's obviously not true -- a ten sigma event is so unlikely that even if the universe was 10 times as old as it were today, the event still should not have occurred
- It's far more likely that LTCM's market models were wrong and underestimated the risk of the markets behaving in this manner
- Hindsight bias predisposes us to think that because the past is predictable, the future is also predictable
- Hindsight bias predisposes us to learn overly specific lessons about the past
- Moreover the prevention of black swans is not easily seen or rewarded
The Conjunction Fallacy
- The conjunction rule of probability states that P(A and B) is always less than or equal to either P(A) or P(B)
- However, adding details to a story makes the story seem more believable, even though the probability of every single one of the details being true decreases
- Example: Linda the bank teller
- Linda is a hypothetical person who majored in philosophy is interested in feminist causes and and social justice.
- Is it more likely that Linda is a bank teller or that Linda is a bank teller who is active in the feminist movement?
- People pick the latter statement as being more likely, even though it is mathematically less likely
- The statement with more detail paints a clearer picture, even though it's more likely to be false
- People choose to bet on longer sequences of dice rolls than shorter ones, even though any given sequence of 4 dice rolls is more probable than any given sequence of 5 dice rolls
- We substitute in a notion of "representativeness" for a calculation of probability
- People will pay more to defend against a nanotechnological attack from China than they will to defend against a nanotechnological attack in general
- Vivid, specific scenarios can inflate our sense of security
- People tend to overestimate conjunctive probabilities and underestimate disjunctive probabilities
- People will overestimate the probability of 7 events with 90% probability all occurring
- People will underestimate the probability of at least 1 of 7 events, each with 10% probability, occurring
Confirmation Bias
- People try to confirm hypotheses rather than try to disprove them
- Example: 2-4-6 task
- Experimenter announces a sequence of integers that fit a particular rule
- Subject has to guess the rule
- Even though subjects expressed high confidence in their guesses, only 21% of them actually guessed the rule correctly
- Confirmation bias comes in two forms, "cold" and "hot
- Cold form is the 2-4-6 example above - emotionally neutral
- Hot form is with emotionally charged arguments, like in politics
- Hot confirmation bias is more resistant to change
- People easily accept arguments for what they already believe and subject counterarguments to more scrutiny
- Two biased observers viewing the same stream of evidence can update in opposite directions, as they selectively choose data which already confirms their pre-existing beliefs
- We must apply our knowledge of heuristics and biases evenhandedly -- apply it to arguments that we we accept as well as arguments that we disagree with
- Personal example: plastics in the ocean. I keep hearing that microscopic plastic particles are terrible, but what are their actual effects on marine life?
- This is relevant because we're using this logic to get rid of things like plastic utensils and plastic straws
- People decide what they believe far more swiftly than they realize
- If you can guess what your answer will be, then you already know what your answer will be, to a high degree of confidence
- It's not a true crisis of faith unless things could legitimately go either way
Anchoring, Adjustment and Contamination
- People anchor estimates to data that they've just received, even when that data is completely unrelated to the task at hand
- Example: people were asked to guess the number of countries in the United Nations, after watching a wheel of fortune yield a random number
- People who saw the number "65" come up guessed higher than people who saw the number "15
- Payoffs for accuracy did not change the magnitude of the effect
- People started with the anchoring point, and then adjusted their estimate up or down until it seemed reasonable, then they stopped adjusting
- The generalized form of anchoring is contamination
- Almost any prior information given can contaminate a judgment
- Manipulations to try to offset contamination are largely ineffective
- Placing the subject in a "cognitively busy" environment seems to increase contamination effects
- People will also consistently say that they were not affected by the anchor, even though the experimental evidence showed that they were
The Affect Heuristic
- Subjective impressions of "goodness" or "badness" can produce fast judgments
- People's subjective assessments of the "goodness" or "badness" of a technology colors their assessment of the possible risks of that technology
- Providing information that increases perception of benefit, decreases perception of risk, and vice versa
- This effect is magnified by sparse information, which is particularly troubling for the evaluation of future technology
- More powerful technologies may be rated as less risky if they can also promise great benefits
- Biotechnology can create cures for disease, but it can also lead to more potent biological weapons
- Nanotechnology can create new materials and faster computers, but it can lead to new forms of pollution and "gray-goo" scenarios
Scope Neglect
- People are willing to pay only a little bit more in order to have a much larger benefit
- Example:
- The median price that people are willing to pay to save 20,000 birds is $78
- The median price that people are willing to pay to save 200,000 birds is $88
- Possible explanations include:
- Affect heuristic, combined with availability bias - people reach for a prototypical example of the benefit they're trying to achieve and put their stated willingness to pay based upon how that single example makes them feel
- People are choosing to buy a certain amount of moral satisfaction, and pay based upon the moral satisfaction they feel rather than the amount of good being done
- People pay based upon how much money they have mentally allocated to the cause area
- Scope neglect applies to both human and animal lives
Calibration and Overconfidence
- People are wildly overconfident in their estimates
- Even when asked to give 98% confidence intervals, the true value was outside their confidence interval 42.6% of the time
- In other words, people assigned a 2% probability to events that occurred more than 42% of the time
- Letting people know about calibration makes them better calibrate, but still not calibrated well -- after calibration training, the true value lay outside of people's 98% confidence interval 19% of the time
- People don't realize how wide a range they need in order to have high confidence
- This especially applies to planning
- Planning fallacy
- People were asked to estimate the delivery date of their honors thesis
- On average people missed their average case estimate by 22 days and their worst-case estimate by 7 days
- Only 45% of students managed to finish their thesis by their 99% probability interval date
- Reality usually delivers results that are somewhat worse than the worst case
- Calibration and overconfidence is another bias accusation that has to be applied especially evenhandedly
Bystander Apathy
- People are far less likely to act when they're in a group than when they're on their own
- When people are in a situation that is ambiguously an emergency they look around for social evidence on how to react
- However, everyone else in such a situation is also looking for social proof
- As a result, people's natural instinct to look calm and unruffled kicks in and no one acts unless there is unambiguous evidence that the situation is an emergency
- This is the answer to the question, "If existential risk X is a real threat, why aren't more people doing something about it?"
- It's unambiguously clear whether X is a real threat or not
- As a result, there isn't enough social proof to justify publicly acting on X
A Final Caution
- Every true idea that discomforts you will tend to match the pattern of at least one psychological error
- We care about cognitive biases and psychological errors only insofar as they result in factual errors
- If there are no factual errors, then what do we care about the psychology?
- Before you say why someone is wrong, you must prove that they are wrong
Conclusion
- We need to have an organized body of thinking about existential risks not because the risks are all similar, but because we have similar flaws in how we think about them
- Skilled practitioners in a field will not automatically know of the existential risks their field generates
- Right now, most people stumble across the knowledge of heuristics and biases accidentally -- we should formalize and spread this knowledge so that more people outside of psychology know of these results
- Thinking about existential risk falls prey to the same biases and heuristics that we use for all of our thinking, only the stakes are much higher
- When thinking about the fate of all humanity, people use non-extensional reasoning -- imagine that "humanity" is a separate thing, and that the destruction of humanity doesn't imply the destruction of all they hold dear