Thinking in Bets

Thinking in Bets – Making Smarter Decisions When You Don’t Have All the Facts

By Annie Duke

The book promises that thinking in bets will improve decision making throughout our lives.

Annie Duke took a temporary break that turned into a twenty-year career as a professional poker player.

She discovered a new kind of lab for studying how people learn and make decisions.

Each poker hand provides immediate feedback with loose signals of decision quality.

Bet = decision about an uncertain future

Treating decisions as bets keeps emotions out of the process (as much as possible).

Thinking in bets moves you toward objectivity, accuracy, and open-mindedness – and the move compounds over time

2 things that determine how our lives turn out = the quality of our decisions and luck.

Learning to recognize the difference between the two is what thinking in bets is all about.

Chapter 1 – Life is Poker, Not Chess

Pete Carroll got unlucky when he called for a pass play (that was intercepted) at first and goal that was intercepted in the Super Bowl.

He made a good-quality decision that got a bad result.

Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome = “resulting”

It was the worst result of a call, ever (literally losing the Super Bowl).

If I ask you to remember your best decision and your worst decision from the past year, then you will likely identify with the best and worst results.

Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable.

Our brains evolved to create certainty and order, and we are uncomfortable with the idea that luck plays a significant role in our lives.

Type I error = false positive

Type II error = false negative

Different brain functions compete to control our decisions (Source: Daniel Kahneman’s Thinking, Fast and Slow about “System 1” and “System 2”)

System 1 = “fast thinking” encompasses reflect, instinct, intuition, impulse, and automatic processing = “reflexive mind”

System 2 = “slow thinking” is how we choose, concentrate, and expend mental energy = “deliberative mind”

Automatic processing originates in the evolutionarily older parts of the brain, including the cerebellum, basal ganglia, and amygdala.

Our deliberative mind operates out of the prefrontal cortex.

Colin Camerer says, “We have this thin layer of prefrontal cortex made just for us, sitting on top of this big animal brain. Getting this thin little layer to handle more is unrealistic.” [because our deliberative capacity is already maxed out.]

Most of what we do daily exists in automatic processing, and the challenge is to figure out how to work within the limitations of the brains we have.

We must be aware of our irrational behaviors, and we can look for practical work-arounds.

Chapter 2 – Wanna Bet?

Chapter 3 – Bet to Learn: Fielding the Unfolding Future

Chapter 4 – The (Accountability) Buddy System

David Letterman interviewed Lauren Conrad (MTV’s The Hills reality star) about her relationship drama and asked “Maybe you’re the problem?”

The internet went wild saying that “Letterman basically calls Conrad an idiot” “ripped into Conrad” and “makes fun of Conrad”

Letterman’s mistake was offering up the insight in an inappropriate forum to someone who had not agreed to that kind of truthseeking exchange.

Such interactions are reminders that not all situations are appropriate for truthseeking, nor are all people interested in the pursuit.

“Lettermanning” needs agreement by both parties to be effective.

“You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in Wonderland, and I show you how deep the rabbit hole goes. All I am offering is the truth. Nothing more.” – Morpheus

Morpheus asked Neo to make the choice to exit the matrix with him – to opt-in to truthseeking

In Annie Duke’s poker circles, and in our peer groups in life, we can make learning pods.

You have to learn to focus on the things you can control (your own decisions), let go of all the things you can’t control (luck), and work to be able to accurately tell the difference between the two.

Thinking in bets is easier if you have other people to help you.

Remember the buddy system from elementary school?

A good decision group is a grown-up version of the buddy system = truthseeking pod

You can form or join a group where the focus is on thinking in bets means modifying the usual social contract. It means agreeing to be open-minded to those who disagree with us, giving credit where it’s due, and taking responsibility where it’s appropriate, even (and especially) when it makes us uncomfortable.

Other people can spot our errors better than we can.

As long as there are three people in the group (two to disagree and one to referee), the truthseeking group can be stable and productive.

Different friends fill different needs.

What’s in the truthseeking agreement? What are the features of a productive decision-making pod?

The most well-known example of a productive group approach is Alcoholics Anonymous (AA).

Bill W. started AA, and he learned that to maintain sobriety, he needed to talk to another alcoholic.

This all sprang from the concept that we can do better with the help of others.

Confirmatory thought promotes a love and celebration of one’s own beliefs, distorting how the group processes information and works through decisions, the result of which can be group think or an echo chamber.

Exploratory thought encourages open-minded and objective consideration of alternative hypotheses and a tolerance of dissent to combat bias.

You can improve the thinking of individual decision-makers when individuals are accountable to a group whose interest is in accuracy.

To avoid confirmatory thought and promote exploratory thought: “Complex and open-minded thought is most likely to be activated when decision makers learn prior to forming any opinions that they will be accountable to an audience (a) whose views are unknown, (b) who is interested in accuracy, (c) who is reasonably well-informed, and (d) who has a legitimate reason for inquiring into the reasons behind participants’ judgments or choices.” said Lerner and Tetlock.

Individuals are accountable to a group whose interest is in accuracy = improved thinking of individual decision-maker

You should encourage and celebrate a diversity of perspectives to challenge your biased thinking.

Blueprint for Truthseeking Charter:

  • Focus on accuracy (over confirmation), which includes rewarding truthseeking, objectivity, and open-mindedness within the group

  • Accountability, for which members have advance notice

  • Openness to a diversity of ideas

We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world.

Annie once told a member of her truthseeking charter of poker players about her bad luck in losing. In three sentences, her referee laid out all the elements of a productive group charter by saying, “I don’t want to hear it. I’m not trying to hurt your feelings, but if you have a question about a hand, you can ask me about strategy all day long. I just don’t think there’s much purpose in a poker story if the point is about something you had no control over, like bad luck.”

He encouraged me to find things I might have control over and how to improve decisions about those things.

Our craving for approval is incredibly strong and incentivizing.

Productive decision group can harness this desire by rewarding accuracy and intellectual honesty with social approval.

AA tokens are a tangible reminder that others acknowledge you are accomplishing something difficult.

Annie experienced firsthand the power of a group’s approval to reshape individual thinking habits because she got her fix by trying to be the best credit-giver, the best mistake-admitter, and the best finder-of-mistakes-in-good-outcomes.

You feel disapproval from the group when you act against the charter (and complain about bad luck or something outside your control).

Talking about winning (even if we are identifying mistakes along the way to a win) is less painful than talking about losing, allowing new habits to be more easily trained. Identifying mistakes in hands you won reinforces the separation between outcomes and decision quality.

Once we are in a group that regularly reinforces exploratory thought, the routine becomes reflexive, running on its own. It’s self-reinforced. We internalize the group’s approval.

Accountability is a willingness or obligation to answer for our actions or beliefs to others, and a bet is a form of accountability.

In different environments, you can become hyper-vigilant about your level of confidence in your beliefs.

Accountability improves our decision-making because we know in advance that we will have to answer to the group for our decisions.

Loss Limit = preset amount ($600) that you can lose at the game, and if that amount is hit, then you must walk away from the table

You preset the Loss Limit when you are rational without any emotions that may be pulsing through you after losing a big hand.

Accountability can make future conversations run in your head like explaining to your group why you kept playing after losing $600.

John Stuart Mill is one of the heroes of thinking in bets. He wrote On Liberty which discussed the importance of diversity of opinion, “The only way in which a human being can make some approach to knowing the whole of a subject, is by hearing what can be said about it by persons of every variety of opinion, and studying all modes in which it can be looked at by every character of mind. No wise man ever acquired his wisdom in any mode but this; nor is it in the nature of human intellect to become wise in any other manner.”

Group of humans can get exposed to diverse opinions, can test alternative hypotheses, and move toward accuracy.

Individual cannot get the diversity of viewpoints provided by the combined manpower of a well-formed decision pod.

Decision pods can combat motivated reasoning about beliefs and biased outcome fielding of questions like:

  • Why might my belief not be true?

  • What other evidence might be other bearing on my belief?

  • What sources of information could I hav missed or minimized on the way to reaching my belief?

  • What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me?

  • Why other perspectives are there as to why things turned out the way they did?

By asking ourselves these questions, we are taking a big step toward calibration.

We don’t know what different information other people have. But they do.

Others aren’t wrapped up in preserving our narrative, anchored by our biases. These are ideal circumstances for promoting accuracy.

CIA “red teams” are dedicated to arguing against the intelligence community’s conventional wisdom and spotting flaws in logic and analysis.

John Stuart Mill’s bedrock principle that we can’t know the truth of a matter without hearing the other side.

Diversity is the foundation of productive group decision-making.

Case Study: Political judges and scientists

3 judges. 2 democrats + 1 republican is way better than 3 of 1 party.

Good judge prided himself on hiring liberal clerks. He would tell his clerks that the conservative side of the issues came to him naturally. Their job was to present the other side, to challenge him. He would rather encounter a compelling argument for another position in the privacy of his own chambers, than to meet it unexpectedly at conference or in a dissent.

Mark Twain said, “Never try to teach a pig to sing. It wastes your time and annoys the pig.”

There is a natural drift toward homogeneity and confirmatory thought. We all experience this gravitation toward people who think like we do.

Scientists aren’t immune, even if they are overwhelmingly trained and chartered toward truthseeking.

IQ is positively correlated with the number of reasons people find to support their own side in an argument.

Making risk explicit rather than implicit refocuses us all to be more objective.

People are more willing to offer their opinion when the goal is to win a bet rather than get along with people in a room.

Accuracy, accountability, and diversity wrapped into a group’s charter all contribute to better decision-making, especially if the group promotes thinking in bets.

Chapter 5 – Dissent to Win

CUDOS to a magician named Robert K Merton who died in 2003, and the New York Times called him “one of the most influential sociologists of the 20th century.”

CUDOS is “an ideologically balanced science that routinely resorted to adversarial collaborations to resolve empirical disputes would bear a striking resemblance to Robert Merton’s ideal-type model of a self-correcting epistemic community, one organized around the norms of CUDOS.”

Communism (data belongs to the group – share data, don’t horde it)

Universalism (apply uniform standards to claims and evidence, regardless of where it came from – remove bias about message[r])

Disinterestedness (vigilance against potential conflicts that can influence the group’s evaluation)

Organized Skepticism (discussion among the group to encourage engagement and dissent)

Merton begin in 1942 to write about the normative structure of science, and 31 years later, published a 12-page paper that serves as an excellent manual for developing rules of engagement for any truthseeking group.

CUDOS – communism, universalism, disinterestedness, and organized skepticism can be broadly applied and adapted to push a group toward objectivity.

Mertonian Communism = more is more

Not the political system – it refers to the communal ownership of data within groups because an individual researcher’s data must eventually be shared with the scientific community at large for knowledge to advance.

Secrecy is the antithesis of this norm – full and open communication its enactment.

Richard Feynman described the ideal of scientific sharing, “a kind of utter honesty, or a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid, and not only what you think is right about it. You should share other causes that could possibly explain your results…”

More is more, so get all the information out there for everyone.

As a rule of thumb, if you have an urge to leave out a detail because it makes us uncomfortable or requires even more clarification to explain away, those are exactly the details we must share. The mere fact of our hesitation and discomfort is a signal that such information may be critical to providing a complete and balanced account.

Sharing data and information, like the other elements of a truthseeking charter, is done by agreement.

An agreement to share details pertinent to assessing the quality of a decision is part of a productive truthseeking charter.

After we got every detail out of all the dimensions of the decision, we reached a different conclusion…

Be a data sharer. That’s what experts do. Experts understand that sharing data is the best way to move toward accuracy.

What the experts recognize is that the more detail you provide, the better the assessment of decision quality you get.

Coach John Madden went to a Vince Lombardi coaching clinic, and Lombardi described the power sweep, one play, for eight hours. Madden said, “I went in there cocky, thinking I knew everything there was to know about football, and he spent eight hours talking about this one play…I realized then that I actually knew nothing about football.”

We are naturally reluctant to share information that could encourage others to find fault in our decision-making. My group made this easier by making me feel good about committing myself to improvement.

The reward will be better habits of mind.

Agree to be a data sharer and rewards others in your decision group for telling more of the story.

Universalism – don’t shoot the message

Don’t shoot the messenger is a good reason why we want to protect and encourage dissenting ideas.

Chapter 6 – Adventures in Mental Time Travel

Back to the Future gave bad advice when Doc Brown told us, “Whatever you do, don’t meet up with yourself!”

Doc Brown warned that “the encounter could create a time paradox, the results of which could cause a chain reaction that would unravel the very fabric of the space-time continuum and destroy the entire universe.”

Timecop cautioned that “the same matter can’t occupy the same space at the same time.”

In real-life decision-making, when we bring our past-self or future-self into the equation, the space-time continuum doesn’t unravel.

As decision-makers, we want to collide with past and future versions of ourselves. Our capacity for mental time travel makes this possible.

Chronesthesia = Mental time travel through the ability to be aware of our past or future

We can recruit other versions of ourselves to act as our own decision buddies

We can harness the power of mental time travelling, operationalizing it, encouraging it, and figuring out ways to cause that collision of past, present, and future as much as possible.

The constant exchange of chips reminds poker players that there is risk in every decision – every decision has consequences.

We need strategies designed to recruit past-us and future-us to help with all the executive decisions.

No strategy can turn us into perfectly rational actors. We can make the best possible decisions and still not get the result we want.

Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them

Effort makes a small difference – more rational thinking – fewer emotional decisions – translates into increased probability of better outcomes.

Good results compound. Good processes become habits. Make possible future calibration and improvement.

Night Jerry

Jerry Seinfeld said, “I stay up late at night because I’m Night Guy. Night Guy wants to stay up late. What about getting up after five hours of sleep? That’s Morning Guy’s problem. That’s not my problem. I’m Night Guy. I stay up as late as I want. So you get up in the morning: you’re exhausted, you’re groggy. Oh, I hate that Night Guy. See, Night Guy always screws Morning Guy.”

This is a good example of how we struggle in the present to take care of our future-self.

When we make in-the-moment decisions (and don’t ponder about the past or future), we are more likely to be irrational and impulsive.

Temporal discounting is the tendency we all have to favor our present-self at the expense of our future-self.

Daniel Kahneman recognizes that System 2 should not be considered immune to bias. We can reduce the likelihood of emotionally driven decisions and decrease the influence of bias through self-reflection and vigilance. One ways to do this is to take advantage of mental time-travel strategies.

When Night Jerry stays up late, it’s because it benefits him now; he discounts the benefits that come later from going to bed.

Saving for retirement is a temporal discounting problem because the gratification of spending discretionary income is immediate.

Retirement planning

Time traveling can get us in touch with that future version of us. It can get future-us to remind present-us. This idea that seeing our aged future-self could help us make better allocation decisions.

When we think about the past and the future, we engage a deliberative mind, improving our ability to make a more rational decision.

Our vision of the future is rooted in our memories of the past. The future we imagine is a novel reassembling of our past experiences.

The same neural network is engaged when we imagine the future as when we remember the past.

Thinking about the future is remembering the future, putting memories together in a creative way to imagine a possible way things might turn out.

Those brain pathways include the hippocampus (a key structure for memory) and the pre-frontal cortex, which controls System 2. This is our cognitive control center for deliberative decision-making.

Fundamentally, Morning Jerry and Night Jerry are living the same life. We want Night Jerry and Morning Jerry colliding on the decision of when to get some sleep. We want all those Marty McFlys to get the additional perspective of all the other Marty McFlys.

Philosophers agree that regret is one of the most intense emotions we feel.

Nietzsche said that remorse was “adding to the first act of stupidity a second.”

If regret occurred before a decision instead of after, the experience of regret might get us to change a choice likely to result in a bad outcome.

It would be helpful if we could get regret to do some time traveling of its own, and move regret before our decisions instead of after them.

Before Annie Dukes said down at a poker table, she set her loss limit at $600. Predefined by her rational mind before any emotions could cloud her judgement in the heat of the moment at the poker table. Because of the loss-limit agreement she had made with herself and her accountability group, she ran the conversation in her head that she would be forced to have when she explained why she kept playing beyond her limit. It gave her a chance to regret the decision before she even bought more chips.

Time Travel Goal: To create moments like that, where we can interrupt an in-the-moment decision and take some time to consider the decision from the perspective of our past and future. We can then create a habit routine around these decision interrupts to encourage this perspective taking, asking ourselves a set of simple questions at the moment of the decision designed to get future-us and past-us involved.

Imagine how future-us is likely to feel about the decision or by imagining how we might feel about the decision today if past-us has made it. The approaches are complimentary. Whether you choose to travel to the past or travel to the future depends solely on what approach you find most effective.

10-10-10-Process

    • What are the consequences of each of my options in 10 minutes?

    • In 10 months?

    • In 10 years?

    • This set of questions triggers mental time travel that cues that accountability conversation

    • How would I feel today if I had made this decision 10 minutes ago?

    • 10 months ago?

    • 10 years ago?

    • Whichever frame we choose, we draw on our past experiences (including similar decisions that we may have regretted)

Recruiting into the decision those less-reactive brain pathways that control executive functioning because it activates the neural pathways that engage the prefrontal cortex, inhibiting emotional mind, and keeping events in a more rational perspective. This discourages us from magnifying the present moment

You can use the a 10-10-10-like strategy to recruit your past-self and future-self to ask:

    • How have I felt when I made this decision in the past?

    • How has it generally worked out?

    • When I look back, do I feel I was playing my best (or being my best self)?

    • Do you think this will really matter in the long run?

    • The routine of asking yourself these questions helps to mitigate the in-the-moment risk that, as you are might be losing your mental edge, you might try to convince yourself to make a bad decision

    • 10-10-10 strategy gets us to imagine the decision or outcome in the perspective of the past and the future.

    • Time-travel strategy calms down the in-the-moment emotions we have about an event, so we can get back to using the more rational part of our brain

Moving regret in front of a decision has numerous benefits.

It can influence us to make a better decision.

It can help us treat ourselves (regardless of the actual decision) more compassionately after the fact.

By planning ahead, we can device a plan to respond to a negative outcome instead of just reacting to it.

Coming to peace with a bad outcome in advance will feel better than refusing to acknowledge it.

After-the-fact regret can consume us. Like all emotions, regret initially feels intense but gets better with time.

Time-travel strategies can help us remember that the intensity of what we feel now will subside over time.

Analogy: We make long-term stock investment because we want it to appreciate over years or decades. We would be better off thinking about our happiness as a long-term stock holding. We would do well to view our happiness through a wide-angle lens, striving for a long, sustaining upward trend in our happiness stock. Mental time travel makes that kind of perspective possible.

Our decision-making becomes reactive, focused on off-loading negative emotions or sustaining positive emotions from the latest change in the status quo.

May result in a self-serving bias: fielding outcomes to off-load the negative emotions we feel in the moment from a bad outcome by blaming them on luck, and sustaining the positive emotions from good outcomes by taking credit for them. Perhaps it’s better to be humble, gracious, and resilient. Beware of the self-fulfilling prophecy when decisions are driven by (negative) emotional moments.

But wait, there’s more!

The way we field outcomes is path dependent.

How we got there is more important than where we end up.

What has happened in the recent past drives our emotional response much more than how we are doing overall.

The zoom lens doesn’t just magnify – it distorts – making negative outcomes 2x worse than positive outcomes make better.

Our in-the-moment emotions affect the quality of the decisions we make in those moments, and we are very willing to make decisions when we are not emotionally fit to do so.

When you think about the outcomes as having happened in the distant past, it is likely your preference for the results reverses, landing in a more rational place. Once we pull ourselves out of the moment through time-traveling exercises, we can see these things in proportion to their relative size.

Legends of the profession said, “It’s all just one long poker game.” This aphorism is a reminder to take the long view, especially when something big happened in the last half hour, or the previous hand, or when we get a flat tire. Taking the long view leads to thinking in a more rational way.

More Renewable Resources