Source: http://musingzebra.com/illusions-of-the-stock-market/
When Warren Buffett wrote that investors should have their eyes on the playing field instead of the scoreboard, the emphasis is to focus on what’s hugely important. At the same time, perhaps, his message is more than just how one should use their time, but the peril of constantly checking the scoreboard, the daily stock price. Following the market opens the Pandora’s box of psychological misjudgment, like a nudge to a domino that triggers a chain reaction where biases quickly snowball into insurmountable mistakes.
I’m confident you still remember The Tortoise and The Hare story even though you haven’t read it since childhood. Story simplifies reality so we understand the cause and effect relationship of an event. We seek causal explanation to feel assured. When there’s a significant move in the market, we want to know the reasons that cause the market to fall or how it happened so we can better predict the future. Not knowing the cause create anxiety. But the problem is that most market news stories come with hindsight bias.
Like a word highlighted in a word search puzzle, things always look obvious at hindsight. Any event can be explained from the rearview mirror. The cause of an event can be traced back to its origin with all the indicators that precipitate the event identified and analyzed. At times, we used the most plausible, or nonsensical explanation to explain the past when it’s impossible to identify the exact cause. In Fooled by Randomness, Nassim Taleb points out on the day of Saddam Hussein’s capture, the Bloomberg headline read “U.S Treasuries Rise; Hussein capture may not curb terrorism.” But when the bond prices fell back, the headline reads “U.S Treasuries Fall; Hussein capture boosts allure of risky assets.” We seldom verify and fact checks what’s true (you can’t interview the market). Rather, we go by the coherent of the story to intuitively decide if it is true. This folly of any explanation that fit the bill fool us into thinking the future is as predictable as the past. We fail to acknowledge the future is always uncertain, and the past is a probability of alternative histories: what could have happened instead of what happened.
I am a psychic when it comes to market prediction. My mind can create compelling stories as to why a stock will fall the next day, or how an event will unfold as I imagined. If they didn’t, the news offers plenty of hindsight stories to convince me I knew it all along. I feel good about my predictive ability, not knowing I was fooled by the illusion of causality and finding patterns in randomness as if every ocean ripple can be explained.
Take regression to the mean – a randomness concept where an event with an extreme outcome as a result of luck tends to followed by a moderate one. As an example, if you get head 9 times out of 10 coin flips, which is an extreme outcome, the next 10 flips is going to move closer to the mean – 5 head and tail each. While it’s easy to know coin flip is a game of luck, it is more complicated in business and investing. A business can get a lucky break by riding on an industry boom regardless of the management’s competency. But luck doesn’t make a good story. A riveting story of success needs a deterministic cause as the leading actor: the CEO’s hard work, dedication and past failures that explains the company’s success. But as Malcolm Gladwell wrote in Outlier, “The people who stand before kings may look like they did it all by themselves. But in fact they’re invariably the beneficiary of hidden advantages and extraordinary opportunities and cultural legacies.” Reality is where heredity, environment, luck, opportunity, random chance and hard work intersect to determine our fate. Reality is far more complicated and uncertain than what can be explained by the news in a single, deterministic factor. As a result, we underestimate luck and overestimate our ability to predict under the illusion of control and causality.
I wasn’t a bright student in school but rote learning got me through those years. If I can’t answer an exam question, I’ll pick the most familiar answer. Given that I’ve memorized it before the exam, the answer I’m familiar with should be the correct one. As with everything we do, repetition creates familiarity. What’s familiar often get confused as true. What’s true eventually become part of our belief. The reason why we’re worried about the next market crisis is that we want to avoid capital loss. Fear produces stress and attention. When you’re afraid, you want to keep in touch with the market (The time I spent checking share price quadruple when the market is volatile). It becomes a self-fulfilling prophecy that feeds on itself: news sells the danger of crisis, fear keeps our attention on the market, news reinforce this habit by selling more fear to justify the action of following the market. Through repetitions, we develop a false perception that crisis is the cause of capital loss. What’s familiar get mistaken as true.
Self-fulfilling prophecy, also called confirmation bias, means we have a tendency to reject contradictory information while finding information that conforms with our own beliefs. Remember, we crave certainty and consistency. Holding conflicting information is stressful. As a result, what’s inconsistent with our beliefs gets erased by the mind. In addition, we want to be seen by others as a consistent person. Consistency is associated with trustworthiness. A trustworthy man stands by his own words and does what he says. Changing one’s stance as fast as the wind is perceived as unreliable. Social conformity fuel the need to back up what we believe, even in the face of changing circumstances. When you mix consistency with trustworthiness, you associate your position with your own character. You confuse net-worth with self-worth. We hang on to losers because it’s hard to admit we’re wrong, in particular, when reputation is on the line. We get married to our position.
The incentive of journalism is to make a profit through more readership. It’s easier to sell what’s trivial but sensational – “This stock returned 1000%” or “Is this the next market crisis?”, than the important yet boring – “Good process is the hallmark of good decision”. Survivorship bias is at play here. What’s familiar is what’s sensational – events that make it to the news. We don’t see events that aren’t newsworthy, they did not survive. Therefore, what’s available to the mind are the things that appear in the news. Information asymmetry creates availability bias. As Daniel Kahneman puts it in Thinking, Fast and Slow, “The confidence people have in their belief is not a measure of the quality of evidence but of the coherence of the story the mind has managed to construct.” The ability to construct a coherent story depends on what’s easily available to the mind. In other words, you are likely to overweight low probability, recent events and underweight distant past events due to the asymmetricity of news reporting.
Also called dread risk, we’re afraid to lose money in a market crisis but have no qualms in everyday speculation when the latter has a higher chance of killing you. A cocktail of recency bias and availability bias is a recipe for disaster in the making. Gerd Gigerenzer sums it up in Risk Savvy, “an estimated sixteen hundred American lost their lives on the road due to their decision to avoid the risk of flying [following the September 11 terrorist attack]…This death toll is six times higher than the total number of passengers (256) who died on board the four fatal flights.” Given the attack is all over the news, information availability makes it is easy for the mind to concoct a lucid story which misjudges the probability of getting killed in a plane crash as higher than in a car accident. This also underlies a crucial reality: Probabilistic thinking is hard. But worse when the news magnifies this fallibility. Consider these two statements:
1. The oil price will hit a record high this year.
2. The oil price will hit a record high this year due to Iran sanction and higher demand from China.
Which is more probable? The 2nd statement sounds more likely, but it is the 1st that’s more probable. We choose the 2nd statement because it has a coherent story on what causes the oil price to hit a record; we seek causality. A story has a cause, it is vivid and memorable. While statistic is abstract and forgettable.
News does more than distort our perception of risk, it encourages risk-taking behavior. Risk homeostasis is a concept that explains every individual has an acceptable level of risk. If an event is perceived to be safe, you’ll take on more risk. On the other hand, you’ll be cautious with something risky. The notion that one can act swiftly by following the market creates an illusion of safety that increases the appetite for more risk. When you feel safe, you’re relaxed and confident. You trade more frequently. You push the limits. Breaking news incentivize instant gratification. Short-termism and recency bias are the reasons why we pile into a bull run and scramble for the exit door in a bear market. We don’t predict; we extrapolate. Short-termism promotes fear of missing out (FOMO) and loss aversion. You’ll judge your decision based on the outcome of gains and losses. However, the share price in the short-term is distorted by luck. Focusing on the outcome rather than the process only accelerates more bad decisions.
Let’s recap what we’ve gone through so far. The news tends to make an event looks deterministic, certain and knowable by finding the most plausible reason to explain the past, which creates a perception the future is as predictable as the past (Bias: hindsight, causality, certainty, control, predictability). Through repeated experience, we become familiar with these reasonings, and as a consequence, confuse what’s familiar with what’s true (Bias: familiarity, remembering, validity, truth). We have a tendency to find supporting information to confirm what we believe to be true. Social conformity fuel the need to be consistent as well (Bias: confirmation, internalism). Familiarity is also the result of availability (Bias: availability, attentional). Since what’s available is mostly sensational, we misjudge the probability of an event by overestimating recent or events that appear in the news while underestimating things that don’t get reported (Bias: probability neglect, recency). Risk homeostasis theory further means the news increase risk appetite through frequent trading (Bias: safety, knowledge, overconfidence, skill). As if that’s not enough, news incentivize short-termism by promoting behavior driven by market sentiments (Bias: short-termism, outcome). Outcome driven judgment increase the tendency to buy into euphoria and sell in panic (Bias: FOMO, loss aversion, herd mentality).
To be clear, these biases don’t work in a straight line, but in no particular order with one feeding another while simultaneously triggering others. Regardless, the outcome is certain: it is a cocktail of fallibilities that lead to overconfidence in one’s ability to predict. What kills most investors are not market crisis, but their own ego and overconfidence. It is not the level of your forecasting accuracy that gets you into trouble either. Low accuracy is never the problem. Since, if you know what you don’t know, you just have to wait until the odds are in your favor before you swing. Rather, it is when you think you know when you don’t know, when your subjective confidence deviates from the objective truth, that the problem started to snowball.
It seems like I’m suggesting you to pack your bags and move into the caves. Of course not. On a high level, more information does indeed improve decision, but to an extent. As Nate Silver explains in Signal & Noise, “the volume of information is increasing exponentially. But relatively little of this information is useful – the signal-to-noise ratio may be waning.” While we are generating more signal thanks to big data, the noise is growing at an even faster pace. What we need isn’t more information, but better ways to filter the relevant from the irrelevant. How can we have a better filter? Learn how to think well. Cultivate a rigorous, second-level thinking. There’s a misconception that one needs to master complex knowledge before he can become a good thinker. Quite the contrary, good thinking lies in its simplicity – on knowing what not to do. It is about learning how to exercise control over how and what you think and not getting your mind polluted by nonsense so you don’t make silly mistakes. You’ll be surprised by the clarity of your thoughts once you stopped following the market. If possible, learn psychology. Just as the language of finance improves investment knowledge, understand the language of psychology makes you aware of psychological biases that can affect your decisions so you can find ways to avoid them. Most important of all, you want information that helps you to understand the playing field, not those that explains the scoreboard.
When Warren Buffett wrote that investors should have their eyes on the playing field instead of the scoreboard, the emphasis is to focus on what’s hugely important. At the same time, perhaps, his message is more than just how one should use their time, but the peril of constantly checking the scoreboard, the daily stock price. Following the market opens the Pandora’s box of psychological misjudgment, like a nudge to a domino that triggers a chain reaction where biases quickly snowball into insurmountable mistakes.
I’m confident you still remember The Tortoise and The Hare story even though you haven’t read it since childhood. Story simplifies reality so we understand the cause and effect relationship of an event. We seek causal explanation to feel assured. When there’s a significant move in the market, we want to know the reasons that cause the market to fall or how it happened so we can better predict the future. Not knowing the cause create anxiety. But the problem is that most market news stories come with hindsight bias.
Like a word highlighted in a word search puzzle, things always look obvious at hindsight. Any event can be explained from the rearview mirror. The cause of an event can be traced back to its origin with all the indicators that precipitate the event identified and analyzed. At times, we used the most plausible, or nonsensical explanation to explain the past when it’s impossible to identify the exact cause. In Fooled by Randomness, Nassim Taleb points out on the day of Saddam Hussein’s capture, the Bloomberg headline read “U.S Treasuries Rise; Hussein capture may not curb terrorism.” But when the bond prices fell back, the headline reads “U.S Treasuries Fall; Hussein capture boosts allure of risky assets.” We seldom verify and fact checks what’s true (you can’t interview the market). Rather, we go by the coherent of the story to intuitively decide if it is true. This folly of any explanation that fit the bill fool us into thinking the future is as predictable as the past. We fail to acknowledge the future is always uncertain, and the past is a probability of alternative histories: what could have happened instead of what happened.
I am a psychic when it comes to market prediction. My mind can create compelling stories as to why a stock will fall the next day, or how an event will unfold as I imagined. If they didn’t, the news offers plenty of hindsight stories to convince me I knew it all along. I feel good about my predictive ability, not knowing I was fooled by the illusion of causality and finding patterns in randomness as if every ocean ripple can be explained.
Take regression to the mean – a randomness concept where an event with an extreme outcome as a result of luck tends to followed by a moderate one. As an example, if you get head 9 times out of 10 coin flips, which is an extreme outcome, the next 10 flips is going to move closer to the mean – 5 head and tail each. While it’s easy to know coin flip is a game of luck, it is more complicated in business and investing. A business can get a lucky break by riding on an industry boom regardless of the management’s competency. But luck doesn’t make a good story. A riveting story of success needs a deterministic cause as the leading actor: the CEO’s hard work, dedication and past failures that explains the company’s success. But as Malcolm Gladwell wrote in Outlier, “The people who stand before kings may look like they did it all by themselves. But in fact they’re invariably the beneficiary of hidden advantages and extraordinary opportunities and cultural legacies.” Reality is where heredity, environment, luck, opportunity, random chance and hard work intersect to determine our fate. Reality is far more complicated and uncertain than what can be explained by the news in a single, deterministic factor. As a result, we underestimate luck and overestimate our ability to predict under the illusion of control and causality.
I wasn’t a bright student in school but rote learning got me through those years. If I can’t answer an exam question, I’ll pick the most familiar answer. Given that I’ve memorized it before the exam, the answer I’m familiar with should be the correct one. As with everything we do, repetition creates familiarity. What’s familiar often get confused as true. What’s true eventually become part of our belief. The reason why we’re worried about the next market crisis is that we want to avoid capital loss. Fear produces stress and attention. When you’re afraid, you want to keep in touch with the market (The time I spent checking share price quadruple when the market is volatile). It becomes a self-fulfilling prophecy that feeds on itself: news sells the danger of crisis, fear keeps our attention on the market, news reinforce this habit by selling more fear to justify the action of following the market. Through repetitions, we develop a false perception that crisis is the cause of capital loss. What’s familiar get mistaken as true.
Self-fulfilling prophecy, also called confirmation bias, means we have a tendency to reject contradictory information while finding information that conforms with our own beliefs. Remember, we crave certainty and consistency. Holding conflicting information is stressful. As a result, what’s inconsistent with our beliefs gets erased by the mind. In addition, we want to be seen by others as a consistent person. Consistency is associated with trustworthiness. A trustworthy man stands by his own words and does what he says. Changing one’s stance as fast as the wind is perceived as unreliable. Social conformity fuel the need to back up what we believe, even in the face of changing circumstances. When you mix consistency with trustworthiness, you associate your position with your own character. You confuse net-worth with self-worth. We hang on to losers because it’s hard to admit we’re wrong, in particular, when reputation is on the line. We get married to our position.
The incentive of journalism is to make a profit through more readership. It’s easier to sell what’s trivial but sensational – “This stock returned 1000%” or “Is this the next market crisis?”, than the important yet boring – “Good process is the hallmark of good decision”. Survivorship bias is at play here. What’s familiar is what’s sensational – events that make it to the news. We don’t see events that aren’t newsworthy, they did not survive. Therefore, what’s available to the mind are the things that appear in the news. Information asymmetry creates availability bias. As Daniel Kahneman puts it in Thinking, Fast and Slow, “The confidence people have in their belief is not a measure of the quality of evidence but of the coherence of the story the mind has managed to construct.” The ability to construct a coherent story depends on what’s easily available to the mind. In other words, you are likely to overweight low probability, recent events and underweight distant past events due to the asymmetricity of news reporting.
Also called dread risk, we’re afraid to lose money in a market crisis but have no qualms in everyday speculation when the latter has a higher chance of killing you. A cocktail of recency bias and availability bias is a recipe for disaster in the making. Gerd Gigerenzer sums it up in Risk Savvy, “an estimated sixteen hundred American lost their lives on the road due to their decision to avoid the risk of flying [following the September 11 terrorist attack]…This death toll is six times higher than the total number of passengers (256) who died on board the four fatal flights.” Given the attack is all over the news, information availability makes it is easy for the mind to concoct a lucid story which misjudges the probability of getting killed in a plane crash as higher than in a car accident. This also underlies a crucial reality: Probabilistic thinking is hard. But worse when the news magnifies this fallibility. Consider these two statements:
1. The oil price will hit a record high this year.
2. The oil price will hit a record high this year due to Iran sanction and higher demand from China.
Which is more probable? The 2nd statement sounds more likely, but it is the 1st that’s more probable. We choose the 2nd statement because it has a coherent story on what causes the oil price to hit a record; we seek causality. A story has a cause, it is vivid and memorable. While statistic is abstract and forgettable.
News does more than distort our perception of risk, it encourages risk-taking behavior. Risk homeostasis is a concept that explains every individual has an acceptable level of risk. If an event is perceived to be safe, you’ll take on more risk. On the other hand, you’ll be cautious with something risky. The notion that one can act swiftly by following the market creates an illusion of safety that increases the appetite for more risk. When you feel safe, you’re relaxed and confident. You trade more frequently. You push the limits. Breaking news incentivize instant gratification. Short-termism and recency bias are the reasons why we pile into a bull run and scramble for the exit door in a bear market. We don’t predict; we extrapolate. Short-termism promotes fear of missing out (FOMO) and loss aversion. You’ll judge your decision based on the outcome of gains and losses. However, the share price in the short-term is distorted by luck. Focusing on the outcome rather than the process only accelerates more bad decisions.
Let’s recap what we’ve gone through so far. The news tends to make an event looks deterministic, certain and knowable by finding the most plausible reason to explain the past, which creates a perception the future is as predictable as the past (Bias: hindsight, causality, certainty, control, predictability). Through repeated experience, we become familiar with these reasonings, and as a consequence, confuse what’s familiar with what’s true (Bias: familiarity, remembering, validity, truth). We have a tendency to find supporting information to confirm what we believe to be true. Social conformity fuel the need to be consistent as well (Bias: confirmation, internalism). Familiarity is also the result of availability (Bias: availability, attentional). Since what’s available is mostly sensational, we misjudge the probability of an event by overestimating recent or events that appear in the news while underestimating things that don’t get reported (Bias: probability neglect, recency). Risk homeostasis theory further means the news increase risk appetite through frequent trading (Bias: safety, knowledge, overconfidence, skill). As if that’s not enough, news incentivize short-termism by promoting behavior driven by market sentiments (Bias: short-termism, outcome). Outcome driven judgment increase the tendency to buy into euphoria and sell in panic (Bias: FOMO, loss aversion, herd mentality).
To be clear, these biases don’t work in a straight line, but in no particular order with one feeding another while simultaneously triggering others. Regardless, the outcome is certain: it is a cocktail of fallibilities that lead to overconfidence in one’s ability to predict. What kills most investors are not market crisis, but their own ego and overconfidence. It is not the level of your forecasting accuracy that gets you into trouble either. Low accuracy is never the problem. Since, if you know what you don’t know, you just have to wait until the odds are in your favor before you swing. Rather, it is when you think you know when you don’t know, when your subjective confidence deviates from the objective truth, that the problem started to snowball.
It seems like I’m suggesting you to pack your bags and move into the caves. Of course not. On a high level, more information does indeed improve decision, but to an extent. As Nate Silver explains in Signal & Noise, “the volume of information is increasing exponentially. But relatively little of this information is useful – the signal-to-noise ratio may be waning.” While we are generating more signal thanks to big data, the noise is growing at an even faster pace. What we need isn’t more information, but better ways to filter the relevant from the irrelevant. How can we have a better filter? Learn how to think well. Cultivate a rigorous, second-level thinking. There’s a misconception that one needs to master complex knowledge before he can become a good thinker. Quite the contrary, good thinking lies in its simplicity – on knowing what not to do. It is about learning how to exercise control over how and what you think and not getting your mind polluted by nonsense so you don’t make silly mistakes. You’ll be surprised by the clarity of your thoughts once you stopped following the market. If possible, learn psychology. Just as the language of finance improves investment knowledge, understand the language of psychology makes you aware of psychological biases that can affect your decisions so you can find ways to avoid them. Most important of all, you want information that helps you to understand the playing field, not those that explains the scoreboard.