Description
For centuries, people in the Old World were absolutely certain that all swans were white. It was a fact they confirmed with every sighting. This belief was logical, deeply held, and based on endless evidence. Then, explorers traveled to Australia and found a black swan. In an instant, a single observation shattered a belief that had been held for thousands of years.
This is the central idea of a “Black Swan.” It is an event that comes as a complete surprise, has a massive and transformative effect, and is then explained in hindsight as if it should have been predictable all along. These events show that our understanding of the world is far more fragile than we think.
The main reason we are so vulnerable to these huge surprises is not that the world is completely random, but that our way of thinking is flawed. We are naturally “dogmatic,” meaning we build ideas about how the world works and then cling to them tightly. We like to believe our knowledge is complete, but human knowledge is always growing. Two hundred years ago, doctors were positive about their medical treatments, such as using leeches for illnesses. Today, their confidence seems absurd. We are often the same, blind to concepts that exist outside our accepted view of the world. This narrow-mindedness means we are surprised not by truly random events, but because our perspective was too small in the first place.
A powerful example of this flawed thinking is the “turkey problem.” Imagine you are a turkey living on a farm. Every single day for a thousand days, the farmer comes out to feed you. He gives you food, fresh water, and a safe place to live. From your point of view, based on all available evidence, the farmer is your best friend. You have no reason to expect anything different tomorrow. Your confidence in this positive trend grows daily. Then, one day in late November, the farmer comes out, but not to feed you. It is Thanksgiving. For the turkey, this is a devastating Black Swan. For the farmer, it was the plan all along. We often think like the turkey, believing that what happened in the past is a reliable guide for the future, and we fail to see the point where the pattern will break.
We also suffer from a habit called “confirmation bias.” This is a simple problem: we actively look for evidence that proves we are already right, and we ignore or dismiss evidence that suggests we are wrong. If we already believe something, we are unlikely to accept information that contradicts it. For example, if you believe a certain theory, you will probably search online for “proof” of that theory rather than searching for “evidence for and against” it. This habit is like wearing blinders. It stops us from seeing the full picture and makes us extremely vulnerable to surprises.
Our brains are built to find patterns, even when they are not really there. We face a massive amount of information every day. To cope, our brains select just a few bits of information and weave them into a simple, easy-to-understand story. This is called the “narrative fallacy.” We look back at our lives and pick a few key moments to explain who we are. We might say, “I became a musician because my mother played music for me.” This sounds nice, but it ignores the millions of other tiny, random events that also shaped us. We prefer simple stories over complex, messy reality. The problem is that these simple stories give us a false sense of understanding. We think we know why something happened, but our explanation is just a story we invented after the fact.
Because we love simple cause-and-effect stories, we often miss the fact that tiny, seemingly insignificant events can have massive, unpredictable consequences. A butterfly flapping its wings in one part of the world could, through a complex chain of events, eventually lead to a hurricane on the other side of the planet. We cannot possibly trace this path. We only see the hurricane, the final, big outcome. After the storm, we will try to find a simple cause, but we are just guessing. The world is far more complex than our simple stories allow for.
The effect of a Black Swan is not the same for everyone. The impact depends entirely on your access to information and your level of ignorance. Imagine you bet your entire life savings on a “sure thing” racehorse. You’ve studied its track record, the jockey is skilled, and the competition looks weak. It’s a safe bet. But when the race starts, the horse refuses to move. This is a devastating Black Swan for you. However, it is not a surprise for the horse’s owner, who bet against his own horse because he knew something you did not. That tiny bit of extra information meant the difference between ruin and riches.
Another mistake we make is treating all information the same. We need to understand the difference between two types of things: scalable and non-scalable. “Non-scalable” information has natural limits. Human height is a good example. You cannot find a person who is 10,000 feet tall. There is an upper and lower limit. Because of this, we can make meaningful predictions using averages. The average height of a group of people tells you something useful.
“Scalable” information is totally different. It has no real limits. Wealth is the perfect example. There is no physical limit to how much money one person can have. If you put 1,000 people in a room, the “average” wealth might be million. But this could be misleading. It could be that 999 people have and one person has billion. In a scalable world, averages tell you almost nothing. The digital world is highly scalable. When we use tools meant for the non-scalable world (like averages) to measure the scalable world (like wealth or markets), we make huge errors in judgment.
We are also far too confident in what we think we know. We try to manage risk by treating the real world like a game. This is called the “ludic fallacy.” In a game, like cards or dice, the rules are clear and the probabilities are known. A casino, for example, knows the exact odds of every game. They spend millions on security to protect against rule-breakers, like thieves or cheaters. They feel safe because they believe they have managed all the risks. But their real, greatest threats are not in the game. The casino is not prepared for a kidnapper to take the owner’s daughter. It is not prepared for an employee to forget to file tax paperwork, shutting the whole business down. These are the real risks, the ones that exist outside the defined rules of the game.
So, what is the defense against these traps? We can never perfectly predict the future. We cannot triumph over randomness. The solution is to have a good, honest understanding of our own limitations. We must recognize what we don’t know. “Knowledge is power” is a common phrase, but sometimes, knowing the limits of your knowledge is far more powerful. A good poker player understands this. They know the rules and the probabilities, but more importantly, they know they don’t know their opponent’s strategy or state of mind. This knowledge of the unknown makes them better, more cautious players.
If you know you are subject to biases, you are more likely to spot them in your own thinking. If you know that you naturally create simple stories, you will be more likely to search for more information to see the “whole picture.” This critical self-awareness is the best tool we have. It cannot stop the next Black Swan from happening, but it can help you avoid investing all your money based on a faulty prediction. By accepting our ignorance, we can reduce the damage when the next impossible thing happens.




