fbpx

You are a chicken. Yes, you. You look around and sometimes wonder why your owner takes such good care of you. At first, you’re not sure. You’re skeptical. What if he sends you to the slaughterhouse? You have never been there, but you know very well none of your friends have ever come out of that place. You remain on high alert for when that fateful day might arrive, but it never does.

Days go by, and then weeks, months, even years. You are now convinced your owner loves you more than any of those other chickens and he would never do bad things to you. Each passing day is additional evidence to say that you will live for the next. A thousand days go by like this. A thousand beautiful days. Till, of course, the thousand and first day, when the illusion of safety breaks, and you end up on someone’s dinner plate. You should have never crossed the road.

Now imagine how betrayed the chicken must’ve felt when it was being taken to that terrifying part of the farm. Given the 1,000 days worth of evidence, the chicken’s trust in its owner was ironically at its highest level when it was eventually slaughtered. Perhaps, if it wasn’t so foolish to believe that it was special or unique, maybe it would have at least been spared the feelings of betrayal. That one final day completely changed the outlook of the chicken’s life. That one piece of evidence outweighed the previous 1,000 days, and it’s not even a contest. This is something known as a Black Swan. A single event or observation that often comes as a surprise with disproportionate consequences, radically changing our outlook about something. People used to think that swans could only be white, until they saw a black swan – which reshaped the way people thought about what is out there.

Nassim Nicholas Taleb wrote a book called “The Black Swan: The Impact of the Highly Improbable” to study this very phenomenon and shine light on how vulnerable we are to Black Swans, and how we are only becoming increasingly more vulnerable with each passing day. In his book, he talks about some fundamentals of epistemology that limit our ability to understand Black Swans before they happen. But first, let’s talk about why our modern society, as technologically advanced as it is, is the perfect nesting place for a black-swan event.

Let’s say we are going to weigh a few thousand people, and at the extreme end of that sample contains the heaviest person in the world. So long as that person is subject to biological constraints like the rest of us, it really doesn’t matter how much he or she weighs. Let’s say 2,000 pounds. Now, how much do you think that accounts for in the total weight of all the people we weighed? The answer is probably less than .5%. It shows that even a crazy outlier like a 2,000 pound person doesn’t really overwhelm the average. Taleb calls this ecosystem “mediocristan” to refer to how the mediocre measurements of the “average” person do mostly represent all measurements quite well.

Now, let’s conduct the same experiment, but with wealth. Let’s gather a few people and include just one of the 3,000 or so billionaires in that list. How much do you think that billionaire accounts for in the total wealth of all people in that sample? An overwhelming majority – almost always close to 99%. Contrary to the first scenario, here the outlier overwhelms everything else. Taleb calls this world “extremistan” as it rewards a few people extremely well, but leaves basically nothing for the others.

Taleb says that the modern world is composed of circumstances that are geared towards extremistan, not mediocristan. Because, money, for all intents and purposes, is just a number in someone’s book. The vast majority of money is completely digital. It is not subject to physics laws or biology to constrain it to minimal variance. Sure, most people don’t make that much money, but a few people can make a lot of money. Similarly, if you want to consider musicians, most musicians don’t sell that many albums, but a few artists sell quite a few. You can conduct the same thought experiment with book sales, scientific publications, shoe brands, and so on. Point is, the modern economy is very much a winner-take-all system that rewards a small number of people with a disproportionately large portion of the pie. If it was more like the weight example we just talked about, you wouldn’t expect the outliers to be so wild. But, the fact that they are indeed so wild goes to show just how unpredictable the environment we are living in really is. The forecasts we take for granted today often fail to take into account the true nature of this unpredictability, these black swan events.

You might be inclined to say that “No, these billionaires put the work in day in day out, and therefore, they can enjoy the fruits of their labor.” Indeed, most of them probably worked really hard. Some of their innovations might later pave the way for a better future for all of us. I am not discounting that. However, the system is not rewarding them proportionately. More importantly, it’s hard to say how much of their efforts are the fruits of their labor and how much of it is due to pure chance. If you were to run a few simulations with extremistan-type circumstances, you would inevitably have a few Jeff Bezos – like outliers.

We may be biased into thinking that we understand what causes Bezos-like outliers in our society. You know the usual. Think out of the box, start a revolutionary company, work extremely hard for a few years, and then smell the roses. Happily ever after. We have all read the autobiographies. We have all watched the documentaries. However, when was the last time you read about a person who did all of those things and failed? When was the last time you saw shelves of books about people who failed? Chances are, never. These stories just never really make it.

There is an epistemic bias in all this. Taleb says, “Now take a look at the cemetery. It is quite difficult to do so because people who fail don’t seem to write memoirs, and, if they did, those business publishers I know would not even consider giving them the courtesy of a returned phone call.”


This is despite the fact that, often, advice about what not to do is more useful than what to do.

But that’s just the economy. That’s just one facet of society. We also don’t understand the socio-political aspects. Take 9/11, for example, which is certainly a Black Swan event. After it happened, you had tons of experts come out and say that they had known for years that it was about to happen.

Well, why didn’t they say anything?

This retrospective distortion of the understanding of a problem is one of the hallmarks of a black swan event. None of them really knew. If they did, cockpit doors would have been bulletproof long ago, pocket knives would never have been allowed onto cabins, and the TSA would have been invented much earlier. But these things were only instituted after 9/11. If you were to suggest such policies in 1991, for example, you would probably not be taken too seriously, or would be shown a spreadsheet that suggested airlines don’t have the money for bullet-proof doors.

Thankfully, the likelihood of a 9/11 style event is much lower now than it used to be. Countries around the world are more prepared, more vigilant. However, that also makes these precautions somewhat lose their relevance. Yuval Noah Harrari, in his book Homo Deus, cites a paradox about knowledge. He says, “knowledge that does not change behaviour is useless, but knowledge that changes behaviour loses its relevance. The more data we have and the better we understand history, the faster history alters its course, and the faster our knowledge becomes outdated.”

Despite the measures we have taken for a Black Swan event like 9/11, that does nothing to improve our odds against a future Black Swan. If anything, it might lure us into a false sense of security and in fact worsen our chances of coping with the impacts of the next highly improbable event.

We tend to convince ourselves that we understand risks once we have understood a game of dice, or blackjack. However, trying to approximate the risks in real life with the same methods used in a closed-loop artificial game is simply oversimplification – a mistake we commit daily.

This, Taleb calls, is the Ludic Fallacy.

We learn simple games and immediately conclude that the stock market works the same way, even though one of these things lives in mediocristan and the other one lives in extremistan. If the markets were well understood, do you think something like Gamestop or AMC would ever have been allowed to happen? Sure, short squeezing is not a particularly new phenomenon. And yet, even a non-black-swan event such as this one left even the smartest hedge fund managers scratching their heads and practically chasing bankruptcy. This false sense of understanding makes Black Swans that much more dangerous.

There are other reasons why we are increasingly more vulnerable to black swans. Taleb said whereas in the past people might be studying different kinds of literature and diving deep into a locally developed set of ideas, today, arguably the most read book is Harry-Potter. That’s of course not to say Harry Potter is a bad book or anything, but it goes to show that we are much less in tune with each other’s ideas, for better or for worse. For the most part, everyone is dealing with generally the same ideas. That coupled with the rising complexity and reach of technology means when something fails, it fails for more people than ever before.

“The Pakistani government tried to shut-down YouTube in Pakistan. It ended up shutting down YouTube world-wide,” Taleb cited. We don’t understand these things. That’s just one way for technology to fail, but it goes to show just how interconnected things are, and while that is often touted as a plus, given sufficiently poor luck, that can really spell doom for us all.

Take Coronal Mass Ejections as an example. These are regular bursts of radiation from the sun that scientists on Earth know and expect. The largest coronal mass ejection ever on record is the Carrington event in 1859. Its effects were mostly felt by telegraph operators who had some of their equipment burnt from the sudden surge. Most of the world just went on without a hitch. On the other hand, if a Carrington-class event were to occur today, with all the grids, electric cars and equipment that we now have, the damages would be in the trillions of dollars, and repair could take decades if at all possible. And with each passing day, with each little transition into an electric future, we are becoming more and more vulnerable to such an event. The thing is, this isn’t even a Black Swan event.

In 2012, the likelihood of a Carrington event in the next decade was calculated to be around 12%. And yet, despite that high probability, we are not particularly prepared for such an event. Given the esoteric nature of its risk – seemingly low probability, but high impact – despite all the mounting evidence, you’ll have a very hard time convincing governments to make modifications to power grids to avoid catastrophic failures. So if that’s how little we care about an event that we know is bound to occur eventually, imagine how unaware we are of a true Black Swan.

The chicken in the farm – were it to somehow be spared by some miracle, would never trust another human being ever after the betrayal it endured. However, few are ever so lucky. Meanwhile, for the owner, the chicken’s death comes as no surprise. It is a routine event, and therefore no black-swan. The idea of a black-swan, is therefore, relative to the knowledge one possesses. Hence, our objective is to try to be in the position of the butcher, not the butchered. Taleb says, “I worry less about advertised and sensational risks, more about the vicious hidden ones.”

Of course, the idea of a black swan also incorporates good things. Things such as wildly unlikely positive outcomes of chance, otherwise known as life. The odds of being born are 1 in 400 trillion. But to be fair, I just unfollowed my own advice. Such a thing can’t really be predicted, can it? For all we know, and for all we don’t, being born is an unimaginably unlikely event that nobody really predicted. So if you are alive, whatever that means, in the end, we’re actually all the Black Swans we’ve been trying to avoid the entire time.

Ironic, isn’t it?

Subscribe
Notify of

0 Comments
Inline Feedbacks
View all comments