foobuzz

by Valentin, March 27 2023, in books

The Black Swan (Nassim Nicholas Taleb, 2007)

The Black Swan deals with the problem of induction, that is, the problem of how we can predict the future based on knowledge of the past. The book argues that, as far as the real-world is concerned, we can't, most notably because of Black Swans.

Given a series of datapoints which harbor similar properties, a Black Swan is a new datapoint of the series, which is unlike any other existing datapoint, and therefore that you couldn't have predicted just by looking at past datapoints. Furthermore, the occurrence of the Black Swan has such a big impact that if you happen to have a system based on the assumption that it wouldn't happen, your system will collapse. One should therefore assume that a Black Swan will happen. This can be done without assumption of the specific Black Swan (which is, by definition, unpredictable) while still protecting oneself from it (robust), or better, by benefiting from it (antifragile).

The book uses the life of a turkey as an illustration: for 1000 days the turkey is fed by the humans. The turkey, by looking at past data, only sees datapoints of extreme regularity (which gives great confidence in the quality of future predictions), and therefore predicts that tomorrow will be another day of feeding. Now, when tomorrow happens to be close to Thanksgiving, a Black Swan occurs: the turkey, instead of being fed, is killed.

Still, there is a distinction to be made between some things about which you can make predictions, and some others about which you can, respectively what Taleb calls "Mediocristan" and "Extremistan". For example, imagine that you have 10 random people lined up, whose average weight you can calculate. Now, a new person is added to the group. Can this single new person significantly change the weight average? Nope, because no matter how big the new person is, there is a conservative upper bound on the number which keeps the average stable. However, if you replace the criterion of weight with the criterion of wealth, this is another story; if the new person happens to be Bill Gates, the average will dramatically increase. So, weight belongs to Mediocristan, but wealth belongs to Extremistan.

The issue is that much of life belongs to Extremistan, yet many institutions and much of academia try to model it with models working only in Mediocristan. This includes, in particular, Taleb's biggest pet peeves: the banking system, economists, and their sordid mathematical tool they use but shouldn't: the bell curve. Bankers are makers of dynamite on which they like to sit before it explodes (and end up not paying the price of their mistakes when they're bailed out with tax-payer money). Economists are charlatans. And then, an entire chapter of the book is dedicated to debunking usage of the bell curve to model real-world probabilities (mainly because small errors get compounded to the point of the entire model being completely useless, or something like that). Taleb has a grudge.

The author tells about his "barbell" strategy of investing, which consists in putting something like 90% of one's assets into extremely safe instruments, and diversifying the 10% in extremely risky ones, and waiting for a massive payoff in one of the risky ones. This was apparently so contradictory to conventional investing wisdom, that everyone thought he was stupid. Until 2008 when everyone crashed but he made bank.

A significant part of the book explores the cognitive bias at play to make economists and other charlatanic professions believe in their theories which never succeeded in predicting anything useful. Those chapters overlap significantly with Daniel Kahneman's Thinking, Fast and Slow, with an emphasis on survivorship bias and what Taleb calls the "narrative fallacy", which consists in getting confidence into one's modeling of past history without actually confronting it to its capacity at predicting future history (and then doing it again with whatever happens in the future).

Taleb is more generous in criticism of the system that in advice of what to do once you have understood that the system is broken. One single chapter is advice-oriented, and it boils down to: don't listen to economists and other forecasters; be prepared for various eventualities; and expose yourself to positive black swans (unpredictable data points that make your life better instead of collapsing the economy): first by maximizing serendipity in your life, and then by trying out as many opportunities as possible.

Quotes

Whenever you hear a snotty (and frustrated) European middlebrow presenting his stereotypes about Americans, he will often describe them as "uncultured", "unintellectual", and "poor in math" because, unlike his peers, Americans are not into equation drills and the constructions middlebrows call "high culture".

Statisticians, it has been shown, tend to leave their brains in the classroom and engage in the most trivial inferential errors once they are let out on the streets.

It takes considerable effort to see facts (and remember them) while withholding judgment and resisting explanations.

After a candidate's defeat in an election, you will be supplied with the "cause" of the voters' disgruntlement. Any conceivable cause can do. The media, however, go to great lengths to make the process "thorough" with their armies of fact-checkers. It is as if they wanted to be wrong with infinite precision (instead of accepting being approximately right, like a fable writer).

I promised not to discuss any of the details of the casino's sophisticated surveillance system; all I am allowed to say is that I felt transported into a James Bond movie.

Prediction, not narration, is the real test of our understanding of the world.

We tend to "tunnel" while looking into the future, making it business as usual, Black Swan-free, when in fact there is nothing usual about the future.

Being an executive does not require very developed frontal lobes, but rather a combination of charisma, a capacity to sustain boredom, and the ability to shallowly perform on harrying schedules.

Engineers tend to develop tools for the pleasure of developing tools, not to induce nature to yield its secrets. It so happens that some of these tools bring us more knowledge; because of the silence evidence effect, we forget to consider tools that accomplished nothing but keeping engineers off the streets.

I do not see a "tree"; I see a pleasant or an ugly tree. It is not possible without great, paralyzing effort to strip these small values we attach to matters. Likewise, it is not possible to hold a situation in one's head without some element of bias.

People can't predict how long they will be happy with recently acquired objects, how long their marriages will last, how their new jobs will turn out, yet it's subatomic particles that they cite as "limits of predictions". They're ignoring a mammoth standing in front of them in favor of matter even a microscope would not allow them to see.

I worry less about embarrassment than about missing an opportunity.

An economist would find it inefficient to maintain two lungs and two kidneys: consider the costs involved in transporting these heavy items across the savannah. Such optimization would, eventually, kill you, after the first accident, the first "outlier".

We currently learn in business schools to engage in borrowing (by the same professors who teach the Gaussian bell curve, that Great Intellectual Fraud, among other pseudosciences), against all historical traditions, when all Mediterranean cultures developed through time a dogma against debt.

Life takes place in the preasymptote.