‘Black Swan’? More like ugly duckling

Just because unexpected events happen doesn’t mean we should throw prediction out the window

Pratik Koppikar
Mercury Staff

We cannot perfectly predict the future. Many try. Some even build their careers on risk analysis and prediction. However, on a macro scale, our models don’t always account for the improbable. Though in his study of the uncertain, “The Black Swan,” Nassim Nicholas Taleb argues that we must ignore empirical predictions to embrace the unknown, his argument falls apart in practice.

Thinking we’ve seen everything, we plan for the expected, the norms and the past. Because we don’t know what can happen, we cannot reliably predict what will. The occurrences Taleb dubs ‘Black Swans’ are unexpected, improbable and unforeseeable events, which are still incredibly formative in each of our lives. Forget trying to predict everything, he says, and instead take advantage of the fact that all things are uncertain. Taleb’s resolution is to forgo the so-called experts and their predictions, taking advantage of uncertainty instead. To embrace the uncertain is to understand we cannot and should not try to predict Black Swans. Unfortunately, Taleb’s solution falls short: his ideas, claims and arguments are interesting as a thought experiment, but baseless when applied to the real world.

One of Taleb’s earliest criticisms of the modern understanding of risk prediction is the idea that there are unquestionable narratives for conveying information. He’s right: as students we accept certain mainstream economic models (Keynesian, Neoclassical), philosophical narratives (rationality), and scientific theories (relativity) taught in academic settings as fact without firsthand evidence. The narratives taught to us are often accepted without question, until we choose to go back and question our presumptive notions. Still, for all his observation of our susceptibility to baseless narratives, Taleb presents narratives without backing throughout his entire novel. None of his arguments are presented without first giving a lengthy backstory on how the brilliant idea was co-opted from an underappreciated philosopher intellectually cast aside by their charlatan contemporaries. Taleb may have identified the poisoned root in our flawed models, but he builds his case from the same soil.

Later, Taleb argues that the ubiquitous presence of silent evidence (evidence we don’t record due to variables we don’t realize exist) is a sign of the unreliability in every field. Again, he is right in his identification of the problem.  Negative studies (where the hypothesis is not supported) are hardly ever published, the failures are not yet systematically recorded in a database to compare new results against. Because our models of prediction and risk rarely have negative results to compare successes against, they’re left hanging as theories without viable alternatives until a Black Swan comes along to disprove the model.

Taleb, though correctly identifying silent evidence as an issue, ignores it in his own discussion of risk. He criticizes casinos for spending billions of dollars on anti-cheating efforts (like those to catch card-counters) when the recent major losses have had nothing to do with cheaters. Taleb believes cheating is not at all a source of lost income for casinos: it’s the unpreventable Black Swan of a tiger attacking a stage performer or executive forgetting to file major tax documents. But Taleb doesn’t even entertain the idea that the reason cheating is not currently a source of lost income is because of the measures taken to combat it in the past. Look back a few years and it’s evident that the financial losses incurred by card counters and other cheaters in casinos were massive — there is silent evidence in the fact that the current systems discouraged cheating to the extent that is no longer an avenue for lost income. Taleb’s argument in this case is reduced to the notion that dealing with what you can is stupid; focus instead on the fact that you really don’t know anything at all.

Often claiming that there are many studies to back up his claims, Taleb never cites more than one study per idea, neglecting footnotes or references for the “many others.” Accepting what he believes to be rediscovered theories as fact while lambasting the doctors and scientists that utilize experimental data, Taleb seemingly resorts to cherry-picking ideas to find an alternative solution to the identified problems. At one point, while criticizing the models economists and traders use to predict outcomes in the market, Taleb cites a study that concluded statistically sophisticated or complex models don’t necessarily provide more accurate forecasts than simpler ones. It’s worth noting that the study didn’t support the claim that the complex models were worse than the simple ones, just that they were not always superior. When having a guarantee while choosing between two methods that one will do the same if not better than another, what reasoning is there to choose the latter?

Taleb’s hypocrisy in developing his arguments is unfortunate, because his assessment of the issue is spot-on. We are ill-equipped to handle outlier events in society — we have not prepared for the unexpected. Taleb’s concerns are worthwhile and valid, but his solutions not so much. How do we account for the unexpected? If, somehow, we implemented measures to account for the unexpected, we’d never know. Taleb’s initial example of this phenomenon supposes that if stronger doors and anti-hijack measures were implemented in 2000, no one would celebrate the “prevention” of 9/11. We can only experience the unexpected because we don’t expect it.

Black Swans are unpredictable in nature. There will always be risk in the world. To assume we will ever be able to know all possible outcomes is disingenuous at best and dangerous at worst. Taleb’s solution to abandon our current models is dangerous for the same reason: to do away with the prevention of risks that we’re aware of would only beget a greater risk overall. Reality is that we will experience life-altering events when we don’t expect them — but our response and recovery is the most important factor in dealing with them. Our models aren’t perfect, but it’s likely they never will be. Especially in our current Black Swan, this reality that was unforeseeable a year ago, our best shot at recovery is to use what we’ve got.