There’s a pretty decent review at Scientific American that sums things quite well:
…he’s brilliant, funny and fearless and tackles consequential topics. What are the limits of science? Of understanding and prediction? Given our limited ability to know and control the world, how should we live our lives? How can we prosper in spite—and even because—of life’s vicissitudes?
A former derivatives trader, Taleb made his reputation by bashing conventional economics and finance, but his scope has always ranged far beyond Wall Street. His Big Idea is that life inevitably serves up surprises, or “black swans”–from AIDS and nuclear weapons to the 9/11 attacks and the internet—that our necessarily retrospective models of reality cannot foresee.
…Here is how he sums up his message in The Wall Street Journal: “We should try to create institutions that won’t fall apart when we encounter black swans—or that might even gain from these unexpected events… To deal with black swans, we instead need things that gain from volatility, variability, stress and disorder.” That is what Taleb means by “antifragile.” He offers some suggestions for achieving antifragility in government, business and other spheres: “Think of the economy as being more like a cat than a washing machine.” “Favor businesses that learn from their own mistakes.” “Small is beautiful, but it is also efficient.” “Trial and error beats academic knowledge.” “Decision makers must have skin in the game.”
Taleb has a decent editorial in the Wall Street Journal this past November:
Learning to Love Volatility: In a world that constantly throws big, unexpected events our way, we must learn to benefit from disorder, writes Nassim Nicholas Taleb.
Several years before the financial crisis descended on us, I put forward the concept of “black swans“: large events that are both unexpected and highly consequential. We never see black swans coming, but when they do arrive, they profoundly shape our world: Think of World War I, 9/11, the Internet, the rise of Google.
I also recommend Taleb’s earlier book from 2007, The Black Swan: The Impact of the Highly Improbable. The review on Amazon sums it well:
A black swan is a highly improbable event with three principal characteristics: It is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was. The astonishing success of Google was a black swan; so was 9/11. For Nassim Nicholas Taleb, black swans underlie almost everything about our world, from the rise of religions to events in our own personal lives.
This certainly sounds like the 2010 return of Fraser sockeye… the big one. And in turn the ‘collapsed’ run of the year previous.
There are some pretty good YouTube clips of Taleb speaking. He’s known to be volatile and unpredictable at times. In one of his calmer appearances, he has a great comparison between plane crashes and bank failures…
He suggests that when a plane crashes, generally people die – however, we learn from those crashes, which in turn reduces future plane crashes.
Yet when big banks crash and fail, we don’t learn from those events and in fact, more big banks fail more regularly.
Taleb suggests, as the handwritten quotes above suggest, that we spend far too much time and resources trying to predict things, that we can’t in fact predict.
Hmmmm. like salmon runs? for one.
And two, predictions of how many salmon can be caught from those badly predicted run sizes… yet retain the ‘health’ and ‘resilience’ of these runs, or collection of salmon runs which comprise the Fraser sockeye runs?
And so on, and so on.
We know that the concept, and formulaic practice of determining Maximum Sustainable Yield in fisheries is a highly failed, flawed, and screwy model. It should be banished, yet it still dominates ‘fisheries management’…. [see free E-book in right hand column]
…that term in itself a misnomer… it implies we can ‘manage’ the fish, and in turn the ‘fishers’ that target and bonk them… the latter being somewhat more ‘controllable’… ‘manageable’….
_ _ _ _ _ _ _
Wading through Justice Cohen’s reports, evidence (or lack of…), and 1000+ pages makes me think of Taleb’s quote above: ‘the limit is mathematical, period, and there is no way around it on this planet. What is nonmeasurable and nonpredictable will remain nonmeasurable and nonpredictable no matter how many PhDs you put on the job.’
And like banks and politicians — no one is held accountable for making bad or wrong predictions based on complex spreadsheets and formulaic equations. As Taleb essentially asks: what if we held scientists accountable, would we continue to get the same bullshit models and faulty prediction regimes? He suggests that far too few ‘predictors’ and ‘spreadsheet makers’ are ever held accountable for their ‘predictions’.
What if fisheries scientists and predictors and prognosticators were held accountable for their predictions? and risk models? and so on…?
Would we all of a sudden see a massive simplifying of this ever-increasing ‘science’ which is not really a ‘science’. Similar to economists, fisheries scientists can predict future salmon runs with as much accuracy as the prognosticators predicting where financial markets will close next Wednesday…
So why do we continue to waste millions and millions of dollars on faulty science, predictions, consultants, law professionals, and so on.
As has been pointed out in recent posts, and many past posts, wild salmon inhabit far too many vast areas (e.g. the North Pacific) of which we will never, ever be able to make accurate ‘predictions’ about. So why flaunt and flit about suggesting that we ‘understand’ things that we don’t…
It’s kind of like lying on one’s resume then getting the job, or taking performance enhancing drugs for decades and yet maintaining innocence, or lining one’s own pockets or those of their friends while an elected politician…
“Our track record in figuring out significant rare events in politics and economics [and natural systems] is not close to zero; it is zero.”