It is difficult to describe the level of depth and wisdom that Fooled By Randomness brings. On one hand, Nassim Taleb takes me into the world of trading, finance, and economics, and shows me the weaknesses (and irrelevance) of their math, models and theories. One cannot scientifically measure risk. Tomorrow is unknown and trying to create models to manage risk is moot. Luck also plays a big part in life, but humans overestimate their skills in explaining success.
Taleb then takes me into his expertise in statistics and shows how statistics could fool people if not understood or studied the right way. He later introduced me to the philosophy of Karl Popper, whose work I later read and have been highly influenced by. Interestingly, Taleb also touches on the work of Daniel Kahneman and Amos Tversky who showed me the many different biases that we humans have. We decide with our emotions and there is really no way for us humans to forgo emotions. Without emotions we cannot decide, but emotions also make us stupid decision maker at times. The irony. Lastly, Taleb touches on stoicism, an approach to life I have been trying to practice in the past few years.
If people ask me what books I recommend them to read, this book is in my top 3.
Notes
Things that come with little help from luck are more resistant to randomness.
It does not matter how frequently something succeeds if failure is too costly to bear.
Mild success can be explained by skills and labor. Wild success is attributable to variance.
Behavioral scientists believe that one of the main reasons why people become leaders is not from what skills they seem to possess, but rather from what extremely superficial impression they make on others through hardly perceptible physical signals – what we call today “charisma.”
People often confuse complex ideas that cannot be simplified into a media-friendly statement as symptomatic of a confused mind.
Both risk detection and risk avoidance are not mediated in the “thinking” part of the brain but largely in the emotional one.
By “watching” your risks, are you effectively reducing them or are you giving yourself the feeling that you are doing your duty?
Learning from history does not come naturally to us humans, a fact that is so visible in the endless repetition of identically configured booms and busts in modern markets.
It is a platitude that children learn only from their own mistakes; they will cease to touch a burning stove only when they are themselves burned; no possible warning by others can lead to developing the smallest form of cautiosness.
When you look at the past, the past will always be deterministic, since only one single observation took place.
Psychologists call this overestimation of what one knew at the time of the event due to subsequent information the hindsight bias, the “I knew it all along” effect.
A mistake is not something to be determined after the fact, but in light of the information until that point.
A more vicious effect of such hindsight bias is that those who are very good at predicting the past will think of themselves as good at predicting the future.
For an idea to have survived so long across so many cycles is indicative of its relative fitness. Noise, or at least some noise, was filtered out.
It takes a huge investment in introspection to learn that the thirty or more hours spent “studying” the news last month neither had any predictive ability during your activities of that month nor did it impact your current knowledge of the world.
Prices did not rationally reflect the long-term value of securities and were overshooting in either direction.
People who look too closely at randomness burn out, their emotions drained by the series of pangs they experience.
Science is method and rigor; it can be identified in the simplest of prose writing.
One can make money in the financial markets totally out of randomness.
History teaches us that things that never happened before do happen.
The practice of “financial engineering” came along with massive doses of pseudoscience. Practitioners of these methods measure risks, using the tools of past history as an indication of the future.
The Black Swan problem by John Stuart Mill: No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion.
According to Karl Popper, there are two types of theories:
- Theories that are known to be wrong, as they were tested and adequately rejected (falsified).
- Theories that have not yet been known to be wrong, not falsified yet, but are exposed to be proved wrong.
Why is a theory never right? Because we will never know if all the swans are white.
The more data we have, the more likely we are to drown in it.
The greater the number of businessmen, the greater the likelihood of one of them performing in a stellar manner just by luck.
The mistake of ignoring the survivorship bias is chronic even among professionals. How? Because we are trained to take advantage of the information that is lying in front of our eyes, ignoring the information that we do not see.
Optimistic people certainly take more risks as they are overconfident about the odds; those who win show up among the rich and famous, others fail and disappear from the analyses.
Ergodicity – time will eliminate the annoying effects of randomness.
Conventional economists do not have this luxury as they observe the past and make lengthy and mathematical comments, then bicker with each other about them.
The fact that your mind cannot retain and use everything you know at once is the cause of such biases.
We think with our emotions and there is no way around it.
People overvalue their knowledge and underestimate the probability of their being wrong.
Beliefs are said to be path dependent if the sequence of ideas is such that the first one dominates.
Economics is a narrative discipline, and explanations are easy to fit retrospectively.
A slightly random schedule prevents us from optimizing and being exceedingly efficient, particularly in the wrong things.