Book 2 of the Random series.
I agree with all he writes but I'm wish there was more concrete advice on how I can use it in my daily life or even how to make money off of it, since he did it. Have to read the whole 300 pages and there's only a paragraph of "useful" or what I was looking for.
- Black Swan. an outlier. Defined by its rarity, extreme impact, and retrospective predictability.
- A small number of Black Swans explain almost everything in our world.
- Low predictability and large impact.
- Central idea: our blindness with respect to randomness, particularly the large deviations.
- What is surprising is not the magnitude of our forecast errors, but our absence of awareness of it.
- Focus on antiknowledge, or what we do not know.
- Contrary to social-science wisdom, almost no discovery, no technologies of note, came from design and planning -- they were just Black Swans.
- The strategy is, then, to tinker as much as possible and try to collect as many Black Swan opportunities as you can.
- This is a book about uncertainty; to this author, the rare event equals uncertainty.
- People in the classroom not having faced man true situations of decision making under uncertainty.
- You need a story to displace a story. Metaphors and stories are far more potent than ideas. Ideas come and go, stories stay.
- Not relying on the beastly method of collecting "corroborating evidence" -- he calls these overload of examples naive empiricism -- succession of anecdotes selected to fit a story do not constitute evidence.
- Our world is dominated by the extreme, the unknown, and the very improbable and all the while we spend our time engaged in small talk, focusing on the known, and the repeated. The future will be increasingly less predictable, while both human nature and social "science" seem to conspire to hide the idea from us.
- How we humans deal with knowledge -- our preference for the anecdotal over the empirical.
- History is opaque. You see what comes out, not the script that produces events, the generator of history.
- Triplet of opacity (as it concerns humans about history): 1. illusion of understanding; 2. retrospective distortion; overvaluation of factual info and handicap of authoritative and learned people.
- This retrospective plausibility causes a discounting of the rarity and conceivability of the event. Minds are wonderful explanation machines, but incapable of accepting the idea of unpredictability. These events were unexplainable, but intelligent people thought they were capable of providing convincing explanations for them -- after the fact.
- History and societies do not crawl. They make jumps. We are just great machines for looking backward, and that humans are great at self-delusion.
- Manifestation of Platonicity, the desire to cut reality into crisp shapes. [Clusters and herding] Categorizing always produces reduction in true complexity.
- Not only are some scientific results useless in real life, because they underestimate the impact of the highly improbable, but that many of them may be actually creating Black Swans.
- Platonic fold is where our representation of reality ceases to apply--but we do not know it.
- Most traders were just "picking pennies in front of a steamroller," exposing themselves to the high-impact rare event yet sleeping like babies, unaware of it.
- Mine was the only job you could do if you thought yourself as risk-hating, risk-aware, and highly ignorant.
- Advice from 2nd yr Wharton student: to get a profession that is "scalable," one in which you are not paid by the hour and thus subject to the limitations of the amount of your labor.
- Bad advice according to NNT. He recommends someone to pick a profession that is not scalable. It's predictable. Scalable profession only good if you are successful. Very competitive, has few giants and many dwarves. Creation of giants by Black Swans.
- Art De Vany -- studied wild uncertainty in the movies. Much of what we ascribe to skills is an after-the-fact attribution.
- In Extremistan, inequalities are such that one single observation can disproportionately impact the aggregate, or the total. Almost all social matters are from Extremistan. Social quantities are informational, not physical.
- In Extremistan, always be suspicious of the knowledge you derive from data. In this world, type 2 randomness: wealth, income, book sales, etc. Where most of Black Swans happen.
- In Mediocristan, height, weight, bell curve. Predictable.
- The turkey problem, Problem of Induction. Something works in the past, until, unexpectedly, it no longer does. Derive solely from past data a few conclusions concerning the properties of the pattern with projections into the future. Turkey is feed for 1000 days, then 1001, his head chopped.
- Black Swan problem in its original form: How can we know the future, given knowledge of the past; how can we figure out the properties of the unknown based on the known?
- Pyrrhonian skeptics taught themselves to systematically doubt everything, and thus attain a level of serenity.
- Black Swan problem -- central difficulty of generalizing from available info, or of learning from the past, the known, and the seen.
- Domain specificity of our reactions -- depend on the context in which the matter is presented, or the domain.
- Simple confusion of absence of evidence (of the benefits of mothers' milk) with evidence of absence of the benefits.
- Negative empiricism: as applies to cancer detection -- the finding of a single malignant tumor proves you have cancer, but the absence of such finding cannot allow you to say with certainty that you are cancer-free. We can get closer to the truth by negative instances, not by verification. It is misleading to build a general rule from observed facts.
- Popper, who promoted this idea of one-sided semiskepticism. Empirical decision makers who hold that uncertainty is our discipline, and that understanding how to act under conditions of incomplete info is the highest and most urgent human pursuit.
- Popper introduced the mechanism of conjectures and refutations: formulate a bold conjecture and you start looking for the observation that would prove you wrong.
- Our natural tendency to look only for corroboration - corroboration error, or confirmation bias. Disconfirming instances are far more powerful in establishing truth.
- Need to find what experimenters said no to.
- the search for their own weaknesses that makes them good chess players.
- the central problem of knowledge is that there is no such animal as corroborative evidence.
- The modern world, being Extremistan, is dominated by rare--very rare-- events. We need to withhold judgement for longer than we are inclined to.
- Narrative fallacy - our vulnerability to overinterpretation and our predilection for compact stories over raw truths. Our limited ability to look at sequences of facts without weaving an explanation into them, forcing a logical link, an arrow of relationship, upon them.
- Problem of induction: what could be inferred about the unknown, what lies outside our info set; now looking at the seen, what lies within the info set.
- Nylon stocking experiment. Post hoc rationalization. Women supplied backfit, post hoc explanation. Better at explaining than understanding.
- Our biological dependence on a story. Hard to avoid interpretation. A higher concentration of dopamine appears to lower skepticism and result in greater vulnerability to pattern detection.
- Deeper reason for our inclination to narrate: 1. info is costly to obtain; 2. info is costly to store; 3. info is costly to manipulate and retrieve. Not a capacity problem but an indexing, retrieval one. Same condition that makes us simplify pushes us to think the world is less random than it actually is.
- Our tendency to perceive--to impose--narrativity and causality are symptoms of dimension reduction. We tend to remember those facts from our past that fit a narrative, while neglecting others that do not appear to play a causal role in that narrative.
- Memory is dynamic, more a self-serving dynamic revision machine. We pull memories along causative lines. Far too many possible ways to interpret past events.
- The Black Swans we imagine, discuss, and worry about do not resemble those likely to be Black Swans. False Black Swans -- narrated, present in the current discourse and likely to hear about on TV; real ones that no one talks about since they escape models.
- We learn from repetition -- at the expense of events that have not happened before their occurrence, and overestimated after (for a while).
- Thinking and reasoning: System 1 -- experiential, effortless, automatic, fast, "intuition, produces shortcuts; System 2 -- cogitative, normally call thinking, effortful, slow, logical, serial. Most of our mistakes in reasoning come from using System 1.
- World is more nonlinear than we think. Nonlinear relationships can vary, cannot be expressed verbally to do them justice. Linear progression is not the norm.
- Some people are like the turkey, exposed to a major blowup without being aware of it, while others play reverse turkey, prepared for big events that might surprise others. We have a marked preference for making a little bit of income at a time. Property of Extremistan to look less risky, in the short run, than it really is.
- Engaged in a strategy that he called "bleed:" you lose steadily, daily, for a long time, except when some event takes place for which you get paid disproportionately well.
- Another fallacy: Silent evidence - what events use to conceal their own randomness. To understand success and analyze what caused them, we need to study the traits present in failures.
- A life saved is a statistic; a person hurt is an anecdote. Statistics are invisible; anecdotes are salient.
- That we got here by accident does not mean that we should continue to take the same risks.
- Reference point argument: do not compute odds from the vantage point of the winning gambler, but from all those who started in cohort.
- We are explanation-seeking animals but there may not be a visible because.
- Whenever your survival is in play, don't immediately look for causes and effects. Be suspicious of the "because."
- Nerd is simply someone who thinks exceedingly inside the box. Ever wondered why many of these straight-A students end up going nowhere in life?
- Ludic fallacy - the attributes of the uncertainty we face in real life have little connection to the sterilized ones we encounter in exams and games.
- Prediction, not narration, is the real test of our understanding of the world.
- Epistemic arrogance - we think we know a little bit more than we do. Scandal of prediction - we are very bad at it. We are simply not wise enough to be trusted with knowledge.
- Epistemic arrogance bears a double effect: we overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states ... ingrained tendency to underestimate outliers, Black Swans.
- One main effect of info: impediment to knowledge. Additional knowledge of minutiae can be useless. The problem is that our ideas are sticky; once we produce a theory, we are not likely to change our minds (belief perseverance).
- Pony handicapping experiment. More info did not lead to increase in their accuracy, but their confidence went up.
- Good idea to question that error rate of an expert's procedure. Do not question his procedure, only his confidence.
- Things that move, and therefore require knowledge do not usually have experts. Professions that deal with the future and base their studies on the nonrepeatable past have an expert problem. Things that move are often Black Swan-prone.
- You cannot ignore self-delusion. The problem with experts is that they do not know what they do not know. Lack of knowledge and delusion about the quality of your knowledge come together; also makes you satisfied with your knowledge.
- Problem with prediction. We are living in Extremistan, not Mediocristan. Good at predicting the ordinary, but not the irregular. What matters is not how often you are right, but how large yoru cumulative errors are. Economic forecasters tend to fall closer to one another than to the resulting outcome (cluster effect).
- Expert excuses for prediction errors: Invoke the outlier, outside the system, outside scope of your science; "Almost Right" defense, retrospectively. Experts ... when right attribute to their own depth of understanding and expertise; when wrong, it was situation to blame.
- We humans are victims of asymmetry in perception of random events. We attribute our successes to our skills, and our failures to external events, to randomness.
- Hedgehog and the fox: most prediction failures come from hedgehogs who are mentally married to a single big Black Swan event, a big bet that is not likely to play out. Rather be a fox, with an open mind. I know history will be dominated by an improbable event, I just don't know what that event will be.
- Plans fail because of tunneling, the neglect of sources of uncertainty outside the plan itself. Consider the track records of builders, paper writers, and contractors. The unexpected almost always pushes in a single direction: higher costs and a longer time to completion.
- Researchers on how students estimate their time of projection completion. Optimists promised 26 days; pessimists 47 days. Average actual time: 56.
- Also the nerd effect - stems from mental elimination of off-modeled risks, or focusing on what you know. View the world from within a model. Most delays and cost overruns arise from unexpected elements that did not enter into the plan.
- Anchoring - you lower your anxiety about uncertainty by producing a number.
- The longer you wait, the longer you will be expected to wait.
- Corporate and government projections have an additional easy-to-spot flaw: they do not attach a possible error rate into their scenarios. Fallacy 1: variability matters; 2. forecast degradation as project period lengthens; 3. misunderstanding the random character of the variables being forecast.
- almost all inventions we see: product of serendipity.
- Fat tail - technical term for Black Swan.
- Prediction requires knowing about technologies that will be discovered in the future. But that very knowledge would almost automatically allow us to start developing those technologies right away.
- Poincare: nonlinearities, small effects that can lead to severe consequences; about the limits that nonlinearities put on forecasting.
- Montaigne, vulnerability of human knowledge.
- To me utopia is an epistemocracy. Society governed from the basis of the awareness of ignorance, not knowledge.
- The assertive idiot has more followers than the introspective wise person. Pyschopaths rally followers.
- In practice, randomness is incomplete info. Opacity.
- Maximize the serendipity around you. Barbell strategy for investing: 85-90% in extremely safe instruments like T-bills; other 10-15% in extremely speculative bets. Taking maximum exposure to positive Black Swans while remaining paranoid about the negative ones.
- Invest in preparedness, not in prediction. Seize any opportunity, or anything that looks like opportunity. Work hard, not in grunt work, but in chasing such opportunities and maximizing exposure to them.
- Asymmetry: Put yourself in situations where favorable consequences are much larger than unfavorable ones.
- Matthew effect: people take from the poor to give to the rich. "cumulative advantage".
- standard deviation is meaningless outside of Mediocristan. Same as sigma. Variance is square of sigma.
- With Gaussian bell curve, dramatic speed of decrease in odds as you move away from the average. But Gaussian bell curve not ubiquitous in real life.
- One can almost always ferret out predecessors for any thought. You can always find someone who worked on a part of your argument and use his contribution as your backup. The scientific associaiton with a big idea goes to the one who connects the dots. Those who derive consequences and seize the importance of the ideas, seeing their real value, who win the day.
- Nature's geometry is not Euclid's -- triangles, squares, circles.
- Fractal is the word Mandelbrot coined to describe the geometry of the rough and broken. Fractality is the repetition of geometric patterns at different scales, revealing smaller and smaller versions of themselves. No qualitative change when an object changes size. Large resembles the small.
- Fractal Geometry of Nature, Mandelbrot.
- fractal has numerical or statistical measures preserved across scales, ratio is the same.
- Distribution is scalable and fractal (didn't know the exact exponent though).
- Precise models and humbled by reality. About opacity, incompleteness of info, the invisibility of the generator of the world.
- Fractal randomness a way to reduce these surprises, to make swans possible, to make them gray.
- It is contagion that determines the fate of a theory in social science, not its validity.
- Domain dependence of skepticism. Irritated by those who exercise their skepticism against religion but not against economics, social scientists, and phony statisticians.
- Having plenty of data will not provide confirmation, but a single instance can disconfirm.
- Rule: I am very aggressive when I can gain exposure to positive Black Swans -- when a failure would be a small amount--and very conservative when I am under threat from a negative Black Swan.
- Refusing to catch trains. Missing a train is only painful if you run after it. You stand above the rat race and the pecking order, not outside of it, if you do so by choice.