The Black box thinking, M. Syed

Sunday, November 8, 2020

I own this book and took these notes to further my own learning. Taking notes, publishing them and re-reading them allow me to flatten my forgetting curve. If you enjoy these notes, I highly encourage you to do the same, buy this book here and take your own notes.

 

Part 1: the logic of failure

  • Medical errors every day could be compared to two 747 crashes every day in the USA. Today, preventable medical error kills more people than traffic accidents. The difference is that we would never tolerate that degree of preventable harm in any other forum
  • medical errors are due to lack of resource, mistakes, doctors have to make a quick decision and can't afford to test all the alternative treatments
  • the human being does not like failure, we even create face-saving excuses before attempting anything
  • when investigating medical death in operation, you can find patterns… problem these patterns are set to be repeated over time because health-care never collect data on how accidents happen. On the other hand, the aviation industry collected a numerous amount of data from failures and is now one of the most secure industry in the world
  • Mistakes happened in the airlines and hospitals because of engineer or nurse that were intimidated to warn the person hierarchically above them… in psychology, it's now well-known that social hierarchies inhibit assertiveness… people talk to those in authority with “mitigated language”… NASA even created a program to train junior crews on communication assertiveness and senior crews were taught to listen
  • Airline crash learnings showed that attention is a scarce resource, if one focus on one thing, he is losing awareness of other thing and lose awareness of time
  • Abraham Wald, survival bias, in order to learn from a mistake, you need to consider not only the data you have but also the data you can't have
  • Progress can only occur when there is a culture and attitude of failure from a mistake, something that just does not exist in the medical system, where we protect doctors

Part 2: Cognitive Dissonance

  • Type 1 error: diagnose a tumor that isn't here (error of commission). Type 2 error: fails to diagnose (omission)
  • cognitive dissonance: Most of us think of ourselves as rational, smart, and being proved wrong is a very itchy sensation. That's why in front of new evidence, one can adding new parts to the cognition causing the psychological dissonance (rationalization) or avoiding circumstances and contradictory information likely to increase the magnitude of the cognitive dissonance (confirmation bias)
  • the cognitive dissonance is a normal psychological trait that can be very harmful in certain core profession (judge, police investigator, politician, doctors, tony Blairs with the invasion of Iraq…)
  • the cognitive dissonance and self-justification are psychological traits that go against the progress from learning from failure
  • UC Berkeley study showed that the winning stocks investors sold outperformed the losing sold they did not sell by 3.4%… on other words, people are holding too long because they can't bring themselves to realize they have made a mistaken
  • memory is very malleable, especially through time. Our brain operates videos, and assemble fragments of moments. But these fragments are moving a lot over time. The thing that we think we remember during specific moments (let's take 9/11) are very likely to be different than the reality

Part III: Confronting Complexity

  • Evolution is a process of failure test called natural selection
  • The company goes bankrupt every year, J Schumpeter: “creative destruction”
  • The biologist at Unilever took a testing approach (trials and errors) and proved better results than with a linear approach
  • Experiments showed that groups with a focus on building a quantitative number of works were better than the one focusing on quality over quantity
  • Mental shortcut forms the basis of The Narrative Fallacy. The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them. Our brain is easily forcing a logical link to build reasoning. The narrative makes facts more easily remembered; it helps them make more sense….
  • Problem….Sometimes our brain is so eager to provide an explanation that we are capable of explaining opposite outcomes, without noticing our inconsistency
  • “Scared Straight” program. Show to youngsters a glimpse of prison life to chock them and nudge them into a behavior change…. great example of failed analysis due to selection bias (questionary send to parents…)… cognitive dissonance, the creator were persuaded by the efficacity because it sounded so intuitive
  • Today a lot of government changes have actually not been really tested through proper testing analysis, which is very scary….

Part IV: Small Steps and Giant Leaps

  • break down a big goal into small parts and then improve on each of them
  • create marginal gains and accumulate them to achieve bigger goals
  • Formula 1 pitstop
  • Nathan's hot dog contest…
  • Dyson's creative process: begins with frustration, a problem, then ideations and critics, prototyping again and again..
  • Dyson was not the first to invent this technology… the most important is to then put the idea into development and build a sales and the production machine (perfect manufacturing process over just the right idea)

Part V: The Blame Game

  • Management course is all about finding the right spot between “blame culture” and “anything goes”… Blaming can destroy the company culture and prevent any future innovation and bold ideas

Part VI: Creating a Growth Culture

  • Evangelist for failure: Beckham, MJ, Dyson, the most important quality is the willingness to try and not being afraid to fail in order to progress.
  • Try, fail and learn
  • Redefine failure, celebrate it
  • Japon, failure is very stigmatized and could explain why there is so few entrepreneurship vs. in the USA
Book SummaryBehavioral Science & Design

Designing for behavior change, S. Wendel

Good Strategy Bad Strategy, R.Rumelt