As the title suggests some of this is about the safety culture in airlines industry which is very open to analysing errors, using the black box. Compares this to the previous culture of safety in the medical industry, which was less open and honest about mistakes, this has changed over time, in part by learning from the aviation industry and also learning about human factors in relation to significant adverse events. Fair bit about how systems contribute to iatrogenic errors.
Without knowing about the outcome of treatments doctors wouldn’t improve their skills, in particular this relates to psychiatrists whom the outcome of their intervention may only be known about 20 years down the line. Equates this to swinging for golfballs in the dark and trying to improve without any feedback. Information needs to be shared back freely so learning can occur, punishing people too much means they won’t report errors.
When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. This is cognitive dissonance. Contains high profile examples of this from George Bush & Tony Blair.
Take many small steps to make improvements to things. The key to creativity is often combining learning from more than one industry. Mentions a lot about small randomised control trials facilitating learning, in particular Google performs many of these, even on things such as the colour of the blue and keep doing these in an iterative process to improve profits.
People are quick to blame others, often genuinely perceiving the blame without full information, with scapegoating, the blame game is very prevalent.