Logic Of Failure
R**S
"On S'engage Et Puis On Voit!"
Napoleon said "On s'engage et puis on voit!" Loosely translated that means "One jumps into the fray, then figures out what to do next," a common human approach to planning. This discussion (page 161) takes on the adaptability of thought and cautions decision makers about the risks of overplanning in a dynamic, multivariate system. Using examples from Napoleon as well as more concrete examples such as the quotation about soccer strategy (also on page 161,) Dietrich Dörner, the brilliant German behavioral psychologist (University of Bamberg) has created a masterwork on decision making skills in complex systems; I find it to be highly complimentary to Perrow's work and also highly recommend his equally brilliant "Normal Accidents."A strength of this work is that Dörner takes examples from so many areas including his own computer simulations which show the near-universal applicability of his concepts. One of Dörner's main themes is the failure to think in temporal configurations (page 198): in other words, humans are good at dealing with problems they currently have, but avoid dealing with and tend to ignore problems they don't have (page 189): potential outcomes of decisions are not foreseen, sometimes with tragic consequences. In one computer simulation (page 18) Dörner had a group of hypereducated academics attempt to manage farmland in Africa: they failed miserably. In this experiment Dörner made observations about the decision makers which revealed that they had: "acted without prior analysis of the situation; failed to anticipate side effects and long-term repercussions; assumed the absence of immediately negative effects meant that correct measures had been taken; and let overinvolvement in 'projects' blind them to emerging needs and changes in the situation." (How many governmental bodies the world over does this remind you of?)I am a safety professional, and am especially interested in time-critical decision making skills. Dörner's treatment of the Chernobyl accident is the most insightful summation I have seen. He makes the point that the entire accident was due to human failings, and points out the lack of risk analysis (and managerial pressure) and fundamental lack of appreciation for the reactivity instability at low power levels (and more importantly how operators grossly underestimated the danger that changes in production levels made, page 30.) Dörner's grasp here meshes the psychology and engineering disciplines (engineers like stasis; any change in reactivity increases hazards.) Another vital point Dörner makes is that the Chernobyl operators knowingly violated safety regulations, but that violations are normally positively reinforced (i.e. you normally "get away with it," page 31.) The discussion about operating techniques on pages 33 and 34 is insightful: the operators were operating the Chernobyl Four reactor intuitively and not analytically. While there is room for experiential decision making in complex systems, analysis of future potential problems is vital.In most complex situations the nature of the problems are intransparent (page 37): not all information we would like to see is available. Dörner's explanation of the interactions between complexity, intransparence, internal dynamics (and developmental tendencies,) and incomplete (or incorrect) understanding of the system involved shows many potential pitfalls in dynamic decision making skills. One of the most important of all decision making criteria Dörner discusses is the importance of setting well defined goals. He is especially critical of negative goal setting (intention to avoid something) and has chosen a perfect illustrative quote from Georg Christoph Lichtenberg on page 50: "Whether things will be better if they are different I do not know, but that they will have to be different if they are to become better, that I do know." A bigger problem regarding goals occurs when "we don't even know that we don't understand," a situation that is alarmingly common in upper management charged with supervising technical matters (page 60.)Fortunately Dörner does have some practical solutions to these problems, most in chapter six, "Planning." One of the basics (page 154) is the three step model in any planning decision (condition element, action element, and result element) and how they fit into large, dynamic systems. This is extremely well formulated and should be required reading for every politician and engineer. These concepts are discussed in conjunction with "reverse planning" (page 155) in which plans are contrived backwards from the goal. I have always found this a very useful method of planning or design, but Dörner finds that is rare. Dörner argues that in extremely complex systems (Apollo 13 is a perfect example) that intermediate goals are sometimes required as decision trees are enormous. This sometimes relies on history and analogies (what has happened in similar situations before) but it may be required to stabilize a situation to enable further critical actions. This leads back to the quote that titles this review: 'adaptability of thought' (my term) is vital to actions taken in extremely complex situations. Rigid operating procedures and historical problems may not always work: a full understanding of the choices being made is vital, although no one person is likely to have this understanding; for this reason Dörner recommends there be a "redundancy of potential command" (page 161) which is to say a group of highly trained leaders able to carry out leadership tasks within their areas of specialty (again, NASA during Apollo 13) reportable in a clear leadership structure which values their input. Dörner then points out that nonexperts may hold key answers (page 168); though notes that experts should be in charge as they best understand the thought processes applicable in a given scenario (pages 190-193.) This ultimately argues for more oversight by technicians and less by politicians: I believe (and I am guessing Dörner would concur) that we need more inter- and intra-industry safety monitoring, and fewer congressional investigations and grandstanding.This is a superb book; I recommend it highly to any safety professional as mandatory reading, and to the general public for an interesting discussion of decision making skills.
A**N
A very relevant read
In this book Dr. Dorner does a fantastic job of identifying and illustrating the complexities of life interactions. What at first may seem simple rarely is so due to a significant number of often overlooked issues. I believe that what makes this book an indispensable aid in the process of decision making and leadership are the insights that Dr. Dorner draws from the many hypothetical scenarios that he creates to display the 'Logic of Failure.'This is a must great read for all in leadership but it has relevance for everyone because life is about decision making!
T**E
A book to make you go UMMMM
I was torn rating this book. My score went up and down with each turning page. It has moments where it will turn on a light in your head. At other chapters, it loses its way and your attention. The book is an old book, around 25 years old. It does a good job of defining failure through how people's minds worked. His data primarily comes from his simulations and experiments. He mixes a few things from the real world to spice up his points. I would have liked to see more real-world examples and a broader data set.Overall he has some thoughts that will change how you view your team/organization/company. You will see the little things trip you up even before the dust settles. Through that it will be worth the read.
A**Z
I gave it to my 15 years old son to read it.
Great book written in 1989 by West German Professor Dorner. This is a good introduction into System dynamics and System thinking. The book gives several mental models of how people "attack" complex problems and why they often fail. Dorner describes several psychological experiments that help to differentiate "bad decision maker" form "good decision makers," as well as mental traps that lead to failures. Based on this analysis he presents his model of decomposition and planning. At the end of the book Dorner quotes Clausewitz that "...War is not an infinite mass of minor events... War consists rather of single, great, decisive actions, each of which needs to be handled individually." Such "strategic thinking" requires far greater expenditure of mental energy argues Dorner.Dorner ends his book with the thought that "...If we cannot form a picture of a temporal configuration, we cannot adjust our thinking and actions to take that temporal pattern into account... We human being are creature of the present. But the world today must learn to think in temporal configuration."The book is easy to read and comprehend. I gave it to my 15 years old son to read it. Hopefully he will use my advise to read it earlier in his life.
A**3
Something for the Office and Home
After having worked in several large multi-national companies, it is always interesting watching how projects unfold and in particular once MBA or MBA wannabees with their SharePoint programs can slowing grind a promising venture into the ground. It is always illuminating how tracking meetings, attendance, gantt charts and all the other tools can lead to bad outcomes. What I found valuable in the book is that the author has organized the things that we (or groups) do that lead to failure. It is these patterns that we learn at an early age that make it difficult for us to avoid the traps both in our personal and business lives. I would recommend this as one of those books that we should go back and re-read every couple years.
F**D
Fascinating read
Love the way this looks at success through the window/mirror of failure.
D**N
So much potential
Interesting quantitative findings from experiments. Talked for too long about HIV, skipped over it in the end. A better summary of conclusion would have been a benefit. Indeed a rolling summary of findings was missing through out the book... to which end, it isn't really a book, it is a collection of experiment reports with an introduction that is left hanging.
J**B
Buen libro aunque con muchos altibajos
Si tuviera que definirme sobre si el libro vale la pena o no, la respuesta sería clara: Sí.Sin embargo, una vez dicho esto, cabe señalar que el libro contiene desde trivialidades o cosas muy conocidas que pueden encontrarse en divulgadores del tipo Peter Senge hasta rasgos de brillantez, cuando el autor describe cuáles son los mecanismos que nos conducen a error en el diagnóstico de sistemas o nuestra distinta capacidad para tratar asuntos relacionados con el espacio o con el tiempo.El libro tiene multitud de perlas que van apareciendo a lo largo del libro y cuya sola presencia bastaría para justificar su compra. Al mismo tiempo, cualquiera que esté familiarizado con los modelos sistémicos encontrará muchas cosas sobradamente conocidas e incluso hasta podrá discutir sobre las bondades de los procesos de simulación, utilizando para ello un argumento que el propio autor da en la página 88:"If we have no idea how the variables in a system influence one another, we cannot take these influences into account".Éste es precisamente el punto débil de los procesos de simulación: Son útiles cuando se trata de pronosticar el desarrollo de variables cuya relación ya se conocía. Si esa relación no se conoce -y esto ocurre a menudo en sistemas complejos- la simulación de procesos no es de gran ayuda.
G**D
I can't think of anybody who should not read this ...
I can't think of anybody who should not read this book. I can certainly think of many who should have, and didn't.
Trustpilot
1 month ago
2 weeks ago