When innovation and safety are paired together the subject is likely to be something like parachutes or air bags or highway design, but there’s a much more fundamental connection between thinking innovatively and being safety minded.
I recently finished reading a book on the hazards of our modern technologies, including how and why things sometimes go tragically awry. The book, Inviting Disaster, Lessons from the Edge of Technology (New York: Harper Business, 2002) was given to me by its author, technology writer James Chiles. (You may have seen some of his work on The History Channel.) It’s a fascinating read and an outstanding piece of journalism that recounts such misadventures as Three Mile Island, the Challenger Space Shuttle explosion and the crash of the Air France Concorde. Chiles not only does a masterful job of explaining the workings of complex machines; he probes the very human foibles that led to such dramatic failures.
Much of what Chiles explains has a familiar ring. I’ve long used incidents such as the Challenger disaster to explain counterproductive patterns of thought and behavior. Still, as I poured over example after tragic example, I was struck by just how consistently the same core human failings reoccur. More than that, I came to the vivid realization that many of the same attitudes and practices that maintain safe and smooth operations are the same ones that promote successful innovation—and vice versa.
This is more than a little counterintuitive. Innovation is usually associated with taking risks, while keeping a complex system operating reliably would seem to be the ultimate exercise in maintaining the status quo. One might logically argue that the control room of a nuclear reactor or the cockpit of a supersonic jet is the last place we want someone getting creative. Yet that’s exactly what’s required when things go wrong: the ability to creatively interpret, if not anticipate, novel events.
The notion that trouble-shooting involves creative thinking comes as no surprise to your auto mechanic or anyone who’s ever written software code. It requires the ability to analyze and sort out some very complex cause and effect relationships, in order to link the invisible origins of a problem to symptoms that are often far removed. As Chiles notes, it requires imagining scenarios and unobservable connections deep inside the system.
There are many parallels. Innovation, for example, requires overcoming hidden assumptions that limit the range of possibilities we consider, and it’s that same tendency to make unexamined assumptions that often leads to problems with our machines. We assume the machine won’t (or even can’t) fail. We assume that it will always operate and be operated as intended. We assume we’ll have sufficient time to fix a problem when one arises. We may even be dismissive of warnings about problems, assuming that since it hasn’t happened yet, it isn’t likely to. Such assumptions have contributed to horrific failures.
Design is increasingly recognized as an important dimension of successful innovation, and it’s just as important when it comes to safety. Not just designing systems that are reliable, but designing them so they won’t confuse or mislead intelligent well-intentioned operators. One of the hard-learned lessons of innovation is that products and services need to be adapted to the way consumers will use them, not the other way around. Many catastrophes have occurred because of this same kind of mismatch.
Disaster planning, like innovation, is about imagining scenarios and “what ifs”. It’s about dreaming up new possibilities, figuring out what might bring them about, and devising ways to avert adverse consequences. An unforeseen accident may represent any number of failures and one of them is surely a failure of imagination.
When we fail to think innovatively, it can have many negative consequences for businesses and careers and investments and economies. It can also be down right dangerous.
Leave A Comment