In another happy accident, I ended up with a bunch of podcasts on failure to listen to in the same week. (Success!) Here’s a few quotes I particularly enjoyed.
The engineers said “oh it’s going to have two buttons. That’s the whole interface. Take Me To The Moon, that’s one button, and Take Me Home is the other button” [but] by the time they landed on the moon it was a very rich interactive system …
The ultimate goal of new technology should not be full automation. Rather, the ultimate goal should be complete cooperation with the human: trusted, transparent, collaboration … we’ve learned that [full autonomy] is dangerous, it’s failure-prone, it’s brittle, it’s not going to get us to where we need to go.
And NASA has had some high-profile failures. In another episode in the same series of programmes, Faster, Better, Cheaper, presenter Kevin Fong concludes:
In complex systems, failure is inevitable. It needs to be learned from but more importantly it needs to become a conscious part of everything that you do.
Which fits nicely with Richard Cook’s paper, How Complex Systems Fail, from which I’ll extract this gem:
… all practitioner actions are actually gambles, that is, acts that take place in the face of uncertain outcomes. The degree of uncertainty may change from moment to moment. That practitioner actions are gambles appears clear after accidents; in general, post hoc analysis regards these gambles as poor ones. But the converse: that successful outcomes are also the result of gambles; is not widely appreciated.
In the Ted Radio Hour podcast, Failure is an Option, Astro Teller of X, Google’s “moonshot factory”, takes Fong’s suggestion to heart. His approach is to encourage failure, to deliberately seek out the weak points in any idea and abort when they’re discovered:
… I’ve reframed what I think of as real failure. I think of real failure as the point at which you know what you’re working on is the wrong thing to be working on or that you’re working on it in the wrong way. You can’t call the work up to the moment where you figure it out that you’re doing the wrong thing failing. That’s called learning.
He elaborates in his full TED talk, When A Project Fails, Should The Workers Get A Bonus?:
If there’s an Achilles heel in one of our projects we want to know it right now not way down the road … Enthusiastic skepticism is not the enemy of boundless optimism. It’s optimism’s perfect partner.
And that’s music to this tester’s ears.