09 March, 2014

3-3-7 improving systems: Exponentiation of error

Despite that "automation" is a great strategy; there are some drawbacks of it. One of the major drawbacks is what I called "exponentiation of error"

We know that the more efficient automation is, the less human intervention is. But imagine that for some reason an error has happened within a system lacking a human factor; imagine how many faulty products would it produce till someone discover it and shut the system down or fix it

And imagine if one error would lead to a bigger one … and etc. How big and devastating the end error will be?

So the more complex and automated the system is, the more probable it is for an error to happen, and the more important it is for a human expert to intervene to stop the error and fix it


  That's why there will ever be a pilot and a co-pilot in every plane carrying people no matter how automated and efficient it is 

No comments:

Post a Comment