WHEN there is need for real change in response to a crisis, someone who is accused of “rearranging the deck-chairs on the Titanic,” surely has lots of soul-searching to do. That derisive phrase, refers to the situation that exists, when efforts are applied to an endeavour that is doomed to fail.
If your next action, in some way, had the potential to cause harm to someone else, would you act or behave differently? If we exclude the unusual scenario where a hero’s life or well-being is sacrificed to save others who are endangered, we should hope that the answer is positive. If your answer is negative, indicating that no change will be made to your behaviour, then reverse the question. Would you prefer that someone else act differently, if it meant that you would not be harmed?
Consider the damage what would happen under the following scenarios:
* Road warning signs removed for extended cleaning and maintenance;
* A lighthouse taken offline for days to change its lightbulbs;
* An emergency telephone number rendered unavailable during system upgrades.
Within the past few weeks, you may have encountered news stories featuring the harmful after-effects of actions or inaction, for example:
* Consumers and regulators duped by misleading engine test results, when their German-made cars defeated and bypassed the true nature of these tests;
* An abused victim, who was not rescued despite a report being made to the authorities.
A friend whose job involved updating the flight control software for helicopters, vividly described the extra attention to detail when submitting changes to the software. He had to join the pilot when his software changes were uploaded, which forced a strong safety culture. The intention of that policy was to provide a safety check to ensure that high standards were maintained, and did not unduly put the lives of others at risk.
Apparently during Roman times, there was a similar procedure. The engineers who built bridges would have to stand under the bridge, while it was first used. Any technical failures would cause harm to befall the faulty-bridge builder.
It is clear to understand why quality control, whether in a software environment, a hardware context, or even in a societal context, would improve if those making decisions were somehow affected by the delivery of that product or service. The lesson? Shared risks make jobs more meaningful, and increase accountability, especially when lives matter.
To share your views, contact the author at: www.datashore.net or via The Voice.