In a previous workplace we had a system that had a potential for failure - as many systems do.
Two solutions were proposed:
- Reduce the potential for failure. (Build a more robust system.)
- Provide a way to recover from that failure. (a.k.a. Reboot.)
At first, rebooting was considered a last resort and only there to protect the company. Having that option even available was contentious as it would be quite disruptive. Nonetheless, it was left in because it was considered a last line of defence.
As the system progressed, people in the company became used to having that option available. It became common to reboot the system every 24 hours rather than spend the time fixing aspects of the system that would make it more robust.
Is this behaviour just called "falling to the lowest common denominator"?
Simplifying this down, the rather obvious argument can be summarised as:
If one brings a tool, then one is more likely to use that tool than look for a better tool.
(Note that this isn't the same as "If you have a hammer, everything looks like a nail." because it's not that they don't know better, but that laziness or pressure guides them away from the better solution.)
Is there a philosophical law/razor that describes and unifies these behaviours?
(Or that, conversely, if we remove the opportunity then we effectively force people to consider "better" options.)
UPDATE
To expand on this, the razor I'm thinking of has to do with avoiding providing "easier" false solutions over true solutions.
You can see this theoretical "razor" also in things like people's attitude toward credit card debt. For example, a person is more likely to think "I need it." if they have a credit card than if they don't. The human failing is one part, but not providing the credit card in the first place would have avoided the false "solution" entirely, forcing them to save and therefore not become encumbered with the additional fees and drag on their finances.