It's a piece which lists very good examples of normalization of deviance in organizations.
Personally it happened to myself as well. I regularly rent a Tesla and once, I took a date on a trip and I drove us through the city, her riding shotgun. She said: "Look at the orange line on the screen. You're driving too close to the parked cars on our right".
I answered, "it always does that, the proximity sensor on these Teslas is way too nervous". She looked out of the window and said: "no, you are actually way too close to the parked cars!"
This (and multiple other examples in the article) are why warnings and alarms and procedures should be well designed and justified. It's very easy (such as in the case of the ventilator alarm) for the alarms to be noisy to the point of uselessness, and so they get ignored or disabled. Same with procedures: there are many which are written but lack a good justification. Sometimes the procedure itself is not useful, sometimes it is useful but the reason is not communicated to those who are responsible for implementing it, and worse sometimes it is useful but not for the reason anyone thinks it is!
This means, IMO, that any organisation with a healthy culture will also have a means to review and remove alarms and procedures if they are found to not be worthwhile. This substantially increases the chances the rest of them are respected and followed.
Personally it happened to myself as well. I regularly rent a Tesla and once, I took a date on a trip and I drove us through the city, her riding shotgun. She said: "Look at the orange line on the screen. You're driving too close to the parked cars on our right".
I answered, "it always does that, the proximity sensor on these Teslas is way too nervous". She looked out of the window and said: "no, you are actually way too close to the parked cars!"
I had totally normalized the proximity warning.