When an oil worker told investigators on July 23 that an alarm to warn of explosive gas on the Transocean rig in the Gulf of Mexico had been intentionally disabled months before, it struck many people as reckless.
Reckless, maybe, but not unusual. On Tuesday, the National Transportation Safety Board said that a crash last year on the Washington subway system that killed nine people had happened partly because train dispatchers had been ignoring 9,000 alarms per week. Air traffic controllers, nuclear plant operators, nurses in intensive-care units and others do the same.
Mark R. Rosekind, a psychologist who is a member of the National Transportation Safety Board, said the cases had something in common. “The volume of alarms desensitizes people,” he said. “They learn to ignore them.”


James P. Keller Jr., vice president of the ECRI Institute, formerly the Emergency Care Research Institute, has a name for it: “alarm fatigue.” In a recent Web seminar for health care professionals, he asked participants if their hospital colleagues had become desensitized to any important alarms in the last two years. Three-quarters said yes. “This suggests it’s a pretty pervasive problem,” he said.

Photo

CreditGeorge Ruhe for The New York Times 

The Deepwater Horizon disaster was BP’s second in recent years. An explosion at BP’s refinery in Texas City, Tex., in March 2005 killed 15 people and injured 180. Technicians were supposed to check that all alarms worked before starting up the chemical processing unit, but a supervisor told a technician to stop checking because there was not enough time, according to a report by the Chemical Safety Board, a federal investigative agency. So they did stop, and it turned out that one alarm was not working, leading a control room operator to make a wrong decision in the hours before the explosion.
In August 1997, a Korean Air jumbo jet hit the jungle four miles short of the runway in Guam because of pilot error, even though the Federal Aviation Administration had installed a system in the control tower to prevent such accidents — a “minimum safe altitude warning” alarm, to tell controllers that a plane on approach was too low. After the crash, investigators found that controllers thought the alarm had sounded too often, so they persuaded a technician to prevent it from sounding under normal circumstances. In New York in 1980, control room technicians at the Indian Point 2 nuclear reactor ignored alarms indicating that there was water in the basement of the containment building; by the time they discovered the problem — by seeing it — 100,000 gallons of Hudson River water had leaked into the building, and the hot reactor vessel was sitting in it. Such a situation, engineers thought, risked cracking the vessel.
In all the cases, the alarms were installed because the things they were watching could not be easily monitored by a person. The common problem was that the humans did not trust the systems set up to assist them.
On the oil rig and in the Guam control tower, the operators were annoyed by false alarms, which sometimes went off in the middle of the night. At the refinery and the reactor, the operators simply did not believe that the alarms would tell them anything very important.
In other words, the alarms conveyed no more urgency to these operators than the drone of a nagging spouse — or maybe the shepherd boy in Aesop’s fable, who cried “Wolf!”
Mr. Rosekind, a former operations specialist at NASA, said even astronauts were vulnerable. “There’s so much information overload,” he said. “If that alarm doesn’t have meaning for that user, that operator, they’re going to start ignoring it. It doesn’t matter what environment you’re in.”
In separate interviews, he and Mr. Keller, who is a biomedical engineer, said the alarms had to be smarter, crying wolf less often. Of course, it would help if users behaved a little smarter, too — as if they understood why the devices are there in the first place.