How Software Killed Four People

Companies are implementing AI to automate workflows, often without adequate preparation.

As we automate more with AI, we must remember that software was responsible for the deaths of at least four people.

Why it matters: In 1986, Ray Cox arrived at the hospital for his ninth radiation treatment for a tumor on his back.

  • He was a patient at the East Texas Cancer Center, a modern facility that used the Therac-25, a software-controlled machine.

  • This device provided both electron therapy for superficial tumors and X-ray therapy for deeper ones.

Between the lines: Ray Cox's treatment was intended to deliver a specific dose of electron radiation to his tumor.

  • However, a programming error combined with an operator mistake resulted in the machine producing a powerful X-ray beam at full power, directed straight at a small area of his back.

Context: Cox suddenly experienced a burning sensation, which he described as an "electric shock."

  • The pain was intense, and he cried out in agony.

  • The machine briefly showed an error message reading "Malfunction 54."

  • Still, the operator cleared it and tried to proceed with the treatment, oblivious to the catastrophic overdose that had just taken place.

Human in the loop: The Therac-25 required trained technicians to operate its controls, a design similar to the "human in the loop" system, which is intended to oversee AI behavior.

  • However, the human controller proved to be an inadequate fail-safe.

Zoom in: This incident was not an isolated case.

  • Over the next two years, the Therac-25 was associated with at least six accidents that resulted in massive overdoses of radiation.

  • These incidents led to severe injuries, amputations, and ultimately, the deaths of multiple patients.

  • Among them was Ray Cox, who died months later due to complications resulting from the overdose.

What's next: Companies are implementing AI to automate workflows, often without adequate preparation.

  • Although many of these applications may not carry the same risks as medical devices, mistakes can still lead to significant consequences.

  • Therefore, it is essential to establish a solid foundation in training, planning, and policy before adopting AI, even for small pilot programs.

Go deeper: To better understand how to prepare for AI adoption, visit Todd Moses & Company.