When health tech goes terribly wrong

Therac-25 is a name that is often mentioned when discussing ethics and computer testing, and it’s clear to see why. The Therac-25 was a radiation treatment machine that was set to revolutionise the way radiation treatment was done. Thanks to process automation, it could treat more patients faster and the staff could spend more time talking to and assisting the patients. (McQuaid, 2012, pp.459-470) However, many pre-assumptions were made and hence less testing was conducted which resulted in bad software being released. In addition, because of lack in feedback to the manufacturer and the FDA, and rules not in place for the government to interact, as they should, action was not taken before several deaths were a fact. (Tavani, 2013, pp. 120, ComputingCases.org, n.d.)

Image source: (McQuaid, 2012, pp.459-470)

An extract of design decisions leading to the accident:

  • A faulty micro switch (Tavani, 2013, pp. 120)
  • Safety switches being removed from hardware to software. (Tavani, 2013, pp. 120)
  • Serious Software error: new radiation dosages were ignored, using old dosages  (Tavani, 2013, pp. 120)
  • Serious Software error: the electron beam could be turned on without device for spreading the beams (ComputingCases.org, n.d.)
  • Serious Software error: A subtle bug called a «race condition» meant that a fast typist could accidentally make the machine enter into and fire in high-power mode with the X-ray target out of position. (McQuaid, 2012, pp.459-470)
  • Numeral software fixes needed to be applied (ComputingCases.org, n.d.)

Organisational structures leading to the accident:
(ComputingCases.org, n.d.)

  • Users were not required to report injuries, neither to officials nor to the manufacturers. Thus, the limit for deciding what was important enough to report was individual. In 1986, only 51% of problems were reported to manufacturers, as the hospitals mainly dealt with the issues themselves. Hence, the officials (FDA) did not know of any repeated problems, or that many clients were facing them, until after 2 deaths started to shed light on the issue.
  • The FDA did not have the authority to recall Therac-25, only to recommend it. The strongest action they could take was to issue an article in a medical bulletin declaring it defective!

After yet another death, they still could not do much than to recommend not using Therac-25. They should have had more authority to take clear action if there is any doubt that something might harm the users, however small.

  • The software and technology was said to already be in use in various other existing products, so it was regarded as pre-approved and passed FDA testing procedures. Even so, the safety mechanisms were moved into the software instead of using hardware safety locks, so this was a big move and should maybe not have been considered as having been previously approved. Some minimal testing should be done at any new product, regardless.
  • The programmer who developed the operating system for the previous Therac-system and this one had no formal training. He was responsible for the «race condition» bug not being weeded out. (McQuaid, 2012, pp.459-470)
  • Lack of procedures, both for FDA to follow up and check before approving, as well as the manufacturer’s internal training and testing.

Ethical responsibilities:

The organisations: I think the FDA should have had some testing of the finished product, regardless of pre-approved parts of technology and software. There is an ethical responsibility to make sure it is safe and look closer at the complete, finished product. AECL, the manufacturer of Therac-25, should have tested better as well. For example, the software bug that ignored new doses of radiation should definitely have shown up in any kind of software testing. The public relies on both the FDA and the manufacturer to have thoroughly tested the safety of the products they allow. Procedures should be followed and everyone involved should have required training and qualifications for doing their work.

The individuals involved: The hospital workers have to trust that the tools they are given are safe when they are declared approved from the FDA. The same with the integrity of a large manufacturing company. However, once the workers discovered minor errors and later on more serious errors, they should have taken more responsibility to make sure it was noted further up in the hierarchical chain. In addition, surely some worker at the FDA or AECL must have thought that the tests were not good enough. Someone approved this, and someone else could have been a whistle-blower. Being a whistle-blower can take serious toll on the workplace, personal psyche and career – but should this stop us from saying something if we see that things are potentially dangerous? (Tavani, 2013, pp. 117)

Summary: Many things went wrong in this case, on many levels. The swiss cheese model comes to mind! If just one of the factors would have been different, disaster could have been prevented. The FDA could have found errors if they did more testing, the system could have had a closer inspection on all parts if the users mentioned all minor concerns they had and the manufacturer could have had more integrity in the product they manufactured and sold. I hope that though, this has been a big lessons learned to many industries and organisations.

 

 

Bibliography

Further reading: https://pdfs.semanticscholar.org/14cb/8d71d9de3df98e6c333551d9bd534f71d75a.pdf

ComputingCases.org. (n.d.) A history of the introduction and shut down of Therac-25 [Online] ComputingCases.org: Available from: http://www.computingcases.org/case_materials/therac/case_history/Case%20History.html (Accessed: 04.06.2019). Edit 16.06.2019: Seems to be down at the moment, access the cached version here: https://webcache.googleusercontent.com/search?q=cache:PKZFHdTuX1kJ:https://computingcases.org/case_materials/therac/case_history/Case%2520History.html+&cd=1&hl=no&ct=clnk&gl=no

McQuaid, P. A. (2012) ‘Software disasters–understanding the past, to improve the future’, Journal of Software: Evolution & Process, 24 (5), pp.459-70.

Tavani, H. T. (2013) Ethics and technology: Ethical issues in an age of information and communication technology. 4th Edition. John Wiley & Sons, Inc.

Image source: ar130405, Pixabay licence free usage.