Personal tools

3rr2.html

Ethical Problems with Modern Technology

Home
About
Working Groups
Publications
Join
Events
Topics
Chapters
News
Search
Computer Professionals for Social Responsibility

Ethical Problems with Modern Technology

By Eric Douglas Nielsen, Michigan State University

Everyday, our life spectrum is broadening, and new technologies are created. Many different types of technologies are available and are used through our computers. Computers are present in every aspect of our lives. But it is quite surprising that very few people truly understand how computers work, or pause to think about the changes computers cause in our lives and in our society. They perform many ordinary, everyday functions and also perform many life-critical tasks. We bank by computer, we shop by computer, and we even rely on computers for medical treatments. Computer programmers and computer scientists make all of this possible.

Questions that a computer programmer may deal with include queries such as; are ethics integrated into the computer science industry? What role do the codes of conduct really play for programmers? How are these codes of conduct implemented and applied? What is being done to limit people from programming/developing anything they wish? Do most programmers ignore the coding standards and the codes of ethics that are set forth by multiple professional computer associations? Because of the prevalence of computers in our everyday lives, it is important to understand how programmers view ethics, and what kind of roles that ethics play in their daily work.

The American Heritage Dictionary describes ethics as "The rules or standards governing the conduct of a person or the members of a profession"(1). Ethics is "The science of human duty; the body of rules drawn from this science; a particular system of principles and rules concerning duty, whether true or false"(2). In general, ethics are rules that govern actions and decisions in a particular profession. Dr. Walter Maner, professor of computer science at Bowling Green State University is an avid computer scientist. He is considered by some to have coined the term "computer ethics;" Maner defines computer ethics as: "an academic field in its own right with unique ethical issues that would not have existed if computer technology had not been invented"(3). Computers rely on programming, which is limitless; if one can dream it, one can design the program and implement it. Ethical questions about the uses and applications of electronic computers have been raised since their inception approximately sixty years ago. The Therac-25 case is a good example through which we can observe important and very relevant computer-based issues.

The Therac-25 case is a classic computer software related failure, and is one of the most widely cited software-related accidents. This case shows the problems and consequences that result from deficiencies in a software program. Programs always have a certain set of rules and guidelines that need to be met. If the program specifications are met and the programmer considers certain relevant ethical standards, the program is considered disciplined. Otherwise, the consequences and effects of undisciplined software development can be devastating to all involved, and raise important questions related to the ethics of computer programming.

The Therac-25 was originally developed in 1976 by The Atomic Energy of Canada and CGR, a French Based Company: "The purpose of the project was to build a linear accelerator to deliver X-rays and electron beams for the treatment of cancer."(5). The Therac-25 was a multi-million dollar computerized radiation therapy machine with its first commercial version available in 1982. Before this, there were several other models of the Therac.

In general, most computers have two main components: hardware and software, which the machine relies upon to run and execute the programs and commands. A machine like the Therac-25 is considered to be software intensive if it heavily relies on the software component to execute a specific purpose, rather than relying on both the hardware and the software to run. In other words, a machine or system where the critical control is mediated through the software is considered software intensive. The Therac-25 was originally developed in 1976 by The Atomic Energy of Canada and CGR, a French Based Company: "The purpose of the project was to build a linear accelerator to deliver X-rays and electron beams for the treatment of cancer"(5). The Therac-25 was a multi-million dollar computerized radiation therapy machine with its first commercial version available in 1982. Before this, there were several other models of the Therac. The Therac-25 was considered to be more compact, more versatile, and easier to use than the previous models. Even though the new model offered great benefits, its software was flawed.

With the earlier models, as in the Therac-20, there were hardware and software interlocks, which in general kept the machine running in a safe way. This prevented the patient from being burned or given an overdose; instead, a fuse blew in the machine. A single person developed the software for the Therac-20/25, which were both modeled after the Therac-6 (an earlier model). The Therac-25 software had more responsibility for maintaining safe execution and use than the software in the previous models. The Therac-20 had independent protective circuits, which monitored the operation, and kept the patient safe. The computer had immense ability to control and monitor the hardware, which was a great asset for the system. But the cost of these safety mechanisms and interlocks was very high, so they were removed for the next model, Therac-25. Nancy Leveson, a professor at The University of Washington, has suggested that this practice of cutting costs is "becoming more common as companies decide that hardware interlocks and backups are not worth the expense, or they put more faith on software than on hardware reliability"(5). In the Therac case, there was tremendous responsibility put on the programmer to produce a safe software program, but the programmer failed to do so.

The Therac-25 was designed to precisely aim a beam of radiation at a patient to treat tumors or cancerous growths. The machine had two settings, a low energy mode and an X-ray mode. The low energy mode was "200-rad mode" that delivered photons, and the X-ray mode had a "25 million electron volt capacity" which are very strong electrons at varying energy levels. The X-rays produced were used to reach deeper tissue in the body, and spare the outer layer of tissues. The low setting was aimed directly at the patient, whereas the high-energy mode (with the X-rays) was first put through a thick tungsten shield. The machine was "controlled through a terminal hooked up to an old VAX mainframe so that a technician could run it from another room," so not to cause excess radiation exposure to the operator (Eby, online).

All together, there were eleven Therac-25 machines installed in treatment centers: five in the USA and six in Canada. In most treatment cases, the procedure went as planned, with no complications. The machine provided the necessary radiation to cure the cancerous tumors. But between June 1985 and January 1987, there were six known accidents that involved massive overdoses of radiation, which resulted in deaths and serious injuries. Mr. Rawlinson, of The Ontario Cancer Institute described the accidents as "the worst series of radiation accidents in the 35-year history of medical accelerators." (Rawlinson, (6)) The cause for all of these accidents was human error, which is becoming more common place in all accidents in society. Mark Eby, a professor at The University of Guelph (Canada), has suggested that "with an increase in advancement everyday, people are looking for a faster way to make products and software. (These products are) created at an alarming rate to keep up with the consumer and sometimes (important aspects) are over looked."(7). When the Therac-25 was engineered not all the possible outcomes were considered before it was put on the market. The software overlooked the possibility of common input errors. The likelihood of the deaths could have been reduced or even prevented if the program had been designed better and had been tested properly.

There were several different factors that played a part in the failure of the Therac-25. A major problem was the individual programmer, whose personality and way of thinking were both severely flawed. The programmer who developed the Therac-25 software took his work seriously, but also personally–he preferred to work individually and in isolation rather than work collaboratively with peers who might have caught the errors in his code. In general, this way of thinking was very arrogant and irresponsible on the parts of the individual programmer and the company for which he worked. While developing and testing the code, the programmer did not think of all the different possibilities that could arise, and didn’t handle the errors properly. This program handled errors by silent failure, in that the error output was not informative and that the whole computer system could fail and shut down if a single part of the software failed.

Another problem was that the technicians were trained to think that the Therac-25 machine couldn’t fail and that it was virtually impossible to give an overdose of radiation because of "the many safety mechanisms"(Medical Devices: The Therac-25, 8). This way of thinking was totally wrong; programs should always handle errors in an appropriate and safe manner. The technicians were taught that if an error occurred, it didn’t matter and it didn’t jeopardize the safety of the patient. Furthermore, the software never went through any meaningful debugging/testing. This was an enormous problem, in that not all-possible outcomes were thought of or developed into the software. These design process problems were in violation of good software engineering principles, and also in violation of general "good" engineering practice. Along with these problems, there were also some basic software-engineering principles that were violated in the development process.

Good software design practices were completely ignored: The Therac-25 was a machine that almost totally relied on the software for safe operation. It didn’t have safety interlocks with the hardware, which could help to ensure safe operation. In general, the overall design of the system was unsafe. The design of a program should be kept simple and easy for other programmers to understand. In software development, there must always be quality assurance practices and standards established for the program to abide by. When errors occur, specific information on the cause of the error should always be outputted. The software should be subjected to extensive testing and formal analysis before it is deemed safe for general use. All of these problems could have been corrected to ensure that the patients were treated properly, and to ensure safe operation of the machine. Safe operation is necessary in all machines, especially in life-critical machines.

The Therac-25 case is not the only case of the intersection of ethics and computer programming; there are many other examples of the dire consequences of ignoring certain aspects of a program. This case is a real life example of what can happen when computer ethics are ignored. All professionals have to take into consideration ethics, especially when dealing with a life critical system like Therac-25. However, computer programmers have a larger responsibility, a responsibility that extends beyond just computer-related medical technologies. More and more people bank online today, which heavily relies on computers. Imagine a situation in which all people who use online banking suddenly lose all information in their accounts. If the computer system malfunctions or is corrupted, there will be confusion, anger and mayhem. The computer systems that banks use need to be secure and always running properly. When computer scientists develop the code for the banking systems, they need to verify that all possible complications and outcomes are handled properly. A bank system can’t directly harm people if it malfunctions, but it can create many problems that directly affect and disrupt people’s way of life. This is just one hypothetical example of the fact that in all applications of computer programming, certain standards, principles and ethical decisions need to be considered and applied.

Imagine another hypothetical situation, which involves on-line shopping. More and more people are relying on the Internet and their computers to shop. Imagine a case in which an on-line store’s servers and machines malfunction or crash. If a person was ordering a certain product (1 in quantity) for their business, but instead was shipped 1,000 in quantity, there would be some minor problems. In this case, the consumer could always return the extra products and receive a refund. As always with computers, certain steps need to be taken to avoid these complications and mix-ups. This comes into play when the program is being developed; certain standards need to be met, which are critical to the proper and safe operation of the system.

The programmer in the Therac-25 case used computer technology inappropriately, not thinking of the all the possible consequences. Whenever there was a problem with the machine, an error message displayed, which was basically always stating the same thing, a "MALFUNCTION" had occurred. The way the programmer handled the errors was unethical, especially when this code was used in a life critical system.

Why was this single programmer allowed to handle errors this way? What kind of ethical decisions came into this programmer’s head, if any? All professional programmers know that there are proper ways to handle errors, especially when dealing with a machine that involves human life. Did the programmer not consider certain set of "commandments for programming"? The Computer Ethics Institute's Ten Commandments of Computer Ethics shows what the "acceptable" uses of technology, specifically computers, are and includes:

1. Thou shalt not use a computer to harm other people.

2. Thou shalt not interfere with other people's computer work.

3. Thou shalt not snoop around in other people's computer files.

4. Thou shalt not use a computer to steal.

5. Thou shalt not use a computer to bear false witness.

6. Thou shalt not copy or use proprietary software for which you have not paid.

7. Thou shalt not use other people's computer resources without authorization or proper compensation.

8. Thou shalt not appropriate other people's intellectual output.

9. Thou shalt think about the social consequences of the program you are designing.

10. Thou shalt always use a computer in ways that insure consideration and respect for your fellow humans. (9).

If the programmer of the Therac machine had followed the commandments, and especially paid attention to numbers 9 and 10, most likely none of the patients would have been injured or killed. The programmer clearly did not think about all the possible outcomes for the machine. If the programmer had consulted with other professionals to help develop the program code, the machine most likely would have been safe. The commandments provided above are just one example of how programmers and researchers approach and understand computer ethics.

There are many different perspectives on the issue of computer ethics. According to Moor, with The Research Center on Computing & Society at Southern Connecticut State University, there is a problem with defining computer ethics: "A typical problem in computer ethics arises because there is a policy vacuum about how computer technology should be used. Computers provide us with new capabilities and these in turn give us new choices for action."(4). Usually there aren’t any policies that describe conduct with computers, and when there is, the policies are inadequate. One of the main tasks of people interested in designing computer-use policies is to determine what society should do in such cases. But this task is very difficult in the sense that the policy needs to "include consideration of both personal and social" circumstances (4). As an Engineering major, interested in Computer Science, I spend a great deal of time interacting with computers, and thinking about the policies, programming, and protocols that regulate my use of computers. I can also understand the difficulties described above related to setting computer-use policies.

I have one programming class, CSE 231, which is designed as "an introduction to programming." I have created many programs, which span a wide array of applications and uses, from games to manipulating data, to rewriting data files. I have just stepped through the door of computer programming and have much to learn yet. If an inexperienced programmer like myself can create programs that manipulate data and files, then what can a professional or an experienced programmer do? People can create viruses/programs that will wipe out the whole hard drive of a personal computer, shut down servers, affect the business operations of a multi-billion dollar corporation, and even disrupt/destroy our way of life.

The government regulates many aspects of our lives, but the newest technologies, which we rely upon to live, work, and shop, are not regulated in a way that guarantees benefits to our larger society. Our society needs to regulate technology, specifically computer software development, and continue to apply the ethics and morals of our society to the different aspects of our lives. The computer science industry also needs to be regulated with a higher standard, but only to a certain degree. If the government is going to regulate a discipline, the regulations must be both reasonable and supportive of the goals of society. Regulations made by bodies that are not familiar with a certain discipline can cause more harm than good.

Today, anyone can become a professional programmer; people with a degree in computer science, and even self-taught people. Even though there are many professional societies, programmers are not required to join or follow the ethical principles of any professional society. The world would be a much greater and safer place to live, if all programmers were required to join a professional society, and follow all of it’s rules and standards for creating programs. This way, all programmers would make and implement ethical decisions into all programs, from the online shopping applications to medical and life-critical devices. With new laws and regulations, could the Therac-25 incident have been avoided entirely? Could these laws prevent any future accidents from occurring? Could we prevent all computer code failure incidents?

 

Works Cited

  1. The American Heritage Dictionary of the English Language, Fourth Edition: Houghton Mifflin Company, 2000.
  2. Webster’s Revised Unabridged Dictionary: Simon and Schuster, 1995.
  3. Maner, Walter. "Unique Ethical Problems in Information Technology." Science and Engineering Ethics. Volume 2, number 2.(1996): 147-154. (Online).
  4. Moor, James. "What is computer ethics?" The journal Metaphilosophy. (1985): 266-275. (Online).
  5. Leveson, Nancy and Turner, Clark. "An investigation of the Therac-25 Accidents." IEEE Computer. Volume 25, Number 7. (1993) Pg18-41.
  6. J.A. Rawlinson, "Report on the Therac-25." OCTRF/OCI Physicists Meeting, Kingston, Ont., Canada, May 7, 1987.
  7. Eby, Mark. "Therac-25: The Treatment that Killed." The University of Guelph. http://www.uoguelph.ca/~meby/ (4/5/03).
  8. Felciano, Ramon M. "Human Error: Designing for Error in Medical Information Systems." Stanford University. http://www.smi.stanford.edu/people/felciano/research/humanerror/humanerrortalk.html#RTFToC18 (4/5/03).
  9. The Computer Ethics Institute "Ten Commandments of Computer Ethics". Computer Professionals for Social Responsibility. http://www.cpsr.org/program/ethics/cei.html (4/5/03).
  10. Healy, Mike and Iles, Jennifer. "The Establishment and Enforcement of Codes." Journal of Business Ethics, Aug 2002.
Archived CPSR Information
Created before October 2004
Announcements

Sign up for CPSR announcements emails

Chapters

International Chapters -

> Canada
> Japan
> Peru
> Spain
          more...

USA Chapters -

> Chicago, IL
> Pittsburgh, PA
> San Francisco Bay Area
> Seattle, WA
more...
Why did you join CPSR?

In these times, this is the kind of organization that technology professionals should be a part of.