Ethical Issues in Technology - Business Ethics

Masters Study
0
Ethical Issues in Technology


Kristin Shrader Frechette

The US government projects cost increases for drugs of between 10 and 14 percent, per year, through at least 2011. US consumers claim the pharmaceutical technologies are too expensive, given that Canadians pay $34 for the same dose of the cancer drug Tamoxifen that costs US citizens $240. They also say that because the US National Institutes of Health, using citizens’ tax dollars, funded most research on Tamoxifen and other drugs, citizens typically ‘‘pay twice’’ for pharmaceuticals. Yet the drug industry makes greater profits than any other contemporary enterprise. In response, the pharmaceutical companies claim it cost nearly four times as much to bring a new drug to market, in 2003, as in 1990. But the US Food and Drug Administration says that, since 1989, only 15 percent of new drugs provided remedies that were superior to those already on the market. It says most of the costs of new drugs is for advertising, for less desirable products, and not for research (Greider, 2003; see Huffington, 2003). 

Who is right in the pharmaceutical technology battle? Ethicists have a variety of answers, not only to this problem but also to other techno logical problems, such as World Trade Organization agreements that override national environmental and safety standards (see Wallach and Sforza, 1999). In part as a result of such trade agreements, the World Health Organization says, for example, that the application of chemical technologies annually kills at least 40,000 persons worldwide. Most of these deaths are from use of pesticides in developing nations that are banned in the developed world. As the drug and pesticide cases illustrate, the design and employment of technologies often raise troubling ethical issues, questions about right and wrong. 

Most of the ethical issues concerning technology focus on questions of risk. Some of these questions include whether persons have been informed adequately about technological dangers, whether they have consented to them, whether the risks are equitably distributed, and whether risk imposers have been compensated for the threats they generate. 

Technological risks can be divided into two main types, societal and individual. Societal risks (such as those from underground gasoline storage tanks) are largely involuntarily imposed. In dividual risks (such as those from using a regulator to engage in scuba diving) are largely voluntarily chosen. Societal risks often raise greater ethical questions than individual risks because their potential victims typically have less choice regarding whether or not to accept them. For example, people choose whether to become scuba divers and what kind of breathing equipment to use. Usually, however, they have less choice over whether to allow a gas station to be built near them. 

Much ethical debate focuses on whether technological risks ought to be evaluated by members of the technical community (Cooke, 1992; see Sunstein, 2002) or by laypersons who are most likely to be their victims (Freudenburg, 1988). Scientists and engineers often treat the assessment of technological risks as the paternalistic prerogative of experts, in part because they claim that the definitions of irrational, ignorant, or risk averse laypersons could impede social and technological progress (Douglas and Wildavsky, 1982; Sunstein, 2002). Many moral philosophers argue, in response, that evaluations of technology are not only matters of scientifically defensible outcomes but also matters of just procedures, because they affect public welfare (Cranor, 1992; Shrader Frechette, 1991, 2002). Also, because even scientists and engineers have well known prejudices in defining and estimating technological risks – such as the over confidence biases in giving overly positive estimates of risk (Cooke, 1992; Kahneman, Slovic, and Tversky, 1982) – ethicists claim that we need democratic, as well as technical, evaluations of technology. 

Other ethical controversies concern what level of technological safety is safe enough. Utilitarian philosophers, who emphasize maximizing over all welfare, typically argue that we can serve the greater good by accepting low levels of risk and by not forcing industry to spend money to avoid unlikely hazards, such as nuclear core melts or chemical explosions. Harsanyi (1975), for example, argues that ‘‘worst cases’’ of techno logical risk rarely occur. He claims that forcing industry to avoid worst cases is too conservative, impedes social progress, and over emphasizes small probabilities of harm. Egalitarian philosophers, who emphasize the equal rights of all persons to protection from harm, maintain that the people deserve protection, even from un likely technological threats. Shrader Frechette (1991, 1993), for example, argues that because the probabilities associated with technological risks are often uncertain, the public deserves protection from them, even if they are small. Their size is dwarfed by potentially catastrophic consequences such as global warming or toxic leaks (see Rawls, 1971). Some egalitarian philosophers also claim that fairness and equal treatment require technology assessors and decision makers to reverse the burden of proof and place it on those who impose technological risks rather than on those likely to be their victims. They say that because causal chains of harm are difficult to prove – and because risk victims are less able than risk imposers to bear the costs of faulty technological evaluations – those who design, implement, apply, or benefit from a technology should bear its greatest risks (Cranor, 1992). 

Still other ethical issues regarding technology address the criteria under which it is acceptable to impose some hazard (for example, chemical effluents) on workers or on the public. One important criterion for risk imposition is the equity of distribution of the risks and benefits associated with an activity. For example, Parfit (1983) argues that temporal differences among people/generations are not a relevant basis for discriminating against them with respect to risk. He and others maintain that a technological risk is less acceptable to the degree that it imposes costs on the future but awards benefits in the present. Commercial nuclear fission, for example, benefits mainly present generations, whereas its risks and costs – because of radio active waste – will be borne primarily by members of future generations. 

On the one hand, many economists evaluating technology follow the utilitarian philosophy. They question notions of distributive equity and argue that a bloody loaf of bread – earned through dirty or risky technologies – is better than none at all, because such technologies bring tax and employment benefits. On the other hand, egalitarian philosophers evaluating technology argue for ‘‘geographical equity’’ (Shrader Frechette, 1993, 1995) and ‘‘environ mental justice’’ (Bullard, 1993). They maintain that technological risks should be distributed equally across generations, regions, and nations. Otherwise, they claim, economically and socially disenfranchised persons will bear disproportion ate burdens of technological risks. Economically, educationally, or socially disenfranchised per sons also are less likely than others to be able to give genuine free informed consent to techno logical and workplace risks (MacLean, 1986; Rescher, 1983; Bullard, 1993; Shrader Frechette, 2002). Chemical facilities and hazardous waste dumps, for example, tend to be located in areas where income, education, and political power are the lowest. 

To such equity and consent arguments, some utilitarian ethicists have responded that no in stances of distribution or consent are perfect. They claim that the greater good is achieved by risk for money trade offs when workers accept jobs in dangerous technologies or when citizens accept the tax benefits of a hazardous technology in their community. Egalitarians like MacLean (1986) claim, however, that some values (like bodily health and environmental security) ought not to be traded for financial compensation. Gewirth (1982) also argues, for example, that persons have a moral and legal right not to be caused to have cancer. Such ethical debates over risk imposition and trade offs generally focus on opposed views about rights, paternal ism, human dignity, equal treatment, and ad equate compensation for technological risk (Thomson, 1986). 

Another aspect of the consent and compensation debate over technological risk concerns liability. Current US laws, for example, excuse nuclear power plant licensees from 99 percent of their liability for accidents, even when they intentionally violate safety laws. Technologists justify this liability limit on grounds of economic efficiency and the necessity to promote essential, but dangerous, technologies. A number of ethicists argue that these exclusions violate citizens’ rights to due process and to equal protection (Shrader Frechette, 1993, 1991). 

As this discussion of equity, consent, and compensation reveals, the ethical issues associated with technology may be just as important as the scientific and safety issues. Once we under stand the magnitude of these ethical issues, we are forced to ask about a technology, not only ‘‘how safe is safe enough?’’ but also ‘‘how safe is equitable enough?’’ or ‘‘how safe is voluntary enough?’’ or ‘‘how safe is compensated enough?’’


Bibliography

Bullard, R. D. (1993). Confronting Environmental Racism. Boston, MA: South End Press.

Cooke, R. (1992). Experts in Uncertainty: Opinion and Subjective Probability in Science. New York: Oxford University Press.

Cranor, C. (1992). Regulating Toxic Substances: A Philosophy of Science and the Law. New York: Oxford University Press.

Douglas, M. and Wildavsky, A. (1982). Risk and Culture. Berkeley: University of California Press.

Freudenburg, W. (1988). Perceived risk, real risk: Social science and the art of probabilistic risk assessment. Science, 242, 44 9.

Gewirth, A. (1982). Human Rights. Chicago: University of Chicago Press.

Greider, K. (2003). The Big Fix. New York: Public Affairs.

Harsanyi, J. (1975). Can the maximin principle serve as a basis for morality? American Political Science Review, 69, 594 605.

Huffington, A. (2003). Pigs at the Trough. New York: Crown.

Kahneman, D., Slovic, P., and Tversky, A. (eds.) (1982). Judgment under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.

Kates, R. and National Academy of Engineering (eds.) (1986). Hazards: Technology and Fairness. Washington, DC: National Academy Press.

Kneese, A. V., Ben-David, S., and Schulze, W. D. (1982). The ethical foundations of benefit-cost analysis. In D.

MacLean and P. Brown (eds.), Energy and the Future. Totowa, NJ: Rowman and Littlefield, 59 74.
MacLean, D. (ed.) (1986). Values at Risk. Totowa, NJ: Rowman and Allanheld.

National Research Council (1983). Risk Assessment in the Federal Government: Managing the Process. Washington, DC: National Academy Press.

National Research Council (1993). Issues in Risk Assessment. Washington, DC: National Academy Press.

Parfit, D. (1983). The further future: The discount rate. In D. MacLean and P. Brown (eds.), Energy and the Future. Totowa, NJ: Rowman and Littlefield, 31 7.

Porter, A. L., Rossini, F. A., Carpenter, S. R., and Roper, A. T. (1980). A Guidebook for Technology Assessment and Impact Analysis. New York: Holland.

Rawls, J. (1971). A Theory of Justice. Cambridge, MA: Harvard University Press.

Rescher, N. (1983). Risk: A Philosophical Introduction. Washington, DC: University Press of America.

Sagoff, M. (1988). The Economy of the Earth. Cambridge: Cambridge University Press.

Shrader-Frechette, K. (1991). Risk and Rationality: Philosophical Foundations for Populist Reforms. Berkeley: University of California Press.

Shrader-Frechette, K. (1993). Burying Uncertainty: Risk and the Case Against Geological Disposal of Nuclear Waste. Berkeley: University of California Press.

Shrader-Frechette, K. (1995). Technology assessment. In W. T. Reich (ed.), Encyclopedia of Bioethics. New York: Macmillan.

Shrader-Frechette, K. (2002). Environmental Justice: Creating Equality, Reclaiming Democracy. New York: Oxford University Press.

Srinivasan, M. (ed.) (1982). Technology Assessment and Development. New York: Praeger.

Sunstein, C. (2002). Risk and Reason. New York: Cambridge University Press.

Thomson, J. J. (1986). Rights, Restitution, and Risk. Cambridge, MA: Harvard University Press.

Unger, S. H. (1994). Controlling Technology: Ethics and the Responsible Engineer. New York: John Wiley.

Wallach, L. and Sforza, M. (1999). Whose Trade Organization? Washington, DC: Public Citizen.

Winner, L. (1977). Autonomous Technology: Technics Out of Control as a Theme in Political Thought. Cambridge, MA: MIT Press.

Post a Comment

0Comments
Post a Comment (0)

Ads

#buttons=(Accept !) #days=(20)

Our website uses cookies to enhance your experience. Check Now
Accept !