Moral outsourcing refers to placing responsibility for ethical decision-making on to external entities, often algorithms. The term is often used in discussions of computer science and algorithmic fairness,[1] but it can apply to any situation in which one appeals to outside agents in order to absolve themselves of responsibility for their actions. In this context, moral outsourcing specifically refers to the tendency of society to blame technology, rather than its creators or users, for any harm it may cause.[2]
The term "moral outsourcing" was first coined by Dr. Rumman Chowdhury, a data scientist concerned with the overlap between artificial intelligence and social issues.[1] Chowdhury used the term to describe looming fears of a so-called “Fourth Industrial Revolution” following the rise of artificial intelligence.
Moral outsourcing is often applied by technologists to shrink away from their part in building offensive products. In her TED Talk, Chowdhury gives the example of a creator excusing their work by saying they were simply doing their job.[1] This is a case of moral outsourcing and not taking ownership for the consequences of creation.
When it comes to AI, moral outsourcing allows for creators to decide when the machine is human and when it is a computer - shifting the blame and responsibility of moral plights off of the technologists and onto the technology. Conversations around AI and bias and its impacts require accountability to bring change. It is difficult to address these biased systems if their creators use moral outsourcing to avoid taking any responsibility for the issue.[2]
One example of moral outsourcing is the anger that is directed at machines for “taking jobs away from humans” rather than companies for employing that technology and jeopardizing jobs in the first place.[1]
The term "moral outsourcing" refers to the concept of outsourcing, or enlisting an external operation to complete specific work for another organization. In the case of moral outsourcing, the work of resolving moral dilemmas or making choices according to an ethical code is supposed to be conducted by another entity.[3]
In the medical field, AI is increasingly involved in decision-making processes about which patients to treat, and how to treat them.[4] The responsibility of the doctor to make informed decisions about what is best for their patients is outsourced to an algorithm. Sympathy is also noted to be an important part of medical practice; an aspect that artificial intelligence, glaringly, is missing.[5] This form of moral outsourcing is a major concern in the medical community.
Another field of technology in which moral outsourcing is frequently brought up is autonomous vehicles.[6] California Polytechnic State University professor Keith Abney proposed an example scenario: "Suppose we have some [troublemaking] teenagers, and they see an autonomous vehicle, they drive right at it. They know the autonomous vehicle will swerve off the road and go off a cliff, but should it?"[6] The decision of whether to sacrifice the autonomous vehicle (and any passengers inside) or the vehicle coming at it will be written into the algorithms defining the car's behavior. In the case of moral outsourcing, the responsibility of any damage caused by an accident may be attributed to the autonomous vehicle itself, rather than the creators who wrote protocol the vehicle will use to "decide" what to do.[1]
Moral outsourcing is also used to delegate the consequences of predictive policing algorithms to technology, rather than the creators or the police. There are many ethical concerns with predictive policing due to the fact that it results in the over-policing of low income and minority communities. [7] In the context of moral outsourcing, the positive feedback loop of sending disproportionate police forces into minority communities is attributed to the algorithm and the data being fed into this system--rather than the users and creators of the predictive policing technology.
Moral outsourcing is also commonly seen in appeals to religion to justify discrimination or harm. In his book What It Means to be Moral, sociologist Phil Zuckerman contradicts the popular religious notion that morality comes from God. Religion is oftentimes cited as a foundation for a moral stance without any tangible relation between the religious beliefs and personal stance.[8] In these cases, religious individuals will "outsource" their personal beliefs and opinions by claiming that they are a result of their religious identification. This is seen where religion is cited as a factor for political beliefs,[9] medical beliefs,[10] and in extreme cases an excuse for violence.[11]
Moral outsourcing can also be seen in the business world in terms of manufacturing goods and avoiding environmental responsibility. Some companies in the United States will move their production process to foreign countries with more relaxed environmental policies to avoid the pollution laws that exist in the US. A study by the Harvard Business Review found that "in countries with tight environmental regulation, companies have 29% lower domestic emissions on average. On the other hand, such a tightening in regulation results in 43% higher emissions abroad."[12] The consequences of higher pollution rates are then attributed to the loose regulations in these countries, rather than on the companies themselves who purposefully moved into these areas to avoid strict pollution policy.
Chowdhury has a prominent voice in the discussions about the intersection of ethics and AI. Her ideas have been included in The Atlantic,[13] Forbes,[3] MIT Technology Review,[14] and the Harvard Business Review.[15]
Original source: https://en.wikipedia.org/wiki/Moral outsourcing.
Read more |