Sally Applin quoted in The Washington Post March 29, 2022
Hed: The military wants AI to replace human decision-making in battle
Dek: The development of a medical triage program raises a question: When lives are at stake, should artificial intelligence be involved?
Author: Pranshu Verma
“AI is great at counting things,” Sally A. Applin, a research fellow and consultant who studies the intersection between people, algorithms and ethics, said in reference to the DARPA program. “But I think it could set a [bad] precedent by which the decision for someone’s life is put in the hands of a machine.”
Meanwhile, Applin, an anthropologist focused on AI ethics, said as the program shapes out, it will be important to scan for whether DARPA’s algorithm is perpetuating biased decision-making, as has happened in many cases, such as when algorithms in health care prioritized White patients over Black ones for getting care.
“We know there’s bias in AI; we know that programmers can’t foresee every situation; we know that AI is not social; we know AI is not cultural,” she said. “It can’t think about this stuff.”
And in cases where the algorithm makes recommendations that lead to death, it poses a number of problems for the military and a soldier’s loved ones. “Some people want retribution. Some people prefer to know that the person has regret,” she said. “AI has none of that.”8 April 2022