게시물

Another demographic which could be subject to unfair targeting by lethal autonomous weapons is people of colour, particularly since facial recognition technologies are significantly less accurate when trying to identify people of colour.

A study by the US National Institute of Standards and Technology found that Asian and African American people were 10 to 100 times more likely to be misidentified by facial recognition technologies, depending on the particular algorithm and t...ype of search.

In everyday life, misidentification can subject innocent people to police scrutiny and/or erroneous charges. In situations where lethal autonomous weapons are in use, whether on or off the battlefield, misidentification could equal a death sentence.

This piece by Joy Buolamwini of Algorithmic Justice League about the success of the Safe Face Pledge, a “strategic initiative meant to set explicit red lines for unacceptable use [of facial recognition technologies]; raise awareness about the weaponisation of FRTs; broaden the conversation about the harms of FRTs; and challenge companies to make actionable, measurable commitments beyond stating their AI ethics principles", is definitely worth a read. https://medium.com/…/announcing-the-sunset-of-the-safe-face…

더 보기

Issue #7 with lethal autonomous weapons that can identify, select and apply force to a target without human approval is the risk of unjustly targeting certain individuals at a disproportionate rate, also leading to an increase in civilian casualties.

International humanitarian law prohibits warring parties from targeting individuals on the basis of their race, colour, sex, religion, gender identity, etc., but in many settings, these are the sorts of simplistic criteria which... lethal autonomous weapons will use to pick out potential targets.

One plausible outcome is military-aged civilian men being denied protection against attack, protection afforded to non-male civilians, because they happen to fit the generic description of an enemy combatant & happened to be in the wrong place at the wrong time.

In an earlier episode of the FLI podcast, Director of the Women's International League for Peace and Freedom Disarmament Program Ray Acheson talks about this topic in quite some detail. In this short clip, she highlights another group that may suffer from unfair targeting: the trans community. https://futureoflife.org/…/fli-podcast-why-ban-lethal-auto…/

더 보기
이 동영상을 재생하는 중 문제가 발생한 것 같습니다. 브라우저를 다시 시작해보세요.
닫기
조회 48회
동영상
Issue #7 with lethal autonomous weapons that can identify, select and apply force to a target without human approval is the risk of unjustly targeting certain individuals at a disproportionate rate, also leading to an increase in civilian casualties. International humanitarian law prohibits warring parties from targeting individuals on the basis of their race, colour, sex, religion, gender identity, etc., but in many settings, these are the sorts of simplistic criteria which lethal autonomous weapons will use to pick out potential targets. One plausible outcome is military-aged civilian men being denied protection against attack, protection afforded to non-male civilians, because they happen to fit the generic description of an enemy combatant & happened to be in the wrong place at the wrong time. In an earlier episode of the FLI podcast, Director of the Women's International League for Peace and Freedom Disarmament Program Ray Acheson talks about this topic in quite some detail. In this short clip, she highlights another group that may suffer from unfair targeting: the trans community. https://futureoflife.org/2019/04/02/fli-podcast-why-ban-lethal-autonomous-weapons/
2
“We struggled for 70-odd years to contain nuclear weapons and prevent them from falling into the wrong hands. In large quantities, [lethal autonomous weapons] would be as lethal, much cheaper, much easier to proliferate.” - Stuart Russell, Professor of Computer Science at UC Berkeley Listen to the full episode here - https://futureoflife.org/2021/02/25/stuart-russell-and-zachary-kallenborn-on-drone-swarms-and-the-riskiest-aspects-of-lethal-autonomous-weapons/
2
In the most recent episode of the Future of Life Podcast, Zachary Kallenborn, an expert on drone swarms and weapons of mass destruction, explains how error risks around targeting become more complex as you scale up lethal autonomous weapons - “If you have 10,000 drones, even a really small error rate may end up resulting in all sorts of error." You can listen to the episode in full or read the transcript here: https://futureoflife.org/2021/02/25/stuart-russell-and-zachary-kallenborn-on-drone-swarms-and-the-riskiest-aspects-of-lethal-autonomous-weapons/
5