Loading…
MEASURING AUTONOMOUS WEAPON SYSTEMS AGAINST INTERNATIONAL HUMANITARIAN LAW RULES
In this paper, I argue that Autonomous Weapon Systems (AWS) will not be able to comply with important rules of international humanitarian law (IHL) such as distinction, proportionality, and military necessity. Currently, it is impossible to develop a lethal robot that can comply with IHL rules becau...
Saved in:
Published in: | Journal of law & cyber warfare 2016-07, Vol.5 (1), p.66-146 |
---|---|
Main Author: | |
Format: | Article |
Language: | English |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper, I argue that Autonomous Weapon Systems (AWS) will not be able to comply with important rules of international humanitarian law (IHL) such as distinction, proportionality, and military necessity. Currently, it is impossible to develop a lethal robot that can comply with IHL rules because IHL terms such as "civilians", "combatants", "direct participation in hostilities" etc. have no precise definitions that can be decoded into a machine. Further, in many instances, the judgement which machines do not possess. Even in cases where AWS may comply with some rules of IHL, I argue that AWS still violate the right to dignity which demands that the decision to use force against another human must be taken by a fellow human. To this end, I note that the issue is not about comparing whether robots can perform better than humans, but whether they should be allowed to act as combatants in the first place. |
---|---|
ISSN: | 2578-6245 2578-6229 |