A working definition of “lethal autonomous weapons systems,” LAWS, suitable for negotiation and treaty language, may be drawn from the way that the Ottawa Convention banning antipersonnel landmines defines those weapons in general: ‘Anti-personnel mine’ means a mine designed to be exploded by the presence, proximity or contact of a person and that will incapacitate, injure or kill one or more persons. This definition can be parsed into two halves. The second part describes the lethal effects of the weapon on persons. The first part is more interesting. It describes the mine as being “designed to be exploded by the presence, proximity or contact of a person.” Rather than being triggered by their designated and accountable “operators,” mines are triggered by their victims. In fact, if autonomy is defined broadly as acting without human control, guidance or assistance, mines should be considered LAWS. Legacy mines are extremely simple in comparison to robots and artificial intelligence, but there is no reason that more advanced landmine systems would not incorporate artificial intelligence to more accurately discriminate targets and perhaps tailor responses to varied situations. Surely these would be of interest as potential LAWS.
Read the full report from Mark A. Gubrud here.