The UK should realign its definition of autonomous weapons to be the ‘same, or similar, as that used by the rest of the world’, according to recommendations by the House of Lords Select Committee on Artificial Intelligence.

It warned that without an agreed definition the UK could find itself ‘stumbling through a semantic haze into dangerous territory’.

The wide-ranging report, ‘AI in the UK: ready, willing and able?’, collected evidence over 10 months from more than 200 witnesses including government officials, academics and companies and explored the use of AI across sectors ranging from healthcare to transport.

It found that the definition of autonomous weapons varied significantly between countries and organisations.

The most up-to-date UK definition, provided by the Ministry of Defence (MOD), states that an autonomous weapons system must be ‘capable of understanding higher-level intent and direction’.

This differs from an automated system, which is programmed to follow a set of rules and its outcome can always be predicted.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Expert witness Noel Sharkey, professor of AI and robotics at the University of Sheffield, argues that requiring an autonomous weapon system to be ‘aware and show intention’, as stated in MOD guidance, was to set the bar so high that it was effectively meaningless.

“By saying that to be considered autonomous a weapon need to have attention and show awareness, they give the impression that they are unlikely to exist for a long time, if ever,” Sharkey told Army Technology.

“This is picking up on a non-technical meaning of the word ‘autonomous’, related to free will. In the robotics community, it is just a term for a robot with sensors controlled only by a computer programme.”

This high standard was found to be out of step with definitions used by most other governments. Countries including the US, Norway and France focused on the level of human involvement, rather than the capability for ‘understanding higher-level intent and direction’.

The report found that such a position of ambiguity limits both the extent to which the UK can meaningfully participate in international debates on autonomous weapons, such as the UN’s Convention on Certain Conventional Weapons Group of Governmental Experts in November 2017. During the convention, in which 86 countries participated, 22 countries supported a prohibition on fully autonomous weapons, including Brazil, Uganda and Iraq.

“At the UN most states use some form of the definition that an autonomous weapon is one that once launched will operate without further human intervention or a weapons system that is autonomous in the critical function of target selection and engagement,” said Sharkey.

“The Lords agreed that the heart of the matter is the level of human control of weapons systems.”

Secretary of State for Digital, Culture, Media and Sport Matt Hancock, who appeared as a witness before the committee, agreed that there is no internationally agreed definition of autonomous weapons systems, but said he believes existing international laws are adequate.

“We think that the existing provisions of international humanitarian law are sufficient to regulate the use of weapons systems that might be developed in the future,” said Hancock .

“Of course, having a strong system and developing it internationally within the UN Convention on Certain Conventional Weapons is the right way to discuss the issue.

“Progress was made in Geneva by the group of government experts just last month. It is an important area that we have to get right”.

Expert witnesses generally agreed that the key factor was the level of human oversight over autonomous weapons, but it also emerged that the ‘human-in-the-loop’ model can mean many things with regards to semi-autonomous weapons.

The report concluded that the government should convene a panel of military and AI experts to agree a revised definition within eight months of the publication of the report.

“There is a great opportunity for the UK to take international leadership on the issue of autonomous weapons systems at UN treaty meetings when it is unhampered by a thoroughly faulty definition,” said Sharkey.

“We need the UK to focus on a positive obligation to keep weapons under meaningful human control to hold the respect of the international community of nations.”