Army Technology interviewed Lord Robert Lisvane, Chair of the House of Lords Inquiry into AI enabled Autonomous Weapon Systems (AWS) that has published, 1 December, a report recommending the UK government seeks public support in implementing wider use of AWS in defence.

The report, ‘Proceed with Caution: Artificial Intelligence in Weapon Systems‘, also suggested that meaningful human oversight be embedded at every stage of development and deployment, and that Artificial intelligence should be prohibited in nuclear command, control and communications.

“Although AI enabled intelligence gathering may feed into political and military decisions made, particularly in a period of heightened tension, we were very clear that the escalatory risks of consigning decisions to artificial intelligence in the nuclear sphere was not acceptable.”

MoD procurement needs a ‘complete reset’ for AI enabled AWS

Lord Lisvane stated the Committee’s evidence indicates procurement for AI enabled AWS needs a ‘complete reset’ in order to match the constantly updating pairing of capability and requirement. 

The report is explicit that the Ministry of Defence needs to have capacity, either in house or on contract, to assess bid ‘on tap’, and to impose meaningful conditions during through-life support of the systems. The upshot defence economy efficiencies from this pattern of development is,  Lord Lisvane notes, that with the rate of adaptation being so much more increased, “gold plating may actually not be very easy to do”, implying prestige assets may be an alien concept to the sector. 

“If you’re looking at AI enabled AWS, it’s such a long way from procuring a fast jet, or a new frigate. And whereas the timeline, from operational requirement to tendering contract, trials, deployment could be as long as thirty years. We’re now talking in terms of a period of procurement, possibly going really quite rapidly, to potential deployment which could be measured in months.”

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Potential for reduction in collateral damage

Witnesses at the inquiry offered evidence that AWS, with appropriate controls, may reduce casualties normally defined as collateral damage, offering commanders fires that can be more selective and precise, but as Lord Lisvane notes, this should also be taken in balance. “If you’re dealing with an AWS and not with, let us say, a human piloted aircraft, an attack might be pressed home with more fervour and determination in the AWS mode, than it would [with] pilots who are worrying about being shot down.”

Central to the findings of the report, titled ‘Proceed with Caution: Artificial Intelligence in Weapon Systems’,  is the recommendation to the government to fully define AI enabled Autonomous Weapon Systems, so that legislators a use this definition to enact regulation.

“If circumstances meant that something wholly unforeseen, was developed, then you’d need also, I think, to have agreement on how a definition should be amended.”

Lord Robert Lisvane

The Committee offered two levels of definition. ‘Fully’ autonomous weapon systems are systems that, once activated, can identify, select, and engage targets with lethal force without further intervention by an operator, while ‘Partially’ autonomous weapon systems are defined  as, systems featuring varying degrees of decision-making autonomy in critical functions such as identification, classification, interception and engagement. 

“We’re pretty confident that our definition is a robust one, and resilient, at least against foreseeable change,” said Lord Lisvane, crediting the practical worth to the avoidance of technical detail. “We’ve simply said, ‘If the system exhibits these behaviours, it will fall into the definition of either fully or partially autonomous.’ He outlines the benefit for a stable definition as laying the foundation for international agreement on AI enabled AWS, but adds that revision is also consideration. “If circumstances meant that something wholly unforeseen, was developed, then you’d need also, I think, to have agreement on how a definition should be amended.”