The US Office of Naval Research (ONR) has offered a $7.5m grant to researchers from universities including Tufts, Rensselaer Polytechnic Institute, Brown, Yale and Georgetown, to explore the development of robots with autonomous moral reasoning ability.

ONR cognitive science programme director Paul Bello was quoted by Defense One as saying that even though today’s unmanned systems are ‘dumb’ in comparison to a human counterpart, progress is being made to incorporate more automation at a faster pace.

"As researchers, we are playing catch-up trying to figure out the ethical and legal implications." Bello said. "We do not want to be caught similarly flat-footed in any kind of military domain where lives are at stake."

"Semi-autonomous robots will not be able to choose and engage particular targets or specific target groups until they are selected by an authorised human operator."

The US Department of Defense (DoD) has, however, prohibited use of lethal, completely autonomous robots. However, researchers say that semi-autonomous robots will not be able to choose and engage particular targets or specific target groups until they are selected by an authorised human operator.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

"Even if such systems aren’t armed, they may still be forced to make moral decisions," Bello added. He also noted that in a catastrophic scenario, the machine might have to decide who to evacuate or treat first.

Despite the envisioned systems having extensive use in first-response, search-and-rescue missions, and even in the medical domain, Bello added that the idea of in-theatre robots is still a consideration.

Some advanced drones, such as BAE System’s Taranis and Northrop Grunman’s X-47B, already have some self-direction programmed into them.

According to Wendell Wallach, author of ‘Moral Machines: Teaching Robots Right From Wrong’, some types of morality are simple and easier to code.

"There’s operational morality, functional morality, and full moral agency," Wallach said.

Defence Technology