Unmanned Systems Technology 021 | Robot Aviation FX450 l Imaging Sensors focus l UAVs Insight l Liquid-Piston X-Mini l Riptide l Eurosatory 2018 show report l Zipline l Electric Motors focus l ASTS show report

98 August/September 2018 | Unmanned Systems Technology PS | ‘Weaponised’ AI A rtificial intelligence (AI) is fundamental to the future of unmanned vehicle systems, but worries persist over its use for lethal purposes (writes Peter Donaldson). The latest manifestation of the anxiety about this is Google’s case of employee- driven cold feet about the US Department of Defense’s Project Maven. Formally known as the Algorithmic Warfare Cross- Functional Team, its purpose is to find terrorist targets automatically in imagery generated by UAVs. In April 2017, spurred by its inability to cope with the flood of UAV imagery and impressed by commercial AI technology developed in universities and deployed by the likes of Google, Amazon and others, the DoD turned to the private sector to adapt the technology for military purposes. Doing so would also allow the DoD to benefit from a stream of private investment in the technology estimated at $36 billion for the previous year. The first phase of Project Maven focused on the use of computer vision and deep learning using biologically inspired neural networks. The aim was to autonomously identify 38 classes of objects representing the kinds of things the US military needs to detect, and to deploy an initial capability by the end of 2017. The goal of the ongoing second phase is to generate actionable intelligence and decision-quality insights at speed, according to the DoD. Before they can generate such insights, however, algorithms have to be trained, and data labelling is a key part of that process. Here, people analyse sample sets of imagery data, for example, and apply tags for labels that tell the algorithms what kinds of objects are depicted in the images – people, vehicles, animals, buildings, weapon systems and so on. Once trained, the algorithms will flag up objects of interest in new imagery data for further attention from analysts. Lt Col Drew Cukor, leader of the Maven team in the DoD, said people and computers would work symbiotically to increase weapon systems’ ability to detect objects. He expressed the hope that analysts – who are a scarce resource – will eventually be able to do twice or even three times as much work as they could without Maven technology. He also emphasised that while AI is ready to help computer vision systems to recognise objects, it is not ready for more demanding and critical tasks such as selecting targets in combat, and won’t be ready any time soon. Instead, AI will complement the human operator, he said. However, the direction of travel seems clear to more than 3000 Google employees, who petitioned CEO Sundar Pichai over concerns about their involvement with developing ‘warfare technology’ and worries about what they referred to as “biased and weaponised AI”. Google responded by effectively pulling out of the project, announcing in June 2018 that it would not be renewing its Maven contract with the DoD when it expires in March of 2019. Whether this marks the beginning of a sea change in attitudes to military AI development though remains to be seen. Now, here’s a thing “ ” AI is ready to help computer vision systems recognise objects, but not for tasks such as selecting combat targets