Unmanned Systems Technology Dec/Jan 2020 | Phoenix UAS | Sonar focus | Construction insight | InterGeo 2019 | Supacat ATMP | Adelan fuel cell | Oregon tour | DSEI 2019 | Copperstone Helix | Power management focus

58 Digest | Supacat hybrid ATMP localisation and mapping (SLAM) software for navigation. As the name suggests, the SLAM algorithm – currently under development – creates a map of its surroundings and works out the vehicle’s location within it based on data from a suite of sensors, in this case a combination of stereoscopic cameras, Lidar and an INU for dead reckoning. Selecting sensors The sensors fitted to the development vehicle have been chosen to ensure development costs are focused on innovation but with an understanding of what the military-spec equivalents will be, and the adoption of the UK MoD’s Generic Vehicle Architecture (GVA) standard for power, data and control interfaces. These include Mil-Std 810 for hardware and Stanag 4586 for system design referencing. “We can do a lot of what we want to do using less expensive sensors than those for the military, and in any case, by the time we come to integrate the defence-standard sensors the state of the art will have moved on by two or three years,” Austen says. “We know we can get a defence-standard system, and the software and interface standards we are using means they will be compatible.” Supacat showed us a recording of the terrain from the vehicle’s sensors combined and compressed into a representation of the terrain in colour- coded range bands – yellow shows obstacles far away, green nearer and blue very close, which is good enough for training both a human teleoperator and the AI software. That massively reduces the amount of data that the vehicle’s onboard network and data links have to handle, while providing enough information to negotiate the terrain safely and avoid obstacles, Austen says. He adds that this approach will enable Supacat to understand how the vehicle runs in this environment and to push the limits of its capabilities on a range of terrain types. “We have a series of courses that we know the baseline manned ATMP should be able to handle without a problem,” he says. “We also have areas that are out of bounds, and with a remotely operated vehicle we can afford to risk rolling it over to improve our understanding of its performance.” Autonomy architecture The developmental autonomy software runs on a rugged Nvidia Jetson TX2, which the manufacturer describes as a Mini-ITX board enabled with CUDA cores that allow the user to configure their own software architecture. A set of AI tools and workflows enhances system capability by supporting explorations on neural networks efficiently, freeing up system resources for other parallel tasks. Autonomy sits at the top of the vehicle’s software architecture. Lower down the hierarchy are the functional modules that enable local control by an onboard driver, which interface with the same system that enables teleoperation. The idea is that it makes no difference to the system whether the human operator is manipulating controls aboard the vehicle or via a radio link. Downstream, the teleoperation module interfaces with the motion control module, which commands the drivetrain actuation module and the SLAM module. The latter is divided into localisation and mapping sub-modules. The next layer of functionality resides in the sensor interface unit. It includes proximity sensors, the inertial reference unit and the global reference unit (GNSS) that support the localisation module, plus the vision suite and Lidar that support mapping, along with the control system algorithms that command the drivetrain’s actuation system. The foundations on which the company is building the vehicle’s autonomy are path planning, obstacle categorisation, sensor modelling and the SLAM algorithms. The path-planning function will rely on available data to collaborate with and follow other vehicles, requesting any extra data it needs from the mission planning tool or asking for human assistance. Meanwhile, its path monitoring function will use GNSS waypoints if satellite navigation is available, or checkpoints otherwise. It will also check the desired path for terrain or gradients that might cause it problems. The obstacle categorisation software will rely on neural network-defined December/January 2020 | Unmanned Systems Technology The ATMP’s motor controllers have isolated 12 V logic and provide energy-efficient control of the asynchronous induction motors

RkJQdWJsaXNoZXIy MjI2Mzk4