USE Network launch I UAV Works VALAQ l Cable harnesses l USVs insight l Xponential 2020 update l MARIN AUV l Suter Industries TOA 288 l Vitirover l AI systems l Vtrus ABI

96 are a number of passive global alignment modules, developed by lead computer vision scientist Dongjiang Xu. These help to adjust for long-term errors in the maps being generated, as well as in the position of the ABI unit (and hence the vehicle) in an indoor space. For obstacle detection and avoidance, the dynamics model of the vehicle chassis must also be input so that the ABI can calculate the boundaries of the vehicle’s trajectory. “That can be done when the ABI and vehicle are first set up, using the GUI software,” Xu says. “It typically involves entering common vehicle parameters such as wheel-to-wheel distance, the location of the ABI unit relative to the vehicle’s centre or pivot point, the rotor wheelbase and so on.” Brains of the brain The ABI unit uses multiple sensors and filters before communicating its final position readings to an internal state estimator. The control software then determines and sends commands to the vehicle’s motor controller, with the customer typically taking over at the motor control level. “We can work with customers to take control of the whole vehicle-side stack, including actuation,” adds Sanchez. “For further security, a customer can set limits on which areas of the indoor map the vehicle is allowed to go. Also, they can set hard output limits on the maximum acceleration permitted in any direction, as well as speed of movement. “Of course, if customers are unfamiliar with setting that portion of robotic vehicle systems, we also offer custom integration assistance to handle the configuration of motors, motor drives and so on.” The four pairs of stereo cameras are connected to an Nvidia Jetson TX2 board, via an FPGA that provides the rapid front-end processing of the visual data; the IMU is also installed on this board. As it happens, the TX2 forms the mid-section of a considerable group of computer boards needed to provide the ABI’s extensive autonomy and GNSS- denied navigation. Two UART ports on the TX2 provide comms with a Pixhawk Mini flight control unit on one side, and a Teensy 4.0 microcontroller on the other. The TX2 also comes with a USB 3.0 hub for connecting further external computers and sensors, as well as an additional UART port specifically for debugging. The TX2 is important for many reasons, but most crucial is its GPU speed and power, which are vital for quickly processing computer vision tasks, 3D area reconstructions and path planning. Sanchez adds, “The ability to do this inside the ABI is paramount for safety, and to ensure seamless autonomy. Some areas inside industrial facilities could cause a loss of signal, which would severely hamper any remotely processed tasks, such as via cloud or edge computing.” Compared with the TX2, the Pixhawk flight controller is less vital to operations – in fact, it is only used as a low-level rate and attitude controller for multi-rotor UAV applications. Its autopilot functionality is never used; instead it simply obeys high- level commands such as the desired pitch, roll or acceleration. For UGV applications, the Pixhawk can be bypassed altogether. “The Teensy 4.0 board serves as a dedicated low-level microcontroller for handling time-sensitive comms, such as I2C or 1-Wire communication protocols such as PWM or some addressable LED languages,” says Vtrus’ CTO and lead electrical designer Jonathan Lenoff. “It also offers flexibility for low-level interfaces via digital and analogue I/Os.” This board also enables straightforward implementation of simple interfaces such as SBUS, PWM or PPM, which are widely used for controlling various robotic actuators – stepper motor drivers, brushless DC motor ESCs and servos, for example. It can also be programmed using Arduino software, to offer a more familiar programming route for many engineers, surveyors and other potential end-users. Naturally, the various functions and comms in the ABI head are grouped into discrete algorithmic blocks. At the front end, all the sensor data – visual, inertial, depth, Lidar and any added external sensors – are gathered into one block for feeding into the core estimator block for positioning and localisation, and also into the customer’s vehicle dynamics model. The core estimator feeds its calculations directly to the optimisation engine for obstacle avoidance and collision-free path planning. That engine then sends its pathing and movement commands to the microcontroller or the Pixhawk, depending on whether the ABI is being used on a UAV or UGV. June/July 2020 | Unmanned Systems Technology In addition to a variety of widely used interface ports, Vtrus can incorporate a number of lesser-known interfaces on the ABI

RkJQdWJsaXNoZXIy MjI2Mzk4