USE Network launch I UAV Works VALAQ l Cable harnesses l USVs insight l Xponential 2020 update l MARIN AUV l Suter Industries TOA 288 l Vitirover l AI systems l Vtrus ABI

93 and software terms) and offer it as an open-architecture solution inside a single enclosure. Developers with no robotic programming skills can simply connect the unit to their vehicle chassis, and start creating indoor waypoint routes and assigning mission tasks. The ABI unit is compatible with UGVs with a differential drive (for tank-steer or skid-steering) as well as multi-rotor UAVs with up to eight motors. It weighs a little under 780 g, measures about 20 x 15 x 6 cm, and consumes a maximum 85 W of power through a voltage input range of 12-17 V DC. It comes with various I/O ports including HDMI, USB 3.0 and 2.0, I2C, UART and SPI, as well as 32 Gbytes of onboard storage and room for additional SDI cards. All this adds up to an open architecture that enables end-users to build a vehicle chassis optimised for whatever operating endurance, speed or payload capacity (within reason) they want their system to have, rather than being limited to currently available autonomous indoor inspection systems. As standard, it can detect and classify objects up to 5 m away, and also contains firmware for integrating an optional 2D Lidar system weighing roughly 100 g, which will extend the object detection range to 40 m. All it needs from the customer are a battery pack, a vehicle body frame, and wheel or propeller motors. “Out of the box, the ABI comes with all the Vtrus software necessary for perception, planning and control,” lead software engineer Jeston Furqueron explains. “To downlink and operate using real-time 3D mapping visualisation, and engage in 3D mission waypoint planning, we’ve produced a GUI software program we call Vtrus Universe, which the end-user can install and operate on a handheld tablet computer.” Sensor architecture The ABI’s localisation system is based on a hybrid combination of dense and sparse SLAM (simultaneous localisation and mapping) methods using visual inertial data from multi-view, time-synchronised and IMU-calibrated cameras. “We’ve integrated a 100 Hz IMU from Bosch into our state estimation and aiding software, which outputs a final set of readings for position, velocity and orientation at about 30 Hz,” says Sanchez. The IMU incorporates an accelerometer and a gyroscope, both three-axis and 16- bit. The accelerometer outputs data at a resolution of 0.09 milli- g and the gyroscope at 0.004 º /s; the overall system typically draws a current of 5.15 mA. Sanchez continues, “Our system relies on visible light, but we have integrated a laser depth sensor as well. There is also an optional top-mountable Lidar module, so we can still have some level of performance in dark conditions, although we recommend a minimum of 60 lux of brightness in the areas and directions the cameras are pointing.” For the visual aspect of SLAM data Vtrus Autonomy Brain Implant | Digest Unmanned Systems Technology | June/July 2020 The Autonomy Brain Implant is designed for indoor vehicles without needing autopilot, SLAM, or mission planning technologies

RkJQdWJsaXNoZXIy MjI2Mzk4