Uncrewed Systems Technology 043 l Auve Tech Iseauto taxi l Charging focus l Advanced Navigation Hydrus l UGVs insight l MVVS 116 l Windracers ULTRA l CES 2022 show report l ECUs focus I Distant Imagery

31 look at the door areas and also to the front, rear and sides. Naturally, the more object-dense an area (such as a campus with a lot of pedestrians and other vehicles), the higher the burden on the sensors in terms of detecting and tracking everything moving in the surroundings and the road ahead. Depending on the difficulty of the area to be navigated on a routine basis, Auve Tech will switch between different Lidar units for the upper mounting spot. “The longer the range we have, the smarter and more gently we can program our manoeuvres to be in response to high-speed obstacles far away,” Mossov says. “We started with Velodyne’s Puck Lidars but we’ve begun adopting Hesai’s products too. “They both have similar ranges and densities. The important thing for us is being able to run analytics on the point clouds to be able to recognise which clusters of points correspond to a moving object and then interpolate what its velocity is, even when far away, when the picture given by Lidar becomes imprecise because you have fewer points to work with.” Sensor intelligence As mentioned, the Iseauto is currently rated to SAE Level 4 autonomy, in that if an issue is detected, it is capable of parking itself safely. This is helped to an extent by its application as a low-speed vehicle running in closed, predetermined routes. If a problem occurs, the electric motor can be shut off and the vehicle will come to a halt in about 2.5 m. “When we go to more complicated routes though, the latest design of the Iseauto will be key,” Mossov adds. “We’ve duplicated every vital system, so if for instance one computer dies, another one takes over. We expect that in most cases we’ll be required to program it specifically to pull over slowly and park when that happens, rather than continue its journey. And if both systems fail then we can have a teleoperator remotely take over and steer it to a safe location.” The company has also worked over the past year on the fusion of data streams between the cameras and the Lidar. The ability of the former to sense colour, combined with the latter’s depth and shape sensing, results in a considerable enhancement of the main computer’s ability to recognise and classify objects, to tag and box them individually for tracking, and to calculate possible trajectories they will take (assuming it is a pedestrian, cyclist or similar). Notably, the vision AI thus far does not need to recognise street signs, as working in pre-mapped areas means the vehicle can be pre-programmed with rules on where to stop, where to give way, comply with speed limits and other regulations. On the rare occasion that a traffic light is encountered, a low-duty remote comms beacon can be installed between the traffic light and the vehicle, at far lower cost and set-up time than would have been spent on recognising traffic light colours and the correct behaviours. Unmanned Systems Technology | April/May 2022 Localisation comes primarily from 3D models produced and checked in real time against pre-established models of known areas, using Lidars at the top corners… … while a Lidar on the front bumper serves as a safeguard against frontal collisions, signalling emergency brakes if needed

RkJQdWJsaXNoZXIy MjI2Mzk4