Uncrewed Systems Technology 049 - April/May 2023

52 ran the Ottobots autonomously while some of their key AI computing units for localisation and object classification ran on the cloud in real time,” Vijay says. “5G-powered cloud computing will therefore be rolled out as a strategic update as the infrastructure is made ready. It’s because of that kind of support that we have Verizon and others as key customers now.” Sensor layout Although much of Ottonomy’s past expertise in autonomous cars has been used in producing its Ottobot, the UGV has a highly contextual form of navigation rather than focusing on GNSS localisation first and immediate object detection second, as is more typical of the kinds of robotaxis Vijay and Korupolu used to work on. “To understand the context of where they are and the densities of different kinds of objects – including people – they might encounter, our UGVs need enough sensors for a realistic and detailed representation of their environment,” Vijay says. “So at the top and front of themwe have a 360o Ouster 3D Lidar to generate a point cloud around it.” While that maximises real-time point cloud coverage through the use of a single Lidar, it had the potential to create shadow areas unless multiple Lidars were deployed, which would drive up the cost of each unit considerably. Numerous small ultrasonic sensors are therefore installed around the body, to cover these shadow areas and improve detection of glass surfaces, as Lidars as well as cameras can become ‘confused’ by them, particularly glass doors, owing to their refraction and transparency. “We also have seven or eight cameras covering the UGV’s entire FoV. The actual number depends on how we alter our modular design for different uses and cost cases, and we’ll have both regular HD cameras and depth cameras,” Vijay explains. “The 3D Lidar primarily gives us geometric information of objects and obstacles, to a better resolution and detail than any other sensor type, while the cameras identify semantic information that is key to contextual navigation that helps with specific identifications and classifications. The ultrasonic sensors are essentially there to detect things the Lidars and cameras can miss. “Those aren’t the only failsafe sensors on board: we also have ‘cliff detection’ sensors, which detect not just cliffs [despite the terminology], but also small objects nearby, stairs and other edges or precipices in front of the robot. That provides another layer of safety, because autonomous perception systems can still incorrectly report that it’s safe tomove forwards, just because there don’t happen to be any obstacles detected ahead.” In addition to the sensors fitted around the body, some indicator lights and the four electrically driven wheels and a battery pack sit below. The storage containers can be selected modularly for different requirements. April/May 2023 | Uncrewed Systems Technology The Ottobot 2.0’s 4WD enables it to rotate on the spot and sidestep as needed in order to navigate through crowds without bumping into people A long-term contract has been signed with Ouster for the supply of the Ottobot’s 3D Lidar units

RkJQdWJsaXNoZXIy MjI2Mzk4