Uncrewed Systems Technology 043 l Auve Tech Iseauto taxi l Charging focus l Advanced Navigation Hydrus l UGVs insight l MVVS 116 l Windracers ULTRA l CES 2022 show report l ECUs focus I Distant Imagery

32 But as and when the Iseauto progresses towards open road applications, the company will devote resources towards incorporating the necessary recognition and response actions. The company estimates that around 80% of the tools and imagery used for training the Iseauto’s computer vision systems were taken from open source packages – that is, the initial 80%. More recently, the team has focused on refining its in-house object recognition, classification and avoidance capabilities using data and imagery captured across the Iseauto’s many trials and projects. “It is a better, more context-appropriate approach,” Mossov says. “The main reason we needed so much open source data to begin with was because we were focused on optimising the way we used Lidar for so much of our r&d so far – the cameras were almost not used in a practical sense at all in the early years. “Now though, we’ve gone from attempting 100% of our long-distance object recognition with Lidar, to 80% with Lidar and 20% with cameras. As we scale up our systems, we’re aiming for 50% of the data used in the taxis in their sense & avoid AI to be visual, and laser point clouds will form the other 50%.” The importance of using both equally comes not merely from a position of limited computing power but also because if the Lidar breaks down, the taxis cannot be stuck with the loss of 80-100% of their obstacle avoidance capability. The camera therefore needs to be able to take over fully to safely perform final manoeuvres towards parking. “Working with the Lidars made for simpler and easier r&d at first, because they output a ready-made 3D understanding of the local area,” Mossov adds. “Cameras are doubtlessly much less pricey per unit than Lidars, but to be able to sense objects, velocities and distances with cameras means you pay a very high price in AI development costs. “But having paid those costs, we might save costs and offer lower prices in the long run if we decide to release new versions of the Iseauto that use cameras in place of some of the Lidars.” Although stereo cameras might have provided easier access to vision-based depth information, such systems were considered superfluous given the depth information already gained through the Lidars. Electric powertrain The powertrain has been the subject of the least focus of any subsystem on the Iseauto, by virtue of the fact that the team has not had a single issue with it since the first prototype made its maiden trial journey. “Even the maintenance and supply chains of the motor haven’t been a problem,” Mossov says. “The only way we’ll stop using it or switch to any other powertrain systems is if we decide to change to a different voltage architecture from those we use. “The battery-electric version of the Iseauto uses a 300 V architecture, while the hydrogen version uses a 48 V bus. But we wouldn’t mind finding something like a 96 or 200 V drive as a middle ground between the two, so long as it doesn’t lead to any noticeable loss of efficiency or performance in the battery or hydrogen powertrains.” The Iseauto’s electric drive systems come from Meidensha Corporation, April/May 2022 | Unmanned Systems Technology Numerous fisheye cameras provide a 360 º FoV around the body for identifying and tracking potential obstacles The battery-electric version uses 300 V, while the hydrogen version uses 48 V. But we wouldn’t mind finding something like a 96 or 200 V drive

RkJQdWJsaXNoZXIy MjI2Mzk4