Uncrewed Systems Technology 048 | Kodiak Driver | 5G focus | Tiburon USV | Skypersonic Skycopter and Skyrover | CES 2023 | Limbach L 2400 DX and L 550 EFG | NXInnovation NX 100 Enviro | Solar power focus | Protegimus Protection
34 Dossier | Kodiak Driver accumulates for them through their testing and commercial deliveries. Wendel lauds the robustness and reliability of both Lidars, saying, “When you look at a laser’s specs, people often ask first how far it can scan, then if there’s another that can see 20 m further, but those don’t make for a robust and reliable AV. ‘Guaranteed range’ does: how far can I fully trust that the real-world objects are reported by the laser and not missed? “We’ve therefore tested our Lidars extensively to know that these ones will report data with an accuracy and integrity at the 300 m distances imperative to our safe driving and obstacle avoidance.” The Hesai Lidar models are rotating 360o units, with one either side to cover the blind spots created by the truck. Relying on a single unit on top of the cab could render the truck partially blind to nearby vehicles. In Kodiak’s configuration, any vehicles or motorcyclists next to the truck can be detected by the Hesai units to ensure the truck can avoid excessively risky movements. The Luminar Lidar in the CenterPod has a longer range. In the 300 m ahead of the truck, all three Lidars, the three forward- looking cameras and the two forward- facing radars provide overlapping views, ensuring the redundancy of object detection and lane counting in that safety- and localisation-critical ‘cone’. “The cameras are the first source of data in lane detection, but the Lidars are also important,”Wendel says. “Traffic lanes are highly reflective surfaces, and often even have little reflectors between them that pop out evenmore to the Lidars. In general, our perception processing software treats all the data equally, so there isn’t really any prioritisation of one data feed over another. “The radars, from ZF, are key to perception, but not the localisation aspect of it: they are ‘4D’ systems. Most automotive radars have a very flat view of the world, with just 2D distances and velocity, whereas ours also measure elevation over ground. “That helps us to see a lot of important obstacles that other AVs miss, for instance cars that have stopped under bridges. The radars can tell physically where the car ends and where the bridge begins. Without that distinction, the truck would have to slow down and lose time at every bridge.” Wendel adds that the radars have an effective range of 300 m as well, and that their ability to discern velocity is particularly critical for ensuring safety at the rear and sides, fromwhere other vehicles might be approaching. Kodiak Vision The camera images and Lidar models are run through a deep neural network to extract key properties such as lane markings and road boundaries. These directly inform calculations regarding the truck’s trajectories, as well as specific updates to Sparse Maps to ensure any changes in roads are fed to other trucks. “The information gleaned through that neural network lets the truck know for instance if its exit is upcoming, or if it needs to change lane for a merge later down the road,” Wendel says. “There are convolutional elements, but it isn’t a purely convolutional neural network [CNN]; it has a more advanced architecture than that. There are actually quite a few standardised architectures for image-based machine learning and AI that can’t just be called CNNs any more, like transformer networks, February/March 2023 | Uncrewed Systems Technology Kodiak Vision takes the raw data from the onboard sensors… …and fuses them via a deep neural network to identify vehicles, lane markings, road boundaries and other properties that inform the truck’s movement calculations
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4