30 Dossier | Keybotic Keyper developed our own AHRS, but we’ve found that Advanced Navigation’s system outputs the heading so consistently and easily that it removed a lot of headaches for us.” The Orientus MEMS device weighs 25 g and consumes 0.5 W. Its magnetic heading readings are accurate to 0.8o, with 0.2o accuracy in roll and pitch, a 3o/ hour bias instability, and it has a 1000 Hz update rate. “We also need to perceive and digitally reconstruct the surrounding environment to calculate the necessary movements for reducing stumbles and impacts,” Tome explains. “Fortunately, just 10 m of forward optical range is sufficient for a good safety margin at the speeds the Keyper travels.” Using Lidar for perceiving and modelling the robot’s surroundings would have been cost-prohibitive, so Intel RealSense D455 stereo cameras are used instead. The D455 combines the company’s D450 camera module with its D4 vision processor board, along with a Bosch BMI055 IMU and a USB-C 3.1 (Gen 1) connector for straightforward power and data interfacing. Its stereo cameras provide an FoV of 87 × 58o, with a frame rate of up to 90 fps and a resolution of up to 1280 x 720 in its output depth footage. It also integrates a 1 MP RGB sensor with global shutter and a 90 × 65o FoV. “We use five of these, covering 360o around the robot and creating a representation of the world for the Keyper so that it knows where to step and how to avoid bumping into obstacles,” Tome says. The cameras include one D455 pointing left, one right, one rearwards and two forwards. The second forward camera tilts slightly downwards, for awareness and resolution of obstacles or physical inconsistencies such as cracks or uneven stairs that could otherwise threaten the Keyper’s stable forward movement. It also enables perception redundancy in the direction the Keyper will most often travel in. Navigation intelligence Tome notes however that performing localisation using vision alone is extremely challenging, because the resolution of range measurements and the availability of distinct visual features is often unreliable for triangulating position within a known area. “That’s the thing about stereo cameras, they’re depth estimators, they triangulate depths and distances based on visual markers and the perceived differences between those from one ‘eye’ to the other,” Tome says. “They don’t know in a tactile way how far away or apart objects are, so in lieu of computationally intensive optimisations to localise without consistently available visual markers, we’ve mounted an optional Lidar on the Keyper’s back to detect precisely where every point of every surrounding surface is, regardless of how featureless a given room or corridor might be. “We weren’t always sure we would opt for Lidar. Tesla among others weren’t wrong in the past to think relying on Lidars would make autonomy too expensive October/November 2023 | Uncrewed Systems Technology An Orientus AHRS from Advanced Navigation gives attitude readings so that the UGV always understands where it is facing
RkJQdWJsaXNoZXIy MjI2Mzk4