Uncrewed Systems Technology 048 | Kodiak Driver | 5G focus | Tiburon USV | Skypersonic Skycopter and Skyrover | CES 2023 | Limbach L 2400 DX and L 550 EFG | NXInnovation NX 100 Enviro | Solar power focus | Protegimus Protection

10 February/March 2023 | Uncrewed Systems Technology Simulation firm rFpro has used ray tracing to develop software that can accurately reproduce environments to test sensors in autonomous vehicles (writes Nick Flaherty). The company develops high-fidelity software for driver-in-the-loop (DIL) simulators, and 6 years ago started extending its technology to testing driverless cars. It has written a simulation engine from the ground up using ray tracing that traces all the beams that fall on a sensor. This can be visible light for a camera, infrared for a Lidar or RF for a radar sensor. Using ray tracing allows artefacts such as motion blurring to be accurately tested in a virtual environment. The simulation engine builds up environments, for example an underground parking garage or an urban tunnel at night. All the beams, or rays, in the environment are tracked, including those fromexternal lights and from the vehicle, to recreate what is received by the sensor. “We have spent 16 years creating immersive real-time high-bandwidth, low-latency simulation technology for human vision, which is what DIL simulators are all about,” said Matt Daley, operations director at rFpro. “Until now, the fidelity of simulation in the most challenging lighting situations hasn’t been high enough to replace real- world data. Our ray-tracing technology is a physically modelled simulation solution that has been developed specifically for sensor systems to accurately replicate the way they see the world.” “It needs to be engineering-accurate. You have to do things as physically accurately as possible. As soon as you move away from a perfectly lit daytime scene with lots of other light sources, and other vehicles, you have to be able to calculate how the light bounces around the environment. That is why ray tracing is needed for high-fidelity sensor simulation. “Ray tracing is established in the graphics industry but it has been focused on making things look good to human eyes. We believe this is the first engine written from the ground up for sensors in autonomous systems. “It’s all about the physics of electromagnetic waves from a source reflecting offmaterials and arriving at a sensor. It’s about how you trace the path.” The model of the sensor is a key element in the simulation. For example, a camera sensor with a rolling shutter can use three capture periods, for example at 2, 5 and 10 ms, then process that data to give an HDR image. These timings can also change from frame to frame as the sensor adapts to the different light levels while the vehicles move around. That needs to be included in the model to achieve accurate motion blur in the simulated sensor. “With rolling shutter sensors, every single line of the chip is being sampled at a slightly different time, so we don’t get straight edges,” said Daley. “That is fundamentally built into the way the sensor models are coupled with the ray tracing. What we have done is develop the ray tracer alongside the sensor APIs that allow the models to be integrated.” Simulators Ray-tracing sensor tests Adapting to varying light levels allows motion blur to be tested virtually

RkJQdWJsaXNoZXIy MjI2Mzk4