Issue 39 Unmanned Systems Technology August/September 2021 Maritime Robotics Mariner l Simulation tools focus l MRS MR-10 and MR-20 l UAVs insight l HFE International GenPod l Exotec Skypod l Autopilots focus l Aquaai Mazu

38 could develop virtual models of cars, cyclists, pedestrians, even dogs that might run out in front of a vehicle. Unfortunately, the simulators built on games engines were perhaps too flexible. They can be configured in many different ways, so the same environment could provide multiple different results for each scenario depending on the optimisations chosen by the engineer. Games engines also take short-cuts to reduce the computing load when running on an Xbox or PlayStation to provide a real-time response. This means effects such as snow or rain are modelled for computing efficiency, not accuracy. If the scenario is about the effect of raindrops on the performance of a Lidar sensor at different ranges and frequencies, and with different levels of rainfall, then a gaming engine is not sufficient. That has meant developing physics engines in these simulation environments to model the exact impact on a certain technology in the environment – whether that is a Lidar in the rain, a camera at dusk blinded by the sun, a millimetre- wave radar sensor that is extremely sensitive to the metal in a highway guardrail, or when entering a tunnel. All these scenarios can generate false positives and false negatives, and the simulation engine has to be able to show it can link back to the underlying physics of the phenomena it is representing. It is the edge cases, which happen only rarely, that need to be fully tested out to ensure that the vehicle operates safely. This is for only one car, or the ‘ego’ vehicle, as it moves through a virtual world, testing out its sensors. The model of that vehicle can be built up from a behavioural model or from the cycle- accurate models of components. Increasingly though, simulation environments can also support multiple vehicles with this cycle-accurate modelling. It can examine how different sensors in different vehicles impact on the original design. For example, the laser output of other Lidar sensors at different heights in other vehicles could perturb the sensors in the ego vehicle. The simulation environment can go even further. The next stage is for the environment to support digital twins of multiple vehicles. This can test out how the AI systems that control the vehicles interact in the virtual environment of a highway. Again, using simulation allows the edge and corner cases with multiple vehicles to be identified and thoroughly examined. Another challenge is modelling urban environments. Most of the simulation systems have modelled highways as August/September 2021 | Unmanned Systems Technology Simulators allow evaluation of certain conditions such as the sun at a low elevation, which can confuse sensors (Courtesy of Ansys) Simulation tools allow the performance of different sensors to be configured (Courtesy of AImotive) The next stage is for the simulation environment to support digital twins of multiple vehicles, to test how their AI systems interact

RkJQdWJsaXNoZXIy MjI2Mzk4