Unmanned Systems Technology 026 I Tecdron TC800-FF I Propellers I USVs I AUVSI 2019 part 1 I Robby Moto UAVE I Singular Aircraft FlyOx I Teledyne SeaRaptor I Simulation & Testing I Ocean Business 2019 report

88 needed, rather than fully accurate models. The level of fidelity required in the V2I model then depends on the use case and the environment, which then links back to the scenarios used for modelling. At the digital city level, these models are huge but they already exist. Developers of smart city software already have detailed models of infrastructure, traffic flows and roadside units, along with maps to identify where a car can drive – all based on real-world, infrastructure- based sensors. These tools can then be used to generate traffic that behaves realistically, to test the interaction of the digital twin model of a vehicle with the environment, complete with the different wireless links. Standardisation The sheer complexity of the simulation and verification environments highlights the challenge going forward. So far, many companies have developed their own tools, but the complexity is growing at such a rate that this is no longer viable. Standardisation is therefore needed, but the debate is at which level. There can be standard interfaces between tools at the application processing interface level, but they may not be able to cope with the various levels of fidelity of the different models. There are working groups around the world looking at how this can be achieved. There is also a need to standardise the scenarios – how they are set up and what they test. At the moment, many unmanned system developments are replicating the development of the same scenarios, seeing this as core intellectual property built up over time. Scenario definition is critical, as are the interfaces to the scenario databases, so that vehicle developers can share scenarios to improve the quality of the ML algorithms. At the validation stage, there is value in having a shared set of data. Ride- sharing companies say they test a set of 2 million scenarios with a closed-loop improvement cycle, which drives down development costs as well as potential liability costs. There is also the regulatory environment. Singapore for example is a key testbed for this technology, and has built a $75 million digital twin of itself called Virtual Singapore. This will allow regulators to test vehicles and their levels of performance in the virtual environment before they allow cars in particular areas. Scenarios based around the models of these areas would be a key part of the initial permission. No regulator has a set of test cases yet, but that is expected to come. This is highlighted by the fact that the EuroNCAP car safety lab is part of the certification process for combined virtual and physical testing of systems for 2020. Hardware in the loop There is a lot of continuity in the transition from a high-level model simulation, through a detailed model in the loop and then hardware in the loop (HIL). This hardware can be a physical controller that replaces the model in the simulation or even the entire vehicle. For example, a development team can disconnect the sensors in the vehicle and feed the scenarios directly into the vehicle bus to achieve full proving-ground testing without leaving the lab. In this case the scenario definition gives the sensor input, the trajectory of the vehicle and a robot ‘driver’ on the seat of the car or in the controller. The robot driver allows the same scenario to be replayed on a physical testing ground with the same control data, which provides consistency throughout the test process. Integrating the hardware into the simulation environment is key. The simulation environment and scenarios are used to test and validate the systems being put into the cars using a process that is safe and repeatable to enable test and validation. This is being used for simulating perception systems, mapping, localisation and the sensors around the vehicle, also called the ‘safety force field’. This simulates the camera, Lidar and radar inputs, and feeds them into the actual hardware running the sensor fusion and control software. June/July 2019 | Unmanned Systems Technology Focus | Simulation and testing Detecting vehicles and lanes in the reference example about designing a visual perception system (Courtesy of MathWorks)

RkJQdWJsaXNoZXIy MjI2Mzk4