Unmanned Systems Technology 026 I Tecdron TC800-FF I Propellers I USVs I AUVSI 2019 part 1 I Robby Moto UAVE I Singular Aircraft FlyOx I Teledyne SeaRaptor I Simulation & Testing I Ocean Business 2019 report

90 This validates the systems that will go out on the road up to the various levels of autonomous operation, including full autonomy on any road at Level 5 with redundancy through a second hardware controller. For Level 5 operation the simulation plays a huge role. It allows the developer to jump to the most dangerous intersections in the world and create a suite of the craziest environments to test the system, which simply isn’t possible in the real world. One open platform is being developed around the GPUs used in many unmanned vehicle designs. This has two hardware controllers based on production GPUs, in a data centre linked to a server that provides the simulation environment. It uses software tools from the computer gaming world to create the virtual maps and environments, allowing developers to create random trees and roads with potholes, and changing the time of day or the weather, to provide a wide range of scenarios for testing the responses of the controller. These can be linked to the vehicle models, which have many thousands of parameters. This allows 100 nodes in the data centre to run automated tests in different environments 24 hours a day using real-world data, rather than having to have 100 cars driving around with safety drivers. The platform supports regression testing so that when new code is added to address a failure, the previous tests are run to ensure that the responses are improved. Again, this is a significant data management challenge in tracking all the environments, the sensor sources, the test vectors and the results. The platform is also scalable so that the controllers based on the latest GPUs can be dropped in to replace the current hardware and all the same tests run with the same scenarios in the same environments. However the data challenge is huge. Four, 4 MP camera sensors would generate 4 bytes per pixel at 30 frames per second, generating around 2 Gbytes each second. At 60 mph, the simulation generates 120 Gbytes per mile. The 100 nodes running for 24 hours would generate 17.28 Pbytes – 17.28 million Gbytes – each day, just for the image sensors, let alone radar and Lidar. A million miles of simulation, which is a small part of a validation suite, would require 120 Pbytes of storage for just one vehicle. Hardware in the loop for UAVs HIL can also be used in the development and testing of UAV systems, and in a number of ways. Simulation of an airborne platform is usually carried out in exactly the same way as for a ground vehicle, but the differences come with the hardware in the loop. This can be used for testing the end controller as well as developing the flight control software, rather than using a purely software simulator – especially helpful when customising an autopilot for a specific UAV platform. One HIL approach is to use the autopilot hardware both as the simulation environment and as the HIL test system. A simulator can run on an autopilot board, providing 200-250 mission parameters, such as the GNSS positioning data, altitude, speed and so on for any type of unmanned aircraft, both rotary and fixed wing. The system also simulates an electrical model of the sensors, the output to servos, switches – essentially all the items connected to the autopilot. The output from the simulation autopilot is then fed via a custom interface board to an identical autopilot board, with the software being tested via links to the ground control system. This allows developers to adjust the simulation environment to explore different scenarios. That means the same processor is used for flight control as well as the simulator, so there is consistency. The only thing outside this combined dual autopilot system is the control of the mission, and that is addressed using a 3D representation built around a third- party model. This allows an engineer to have additional feedback to see the orientation of the simulated craft, and see how it is ‘flying’ via a PC connected to the GCS. This then allows the engineer to change the scenario, for example by introducing noise or offsets on the sensors, or forcing the servos into a certain position. June/July 2019 | Unmanned Systems Technology Two autopilots are combined in a hardware- in-the-loop simulation system for developing software for UAVs (Courtesy of UAV Navigation)

RkJQdWJsaXNoZXIy MjI2Mzk4