Unmanned Systems Technology 009 | Ocean Aero Submaran S10 | Simulation and testing | Farnborough report | 3W-110xi b2 TS HFE FI | USVs | Data storage | Eurosatory/UGS 2016 report
36 In testing out a video sensor, for example, some HIL systems have video footage of real situations that the sensor reads, combined with 3D models of roads and buildings. However, testing a Lidar laser ranging system of the kind used on driverless cars presents a major challenge, as it is generating millions of data points every second. Real-time data can be captured from a Lidar at the same time as a video feed is captured, but this data is limited to a fixed set of scenarios. Generating a simulated Lidar feed from a 3D model allows test engineers to test out different scenarios, but it is still no easy task. Synchronising this simulated feed with the video feed to the image sensors and other sensors such as radar, and data from the vehicle such as throttle and braking, battery temperature and current drain presents a further challenge, as the sensor fusion algorithms in the ECU have to bring all this together in the right time frame. With tens of millions of lines of code in a driverless car, the aim is to find as many defects as possible from the fusion of ultrasonic, radar, Lidar and image sensors. This has to be combined with mapping and position information from GPS satellite navigation and inertial sensors, as well as data coming over wireless links. The advanced sensors for autonomous systems are hard to simulate – radar simulation and beam forming for example are issues that a lot of developers are still trying to sort out. The bar for autonomous vehicles is also much higher than for traditional cars, as there are many worries about trust, so people want to do much more testing, and HIL is the only conceivable way to do that, say the suppliers. Handling this testing in the lab is still more cost-effective than taking test vehicles around a track or test area such as the empty city being built in New Mexico by Google’s parent company Alphabet, as many more scenarios can be tested. There are different ways to develop HIL systems, from a modular system built around the PXI standard to fully customised test rigs, but the HIL system is not just about the hardware – or even the software. The aim is to provide the same stimulus to the ECU that it would experience in the field, and this can be provided in a number of ways, from video screens for the image sensors to fully interactive 3D models. PXI is an open specification designed for rugged applications such as industrial automation and test that combines the same PCI electrical bus features found in PCs with CompactPCI cards and specialised synchronisation buses and software. Launched in 1998, it is an open industry standard overseen by the PXI Systems Alliance (PXISA), a group of more than 70 companies that ensure interoperability and maintain the specification. August/September 2016 | Unmanned Systems Technology Focus | Simulation and testing Devices for testing in the loop are connected to the simulation environment via test rigs depending on the system under test (Courtesy of D-Space) A HIL system for UAVs, unmanned boats or driverless cars can use standard PXI cards to build up a customised simulation environment (Courtesy of National Instruments) With millions of lines of code in a driverless car, the aim is to find as many defects as possible from the radar, Lidar and image sensors
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4