Unmanned Systems Technology 003 | UAV Solutions Talon 120 | Cable harnesses | Austro Engine AE50R and AE300 | Autonomous mining | AUVSI 2015 show report | Transponders | Space systems

80 want to make the main decisions about what kind of data or picture to take from a particular area. “These are key scientific decisions that will not be left to the onboard autonomy, so the autonomy is all about where the rover can go,” says Joudrier. The algorithms behind the navigation system were developed almost 20 years ago at the CNET research laboratories in France, and Airbus has taken these algorithms and developed its own implementation for the ExoMars rover. “For ExoMars, autonomous navigation is one of the key technologies, and we have been allocating special resources such as a dedicated coprocessor for image processing. This is very important as it allows us to offload the main processor, and allows the rover to process images while it is on the move,” says Joudrier. At the heart of the rover’s autonomy is a stereo navigation camera on a 2 m mast, which takes images of the terrain in front of the rover while it’s stationary. The rover’s image processing subsystem, then analyses these images to help generate a 3D model of the next 6 m of observable terrain. This model is then analysed to find the safest route for the rover. This produces a ‘navigation map’ that shows where it is safe for the rover to drive. The navigation system then plans a safe path, which is about 2 m long, across the navigation map in a direction that should get the rover closer to its final target. It then drives along the planned path, its trajectory control system compensating for any small obstacles such as rocks or slopes that push it off the path. Once it reaches the end of the path, it stops, and repeats the process of navigation and path-following until it reaches its target. The image processing algorithms run on the rover’s 96 MHz LEON2 coprocessor; the execution speed of these algorithms on this target hardware is important to the system-level performance of the rover, which has to produce a result in less than 20 s. The LEON2 is a variant of the Sparc RISC architecture developed by Sun Microsystems, and commonly used a decade ago. Sun was later bought by Oracle, and Sparc has now disappeared as a commercial processor, except in the space industry, which uses versions of it in semiconductor fabrication processes that allow plenty of redundancy to be included in the design of processor cores and memories to protect against strikes by alpha particles and cosmic rays. These strikes can alter data in the processor and memory, disrupting the operation of the system, so microcontrollers such as LEON2 are carefully built to eliminate these effects. There are two later versions of the coprocessor, LEON3 and 4, with different implementations, but all the verification of the code for both the navigation system and the main processor has been done on the LEON2 The prototype image processing system is based on a Pender Electronic Design GR-CPCI-XC4V development board. This includes a 70 MHz LEON2 processor, 256 Mbytes of SDRAM and version 4.6.2 of an open source real-time operating system called RTEMS (Real-Time Executive for Multiprocessor Systems). All this makes the board suitable for benchmarking the execution speed of the perception algorithms, because although it does not exactly match the specifications of the rover’s image coprocessor module, its design is similar enough that it is possible to perform analytical scaling of the execution time results based on the specification differences between the two platforms. The perception system, implemented in the C programming language, has Summer 2015 | Unmanned Systems Technology The ExoMars rover prototype, developed by Airbus, being tested in the desert A dedicated image processor in ExoMars will allow us to offload the main processor and allow the rover to process images on the move

RkJQdWJsaXNoZXIy MjI2Mzk4