Unmanned Systems Technology 006 | ECA Inspector Mk2 USV | Antenna systems | Northwest UAV NW-44 | Unmanned ground vehicles | Navigation systems | Lunar X challenge

8 Platform one February/March 2016 | Unmanned Systems Technology NVIDIA has announced the latest version of its in-car computer system that provides supercomputer-level performance for sensors, control systems and ‘deep learning’ – a form of artificial intelligence (AI) – in driverless vehicles. The Drive PX2 is being used by Volvo, which plans to have 100 XC90 cars available to lease to the public in Gothenberg by 2018. The vehicles will be able to operate autonomously in the city and semi-autonomously elsewhere as part of Volvo’s Drive Me programme. The Drive PX2 will provide 8 teraflops of processing power – equivalent to 150 MacBook Pros – to process the inputs of 12 video cameras, plus Lidar, radar and ultrasonic sensors, and fuse them to accurately detect objects, identify them, determine where the car is relative to the world around it, and then calculate its optimal path for safe travel. NVIDIA is also supplying a suite of software tools, libraries and modules called DriveWorks to enable sensor calibration, acquisition of data, synchronisation, recording and then processing streams of sensor data through a pipeline of algorithms running on all of the Drive PX2’s processors. Software modules are included for every aspect of the autonomous driving system, from object detection, classification and segmentation to map localisation and path planning. For map localisation and path planning, the system can compare real-time situational awareness with a known high- definition map, enabling it to plan a safe route and drive precisely along it, adjusting to changing circumstances. Drive PX2 will also perform other critical functions such as stitching camera inputs to create a complete surround view of the car. The Drive PX2 board uses two Tegra X1 processors, each with four ARM A57 and four A53 general-purpose 64-bit processor cores (with 2 Mbytes of layer 2 cache for the A57s and 512 kbytes for the A53s) and 256 graphics processing units (GPUs) built on a 16 nm silicon process. The board has the processing power to run deep learning algorithms in real time. These are a form of AI that uses neural networks to recognise and categorise new objects and anticipate potential threats. The board can process 24 trillion deep learning operations a second, ten times the performance of the first- generation Drive PX board. NVIDIA has also developed a software tool called Digits to speed up the development, training and visualising the neural networks used in deep learning algorithms. “Using Digits, in less than four hours we achieved over 96% accuracy using Ruhr University Bochum’s traffic sign database,” said Matthias Rudolph, director of Architecture Driver Assistance Systems at Audi in Germany. BMW, Daimler and Ford are also using this technology “Deep learning on Digits has allowed for a 30 times enhancement in training pedestrian-detection algorithms, which are being further tested and developed as we move them onto Drive PX,” said Dragos Maciuca, technical director of the Ford Research and Innovation Center in the US. The experience with Digits has also led to NVIDIA’s own neural network, called DriveNet. This has 37 million neurons with nine inception layers taking in signals and three convolutional layers bringing the data together. To run data through the network once takes 40 billion operations, which is why the PX2 board is necessary. The PX2 board will be generally available in the fourth quarter of 2016 but will also be available to early-access development partners in the second quarter. Hardware with AI horsepower Driverless vehicles NVIDIA’s PX2 board will provide 8 teraflops of processing power to fuse a range of driverless vehicle sensor inputs

RkJQdWJsaXNoZXIy MjI2Mzk4