Unmanned Systems Technology 013 | AutonomouStuff Lincoln MKZ | AI systems | Unmanned Underwater Vehicles | Cosworth AG2 UAV twin | AceCore Neo | Maintenance | IDEX 2017 Show report
12 Platform one Dr Donough Wilson Dr Donough Wilson is innovation lead at VIVID/ futureVision, which specialises in game- changing thinking for defence, homeland security, and both manned and unmanned aviation innovations. He was first to propose the automatic tracking and satellite download of airliner black box data, technology which is now being adopted. His defence innovations include the automatic cockpit vision system that protects military aircrew from asymmetric high-energy laser attack. As a pilot, he has more than 3000 hours of flying experience in both military and civil environments, and is currently a flying instructor and a flight test examiner. Paul Weighell Paul has been involved with electronics, computer design and programming since 1966. He has worked in the real-time and failsafe data acquisition and automaton industry using mainframes, minis, micros and cloud-based hardware on applications as diverse as defence, Siberian gas pipeline control, UK nuclear power, robotics, the Thames Barrier, Formula One and automated financial trading systems. Ian Williams-Wynn Ian has been involved with unmanned and autonomous systems for more than 20 years. He started his career in the military working with early prototype unmanned systems and exploiting imagery from a range of unmanned systems from global suppliers. He has also been involved in ground-breaking research including novel power and propulsion systems, sensor technologies, communications, avionics and physical platforms. His experience covers a broad spectrum of domains from space, air, maritime and ground, and in both defence and civil applications including, more recently, connected autonomous cars. Unmanned Systems Technology’s consultants April/May 2017 | Unmanned Systems Technology Nvidia has launched a second-generation module that will bring machine learning and artificial intelligence to a wide range of autonomous systems, from UAVs to submarines (writes Nick Flaherty). The Jetson TX2 is a credit card-sized platform that allows AI computing in small systems. It uses a new graphical processing unit (GPU) to provide twice the performance of the previous system for a similar power budget to run neural networks locally. The heart of the card is the Pascal GPU. This has 256 of the latest graphics cores with two of Nvidia’s 64-bit custom Denver processor cores and four ARM Cortex-A57 cores. The 50 x 87 mm module supports faster video links and higher bandwidth memory to get the performance needed for image recognition and classification algorithms. It can handle up to six cameras and has a memory bandwidth of 58.3 Gbyte/s The Jetson TX2 (and its predecessor the TX1) are both programmed using the JetPack 3.0 software development kit for AI computing. This supports TensorRT 1.0, a high-performance neural network inference engine for production deployment of deep learning applications; cuDNN 5.1, a GPU-accelerated library of primitives for deep neural networks; and VisionWorks 1.6, a software development package for computer vision and image processing. “Engineers started with computers being able to see, using predefined algorithms that were coded in software,” said Eddie Seymour at Nvidia. “That moved on to machine learning that was an extension of computer vision with the same ethos and coding. “For a machine to learn though it needs to think more like a brain, so we moved from machine learning to deep learning, and that has really revolutionised the area. Deep learning and neural nets are a completely different approach to programming.” Neural nets are trained to recognise and classify images, and then run on GPU processors such as Pascal in autonomous designs. “GPUs are extremely parallel devices, and engineers saw that they could use that to build deep learning, which is a way of recognising images without traditional programming,” Seymour said. “Now we have to go to the deployment mode, and we are now able to deploy these deep neural networks for particular functions.” The module is aimed at smaller autonomous systems. For AI in driverless cars, Nvidia is developing the PX3 card that uses the next-generation Xavier GPU processor, which will be available at the end of this year. The Jetson TX2 developer kit, which includes the carrier board and Jetson TX2 module, is now available, while the card module will be available in the middle of this year. The Jetson TX1 kit is still available. AI focus, page 32 AI branches out Artificial intelligence Nvidia has kept the credit card format of its GPU accelerator card but doubled the performance for AI in UAVs
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4