UST030
39 Thermal imaging sensors | Focus Image processing While computer hardware is constantly advancing, to the benefit of nearly every type of unmanned system, a few key innovations stand out in terms of their interest to thermal sensor developers. Perhaps most critical are SWaP- optimised processors that support parallel computing, in which many calculations or processes are carried out simultaneously, enabling large tasks to be broken down into smaller ones. This approach is particularly useful in fields where size and power constraints prevent processor frequency from being scaled up. Intel’s Myriad vision processor units (VPUs) for example are among those being used in new IR camera cores. They contain multiple parallel programmable cores based on the SHAVE (Streaming Hybrid Architecture Vector Engine) architecture from Movidius, which was originally developed for game physics but has been slightly modified for machine vision applications. The SHAVE architecture greatly enhances the abilities of thermal sensor developers (particularly in autonomous mobility and aerial monitoring) to divide up vision operations into multiple smaller components that can be processed in parallel at very low power – potentially 20 Gflops or 180 MHz at 300 mW. SHAVE cores also help with machine learning, as artificial neural networks can operate very quickly when installed onto parallel computing architectures – to the point that some thermal cameras now feature a second Myriad VPU for end-users to install and train their neural network software. By bundling this architecture with embedded memory, a very compact and low-power thermal camera capable of intelligent object detection and classification across a range of unmanned vehicles can be created. They could also be applied during or after the manufacturing of unmanned vehicles and subsystems to inspect for structural flaws. The latest Qualcomm Snapdragon SoC (system-on-chip) solutions have also been rapidly adopted by thermal image processing companies for providing advanced capabilities in graphics and pixel manipulation at low power consumption rates. Developing custom circuitry and integrating the right processing units can be time-consuming and costly, spurring demand for such lightweight, low-power SoCs. In addition to the faster speeds and smaller sizes of SoCs coming to market, systems such as the Snapdragon are highly integrated combinations of multiple CPU cores, GPUs, modems, and digital and image signal processors. They also include high numbers of I/Os for mission-critical ancillary inputs such as GNSS, video, imagery, audio, and AI software such as artificial neural networks and machine learning. The Zynq SoCs from Xilinx are enjoying similar popularity, as they are being built around a combination of an ARM processor and an FPGA. They enable broad software integration via the former and hardware integration via the latter, to support a range of thermal data analytics and machine vision while integrating variations of CPUs, DSPs and other units on a single device. Even different kinds of thermal camera can be combined together, using AI- based image processing algorithms to autonomously output actionable information based on mapping data gathered within a given region but across different parts of the electromagnetic spectrum. Significant enhancements can thus be added to the thermal data before it reaches either the autopilot or the user’s visual display. A combination of LWIR/MWIR with NIR and visible-light cameras therefore produces a system that overcomes the limitations of each band. It also opens up a range of new analytical capabilities, and the image outputs of the thermal sensors can be enhanced through software at the same time. While the visual display must typically output 8-bit, 256-colour images, this standard omits significant information, even in the visual spectrum. However, thermal information from IR and NIR overlays can be reworked algorithmically and inserted into colour maps for further understanding of environmental, agricultural or other data. For example, conservationists could fly a UAV over large areas with an EO camera to map out wilderness and jungle, while using a LWIR camera to sense the number and distribution of otherwise camouflaged animals in the region. NIR and machine vision algorithms would help determine the species and Unmanned Systems Technology | February/March 2020 New SoCs are opening up avenues for thermal imaging analytics, without taking up undue power or weight (Courtesy of Sightline Applications)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4