Unmanned Systems Technology 002 | Scion SA-400 | Commercial UAV Show report | Vision sensors | Danielson Trident I Security and safety systems | MIRA MACE | Additive manufacturing | Marine UUVs
45 Vision sensors | Focus systems, fusing multiple sensors into compact and lightweight systems. A key factor for airborne systems is the ability to combine daylight imaging sensors, hyper-spectral imaging sensors and SWIR and LWIR sensors, along with the electronics, optics and vision system programs. This is the way to get the minimum power consumption and weight, both for control of the UAV or as a payload system. One application here has been to combine two hyper-spectral sensors for precision agricultural monitoring. One camera sees deep UV to visible, from 400 to 600 nm, and the second from blue to IR (600-1000 nm). Coupled with this is an intelligent data acquisition board controlled by a processor running a customised embedded Linux operating system. Other mission sensors such as a GPS, motion sensors and radiance sensors to measure the spectrum of the incoming light can also be integrated into a complete data capture system, either for control or payload. Low power – so that a smaller, lighter battery can be used – and low mass are key design requirements, as every extra 100 g reduces flight time by a minute for many UAVs. To help achieve this, one company has designed an acquisition board that measures 95 x 95 mm to host a field-programmable gate array (FPGA), a microchip that can be user- programmed after manufacture, which implements the control algorithms in hardware rather than software. This is then integrated into the OEM image sensors. The company designs its own daylight cameras and optics, while the IR cameras come from a third-party supplier. The FPGA board then connects to a standard PC board in the Q7 format that can run the latest single-, dual- or quad-core processors. Most of the pre-processing, such as balancing multiple cameras to run synchronously in frame lock and specific features such as false colouring and synchronising all image channels from the sensors, is handled in the FPGA. A key processing algorithm on the FPGA is ‘decubing’ the hyper-spectral image data. An ordinary camera carries out RGB colour processing, but a hyper-spectral sensor has to handle hundreds more ‘colours’, so the image becomes a cube, with the x and y axes as the spatial resolution and z axis as the colours. This is handled in real time in the FPGA, taking position data from the GPS and inertial navigation systems, time-stamping the data and streaming it to an onboard solid-state disc drive (SSD) or to a modem for wireless transfer to a ground station. The Q7 board then handles image processing and control, file management for the SSD and system management. Integration Compared with driverless vehicles, UAV integration is relatively new. For road vehicles, the integration of the vision sensor algorithms has already moved to more dedicated programmable devices that are being used for advanced driver assistance systems (ADAS). These processors provide functions such as lane departure warning, advanced cruise control, traffic sign recognition, pedestrian and object detection, forward collision warning and preventing reversing if there is an obstruction behind the vehicle. These are all key algorithms for using the data from a vision sensor to control a self-driving car, and it is here that the architecture of the controller is vital. The camera applications are now so complex – with up to eight image sensors running in parallel, and other sensors such as radar and even Lidar laser detection systems – that it is important to understand which functions are required for controlling the vehicle. For that you need the hardware Unmanned Systems Technology | Spring 2015 For road vehicles, the integration of the vision sensor algorithms has already moved to more dedicated programmable devices for ADAS The image processing used for advanced driver assistance (ADAS) systems (below) is being extended into the control of driverless cars using the same data and algorithms (Courtesy of Toshiba)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4