Unmanned Systems Technology 021 | Robot Aviation FX450 l Imaging Sensors focus l UAVs Insight l Liquid-Piston X-Mini l Riptide l Eurosatory 2018 show report l Zipline l Electric Motors focus l ASTS show report

40 Focus | Imaging One of these chips uses a quad-core 1.2 GHz ARM Cortex A53 processor core with digital signal processing extensions and a floating point unit combined with hardware for real-time 360 º de-warping and lens distortion correction. It accepts up to 800 MP/s of data with multi-exposure HDR as well as LED flicker mitigation. The aerial realm UAV imaging systems can be split into two types with converging requirements. The resolution of imaging sensors is allowing collision detection to be performed using visual data rather than Lidar or radar sensors, reducing the cost and power requirements. These require sophisticated image processing algorithms to identify objects in the images. At the same time, there is a growing need for tracking algorithms for cameras operating in a gimbal in a UAV’s payload. Using the video feed from the camera, an operator on the ground can select an object in the image and have the camera track it autonomously. This needs image identification and tracking algorithms as well as a real-time, low-latency interface to the gimbal with closed-loop feedback. It has to be performed before the video is compressed for sending over a radio link. Processing the image at the camera output is important in UAV applications. Uncompressed, full pixel depth digital video is the best source for most processing functions, especially when tracking objects that are just a few pixels in size. Doing the encoding on the same processor reduces the number of encode-decode cycles to the bare minimum, reducing the power requirement. Putting the object identification in the same processing thread is also beneficial, as it eliminates false positives and improves the situational awareness of the operator. This is even more important when the UAV is linked to multiple ground vehicles in an autonomous swarm, as the detection has to be done onboard. By putting the processors on the camera side of the slip ring of the gimbal, the data bandwidth management and system connectivity is much easier. That has an impact on the processing architecture. Current digital signal processors can process a 1080p EO and SD IR camera feed, carry out stabilisation, detection, tracking, enhancement, on-screen display and encoding, all at 30 fps in a card measuring 26 x 38 mm with a power consumption of 2.5 W.   The cameras themselves are also improving, along with the lens technology. The first 100 MP sensors with an array of 11664 x 8750, 3.76 µm pixels that use BI are now being used in payload cameras to provide high light sensitivity and extend the dynamic range to 83 dB, taking advantage of the technology improvement from the automotive designs. Camera makers have also worked on the lenses, allowing quieter zoom on lenses from 35 to 150 mm focal lengths so that the noise of the zoom does not show up on the video feed. Improvements in the technology are also helping to create smaller EO and IR sensors for smaller UAVs. The reduction in size and weight of sensors has allowed a 32 g UAV for example to use both thermal and traditional camera imaging systems. It can fly for 2 km at speeds of over 21 kph with a mission time of 25 minutes. August/September 2018 | Unmanned Systems Technology The Black Hornet weighs just 32 g and can operate with EO and IR camera sensors (Courtesy of FLIR) Current digital signal processors can process a 1080p EO and SD IR camera feed at 30 fps in a card that consumes 2.5 W

RkJQdWJsaXNoZXIy MjI2Mzk4