Unmanned Systems Technology 021 | Robot Aviation FX450 l Imaging Sensors focus l UAVs Insight l Liquid-Piston X-Mini l Riptide l Eurosatory 2018 show report l Zipline l Electric Motors focus l ASTS show report

34 I maging sensors are becoming an essential part of the design of autonomous systems, especially in cars and trucks. The move to driverless cars is pushing the performance, weight reduction and lower cost for sensors operating not only in the visible spectrum but in the near-infrared (IR) as well, for low-light conditions and at night. All of this is driving the design of sensor architectures in different directions. Some vision sensor system makers are using single sensors; others are using combinations of them. Some are using localised processing to reduce the amount of data that has to be carried in the vehicle but that require more processing power locally, while others are using centralised processing boards. In addition, demonstration driverless cars have been developed that just use cameras rather than a combination of cameras, radar and Lidar sensors, highlighting the potential of the technology. The technology for imaging sensors has been moving away from more expensive charge coupled devices (CCDs) to ones built on the same CMOS technology used to make most silicon chips these days. CCD versus CMOS In a CCD sensor, the charge generated in a pixel when it is hit by photons is transferred through a very limited number of output nodes (often only one) to be converted into a voltage, then buffered and sent off-chip as an analogue signal. One advantage here is that all of the pixel can be devoted to light capture, and the output’s uniformity is high, making it good for image quality for cameras in the payload of a UAV, for example. By contrast, a CMOS sensor has pixels that include their own charge-to-voltage conversion, and the sensor often also includes amplifiers, noise correction and digitisation circuits so that the chip outputs digital bits. These other functions increase the design complexity and reduce the area available to capture photons, but that is compensated for by a lower cost of manufacturing. With each pixel doing its own conversion, the uniformity is lower so Nick Flaherty reports on developments underpinning the emergence of more powerful, lighter and cheaper sensors Behind the scenes August/September 2018 | Unmanned Systems Technology Some demonstration driverless cars use only cameras, rather than radar and Lidar sensors as well (Courtesy of Ambarella)

RkJQdWJsaXNoZXIy MjI2Mzk4