UST035

84 I mage sensor design for unmanned systems is being driven by two key capabilities in the sensors to provide the detection accuracy needed for safe operation. This is leading to a divergence in sensor design. Addressing high dynamic range (HDR) and LED flickering mitigation (LFM) are key for sensor designs, particularly for driverless cars. Applications range from near-infrared (NIR) sensing for lane detection, to HDR at 140 dB and above to cope with difficult lighting and accurately detect LED signage. These are different requirements from those of image sensors inside a car, which require more sensitivity to monitor drivers’ eyes for tiredness. Increasing HDR in image sensors is a complex balance in the design and manufacturing of the pixel structures in the sensor array; the shutter design also has an impact. A rolling shutter that takes the data from an array of photodiodes row by row consumes less power than a global shutter that takes all the data in one go. These factors are driving the design of image sensors for driverless vehicles operating at Level 3 and Level 4 autonomy. At the same time, the sensors are getting larger, from an array of 8 MP to 12 MP and higher. That is creating challenges for data management owing to the large amounts of data from the sensor. One way to address that is to add more processing on the sensor. This is being boosted by new manufacturing techniques. Growing demands on image sensor capabilities are driving advances in their technologies, materials and manufacturing. Nick Flaherty reports Vision of the future December/January 2021 | Unmanned Systems Technology Combining high dynamic range and LED flicker mitigation helps unmanned systems identify signs in challenging environments such as tunnels (Courtesy of Omnivision)

RkJQdWJsaXNoZXIy MjI2Mzk4