Unmanned Systems Technology 002 | Scion SA-400 | Commercial UAV Show report | Vision sensors | Danielson Trident I Security and safety systems | MIRA MACE | Additive manufacturing | Marine UUVs
46 Focus | Vision sensors capability and make the most intelligent assignment of the functions to the hardware architecture. This is a key step towards having a controller for a self-driving car with sensor fusion that combines the data from all the sensors – not just the image sensors – to make decisions. Having a processor that can process other information alongside the image data is a key step in the gradual introduction of autonomous functions. One chip designer expects to have 237 customer designs for ADAS by 2017 using a family of vision processors based on the MIPS processor architecture. This sensor data processing capability was the driver for the definition of a new family of processor cores, the M51xx, that will sit at the heart of the next generation of devices for customers such as General Motors, Volvo and Honda. The processing architecture is not necessarily fixed though. One supplier, which launched its first automotive image processor in 2004 around the architecture from chip designer ARM, has changed its approach with its fourth- generation design. The hardware-based architecture of the latest device moves from ARM to ten dedicated media processor engines (MPEs) supported by a custom-designed very long instruction word coprocessor core called MeP running applications such as traffic signal detection and lane detection warning. These are optimised for transferring the large amounts of data generated by the vision systems. Architectures The MPE media processor cores are themselves supported by 14 hardware accelerators for processing the image data, and the company now supplies the application programming interface and libraries that developers use to program the devices as camera functions become more standardised. Moving from one processor architecture to another has traditionally been regarded as a difficult task, but by separating out the control code from the image recognition algorithms running on the MPE coprocessors, it is easier to move the control tasks to the MeP using a standard C++ compiler. However, the ARM architecture is still a key one for automotive systems, and another third generation of image system combines two 500 MHz digital signal processing cores with a fully programmable vision acceleration core alongside two 200 MHz ARM Cortex-M4 cores, along with an image signal processor. Another, relatively new entrant into this technology is using two ARM Cortex-A15 processor cores with a graphics core and four of its own image processing cores, as well as a separate video decoding core. There is also a core dedicated to viewpoint conversion of the video, where six channels can each support a high-resolution camera such that the combined output can produce high-definition surround view images. The part started sampling at the end of 2014 and is set for mass production in 2016. The challenge for vision sensors in autonomous vehicles has been to bring all the systems together and make them work together, and that’s the direction in which the market is heading. Combining electronics and physics allows more focus on the detection of light and turning it into digital information, which is helping OEMs build better systems and allowing the different types of optics to be integrated with the thermo-mechanical elements into a small housing, along with the corrections needed in the optical system to see what the customer wants to see. Acknowledgements The author would like to thank Austin Richards at FLIR Commercial Systems, Marco van Hout at 3D-One and Klaus Neuenhuskes at Toshiba for their input into this article, plus contributions from Sensors Unlimited, Headwall Photonics, Xenics, congatec, Renesas and Mobileye. Spring 2015 | Unmanned Systems Technology The challenge for vision sensors in autonomous vehicles has been to bring all the systems together and make them work together The architecture of the latest advanced driver assistance chip is a key step towards handling video in driverless cars (Courtesy of Toshiba)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4