UST035

UK company Opteran is developing a new, event-driven approach to optical sensing for UAVs (writes Nick Flaherty). It has developed a lightweight, low-cost silicon-based approach to deep learning using optical flow techniques. These are similar to the way insects see. The technology is driven by studies of insect brains as part of the Green Brain and Brains on Board research projects. The research has led to software that mimic tasks such as seeing, sensing objects, obstacle avoidance, navigation and decision-making. In a recent trial the researchers were able to control a sub-250 g UAV with complete onboard autonomy, using fewer than 10,000 pixels from a single low-resolution panoramic camera. Autonomous control of mobile systems needs flexibility to deal with novel environments and scenarios. The honey bee, an extremely well-studied animal with a brain of only 1 million neurons, exhibits sophisticated learning and navigation abilities through highly efficient neural processes. Bees can navigate reliably over several kilometres in 3D space, learning the features that will enable them to return to their hive. They can optimise the distances travelled on routes from the hive to multiple forage patches, almost certainly without possessing a mental map. This is in marked contrast to current deep learning algorithms. Weighing about 30 g and integrating Opteran technology that draws less than 1 W of power, the Opteran Development Kit will enable the technology to be integrated into a wide variety of applications with real-time autonomous decision-making. Insects inspire sensor tech Image sensing Tests show the system needs fewer than 10,000 pixels to control a small UAV

RkJQdWJsaXNoZXIy MjI2Mzk4