Issue 57 Uncrewed Systems Technology Aug/Sept 2024 Schiebel Camcopter | UTM | Bedrock AUV | Transponders | UAVs Insight | Swiss-Mile UGV | Avadi Engines | Xponential military report | Xponential commercial part 2 report

T-motor Jiangxi-Xintuo Enterprise Co Ltd T-MOTOR ® THE SAFER PROPULSION SYSTEM A SERIES MODULAR PROPULSION Dual Inputs: PWM/CAN Thrust Up To: 57kg A next level with an upgraded cooling configuration WWW.TMOTOR.COM Platform one Researchers in the US have developed a high-speed, neuromorphic vision sensor for robotic applications, writes Nick Flaherty. A team led by University of Maryland computer scientists developed the Artificial Microsaccade-Enhanced Event Camera (AMI-EV) that tracks events in an image, even at high speeds. “Event cameras are a relatively new technology, better at tracking moving objects than traditional cameras, but today’s event cameras struggle to capture sharp, blur-free images when there is a lot of motion involved,” said researcher Botao He. “It’s a big problem because robots and many other technologies, such as self-driving cars, rely on accurate and timely images to react correctly to a changing environment.” Microsaccades are small, quick eye movements that involuntarily occur when a person tries to focus. The human eye can maintain focus on an object and its visual textures accurately over time. “We figured that just like how our eyes need those tiny movements to stay focused, a camera could use a similar principle to capture clear, accurate images without motion-caused blurring,” he said. The team replicated microsaccades by inserting a rotating prism inside the AMI-EV to redirect light beams captured by the lens. The continuous rotation of the prism simulated the movements naturally occurring within a human eye, allowing the camera to stabilise the textures of a recorded object just as a human would. The team then developed software to compensate for the prism’s movement within the AMI-EV to consolidate stable images from the shifting lights. “When you’re working with robots, replace the eyes with a camera and the brain with a computer. Better cameras mean better perception and reactions for robots,” said Prof Yiannis Aloimonos, director of the Computer Vision Laboratory at the University of Maryland Institute for Advanced Computer Studies (UMIACS). Event-driven cameras have advantages over classical vision systems, including better performance in extreme lighting, low latency and low power consumption. In testing, AMI-EV was able to capture and display movement accurately in a variety of contexts, including rapidly moving shape identification. AMI-EV could capture motion in tens of thousands of frames per second, outperforming most typical commercial cameras. Robotic applications Neuromorphic vision tracks images at high speed WWW.TMOTOR.COM A SERIES MODULAR PROPULSION A next level with an upgraded cooling connguration Upgraded cooling channels temperature can be reduced by about 25% WWW.TMOTOR.COM UAV A SERIES MODULAR PROPULSION A next level with an upgraded cooling connguration A SERIES MODULAR PROPULSION Dual Inputs: PWM/CAN Thrust Up To: 57kg A next level with an upgraded cooling connguration

RkJQdWJsaXNoZXIy MjI2Mzk4