Issue 60 Uncrewed Systems Technology Feb/Mar 2025 ACUA Ocean USV | Swarming | Robotnik RB-WATCHER UGV | Dropla Mine Countermeasures | Suter Industries Engines | UUVs insight | Connectors | Black Widow UAV | FIXAR 025 UAV

12 February/March 2025 | Uncrewed Systems Technology An algorithm can double the resolution of a radar system without replacing the equipment, writes Nick Flaherty. In Korea, the algo developed by researchers at Daegu Gyeongbuk Institute of Science and Technology enables precise object recognition using existing hardware specifications without the need for bandwidth expansion. This approach also enables the precise identification of objects outside the vehicle with lower-cost hardware or more resolution for autonomous vehicles. Frequency-modulated continuous-wave (FMCW) radar systems for automotive and aerospace applications require resolutionenhancement technologies to improve eYs3D Microelectronics, a subsidiary of Etron, has launched a multi-sensor image controller for autonomous systems, writes Nick Flaherty. The eSP936 supports the synchronous processing of data from up to seven visual sensors to provide high image-recognition accuracy. The eSP936 can be integrated with multi-modal visual language models in automated guided vehicles, autonomous mobile robots and UAVs. The eSP936 can process multiple 2D images at high speeds and generate 3D depth maps, enhancing precise environmental recognition. The eSP936 has a built-in DRAM chip and wide-angle image de-warping technology, enabling high-precision environmental perception and multi-view, 3D-depth map generation. The algorithm analyses the features of received signals from two targets – the beats of the signal envelope – to improve the resolution of target detection using the same bandwidth. This achieves nearly twice the resolution through signal processing on existing radar hardware. The algorithm is based on the fact that the envelope of the beat signal contains information about the difference between two frequencies. This is used to estimate the frequencies missed due to the insufficient size of the observation window. Specifically, it uses the Fast Fourier Transform (FFT) results that were incorrectly estimated as a single target, along with the FFT results of the envelope of the beat signal. resolution, and synchronises four RGB cameras for enhanced perception range and recognition accuracy. A highlight of the YX9170 is intelligent sensing. The embedded AI algorithms enable real-time, multi-object recognition from synchronised images, achieving a 30% reduction in system computational load and latency. The eSP936 is used in the YX9670 navigation system for autonomous vehicles. As well as dual-depth sensors and quad RGB cameras giving a 278o panoramic field of view, a monochrome camera provides a 145o overhead view alongside a thermal imaging sensor. An attitude and heading reference system (AHRS) is used for vehicle coordination. The YX9670’s embedded AI algorithms enable real-time panoramic object recognition, navigation direction analysis and multi-target tracking. Sensing Sensing Algo enables precise object recognition without boosting bandwidth Image controller synchronises data from seven visual sensors object recognition. The answer typically involves boosting bandwidth or using ultra high-resolution algorithms with significant complexity, but this increases both costs and system complexity. The range resolution of FMCW radars is determined by bandwidth – its size being proportional to the observation window (OW). If the OW is sufficiently large, multiple frequencies with small differences between sinusoids can be well estimated. However, if the OW is insufficient, multiple sinusoids with similar frequencies may be incorrectly assumed to be a single frequency. The research team discovered that additional information embedded in the envelope of radar signals could be used. It also features high-performance data-compression to reduce latency. The MIPI+USB simultaneous processing technology ensures high-quality 2D image and 3D depth map output, improving image recognition accuracy. The eSP936 and XINK-ll edge spatial computing platform is used in the YX9170 sensor-fusion system. This integrates dual-depth sensors, supporting highdefinition images up to 1280 x 720 xx A range of control systems based on the eSP936 controller (Image courtesy of Etron) Platform one

RkJQdWJsaXNoZXIy MjI2Mzk4