Uncrewed Systems Technology 047 l Aergility ATLIS l AI focus l Clevon 1 UGV l Geospatial insight l Intergeo 2022 report l AUSA 2022 report I Infinity fuel cell l BeeX A.IKANBILIS l Propellers focus I Phoenix Wings Orca

10 Platform one Trimble has teamed up with Exyn Technologies to develop an autonomous robotic system that can be used for surveying complex environments where existing positioning technologies won’t work (writes Nick Flaherty). The system uses the Spot robotic dog from Boston Dynamics with the Trimble X7 3D laser scanner and the ExynPak machine learning platform. The combination allows Spot to move fully autonomously inside complex and dynamic construction sites to capture consistent and precise surveying data. “Integrating autonomous surveying technology into a construction workflow can improve operational efficiency and transparency throughout a build, while also transforming worker safety for potentially hazardous data collection,” said Aviad Almagor, vice-president of technology innovation at Trimble. A tool called ExynAI can sense and avoid obstacles, dynamically adapting to the changing environment of sites. The ExynPak uses a Velodyne Lidar sensor on a gimbal with two Chameleon 5 MP cameras from FLIR mounted on Spot for Level 4 autonomous operation without the need for satellite navigation. That avoids the need for an operator or for the robot to learn about its environment beforehand. A surveyor defines a 3D volume for a mission, and the integrated robotic solution handles the complexities of self-navigation without needing a map, GPS or wireless infrastructure. Integrating the Trimble X7 provides the 3D laser scanning to capture the state of the environment. The captured data can be uploaded to the Trimble Connect collaboration platform to be shared, which can include a comparison with Building Information Models and previous scans to monitor quality and progress. This creates a detailed map with minimal human intervention and risk.  Spot the dog’s survey skill Ground vehicles Spot can autonomously survey complex, GNSS- denied construction sites December/January 2023 | Uncrewed Systems Technology Researchers at the University of Bath in the UK have developed a machine learning (ML) technique to detect unexploded bombs at the bottom of the sea (writes Nick Flaherty). The system uses large, unlabelled survey datasets to aid automatic classification from synthetic aperture sonar (SAS) data. The researchers simulated this data to train an AI framework that would be used by vehicles such as the REMUS 620 (see page 16) to autonomously detect munitions that have been discarded. Autonomous underwater vehicles equipped with SAS can survey large areas at centimetre resolution, but that generates a lot of data that needs an automated approach to detecting and classifying unexploded munitions. The ML model encodes a representation of SAS images from which new SAS views can be generated. This requires the model to learn the physics and content of the images without the need for human labels in self-supervised learning. A more accurate graphics technology called ray tracing was used to generate realistic images, and noise was then added to match the statistics of real SAS images. These were 18,000 images of the Skagerrak UXO dumpsite taken using the HISAS 1030 sonar system. The pre-trained model can then be fine-tuned to perform classification on a small amount of labelled examples with 700 training images and 2500 test images. A 250 kg, 1.8 m-long bomb was used to demonstrate the accuracy of the system, which worked better than a traditional self-supervised approach and systems with no pre-training. AI system detects UXBs Seabed safety

RkJQdWJsaXNoZXIy MjI2Mzk4