Issue 56 Uncrewed Systems Technology June/July 2024 Insitu ScanEagle VTOL and Integrator VTOL l Data storage focus l IDV Viking UGV l Oceanology International l LaunchPoint l Insight on USVs l Antennas focus l Xponential report

61 system. The latter is used to track multiple uncrewed underwater vehicles and the new D010 diver tracking beacon. “The system features depth and geo-fence alarms, as well as a diverdistress alert cord on each beacon, helping to drastically improve topside situational awareness and diver safety,” says Rachael Reader. Robosys promoted its new AI autopilot for small, survey USVs in the 3-6 m size range. Most of them have electric drives with their own communication protocols, which traditional, commercial, off-the-shelf autopilots were not designed to work with, according to Nigel Lee. “We have taken our Voyager AI software, and integrated it with a hardware autopilot capable of working with analogue and digital steering and propulsion systems,” he says. He adds that Robosys can configure the autopilot for any vessel out there now with the latest technology, and it is not limited to the smaller vessels. For example, the team is currently integrating the system into ACUA Ocean’s 13.5 m, hydrogen fuel-cellpowered USV, which uses a steerable electric drive from RAD Propulsion. It is also compatible with TQ bus and TorqLink, as well as ESC and CAN buscontrol systems. The autopilot has to ingest data from the vessel’s navigation systems and sensors to work the steering and propulsion systems. Robosys’ Voyager AI software capability comes in here, with the ability to make the same decisions as a human. “That is what we are doing, particularly when we follow the COLREGS – the rules of the road (RoRs) – at sea. It’s very much about machine learning and a rulesbased architecture, particularly when it comes to the COLREGS. “If you just try to use a neural network or learning at the edge through unsupervised learning, there is the potential to learn errors because you will learn from mistakes that are made, as opposed to applying the rules correctly, which is what I do as an officer of the watch at sea. So we use a mixture of rules-based and machine-learning techniques to react correctly within a dynamically changing environment. In addition, Robosys details the RoRs in force to deliver Explainable AI (XAI)- based COLREGS. Robosys has also integrated the Computer Vision Sea AI Sentry perception system into its VOYAGER AI in conjunction with a software version of its USV AutoPilot for a 30 m tug in India. Advanced Navigation discussed its AI-enabled inertial navigation systems (INS), including fibre-optic gyroscopes (FOG) and micro electro-mechanical (MEMS) gyroscopes. The company’s AI fusion neural network helps to reduce drift to a minimum, so the INS keeps track of the host vehicle’s movements for longer in GNSS-denied environments. “What the AI is doing is tracking and removing the sensor errors far more accurately,” says Xavier Orr. He says traditional navigation systems rely on an algorithm developed in the 1960s, known as the Kalman filter, for guidance and navigation. However, it faced limitations in correcting certain errors in the process of determining an object’s position, such as biases, scalefactor errors, instabilities and noises. “The Kalman filter will have a predefined level of uncertainty associated with each one of those pieces of information. As things change within that platform and the sensors have different levels of performance, you’re not modelling that fully,” he says. “Advanced Navigation’s AI fusion Oceanology International | Show report Uncrewed Systems Technology | June/July 2024 Blueprint Subsea’s Oculus M-Series multi-beam imaging sonars measure 125 mm by 122 mm by 62 mm, weigh 980 g in air (360 g in water), and have an operating temperature range of -5 to 5 C Applying AI that constructs models of the dynamics of the vehicles enables Advanced Navigation’s MEMS inertial sensors to perform like FOGs, comparably improving the latter

RkJQdWJsaXNoZXIy MjI2Mzk4