Unmanned Systems Technology 042 | Mayflower Autonomous Ship | Embedded Computing | ElevonX Sierra VTOL | UUVs insight | Flygas Engineering GAS418S | Ocean Business 2021 report | Electric motors | Priva Kompano
35 level edge computing network, which leans heavily on Nvidia Jetson AGXs. “By and large, the computers are overspecced, because it’s important to have sufficient computing resources in case one or even several fail, so we always have enough processing power as back-up.” Selection of cables and connectors drew heavily on Phaneuf’s and Scott’s experience of making defence-grade UUVs and USVs, with their choices largely being a compromise between reliability, performance and price. Machine learning with IBM The aforementioned system of camera stations created a unique database of more than a million nautical images, which (along with some open-source database imagery) was used to develop the AI Captain’s computer vision for detecting and categorising ships, offshore structures and potential hazards. An IBM Power AC922 server computer powered by IBM Power9 and Nvidia V100 Tensor Core CPUs was used to handle the machine learning and other AI processes necessary for the object classification. “The AI responses to those detections are based not just on COLREGs but more on our lengthy experiences of handling sea vessels and around what it is that the Mayflower actually needs to do, especially regarding how the metadata suggests that a given ship, once detected and classified, is going to move,” Phaneuf says. “Estimates of inertia for example play a big part in the AI decision-making: how our ship manoeuvres around a container vessel is very different from how it should respond to a jet ski. Once the AI Captain has an object’s vector, potential range of movement, and an idea of how quickly or suddenly it might move, it can calculate a Closest Point of Approach [CPA] which essentially marks a no-go zone around each foreign object for safety.” After the CPAs are calculated, the Mayflower can then estimate how best to manoeuvre around objects in compliance with COLREGs. The limited number of ship classifications under these regulations limited their primacy in the development of the obstacle avoidance system; instead, the team says, computer vision and AI classification are better placed to predict the expected actions of other ships and inform how best to interact with them. Scott offers the following to illustrate the problem. “COLREGs often say you need to perform ‘action X’ until you are ‘past and clear’ of the other ship; but what does ‘past and clear’ actually mean? It is understood to mean that you’re at a point where you think the other captain thinks they’re safe, then you can reacquire your track. “But that’s a value judgement, so we need to make sure our AI makes correct value judgements. That means it needs to know what each ship is and what its capabilities are. That takes up far more processing power than actually executing the COLREGs.” Although CNNs were used extensively in developing the computer vision AI, the Mayflower team also investigated reinforcement learning (RL) early Unmanned Systems Technology | February/March 2022 The machine learning was achieved through an IBM Power AC922 server computer, powered by IBM and Nvidia CPUs, while a fintech app called ODM was used for complex decision-making
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4