Issue 39 Unmanned Systems Technology August/September 2021 Maritime Robotics Mariner l Simulation tools focus l MRS MR-10 and MR-20 l UAVs insight l HFE International GenPod l Exotec Skypod l Autopilots focus l Aquaai Mazu

12 Researchers in Japan and Canada have developed a formal technique to automatically convert traditional autonomous control software into models that can safely handle uncertainty (writes Nick Flaherty). The researchers, at the National Institute of Informatics (NII) in Japan and the University of Waterloo in Canada, developed the automated method to convert traditional control loops and generate formulas that represent the degree of uncertainty that the controller software can tolerate. The level of uncertainty in the system then determines the safety margins. For example, if the sensor of an autonomous vehicle can misperceive the positions of other cars up to 1 m away, it should operate with a safety margin of at least 1 m. Automated conversion of a traditional control loop can be complex, as developers need to verify that safety is guaranteed for every possible behaviour while accounting for differences between the true values and the data from sensors, where it is difficult to estimate the degree of the uncertainty. For example, the perception uncertainty depends on the situation in which the controller system is deployed, such as if the weather is foggy. “Controller systems are crucial because most software systems’ usefulness is due to their interactions with external environments,” said Tsutomu Kobayashi at the NII. “This research aims to help developers apply formal modelling approaches to realistic software by addressing the inevitable problem of controller systems regarding the gap between the perception and reality. Developers can therefore focus on the essence of controller behaviour. “We believe that the method is valuable and can be extended in various ways. We will continue working towards the systematic and easy application of rigorous mathematical methods to ensure a safe environment for everyone.” The method consists of two steps, starting with uncertainty injection. This transforms the input model of a traditional controller into an intermediate model of an uncertainty-aware controller. Here, the behaviour of the intermediate model is the same as that of the input model, which is unsafe. Variables of perceived values, which can differ from true values, are introduced on the assumption that the controller can refer to the true values when it determines an action. Safe operation is still not guaranteed, however, as the behaviour is the same as that of the input controller. For example, an autonomous vehicle might misperceive the distance to another car as 5 m when it is only 4 m, resulting in a collision. The second step converts the intermediate model into one that is uncertainty-aware and safe. The behaviour of the resulting controller is updated so that it operates safely, even under uncertainty. For instance, in some situations an autonomous vehicle might be unsure of whether it should cruise or brake owing to a perceptual uncertainty about the distance between itself and a car ahead. This second stage exhaustively lists such uncertain cases so that they are considered separately, and updates them so that safety will be guaranteed, even under perception uncertainty. The proposed approach also generates the limit as a formula of uncertainty. Developers can choose appropriate sensors from a catalogue by using the formula and analyse how the uncertainty will be propagated when the controller is combined with other components. Uncertainty for software Control systems August/September 2021 | Unmanned Systems Technology The input model is transformed to calculate the degree of uncertainty the generated controller can tolerate

RkJQdWJsaXNoZXIy MjI2Mzk4