Unmanned Systems Technology 002 | Scion SA-400 | Commercial UAV Show report | Vision sensors | Danielson Trident I Security and safety systems | MIRA MACE | Additive manufacturing | Marine UUVs
58 Insight | Security and safety systems infrastructure, including interactions with traditional as well as electric engines and the creation of cooperative vehicles. The connected vehicle brings with it the potential for severe consequences should security systems be unable to mitigate any cyber threats, so it is essential that vehicle components can ‘trust’ each other and that only the driver or autonomous driving system can give the car instructions, with hackers unable to inject their own commands. Identity and credential management is vital in the fight against these potential security attacks. The MIRA-Plextek-Intercede group has been formed to ensure that best practice across the whole spectrum of security is defined and in place early in the manufacturing process, with the aim of protecting road users. Adding security into the mix also increases the issue of fragmentation in the development of the safety-critical software. Rather than trying to develop safety cases that cover all eventualities across all platforms, the different safety cases vary between the different standards. Automotive equipment developers are working to the ISO 2626 standard, while aircraft suppliers are working to the US Department of Defense DO-178B and C standards. The F-2541 standard from ASTM International in the US covers maritime systems, with versions for surface and underwater craft, and deals with autonomy and control, comms, system interfaces and data formats. DO-178C clarifies the boundaries between the concepts of High Level Requirements, Low Level Requirements, and Derived Requirements in DO- 178B, and gives a better definition of the exit/entry criteria between systems requirements and the software requirements and design. This is the key safety and security standard for software in UAVs, but it also impacts on other standards such as DO-330 for the software development tools, DO-331 for the development and handling of software models, DO-332 for object- oriented software development and DO- 333, which handles formal methods. Formal methods are mathematically based techniques for specifying, developing and verifying the software aspects of digital systems. The mathematical basis of these methods consists of formal logic, discrete mathematics and computer-readable languages, and their use is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analyses can contribute to establishing the correctness and robustness of a design. As the software becomes more complex, so the cost of fully verifying and testing autonomous systems becomes an issue, which is leading to new software architectures to reduce the cost of the development of programs such as agents – small pieces of code that ideally can be fully verified and developed using formal methods to be as safe and secure as possible. The agent then monitors the rest of the system software to watch out for aberrant behaviour or attacks. However, there is still a lot of discussion as to what to include in the agent, the partitioning between the agent’s functionality and the rest of the system software. This is largely driven by the experience of the PC and business software market, where the cost of protecting against attacks has become significant. Autonomous systems simply can’t afford to have the same problems as the PC cyber security sector, where current approaches are insufficiently robust and costly to develop and maintain. This is why companies such as Intercede are being included in the initiative at this relatively early stage to provide security from the beginning of a design. Other developments As a result there are other projects elsewhere looking at different ways to implement both safety and security in autonomous systems, moving from purely deterministic assessments to probabilistic analysis. The industry consensus is that it is increasingly impractical to fully test and verify all possible safety cases, so there is a move to developing a consistent set of behaviours which, for example, would define what a system must do, what it must never do, and what happens when it breaks. At the same time there is a move to using agents to monitor the software in an autonomous system for Spring 2015 | Unmanned Systems Technology Mapping the safety requirements of a UAV onto the requirements of a formal development process (Courtesy of University of Liverpool)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4