Uncrewed Systems Technology 047 l Aergility ATLIS l AI focus l Clevon 1 UGV l Geospatial insight l Intergeo 2022 report l AUSA 2022 report I Infinity fuel cell l BeeX A.IKANBILIS l Propellers focus I Phoenix Wings Orca
44 Focus | AI accuracy is reduced. It is also less efficient in processing, so more hardware resources are needed, driving up the power consumption and cost. So there is a constant battle for AI frameworks to keep up with technology development in driverless vehicles. These frameworks have been developed over the past decade for ADAS systems, which use forward- looking cameras to capture a vehicle in front, the lane markings on the road and any potential obstructions. The ADAS system supports a driver, issuing a warning if the vehicle is veering off the road or triggering the emergency braking system to prevent a collision. This is a more tightly bound problem than full autonomous operation, so it can make up only a part of the overall solution. ADAS systems also rely on a driver in the vehicle, so have a lower safety requirement. Moving to full autonomy requires much more redundancy in the system design, including in the AI framework. For a driverless vehicle, the latest AI framework consists of 5 million lines of code to ensure that the system can be certified to the highest level of safety, ASIL-D, as part of the ISO 26262 safety process. Running this framework requires a high-performance processor to handle the inference, and the next-generation devices will handle 2 TOPS, or 2 million mega-operations per second. Devices with that level of performance will be in production in vehicles in 2025; however, the exact performance requirements depends on the sensor architecture of the vehicle and the mix of image sensors, Lidar and radar. The development and simulation tools are key, supporting multiple DNNs with the ability to change sensor configurations and positions, and add other road users. For example, the tools can reconstruct a 3D scene from the recorded data and add human and synthetic data, then generate the images needed for training the framework. The tools can take a video input and 360 º scene creation, and extract other cars and people to leave a driveable scene. That allows the focus vehicle and other road users to move through the scene using the laws of physics and test the software stack to scale the testing of edge cases to create a massive number of scenarios. The tools can also simulate all the other software in the car, including displays and the infotainment system, to test how the data moves around the vehicle using the actual software that will run in the vehicle. This can ensure that data from key components, particularly sensors, is not blocked or delayed by other applications. However, the evolution of the AI framework development tools from Level 2 and Level 3 ADAS applications to full autonomous operation at Level 4 and Level 5 is pushing for more hardware independence. That would allow all the investment in the complex development of the framework to be moved from the chips from one manufacturer to another without compromising the performance. That avoids being locked into a particular software stack and supply chain, and gives the flexibility to use different chips for different vehicle designs, from a budget platform to high- end luxury versions, using the same safety-certified AI framework. This hardware independence relies on the fact that the underlying processing architecture for most autonomous vehicle applications is based on MAC units. The MAC units can be connected in different ways with different amounts of memory, but the underlying architecture is similar. As the workload evolves, so does the AI, and the processing requirement for the AI needs to keep up as the system requirements keep changing. For example, there are older, tested frameworks running on new hardware, new frameworks running on older hardware in existing vehicles and new frameworks running on new hardware in new vehicles. This complexity requires more analysis of the system specification, looking at the workloads and benchmarking them on different hardware implementations. This is more than just running a particular network on the fastest possible chip. Software tools can help design a network and explore its architecture to see which layers within it are causing efficiency problems. This is a different approach to taking digital signal processing tools that are used to manage the MAC units; instead they work to optimise the neural network first with the data and then map that to the MACs in the hardware. Then there is the challenge of decoupling the training data design from the execution of the network. The data problem is in designing the workloads, having networks for December/January 2023 | Uncrewed Systems Technology The aiWare hardware for autonomous driving is tightly coupled to the AI framework and analysis tools (Courtesy of aiMotive)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4