Issue 56 Uncrewed Systems Technology June/July 2024 Insitu ScanEagle VTOL and Integrator VTOL l Data storage focus l IDV Viking UGV l Oceanology International l LaunchPoint l Insight on USVs l Antennas focus l Xponential report

43 the required computing platforms are housed. Unfortunately for the Tier 1s and car makers, this means becoming an IT expert in the Internet of Things (IoT) and the edge/macro edge. Using edge storage and data transfer as a service provides a cost-effective way of managing more than 100 TB, all engineered to withstand harsh, mobile environments and enabling in-vehicle data ingestion up to 7 Gbit/s. A portable, rackable system records AV data by securing it to the trunk of the car, and a PCIe adapter can be used for an external PCIe port on the recording system or with self-encrypting drives. The mobile data-storage system is built with industry-standard, AES 256-bit hardware encryption and key management in a rugged, lockable transport case for superior data transport and security. It is also agnostic to data logger form factors and architectures. The current research and development for AD vehicle technology requires use cases for rapid prototyping, real-time vehicle testing, calibration and logging, and many more. Each test vehicle is, in effect, a mobile data centre, generating around 100 TB a day of data from the multiple sensors inside it. This data must be fed into a data lake, from which it can be extracted, and used for AI and ML modelling and training. The end result is AI/ML model code that can be used in real life to assist drivers by taking on more driving responsibilities. The problem is that there can be 10, 20 or more test vehicles, each needing to store logged data from multiple devices. Each device can store its own data (a distributed concept) or use a central storage system. Each method costs money. Just putting in-vehicle data storage in place can cost up to $200,000 per vehicle. That means a 50-vehicle fleet needs capital expenditure of up to $10m just to store the data, let alone use it. Centralising the data can be done in two ways: move the data across a network or move the data-storage drives. Once in the same data centre as the AI training systems, the data has to be made available at high speeds to keep the GPUs busy. Networking the data is costly and, unless the automobile manufacturer spends a great deal of money, it is slow. A fleet of 20 vehicles could each arrive at the edge depot on a daily basis with 100 TB of data. That’s 2 petabytes (PB) of data to upload per day. It is cheaper and perhaps faster to just move the drives containing the data than to send the bytes across a network link. There are five problem areas in this data pipeline: the in-vehicle storage; data movement from vehicle to data centre; data-centre storage; and feeding data fast to GPUs. AI training Each test vehicle has a disk drive array in its trunk, which takes data streamed from sensors to data loggers and stores it centrally. When a test car returns to its depot, the drives inside data cartridges can be removed and physically transported to a receiver unit in the AI/ML training data centre. Once it arrives at the data centre, the data can be stored in an object storage system and also in the cloud for longer-term retention. Active file management (AFM) software feeds the data from the object storage system into a parallel file system and non-volatile flash storage. This is then accessible to the GPUs for model training. In-vehicle data storage Storing data during the life of the vehicle is another challenge. Code data tends to be stored in non-volatile NOR flash memory, which stores one bit per cell for reliability, and speed of writing and recalling the data. This enables instant-on performance and fast system responsiveness in automotive applications, but also limits the capacity of a memory device. Current automotive NOR flash devices have a capacity of up to 2 Gbit, and data-transfer rates of up to 400 Mbit/s read throughput or 166 MHz single transfer rate, 166 MB/s read throughput using a 200 MHz clock double data rate (DDR) with data strobe. The NOR structure enables 75.8 ns initial access time and 2.5 ns subsequent access time. Using four SPI communication bus interfaces, a quad SPI-NOR flash delivers fast Data storage | Technology focus A test vehicle can generate terabytes of data (Image courtesy of University of Limerick) Uncrewed Systems Technology | June/July 2024

RkJQdWJsaXNoZXIy MjI2Mzk4