Uncrewed Systems Technology 046
40 Focus | Video systems reliability for sending packets and much lower latency. The final element for low latency comes from the efficiency of encoding and decoding the stream, then transferring the video to the screen. This is down to the design of the software running on the client, usually a tablet computer, and how the data is fed from the memory through the processor to the display. All of this has been combined to demonstrate a UAV being remotely piloted using a standard mobile phone. An operator in Orlando, Florida, controlled a UAV inspecting a factory in Turin, Italy, over a distance of 4800 miles ( UST 45, August/September 2022). The end-to end signal latency was 68 ms, allowing safe piloting via the video feed alone. The radio controller of the UAV was connected to a PC with software that optimised the packets over a dedicated VPN link via the mobile phone. Previous flights had relied on a more sophisticated, non-mobile internet connection, but the demonstration showed that an internet link via a mobile phone in the vicinity of the UAV is all that is needed to pilot the vehicle remotely from virtually anywhere. The technology is also being used for the next generation of missions to Mars. A 15-day test on Mount Etna, an active volcano in Italy where the landscape is similar to Martian geology, allowed both a UAV and a UGV rover to be controlled from personnel in Houston, Texas, in real time without the need for data from a GNSS satellite network, which of course is not available on Mars. AI accelerators There are several options for AI processing of video content for uncrewed systems, whether in the air or on the ground. The latest GPU chip runs popular image processing AI frameworks as well as the ROS2 operating system and Yocto Linux. The chip provides up to 275 TOPS – eight times the processing power of the previous GPU chip – in a module with the same pinout. The production systems are supported by a software stack with development tools and libraries for computer vision, and tools to accelerate model development with pre-trained models. More than 350 camera and sensor options are available for the module to support challenging indoor/outdoor lighting conditions, as well as capabilities such as Lidar for mapping, localisation and navigation in robotics and autonomous machines. On-chip AI The AI boards are popular for UGV designs where power consumption is less of an issue, but developers also want to use AI on UAVs. That is driving more interest in low-power AI chips. The leading contender for this is an AI chip with a proprietary architecture with three types of blocks with control, memory and computing. These are assigned to various layers of the neural network framework and are designed to keep the data on chip. The chip has a performance of up to 26 TOPS with a power consumption of 6-7 W, which makes it suitable for UAV applications. The chip is used on a board called Jupiter alongside a mainstream video controller chip. The board measures 50 x 50 mm and is set to be part of the Beresheet 2 Lunar Mission for an Israeli lunar spacecraft. Beresheet 2 is a three-craft mission to the Moon: one orbiter and two landers. The orbiter is planned to launch from Earth while carrying the two landers, arrive at the Moon, safely release the landers at different times and orbit the Moon for several years. The landers are planned to be released from the orbiter and land at different sites on the Moon. Each spacecraft is performing at least one scientific mission. The Jupiter board will be used October/November 2022 | Uncrewed Systems Technology The Orin nano GPU card for AI applications has a thermal profile of 7 to 10 W (Courtesy of Nvidia) The Jupiter card uses a Hailo-8 AI processor for video processing for the Beresheet 2 Moon mission (Courtesy of SpaceIL/Maris-Tech)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4