Issue 40 Unmanned Systems Technology October/November 2021 ANYbotics ANYmal D l AI systems focus l Aquatic Drones Phoenix 5 l Space vehicles insight l Sky Eye Rapier X-25 l FlyingBasket FB3 l GCS focus l AUVSI Xponential 2021
82 Focus | Ground control stations The most obvious solution for a swarm composed of many different sizes of unmanned vehicle is to use edge computing, with mission analytics and AI algorithms being run on the GCS and using incoming telemetry from the smaller UAVs. The resulting command outputs can then be sent to the smaller craft without needing them to run anything computationally intensive. In the future, the edge computing approach could be taken a step further, towards collaborative networked operations. In this instance, the most powerful computers on board the largest unmanned vehicles in a swarm could all receive real-time telemetry from the other vehicles. Those vehicles would then handle all the necessary computational ‘heavy lifting’, broadcasting control outputs to their subordinate craft. Formations, manoeuvres and other tactical adjustments would therefore be calculated ad hoc within the swarm. By this time, the GCS will probably be used for nothing more than mission pre- planning and post-mission data analytics; having an operator on-the-loop will only be necessary for regulatory compliance. Of course, cloud computing is already available for remotely running mission profiles, AI behaviours and post-processing and analytics. With a cloud-based mission management dashboard and data link, any platforms – including tablets and smartphones with limited hard disc space and processing power – can be used for deploying and controlling unmanned systems. Moving processing duties to the cloud also opens up the possibility of providing data analytics services from there. That would decouple organisations’ abilities to exploit their survey data or perform predictive maintenance fixes on their vehicles from their computer hardware or software programming capabilities. Tailoring a GCS to access cloud-based services potentially means being able to cherry-pick specialised autonomy or analytical features. For example, agricultural survey UAVs could count fruit, track livestock or carry out hyperspectral analyses. Similarly advanced techniques are increasingly available for energy and utilities inspections, first responders missions and other key UAS markets. In addition to adopting different cloud- or drive-based features as plug-ins for GCS software, programmers also offer increasingly broad ways for end-users to configure their GCS interfaces as they like. At the moment, this largely means straightforward adjustments such as moving menus or data bars around, or changing the colour coding or appearances of alerts, vehicle icons, geo-fences and so on. However, future advances in system interoperability and compatibility can be expected to make GCS software compatible with augmented or virtual reality headsets. At that point, operators will be able to interact with 3D projections of map topographies that display swarms of unmanned assets, with real-time updates on position, altitude and heading information. Using such visualisation for a networked common GCS interface would enable several mission operators and analysts equipped with AR/VR headsets to collaborate in planning a mission, with all of them making adjustments to waypoints and autonomous responses to account for different contingencies. The current alternative is having several people trying to squeeze in front of a single computer screen, or having them sit at different screens but adjusting the same GCS planning interface, creating potential for conflict. After a launch, multiple operators using an AR/VR-based GCS interface could then scrutinise a shared, common operating picture. They would also be able to access different streams of incoming or logged data without interfering with what the others are seeing, all while keeping their hands reasonably free. Advances in gaming hardware and graphics engines are worth following, as their adoption stands to greatly improve the handling, quality and intuitiveness of GCS systems over the next several years. Acknowledgements The author would like to thank Frank Severinsen and Klaus Aude at UXV Technologies, Niklas Nyroth and Paul Holmstedt at Robot Aviation, Ivan Stamatovski at Easy Aerial, Konrad Cioch at Aerobits, Javier Espuch and Elise Nolan at Embention, Barry Alexander at Aquiline Drones, Kraettli Epperson at Vigilant Aerospace Systems, Miguel Angel de Frutos at UAV Navigation, and Steve Jacobson at Autonodyne for their help with researching this article. October/November 2021 | Unmanned Systems Technology Collaborative planning and customisation of GCS interfaces is critical to being able to tailor information displays and alerts for each mission (Courtesy of UAV Navigation)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4