UST034

7 Platform one Unmanned Systems Technology | October/November 2020 Yellowscan has used a laser Lidar sensor developed for driverless cars to create an entry-level mapping system for UAVs (writes Nick Flaherty). Its Mapper Lidar uses the Horizon laser scanner from Livox developed for Level 3 and 4 ground vehicles to provide a high-density point cloud from a UAV. Yellowscan added an IMU from Applanix to support 3D operation in the air, as well as a lithium-ion battery and power management unit. This gives a point-cloud density of 900 points/m 2 when flying at a height of 30 m and a speed of 5 m/s, and 190 points/m 2 at 70 m and 10 m/s, with a field of view of 81.7 º horizontally and 25.1 º vertically. The Mapper works with a post- processed kinematic system on the ground that improves the accuracy of the GPS system in the UAV to under 1 cm. The data is post-processed to generate a corrected and precise trajectory. The smooth best estimated trajectory has an accuracy of 2 cm in the x-y plane and 3 cm in the z plane. Cyril Jayet, technical support engineer at Yellowscan, said, “The accuracy goes from 1.9 cm at 35 m to 5.3 cm at 150 m, with a precision of 1.5 to 5 cm.” The whole Mapper weighs 1.4 kg including battery, which compares to 1.1 kg for the Horizon Lidar scanner itself in a ground vehicle. The battery supports 90 minutes of operation. Lidar’s roadmap to UAVs Airborne vehicles The Mapper Lidar provides a high-density point cloud with high-accuracy laser points Zenotech and the University of Dundee have collaborated on a project to model the airflow in cities to help autonomous aircraft avoid obstacles (writes Nick Flaherty). As part of a Defence and Security Accelerator (DASA) project in the UK, Zenotech performed extremely large- scale airflow modelling to characterise the aerodynamic environment around buildings and cityscapes, which are different from the winds experienced by conventional aircraft. The data will provide information for operating autonomous UAVs effectively and safely. The University of Dundee worked on a prototype that integrated Zenotech’s airflow simulation with its immersive visualisation interface, 3DVisLab, which was developed to improve situational awareness when flying UAVs in complex environments. “The outcomes of this research will help to drive forward the next generation of UAVs as well as develop our products for new and innovative applications,” said James Sharpe, Zenotech’s lead for big data and security. Helen Mullender, the project’s manager at DASA, said, “The work being funded is to mature autonomous systems with the capability to operate on demand, in all conditions that might be encountered. Military operations are undertaken in all kinds of challenging environments, so including autonomous systems in these operations will demand their ability to operate effectively and efficiently, regardless of the environment.” Dr Kieran Baxter at the university’s 3DVisLab department said, “It is important for us to understand how new generations of data will inform not only autonomous systems but also the human operators behind them.” Airborne vehicles Study goes with the flow The project characterised the airflow around city buildings to help UAVs avoid obstacles

RkJQdWJsaXNoZXIy MjI2Mzk4