Unmanned Systems Technology 028 | ecoSUB Robotics AUVs I ECUs focus I Space vehicles insight I AMZ Driverless gotthard I InterDrone 2019 report I ATI WAM 167-BB I Video systems focus I Aerdron HL4 Herculift
14 Platform one October/November 2019 | Unmanned Systems Technology A European start-up is developing an integrated Lidar laser ranging sensor subsystem that uses a new approach to analysing the high-volume point cloud produced by such sensors (writes Nick Flaherty). Instead of using a convolutional neural network (CNN) to implement machine learning on Lidar data, the Outsight system is based on a multi-spectral laser using the shortwave infrared spectrum coupled with a dedicated low power processor rather than a large array of graphics processing units. Outsight is a spin-out of Dibotics, and uses an ‘augmented Lidar’ processing chip from Dibotics. The chip can work with a wide range of Lidar sensors, but Outsight is working with solid-state Lidar A US company is developing a hybrid vertical take-off UAV that can carry a payload of 100 to 225 kg over a range of 500 km (writes Nick Flaherty). Elroy Air’s Chaparral uses battery power sensor maker XenomatiX to combine the chip and sensor. The resulting subsystem has a range of 200 m and produces a frame of laser points every 50 Hz. Its field of view is 30 x 10 º and it has a resolution of 0.2 º . As with mainstream Lidar systems, it detects road markings, lanes and traffic signs, but the multi-spectral sensor also detects vegetation at the side of the road and the state of the road surface. That includes road hazards such as black ice, snow and debris at a range of hundreds of metres, which are notoriously difficult for laser sensors to detect. It is hard to prove that CNN networks meet functional safety requirements in the ISO 26262 standard, and they can mis-classify objects. The Outsight sensor and processing identifies – or classifies – the points in the cloud of Lidar data, and can flag up hazards quickly as there is no need to wait for several frames before the system can make a decision. The embedded chip also has an ‘ego- motion’ mode that understands on a frame-by-frame basis how the vehicle is moving, and it can work with a reference map to provide localisation without the need for satellite navigation. Another mode creates a moving 3D map around the vehicle, allowing a virtual frame from the sensor to be created by integrating hundreds of the sensor frames. Because the classification in the embedded software is deterministic, it is easier to demonstrate a high level of safety and ISO 26262 compliance. This output can also then be used for a CNN machine learning network. and six rotors for the vertical take-off, and a gasoline turbine powering a seventh rotor for horizontal flight to achieve the range using a conventional 10 m wing. The cargo would be loaded autonomously and carried in a pod underneath the aircraft. A grasping mechanism grabs the pod, which is winched up against the body of the fuselage for transport. The system is being tested at the moment for commercial operations next year. Lidar takes low-power view Cargo carrier is in range Sensors Aerial vehicles The Chaparral is designed to transport 100-225 kg
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4