Unmanned Systems Technology 025 | iXblue DriX I Maintenance I UGVs I IDEX 2019 I Planck Aero Shearwater I Sky Power hybrid system I Delph Dynamics RH4 I GCSs I StreetDrone Twizy I Oceanology Americas 2019
59 Planck Aero Shearwater | Digest Testing As regards testing, Twining says, “We are test-heavy – we have indoor testing, a boat for offshore testing and an outdoor test range, and we test multiple times a week. You may think you understand everything about a situation, but there is no substitute for testing to address the nuances that are not obvious but are really important.” The Shearwater has been tested at night by the US Marine Corps at its Autonomous Testing Ground, using the standard camera module. Near-infrared (NIR) lamps on the truck provide enough lighting for the standard landing camera to operate effectively at night and still detect the marker for landing, as the CMOS sensor in the camera also works at NIR frequencies. Extending the platform Planck Aero is extending the platform in two ways. The first is using the TX2 during a mission for tracking and geo-location, and advanced computer vision. Planck has also developed imaging algorithms. Most of the object detection and tracking is done by machine learning, with some algorithmic vision for detecting individual objects such as people, vehicles, boats or even whales. Once an object is identified, using a proprietary deep neural network (DNN), a sensor fusion function is used to provide geo-location information and close-loop control to follow an object or track multiple objects as long as they are in the field of view. Following multiple identified objects requires input from the operator on which ones to follow. A key advantage of the Planck approach is that the DNN is field- trainable. Most DNN systems are pre- trained and are not updated during a mission. Planck however has developed a way to pull in new data to improve the algorithms being used. This is not quite in real time but at the point of need, and requires some user input on the training dataset via the GCS, for example to confirm what a particular object is. “We use a lot of algorithmic filtering, especially with multi-object tracking in cluttered environments,” says Twining. The geo-location comes from the sensor fusion. “Because we have all the state information of the sensors, we can generate geo-location with terrain data, so we can detect things [and have the information about where they are] in the real world,” he says. “That means if the data is being shared on ATAK, any images show up in the correct position [on the other screens].” This is achieved by working out the position of the tracked object relative to the UAV’s body frame, which comes from using the sensor to know the orientation of the UAV. This is then combined with GNSS satellite positioning data to give an accurate position of the object on the ground. All the filtering and control systems can be fed with other sensor data, including RTK for higher levels of accuracy if it is available. Image geo- location isn’t necessary, however. “Most of our customers don’t need centimetre accuracy – an accuracy of metres is sufficient,” Twining says. Mobile tethered system The other advantage of the landing system is somewhat counterintuitive. Instead of using the marker to land the UAV, it is used to keep it in a precise position in the air on the end of a tether connected to the moving vehicle or boat. “Using the vision system for precision station-keeping means it is GPS-optional and allows the UAV to stay over a moving craft or vehicle, despite waves or bumpy roads,” says Twining. The power and data are carried over the tether from the comms module, and the imaging system uses the QR code to keep the UAV on station at a height of up to 80 m. Beyond that height the weight of the tether and the wind become too much of an issue. Knowing the position of the UAV accurately on the end of the tether from the geo-location sensor fusion then provides the same object location capability detailed above, while using the tether provides a mission time for as long as the power from the truck or boat lasts. This can be from the vehicle’s battery or even a generator, providing a day-long mission capability. “We are seeing a strong demand for this,” Twining says. Unmanned Systems Technology | April/May 2019 The Shearwater’s controller is based around the Nvidia TX2 module, which will also provide tracking and geo-location, and advanced computer vision
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4