Unmanned Systems Technology 017 | AAC HAMR UAV | Autopilots | Airborne surveillance | Primoco 500 two-stroke | Faro ScanBot UGV | Transponders | Intergeo, CUAV Expo and CUAV Show reports

The PilotUS autopilot from A-Techsyn 94 our system relatively simple, we hope to avoid excessive hardware or power requirements for our algorithms to run, and remove the cognitive overload from the operators, who may already be dealing with a considerable amount of information when conducting their UAV missions. “If you want to enable a craft to recognise a new type of object the system has not been trained for previously, a few videos and photos [which can be taken by the UAV’s operator in the field] that distinguish the object is enough for our training algorithm to run and add this new object class in real time.” The algorithms are based on unsupervised (self-organising) feature representation models and adaptive coding of high-level features to enable the system to build decision rules that are as accurate and computationally efficient as possible. The user interface was programmed using C++ and Java. “It’s not a rehash of TensorFlow, Caffe or other libraries, networks or frameworks for deep learning and machine intelligence – our software is based on our scientists’ original research,” Lysyuk said.   A common approach in similar situations is to have much of the work carried out in the cloud or the UAV’s ground station, for example. Molfar’s approach however is to do the processing on the UAV itself. “When you consider the limited bandwidth UAVs have for control – for video transmission and so on, as well as security – it’s crucial that our system doesn’t communicate raw data. It just transmits condensed information at the point of object detection, its coordinates and the image for example.” Molfar has a prototype computing board that is designed as a testbed for its software. It runs from a 5 V DC power input, through a USB interface, and is based on a Raspberry Pi 3 to highlight the system’s minimal SWaP and processing power requirements. Worthington Sharpe displayed its Wing GCS ground control station, which it says is a fully integrated solution for complex UAV missions. Sam Worthington told us, “The central feature is the patented Wing control device. This looks and behaves like a computer mouse, but the upper part moves to enable pitch and roll control, while twin paddle grips provide yaw control, and the scroll wheel can be toggled for control throttle    “It still acts as a mouse for precise interaction through the GCS’ computer screen, but now you can do that and control the flight of the UAV all with one hand.” The other hand is free to operate the full-sized keyboard or secondary joystick for mission analysis and controlling the camera gimbal during flight. Alternatively, on two-person missions, the pilot can operate the UAV via a conventional transmitter while the engineer uses the Wing’s 3D functions to control the gimbal. The computer in the GCS can also be tailored to the user requirements. “We could fit a Lenovo workstation for increased data processing, for example. The GCS is modular – we can put in multiple screens, offer a hand- luggage sized version, and add a tripod mounting,” Worthington said. UAV technology company A-Techsyn showcased its PilotUS autopilot unit, for UAVs ranging in December/January 2018 | Unmanned Systems Technology Show reports | Intergeo, CUAV Expo and CUAV Show The central feature of Worthington Sharpe’s Wing GCS is a mouse-like controller

RkJQdWJsaXNoZXIy MjI2Mzk4