Unmanned Systems Technology 012 | AutoNaut USV | Connectors | Unmanned Ground Vehicles | Cobra Aero A33i | Intel Falcon 8+ UAV | Propellers | CES Show report

80 The first is the remotely operated autonomous drone (ROAD) that manages the communications connection. It has three control channels – the LTE cellular network, 2.4 GHz remote control and 915 MHz long-range telemetry. There is disconnection recovery and emergency handover between the control channels to ensure that there is always a connection, as well as the ability to fly away from LTE dead zones, and, if all the connections fail, an emergency landing capability. A multiple-flight fleet management system integrates UAV flight data and three streaming cameras, allowing a user to select a particular control UAV, and simultaneously displays the positions of all the UAVs on a map.  The second component is Tribrid real- time streaming, a heterogeneous multi- video streaming platform that provides hybrid data streaming of three cameras over the LTE link. This gives a first-person view, 30x HD optical zoom and a thermal camera for inspection and night vision. The third element is autonomous patrolling software called Super Range, which allows the ROAD to operate autonomously for periodic patrolling tasks in extended areas over a month. It uses a recharging system that includes an automatic stabilised landing stage, charging safety subsystem monitor and support for power sources such as a direct methanol fuel cell power module, a regular power line or dc sources from solar cells or other generators. There is also a buffering system for a continuous and stable charging supply. Software developed by NASA for controlling unmanned space systems has been adopted for driverless cars. Its Visual Environment for Remote and Virtual Exploration (VERVE) is a 3D graphical interface for visualising data and remote environments in real-time. VERVE has been used to remotely operate and manage NASA’s K10 planetary rover in the Canadian Arctic and the Arizona desert, as well as a lunar rover prototype for the Resource Prospector mission concept. Nissan is now using VERVE, which is available as open source software, for an intelligent mobility system that allows autonomous and driverless cars to co- exist with human drivers. “VERVE provides the foundation for human-robot teaming in Nissan’s new Seamless Autonomous Mobility [SAM] system,” said Terry Fong of the NASA Ames research laboratory. “It allows humans to provide assistance to autonomous vehicles in unpredictable and difficult situations when the vehicles cannot solve the problem themselves.” A Nissan ‘mobility manager’ at CES used the SAM system to remotely help self-driving cars navigate around construction machinery and other difficult road obstacles using V2I (vehicle-to- infrastructure) wireless links. While vehicle sensors such as Lidar, cameras and radar can tell the car where obstacles are, the traffic light state, and even recognise some hand gestures, Nissan said human judgement is required to understand what other drivers and pedestrians are doing, and decide on the appropriate course of action. With SAM, the autonomous vehicle brings itself to a safe stop and requests help from the command centre. The request is routed to the first available mobility manager, who uses vehicle images and sensor data streamed over the wireless network to assess the situation, decide on the correct action and create a safe path around the obstruction. The mobility manager then ‘paints’ a virtual lane for the vehicle to drive along. When safely past the obstruction, the car is released to continue on by itself along the designated route and resumes autonomous operation. This virtual lane is then shared with other vehicles in the same situation. XDynamics has developed a UAV called Evolve that provides 1080p full HD video at 60 fps with transmission latency below 10 ms. The effectively zero-latency transmission allows filmmakers to monitor framing and composition in real time. The Evolve is a quadcopter with a monocoque chassis constructed of carbon fibre, integrating the body and the chassis into a single unit. A 4k camera with a 21 mm f/2.8 lens and Sony CMOS image sensor with 12.4 MP resolution is mounted on a three-axis gimbal stabiliser and records 4k video at 24 fps, 2.7k at 60 fps, 1080p at 120 fps, and 720p at 240 fps. The camera is linked to the Connex Mini Embedded module from Amimon that February/March 2017 | Unmanned Systems Technology The ITRI institute has overcome LTE 4G latency to allow control of multiple UAVs

RkJQdWJsaXNoZXIy MjI2Mzk4