Unmanned Systems Technology 009 | Ocean Aero Submaran S10 | Simulation and testing | Farnborough report | 3W-110xi b2 TS HFE FI | USVs | Data storage | Eurosatory/UGS 2016 report
7 Platform one Researchers at NASA are also working on natural language control interfacing to autonomous systems (writes Peter Donaldson). Working from the premise that unmanned systems will be easier to use once their command and control systems understand what operators are talking about, Eastern Michigan University graduate linguistics student Erica Mezaros has been working as an intern at the NASA Langley Autonomy Incubator on a system comparable to Apple’s Siri. Mezaros recorded operators’ conversations before, during and after flying multicopters, and used a computational linguistics tool called latent semantic analysis (LSA) to determine the degree of relationship between things the operators said, which were termed ‘utterances’. Building a matrix of relevant utterances and the terms used in them, she arranged the terms as rows and the utterances as the columns, and gave each term a value according to how often it was used. Mezaros then used LSA to plot the information on a graph or map, where terms formed clusters related to subjects including visual observations (‘want’ and ‘see’ for example), hardware (‘work’, ‘turn’), immediate commands (‘whoa’, ‘now’, ‘stop’), software (‘script’, ‘run’) and data analysis (‘check’, ‘data’). She then developed a prototype information provider to use this set of relationships to determine the operators’ intentions from their speech, with the aid of Carnegie Mellon University’s Sphinx 4 open source speech recognition software. When the information provider recognises a question, it maps the utterance and its immediate context to a semantic map, works out to which cluster on the map it is most closely related and provides relevant information to the user, who then provides feedback for quality control. The prototype works well, but slowly, so Mezaros wants to speed it up, perhaps trading robustness for speed, integrate it with a vehicle control system, provide a way to determine numerically how well it is working and, later, to use gestures to clarify demonstrative language such as ‘that drone’ and ‘not that drone’. Unmanned Systems Technology | August/September 2016 Aquila will provide internet access over more than 400 sq km for up to 90 days Communications Maiden flight for web UAV Facebook has made the first test flight of its full-sized unmanned aircraft, Aquila. The UAV has a wingspan of 130 ft, and is intended to fly at between 60,000 and 90,000 ft to provide Internet access over an area of more than 400 sq km (writes Nick Flaherty). The flight lasted 90 minutes, three times longer than expected, using 2 kW of power from batteries to propel the craft at 25 mph. In operation, the craft will use 5 kW from batteries charged by solar panels to remain in the air for up to 90 days. It can travel at up to 80 mph at the higher altitudes. The aim of the test flight was to correlate the range of computational fluid dynamics models (see Focus, page 32) at different altitudes with real-world performance. The flight also tested out the autopilot. This had been tested on a quarter-scale model, and performed as expected. During all manoeuvres, it was stable and accurate, and brought the aircraft to the commanded condition in the predicted amount of time. The automatic landing algorithm also performed well, tracking the glide path and the centreline with the expected accuracy. To keep the weight down to 300 kg, Aquila is launched via a dolly. It is held on the dolly by four straps that are cut by small explosive charges by a signal from the autopilot when the take-off speed is reached. It didn’t all go to plan though, as the craft suffered a structural failure just before landing, but Facebook is not revealing what happened.
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4