Unmanned Systems Technology 005 | Selex ES Falco UAV | Sense and avoid systems | RCV Engines DF70 | DSEI show report | Fuel cells | CUAV Expo, InterDrone and CUAV Show reports | SLAM

81 Simultaneous localisation and mapping | Insight Cooperative search and rescue Urban search and rescue operations attending a natural or man-made disaster could benefit greatly from teams of autonomous systems that can cooperate effectively using SLAM to find their way around, building their own maps as they go and then merging them into a more comprehensive map when they meet, or get close enough to communicate. To merge their maps accurately, of course, each system needs to know its own poses accurately in relation to the coordinate system of the shared map. Using particle filter-based SLAM algorithms in each system, this has been shown to work efficiently with up to six vehicles. For example, a team in Germany from the Nuremberg Institute of Technology and Julius Maximilian University in Würzburg has demonstrated a SLAM approach applicable to multiple autonomous robots, which uses 2D Lidar sensors to build a dynamic representation of the environment. The team’s SLAM algorithm uses signed distance functions, which are increasingly useful in machine vision because they contain the distances to the imaged surface for each 3D ‘voxel’, which is a kind of video pixel in 3D space, enabling the algorithm to estimate the sensor’s pose efficiently. In a successful ‘loop closure’ experiment carried out by the team, a single autonomous system – a rescue robot named Simon and equipped with a Hokuyo UTM-30LX Lidar – found its way around the first floor of a building and back to its starting point, a task made more difficult because the environment contained few distinctive features and several laser-unfriendly glass surfaces, and in which the start and end points were close together. In a second single- robot experiment designed to test the team’s SLAM framework, reference laser frames from a data repository at the University of Freiburg were used, along with an image of the resulting map to validate the output of the software. To test its software with multiple robots, the team used a package called the ROS Simple Two Dimensional Robot Simulator installed on a PC with a quad-core processor to provide the simulated robots with artificial laser data. Starting at the same time, the four virtual robots – one for each processor core – successfully explored a labyrinth and built a map of their surroundings. The team’s next multi-robot SLAM experiment involved Simon and Georg, two real autonomous robots that explored separate parts of the same building, again using Lidars as their principal sensors. To validate accuracy, both robots’ trajectories again contained loops that had to be closed. Both arrived back at their starting points at the same time and confirmed that the drift errors were very small. In April 2015, the team also entered the robots for the RoboCup Rescue German Open competition, and won. In future work, the plan is to analyse the accuracy of the estimated trajectories using knowledge of the robots’ real positions (known as exact ground truth) and to carry out a detailed timing evaluation. Experiments like those described above strongly suggest that SLAM techniques will enable teams of robots to cooperate with one another and with humans to take on difficult and dangerous tasks, such as searching inside damaged buildings for casualties after major incidents. SLAM also promises greater robustness for military systems that currently rely on satellite navigation systems that could be attacked with jammers or even anti- satellite weapons in future conflicts. SLAM will be valuable even when GPS or other navigation systems are available, thanks to its ability to build and update maps and obstacle databases ‘on the fly’. As it’s still only 20 years old, SLAM can’t be described as fully mature yet, but it is getting smarter and potentially more useful by the day. Like many of today’s advanced technologies, progress in SLAM comes from a deeper understanding of mathematics and logic expressed in software rather than hardware. Thomas Bayes and Pierre-Simon Laplace would be astonished to see where their logic has led – probably. Unmanned Systems Technology | Dec 2015/Jan 2016 This search and rescue robot has been developed to find its way around complex and dangerous environments (Courtesy of Tohoku University)

RkJQdWJsaXNoZXIy MjI2Mzk4