Unmanned Systems Technology 007 | UMEX 2016 report | Navya ARMA | Launch & recovery systems | AIE 225CS | AUVs | Electric motors | Lethal autonomous weapons
28 Dossier | Navya ARMA ahead of the ARMA to a distance of around 20 m, and the downward-angled rear sensor similarly monitors activity around the back of the vehicle. This design allows the upwards scan of the rear sensor to monitor the space above the front of the vehicle. The forward sensor can be used to detect pedestrians and objects. For example falling objects, such as a concrete block dropped from an overhead walkway, will be detected by the vehicle. If something like that happens, the vehicle can slow to let the block fall in front of it, and stop quickly to protect the passengers. At the front and rear, Lidar sensors from German company SICK monitor areas ahead of and behind the vehicle. These give a better reading for range, but have a narrower angle of operation and are positioned close to the ground to detect obstacles, as well as comparing and confirming the data coming from the Velodyne sensors. The range of perception of all the sensors is the key limiting factor to increasing the speed of the vehicle. Navya conducts a wide range of its own tests to determine the range of the sensors, both with the software from the manufacturer and with its own software. However, for the ARMA Navya is not using the manufacturer-supplied software, and converts the data from the various sensors into its own format. This allows the company to be independent of the sensor vendor, so each time a more powerful sensor comes on the market it can be quickly integrated into the vehicle platform, as Navya does not have to wait for the software from the supplier. The raw data from all four Lidar sensors is stored on board and uploaded when the vehicle reaches a wi-fi connection. This helps improve the map that the vehicle uses to navigate, and the map is held on board the vehicle, as cloud-based maps are not accurate enough to guide the system. Also, Navya says, the telecoms network that provides data from a cloud-based map is not fast enough to provide updates to the maps, which could cause a conflict between the map and the sensor data. Cameras The front and back of the ARMA have two dual-sensor stereoscopic cameras, which are used for obstacle detection, traffic light detection and mapping. A fisheye camera inside the shuttle provides a visual feedback to a supervision centre so that a remote operator can see what is happening. “We want all the different information to be in one format for analysing the information in the same way,” says Sapet. “The output from all the sensors has to be in the same format for the processor to be able to analyse and take the decisions, using Lidar, GNSS and cameras. If the information is not consistent then the vehicle will take appropriate measures and operate in a safe, degraded mode.” Future sensor developments The cameras are also being used for a simultaneous localisation and mapping (SLAM) system being developed by Navya that could operate with the map should the GNSS receiver or Lidars fail. The current challenge with the cameras April/May 2016 | Unmanned Systems Technology The front and rear Lidar sensors provide longer range sensing and redundancy
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4