Unmanned Systems Technology 006 | ECA Inspector Mk2 USV | Antenna systems | Northwest UAV NW-44 | Unmanned ground vehicles | Navigation systems | Lunar X challenge
59 Unmanned Systems Technology | February/March 2016 continuously, the navigation software needs to know the steering angle, the wheel revolutions and orientation of the vehicle. For redundancy, multiple wheel revolution sensors, steering angle sensors and orientation sensors are mounted to both the driven and steered wheels. The vehicles use a patented technology called the Magnet Measurement System (MMS), which detects the reference magnets embedded in the road surface to calibrate the calculated vehicle position. The MMS is mounted behind the front wheels and is able to detect magnets over its full width. These pods are already operating at level 4 on public roads, but research continues to extend their use on all kinds of roads and in different conditions for fully autonomous operation at level 5. To do this, 2getthere is part of a e 4m project called i-CAVE (Integrated Cooperative Automated Vehicles). This is led by the Technical University of Eindhoven, and during 2016 it will look at how to platoon the 2getthere pods together to create a ‘virtual train’, keeping pods just 0.3 s apart using wireless links as the control mechanism in a scheme called Cooperative Adaptive Cruise Control. “Platooning is difficult to develop,” says Sjoerd van der Zwaan, chief technology officer of 2getthere. “It has been the subject of research for years, but an affordable and practical solution is not available yet and this is the aim of our work within this research project. “Our interest in platooning is that it will enable us to improve performance and capacity, increasing the maximum number of passengers per hour. This will give us an insight into the control algorithms to achieve secure and robust implementation of platooning, as well as an insight into the sensors and technology needed.” The i-CAVE programme also includes Ford and TomTom, as well as chip maker NXP and systems provider IBM and truck maker DAF, to develop self- learning computer vision technologies to give vehicles better perception of their surroundings, thereby lowering their dependency on highly accurate maps. The programme is also looking to develop computer vision technologies that allow for greater automation of map creation to improve the way highly accurate maps can be used by thousands or millions of vehicles. Driverless cars The race to develop self-driving cars will hot up during 2016, with many of the major car designers targeting 2017-20 for commercial launches. This timetable requires system designs to be finalised during 2016 for commercial launch in 2018, and these have to comply with safety standards that have yet to be determined. Tesla has launched its Autopilot feature as a software upgrade to its electric vehicles, allowing them to steer themselves on motorways. This will continue to be enhanced to support more urban autonomy for fully autonomous vehicles in 2018, although Tesla doesn’t expect these to be approved until 2021. “Five or six years from now we will be able to achieve true autonomous driving, where you could literally get in the car, go to sleep and wake up at your destination,” says Elon Musk, Tesla’s CEO. Tesla’s Model S is being enhanced with its Summon feature in 2018 that will allow the car to park itself and come when summoned to a location by an app on a smartphone. Audi is even more ambitious, aiming to have a fully autonomous version of its A8 available by 2017. Apple is also expanding its engineering team for its electric car design, doubling it to 1200 by the end of 2016. The company has already negotiated time on a test track for automated vehicles, The Dutch city of Masdar is running an automated mass-transit system that uses routes defined in software (Courtesy of 2getthere) Five or six years from now, true autonomous driving will allow you to get in the car, go to sleep and wake up at your destination
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4