Unmanned Systems Technology 013 | AutonomouStuff Lincoln MKZ | AI systems | Unmanned Underwater Vehicles | Cosworth AG2 UAV twin | AceCore Neo | Maintenance | IDEX 2017 Show report
28 Dossier | AutonomouStuff Lincoln MKZ Power supply Power is distributed via a 12 V harness to the individual sensors. “The power limitation isn’t there unless you start talking about heavy server racks,” says Buckner. “We have a 175 A inverter in the vehicle, and we have done studies on how much power the car is taking and how much extra is available. “There’s about 90 A drawn under normal conditions so we have 85 A to play with,” he says. “That supports most of the sensors and a 1000 W server rack without voiding the warranty of the vehicle.” Software The Nebula is preconfigured with Ubuntu Linux 16.04 LTS running on the Core-i7 processor and with the latest version of the Robot Operating System (ROS) acting as the middleware layer. The middleware is the heart of the system, as it allows modules of software that are as independent as possible to work together in a scalable software architecture. ROS uses a publish/ subscribe technology called DDS to link the different modules together (see sidebar: Publish/subscribe). The idea is that high-level modules such as collision detection, image recognition or emergency braking can be like apps on a smartphone, and easily plugged into the middleware. At the lowest level, there are separate software programs (called nodes in DDS) for controlling the steering, acceleration and braking that link to the by-wire CAN interface. These modules have an interface to ROS. Similarly, there are ROS drivers for linking to the laser scanners, cameras, radars and the navigation system. AS created an arbitrator software module within the ROS that decides which module to use at which time. There is a defined list of priorities that are dependent on the sensing data and the status of the car. “Developing modular software is easier said than done,” says Buckner. “Generally, if you take one piece out, it crumbles. The idea is that you can take out one piece and you lose only the functionality of that module but the rest of the system still functions.” The middleware knows which sensor the data is coming from via the port or USB channel, and makes it available in the ROS world. Then, any node that wants it subscribes to the data set or parts of the data set, such as the tracks or the visualisation data. The apps have to be built to minimise their dependency on other modules and just use the lower level data sources. Each node has a standard defined set of messaging in and out like a typical API. The modules currently handle: • Lidar object detection – determining the ground plane and then identifying objects • Highway autopilot module – this uses adaptive cruise control, a lane-keeping algorithm, intelligent lane change, a point of interest manager and more • A shuttle module that allows the MKZ to be trained to follow a predetermined route. This has an option to increase the lateral acceleration limits so the vehicle can take curves faster. These all make use of the underlying nodes for sensors, steering, acceleration and braking. AS is working on another module for object classification from the camera images. This uses deep learning neural network algorithms running in the Nebula to identify cars and road signs. This is still in the training and testing stage before running a live optimisation. The offline learning uses a System 76 Linux PC with the TensorFlow neural network software development kit to train the algorithm with over 36 Gbytes of image data. This will then be transferred to the Nebula for live testing. The next step is to offer the modules as standalone software to allow customers to get up and running quickly. Every customer receives the throttle, brake and steering wheel command modules. “We are finding that every customer has to build their own speed controllers, so we are releasing a product to allow customers to control the vehicle with a speed and curvature command,” says Buckner. The next module in development would then take in a series of waypoints, and output speed and curvature commands. “Then you have a module on top with a customer’s path planning algorithm to produce a series of waypoints using what is seen in the lane model,” he says. April/May 2017 | Unmanned Systems Technology The Nebula compute engine uses an Intel Core-i7 processor and Nvidia graphics card
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4