UST034

30 expensive, and the technologies available are still prototypes so they’re not up to automotive specification yet.” Instead, the mixed operations sensor suite has six calibrated radar-camera pairs to provide the 360 º coverage as the primary detection sensors, complemented by a forward-facing Lidar and a short-range ultrasonic array. The automotive-grade radar is used for object detection and also generates a velocity vector for each object that feeds into the tracking software. The radar then cues the image processing software to concentrate on the precise area of the camera’s image in which it detected something, handing it over for classification. “There is a set of predefined ‘actors’ that we characterise for an ODD, which changes for each deployment,” van der Zwaan says. In a deployment where the shuttle is only used with other motorised vehicles, they form a subset of a larger set for mixed traffic deployments, typically broadened to include motorcycles, cyclists, pedestrians, cars, trucks and buses, he adds. Each actor has its own motion model attached. Coupled with the radar’s detection, the camera’s classification triggers certain expectations. “If we detect a car in front of us we have a velocity vector, and then we can make a prediction about what the world should look like within the next few seconds,” he says. “We feed that to a tracking system that fuses information coming from the radar-camera pairs on the object level into a unified state of the world and then we try to predict the near- future state.” Based on the next predicted state of the surroundings, decision-making software in the vehicle then decides on its immediate course of action. It will consider whether it should continue through its next set of waypoints, brake, emergency brake or deviate from the planned route, for example by making an evasive manoeuvre. Perception and behaviour The decision-making system not only relies on the perception system’s input, it also puts it into context using a system based on localisation and connectivity that provides information on traffic beyond the sensor system’s view. It also relies on connectivity to the fleet management system and other suitably equipped vehicles for platooning and communicating with smart traffic lights. “Consolidating all that information results in situational awareness that feeds into the decision-making, ultimately determining the vehicle’s behaviour,” van der Zwaan says. “Based on those decisions, the motion control system generates the set points for the actuators to execute the plan.” He emphasises that this detection, October/November 2020 | Unmanned Systems Technology The charging system includes these deployable pads that are lowered into contact with charging plates installed at selected stops for opportunity charging A horizontal bar at wheel level senses the vehicle’s position with respect to magnets embedded in the ground on established routes to support localisation and precise positioning for charging

RkJQdWJsaXNoZXIy MjI2Mzk4