Unmanned Systems Technology 013 | AutonomouStuff Lincoln MKZ | AI systems | Unmanned Underwater Vehicles | Cosworth AG2 UAV twin | AceCore Neo | Maintenance | IDEX 2017 Show report
24 the segmentation step; and then these features are extracted from the overall point cloud. There are four steps that are handled by the fusion engine. In the Association step, the identified segments are matched with their corresponding object, and the position and motion of the objects between each scan are updated based on an Interacting Multiple Model. In the Track Maintenance step, the plausibility of whether certain segments belong to the same object, based on the orientation and yaw rate of the object, is checked. A decision is then made to either merge or split objects based on their behaviour or appearance. For example, if there are two objects following each other with the same velocity and distance, they could be a truck with a trailer, so they can be merged into one object. The Classification step applies a ‘track before detect’ mechanism. That means an object is tracked for as many scans as necessary until the object is clearly identified. In this step, the objects also receive a label such as car, truck, bike or pedestrian. These objects from the fusion engine are then integrated into pathfinder software as the fourth step. Cameras The AS research platform uses two visible-light cameras. Infrared is not needed as the laser sensors operate well at night. Mounted in the front windscreen to provide lane modelling information and object tracking and classification is a Mobileye 660 single-lens camera. The lane modelling is used for highway driving to determine lane quality, for example how clear the solid, double solid or dotted lines are on the road. It uses the reflectivity of the lane marker to calibrate how well the camera is working, on a scale from zero to three. The camera includes the EyeQ2 image processing chip that provides real-time image processing for detecting lanes, vehicles and pedestrians, measuring the dynamic distances between the vehicle and road objects. This object data is fed back via the CAN bus to the compute engine. There are some limitations with this system. For example, an exit from the highway appears as a turn in the road. “Not all of our automation software operates on a predefined map, so we are doing everything in real time, and that can affect how the highway automation works,” Hambrick says. “You can use additional camera algorithms to detect or pre-map the road, and tag it to compensate for this, and we supplement the output with our own algorithm to produce a reference image.” The MKZ also uses cameras from Point Grey, with image processing April/May 2017 | Unmanned Systems Technology Laser scanners: Velodyne Laser scanners: Ibeo Camera: Mobileye (now Intel) Cameras: Point Grey Radar: Delphi GPS: Novatel IMU: KVH Compute engine: Neousys Technology/in-house Storage systems: Quantum Operating system: ROS Operating system: Ubuntu Linux CAN by-wire interface module: Dataspeed Key suppliers A camera from Mobileye handles lane detection on the highway We get the raw detection data out of the radar, which needs to be filtered and processed, and this gives a lot of reflectivity data
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4