Unmanned Systems Technology 002 | Scion SA-400 | Commercial UAV Show report | Vision sensors | Danielson Trident I Security and safety systems | MIRA MACE | Additive manufacturing | Marine UUVs
66 use the vehicle in the traditional manner. Mechanically they are all separate but it’s the software algorithm that pulls all the data together for the control system. The algorithms that control MACE 3 are implemented in hardware in a field programmable gate array (FPGA) in the VIM rather than in software to achieve the necessary latency and response times. The worst-case latency is the time it takes for an image to be captured at a camera, relayed to the controller in MACE 3, then transmitted over the comms link to a remote operator. Reducing the latency has been one of the MIRA development team’s key aims, and the latency comes primarily from upgrading the comms system and tweaking the algorithm to achieve an end-to-end latency of 150 ms. This is critical for the remotely controlled mode, to give the operator as much time as possible to respond to the images coming from the vehicle, but it also determines the time available for the FPGA to detect obstacles instead of transmitting and receiving the data. In remote operation, the aim is for the latency to be low enough that the control of the vehicle is more or less seamless to the human eye, given some initial practice for the operator to get used to driving it remotely. A lot of this expertise in reducing the latency for the remote operation came from the development of the Panama platform, where operators took some time to adapt to the control system because the higher latency caused a lag between moving the steering wheel and Panama responding. One way to reduce the latency is by using specialist, fast-response cameras, but on MACE 3 it is all about using standard, commercially available cameras with standard definition (SD) sensors with a resolution of 640 x 480 pixels and basic digital processing, rather than a high-definition (HD) sensor that has a typical resolution of 1080 x 768 pixels. Using SD reduces the bandwidth required for the comms link, and also serves to reduce the bandwidth required in the autonomous control unit, allowing a faster response. The cameras are used to detect an object as soon as it comes into the field of view, and there are three at the front of the vehicle to provide a wide angle of view. In remote control mode the views they give appear on a three-way screen, and the same cameras are used in self- driving mode. The cameras scan the ground ahead, at a range of about 20 m so that the control system doesn’t get confused, allowing the vehicle to stop in response to cars and people that may be crossing its path. The overall control algorithm is relatively simple: as soon as an object crosses the field of view, the vehicle prepares to stop. It tends to react across a much shorter range, which is adjustable, typically in the 10 m range when it is driving along, to bring the vehicle to a stop within 5 m of the object. MACE 3 also uses Lidar laser ranging detection as part of the suite of sensors, using a 32-scan system from Velodyne. This uses a high-power laser diode to generate a beam that is separated by a Spring 2015 | Unmanned Systems Technology An earlier version of MACE has been used in autonomous operation for monitoring pipelines The pinch point on most of these platforms comes down to comms and the data link – this is invariably the weak point and where it can get expensive
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4