Unmanned Systems Technology 005 | Selex ES Falco UAV | Sense and avoid systems | RCV Engines DF70 | DSEI show report | Fuel cells | CUAV Expo, InterDrone and CUAV Show reports | SLAM
39 Sense and avoid systems | Focus performance manoeuvres, so regulators have been working with the UAS community to determine the minimum level of aircraft performance for TCAS II. This is important for other airspace users as there are helicopters and other aircraft that are being equipped with TCAS II. Recent flight tests have been run to create encounters that trigger CA manoeuvres automatically to order, to measure the response of the Ikhana platform and set the expectations for how a UAV should respond. This response is expected to require the ability to change altitude at rates of about 500 ft/minute as the lower limit. This is viable for larger platforms such as Ikhana but will certainly be a challenge for smaller aircraft and other UAV systems. The speed at which a UAV can respond is a key consideration. The replacement for TCAS is called ACAS -X, with the ACAS-Xa version intended as a drop-in replacement for TCAS II. Another version, ACAS-Xu, is being defined specifically for unmanned aircraft by supporting multiple sensor inputs, and in the longer term it is expected to take into account aircraft with lower performance and all airspace users, whether or not they have a transponder. ACAS-Xu may also incorporate support for non-cooperative air-to-air radar and then infrared electro-optic sensors. The ACAS-X standards are planned to start coming into use from 2020. One of the challenges with passive non-cooperative sensors such as camera systems is that they don’t give a direct measure of range, and that is one of the primary pieces of information used in TCAS and ACAS-X. The SC-228 DA working group is also developing a standard for the transition to controlled airspace up to class A at 18,000 ft. This is looking at the larger platforms that can get up to class A airspace, and is due at the end of 2016. The next step is to include lower-altitude operation, probably by 2020, and to fit in with the Xu standard. Passive visual sensors Passive sense and avoid using image sensors and machine vision algorithms is seen as the next step, but the challenge is the processing power. Graphics processing units (GPUs) are increasingly common in mainstream processors and can be used to implement elegant, efficient machine vision algorithms, while more processing power can be traded for lower power consumption and smaller size, but GPU-based systems and the software implementations of the algorithms do not currently have the standards approvals such as DO-178C that are needed for avionics equipment. Part of the challenge with testing these systems is pitting the performance of the machine vision system against that of a pilot’s eyesight, determining the point at which a pilot picks up a potential collision. Vision systems are consistently beating pilots by two to three miles, with the machine vision not knowing which direction another craft will come from and still picking up potential collision risks. With test craft cruising at around 120 knots, that is enough for a few minutes to instigate collision avoidance. For vision systems there are also some physical challenges. First, there is the positioning of the camera sensors, whether a field of view is required all around the aircraft, above and below. A view below the aircraft is often used for landing, but suffers from ‘below the horizon’ challenges of separating out the background images, and fusing the data from multiple cameras around the aircraft contributes to the complexity of the CA decision-making. The current regulations that expect to replicate the view a pilot would have would require one vision system at the front with a 180 º field of view. There is also the challenge of where to place cameras on the airframe – in some trials a single array has been seen as more efficient and carrying a weight penalty but preferable to the need to have mounting holes for separate cameras around the airframe. Other solutions are using a turret with a camera that can scan a field of view, using a technique called ‘step stare’ image gathering rather than a fixed lens. In other work, a European research project is using a standard 5 megapixel CMOS camera with a high-quality lens and a standard dual-core processor to track moving objects in the sky. Those determined to be aircraft are tracked, but the system can also be used to track birds that may be on a collision course. The system will detect objects up to two miles away, giving up to 50 s for Unmanned Systems Technology | Dec 2015/Jan 2016 The ScanEagle UAV is being used to test small radar systems (shown in red) for sense and avoid applications (Courtesy of Insitu)
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4