Unmanned Systems Technology 003 | UAV Solutions Talon 120 | Cable harnesses | Austro Engine AE50R and AE300 | Autonomous mining | AUVSI 2015 show report | Transponders | Space systems
82 Summer 2015 | Unmanned Systems Technology PS | Mind-controlled systems A s the latest unmanned systems and their payloads become ever more complex, so too do their control systems (writes Stewart Mitchell). In an attempt to remove these complex electro-mechanical interfaces between pilot and UAS, research is being carried out by UAV manufacturer Tekever, based in Portugal’s capital Lisbon, to develop a system by which a pilot can control a UAS in 3D space using just their brain. The project, dubbed Brainflight, explores a specific type of brain- computer interface (BCI) that allows a user wearing an electroencephalography (EEG) cap, fitted with an array of electrodes and connected to a decoder and control unit, to control a UAV using no external stimulation. The approach, known as operant BCI, takes advantage of the brain’s ability to learn how to use new tools through operant conditioning – where an individual makes an association between a particular behaviour and a consequence – using trial and error. Put simply, the EEG’s electrodes measure brain activity, and the decoder adapts the output values of the brain’s signals to appropriate levels of control over the UAS. The decoder is designed in such a way that it transforms the ratio between the different brain waves into a signal with a distribution as close to neutral activity as possible, making the difficulty of reaching the targets unbiased – in other words, turning an unmanned system to the right should be as difficult or easy as turning it to the left. The decoder’s resistance to external information is key, as electrical noise and obvious subject movements (moving the head, chewing, moving eyes) must be ignored to ensure a good dynamic range and a smooth distribution of signals. After the brain has detected the relationship between the waves’ patterns and the change in the UAS, the user then has to learn to control that relationship between a particular brain activity and the result in the real world. The user will eventually be able to control a UAS subconsciously, as Tekever’s business development manager Rob Whitehouse explains. “Once the individual has learnt the process behind producing the type of signal needed for the software to recognise their intentions, they will then go from a state of consciously incompetent to unconsciously competent, and the unmanned system is then able to be controlled completely and safely by the brain, similar to the process of learning how to drive a car,” he says. Since the operant BCI rationale is taking advantage of this natural process it should allow faster and more accurate BCI control. The technology is set to be implemented in larger unmanned systems and, potentially, cargo and transport aircraft so that they too can be controlled without the need for onboard personnel, but there are many regulatory hurdles to be overcome first for it to be used commercially. While it may be some time before we see brain-controlled unmanned systems flying around our airspaces, the technology has the potential to revolutionise this market, and it also paves the way for artificial intelligence algorithms for future autonomous systems. Currently, a lot of decision-making software is based on several layers of ‘if’ statements, as in autonomous roadcars which rely on thousands of inputs from sensors mounted on them to make a ‘decision’. The technology being developed by Tekever however could allow software developers to further their understanding of how decisions are made in the brain and how they transfer into subconscious activity. This can then be analysed in code and algorithms, eventually to develop systems that can learn according to experience – as we do as humans. Now, here’s a thing “ ”
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4