Unmanned Systems Technology 002 | Scion SA-400 | Commercial UAV Show report | Vision sensors | Danielson Trident I Security and safety systems | MIRA MACE | Additive manufacturing | Marine UUVs
10 Platform one Spring 2015 | Unmanned Systems Technology US defence technology research lab DARPA is looking to test out completely new algorithms for fast-moving, small UAVs with a view to extending the technology into ground and marine systems. The Fast Lightweight Autonomy (FLA) programme aims to create a new class of algorithms to enable small, unmanned aerial vehicles to quickly navigate a labyrinth of rooms, stairways and corridors or other obstacle- filled environments, both individually and as a swarm. The programme aims to develop and demonstrate autonomous UAVs small enough to fit through an open window and able to fly at speeds up to 20 m/s (45 mph) while navigating within complex indoor spaces independent of communication with outside operators or sensors and without relying on GPS waypoints. The algorithms could enhance unmanned system capabilities by reducing the amount of processing power and communications needed for low- level tasks such as navigation around obstacles in a cluttered environment. The initial focus is on UAVs, but the algorithms could be applied to autonomous ground, marine and underwater systems, which could be especially useful in areas where GPS doesn’t work well. “Urban and disaster relief operations would be obvious key beneficiaries, but applications for this technology could extend to a wide variety of missions using small and large unmanned systems linked together with manned platforms as a system of systems,” said Stefanie Tompkins, director of DARPA’s Defense Sciences Office. The performance of the algorithms will be evaluated on a DARPA testbed, a six-rotor small UAV called VeloHex. This will carry a Gigabyte mini-PC using an Intel Core i7 processor with two Lidar laser systems from Hokuyo – its UTM-30LX on top, with a 30 m range, and its UST-2 on the bottom with a 5 m range – as well as four USB cameras from mvBlueFOX. The VeloHex will have a total weight of 2100 g and a flight time of 8-10 minutes from the 4400 mAh lithium-polymer battery. Clever swarming indoors UAV software DARPA plans to test new algorithms to enable small UAVs to find their way through a labyrinth of rooms A multicore graphics chip previously used in supercomputer designs is now being used for image processing for driverless car systems. The Tegra X1 processor from NVIDIA aims to help self- driving cars advance from the realm of research into the mass market with its automotive-grade version of the same GPU that powers the world’s ten most energy-efficient supercomputers. The key has been lower power consumption, allowing more powerful applications to run within the same thermal limits. The Tegra X1 has eight ARM CPU cores and 256 graphics cores using NVIDIA’s latest Maxwell architecture, and will drive camera-based ADAS (advanced driver assistance systems) applications such as pedestrian detection, blind-spot monitoring, lane- departure warning and street sign recognition. In the DRIVE PX autopilot platform, two of the X1 chips can process video from up to 12 onboard cameras to run capabilities for a seamless 360 º view around the car, and true self-parking. “Mobile supercomputing will be central to tomorrow’s car,” said Jen- Hsun Huang, CEO and co-founder of NVIDIA. “With vast arrays of cameras and displays, cars of the future will see and increasingly understand their surroundings. Whether finding their way back to you from a parking spot or using situational awareness to keep out of harm’s way, future cars will do many amazing, seemingly intelligent things. Advances in computer vision, deep learning and graphics have finally put this dream within reach.” Super-chip hits the road Driverless cars
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4