Unmanned Systems Technology Dec/Jan 2020 | Phoenix UAS | Sonar focus | Construction insight | InterGeo 2019 | Supacat ATMP | Adelan fuel cell | Oregon tour | DSEI 2019 | Copperstone Helix | Power management focus
8 Platform one Siemens Mobility is leading a three-year project to develop the technologies for a fully autonomous tram depot (writes Nick Flaherty). The project, called AStriD (Autonomous Tram in Depot), is based at Verkehrsbetrieb Potsdam (ViP), in Germany. Its aim is to be able to move trams around autonomously and handle functions such as running them through a washing bay and onto a siding. Siemens will develop the autonomous tram, which will be integrated into the data and system landscape via a data hub provided by partner Codewerk, and localised and tracked using a Mapillary digital map. ViP will provide the tram and depot infrastructure as well as access to the required data, systems and facilities, and evaluate the results from the point of view of a depot operator. The Institute for Information Processing Technology at Karlsruhe Institute of Technology is identifying the necessary data for automation. Codewerk will handle the cloud and edge components for integrating the data from all the systems, while Mapillary will provide the project with a cloud-based online platform for the collaborative collection and provision of street images and relevant information. The data will be analysed using AI and processed to provide digital maps. “Maps are no longer needed just for humans to get from A to B, but for autonomous vehicles across the board,” said Peter Kontschieder, director of research of Mapillary. “That’s where our expertise in street-level imagery understanding comes in. Through computer vision and street imagery, we will teach the tram to recognise and understand its surroundings.” Municipal vehicles Siding with autonomy x Siemens’ part in the AStriD project is to develop the autonomous tram for integration into the wider system Nvidia has launched a pin-compatible module to boost the machine learning performance of unmanned systems (writes Nick Flaherty) The Jetson Xavier NX is the size of a credit card (70 x 45 mm) and is pin-compatible with the Jetson Nano module launched last year. The NX module provides up to 14 TOPS (tera operations per second) for machine learning frameworks at 10 W of power consumption, or 21 TOPS at 15 W. This allows the module to run multiple neural networks in parallel and process data from multiple high-resolution sensors simultaneously. The CPU has six of Nvidia’s custom 64-bit Carmel cores with 6 Mbytes of L2 cache and 4 Mbytes of L3 to minimise off-chip data transfers and reduce the power consumption, with 8 Gbytes of 128-bit LPDDR4x memory on the board operating at 51.2 GBytes/s. The Volta graphics processing unit has 384 CUDA cores and 48 Tensor machine learning accelerator cores. There are also two 4K30 video encode channels and two 4K60 decode channels, with support for up to six CSI cameras (36 via virtual channels) using 12 lanes (3 x 4 or 6 x 2) of MIPI CSI-2 interconnect. The Xavier NX is supported by Nvidia’s JetPack software development kit, which is a complete machine learning software stack. It also supports all major machine learning frameworks, including TensorFlow, PyTorch, MxNet and Caffe. Boost for neural networks Machine learning December/January 2020 | Unmanned Systems Technology
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4