Issue 40 Unmanned Systems Technology October/November 2021 ANYbotics ANYmal D l AI systems focus l Aquatic Drones Phoenix 5 l Space vehicles insight l Sky Eye Rapier X-25 l FlyingBasket FB3 l GCS focus l AUVSI Xponential 2021

27 ANYbotics ANYmal D | Dossier nearby that enables us to test for radiated and conducted emissions, and carry out external electrostatic discharge tests, and we’ve aimed for immunity from EM and electric fields of up to 6 GHz. Fankhauser adds, “We’ve tried to design for EMI protection from the beginning. We have lots of power electronics engineers in-house, and the robot contains high-speed data buses, DC-DC converters and high-power drives, so as much as customers may want it, it’s critical for us as well that the ANYmal produces minimal emissions and tolerates those from outside sources. “It took 4-5 years to devise the testing procedures, software and jigs we use. Legged robotics is uncharted territory, so the whole thing has been – and continues to be – a learning experience.” Locomotion and balance The ANYmal D’s main computer fuses a variety of sensor inputs to calculate the speed and location of its paces needed for maintaining its motion and balance along its mission path. These inputs include position, velocity, acceleration and force information from all 12 joints. The feet are spherical moulded rubber parts for grip and impact damping. “There are no sensors in the feet,” Fankhauser notes. “Some people find that surprising, but we can calculate the timing and force with which the feet are contacting the ground by measuring the forces exerted on the motors by the springs. “Obviously we also have MEMS accelerometers and gyroscopes in the main body to provide inertial data on acceleration and angular rate. Sensing the body’s inertia combined with the various data from the joints allows it to stand and walk with balance, and react to physical disturbances.” Of course, a better way to protect against physical impacts is not to get hit in the first place. The sensors described so far are enough for ‘blind’ movement and balance, but visual feedback is also highly useful for such calculations, given that many organisms including humans use visual data for balancing. The aforementioned forward- and rear- facing sensors, which are set into the front and back ‘faces’ of the ANYmal, come into play here. They are identical, each with a wide-angle camera and two Intel RealSense near-field 3D stereo cameras. Another two RealSense cameras are mounted on the robot’s sides, about halfway between its legs – one pointing left, one right – for a total of eight cameras: six RealSenses and two wide- angle systems. The RealSense cameras update the robot’s perception of the surrounding floor and objects in real time so that the main computer can dynamically calculate where to step next to avoid collisions or stumbling, amid the other calculations it is running for navigation, localisation and overall autonomy. The main computer is a six-core 8th Gen Intel i7 CPU with 16 Gbytes of RAM. A second, identical CPU runs the inspection and survey algorithms, as well as any custom behaviours the end-user requests or wants to write through the developer’s API. Ubuntu 20.04 is used as the main CPU’s operating system, with ROS Noetic as the middleware between the two computers, sensors and actuators. The latter was selected partially for being a widely used software system that many of ANYbotics’ customers are familiar with, and is emerging as a de facto standard. “The locomotion stack is a very complex piece of software, with Unmanned Systems Technology | October/November 2021 Submergence testing was key to determining that the ANYmal D would resist the ingress of water and any other fluids in industrial environments

RkJQdWJsaXNoZXIy MjI2Mzk4