Unmanned Systems Technology 033 l SubSeaSail Gen6 USSV l Servo actuators focus l UAVs insight l Farnborough 2020 update l Transforma XDBOT l Strange Development REVolution l Radio telemetry focus
there are pockets of space where the XDBOT will need to linger in order to provide especially thorough spraying. Teleoperation by the operator from a separate room is possible via the UGV’s camera and wi-fi link, but it is preferable to have the operator standing and watching a short distance away for greater situational awareness in public spaces. While trials of the UGV in this mode have been conducted with the operator following at some distance behind the UGV, the company is also developing a follow-me mode. In this, the vehicle would use its perception sensors to lock on to a cleaner working several paces ahead of it, and spray while following their path and approximate speed. A full remote operation mode (where the operator controls both the arm and the wheels) is also available for rare cases where human judgement is needed. This is the most labour-intensive mode, however, and Transforma does not expect it to be widely used. Lastly, full autonomy naturally entails traction, steering and spraying all being handled by the XDBOT’s embedded software. While Transforma is putting the finishing touches to this mode, it anticipates full autonomy being the most productive and widely used mode, especially in unchanging daily routes, where there is fixed furniture or where corridors will not be blocked off. An initial survey and programming round would be needed in order to embed an approximate map of the building in the UGV’s internal storage. After that it can automatically cover its assigned floorspace up to three times a day. Sensor architecture For fully autonomous operations indoors (where GNSS is unreliable), two sets of perception sensors are integrated into the XDBOT. The sprayer mounts a small rail with a pair of Intel Realsense D435i cameras (integrated for stereoscopic vision) above the nozzle for first-person view. Embedded machine vision is used to recognise those surfaces where people are most likely to have left infectious handprints. To date, the feature recognition has been trained primarily for hospitals, but Transforma and Hand Plus anticipate that slight training updates will be needed for each mission location, as objects such as doorknobs and elevator buttons will be different in each building. The Realsense D435i integrates an IMU for fusing time-stamped inertial data with its vision data. It has a vertical field of view (FoV) of roughly 58 º , a horizontal FoV of about 87 º , and a diagonal FoV of about 95 º . To sense and avoid furniture, people, and other objects, a Slamtec RPLidar-A3 is installed on the front. It has a detection distance of up to 25 m, a typical scan rate of 15 Hz, a sample rate of 16 kHz, and an angular resolution of 0.3375 º . Transforma has trained the software for it in order to improve its ability to recognise when objects and people pose potential obstructions. As of the second prototype, another Realsense system has been installed next to the Lidar, to provide complementary obstacle detection data via the embedded sensor fusion, with superior range and colour information added as a result. Spraying and sanitising The sprayer assembly nozzle can be moved and pointed to dispense liquid disinfectant at up to 2.3 m above the ground in its vertical axis, down to floor level. The typical horizontal reach of the aerosol droplets (which are typically 40 microns in size) extends up to 2 m from the nozzle, although 0.8 m is recommended for optimal coverage. The cone of the spray widens to 0.3 m in diameter at 1.2 m from the nozzle. In the first prototype, the spray nozzle was installed and steered on a robotic arm from Universal Robotics – its UR5 model, which has three axes of rotation, a reach of 850 mm and a weight of 20.6 kg. “In the second version though, we’ve designed and built the mechanics of the sprayer’s aim and movement ourselves, and we’ve simplified it a lot,” Prof Chen says. “We realised that three degrees of freedom was actually too many, so by making a simple pan-tilt gimbal ourselves we could make the XDBOT less expensive to produce and simpler to operate.” 63 Unmanned Systems Technology | August/September 2020 In addition to full autonomy, the XDBOT has a semi-autonomous mode, where the sprayer moves automatically, and an operator nearby steers the UGV via the GCS
Made with FlippingBook
RkJQdWJsaXNoZXIy MjI2Mzk4