Seite - 157 - in Joint Austrian Computer Vision and Robotics Workshop 2020
Bild der Seite - 157 -
Text der Seite - 157 -
360°Monitoring forRobotsUsingTime-of-FlightSensors
Thomas Maier,Birgit Hasenberger
BECOM SystemsGmbH
{thomas.maier,birgit.hasenberger}@becom-group.com
Abstract. In this paper, we present a system based
onmultiple Time-of-Flight (ToF) 3D sensors paired
with a central processing hub for integration into
robotsormobilemachines. This systemcanproduce
a360°viewfromtherobot’sperspectiveandenables
tasks ranging from navigation and obstacle avoid-
ance tohuman-robot collaboration.
1. Introduction
Today’s e-commerce growth and the paradigms
of Industry 4.0 in the manufacturing space present
new challenges for robotic systems [1]. In order
to increase mobility and autonomy of such systems
they need to gather and interpret as much informa-
tionaspossiblefromtheirsurroundings. Approaches
for collision avoidance with 1D time-of-flight sen-
sors have been explored [2], the use of several high-
resolutionsensorswouldenableapplications likehu-
man pose estimation and gesture recognition as well
asautomationtaskssuchashandlingofgoods. Close
collaborationandadditional functionalitiesaremade
possible with the setup proposed in this paper which
consists of intrinsic 3D Time-of-Flight sensors that
can cover360°arounda robot’s armor chassis.
2.Systemarchitecture
Themulti-ToFplatform1 consistsofacentralpro-
cessing module based on a NVIDIA Tegra TX2 pro-
cessor (the hub) and multiple camera modules that
work inparallel (the frontends). Thisplatformarchi-
tecture allows for the integration of various sensor
and camera types.
TheToFsensorfrontendfor theplatformisacom-
pact module designed for close-range detection. Ta-
ble1summarises the technicaldataandperformance
of two frontend variants. The frontend is connected
1https://www.becom-group.com/goto/multi-tof-platform Figure 1. Image of a front-end including descriptions of
its components (left)anda ringof four front-ends (right).
QVGA VGA
frontend frontend
Resolution 304pxx240px 640pxx480px
Fieldofview 110°x82° 110°x82°
Distance range 0.1–1.5m 0.1–2.0m
Operating
wavelength 850nm 940nm
Framerate 40Hz 30Hz
Table1.Frontend specifications.
to the hub via FPD-Link III, using a cable that also
provides thepowersupply. Four frontendscanbear-
rangedasa ring toallowa360°coverage.
The hub controls the frontends, performs calibra-
tion and correction operations on the incoming data,
andultimatelycalculatesdepthmapsorpointclouds.
The data is transmitted from the hub via a Gigabit
Ethernet connection and supports ROS to gather the
individual data streams. The following operations
areperformedon thehub:
1. Synchronizationand triggering
2. Acquisitionanddepthmapcalculation
3. Corrections (temperature, FPPN, distance off-
set, intrinsicandextrinsic)
4. Filtering (spatial and temporal)
5. 3Dpoint cloudcalculation
6. Registrationand transformation
157
Joint Austrian Computer Vision and Robotics Workshop 2020
- Titel
- Joint Austrian Computer Vision and Robotics Workshop 2020
- Herausgeber
- Graz University of Technology
- Ort
- Graz
- Datum
- 2020
- Sprache
- englisch
- Lizenz
- CC BY 4.0
- ISBN
- 978-3-85125-752-6
- Abmessungen
- 21.0 x 29.7 cm
- Seiten
- 188
- Kategorien
- Informatik
- Technik