Seite - 132 - in Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Bild der Seite - 132 -
Text der Seite - 132 -
Figure2. Structureofapplicationmodules
The system design fuses the information acquired from the various sensors: (1) Laser scanning (sta-
tionary) is used for large-scale mapping of workspace environment. (2) A 3D-Time-of-Flight (TOF)
sensor is used for rough dynamic characterization of the scene. Alternatively, an additional stereo
camera system with wider FoV can be used. (3) For high-resolution localization & inspection within
the workspace of the robot (in order to achieve the required 0.05 mm resolution at specific parts of
the scene), an active stereo system with an additional pattern projection unit is used. (4) On one of
the Pan-Tilt Units (PTUs) used for sensor pointing, in addition a laser speckle projector is mounted
to enhance texture-less regions for high-resolution stereo analysis. (5) All components of the sen-
sor system will be integrated in one unit which allows quick installation and setup in the production
environment.
We can benefit from such a system along with other sensory data for safety assurance [16]. Safety
is one of the most important factors when considering human-robot collaboration in industrial ap-
plications. Various safety features such as vision, proximity sensors [33], laser detectors, touch and
collision sensors, force torque sensors, and emergency stops could be exploited to achieve this goal.
At the same time, many standards and guidelines throughout the design, robot manufacturing, instal-
lation, and final implementation are issued to increase the safety in the system [15, 1]. To provide
safety, we need to cope with the sensor failures and their occlusion and also dynamic environments.
Toachieve thisgoal,weplantobuildahybridsafetysystem,whichcombinesmultiplesafetyfeatures
and sensors together in multiple layers in a both serial and parallel manner. This way, failure of one
sensorwill notnecessarilycompromise the safetyas longasother components continue to function.
132
Proceedings
OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
- Titel
- Proceedings
- Untertitel
- OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
- Autoren
- Peter M. Roth
- Kurt Niel
- Verlag
- Verlag der Technischen Universität Graz
- Ort
- Wels
- Datum
- 2017
- Sprache
- englisch
- Lizenz
- CC BY 4.0
- ISBN
- 978-3-85125-527-0
- Abmessungen
- 21.0 x 29.7 cm
- Seiten
- 248
- Schlagwörter
- Tagungsband
- Kategorien
- International
- Tagungsbände