Page - 49 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Image of the Page - 49 -
Text of the Page - 49 -
were interpreted and conclusions drawn. Visual feedback
during teach in was requested. If possible, information should
be projected to the work piece surface. This would require
additional projection technology as proposed by AssistMe.
VI. PROPOSAL FOR FINAL EVALUATION
An additional projection technology would enable spatial
augmented reality methods.
A. Robot C
Spatial augmented reality interfaces are proposed and
implemented as tangible user interface. Physical interaction
with the product to process only might further minimize
programming effort and be an easy to perceive means of
interaction. A tangible marble is used for teach in of process
points and the sequence of their processing. Therefore a 3D
camera is integrated with a projector to detect marbles [21]
positioned on top of screws to acquire spatial process points
as
Fig. 11 - Tangible User Interface
well as taps onto projected buttons to confirm their order or
other interactions with the programming system.
B. Robot D
Robot D is controlled via a 2D interface as depicted in Fig.
13. Process points are entered by tapping onto a 2D
representation of the processed object. A machine vision
algorithm determines the spatial region of the tapped point
and therefore determines both 3D process points and the
sequence of the process points from the tapping order. Fig. 14
shows the technology applied to a bin-picking process where
one of several objects in the 3D sensors field of view can be
selected in a 2D representation of that data. The same
technology is applied to the selection of regions and process-
points on a single object in the sensor’s field of view.
Implicitly also the order of the process points can be entered. Fig. 12 - Tangible User interface system setup
Fig. 13 – Define process points in 2D
C. Robot E
Robot E is programmed by positioning an externally tracked
device (Fig. 15) or an extension like a stick to the process point.
Once calibrated a precise position of a stick’s tip mounted on
an externally position tracked device can be calculated in real-
time. Process points and their order are programmed by
ordered tipping onto screws in question.
Fig. 14 - 2D tap based process point selection
(https://www.youtube.com/watch?v=nrhXEqG014o)
VII. CONCLUSION
The presented study demonstrated that the not-
intermediated (direct manual) interaction with the robot can
increase the experience of the robot's capabilities (usability,
Select positionby click in 2D view
Position
2Position
1
Live View of2D Camera
Camera Position
49
Proceedings of the OAGM&ARW Joint Workshop
Vision, Automation and Robotics
- Title
- Proceedings of the OAGM&ARW Joint Workshop
- Subtitle
- Vision, Automation and Robotics
- Authors
- Peter M. Roth
- Markus Vincze
- Wilfried Kubinger
- Andreas Müller
- Bernhard Blaschitz
- Svorad Stolc
- Publisher
- Verlag der Technischen Universität Graz
- Location
- Wien
- Date
- 2017
- Language
- English
- License
- CC BY 4.0
- ISBN
- 978-3-85125-524-9
- Size
- 21.0 x 29.7 cm
- Pages
- 188
- Keywords
- Tagungsband
- Categories
- International
- Tagungsbände