Page - 108 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Image of the Page - 108 -
Text of the Page - 108 -
backside of the tablet or an additional AR marker is put on
the desired position, see Figure 2. The additional marker
Fig. 2. Screenshot of position input via an additional marker in the image.
The position of the marker defines the position of the new sticky note. The
pane on the right shows pre-defined tags connected with the marker.
can be removed after the system took over its position.
While the measuring tip might be the more intuitive way
to define a position, AR markers have an advantage: they
can be combined with note-templates, e.g. different markers
for different users or different types of notes.
Fig. 3. Screenshot with two sticky notes in the image (flash icons on the
left). The top left note is selected and the pane on the right displays the
according information.
B. HumanMachine Interaction
The base of the human machine interaction is a live view
of the tablet’s camera in which the sticky notes are overlaid
in an Augmented Reality fashion. When touching a note,
a form with additional information pops up, see Figure 3.
From a technologically viewpoint this is a HTML overlay,
reflecting the client-server based architecture of the software.
Beside the standard information for a sticky note like
name, type, and description an additional tagging system has
been implemented for a flat hierarchy of the notes. To each
note one ore more pre-defined tags can be assigned, which
allow an easy to use filtering, e.g.: showing only electronic
related notes in the view. Examples of the tags can also be
seen in Figure 2 and 3 on the bottom right.
Furthermore the system supports different user roles,
which differ in the amount of information they can edit
and/or which sticky notes they can see. IV. TECHNICAL EVALUATION
The system is currently in an evaluation phase. Tests
regarding positioning showed up accuracies of±0.5cm in
a volume of 5mĂ—5mĂ—5m.
The used hardware is a standard Microsoft surface tablet
with the internal camera set to a resolution of 1280Ă—720px
andamaximumachievable framerateof25fpsof theoverlay.
However the limiting factor for the frame rate is the tablet
camera itself. With an external industrial camera frame rates
up to 100fps have been achieved.
For the registration step and the moveable parts also the
marker size and camera-marker-distance have been evalu-
ated: for a reasonable marker size of 7cm the largest distance
to the marker is 2m. With higher distances the detection and
therefore the visualization becomes in-stable.
V. CONCLUSION AND FUTURE WORK
In this paper we presented an industrial grade AR based
documentation system which extends the current state-of-
the-art by integrating additional tracking modalities and
furthermore allows the user to edit the information in an
intuitive way. The proposed system is currently on a tech-
nology readiness level of 7. The next steps are extensive
field tests. Although the system can be used stand-alone,
the ideal synergy is together with a PLM system. Currently
the import/export functions support only one such system,
however this will be extended in future.
Furthermore of vital interest is the evaluation of time
savings gained by using our system. For this a detailed study
will be set up together with industrial partners.
From an application point of view a possible extension
could be the replacement of the PLM input with a Virtual
Reality (VR) system. This would allow one user to add
information on a model in VR which is then immediately
shown to a different user on the real object. This could be
the base for numerous multi-user scenarios e.g. the usage as
remote maintenance system. The advantage over a traditional
2D camera assisted remote maintenance system would be
the exact 3D positioning of the information on the object of
interest.
ACKNOWLEDGMENT
This research is funded by the project SIAM (FFG,
849971) and by the European Union in cooperation with
the State of Upper Austria within the project Investition in
Wachstum und Bescha¨ftigung (IWB).
REFERENCES
[1] RE’FLEKT. Kothes / Docufy - Virtually enhanced
handbooks with AR intergrated into your editing system,
2016. https://www.re-flekt.com/portfolio-view/
augmented-reality-technical-documentation/.
[2] S. Garrido-Jurado, R. MunËśoz Salinas, F.J. Madrid-Cuevas, and M.J.
Marı´n-Jime´nez. Automatic generation and detection of highly reliable
fiducial markers under occlusion. Pattern Recogn., 47(6):2280–2292,
June 2014.
[3] S. Akkaladevi, M. Ankerl, C. Heindl, and A. Pichler. Tracking
multiple rigid symmetric and non-symmetric objects in real-time using
depth data. In 2016 IEEE International Conference on Robotics and
Automation (ICRA), pages 5644–5649, May 2016.
108
Proceedings of the OAGM&ARW Joint Workshop
Vision, Automation and Robotics
- Title
- Proceedings of the OAGM&ARW Joint Workshop
- Subtitle
- Vision, Automation and Robotics
- Authors
- Peter M. Roth
- Markus Vincze
- Wilfried Kubinger
- Andreas MĂĽller
- Bernhard Blaschitz
- Svorad Stolc
- Publisher
- Verlag der Technischen Universität Graz
- Location
- Wien
- Date
- 2017
- Language
- English
- License
- CC BY 4.0
- ISBN
- 978-3-85125-524-9
- Size
- 21.0 x 29.7 cm
- Pages
- 188
- Keywords
- Tagungsband
- Categories
- International
- Tagungsbände