Web-Books
im Austria-Forum
Austria-Forum
Web-Books
International
Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Seite - 50 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 50 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Bild der Seite - 50 -

Bild der Seite - 50 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Text der Seite - 50 -

performance expectancy). Fig. 15 – pointer with external tracking The outcomes of a previous user study [22] led to a technical revision of the HRI mechanisms of the first robot prototype by incorporating the worker’s feedback. In the current study the same workers tested the HRI mechanisms of the revised robot and the findings were compared with the previous version. Furthermore, it seems unlikely that the results can be explained by practice effects, due to the period of one year between the studies and the completely different interaction methods. However, the findings of the current study drove the last technical revision of the system (robot C, D, E) which will feature improvements in ergonomics and be evaluated in a final evaluation in 05/06 2017. Collaboration can be improved by adding visual feedback on the robot and the work piece during the teaching (to reduce the burden of switching attention between the robot and touch panel). [15] [16] introduce the notion Spatial Augmented Reality (SAR) and describe it as enhancement or aggregation of several Augmented Reality (AR) technologies. One formulation [17] might be a depth camera projector based system to project (correctly distorted) information on three dimensional objects instead of flat screens (Figure 3) and may be used for projection of buttons. (Applied) robotics does not make use of SAR methods extensively. [18] introduces a projection based safeguard system for robotic workspaces especially for collaboratively used workspace. [19] gives an overview on Tangible User Interfaces (TUI) which denote interfaces that can be manipulated physically, and which have an equivalent in the digital world and represent a mean for interactive control. The project proposes a combination of TUI and SAR methods. Hand-guided positioning of the robot might be uncomfortable or time consuming due to inappropriate input modalities (friction afflicted robot drives, unintuitive touch screens,…). These were motivations for the implementations of technologies integrated in robot C,D and E and will be evaluated in the final evaluation in AssistMe. The new HRI mechanisms of robot C, D and E will be based on the paradigm of joint/shared attention, which describes the shared focus of two individuals on an object. Joint/shared attention is realized when one individual alerts another to an object by verbal or non-verbal means such as eye-gazing or pointing (gestures). The application of this paradigm will result in gesture-based HRI mechanisms for robot C. This design decision will shift human-robot interaction towards the dynamics during human-human or human-animal interactions. Therefore, we expect that this approach will help to increase perceived safety, overall acceptance and to ease the transition of working with newly introduced robots. ACKNOWLEDGMENT This research is funded by the project AssistMe (FFG, 848653), SIAM (FFG, 849971) and by the European Union in cooperation with the State of Upper Austria within the project “Investition in Wachstum und Beschäftigung” (IWB). REFERENCES [1] A. Weiss, R. Buchner, M. Tscheligi und H. Fischer, „Exploring human-robot cooperation possibilities for semiconductor manufacturing,“ in Collaboration Technologies and Systems (CTS), 2011 International Conference on, 2011. [2] D. Wurhofer, T. Meneweger, V. Fuchsberger und M. Tscheligi, „Deploying Robots in a Production Environment: A Study on Temporal Transitions of Workers’ Experiences,“ in Human-Computer Interaction--INTERACT 2015, Springer, 2015, pp. 203-220. [3] R. Buchner, N. Mirnig, A. Weiss und M. Tscheligi, „Evaluating in real life robotic environment: Bringing together research and practice,“ in RO-MAN, 2012 IEEE, 2012. [4] S. Griffiths, L. Voss und F. Rohrbein, „Industry-Academia Collaborations in Robotics: Comparing Asia, Europe and North- America,“ in Robotics and Automation (ICRA), 2014 IEEE International Conference on, 2014. [5] A. Weiss, R. Bernhaupt und M. Tscheligi, „The USUS evaluation framework for user-centered HRI,“ New Frontiers in Human--Robot Interaction, Bd. 2, pp. 89-110, 2011. [6] G. Biggs und B. MacDonald, „A survey of robot programming systems,“ in Proceedings of the Australasian conference on robotics and automation, 2003. [7] http://www.kuka- robotics.com/en/products/industrial_robots/sensitiv/lbr_iiwa_7_r800/s tart.htm. [8] http://www.mrk-systeme.de/index.html. [9] https://en.wikipedia.org/wiki/Universal_Robots. [10] ISO 10218-1:2011 Robots and robotic devices -- Safety requirements for industrial robots -- Part 1: Robots. [11] ISO 10218-2:2011 Robots and robotic devices -- Safety requirements for industrial robots -- Part 2: Robot systems and integration. [12] ISO/TS 15066:2016 Robots and robotic devices -- Collaborative robots. [13] M. Bovenzi, „Health effects of mechanical vibration,“ G Ital Med Lav Ergon, Bd. 27, Nr. 1, pp. 58-64, 2005. [14] A. Huber, A. Weiss, J. Minichberger und M. Ikeda, First Application of Robot Teaching in an Existing Industry 4.0-Environment. Does it Really Work? Societies, 2016. [15] O. Bimber und R. Raskar, Spatial augmented reality: merging real and virtual worlds, CRC Press, 2005. [16] R. Raskar, G. Welch und H. Fuchs, „Spatially augmented reality,“ in First IEEE Workshop on Augmented Reality (IWAR’98), 1998. [17] K. Tsuboi, Y. Oyamada, M. Sugimoto und H. Saito, „3D object surface tracking using partial shape templates trained from a depth camera for spatial augmented reality environments,“ in Proceedings of the Fourteenth Australasian User Interface Conference-Volume 139, 2013. [18] C. Vogel, M. Poggendorf, C. Walter und N. Elkmann, „Towards safe physical human-robot collaboration: A projection-based safety system,“ in Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on, 2011. [19] H. Ishii, Tangible user interfaces, CRC Press, 2007. [20] C. Harrison, H. Benko und A. D. Wilson, „OmniTouch: wearable multitouch interaction everywhere,“ in Proceedings of the 24th annual ACM symposium on User interface software and technology, 2011. [21] Bishop, Durell. "Marble answering machine." Royal College of Art, Interaction Design (1992). [22] Ebenhofer G., Ikeda M., Huber A., Weiss A., User-centered Assistive Robotics for Production - The AssistMe Project, ÖAGM/ARW 2016, University of Applied Sciences, Wels, Austria 50
zurĂĽck zum  Buch Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics"
Proceedings of the OAGM&ARW Joint Workshop Vision, Automation and Robotics
Titel
Proceedings of the OAGM&ARW Joint Workshop
Untertitel
Vision, Automation and Robotics
Autoren
Peter M. Roth
Markus Vincze
Wilfried Kubinger
Andreas MĂĽller
Bernhard Blaschitz
Svorad Stolc
Verlag
Verlag der Technischen Universität Graz
Ort
Wien
Datum
2017
Sprache
englisch
Lizenz
CC BY 4.0
ISBN
978-3-85125-524-9
Abmessungen
21.0 x 29.7 cm
Seiten
188
Schlagwörter
Tagungsband
Kategorien
International
Tagungsbände

Inhaltsverzeichnis

  1. Preface v
  2. Workshop Organization vi
  3. Program Committee OAGM vii
  4. Program Committee ARW viii
  5. Awards 2016 ix
  6. Index of Authors x
  7. Keynote Talks
  8. Austrian Robotics Workshop 4
  9. OAGM Workshop 86
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Proceedings of the OAGM&ARW Joint Workshop