Seite - 158 - in Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Bild der Seite - 158 -
Text der Seite - 158 -
origin for ray-tracing is defined at the robot hull front, based on the robot’s corrected pose. The local
mapisused insteadof thecurrentmeasurementsalonebecause italsocontainsobstacles remembered
from before, even if they are already inside the sensor’s blind zone. The robot turns to face the user
and moves towards the sitting position up to a distance given by the closest projected measurement
withinananglearound thedetected user, consideringasafetymargin. Fig. 2 illustrates this.
d
m
predefined
position
final
position
Figure2. Obtainingthedistance tomovetowards theuser. Left: ray tracingonthe localmap. Thecurrentvirtual
scan when the head is looking up is depicted in blue, while the scan obtained from ray tracing is shown in yellow.
Theraytracingscanisconsideredtobeobtainedfromthefrontalpartoftherobot(excludingthebumper,whichis
taken intoaccountseparatedly). Theobstacle in frontof therobot isnotdetectedbytheheadsensorbut ispresent
in the remembered local map and therefore in the scan obtained by ray tracing. Right: simplified diagram of how
thedistancetomoveiscomputed. Theminimumdistancefromraytracingisdenoteddandm is thesafetymargin.
For enhanced flexibility, reliability and better adaptation to different users, after this process is exe-
cuted therobotaskswhether tocomeevencloser. Whenthere isapositiveanswer therobotmoves15
cm more, completely blind now and trusting the user’s input, and then the question and subsequent
movement can be repeated up to three times. The user can reply by means of voice, gestures, or the
touch screen commands. The robot remembers if it moved closer and if so it moves backwards again
before startinganew navigation task.
The described method for approaching the user was developed for the call robot function, but it can
alsobeapplied inother tasksandcontexts. For instance,wealso incorporated it at theendofafitness
functionscenario so that the robot comescloser to theuser for further interactionafter exercising.
4.5. Pickupobject
For this function, in the first place, navigation to a goal obtained from the user’s pointing gesture
is needed. Using autonomous navigation in this way provides a much wider applicability and can
be useful for picking up objects not directly inside the robot’s field of view. The static map of the
environment isalsousedforcheckingwhetheradetectedobject is tooclose towallsor funiture, since
thatmeansriskofcollisionandpickinguptheobject isnotpossible then. Oncetheprecomputedpose
is reached,finepositioningwith respect to thedetectedobject isperformed,basedondiscretemotion
commandswith sufficient accuracy. Moredetails about thepick up algorithmscanbe found in [6].
158
Proceedings
OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
- Titel
- Proceedings
- Untertitel
- OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
- Autoren
- Peter M. Roth
- Kurt Niel
- Verlag
- Verlag der Technischen Universität Graz
- Ort
- Wels
- Datum
- 2017
- Sprache
- englisch
- Lizenz
- CC BY 4.0
- ISBN
- 978-3-85125-527-0
- Abmessungen
- 21.0 x 29.7 cm
- Seiten
- 248
- Schlagwörter
- Tagungsband
- Kategorien
- International
- Tagungsbände