Web-Books
im Austria-Forum
Austria-Forum
Web-Books
International
Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Seite - 77 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 77 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Bild der Seite - 77 -

Bild der Seite - 77 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Text der Seite - 77 -

CAMERA1 CAMERA2 RealSense R200 XY Z Fig. 3: Experimental setupwith two IntelR©RealSenseR200 structured light cameras in 90◦ alignment to anABB IRB120. algorithm.Rz(α)performsa roll, pitch,oryawrotationabout theangleα according to the joint rotationaxis.ThenewDH- transformationmatrix n−1Tn∈R4×4 of Equation (4)will be saved after the rotations have been performed. By iteration of these equations every rotationRz(α) of a joint nwill be passed to the following joints. This guarantees that every joint keeps coupled to each other. IV. SETUP&CAMERAALIGNMENT Two identical structured light cameras (IntelR©RealSense R200) areused in the setup (cf. Figure3) to avoidocclusions and to get a denser point cloud representation of the robot as shown in Figure 2c. The cameras are positioned in 90◦ to each other. This angle has been chosen since the influence of the illumination disturbance by the projected structured lights isminimized. Each camera is placed 65cm above the robot with a pitch angle of 30◦ down to have awider view. The extrinsic camera calibration will be performed through a plane calibration. Therefore, the table where the robot is placedonhasbeendetectedby theoutliers’detectionmethod RandomSampleConsensus (RANSAC) to receive themodel coefficientsof theplaneAxy.With themodel coefficients, the dihedral angle between the plane normal and camera image normal can be derived by the equation cos(ϕ)= ~n1 ·~n2 |~n1|·|~n2| (7) where ~n1=(a1,b1,c1) is the normal vector of the planeAxy in z direction and ~n2=(a2,b2,c2), the normal vector of the camera image planeAyz along the x directionwith the plane coefficients ai, bi, ci for i=1,2. The cameras are aligned by the rotationwithϕ fromEquation (7) (plus the camera pitch angle) and the known translation from the robot’s base. V. EVALUATION&RESULTS The test system, which is used to evaluate the proposed approach, consists of a personal computer with an IntelR© TABLE IV: The parameters used for the Iterative-Closest- Point algorithm Max. Corre- spondence Distance Max. Iterations Transformation Epsilon Euclidean Fitness Epsilon 0.003m 100 1e-8m 5e-4m Core TM i5-3470@3.20GHz,4096MBRAM,andaGeForce GT 630 with the operation system Linux Ubuntu 16.04@ 64bit. So far, the structured light cameras and the robot motion communication are implemented successfully in ROS. The cameras and the robot are launched as ROS nodes such that they can communicatewith each other. The Point cloud models of the robot’s links are generated from CAD files and coupled together via the DH convention such that they depend on each other and that a rotation of joint one, for instance, has an effect to the other joints (cf. Equations (2) to (4)). A visualization is implemented to visualize the model together with the captured depth image as shown in Figure 1. The joint positions and alignments from the implemented model are observable and controllable. Since the ICP algorithm needs an initial guess where it should start the matching, an initial robot position for program start has been chosen as shown in Figure 2a, otherwise a correct estimation of the position would be hardly possible. In the first experiment the built-in ICP algorithm from the PCLhas been testedwith the parameters fromTable IV and structured light cameraswithmoderate results.While for the initial pose (start pose) reasonably accurate joint angleswith ±0.5◦ have been measured, the deviation increased up to ±5◦ during motion. These evaluation results were obtained for slow motion tasks (≤1◦/s). For faster movements the ICP algorithm is not able to finish the required number of 77
zurück zum  Buch Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics"
Proceedings of the OAGM&ARW Joint Workshop Vision, Automation and Robotics
Titel
Proceedings of the OAGM&ARW Joint Workshop
Untertitel
Vision, Automation and Robotics
Autoren
Peter M. Roth
Markus Vincze
Wilfried Kubinger
Andreas Müller
Bernhard Blaschitz
Svorad Stolc
Verlag
Verlag der Technischen Universität Graz
Ort
Wien
Datum
2017
Sprache
englisch
Lizenz
CC BY 4.0
ISBN
978-3-85125-524-9
Abmessungen
21.0 x 29.7 cm
Seiten
188
Schlagwörter
Tagungsband
Kategorien
International
Tagungsbände

Inhaltsverzeichnis

  1. Preface v
  2. Workshop Organization vi
  3. Program Committee OAGM vii
  4. Program Committee ARW viii
  5. Awards 2016 ix
  6. Index of Authors x
  7. Keynote Talks
  8. Austrian Robotics Workshop 4
  9. OAGM Workshop 86
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Proceedings of the OAGM&ARW Joint Workshop