Web-Books
in the Austria-Forum
Austria-Forum
Web-Books
Informatik
Joint Austrian Computer Vision and Robotics Workshop 2020
Page - 15 -
  • User
  • Version
    • full version
    • text only version
  • Language
    • Deutsch - German
    • English

Page - 15 - in Joint Austrian Computer Vision and Robotics Workshop 2020

Image of the Page - 15 -

Image of the Page - 15 - in Joint Austrian Computer Vision and Robotics Workshop 2020

Text of the Page - 15 -

Object parallel with X-axisofcameracoordinatesystem normal to𝑢 ଶୄ 𝑢 ଵ 𝑢 ଶ 𝑢 ଵ Figure 2. Visualization of the correct alignment of vector ~u2 torwith the smallest eigenvalue (eg. variance). As the formof theshape is symmetrical, themean of thepoints in thegrabbingareaestimates theorigin of the coordinate system shown in Figure 1. There- fore, themeanof thePCAcanbeusedas the transla- tional componentof the transformationmatrix. ~t=(~µx, ~µy, ~µz) T (2) The rotation matrix has to be assembled from three orthonormal vectors. The first vector has al- ready been found, which is the smallest eigenvector ofthePCA,whichformstheZvectorpicturedinFig- ure1. Thesecondvectorcanbeobtainedby leverag- ing knowledge about the environment of the indus- trialgraspingusecase. As the targetobject is located at a target location that is parallel to the ground, the rotation around the Z axis can be neglected. That is whythesecondvectorcanbealignedwith theYaxis of the camera coordinate system. But since the first vector found with the PCA could be rotated around the Y axis of the object coordinate system, the sec- ond vector has to be projected orthogonally to the first. This is done with Equation (3) and the process isvisualized in Figure2. ~u2||=(~uT2 · ~u1) · ~u1 (3) ~u2⊥= ~u2− ~u2|| The third vector can then be calculated using the cross product of ~u1 and ~u2. The resulting rotation matrix is constructedusingEquation (4). R=   u3,x u3,y u3,zu2,x u2,y u2,z u1,x u1,y u1,z   (4) Aftercalculating the rotationand translationcom- ponentsof theobject, a transformationmatrixcanbe formulated usingEquation (5). Camera Optical Axis Distance mark Angle around camera α β d β Figure3. Visualization of the test setup Tcameraobj =[R ~t ] (5) The transformationmatrixcan thenbeused toex- press the grasping point in the world coordinate sys- tem, which is used for motion planning of the robot arm. Having calculated the transformation between thecameracoordinatesystemandtheobjectscoordi- natesystemonecancalculate theobjectsworldposi- tionas follows Tobj=T world camera ·Tcameraobj (6) In the following chapter, the performance of the proposedapproach isdiscussed. 4.Results The test setup consisted of the 3D printed model of the target object shown in Figure 1 and an Intel RealSense D4351. The RGB-D camera has been set upatadefined locationona tableand the3Dprinted modelhasbeenplaced in frontof it ascanbeseen in Figure3. To measure the error of the PCA-based approach, a metric had to be defined. For this, the Euclidean distance between the ground truth vector and the es- timatedplanenormalvectorofthePCAisused. Usu- 1https://www.intelrealsense.com/depth-camera-d435/ 15
back to the  book Joint Austrian Computer Vision and Robotics Workshop 2020"
Joint Austrian Computer Vision and Robotics Workshop 2020
Title
Joint Austrian Computer Vision and Robotics Workshop 2020
Editor
Graz University of Technology
Location
Graz
Date
2020
Language
English
License
CC BY 4.0
ISBN
978-3-85125-752-6
Size
21.0 x 29.7 cm
Pages
188
Categories
Informatik
Technik
Web-Books
Library
Privacy
Imprint
Austria-Forum
Austria-Forum
Web-Books
Joint Austrian Computer Vision and Robotics Workshop 2020