Web-Books
in the Austria-Forum
Austria-Forum
Web-Books
International
Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Page - 71 -
  • User
  • Version
    • full version
    • text only version
  • Language
    • Deutsch - German
    • English

Page - 71 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Image of the Page - 71 -

Image of the Page - 71 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Text of the Page - 71 -

three-step procedure was used, visualised in Figure 4. Firstly, the robotmoves the plug athigh velocity to the approach position, which is within a 0.1 meter radius from the charging port. The second step is to reduce the velocity to 10% of the maximum robot joint speed and move to the final alignment position. In this pose, the connector plug and the charging port are fully alignedby theirZ-axisand just a fewmillimetersaway from the contact point. The last step is to move at just 2% of the maximum speed along Z-axis and perform the plug-in motion. During this move, the forces and torques exerted on the end effector of the robot are monitored. Incase the forcesexceedagiven threshold, the system is halted to prevent any damage. Fig. 4. Three step plug-in procedure plan. Firstly, the robot moves the connector plug to the Approach Position, which lies approximately 0.1 meter away from the charging port. The second move aligns the Z-axes of the charging port and the plug, and gets the plug just a few millimeters away from the port. The final plug-in movement performs the plugging in motion along Z-axis. E. Unplugging After the vehicle is charged fully or to the desired battery level, the robot has to disconnect the charger. Under the assumption that there were no position changes during the charging process, the unplugging procedure was simplified to follow the recorded way- points of the plug-in procedure in the inverse order. First, the robot gets back to the approach position and then returns to the stand-by position, where it is docked while waiting for the next task. The stand-by position ensures an unobstructed view of the parked vehicle for the vision sensor. III. EXPERIMENTS AND RESULTS A. Experiment Setup At the current stage, the testing was limited to the lab environment. The experimental setup consists of an UR10 robot arm, a vision sensor containing stereo cameras and a charging port holder with inter- changeable charging ports. The charging port holder has variable height, position and angle to simulate various imperfect parking positions and differences in charging port locations on the vehicle. Two types of the charging ports, Type 1 and 2, have been used, as previously seen in Figure 3. The connector plug is attached to the end-effector of the robot using a custom 3D printed attachment, shown in Figure 5. The charging cable is also attached Fig. 5. Custom 3D printed connector plug holder attached to the end-effector of the UR10 robot. to simulate realistic weight exerted on the robot during the operation. The whole experimental setup is shown in Figure 6. The final goal was to locate the charging port using the vision sensor and estimate its pose. Then, the pose is transformed into the coordinate system of the robot and the end point of the connector plug is aligned and plugged in to the charging port. After a brief pause to simulate thechargingprocess, theunplugging movement is performed and the robot moves back to the stand-by position. Results of each part of the process are discussed separately and followed by the final evaluation of the whole system. Fig. 6. The whole experiment setup. On the left the charging port holder can be seen. The robot is holding the connector plug, and the vision sensor made up of two stereo cameras is seen on the right hand side. B. TemplateMatching Template matching for Type 1 and Type 2 charging ports as well as the connector plug (Type 2) has worked well for various illumination and angles up to 45â—¦ relative to the viewing angle of the camera. The matching confidence score for good alignment was over 95%. The recognition speed on the full camera image was varying between 300msand 800ms. By narrowing down the search area, for example by identifying the darker than average regions in the image, the recognition speed can be reduced to under 150ms. The results can be seen in Figure 7. The limit for the successful recognition under low illumination or overexposure was when the edges of the socket or plug structure are still visible. The 71
back to the  book Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics"
Proceedings of the OAGM&ARW Joint Workshop Vision, Automation and Robotics
Title
Proceedings of the OAGM&ARW Joint Workshop
Subtitle
Vision, Automation and Robotics
Authors
Peter M. Roth
Markus Vincze
Wilfried Kubinger
Andreas Müller
Bernhard Blaschitz
Svorad Stolc
Publisher
Verlag der Technischen Universität Graz
Location
Wien
Date
2017
Language
English
License
CC BY 4.0
ISBN
978-3-85125-524-9
Size
21.0 x 29.7 cm
Pages
188
Keywords
Tagungsband
Categories
International
Tagungsbände

Table of contents

  1. Preface v
  2. Workshop Organization vi
  3. Program Committee OAGM vii
  4. Program Committee ARW viii
  5. Awards 2016 ix
  6. Index of Authors x
  7. Keynote Talks
  8. Austrian Robotics Workshop 4
  9. OAGM Workshop 86
Web-Books
Library
Privacy
Imprint
Austria-Forum
Austria-Forum
Web-Books
Proceedings of the OAGM&ARW Joint Workshop