Web-Books
im Austria-Forum
Austria-Forum
Web-Books
International
Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Seite - 69 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 69 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Bild der Seite - 69 -

Bild der Seite - 69 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Text der Seite - 69 -

Comparisons of the time taken to charge a vehicle using different charging systems is shown in Figure 1. Fig. 1. Driving distance and charging time comparison of different charging systems [22]. C. Related Research Automated charging has been well researched, es- pecially for mobile robots. Typically, there is a custom made charging station, which is localized by the robot either using a direct communication or using computer vision based methods. These methods are normally based on having special markers on the charging station, which are localised in order for the robot to correctly align itself and approach the station. Removing markers would impede the opera- tion [12] [19] [18] [14]. Another concept developed specifically for the de- tection of charging ports on EVs was based on adding an array of RFID tags on the car. Reading RFID signalsallows tofind theexactpositionandorientation of the charging port and plug it in automatically [16]. However, this still requires modification to the vehicle and would not support non-adapted cars. Fig. 2. CAD model of the robotic charging station concept. D. Method Presented in ThisWork We present a conductive robot-based automated charging method for EVs and PHEVs, which does not require any modifications to existing vehicles. First of all, we present a quick eye-to-hand calibration procedure to calibrate the vision sensor and the robot to work in the same coordinate system. It estimates both, the placement of the vision sensor in relation to the robot base as well as between the end-effector and the plug. Then we use shape-based matching and triangulation to locateand identify thechargingportof the car and guide the robot, holding a charging cable, to precisely plug in the charger. Once the car is fully charged, the robot will automatically unplug from the vehicle, which will be ready to be driven away. The visualisation of the concept robotic charging station is shown in Figure 2. This paper is organized as follows. We explain the proposed method in Section II. Then we provide our test setup, experiments and results in Section III, fol- lowed by conclusions and future work in Section IV. II. METHOD A. Detection of the Charging Port A majority of the car charging ports are manufac- tured from texture-less black plastic material, making itdifficult toobtaingoodfeatures in thecamera image. Similarly, the measurements made using time-of-flight cameras, which use the projection of infrared (IR) light, are noisy and inaccurate due to IR absorption by the material. As an alternative solution, a stereo- camera setup was used as the vision sensor. Fig. 3. Input images, simplified template models and automatically created shape-based templates for matching. Type 2 socket is shown in column a), type 1 socket in b) and type 2 connector plug is shown in c). Green circles define the area of interest for the model creation and the red outline line defines the created shape model. The first step in the detection procedure is to find the location of the charging port in stereo images using shape-based template matching. Models were created for two types of the charging ports as well as the power plug connector, later to be used for eye-to-hand calibration. Figure 3 shows the camera images and simplified model images, which are used to automatically generate shape-based templates later to be used for matching. Template matching was performed using a Halcon Machine Vision software, which has proven to perform well in given conditions of low-contrast input images [2]. Matching results in a 2D Affine transformation matrix defining the template location in the image. By taking x and y coordinates of the correspond- ing object points in images from each of the stereo 69
zurück zum  Buch Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics"
Proceedings of the OAGM&ARW Joint Workshop Vision, Automation and Robotics
Titel
Proceedings of the OAGM&ARW Joint Workshop
Untertitel
Vision, Automation and Robotics
Autoren
Peter M. Roth
Markus Vincze
Wilfried Kubinger
Andreas Müller
Bernhard Blaschitz
Svorad Stolc
Verlag
Verlag der Technischen Universität Graz
Ort
Wien
Datum
2017
Sprache
englisch
Lizenz
CC BY 4.0
ISBN
978-3-85125-524-9
Abmessungen
21.0 x 29.7 cm
Seiten
188
Schlagwörter
Tagungsband
Kategorien
International
Tagungsbände

Inhaltsverzeichnis

  1. Preface v
  2. Workshop Organization vi
  3. Program Committee OAGM vii
  4. Program Committee ARW viii
  5. Awards 2016 ix
  6. Index of Authors x
  7. Keynote Talks
  8. Austrian Robotics Workshop 4
  9. OAGM Workshop 86
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Proceedings of the OAGM&ARW Joint Workshop