Web-Books
im Austria-Forum
Austria-Forum
Web-Books
International
Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Seite - 118 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 118 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Bild der Seite - 118 -

Bild der Seite - 118 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Text der Seite - 118 -

Colour space Accuracy HSV 1 RGB 0.7742 YCbCr 0.7419 Grayscale 0.7419 CIE-Lab 0.7188 TABLE II ACCURACY ON THE DATASET EVALUATED USING DIFFERENT IMAGE COLOUR SPACES. PEAK SIGNAL TO NOISE RATIO (PSNR) IS USED AS SIMILARITY MEASURE. EVALUATED COLOUR SPACES ARE HUE, SATURATION, VALUE (HSV), RED, GREEN, BLUE (RGB), YCBCR, GRAYSCALE AND CIE-LAB. USING HSV SHOWS THE HIGHEST ACCURACY. al. [10]. Nevertheless by using this colourspace the accu- racy drops to 0.7419. On the other hand with Grayscale colourspace the accuracy drops also to 0.7419. This is of further interest since the applied grayscale conversion algorithmsimply takes theYcomponentofYCbCrandomits the colour channels. This procedure is common usage in photoediting software like Photoshop 2 orGIMP 3.A further look on the results uncovers that YCbCr and Grayscale have their wrong estimations on the same models. Therefore CbCr colour encoding adds no benefit to using the Y channel alone in this application. CIE-Lab colour space was also evaluated since it approximates human vision, unfortunately in this application the accuracy dropped to 0.7188. D. Comparison with state-of-the-art A comparison with state-of-the-art is difficult, since the algorithms are usually embedded into a certain application scenario which is not always exchangeable. Nevertheless theEvaluationof theappearancequalitypart in the publication of Alexiadis et al. [8] has been adopted to our set-up: The parameter value of which the reconstructed views are most similar to the ground-truth key-frames is chosen as best value. Like in Alexiadis et al. the similarity measure is SSIM and the colour space HSV. When run on the dataset the accuracy is at 0.4062. This is not a fair comparison since the appearance evaluation is only a part of the whole framework of Alexiadis et al. and only confirms that the set-ups of both approaches cannot be intermixed. V. DISCUSSION This section contains a discussion about the applicability of the approach as well as considerations on the runtime. A. Applicability The proposed approach relies on an interesting property of the reconstruction principle: changes in the parameter value lead to mainly distinct local deviations in the model. PSNR as the chosen similarity measure has to be sensitive to this deviations. To visualize this a metric Multi Dimen- sional Scaling (MDS) [11] algorithm is utilized. An MDS 2http://www.adobe.com/at/products/photoshop.html 3https://www.gimp.org algorithm tries to position each object in multi-dimensional space such that the between-object distances are kept as well as possible. This gives more insight into the working principle of the proposed approach since it illustrates which images are similar from the view of PSNR. Fig. 3. Multidimensional scaling layout of all frontal views for parameter- type ColourCorrection on Model 0093 using Peak Signal to Noise Ratio (PSNR) as similarity measure and Hue, Saturation, Value colour space. Top right is the best choice, bottom left and right show deviations on the pine, top left on the right cheek. The farther the images are away from each other the more they are different in the meaning of PSNR. The images form clusters according to local deviations in the reconstruction. In Figure 3 all frontal view reconstructions in the whole parameter value range for ColourCorrection of a specific model (0093 in the dataset) are laid out with an MDS algorithm. To create the necessary distance matrix for the algorithm, the similarities inMα were converted to distances. On the top right is the optimal reconstruction. Bottom left and bottom right show deviations on the pine, whereas top left deviates on the left cheek. It can be seen that images with similar deviations are clustered together. However the increasing visual effect of the parameter values, mentioned in Section III, is not visible in the layout, on the one side because MDS is a form of non-linear dimensionality reduction and on the other side PSNR as underlying measure does not fully reflect the human visual perception. Figure 4 depicts also a MDS layout for a whole pa- rameter value range (parameter-type SmoothnessTerm on Model 0098 in the dataset). On bottom right is the rare case ofacomplete failed reconstruction,whichhasahighdistance to the other images. One can see that the case of a global deviation is treated well, as long as it is not in the majority of the images. The dependency on distinct local deviations can be a loss of generality of the approach. However especially in the area of human 3D reconstruction there should be a wide range of possible applications. Furthermore our approach is not dependent on a certain reconstruction principle. 118
zurück zum  Buch Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics"
Proceedings of the OAGM&ARW Joint Workshop Vision, Automation and Robotics
Titel
Proceedings of the OAGM&ARW Joint Workshop
Untertitel
Vision, Automation and Robotics
Autoren
Peter M. Roth
Markus Vincze
Wilfried Kubinger
Andreas Müller
Bernhard Blaschitz
Svorad Stolc
Verlag
Verlag der Technischen Universität Graz
Ort
Wien
Datum
2017
Sprache
englisch
Lizenz
CC BY 4.0
ISBN
978-3-85125-524-9
Abmessungen
21.0 x 29.7 cm
Seiten
188
Schlagwörter
Tagungsband
Kategorien
International
Tagungsbände

Inhaltsverzeichnis

  1. Preface v
  2. Workshop Organization vi
  3. Program Committee OAGM vii
  4. Program Committee ARW viii
  5. Awards 2016 ix
  6. Index of Authors x
  7. Keynote Talks
  8. Austrian Robotics Workshop 4
  9. OAGM Workshop 86
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Proceedings of the OAGM&ARW Joint Workshop