Web-Books
im Austria-Forum
Austria-Forum
Web-Books
International
Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Seite - 21 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 21 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Bild der Seite - 21 -

Bild der Seite - 21 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics

Text der Seite - 21 -

Estimated distance with VO in m 3. loop 2. loop 1. loop 0 500 1000 1500 2000 2500 3000 −100 −50 0 50 100 150 200 250 300 350 400 Fig. 9: Scenario 2 – Comparison of estimated yaw angles of the yaw angle like it‘s necessary for agricultural vehicles. Using the given laptop, in average the computing time of the pose estimation takes 0.349s per stereo pair. This equates to approximately three pose estimations per second. V. CONCLUSION AND FUTURE WORK We developed a Visual Odometry system that is based on a stereo-camera pair and capable of estimating the position and orientation of an agricultural vehicle in GPS-obstructed environments. We deployed our system on a small truck and carried out measurements for two different logging road scenarios. The results showthat an insufficientdistribution of features can lead to an ill-conditioned pose estimation, and hence to an inaccurately estimated distance and pitch angle. Due to the incremental concatenation of relative motions, this results in an increased error in position. However, our robust VO system is highly capable of estimating the orien- tation (yaw angle) with acceptable accuracy in unstructured environment. This is especially shown in the first scenario in the dense forest where the signal quality of GPS lacks. Future work includes the improvement of the distribu- tion of features and hence the pose estimation. Uniformly distributed features could be achieved via using different detector parameters for the upper and lower half of the image. Another goal is to reduce the computing time of the VO to facilitate an online system. This can mainly be done via the parallelization of repeatable tasks like feature detection and description. Furthermore, the next steps include the incorporation of the data of an IMU that were recorded simultaneously during our measurements. We plan to use a data fusion algorithm such as a Kalman Filter to improve the overall accuracy by combining the VO with the IMU data. REFERENCES [1] M. Agrawal and K. Konolige, “Rough terrain visual odometry,” in Proceedings of the International Conference on Advanced Robotics (ICAR), vol. 1, 2007, pp. 28–30. [2] P. F. Alcantarilla, J. Nuevo, and A. Bartoli, “Fast explicit diffusion for accelerated features in nonlinear scale spaces,” inBritishMachine Vision Conference (BMVC), 2013. [3] P. F. Alcantarilla, A. Bartoli, and A. J. Davison, “KAZE features,” in Computer Vision–ECCV. Springer, 2012, pp. 214–227. [4] S. Blackmore, “Precision farming: An introduction,” Outlook on Agriculture, vol. 23, no. 4, pp. 275–280, 1994. [5] G. Bradski, “The OpenCV library,” Dr. Dobb’s Journal of Software Tools, 2000. [6] Y. Cheng, M. Maimone, and L. Matthies, “Visual odometry on the mars exploration rovers,” in IEEE International Conference on Systems,Man andCybernetics, vol. 1, 2005, pp. 903–910. [7] A. I. Comport, E. Malis, and P. Rives, “Accurate quadrifocal tracking for robust 3d visual odometry,” inProceedings of IEEE International Conference on Robotics and Automation (ICRA), 2007, pp. 40–45. [8] K. Cordes, L. Grundmann, and J. Ostermann, “Feature evaluation with high-resolution images,” in International Conference on Computer Analysis of Images and Patterns. Springer, 2015, pp. 374–386. [9] E. B. Dam, M. Koch, and M. Lillholm, Quaternions, Interpolation and Animation. Datalogisk Institut, Kþbenhavns Universitet, 1998. [10] A. J. Davison, “Real-time simultaneous localisation and mapping with a single camera,” in IEEE International Conference on Computer Vision (ICCV), 2003, pp. 1403–1410. [11] M. M. Deza and E. Deza,EncyclopediaofDistances. Springer, 2009. [12] M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,”Communications of the ACM, vol. 24, no. 6, pp. 381–395, 1981. [13] F. Fraundorfer and D. Scaramuzza, “Visual odometry, part II: Match- ing, robustness, optimization, and applications,” IEEE Robotics & AutomationMagazine, vol. 19, no. 2, pp. 78–90, 2012. [14] R. W. Hamming, “Error detecting and error correcting codes,” Bell System technical journal, vol. 29, no. 2, pp. 147–160, 1950. [15] C. Harris and M. Stephens, “A combined corner and edge detector,” inProceedings of 4th Alvey Vision Conference, 1988, pp. 147–151. [16] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision. Cambridge university press, 2003. [17] K. Konolige, M. Agrawal, and J. Sola, “Large scale visual odometry for rough terrain,” inRoboticsResearch. Springer,2011,pp.201–212. [18] V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate o(n) solution to the pnp problem,” International Journal of Computer Vision, vol. 81, no. 2, pp. 155–166, 2009. [19] M. I. Lourakis and A. A. Argyros, “SBA: A software package for generic sparse bundle adjustment,”ACMTransactions onMathemati- cal Software (TOMS), vol. 36, no. 1, p. 2, 2009. [20] D. G. Lowe, “Distinctive image features from scale-invariant key- points,” International Journal of Computer Vision, vol. 60, no. 2, pp. 91–110, 2004. [21] M. Maimone, Y. Cheng, and L. Matthies, “Two years of visual odometry on the mars exploration rovers,” Journal of Field Robotics, vol. 24, no. 3, pp. 169–186, 2007. [22] D. Niste®r, O. Naroditsky, and J. Bergen, “Visual odometry,” in Pro- ceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), vol. 1, 2004, pp. I–652. [23] D. Niste®r, O. Naroditsky, and J. Bergen, “Visual odometry for ground vehicle applications,” Journal of Field Robotics, vol. 23, no. 1, pp. 3–20, 2006. [24] D. Scaramuzza and F. Fraundorfer, “Visual odometry, part I: The first 30 years and fundamentals,” IEEERobotics&AutomationMagazine, vol. 18, no. 4, pp. 80–92, 2011. [25] T. Strang, F. Schubert, S. Tho¹lert, R. Oberweis, M. Angermann, B. Belabbas, A. Dammann, M. Grimm, T. Jost, S. Kaiser, et al., “Lokalisierungsverfahren,” Deutsches Zentrum fu¹r Luft-und Raum- fahrt (DLR), Tech. Rep., 2008. [26] N. Su¹nderhauf, K. Konolige, S. Lacroix, and P. Protzel, “Visual odometry using sparse bundle adjustment on an autonomous outdoor vehicle,” inAutonomeMobile Systeme. Springer, 2006, pp. 157–163. [27] J.-P. Tardif, M. George, M. Laverne, A. Kelly, and A. Stentz, “A new approach to vision-aided inertial navigation,” in IEEE/RSJ Interna- tional Conference on Intelligent Robots and Systems (IROS), 2010, pp. 4161–4168. [28] B. Triggs, P. Mclauchlan, R. Hartley, and A. Fitzgibbon, “Bundle adjustment – a modern synthesis,” in Vision Algorithms: Theory and Practice, ser. Lecture Notes in Computer Science (LNCS), B. Triggs, A. Zisserman, and R. Szeliski, Eds. Springer-Verlag, 2000, vol. 1883, pp. 298–372. [Online]. Available: https://hal.inria.fr/inria-00590128 21
zurĂŒck zum  Buch Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics"
Proceedings of the OAGM&ARW Joint Workshop Vision, Automation and Robotics
Titel
Proceedings of the OAGM&ARW Joint Workshop
Untertitel
Vision, Automation and Robotics
Autoren
Peter M. Roth
Markus Vincze
Wilfried Kubinger
Andreas MĂŒller
Bernhard Blaschitz
Svorad Stolc
Verlag
Verlag der Technischen UniversitÀt Graz
Ort
Wien
Datum
2017
Sprache
englisch
Lizenz
CC BY 4.0
ISBN
978-3-85125-524-9
Abmessungen
21.0 x 29.7 cm
Seiten
188
Schlagwörter
Tagungsband
Kategorien
International
TagungsbÀnde

Inhaltsverzeichnis

  1. Preface v
  2. Workshop Organization vi
  3. Program Committee OAGM vii
  4. Program Committee ARW viii
  5. Awards 2016 ix
  6. Index of Authors x
  7. Keynote Talks
  8. Austrian Robotics Workshop 4
  9. OAGM Workshop 86
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Proceedings of the OAGM&ARW Joint Workshop