Web-Books
in the Austria-Forum
Austria-Forum
Web-Books
International
Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Page - 97 -
  • User
  • Version
    • full version
    • text only version
  • Language
    • Deutsch - German
    • English

Page - 97 - in Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“

Image of the Page - 97 -

Image of the Page - 97 - in Proceedings - OAGM & ARW Joint Workshop 2016 on

Text of the Page - 97 -

Real-time tracking of rigid objects using depth data Sharath Chandra Akkaladevi1, 2, Martin Ankerl1, Gerald Fritz1, Andreas Pichler1 1Department of Robotics and Assistive Systems Profactor GmbH, Im Stadtgut A2, Steyr-Gleink, 4407, Austria {firstname.lastname}@profactor.at 2Institute of Networked and Embedded Systems Alpen-Adria-Universität Klagenfurt, Austria Abstract In this paper, a robust, real-time object tracking approach is presented. The approach relies only on depth data to track objects in a dynamic environment and uses random-forest based learning to deal with problems like object occlusion and clutter. We show that the relation between object motion and the corresponding change in its 3D point cloud data can be learned using only 6 random forests. A framework that unites object pose estimation and object pose tracking to efficiently track objects in 3D space is presented. The approach is robust against occlusions in tracking objects and is capable of real-time performance with 1.7ms per frame. The experimental evaluations demonstrate the performance of the approach against robustness, accuracy and speed and compare the approach quantitatively with the state of the art. 1. Introduction Object tracking has been widely researched in the vision community over the recent past and many methods are proposed in literature to track objects [6]. Until the last decade the methods mainly considered 2D image data as input and in some cases stereo vision and served applications like surveillance, military use, security and industrial automation. However, 2D image data only captures the 3D projection into two dimensions and is sensitive to illumination changes. With recent development of RGB-D devices like Kinect, researchers all over the world are exploiting depth data for object recognition and tracking [7]. Tracking can be defined as the problem of estimating the trajectory (6 DOF – 3 translations, 3 rotation parameters) of an object in the 3D image plane as it moves around a scene. Though there has been a lot of work in tracking humans using RGB-D devices [8], not much work is done in the field of tracking objects that could be used in industrial settings which often have real-time requirements. Object tracking in general is a challenging problem. Tracking objects becomes difficult due to abrupt object motions, object to object occlusions, clutter, camera motion and noisy sensor data. When considering its application in industrial settings the problem of designing a successful tracking algorithm becomes even more difficult. This is due to the requirement of higher levels of robustness, accuracy and speed. Also, industrial objects tend to have little texture. In this paper, we describe an approach for real-time tracking of objects [12] that aims to answer these challenges. The main contribution of this paper is the extended evaluation of the work in [12] and its comparison with the state of the art approaches. 97
back to the  book Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“"
Proceedings OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Title
Proceedings
Subtitle
OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Authors
Peter M. Roth
Kurt Niel
Publisher
Verlag der Technischen Universität Graz
Location
Wels
Date
2017
Language
English
License
CC BY 4.0
ISBN
978-3-85125-527-0
Size
21.0 x 29.7 cm
Pages
248
Keywords
Tagungsband
Categories
International
Tagungsbände

Table of contents

  1. Learning / Recognition 24
  2. Signal & Image Processing / Filters 43
  3. Geometry / Sensor Fusion 45
  4. Tracking / Detection 85
  5. Vision for Robotics I 95
  6. Vision for Robotics II 127
  7. Poster OAGM & ARW 167
  8. Task Planning 191
  9. Robotic Arm 207
Web-Books
Library
Privacy
Imprint
Austria-Forum
Austria-Forum
Web-Books
Proceedings