Web-Books
in the Austria-Forum
Austria-Forum
Web-Books
International
Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Page - 220 -
  • User
  • Version
    • full version
    • text only version
  • Language
    • Deutsch - German
    • English

Page - 220 - in Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“

Image of the Page - 220 -

Image of the Page - 220 - in Proceedings - OAGM & ARW Joint Workshop 2016 on

Text of the Page - 220 -

whereui = u(ti) and t1, . . .,tN is a sequence of consecutive time steps in the interval [t0,tf]. A variationof thecontrolsui leads to avariationof thecost functional δJˆ= N∑ i=1 ∂Jˆ ∂ui δui. On the other hand, the variation δJˆ can be expressed by Equation (9) which, after discretisation, results in δJˆ= N∑ i=1 Hu,i∆tiδui whereHu,i is theevaluationofHu at t= ti. Hence, thegradientof thediscretised functionalmay be identifiedas ∂Jˆ ∂ui =Hu,i∆ti inwhich∆ti= ti− ti−1. Forwalking in thedirectionof thenegativegradient a smallnumberκ>0 has tobechosen toget the increment δui=−κHTu,i∆ti. (11) If κ is sufficiently small, the updated controlui+ δui will always reduce the cost functional J. However, finding the numberκ such thatJ is reduced may require several simulationsof the system equations. For that purpose, the increments given by Equation (11) are considered as functions ofκ. After solving the equations of motion withu+δu as inputs also the objective functionJ becomes, ultimately, a function of κ. By means of a line search algorithm one may find a number κ in a predefined interval [0,κmax]which minimizesJ. 4.2. Applicationofa Quasi-NewtonMethod It iswellknownthat theconvergenceof thegradientmethodisratherslow,especiallynear theoptimal solution. Hence, a Newton method provides an alternativeapproach to find the minimumof the cost functionalJ. The basic idea is the following one: If uˆ= (uT1 ,uT2 , . . .,uTN)T is defined by a zero gradient, i.e. by theequations ∇Jˆ= [ ∂Jˆ ∂u1 , · · · , ∂Jˆ ∂uN ]T =0 which can be solved for uˆ by Newton’s method. However, the HessianH is required for that pur- pose. Toavoid the full computationofH,whichwouldbeextremely timeconsuming, severalquasi- Newton methods have been developed. They all approximate the Hessian by using the gradients of successive Newton-iterations. For example, the Hessian can be estimated efficiently by the well known Broyden-Fletcher-Goldfarb-Shanno (BFGS)-Algorithm (c.f. [10]). Even its inverse can be efficientlyobtainedby applying theSherman-Morrison formula (c.f. [11]). We compute an approximationH˜−1 of the inverse of the Hessian from the BFGS-algorithm. Then, an incrementδuˆof thediscretized control signal isgivenby     δu1 δu2 . . . δuN      =−H˜−1∇Jˆ (12) 220
back to the  book Proceedings - OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“"
Proceedings OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Title
Proceedings
Subtitle
OAGM & ARW Joint Workshop 2016 on "Computer Vision and Robotics“
Authors
Peter M. Roth
Kurt Niel
Publisher
Verlag der Technischen Universität Graz
Location
Wels
Date
2017
Language
English
License
CC BY 4.0
ISBN
978-3-85125-527-0
Size
21.0 x 29.7 cm
Pages
248
Keywords
Tagungsband
Categories
International
Tagungsbände

Table of contents

  1. Learning / Recognition 24
  2. Signal & Image Processing / Filters 43
  3. Geometry / Sensor Fusion 45
  4. Tracking / Detection 85
  5. Vision for Robotics I 95
  6. Vision for Robotics II 127
  7. Poster OAGM & ARW 167
  8. Task Planning 191
  9. Robotic Arm 207
Web-Books
Library
Privacy
Imprint
Austria-Forum
Austria-Forum
Web-Books
Proceedings