Web-Books
im Austria-Forum
Austria-Forum
Web-Books
Informatik
Joint Austrian Computer Vision and Robotics Workshop 2020
Seite - 88 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 88 - in Joint Austrian Computer Vision and Robotics Workshop 2020

Bild der Seite - 88 -

Bild der Seite - 88 - in Joint Austrian Computer Vision and Robotics Workshop 2020

Text der Seite - 88 -

Imageacquisition.Acquisition of a set of SAR im- ages of the area of interest, in optimal case from as- cending and descending orbital direction (cf. Fig- ure 1). In case images are gathered from one orbital direction the stereo intersection angle has to be rea- sonably large (i.e., larger than 10◦). After import each image consists of complex valued pixels plus the according sensor model (i.e., the cocircular ge- ometry based on the Range and Doppler equations [2,10]). SARdelay correction. Adjustment of sensor mod- els, in specific the SAR internal delays in range di- rection, for the following effects (cf. [5]): (1) Iono- sphericsignalpropagationdelaycausedbyelectrons; (2) tropospheric signal propagation delay caused by air conditions, e.g., water vapor; (3) solid earth tides caused by gravity of moon and sun; and (4) plate tectonics, i.e., continental drift. For each image the range correction grid is updated, whereas the under- lying information is gathered from weather and GPS services. Point extraction. Metal objects appear as points or rather bright blobs on dark background (cf. Fig- ure 2). For detection the image is upsampled based on complex FFT oversampling with a factor of 2. Then a matched filter is applied on the amplitude to localize blobs using a spike-shaped template ker- nel (cf. [14]). Results are thresholded and the best matching2Dblob locationsare retrievedbysubpixel interpolation [6,12]. Matching of points. For each stereo pair epipo- lar rectified images using a coarse digital elevation model based on the method [13] are generated, also transferring theextractedpoints. Thismethodundis- torts the images in rangedirectionand thus increases their geometric and radiometric similarity. Those points are then matched by means of normalized cross-correlation (kernel size depends on resolution of the input images). The resulting homologous points are then transformed back into the input im- ages. Retrieval of 3D coordinates. GCPs are calculated by a multi-image least squares spatial intersection of SARrangecirclesyieldingapointcloud. Duetoover determination incorrect points can be detected and rejected. 3.ResultsandConclusion The presented workflow was applied on a multi- tudeofmulti-beamscenesdistributedover thewhole Figure 2. Roundabout traffic as perceived from an air- bornedigitalcamera(top)andfromtheSARsatellite(bot- tom). Thebrightblobs in theSARamplitudecorresponds mainly to light poles. An exemplary pole is highlighted togetherwith its extracted3Dlocation. globe, acquired with various imaging modes (i.e., Stripmap, Spotlight, HS Spotlight, Staring Spotlight [3]). Referencecoordinatesofmetalpolesweremea- sured in-situ with differential GPS with an absolute 3D accuracy of±5 cm such that inaccuracies of the cadastresystemdonotpropagate into theevaluation. Table 1 gives exemplary 3D accuracies (defined as root mean square (rms) values) as can be ex- pected from the proposed methodology. In planime- try around 15 cm are achieved and in height around 20cm,whichare impressivenumbers taking intoac- count thealtitudeof the satellite’sorbit at 514km. East [m] North [m] Height [m] rms 0.14 0.14 0.21 mean 0.04 0.09 0.07 std 0.13 0.10 0.20 min -0.38 -0.26 -0.44 max 0.32 0.26 0.40 Table 1. 3D accuracy evaluation w.r.t. in-situ measure- ments given in meters based on 26 reference points and twooppositeorbitStaringSpotlight images. Future work will deal with automatic transfer of those SAR-based GCPs to optical images by means of multi-modal image matching. Most promising re- cent works use deep learning to tackle this ill-posed issue, for instance, [7,8,1]. 88
zurück zum  Buch Joint Austrian Computer Vision and Robotics Workshop 2020"
Joint Austrian Computer Vision and Robotics Workshop 2020
Titel
Joint Austrian Computer Vision and Robotics Workshop 2020
Herausgeber
Graz University of Technology
Ort
Graz
Datum
2020
Sprache
englisch
Lizenz
CC BY 4.0
ISBN
978-3-85125-752-6
Abmessungen
21.0 x 29.7 cm
Seiten
188
Kategorien
Informatik
Technik
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Joint Austrian Computer Vision and Robotics Workshop 2020