Page - 8 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Image of the Page - 8 -
Text of the Page - 8 -
Fig. 3. Drone vision-based vehicle following, marked with a 39×39 cm Apriltag. Experiment: 3 min, mean speed 7.91 km/h and top speed 13.35 km/h.
b.1) Gimbal Camera Landing Platform Tracking
The drone’s GPS measurements, the gimbal current orienta-
tion and the camera relative pose to the marker are combined
to estimate the position of the markers in world coordinates.
During specific tasks, these position estimates can be used to
command the gimbal to point at the marker that is positioned
on top of a landing platform. This approach is used during
the vehicle following, package delivery and landing tasks.
b.2) Vehicle Speed Estimation
The marker relative pose estimates are calculated at around
25 fps for a resolution of 1280×720 px. These estimates
are stored in a queue with a length of 20 elements. The
vehicle speed is estimated for every linear coordinate using
linear regression on the elements of the queue, which does
not incur significant computation costs.
b.3) Navigation Control Algorithm
The flight behavior of our drone was characterized by per-
forming speed command step-response identification tests. A
rough controller parameter tunning was calculated based on
the resultingmodel and itwas later experimentally improved.
We utilize a feedback loop controller based on the PID
controller architecture for the three linear coordinates and
the yaw heading. In order to improve its performance, the
controller utilizes both position and speed references. The
utilized measurement feedback are the position and velocity
provided by the autopilot telemetry, obtained through the
fusion of GPS data with the IMU and magnetometer data.
III. EXPERIMENTAL RESULTS
a) Package delivery mission
We succeed in performing a fully autonomous mission where
the drone takes-off from the truck, follows a GPS predefined
flight trajectory, looks for the delivery position, proceeds to
land and drop the package, takes-off again, flies back to the
distribution truck, follows it for a while and lands on the
static vehicle. This mission is summarized in our video1.
b) Vehicle following experiment
The task of the drone is to follow the vehicle that is marked
with a landing-platform at a constant distance of 2.5 m from
behind and above. The vehicle speed estimate, see Sec. b.2,
is used as speed reference for the controller.
The vehicle following experiment lasted 3 min during
which the drone performed the task successfully all the time.
Pictures of this experiment are shown in Fig. 3 and the
logged trajectories and speeds of the drone and the vehicle
are plotted in Fig. 4. Overall, during this experiment, the
mean and top vehicle speed were 7.91 km/h and 13.35 km/h,
and the root mean square error (RMSE) of the position and
speed control tracking error were 0.37 m and 1.34 km/h.
1Package delivery demo:https://youtu.be/bxM6dls2wuo −40 −30 −20 −10
0
10
20
0
2
X
(m)Y
(m) drone
vehicle
150 200 250 300 350
0
3.6
7.2
10.8
14.4
t (s)
Fig. 4. Vehicle following experiment of 3 min duration. The plot shows
the (red) drone and (blue) vehicle 3D positions and speeds over time.
IV. SUMMARY
In this paper we presented a fully autonomous drone that
using only on-board processing is able to perform coarse
navigation using GPS and vision-based precise vehicle fol-
lowing and landing (see Figs. 1 & 3). Our fully autonomous
package delivery flight demonstration, carried out in col-
laboration with SFL Technologies, was reported by local
newspapers2,3. In future work we plan to use this system
as a first step towards performing autonomous landing on a
moving vehicle.
ACKNOWLEDGMENTS
The authors thank SFL Technologies for providing the
testing environment and the electric vehicle ELI [7].
REFERENCES
[1] A. Agha-mohammadi, “Confidence-aware occupancy grid mapping: A
planning-oriented representation of environment,” IROS2016Workshop.
[2] A. Borowczyk, D.-T. Nguyen, A. P.-V. Nguyen, D. Q. Nguyen, D. Saus-
sié, and J. L. Ny, “Autonomous landing of a multirotor micro air vehicle
on a high velocity ground vehicle,” arXiv preprint, 2016.
[3] S. Daftry, S. Zeng, A. Khan, D. Dey, N. Melik-Barkhudarov, J. A.
Bagnell, and M. Hebert, “Robust monocular flight in cluttered outdoor
environments,” IROS2016Workshop, arXiv:1604.04779, 2016.
[4] R. D’Andrea, “Guest editorial can drones deliver?” IEEETransactions
on Automation Science and Engineering, vol. 11, no. 3, 2014.
[5] E. Olson, “AprilTag: A robust and flexible visual fiducial system,” in
Proceedings of the IEEE International Conference on Robotics and
Automation (ICRA). IEEE, May 2011, pp. 3400–3407.
[6] A. Sankalp, D. Geetesh, J. Sezal, M. Daniel, A. Greg, Y. Song, and
S. Sebastian, “Autonomous semantic exploration using unmanned aerial
vehicles,” IEEE IROS2016Workshop, 2016.
[7] SFL Technologies, “E-Mobility electric powered vehicle ELI,” http://
www.sfl-technologies.com/spektrum/e-mobility/, accessed:2017-03-14.
2ELI roll-out demo - Mein Bezirk -http://bit.ly/2hott8t
3ELI roll-out demo - Kleine Zeitung -http://bit.ly/2it6N2P
8
Proceedings of the OAGM&ARW Joint Workshop
Vision, Automation and Robotics
- Title
- Proceedings of the OAGM&ARW Joint Workshop
- Subtitle
- Vision, Automation and Robotics
- Authors
- Peter M. Roth
- Markus Vincze
- Wilfried Kubinger
- Andreas Müller
- Bernhard Blaschitz
- Svorad Stolc
- Publisher
- Verlag der Technischen Universität Graz
- Location
- Wien
- Date
- 2017
- Language
- English
- License
- CC BY 4.0
- ISBN
- 978-3-85125-524-9
- Size
- 21.0 x 29.7 cm
- Pages
- 188
- Keywords
- Tagungsband
- Categories
- International
- Tagungsbände