Seite - 7 - in Proceedings of the OAGM&ARW Joint Workshop - Vision, Automation and Robotics
Bild der Seite - 7 -
Text der Seite - 7 -
Package Delivery Experiments with a Camera Drone
Jesús Pestana1 and Michael Maurer1 and Daniel Muschick2 and Devesh Adlakha1
and Horst Bischof1 and Friedrich Fraundorfer1
Abstract—The undergoing efforts for the integration of
robotics into logistics systems is affecting the production work-
flow at all stages, from the transportation and the handling of
parts insidestorageandproductionfacilities to thefinalproduct
distribution. In this paper we address the problem of delivering
a package by means of a multirotor drone. We describe a fully
autonomous package delivery flight demonstration prepared in
collaboration with an industrial partner. All computations are
performed in real-time on-board the drone. A gimbal camera
is utilized to realize the vision-based localization, by means
of fiducial markers, of the delivery position and the landing
platform on a pickup truck. The demonstration consists of
the fully autonomous execution of the following tasks: the
drone takes-off from the truck, looks for the delivery position,
proceeds to land and drop the package, flies back to the
distribution truck and follows it, and the flight is finished by
performing the landing on the static vehicle. The experiments
focus on the performance of the vision-based truck following.
I. INTRODUCTION
In this paper we present a fully autonomous drone that
using only on-board processing is able to perform coarse
navigationusingGPS, vision-basedprecisevehicle following
and landing on static platforms (see Figs. 1 & 3). We used
our system to perform a fully autonomous package deliv-
ery flight demonstration in collaboration with an industrial
partner. The main technical challenges related to this work
are the navigation control, the real-time vision-based pose
estimation of the vehicle and the landing positions and their
integration with the navigation control. In order to obtain the
required localization precision for the vehicle following and
the landing tasks we use visual fiducial markers.
Drones are a hot topic and an ongoing research area.
These aerial platforms are suitable for being integrated in
logistics systems, for instance, for the transportation of
goods. Package delivery by means of an autonomous drone
can significantly reduce the costs of distribution. A succinct
feasibility analysis by D’Andrea [4] estimated its operating
cost at 10 cents for a 2 kg payload and a 10 km range.
The main challenges faced by real-world drone package
delivery are highlighted by the following selection of recent
research works: an obstacle mapping method that encodes at
cell-level the value of occupancy and its variance [1], testing
modern deep-learning based object detection algorithms on-
board drones [6], trajectory planning intended for navigation
in cluttered environments [3] and landing on vehicles that
are moving in straight roads at speeds of up to 40 km/h [2].
1Institute for Computer Graphics and Vision, ICG - TU Graz
{pestana,maurer,bischof,fraundorfer}@icg.tugraz.at
2Institute of Automation and Control, Graz University of Technology
daniel.muschick@bioenergy2020.eu Fig. 1. Illustration of autonomous vision-based controlled drone landing
on a marked delivery position in order to deliver a package.
DJI M100 Autopilot Board
GPS Module
NVIDIA Jetson TK1
- Quad-core ARM Processor
- NVIDIA Kepler GPU
Gimbal and Camera
DJI Zemuse X3
Fig. 2. DJI M100 quadrotor equipped with a Nvidia Jetson TK1 on-board
computer (DJI Manifold), autopilot, GPS module, DJI Zenmuse X3 gimbal
camera (1280×720 px) and an electro-magnet to carry a package of 100 g.
II. SYSTEM OVERVIEW
a) Hardware Setup
Our drone is equipped as shown in Fig. 2. For the exper-
imental tests, the E-Mobility electric powered pickup truck
“ELI” from SFL Technologies [7] was used, see Fig. 3. The
delivery position and the landing platform are tagged using a
39×39 cm 36h11-family Apriltag fiducial marker [5]. Using
this approach the relative pose of the gimbal camera with
respect to the landing-platform at a distance of 3.5 m can be
estimated with an accuracy of around 3 cm.
b) Software Setup
The inter-module communication is achieved by means of
the Robot Operating System (ROS). Since our experimental
results focuson thecar followingperformance,only themain
modules related to this task are explained, which are: the
gimbal camera landing-platform tracking, the vehicle speed
estimation and the control algorithm.
7
Proceedings of the OAGM&ARW Joint Workshop
Vision, Automation and Robotics
- Titel
- Proceedings of the OAGM&ARW Joint Workshop
- Untertitel
- Vision, Automation and Robotics
- Autoren
- Peter M. Roth
- Markus Vincze
- Wilfried Kubinger
- Andreas Müller
- Bernhard Blaschitz
- Svorad Stolc
- Verlag
- Verlag der Technischen Universität Graz
- Ort
- Wien
- Datum
- 2017
- Sprache
- englisch
- Lizenz
- CC BY 4.0
- ISBN
- 978-3-85125-524-9
- Abmessungen
- 21.0 x 29.7 cm
- Seiten
- 188
- Schlagwörter
- Tagungsband
- Kategorien
- International
- Tagungsbände