Gesture Controlled Drone: Ian Walter and Monette Khadr
Gesture Controlled Drone: Ian Walter and Monette Khadr
Abstract—The aim of the project is to develop a system that from on-board accelerometers and rate gyroscopes. The state
uses hand gestures as a method to control the flight of a drone. In estimator, controller, and trajectory generator all run on-board
this system, the drone’s absolute position is not being monitored the Crazyflie’s microcontroller. The goal is to fuse all sensors
or recorded. Instead, the drone is being told to move relative
to its current position based on the detected motion of the user. information arriving at variable rates, and for this purpose,
In order to enable fully autonomous flight, an extended Kalman an extended Kalman filter (EKF) is considered. An advantage
filter (EKF) based procedure is used to control and adjust all of using a Kalman filter is allowing simple modification to
six DoF (degrees of freedom) of the drone. The EKF used support sensor fusion by including measurement equations in
the readings of the pre-mounted accelerometer and gyroscope the update step [4].
sensors on the drone as well as a supplementary optical flow
sensor and a time-of-flight (ToF) sensor. The estimator uses an The contribution of this work can be divided in two folds:
extended aerodynamic model for the drone, where the sensor (1) Demonstrating that closed loop control of the drone is
measurements are used to observe the full 3D airspeed. To detect necessary for agile and controllable drone flight maneuvers
the motion of the user, a near-field sensor is measuring the (2) Creating a framework that translates hand movements into
disturbance of an electric field due to conductive objects, like a drone trajectories. The remaineder of the paper is organized
finger. Finally, to combine these systems, code will be developed
on a RaspberryPi to facilitate communication from the sensor to as follows: the system model is presented in Section II, with
the drone and convert from the input X, Y, Z sensor values to details on the utilized sensors. Section III discusses the drone’s
the values compatible with the drone system. inner state estimation problem and the proposed EKF. System
analysis is listed in Section IV, and finally the paper concludes
I. I NTRODUCTION
in Section V.
Drones, also known as unmanned aerial vehicles (UAV),
II. S YSTEM M ODEL
are being heavily deployed in a wide range of commercial and
recreational applications [1]. Drones are be basically observed The system model of the proposed gesture controlled drone
as special flying robots that perform multiple functionalities system is depicted in Fig. 1. The system consists mainly of
such as data capturing and sensing from its environment. the hand movement sensor, the drone, and the RaspberryPi
There are two broad types of drones which are fixed-wing where all the processing and timing control is performed.
and multirotor. In this work, the open-source Crazyflie 2.0 is The mathematical model of the fundamental components of
used, which is a quadcopter (i.e. drone with four rotors) [2]. the system, i.e the sensor and the drone, is discussed in the
Most of the available commercial drones come with designated forthcoming subsections.
controllers or software applications running on users’ hand-
held device. In both cases, commands with detailed movement
Sampling
information are sent through wireless channels, which can be EKF
Clock
via Wi-Fi or Bluetooth. The Crazyflie weighs only 27 grams
and is 9.2cm in length and width. The communication with
Crazyflie can either be via Bluetooth or using the Crazyradio Start Quantify Translate X, Y, Z Send Control
Program Movement Into Roll, Yaw, Pitch to Crazyflie
which is a long range open USB radio dongle based on the
nRF24LU1+ from Nordic Semiconductor. Fig. 1: System model detailing the main functional compo-
This work presents an attempt to add new control dimen- nents of the sensor to drone system
sions by allowing more degrees of freedom (DoF) to control
the drone. Instead of using predesignated buttons, users can
move their fingers which are then translated by the sensor A. Sensor
into digital commands. For autonomous operation, state esti- The Skywriter HAT near-field sensor operates by detecting
mation is a fundamental requirement for these vehicles. This fluctuations in a self-generated magnetic field by the intro-
work adopted a quadrocopter state estimation strategy, which duction of conductive objects such as fingers, as can be seen
uses these range measurements to localize the quadrocopter in Fig. 2 [5]. From the Skywriter datasheet, the sensor has a
inspired by the work done in [3]. A closed loop control of sampling frequency of 200 Hz, a spatial resolution of 150
a quadrocopter is developed using a 2-D positioning sensor, dpi, and a range of detection from 0 to 15 cm. However,
a time-of-flight (ToF) sensor measurements fused with a from empirical testing, the range for detection seems to be
dynamic model of the quadrocopter and with measurements consistently less than 3 cm in any direction from the center of
• The quadcopter is perfectly symmetrical in its geometry,
mass and propulsion system. Hence, the inertia matrix
about this symmetry is diagonal.
• The quadcopter is time-invariant and the mass is constant.
Crazyradio PA
Fig. 3: Inertial frame and Body-fixed frame showing the C. Additional Sensors
Crazyflie.
In order to allow stable drone flight, we added an expansion
board that contains two additional sensors, the VL53L0x time-
B. The Drone
of-flight (ToF) sensor and the optical flow sensor PMW3901.
The dynamic equations of the quadcopter are made based The ToF sensor is a laser ranging sensor that measures the
on the following hypothesis [6]: distance of the drone from the ground. It contains a 940
• The quadcopter is a rigid body that cannot be deformed, nm invisible laser that can measure distances up to 4m at a
thus it is possible to use the well-known dynamic equa- maximum rate of 50 Hz. On the other hand, the optical flow
tions of a rigid body such as by using Euler-Newton sensor uses a low-resolution camera to measure movements
approach. in the x and y coordinates relative to the ground. The sensor
cos θ cos φ cos θ sin φ − sin θ
R = sin θ sin φ cos ϕ − cos φ sin ϕ sin θ sin φ sin ϕ + cos φ cos ϕ sin φ cos θ (1)
sin θ cos φ cos ϕ + sin φ sin ϕ sin θ cos φ sin ϕ − sin φ cos ϕ cos θ cos φ
requires a lens that enables the far-field tracking capability. It with ηgyro is assumed to be zero-mean additive white Gaussian
has a frame rate of 121 frames per second with a rate of 100 noise (AWGN). The accelermometer measurements are also
Hz. The term optical flow refers to a flow of two-dimensional assumed to be corrupted by AWGN, ηacc , it can be expresssed
images, in which certain features, such as patterns or pixel as
intensities, are tracked in time. The PMWB3901 requires SPI 1
interface, while the VL53L0x requires I2 C connectivity, the zacc = R−1 (ẍ − g) + ηacc = (f + fa ) + ηacc (6)
m
PCB board handles the connectivity constraints and allows The gyroscope measurements can be directly used as an
direct communication with the drone. estimate of the drone’s angular velocity, while the accelerom-
eter readings can be used to estimate the force of airspeed,
fa . However, as can be observed from Eq. (5) and (6), these
readings are noisy. As a result, we incorporated the optical
flow sensor and the ToF flight sensors, detailed in Section II.C,
in order to improve the estimation reliability. Even though the
PMW3901 drone dynamics are highly non-linear, insight can be gained
Lens from analysing the linearised system constituted by the error
dynamics which makes using an EKF a viable solution.
I2C
6 DoF IMU Accumulator
500 Hz 100 Hz
SPI Extended
Flow Sensor Kalman
Fig. 5: The VL53L0x and PMW3901 sensors integrated within Filter Position,
velocity, and
the same PCB mounted on the bottom of the Crazyflie. orientation
Laser Sensor I2C
0.6
0.5
0.4
0.3
Fig. 9: Segment of code responsible for interfacing with 0.2
Skywriter API and accumulating measured positions from Instance 2 Instance 3
0.1
user.
0
0 1 2 3 4 5 6 7 8
Time (s)
In order to overcome the impact of the noise, a modification
in the code is made. This is done through averaging the (c) Down motion.
samples collected between drone actuations, thus resulting Fig. 7: Curves showing the non-averaged readings of the
in a minimal impact from noise but also a lower spatial Skywriter sensor in the x, y, and z directions for three various
resolution as can be seen throughout Fig.8. Measurements actions, each was repeated three times marked as instances on
are accumulated at a maximum frequency of 10 kHz, as can the time axis.
be seen in Fig.9. These measurements are accumulated for
at most 0.05 seconds before being taken in and averaged
by the main code loop. Based on these values, there should
have been at most 500 samples between each loop. However,
due to the Crazyflie’s motion application program interface
(API) implementation, there is additional time at which the
1 program is stalled before the drone begins moving. Although
x
0.9 y this additional time varies based on the distance values fed into
z
0.8 the function, the typical additional time is significantly less
0.7
Instance 1 than 0.05 seconds, while the maximum theoretical time would
Measured sensor value
0.6
R EFERENCES
0.5
0.4
[1] Kathiravan Natarajan, Truong-Huy D. Nguyen, and Mutlu Mete.
Hand Gesture Controlled Drones: An Open Source Library. CoRR,
0.3 abs/1803.10344, 2018.
[2] Crazyflie 2.0. https://store.bitcraze.io/products/crazyflie-2-0.
0.2
[3] Mark W Mueller, Michael Hamer, and Raffaello D’Andrea. Fusing ultra-
Instance 2 Instance 3
0.1 wideband range measurements with accelerometers and rate gyroscopes
for quadrocopter state estimation. In 2015 IEEE International Conference
0
0 1 2 3 4 5 6 7 8 on Robotics and Automation (ICRA), pages 1730–1736, May 2015.
Time (s) [4] Marcus Greiff. Modelling and control of the crazyflie quadrotor for
aggressive and autonomous flight by optical flow driven state estimation,
(c) Down motion. 2017. Master’s Thesis.
[5] MGC3130 data sheet.
Fig. 8: Curves showing the averaged readings of the Skywriter [6] Carlos Luis and Jerome Le Ny. Design of a Trajectory Tracking Controller
sensor in the x, y, and z directions for three various actions, for a Nanoquadcopter. CoRR, abs/1608.05786, 2016.
each was repeated three times marked as instances on the time
axis.