Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
82 views5 pages

Gesture Controlled Drone: Ian Walter and Monette Khadr

This document describes a system using hand gestures to control a drone. An extended Kalman filter is used to estimate the drone's state based on sensor readings from an onboard accelerometer, gyroscope, optical flow sensor, and time-of-flight sensor. A near-field sensor measures hand movements which are translated into commands to control the drone's roll, yaw, and pitch. The system aims to allow fully autonomous flight through sensor fusion and state estimation using an extended Kalman filter.

Uploaded by

Guruprasad Nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views5 pages

Gesture Controlled Drone: Ian Walter and Monette Khadr

This document describes a system using hand gestures to control a drone. An extended Kalman filter is used to estimate the drone's state based on sensor readings from an onboard accelerometer, gyroscope, optical flow sensor, and time-of-flight sensor. A near-field sensor measures hand movements which are translated into commands to control the drone's roll, yaw, and pitch. The system aims to allow fully autonomous flight through sensor fusion and state estimation using an extended Kalman filter.

Uploaded by

Guruprasad Nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Gesture Controlled Drone

Ian Walter and Monette Khadr


University at Albany - State University of New York
Email: [email protected] and [email protected]

Abstract—The aim of the project is to develop a system that from on-board accelerometers and rate gyroscopes. The state
uses hand gestures as a method to control the flight of a drone. In estimator, controller, and trajectory generator all run on-board
this system, the drone’s absolute position is not being monitored the Crazyflie’s microcontroller. The goal is to fuse all sensors
or recorded. Instead, the drone is being told to move relative
to its current position based on the detected motion of the user. information arriving at variable rates, and for this purpose,
In order to enable fully autonomous flight, an extended Kalman an extended Kalman filter (EKF) is considered. An advantage
filter (EKF) based procedure is used to control and adjust all of using a Kalman filter is allowing simple modification to
six DoF (degrees of freedom) of the drone. The EKF used support sensor fusion by including measurement equations in
the readings of the pre-mounted accelerometer and gyroscope the update step [4].
sensors on the drone as well as a supplementary optical flow
sensor and a time-of-flight (ToF) sensor. The estimator uses an The contribution of this work can be divided in two folds:
extended aerodynamic model for the drone, where the sensor (1) Demonstrating that closed loop control of the drone is
measurements are used to observe the full 3D airspeed. To detect necessary for agile and controllable drone flight maneuvers
the motion of the user, a near-field sensor is measuring the (2) Creating a framework that translates hand movements into
disturbance of an electric field due to conductive objects, like a drone trajectories. The remaineder of the paper is organized
finger. Finally, to combine these systems, code will be developed
on a RaspberryPi to facilitate communication from the sensor to as follows: the system model is presented in Section II, with
the drone and convert from the input X, Y, Z sensor values to details on the utilized sensors. Section III discusses the drone’s
the values compatible with the drone system. inner state estimation problem and the proposed EKF. System
analysis is listed in Section IV, and finally the paper concludes
I. I NTRODUCTION
in Section V.
Drones, also known as unmanned aerial vehicles (UAV),
II. S YSTEM M ODEL
are being heavily deployed in a wide range of commercial and
recreational applications [1]. Drones are be basically observed The system model of the proposed gesture controlled drone
as special flying robots that perform multiple functionalities system is depicted in Fig. 1. The system consists mainly of
such as data capturing and sensing from its environment. the hand movement sensor, the drone, and the RaspberryPi
There are two broad types of drones which are fixed-wing where all the processing and timing control is performed.
and multirotor. In this work, the open-source Crazyflie 2.0 is The mathematical model of the fundamental components of
used, which is a quadcopter (i.e. drone with four rotors) [2]. the system, i.e the sensor and the drone, is discussed in the
Most of the available commercial drones come with designated forthcoming subsections.
controllers or software applications running on users’ hand-
held device. In both cases, commands with detailed movement
Sampling
information are sent through wireless channels, which can be EKF
Clock
via Wi-Fi or Bluetooth. The Crazyflie weighs only 27 grams
and is 9.2cm in length and width. The communication with
Crazyflie can either be via Bluetooth or using the Crazyradio Start Quantify Translate X, Y, Z Send Control
Program Movement Into Roll, Yaw, Pitch to Crazyflie
which is a long range open USB radio dongle based on the
nRF24LU1+ from Nordic Semiconductor. Fig. 1: System model detailing the main functional compo-
This work presents an attempt to add new control dimen- nents of the sensor to drone system
sions by allowing more degrees of freedom (DoF) to control
the drone. Instead of using predesignated buttons, users can
move their fingers which are then translated by the sensor A. Sensor
into digital commands. For autonomous operation, state esti- The Skywriter HAT near-field sensor operates by detecting
mation is a fundamental requirement for these vehicles. This fluctuations in a self-generated magnetic field by the intro-
work adopted a quadrocopter state estimation strategy, which duction of conductive objects such as fingers, as can be seen
uses these range measurements to localize the quadrocopter in Fig. 2 [5]. From the Skywriter datasheet, the sensor has a
inspired by the work done in [3]. A closed loop control of sampling frequency of 200 Hz, a spatial resolution of 150
a quadrocopter is developed using a 2-D positioning sensor, dpi, and a range of detection from 0 to 15 cm. However,
a time-of-flight (ToF) sensor measurements fused with a from empirical testing, the range for detection seems to be
dynamic model of the quadrocopter and with measurements consistently less than 3 cm in any direction from the center of
• The quadcopter is perfectly symmetrical in its geometry,
mass and propulsion system. Hence, the inertia matrix
about this symmetry is diagonal.
• The quadcopter is time-invariant and the mass is constant.

According to Newton-Euler equations:


      
Fb m 0 a ω × mv
= + (2)
τ 0 I α ω × Iw
where Fb is the body total force, τ is the total torque, m
Fig. 2: Diagram describing how a conductive material inter- is mass, I moment of inertia, a is linear acceleration, α is
faces with the Skywriter HAT the angular acceleration, w is the angular velocity, and v is
the linear velocity. Fb is the summation of forces caused by
the rotation of the rotors along the z-axis, such that Fb =
the device. Therefore, the dynamic model for the movement [0 0 f ]T withPsuperscript T denoting the transpose operation.
of the user’s finger is based on the following assumptions: Thus, f =
4 2
i=1 fi given that fi = cT wi with cT is the
• The range of detection is 2 cm from the center of the proportionality constant. Similarly, the torque can be expressed
device in any direction as τi = ±cQ wi2 and a relationship between propeller speeds
• The device operates consistently at 200 Hz and generated thrusts and moments, due to body symmetry,
• The movement of the user’s finger is less than can be defined as
     2
200 Hz × 4 cm = 8 m/s f cT cT cT cT w1
2
τ1   0 dc T 0 −d T  w2 
 
Due to the complexity of communicating with the Crazyflie  = (3)
τ2  −dcT 0 dcT 0  w32 
drone, the rate at which we can send data is much lower than 2
τ3 −cQ cQ −cQ cQ w4
the rate at which the sensor samples. To deal with this and
reduce effect of noise in the sensor, the Raspberry Pi will The idea is to translate Euler-Newton equations into the body
send processed data at a rate of approximately 10 Hz. The frame by defining a rigid transformation matrix from the iner-
sensor outputs information directly in X, Y, Z coordinates, so tial frame to the body-fixed frame. In this case Fe = RFb −mg,
the instantaneous velocity of the users movement can modeled Fe is the inertial total force, R is the transformation matrix
∆dx ∆dy ∆dz given in Eq. (1) and g is the gravity. Practically, due to
vx = vy = vz = air dynamics, the force generated by a propeller translating
t t t
with respect to the free stream will typically be significantly
where ∆d in each direction is the most recent X, Y, Z mea-
different from the static thrust force f . This deviation is a
surement minus the value used for the previous transmission
function of the quadrocopter’s relative airspeed.
to the drone.

Crazyflie 2.0 Raspberry Pi


Skywriter HAT

Crazyradio PA

Fig. 4: Hardware setup with the Skywriter sensor connected


Body Frame directly to the GPIO pins of the raspberry pi.
Inertial Frame

Fig. 3: Inertial frame and Body-fixed frame showing the C. Additional Sensors
Crazyflie.
In order to allow stable drone flight, we added an expansion
board that contains two additional sensors, the VL53L0x time-
B. The Drone
of-flight (ToF) sensor and the optical flow sensor PMW3901.
The dynamic equations of the quadcopter are made based The ToF sensor is a laser ranging sensor that measures the
on the following hypothesis [6]: distance of the drone from the ground. It contains a 940
• The quadcopter is a rigid body that cannot be deformed, nm invisible laser that can measure distances up to 4m at a
thus it is possible to use the well-known dynamic equa- maximum rate of 50 Hz. On the other hand, the optical flow
tions of a rigid body such as by using Euler-Newton sensor uses a low-resolution camera to measure movements
approach. in the x and y coordinates relative to the ground. The sensor
 
cos θ cos φ cos θ sin φ − sin θ
R = sin θ sin φ cos ϕ − cos φ sin ϕ sin θ sin φ sin ϕ + cos φ cos ϕ sin φ cos θ  (1)
sin θ cos φ cos ϕ + sin φ sin ϕ sin θ cos φ sin ϕ − sin φ cos ϕ cos θ cos φ

requires a lens that enables the far-field tracking capability. It with ηgyro is assumed to be zero-mean additive white Gaussian
has a frame rate of 121 frames per second with a rate of 100 noise (AWGN). The accelermometer measurements are also
Hz. The term optical flow refers to a flow of two-dimensional assumed to be corrupted by AWGN, ηacc , it can be expresssed
images, in which certain features, such as patterns or pixel as
intensities, are tracked in time. The PMWB3901 requires SPI 1
interface, while the VL53L0x requires I2 C connectivity, the zacc = R−1 (ẍ − g) + ηacc = (f + fa ) + ηacc (6)
m
PCB board handles the connectivity constraints and allows The gyroscope measurements can be directly used as an
direct communication with the drone. estimate of the drone’s angular velocity, while the accelerom-
eter readings can be used to estimate the force of airspeed,
fa . However, as can be observed from Eq. (5) and (6), these
readings are noisy. As a result, we incorporated the optical
flow sensor and the ToF flight sensors, detailed in Section II.C,
in order to improve the estimation reliability. Even though the
PMW3901 drone dynamics are highly non-linear, insight can be gained
Lens from analysing the linearised system constituted by the error
dynamics which makes using an EKF a viable solution.

I2C
6 DoF IMU Accumulator
500 Hz 100 Hz

SPI Extended
Flow Sensor Kalman
Fig. 5: The VL53L0x and PMW3901 sensors integrated within Filter Position,
velocity, and
the same PCB mounted on the bottom of the Crazyflie. orientation
Laser Sensor I2C

III. I NNER S TATE E STIMATION


Fig. 6: The accumulator averages the last 5 IMU measure-
State estimation of the drone can be accomplished in many ments, as the prediction loop is slower than the IMU loop.
ways, however, a number of factors need to be taken into However, the IMU information is required externally at a
consideration when formulating realtime compliant and com- higher rate for body rate control.
plexity constrained algorithms. The challenge lies in fusing the
information arriving from different sensors at variable rates, As the translational dynamics of the quadcopter is a triple
and for this purpose, an EKF is considered. The Crazyflie has a integrator in essence, stability can only be guaranteed if the
pre-mounted inertial measurement unit (IMU) which includes measurement equation contains information on the zeroth
a 3 axis gyro (MPU-9250), 3 axis accelerometer (MPU- order states, that is translation and attitude. The IMU provides
9250), 3 axis magnetometer (MPU-9250), and a high precision only second order derivative information which will cause the
pressure sensor (LPS25H). For this work, we used the readings EKF to quickly diverge in a quadratic fashion for the positional
from the gyroscope and accelerometer. The slowest rate at states. Using the first order derivative information, obtained by
which the IMU sensor data is fetched is at f s = 500 Hz. the optical flow and ToF sensors, results in slower divergence
As previously mentioned, the component of the airspeed has as a linear drift in the positional estimates. In addition, the
an effect on the drone’s flight. To account for the airspeed, a estimator contains a reference rotation matrix, R̂, where all
vector, fa , is introduced in the Newton-Euler equations that the orientations are expressed. The estimator aims to find
captures the aerodynamic effects affecting the body-frame of the stochastic state ξ, such that ξ = (x, p, δ) denoting the
the drone. Hence, the forces can be re-expressed as [3] drone’s position, velocity, and orientation respectively. Figure
6 illustrates the utilized sensors and the EKF system block
mẍ = R(f + fa ) + mg (4)
diagram. The IMU operates at the rate of 500 Hz, and the
where ẍ is the double derivative of the drone’s position in an accumulator averages the samples such that the rate of the
inertial reference frame. The measurement of the gyroscope output is 100 Hz to match that of the other two sensors. For the
can be modelled as flow sensor, a driver is used to written sample the accumulated
pixel counts, rotate the accumulated pixel counts into the body
zgyro = w + ηgyro (5) frame, and run digital signal processing, including filtering, on
the measurements. Due to the compactness in size limitation,
the rate at which the driver runs is limited to the previously
mentioned 100 Hz.
1
IV. A NALYSIS x
0.9 y
z
Exhaustive experiments are performed to verify the stability 0.8
of the drone’s flight and its responsiveness to precoded trajec- Instance 1
0.7

Measured sensor value


tories. With the aid of the closed loop control, the drone is
0.6
tested for linear, ramp-like, and circular motions. A figure-8
0.5
trajectory is also performed and the drone landed repeatedly
back to its point of origin. Because the drone works well 0.4

and consistently when feedback through the optical sensor 0.3

is introduced to the system, the focus for analysis is the 0.2


behaviour of the sensor. There are three primary issues that 0.1
Instance 2 Instance 3
are identified, each of these can be seen in 7, which displays
0
the x, y and z coordinates of the user’s hand as it moves 0 0.5 1 1.5 2 2.5 3 3.5 4
Time (s)
over the Skywriter sensor in one of the three axis directions.
The detection of motion is indicated by changes in the x, y (a) Forward motion.
and z lines, when the line is flat that entails that no change
1
is detected and the coordinates are repeated. Each sub-figure x
0.9 y
displays three repetitions of the same motion in one primary z
direction. The first of the three issues is that in each instance 0.8
Instance 1
there should have been only one axis with a major change, 0.7

Measured sensor value


for illustration, in Fig.7(a) x should change from 0 to 1, in 0.6

Fig.7(b) y should go from 0 to 1, and in Fig.7(c) z should 0.5


go from 1 to 0. As can be seen in these graphs, rarely is 0.4
there only one axis changing for each instance of motion and
0.3
in some cases the wrong axis has much more movement than
0.2
the axis that should be changing. Secondly, the sensor picks up
Instance 2 Instance 3
radically different values even when the motion is as identical 0.1

for humanly response. In some cases, no changes at all are 0


0 1 2 3 4 5 6 7 8
detected such as in instance 1 in Fig.7(b). Lastly, the motion Time (s)
that is picked up by the sensor is very noisy, such as in instance
(b) Right motion.
1 in Fig.7(a).
1
x
0.9 y
z
0.8
Instance 1
0.7
Measured sensor value

0.6

0.5

0.4

0.3
Fig. 9: Segment of code responsible for interfacing with 0.2
Skywriter API and accumulating measured positions from Instance 2 Instance 3
0.1
user.
0
0 1 2 3 4 5 6 7 8
Time (s)
In order to overcome the impact of the noise, a modification
in the code is made. This is done through averaging the (c) Down motion.
samples collected between drone actuations, thus resulting Fig. 7: Curves showing the non-averaged readings of the
in a minimal impact from noise but also a lower spatial Skywriter sensor in the x, y, and z directions for three various
resolution as can be seen throughout Fig.8. Measurements actions, each was repeated three times marked as instances on
are accumulated at a maximum frequency of 10 kHz, as can the time axis.
be seen in Fig.9. These measurements are accumulated for
at most 0.05 seconds before being taken in and averaged
by the main code loop. Based on these values, there should
have been at most 500 samples between each loop. However,
due to the Crazyflie’s motion application program interface
(API) implementation, there is additional time at which the
1 program is stalled before the drone begins moving. Although
x
0.9 y this additional time varies based on the distance values fed into
z
0.8 the function, the typical additional time is significantly less
0.7
Instance 1 than 0.05 seconds, while the maximum theoretical time would
Measured sensor value

be 0.5 seconds. Therefore, the typical number of samples


0.6
accumulated each loop is well under 1000.
0.5
Because the sensor is relatively cheap, there is no calibra-
0.4 Instance 3 tion that can be done to adjust measurements for varying
0.3 environments and because the user cannot access the raw
0.2 electrode measurements there is little that can be done to
0.1
Instance 2 improve its performance and sensitivity. By having no dynamic
0
calibration, not only does changing the environment slightly
0 1 2 3 4 5 6 have an impact on the sensor’s output, the sensor also struggles
Time (s)
with continued distortions of the surrounding electromagnetic
(a) Forward motion. emissions. This is exemplified when the user holds his/her
1
hand in the exact same position near the sensor. After around
x a second, the measurements output by the device vary wildly
0.9 y
z without any motion from the user. This makes slow or precise
0.8
movements almost impossible to track. All of these issues
Instance 1
0.7 could have been minimized or avoided altogether by using
Measured sensor value

0.6 another sensor such as an optical flow sensor, similar to the


0.5 one used by the drone for stabilization. Crazyflie sells one of
0.4
these sensors for $40 specifically for this kind of use. The
relatively comparable price and potential increase in accuracy
0.3
and performance make this kind of device a more suitable
0.2
option for further studies.
Instance 2 Instance 3
0.1
V. C ONCLUSION
0
0 2 4 6 8 10 12
Time (s)
In this work, a system that controls the motion of a drone
based on users hand gestures was developed. The system
(b) Right motion. consisted of a RaspberryPi, a near field sensor for hand motion
1 detection, and a Crazyflie drone. In order to optimize the
0.9
x
y
drone’s flight and to provide a streamline operation, the drone’s
z premounted sensors as well as additional sensors were used
0.8
Instance 1
along with an EKF. Future work includes using a different
0.7
hand motion sensor for a more seamless operation.
Measured sensor value

0.6
R EFERENCES
0.5

0.4
[1] Kathiravan Natarajan, Truong-Huy D. Nguyen, and Mutlu Mete.
Hand Gesture Controlled Drones: An Open Source Library. CoRR,
0.3 abs/1803.10344, 2018.
[2] Crazyflie 2.0. https://store.bitcraze.io/products/crazyflie-2-0.
0.2
[3] Mark W Mueller, Michael Hamer, and Raffaello D’Andrea. Fusing ultra-
Instance 2 Instance 3
0.1 wideband range measurements with accelerometers and rate gyroscopes
for quadrocopter state estimation. In 2015 IEEE International Conference
0
0 1 2 3 4 5 6 7 8 on Robotics and Automation (ICRA), pages 1730–1736, May 2015.
Time (s) [4] Marcus Greiff. Modelling and control of the crazyflie quadrotor for
aggressive and autonomous flight by optical flow driven state estimation,
(c) Down motion. 2017. Master’s Thesis.
[5] MGC3130 data sheet.
Fig. 8: Curves showing the averaged readings of the Skywriter [6] Carlos Luis and Jerome Le Ny. Design of a Trajectory Tracking Controller
sensor in the x, y, and z directions for three various actions, for a Nanoquadcopter. CoRR, abs/1608.05786, 2016.
each was repeated three times marked as instances on the time
axis.

You might also like