Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
13 views51 pages

Develop Proj

Uploaded by

shishir acharya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views51 pages

Develop Proj

Uploaded by

shishir acharya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

DEVELOPMENT OF AN OBSTACLE

AVOIDANCE QUADCOPTER

PROJECT REPORT

SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENT FOR THE AWARD


OF THE DEGREE OF

BACHELOR OF TECHNOLOGY
Mechanical Engineering

SUBMITTED BY

Aryaman Sharma - 20203034


Aditya Jain - 20203010
Ayush Kumar - 20203043
Shishir Acharya - 20133018

Mechanical Engineering Department

MOTILAL NEHRU NATIONAL INSTITUTE OF TECHNOLOGY ALLAHABAD

PRAYAGRAJ - 211004, INDIA

May 2024

1
Candidate's Declaration

We hereby certify that the work which is being presented in the project report entitled
“Development Of An Obstacle Avoidance Quadcopter” in partial fulfillment of requirements
for the award of degree of Bachelor of Technology in Mechanical Engineering at MOTILAL
NEHRU NATIONAL INSTITUTE OF TECHNOLOGY ALLAHABAD is an authentic record
of our work carried out during a period from August 2023 to May 2024 under the supervision of
Prof. Mukul Shukla. The matter embodied in the thesis has not been submitted to any other
University / Institute for the award of any degree.

Signature of the Students

Aryaman Sharma – 20203034

Aditya Jain – 20203010

Ayush Kumar – 20203043

Shishir Acharya – 20133018

This is to certify that the above statement made by the candidates is correct to the best of my

knowledge.

Date: Signature of Supervisor

Place: Prof. Mukul Shukla

2
Acknowledgments

We extend our sincere thanks to MNNIT Allahabad for the essential resources and support
provided for the successful execution of this study. The Mechanical Workshop at MNNIT
Allahabad deserves special recognition for offering the necessary manufacturing facilities,
forming the foundation for our analysis and modeling.

Our gratitude extends to Prof. Mukul Shukla, our mentor and guide, whose expertise and
insights played a pivotal role in shaping the direction of this study, ensuring its rigor and quality.
Throughout the research process, Dr. Shukla's constant support and encouragement served as a
motivating force.

Special appreciation is also reserved for Dr. J. C. Mohanta for his invaluable assistance in
providing access to the historical dataset of solar radiation readings at the MNNIT campus,
significantly enhancing the quality of our analysis.

Furthermore, we express our thanks to the members of our panel for their unwavering
perseverance, ongoing inspiration, and consistent oversight, all of which contributed to the
success of this work. Professor K. N. Pandey, Head of the Department, deserves
acknowledgement for facilitating the necessary facilities.

We would like to express our heartfelt gratitude to Robotics Club MNNIT and AeroClub
MNNIT for their wonderful support. AeroClub MNNIT provided the necessary components and
a conducive workspace, whereas Robotics Club MNNIT provided the necessary components and
a productive workspace. Their outstanding help was critical to the effective completion of our
project.

In our attempt to acknowledge all contributors to this research, we acknowledge that inadvertent
omissions may occur, and we apologize for any oversights. Our deep appreciation goes to
everyone involved in this study, as each has played a crucial role in its successful completion.
Aryaman Sharma – 20203034

Aditya Jain – 20203010

Ayush Kumar – 20203043

Shishir Acharya – 20133018

4
Abstract
The development of autonomous systems has had a significant influence on several sectors in recent

technological breakthroughs, with one notable application being the production of

obstacle-avoidance drone systems. This abstract digs into the complex design and implementation of

a drone-based obstacle avoidance system, emphasizing its importance in enhancing mobility and

safety for people with physical limitations. The major goal of the obstacle avoidance drone project is

to enable accurate and dynamic movement while flying, ensuring the drone avoids objects as it

navigates through airspace. The obstacle avoidance algorithm is built on the integration of ultrasonic

sensors, which was inspired by improvements in mobile robot systems and autonomous quadcopters.

This research seeks to address not just technical issues related to drone navigation, but also to

contribute to the larger objective of improving the quality of life for people with physical limitations.

The idea is inspired by a variety of sources, including a mobile robot system meant to assist the

physically impaired and equipped with ultrasonic range finders for obstacle identification and

mapping. The topic includes a thorough examination of ultrasonic sensors, stressing their

performance and addressing any limits. Based on this, we offer a simple yet successful technique for

object recognition and collision avoidance in an autonomous flying quadcopter. The study, conducted

in a self-built quadcopter, proves the overall viability of the technique. The vehicle's movement is

directed by real-time data gathered from attached sensors, allowing the system to detect impediments

and determine collision-free courses independently. This abstract delves into the deep aspects of the

sensor-controller interface and its significance in controlling the movement of the robot. This

project's multidisciplinary character, combining robotics, sensor technology, and aircraft engineering,

places it at the forefront of technological innovation. The relevance of this discovery becomes clear

as the abstract closes, suggesting a future in which obstacle avoidance drones reinvent aerial

navigation norms, boosting safety and accessibility in a variety of contexts.

5
Table of Contents

Candidate's Declaration 2

Acknowledgements 3

Abstract 5

List of Figures 7

Chapter 1 Introduction 8

1.1 Objective

1.2 Methodology

1.3 Motivation

Chapter 2 Literature Review 12

Chapter 3 Project Work 14

3.1 Functioning and Implementation of Key Components

3.2 System Design

3.3 Circuit Design

3.4 Circuit Description

3.5 UR Code Review

3.6 Integration of Apm With Arduino

3.7 Path Planning Using MavProxy

Chapter 4 Results and Discussion 44

4.1 Simulation & Testing

4.2 Hardware integration & implementation

Chapter 5 Conclusions References 50

References 51

6
List of Figures

Fig.3.1.1 Pin Diagram of a UR sensor 15

Fig.3.1.1 (a) Echolocation and reflection 16

(b) Timing pulse diagram 16

Fig 3.1.2 Mission Planner 18


Fig.3.2 Block Diagram of the entire assembly. 19

Fig.3.2.2 (a) arduino uno 20

(b) Pin diagram of an ATMega328 21

Fig.3.2.3 ArduPilot Mega(APM 2.8) 22

Fig.3.2.4 Electronic speed controller 23

Fig.3.2.5 BLDC motor 24

Fig. 3.4 Circuit diagram of the UR sensor array assembly 27

Fig. 3.4.1(a) pin connections 27

(b) pin connections 28

Fig. 3.4.2 pin diagram (alert algorithm) 29

Fig. 3.4.3 pin diagram (power distribution) 30

Fig.3.5.1 Code Flowchart 35

Fig 3.7 Flight path simulation 41

Fig. 4.1 (a) Arduino IDE 44

(b) Obstacle detection response 45

Fig. 4.2 (a) Array Stand design in Solid Edge 47

(b) 3d printed array stand 47

(c) Drone assembly 48

7
Chapter 1

Introduction

1.1 Objective

The integration of unmanned aerial vehicles (UAVs) or drones has emerged as a pivotal frontier

in the rapidly evolving landscape of robotics and automation. Drones, also known as flying

robots, are capable of performing a wide range of tasks autonomously or via remote control,

having a significant impact on industries such as imaging, transportation, and geographic

information systems. As technological advancements propel the development of these aerial

systems, the implementation of obstacle avoidance mechanisms is a critical aspect that requires

attention. This project is motivated by the need to improve the safety and efficiency of drone

operations by implementing obstacle avoidance systems. The goal is to create a drone

automation solution that incorporates ultrasonic sensors, based on previous work in the field.

The significance of drone automation lies in its ability to perform specific tasks without constant

human supervision, particularly in environments where human intervention may be difficult or

impractical. Because the project focuses on the integration of ultrasonic sensors for obstacle

avoidance, it is consistent with the larger goal of developing intelligent and autonomous robotic

systems capable of navigating complex and dynamic environments. Based on this foundation,

our project aims to add to the existing body of knowledge by proposing an obstacle avoidance

system for drones that uses ultrasonic sensors. The emphasis is on local information processing

to improve real-time decision-making and reduce potential delays associated with round-trip

communication between the drone and a central server. The evaluation of sensor detection

performance is an important aspect of our work because it ensures the dependability and

effectiveness of the proposed obstacle avoidance solution.

8
1.2 Methodology

Drone automation is the development and deployment of systems that enable unmanned aerial

vehicles (UAVs) to operate autonomously or semi-autonomously. Drone automation methods

vary based on the application and the complexity of the activities that the drone is intended to do.

We will examine drone automation approaches in this project, with an emphasis on quadcopters.

1. Obstacle Detection Technologies:


● Ultrasonic Sensors: Critical for drone obstacle detection, these sensors use ultrasonic
waves to calculate obstacle distances based on wave return time.
● Systems based on cameras: Drones with cameras leverage computer vision algorithms for
obstacle detection, object recognition, and precise navigation.
● Light Detection and Ranging (Lidar) and Radar Systems: Utilizing laser or radio waves,
Lidar offers detailed 3D mapping, while radar excels in various weather conditions, both
contributing to effective drone automation.
2. Navigation and Control Systems:
● GPS (Global Positioning System): Vital for drone navigation, GPS provides real-time
position data, enabling precise waypoint navigation and mission planning.
● Inertial Measurement Unit (IMU): Combining accelerometers and gyroscopes, IMUs
provide orientation and velocity information for stable flight and control.
● Flight Controllers: Specialized systems interpret sensor data to maintain drone stability
and execute desired maneuvers, ensuring effective control.
3. Autonomous Navigation Algorithms:
● Path Planning Algorithms: Determine optimal drone paths, employing techniques like
Rapidly Exploring Random Trees (RRT) and Genetic Algorithms..
● Obstacle Avoidance Algorithms: Employ proximity-based and machine learning
approaches for real-time decisions, enhancing obstacle avoidance capabilities.
● Integration into Quadcopters: Quadcopters with four rotors in a square configuration
integrate obstacle detection technologies seamlessly. The flight controller interprets
sensor data, adjusting motor speeds for stable flight and obstacle avoidance. Rigorous
testing and calibration optimize system performance.

9
In summary, drone automation approaches comprise sensors, guidance, and control systems,

protocols for communication, and complex algorithms. When these approaches are applied to

quadcopters, they offer accurate, autonomous flying with obstacle-avoiding capabilities, making

them adaptable instruments for a wide range of uses including aerial photography to search and

rescue operations.

1.3 Motivation

Our project is motivated by the need to improve the capabilities of UAVs, or unmanned aerial

vehicles, by developing a unique obstacle and collision avoidance system. Our proposed system

stands out as a cost-effective and efficient option by incorporating a variety of low-cost sensors,

including infrared and ultrasonic technology. In contrast to existing methodologies, our strategy

prioritizes implementation simplicity, minimizing mathematical intricacy, lowering

computational load, and cutting total development and maintenance expenses.

The desire to improve autonomous flight capabilities while assuring strong collision avoidance is

one of the key motives driving our study. Our method reduces collision risks by expertly

managing the drone's distance from nearby objects, particularly walls, and humans,

demonstrating its promise for real-world applications. The use of ultrasonic sensors is a

technological breakthrough, providing accurate distance measurements that allow the drone to

identify and maneuver around obstacles in its flight path. This not only improves the safety and

efficiency of UAV operations, but it also establishes our initiative as an important addition to the

expanding environment of unmanned aerial vehicles.

10
We understand the inherent benefits of ultrasonic sensors, such as their affordability and

lightweight design, but we also recognise their drawbacks, such as their limited range and

vulnerability to interference from outside sound sources. These difficulties highlight how

difficult it is to create drones that can avoid obstacles and collisions. But our project also acts as

a springboard, solving present issues and laying the groundwork for upcoming developments in

UAV technology. We see tremendous potential to further refine obstacle and collision avoidance

systems as technology advances, making UAVs more dependable and adaptable for a wide range

of applications.

11
Chapter 2

Literature Review

Research on obstacle and collision avoidance for drones highlights a lively field of research
driven by the increasing frequency of drones and the need to improve their operating safety.
Diverse strategies have been investigated, demonstrating a variety of approaches to reducing
collision hazards during drone operations. Sensor-based and vision-based solutions emerge as
common threads, using devices such as infrared and ultrasonic sensors for real-time obstacle
identification. Furthermore, machine learning and map-based approaches offer intriguing
opportunities for future growth.

Arne Devos, Emad Ebeid, and Poramate Manoonpong presented a work that used simulation to
demonstrate the preliminary performance of an adaptable obstacle avoidance control system. The
study created a drone model and controller in C++ using a V-REP simulation environment,
establishing the framework for successful collision avoidance methods [1]. Jawad N. Yasin and
Sherif A. S. Mohamed did a thorough analysis of the literature on collision avoidance systems
and tactics used in unmanned vehicles [2]. Their paper provides an excellent overview of the
current state of collision prevention research.

Meng Guanglei and Pan Haibing suggested a method for auxiliary barrier aid that included
ultrasonic sensors. This novel method calculates distances amid a quad-rotor drone and obstacles
using acoustic reverberation time. Based on this information, the flight controller orchestrates
controlled, slow-motion maneuvers to avoid detected impediments [3]. A collision-free indoor
navigating algorithm for teleoperated multirotor Unmanned Aerial Vehicles (UAVs) was
contributed by Marcin Odelga, Paolo Stegagno, and Heinrich H. Bülthoff. The program actively
tracks and responds to barriers in the robot's immediate surroundings using an RGB-D camera
and a Bin-Occupancy filter.

12
While this research demonstrates advancements in obstacle and collision avoidance, difficulties
remain. Accurate location tracking, sensor range restrictions, and a tendency to false positives
are all opportunities for development. The incorporation of ultrasonic sensors, as proven by
Meng Guanglei and Pan Haibing's work, shows potential for improved drone safety and
efficiency. Exploration of hybrid techniques and refining of sensor technologies are critical as the
field progresses for creating secure and dependable obstacle and collision prevention systems for
drones.

13
Chapter 3

Our Work

3.1 Functioning and Implementation of Key Components

3.1.1 UR sensor

Ultrasonic sensors, also known as UR sensors, play an important role in modern object detection

systems. These ultrasonic sensors, which operate on the principles of emitting and receiving

ultrasonic waves, are critical in scenarios requiring precise distance measurement and effective

obstacle avoidance.

UR sensors begin their operation by producing ultrasonic waves with frequencies ranging from

20 to 200 kHz. This frequency spectrum keeps the sound waves inaudible to the human ear. The

sensor's emitter component is in charge of producing these waves, which then propagate through

the surrounding medium, which is typically air. The frequency chosen must balance factors such

as wavelength, power consumption, and the ability to penetrate obstacles.

The Ultrasonic Sensor [HC-SR04], a popular model known for its dependability, plays an

important role in object detection systems.

Operational Characteristics:

The HC-SR04 emits high-frequency sound waves within a frequency range of 40Hz when

triggered by a pulse input from the ARM microcontroller. Its TTL operation ensures a seamless

interface with the microcontroller, making it an adaptable choice for a wide range of electronic

systems.

14
The model has a ranging accuracy of up to 3mm, making it a dependable solution for object

detection and obstacle avoidance.

The 5V supply, ground, trigger pulse input, and echo pulse output are all wired into the

HC-SR04. These connections facilitate integration into electronic systems by providing a simple

interface.

Fig. 3.1.1 Pin Diagram of a UR sensor

Echolocation and Reflection

An ultrasonic sensor employs echolocation by emitting high-frequency sound waves and

interpreting the echoes produced when these waves collide with obstacles.

A burst of high-frequency sound waves is emitted by the ultrasonic sensor.

In the case of ultrasonic sensors, the emitted sound waves travel through the surrounding

medium, which is typically air. Because the speed of sound in the medium is known, the sensor

can precisely calculate distances based on the time it takes for the waves to travel.If the emitted

sound waves collide with an object, the process is known as reflection. When the waves hit an

obstacle, they return to the sensor.

15
Fig.3.1.1 (a) Echolocation and reflection

The ultrasonic sensor includes a receiver component that detects echoes or reflected waves. The

sensor is intended to measure the time it takes for the emitted waves to travel to and from the

object. The HC-SR04's Trig pin is set to high for at least 10 us. A sonic beam of 8 pulses at

40KHz is transmitted.

Fig.3.1.1 (b) Timing pulse diagram

16
The signal then hits the surface and returns to be captured by the HC-SR04's receiver Echo pin.

At the time of sending high, the Echo pin was already high.

The sensor can calculate the distance to the object by precisely measuring the time elapsed

between the emission of the sound wave and the reception of its echo. The following formula is

used in this calculation:

The distance data obtained from the time-of-flight measurement is interpreted by the sensor's

internal electronics.

3.1.2 Mission Planner and MAV Proxy

Mission Planner is a ground control station software for planning, executing, and monitoring
quadcopter flight paths using ArduPilot firmware. It provides a graphical interface for users to
interact with their drones, rovers, boats, and other robotic platforms, enabling both beginners and
advanced users to manage their vehicles effectively.

Connect to your quadcopter via USB or telemetry (e.g., 915MHz radio), then perform sensor
calibration (accelerometer, compass) and configure flight modes and fail-safes. In the "Flight
Plan" tab, define waypoints and mission commands (e.g., takeoff, waypoint navigation, RTL),
then upload the mission to the flight controller. Arm the quadcopter, set the mode to "Auto" to
initiate the mission, and use real-time telemetry for monitoring parameters like GPS coordinates,
altitude, and battery status, allowing for in-flight adjustments. Mission Planner facilitates precise
and autonomous quadcopter missions.

17
Fig 3.1.2 Mission Planner

MAVProxy is a command-line ground control station (GCS) software that communicates with
autonomous vehicles via the MAVLink protocol. It provides real-time telemetry, mission
planning, and vehicle control by routing MAVLink messages.

When used with Mission Planner, MAVProxy serves as an intermediary, forwarding MAVLink
telemetry data from the vehicle to Mission Planner. To use them together, connect to the vehicle
with MAVProxy and configure it to output telemetry data to Mission Planner . Mission Planner
can then be connected using the same UDP endpoint , allowing it to receive real-time data and
send commands for mission planning and vehicle control, utilizing MAVProxy's data routing
capabilities for comprehensive management.

18
3.2 System Design

A high-level view of the quadcopter drone assembly reveals a basic power flow that starts with

the battery and proceeds through the voltage regulator to the Pixhawk flight controller and

Electronic Speed Controller.

The Pixhawk, Arduino, and UR sensors communicate constantly, coordinating the drone's flight

path based on real-time obstacle detection data. This collaborative interaction provides precise

navigation and obstacle avoidance throughout the drone's operation.

Fig.3.2 Block Diagram of the entire assembly.

Component Overview:

3.2.1 UR sensor

As previously stated, we are employing The Ultrasonic Sensor [HC-SR04], a widely used model

that is well-known for providing accurate measurements and readings and is widely available on

the market.

19
3.2.2 Arduino uno

The Arduino Uno is a microcontroller board based on the ATmega328P. Six of the fourteen

digital input/output pins are set to PWM operation, and there are six analog inputs in total.

Among the key components are a reset button, a power jack, an ICSP header, a USB connector,

and a 16 MHz quartz crystal.

The Arduino Uno's central processor is the ATMEL ATmega328P, a member of the mega AVR

family. The Pixhawk flight control system will continue to communicate with the Arduino unit,

which will collect data from the UR sensors and send commands to them.

Fig.3.2.2(a) arduino uno

20
Fig.3.2.2(b) Pin diagram of an ATMega328

Key Parameters:

● Microcontroller: ATmega328P

● Digital I/O Pins: 14 (6 PWM)

● Analog Input Pins: 6

● Clock Speed: 16 MHz

● USB Connection: Yes

● Power Jack: Yes

● ICSP Header: Yes

● Reset Button: Yes

21
3.2.3 Ardupilot Mega

ArduPilot Mega (APM) is an open-source autopilot system used for controlling a variety of
robotic vehicles, including drones, rovers, and boats. It offers autonomous navigation and vehicle
control capabilities through a combination of hardware and software components.

Fig 3.2.3 Ardupilot Mega(APM) 2.8

Key parameters :
● Microcontroller: ATmega2560

● Digital I/O Pins: 54 (15 PWM)

● Analog Input Pins: 16

● Clock Speed: 16 MHz

● USB Connection: Yes

● Power Jack: Yes

● ICSP Header: Yes

● Reset Button: Yes

22
● Sensors:

○ Accelerometer

○ Gyroscope

○ Magnetometer

○ Barometer

3.2.4 ESC

The Electronic Speed Controller, or ESC, is a critical component of drones that regulates the

speed of each motor. It analyzes flight controller data and adjusts motor speeds to control the

drone's movement. To power the motors, ESCs convert direct current (DC) power from the

battery to three-phase alternating current (AC), ensuring accurate and responsive control during

flying maneuvers. They play an important role in stabilizing the quadcopter and impacting its

overall performance, making them essential for a smooth and quick drone operation.

Fig.3.2.4 Electronic speed controller

23
3.2.5 BLDC Motor

BLDC (Brushless DC) motors provide high dependability and efficiency in quadcopters.

Electronic commutation, as an alternative to traditional brushes, provides precise control,

minimal maintenance, and an ideal power-to-weight ratio—all of which are critical for

responsive and nimble drone flight. The motor provides a maximum thrust of 840gm, with a

motor kv of 935 RPM/V

Fig.3.2.5 BLDC motor

3.3 Circuit design

The circuit design incorporates two critical circuits: the Ultrasonic Sensor Array Circuit and the

Drone Maneuvering Circuit, both of which play critical roles in obstacle avoidance and precise

drone maneuvering. The seamless integration of these circuits serves as the quadcopter's

backbone, balancing safety measures with precise operational control.

24
3.3.1 Ultrasonic Sensor Array Circuit:

The Ultrasonic Sensor Array Circuit, which is designed to perceive the drone's surroundings, is

at the heart of obstacle avoidance. The Arduino Uno, a microcontroller that orchestrates data

from six ultrasonic sensors for strategic drone positioning, is at the heart of this circuit. The

ultrasonic sensors act as the quadcopter's eyes, continuously measuring distances from potential

obstacles.

When the Arduino Uno receives distance readings, it uses a sophisticated algorithm to determine

the proximity of obstacles. If insufficient distance is detected, the circuit activates a robust

response mechanism. This includes sounding an alarm, warning operators of potential hazards,

and lighting an LED bulb. This section of the circuit design enables real-time communication

and responsiveness, laying the groundwork for an efficient obstacle avoidance system.

3.3.2 Drone Maneuvering Circuit:

The Drone Maneuvering Circuit, a complex network of components dedicated to controlling the

quadcopter's pitch, yaw, and roll, works in tandem with the obstacle avoidance system. The

Pixhawk flight controller, a high-performance autopilot system that governs the quadcopter's

movements with unparalleled precision, is at the heart of this circuit.

The Pixhawk, which serves as the quadcopter's brain, receives sensor input and converts it into

commands for the Electronic Speed Controller (ESC). The ESC controls the current flow to each

motor, adjusting the speed of the four rotors located at the drone's four corners. Propellers

connected to these motors generate the lift required for aerial navigation. The interaction of the

Pixhawk, ESC, motors, and propellers creates a sophisticated control system that allows the

25
quadcopter to maintain stability, execute smooth maneuvers, and respond quickly to external

factors.

Both circuits are integrated using meticulous planning and precise connectivity. With its

ultrasonic sensor array, the obstacle avoidance system integrates seamlessly with the drone

maneuvering circuit, resulting in a comprehensive system that prioritizes both safety and control.

The drone maneuvering circuit ensures that every movement is executed with finesse as the

ultrasonic sensors diligently navigate the drone through its surroundings, promising a flight

experience that is both secure and agile.

3.4 Circuit description

The current flow in the circuit is meticulously orchestrated to ensure optimal functionality. The

Pixhawk, which serves as the power source for the UR sensor array circuit, supplies energy to

the Arduino, allowing for more efficient power distribution. This power is then efficiently

circulated on the breadboard, serving as a convenient hub for simplified component connections.

The breadboard serves as a central node, allowing energy to be transferred seamlessly to various

circuit components. The power is then transferred to critical components such as ultrasonic

sensors, LED bulbs, and sound alarms, allowing them to operate precisely. This systematic

current flow not only improves the circuit's reliability but also ensures that each component

receives the power it requires for seamless integration and cooperative operation within the

quadcopter system.

26
Fig. 3.4 Circuit diagram of the UR sensor array assembly

3.4.1 The pins connections for the ultrasonic sensors

Fig. 3.4.1(a) pin connections

27
a. UR sensor 1 trig pin to digital pin 13
b. UR sensor 1 echo pin to digital pin 12
c. UR sensor 2 trig pin to digital pin 11
d. UR sensor 2 echo pin to digital pin 10
e. UR sensor 3 trig pin to digital pin 9
f. UR sensor 3 echo pin to digital pin 8
g. UR sensor 4 trig pin to digital pin 7
h. UR sensor 4 echo pin to digital pin 6
i. UR sensor 5 trig pin to digital pin 5
j. UR sensor 5 echo pin to digital pin 4
k. UR sensor 6 trig pin to digital pin 3
l. UR sensor 6 echo pin to digital pin 2

Fig. 3.4.1(b) pin connections

28
Because we will be operating at maximum and minimum capacity for all of the ultrasonic

sensors, we did not need to differentiate between PWM pins and normal digital pins over here.

Connecting to the analog pins would also cause a break in the connections for all of the

ultrasonic sensors, and there was no reason to run the sensors on analog input output, so all

connections are made on the digital pins. One connection is made for each ultrasonic sensor's

trigger pin and one for each echo pin. As a result, make arrangements for both sending and

receiving signals.

3.4.2 The alert algorithm circuit

a. the LED positive pin is connected to analog pin A3

b. the sound alarm positive pin is connected to analog pin A4

Fig. 3.4.2 pin diagram (alert algorithm)

29
The analog pins are connected to the LED and sound alarm pins, and we can even control the

intensity of their operation. We may be able to implement more precise alert systems here, such

as:

● Assume we add a blinking mechanism for the alert to the LED bulb. So, if the object is

not very close but is in the alert zone, we can keep its intensity at 100/255 and increase it

as the distance decreases up to the maximum value of 255.

● We can change the frequency at which the alarm sounds based on the distance between

the obstacle and the alert zone.

3.4.3 The power distribution circuit

a. From the pixhawk to the Vin pin on arduino

b. The 5V pin from the arduino to the breadboard positive shorted pins

c. The GND pin from arduino to the breadboard negative shorted pins

Fig. 3.4.3 pin diagram (power distribution)

30
The power is passed from the pixhawk to the arduino, which is then passed onto the breadboard's

positive shorted pins, giving us access to 60 positive connection pins. Similarly, the GND pin of

the Arduino is connected to the breadboard's negative shorted pins, giving access to 60 pins for

the ground terminal. The use of a breadboard simplifies the connections for all of the other

components by increasing the number of pins available.

31
3.5 UR Code overview

The code for the UR sensor array circuit is defined below:

32
33
34
3.5.1 The Flowchart

Fig.3.5.1 Code Flowchart

35
3.5.2 The Algorithm

a. Initialization:

i. Define the trigger and echo pins for each ultrasonic sensor as constants (trigPin1

to trigPin6 and echoPin1 to echoPin6).

ii. Declare variables (duration, distance, firstSensor, secondSensor, ..., sixthSensor)

for storing sensor readings.

iii. Set up the Arduino pins for triggers and echoes using pinMode.

iv. Initialize Serial communication for debugging purposes.

b. Main Setup:

i. Implement the setup() function.

ii. Set trigger pins as OUTPUT and echo pins as INPUT using pinMode.

iii. Begin Serial communication at a baud rate of 9600.

c. Main Loop (void loop()):

i. For each ultrasonic sensor (from 1 to 6):

ii. Call the sonarSensor() function with corresponding trigger and echo pins.

iii. Store the obtained distance in variables (firstSensor to sixthSensor).

iv. Print the sensor readings on the Serial monitor.

d. Obstacle Detection and Response:

i. For each sensor (from 1 to 6):

ii. If the distance measured by the sensor is less than 100 cm:

36
iii. Activate an LED (connected to pin A3) using analogWrite to set brightness to

maximum (255).

iv. Activate another LED (connected to pin A4) in the same way.

v. Else, turn off both LEDs by setting the brightness to 0.

e. SonarSensor() Function:

i. Define the sonarSensor function with parameters trigPin and echoPin.

ii. Inside the function:

iii. Send a short pulse to the trigger pin to initiate ultrasonic signal transmission.

iv. Measure the duration it takes for the signal to bounce back using pulseIn on the

echo pin.

v. Calculate the distance using the formula (duration * 0.0343) / 2.

vi. Store the distance in the global variable "distance."

Note: Ensure the LEDs (A3 and A4) are connected to the appropriate pins on the Arduino board.

End Algorithm

This algorithm outlines the steps taken by the code to read distance values from six ultrasonic

sensors, display them on the Serial monitor, and activate LEDs based on obstacle proximity. The

sonarSensor() function encapsulates the logic for measuring distances, providing modularity to

the code structure.

37
3.6 Integration of APM with Arduino (Drone Reaction)

Here we take the input provided from the sensors and maneuver the drone accordingly with the

help of the Ardupilot mega flight controller.

38
39
40
3.7 Path Planning using MAV proxy

Here we have used Mission planner to simulate our flight path, as defined by the python script

we loaded, with the help of MAVproxy. The flight path is of a simple square

● Connect to the vehicle via MAVProxy using a command like mavproxy.py


--master=udp:127.0.0.1:14550.
● MAVProxy routes MAVLink messages between the vehicle and Mission Planner.
● Use output add command in MAVProxy to forward telemetry to Mission Planner (output
add 127.0.0.1:14550).
● Launch Mission Planner and connect to the vehicle using the same telemetry settings
(udp://127.0.0.1:14550).
● Mission Planner receives real-time data and provides a graphical interface for mission
planning and vehicle control.

Fig 3.7 Flight path simulation

41
42
43
Chapter 4

Results and Discussion

4.1 Simulation & Testing

4.1.1 UR sensor simulation

The code was written entirely on the Arduino IDE. The main functions used in writing the code

were analog & digital write, pulsIn, pinMode, void setup, and void loop.

fig. 4.1 (a) Arduino IDE

44
The sonar sensor function was written inside the code to calculate the distances obtained
from the ultrasonic sensor.

Before uploading the code to the hardware, it was written and uploaded to Tinkercad
simulation software. This allowed us to ensure that all connections were safe and secure and
did not pose a danger to either the components or the environment. The simulation of the UR
sensor replicates a crucial situation in which the distance measurement falls below the 150
cm threshold, leading to the activation of an alarm.

Fig 4.1(b) obstacle detection response.

45
As the simulation ran, we observed that the LED bulb and alarm would sound when we
moved an object closer than 150 cm to any of the ultrasonic sensors. This indicates that we
can receive an immediate warning when the environment becomes hazardous for the drone.
Additionally, all of the components continued to operate efficiently and without any issues.
As a result, we can conclude that the simulation is running smoothly.

4.1.2 Mission Planner Simulation

We successfully simulated a precise flight path for our drone using Mission Planner. By
programming the path with a Python script developed in PyCharm IDE, we utilized
MAVProxy to load and execute the script within Mission Planner. This integration enabled
the drone to accurately follow the designated trajectory. With this accomplishment, we are
now confident in our ability to define and implement complex path geometries for our drone,
ensuring precise and reliable autonomous navigation.

4.2 Hardware Integration & Implementation

We replicated the entire tinkercad simulation by building the entire circuit in which we used the

arduino uno microcontroller to control the UR sensor array and its functioning, receiving the data

and interpretation based upon the arduino uno code. In the testing stage, we imitated the obstacle

that the drone may encounter, crosscheck the obstacle detection range in terms of distance and

coverage area.

46
fig. 4.2.(a) Array Stand design in Solid Edge

The entire UR sensor array structure is supported by a custom build stand, designed on Solid

edge, and 3d printed using Ultimaker 2 +. The material used to print the stand is PLA PRO +,

2.85 mm filament, with a density of 1.22g/cm3. The strength of the stand is sufficient to support

the weight of the entire structure.

fig. 4.2 (b) 3d printed array stand


47
The quadcopter frame used in the assembly is the S500 2015 edition model, weighing around

0.5kg. The frame is made from Glass Fiber and Polyamide Nylon which makes it tough and

durable.

Fig 4.2(c) Drone assembly

The UR array stand will be connected to the quadcopter frame at the hinges provided in the

drone frame, via metal screws.

As discussed above, the drone consists of the 2200 mah li-po battery with an output of 12 volts

connected to the pixhawk flight controller via a voltage stabilizer. The basic power flow has

already been discussed earlier.

48
Upon successful hardware integration, our focus shifted to implementing the drone detection

system. The obstacle detection array effectively identifies obstacles within the specified

observation range. Utilizing data from the ultrasonic range (UR) sensors, this information is

transmitted to the ArduPilot flight controller for real-time adjustment of flight parameters.

Consequently, the drone's motor speeds adapt accordingly, enabling automated evasion and

avoidance of obstacles.

While detection accuracy remains suboptimal, ongoing modifications are aimed at enhancing this

aspect. Our subsequent endeavors will involve integrating path planning software directly with

the flight controller to facilitate controlled flight simulations with predefined trajectories.

49
Chapter 5

Conclusions and Future Work

5.1 Conclusion

The results of the tests confirm the system's operational competence, demonstrating its ability to

identify impediments within the specified range. This suggested system combines cost-effective

sensors from several technologies, such as infrared and ultrasonic, to provide a more efficient

and inexpensive alternative to conventional approaches. In comparison to existing techniques,

the provided methodology stands out for its ease of application, decreased mathematical

complexities, reduced computing burden, and cheaper development and maintenance expenses.

The use of ultrasonic sensors into UAVs has the potential to transform safety and efficiency in a

variety of applications. These sensors, known for their accuracy in measuring distances, improve

obstacle identification and avoidance during drone navigation. Despite their small weight and

low cost, ultrasonic sensors have drawbacks, most notably their limited range and sensitivity to

external sound interference.

With the hardware components and their implementation successfully evaluated, attention now

shifts to integrating path planning with the obstacle detection array. While the current array

design, though not perfect, exhibits adequate detection capabilities, future studies will explore

design features for potential improvements. Simulations conducted on Mission Planner provide a

foundation for conducting complex path planning simulations. The successful integration of

these two components will result in a fully automated flight path planning system for the drone,

including obstacle avoidance. Our primary focus will be on increasing detection accuracy and

refining flight parameters, marking a significant step towards enhanced autonomous drone

operation.

50
References:

[1] Rahman, M. F., Wisnu, Sasongko, R.A. Obstacle’s Contour Detection using Ultrasonic

Sensors. Advance in Aerospace Science and Technology in Indonesia, Vol.

[2] Borenstein, J. and Koren, Y. Obstacle Avoidance with Ultrasonic Sensors, IEEE J., 1998,

pp.213-218

[3] Rawikara, S. S. Desain dan Simulasi Sistem Obstacle Avoidance untuk Misi Waypoint

Following pada Pesawat Udara Nirawak, Tugas Akhir Program Studi Aeronautika dan

Astronautika FTMD ITB, 2013.

[4] Sasongko, R. A., Sembiring, J. Muhammad, H. Design of Obstacle Avoidance Algorithm for

Waypoint Following Control System, Regional Conference in Mechanical and Aerospace

Technology, Manila, Philippines, 2011.

[5] Sasongko, R. A., Rawikara, S. S., Tampubolon, H. J., UAV Obstacle Avoidance Algorithm

Based on Ellipsoid Geometry, Journal of Intelligent and Robotic System, 2017

[6] HC­SR04 Ultrasonic Sensor by Elijah J. Morgan Nov. 16 2014

[7] Modelling and Control of a Quad-Rotor Robot, Pounds

[8] Quadrotor Using Minimal Sensing For Autonomous Indoor Flight, 2007, Roberts ,

EMAV2007

[9] An Improved Artificial Potential Field Approach to Real-Time Mobile Robot Path Planning

in an Unknown Environment, 2011, Joe Sfeir, Maarouf Saad

[10] Khatib, O. 1986. « Real-time obstacle avoidance for manipulators and mobile robots »

International Journal of Robotic Research, vol. 5, no.1, pp. 90-98

[11] Khosla, P., Volpe, R. 1998. « Superquadric artificial potentials for obstacle avoidance and

approach », Proceedings of the IEEE International Conference on Robotics and Automation,

(Philadelphia, PA, April 26-28)

51

You might also like