Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
16 views15 pages

Mini Project Report Format

The mini project report focuses on developing an autonomous drone navigation system that operates without GPS, utilizing advancements in Semantic Segmentation and deep Reinforcement Learning. The project aims to enhance surveillance capabilities in various applications such as security, disaster management, and environmental monitoring. The report outlines the system's objectives, requirements, design approach, and acknowledges the guidance received from faculty members at Rajkiya Engineering College, Sonbhadra.

Uploaded by

Gyanendra Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views15 pages

Mini Project Report Format

The mini project report focuses on developing an autonomous drone navigation system that operates without GPS, utilizing advancements in Semantic Segmentation and deep Reinforcement Learning. The project aims to enhance surveillance capabilities in various applications such as security, disaster management, and environmental monitoring. The report outlines the system's objectives, requirements, design approach, and acknowledges the guidance received from faculty members at Rajkiya Engineering College, Sonbhadra.

Uploaded by

Gyanendra Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 15

A

Mini Project Report


on

“Surveillance in Autonomous Drone Navigation”

SUBMITTED BY:

Vatsal Shukla (2208410100065)

SUBMITTED TO:
Dr. Anurag Sewak
( CSED)

Dr. Mainejar Yadav


( CSED)

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING


RAJKIYA ENGINEERING COLLEGE, SONBHADRA
Abstract

The goal is to build an autonomous system capable of navigating areas (small


scale) without the use of GPS technology. The inspiration of the project is taken
from recent advances in Semantic Segmentation and deep Reinforcement
Learning which have proven to be very successful in the context of
Autonomous Navigation in the case of cars. We aim to extend it to drones.
Navigation of areas involves obstacle avoidance, decision-based system to
classify input camera feed for commands for navigation. We are currently using
ROS for interfacing.
Declaration

I hereby declare that the project titled - "Surveillance in Autonomous Drone Navigation”,

which is being submitted as Mini Project of B.Tech. 5th Semester (Computer Science &

Engineering) to the Department of Computer Science & Engineering, Rajkiya Engineering

College, Sonbhadra is an authentic record of my/our genuine work done under the guidance

and mentorship of Dr. Anurag Sewak & Dr. Mainejar Yadav, Assistant Professor, Computer

Science & Engineering Department, Rajkiya Engineering College, Sonbhadra. I have not

plagiarized the contents of this report, and have not submitted this work for the award of any

other degree/certificate.

Date: Vatsal Shukla


Place: (Name of the Student)
Certificate

This is to certify that the Mini Project titled "Surveillance in Autonomous Drone Navigation"

submitted by Vatsal Shukla has been carried out under my guidance and that this project

work has not been submitted elsewhere for any degree. This project report is approved for

submission requirement for Mini Project in BTech 5th Semester in Computer Science &

Engineering at Rajkiya Engineering College, Sonbhadra.

Date:

(Dr. Anurag Sewak / Dr. Mainejar Yadav)

Computer Science & Engg. Department,

Rajkiya Engineering College, Sonbhadra


Acknowledgement

We would like to express our deepest gratitude to all those who contributed to the
development and successful completion of this research on surveillance in autonomous drone
navigation. Without the support, expertise, and collaboration of numerous individuals and
organizations, this work would not have been possible.

First and foremost, we would like to extend our sincere thanks to our primary advisor, Dr.
Anurag Sewak sir, whose insightful guidance, critical feedback, and unwavering support
throughout this research have been invaluable. Their expertise in autonomous systems and
drone technologies provided a solid foundation for this project and greatly influenced its
direction and depth.

We are also grateful to Dr. Mainejar Yadav sir, whose assistance in refining the theoretical
framework and navigating the complexities of the algorithms has significantly contributed to
the overall quality of this research. Their input on both the technical and practical aspects of
autonomous navigation helped bring the project to fruition.

Our heartfelt appreciation goes to the Computer Science Department for providing the
necessary infrastructure, funding, and resources that were crucial to the success of this
research. The access to state-of-the-art drone hardware, simulation environments, and
computational facilities played an essential role in the implementation and testing phases of
our work.

Vatsal Shukla
Table of Contents

S. TITLE Page No.


No.

1. Introduction
1.1 Application Context 7
1.2 System Objectives 8

2. Requirements Analysis
2.1 System Feasibility 9
2.2 Functional and Non-functional Requirements 10

3. System Design
3.1 Design Approach and System Model 11
3.2 Back-end Design 12

4. System Requirements
4.1 Tools and Technologies used 13

5. Limitations and Scope 14

6. Conclusion 15
Application Context

The application context of surveillance in autonomous drone navigation can be


diverse and wide-ranging, as it intersects with various industries and fields. Below
are some possible application contexts in which autonomous drones equipped with
surveillance capabilities could be utilized:

1. Security and Surveillance

● Border Patrol and Coast Guard Operations: Autonomous drones can be deployed
along borders or coastlines to monitor illegal activities such as smuggling, human
trafficking, or unauthorized crossings. Drones equipped with surveillance sensors
(cameras, thermal imaging, radar) can patrol vast areas with minimal human
intervention, transmitting real-time data to control centers for decision-making .

2. Disaster Management and Search-and-Rescue (SAR)

● Natural Disaster Response: After natural disasters such as earthquakes, floods,


hurricanes, or wildfires, autonomous drones can be deployed to survey affected
areas. These drones can map the extent of the damage, identify safe paths for
rescue teams, and locate survivors through thermal imaging and other sensors.

3. Environmental Monitoring and Conservation

● Wildlife Monitoring: Autonomous drones can be employed to monitor wildlife


populations and track animal movements, especially in remote or protected areas.
This can help with conservation efforts, monitor poaching activities, or gather data
on habitat conditions.
● Forest and Land Monitoring: Drones are increasingly used to monitor forests and
agricultural land for signs of deforestation, illegal logging, or environmental
degradation. Autonomous drones can be programmed to follow predefined flight
paths and regularly inspect large areas, providing detailed imagery and data to
conservationists or government agencies.
System Objective

The system objective for an autonomous drone navigation system with surveillance
capabilities typically revolves around optimizing the operational efficiency, safety,
and effectiveness of the drone in completing surveillance tasks. The specific
objectives can vary depending on the application, but broadly speaking, the system
objectives can be defined as follows:

1. Autonomous Navigation and Path Planning

● Objective: To develop an autonomous navigation system that allows the drone to


navigate efficiently through complex environments, avoiding obstacles while
following predefined or dynamic paths.
● Key Requirements:
○ Safe, obstacle-avoiding flight in real-time (using sensors like LiDAR,
vision cameras, or ultrasonic sensors).
○ Efficient path planning algorithms for navigating dynamic environments.
○ Capability to adapt to changes in the environment, such as moving
obstacles or sudden terrain shifts.

2. Real-time Surveillance and Monitoring

● Objective: To equip the drone with the ability to perform surveillance in real-time,
capturing high-quality visual, infrared, or multispectral data to support monitoring
tasks.
● Key Requirements:
○ Integration of high-resolution cameras, thermal sensors, or other
surveillance sensors for monitoring purposes.
○ Real-time data streaming capabilities to send surveillance footage and
sensor data to ground stations or cloud servers for immediate analysis.
○ Automated recognition of objects, individuals, or behaviors relevant to the
surveillance mission (e.g., anomaly detection or object tracking).
System Feasibility

The system feasibility of an autonomous drone navigation and surveillance system


depends on a comprehensive assessment of various technical, economic, operational,
and regulatory factors. This assessment ensures that the system can be effectively
developed, deployed, and operated in real-world environments. Here’s a breakdown of
the key aspects of system feasibility:

● Technology Maturity:
○ The key technologies (e.g., autonomous navigation, machine learning, sensor
integration, data communication, and real-time processing) are well-
established and continue to evolve. Many components, such as GPS-based
navigation, LiDAR, and high-resolution cameras, are already in use in
consumer drones, and their integration into more advanced surveillance
systems is feasible.
○ Technologies like SLAM (Simultaneous Localization and Mapping) for GPS-
denied environments, AI-based anomaly detection, and edge computing for
real-time decision-making are actively researched and commercially available.

● Hardware Capabilities:
○ Sensors: High-quality cameras, infrared sensors, LiDAR, and multispectral
cameras are already available for integration into drones. The miniaturization
of these sensors has made it possible to pack them into relatively small drone
platforms.
○ Autonomy: Advances in AI and machine learning, particularly in computer
vision and robotic navigation algorithms, allow for real-time decision-making,
object detection, and obstacle avoidance. Autonomous systems are
increasingly capable of operating in complex, dynamic environments without
direct human control.
○ Energy Efficiency: While battery technology is continually improving, there
are still limitations to the endurance of drones. However, advancements in
battery technology (e.g., lithium-polymer and energy-dense batteries) are
making longer flight times possible, and solar-powered drones are an area of
ongoing research.

Functional and Non-functional Requirements


Functional Requirements

Functional requirements describe the specific functions and behaviors that the autonomous
drone navigation and surveillance system must be able to perform to fulfill its intended
purpose. These are the core capabilities that ensure the system can achieve its objectives in
real-world operations.

1. Autonomous Navigation

● The system must enable the drone to autonomously plan and execute flight paths,
avoiding obstacles and maintaining stability.
● The drone should be able to dynamically adjust its flight path based on real-time
environmental data (e.g., obstacles, terrain, air traffic).
● Path planning algorithms should support predefined routes as well as the ability to
adapt to unexpected conditions (e.g., emergency landing, real-time re-routing).
● The drone must detect and avoid static and dynamic obstacles (e.g., trees, buildings,
moving vehicles) using sensors (e.g., LiDAR, vision cameras, radar).

2. Real-Time Data Collection and Transmission

● The drone must be equipped with high-resolution cameras, infrared/thermal sensors,


and other appropriate surveillance equipment to capture relevant data (e.g., video,
images, temperature signatures, multispectral data).
● The system should allow real-time transmission of collected data to a ground control
station or cloud-based system for analysis (live video streaming, sensor data,
telemetry).
● The system should support real-time object detection and tracking, allowing it to
identify and follow specific targets (e.g., vehicles, people, animals).

Non-Functional Requirements

Non-functional requirements describe the qualities and attributes that the system should have.
These requirements help ensure the system is efficient, reliable, secure, and scalable. They
define how well the system performs its functions under various conditions.

Performance

● The system should provide real-time performance, ensuring low-latency


communication between the drone and the control station, especially for critical
surveillance tasks such as live video streaming.
● The onboard processing should enable real-time decision-making, with minimal
delays between data capture and action (e.g., obstacle avoidance, anomaly detection).
● The system should handle large amounts of data (e.g., high-resolution video, sensor
data) without affecting drone performance or stability.
Design Approach and System Model
The design approach for an autonomous drone navigation and surveillance system should be
modular, scalable, and robust, enabling efficient operation in dynamic environments. The
system must combine various technologies, such as autonomous navigation, sensor
integration, real-time data processing, and communication frameworks, all working
seamlessly together. The design approach should follow a layered architecture that addresses
different aspects of the system's functionality.

1. Modular and Layered Architecture

The overall architecture should be layered to separate different concerns such as navigation,
perception, communication, and decision-making. This allows for better scalability, flexibility, and
maintainability.

Drone Hardware

● Sensors: The drone is equipped with various sensors, such as LiDAR, RGB cameras,
thermal cameras, ultrasonic sensors, and IMUs (Inertial Measurement Units). These
sensors allow for precise navigation, obstacle detection, and surveillance data
collection.
● Flight Control System: The onboard flight controller manages the physical movement
and stabilization of the drone. This subsystem interacts directly with the hardware and
ensures stable flight, including altitude control, speed regulation, and yaw/pitch
control.

The system model provides a high-level overview of the components and their
interactions. The main component of the autonomous drone navigation and
surveillance system is:

Navigation System:
● Global Path Planning: Determines optimal routes using GPS and pre-mapped waypoints.
● Local Path Planning: Uses algorithms like RRT or A search* to dynamically adjust the path
in real-time.
● Obstacle Avoidance: Real-time adjustments using sensor data to avoid dynamic and static
obstacles.
● SLAM: Enables navigation in GPS-denied areas by creating maps of the environment and
localizing the drone in it.

Control and Decision-Making System:

● Autonomous Flight Control: Adjusts drone behavior based on navigation and environment
data.
● Anomaly Detection: Machine learning models flag suspicious activities based on the
surveillance data, triggering alerts or actions.
Back-end Designs

The backend design is responsible for managing, processing, and storing the data collected by
the drones while ensuring that the system functions efficiently, securely, and reliably. It
encompasses various components such as data storage, server-side logic, cloud infrastructure,
communication protocols, and security measures. Below is a detailed breakdown of the
backend design, focusing on its architecture, key components, and considerations.

Database and Storage

● Data Storage: Different types of data need to be stored, including:


○ Flight logs: Mission information, telemetry, drone status.
○ Sensor data: Raw data from cameras, LiDAR, thermal sensors.
○ Video and Image Data: High-resolution footage and snapshots taken during
the mission.
○ Surveillance Data: Processed data such as detected objects, anomaly events,
and alerts.

APIs and Communication

● RESTful APIs: The backend will expose REST APIs for communication between the
drone, ground control, and other systems. These APIs allow operators to monitor
drones in real-time, control flight paths, retrieve historical data, or trigger specific
actions (e.g., starting/stopping surveillance).
○ Examples: GET /drone/{id}/status, POST /drone/{id}/command, GET
/mission/{id}/data.
○ API requests must be authenticated and authorized to ensure security.

Security

● Data Encryption: All sensitive data (e.g., flight logs, surveillance data) should be
encrypted both in transit (using TLS/SSL) and at rest (using AES-256 encryption).
This ensures data integrity and privacy.
● Firewall and VPN: To prevent unauthorized access, the backend system should be
protected by firewalls and VPNs. Secure access control policies should be
implemented to restrict access to the backend infrastructure.
System Requirements

Functional Requirements

Functional requirements describe the specific behaviors, actions, and


interactions that the system must support. They define what the system will do.

1. Autonomous Navigation and Control


○ Path Planning: The system should allow drones to autonomously
plan and navigate between specified waypoints without human
intervention.
○ Obstacle Avoidance: The drone must detect and avoid obstacles in
real-time using onboard sensors (e.g., LiDAR, ultrasonic sensors,
cameras).
○ GPS Navigation: The drone should be able to navigate using GPS
coordinates, with real-time position updates.

2. Surveillance and Data Collection


○ Real-time Video Streaming: The drone must stream video footage
(live or recorded) to the ground control system during surveillance
missions.
○ Image/Video Capture: The drone should capture images and videos
from cameras (e.g., RGB, thermal) for surveillance purposes.
○ Object Detection and Recognition: The system must detect and
identify objects of interest (e.g., people, vehicles) using computer
vision algorithms (e.g., CNN, YOLO).

Technologies used:
LiDAR (Light Detection and Ranging): For 3D mapping, obstacle avoidance, and terrain
modeling.

● Example: Velodyne LiDAR, Riegl VUX-1.

Cameras:

● RGB Cameras: Standard high-resolution cameras for general surveillance.


○ Example: GoPro, Sony Alpha series.
● Thermal Cameras: For surveillance in low-light or night conditions.
○ Example: FLIR Vue TZ20.
Limitations and Scope

Limitations

The limitations highlight areas where the system may face challenges or
limitations due to technological, regulatory, or environmental factors.
1. Regulatory and Legal Limitations

● Airspace Regulations: Drones are subject to strict airspace regulations, which vary by
region. For example, in many countries, drones must remain within a specific altitude
limit (e.g., 400 feet in the U.S.) and are prohibited from flying in restricted airspace
such as near airports or government buildings.
○ FAA (Federal Aviation Administration) in the U.S., EASA (European Union
Aviation Safety Agency) in Europe, and other local aviation authorities impose
these regulations.
● Privacy Concerns: Surveillance systems using drones can raise privacy issues,
especially when collecting video or images of private properties or people without
their consent.
○ Compliance with privacy laws (e.g., GDPR, CCPA) is required.

Environmental Limitations

● Weather Conditions: Drones are highly sensitive to weather conditions like rain, strong
winds, and extreme temperatures. High winds can affect flight stability, while heavy
rain can damage sensors and make visibility poor for surveillance.
○ Some drones are designed to be weather-resistant, but extreme weather events
(e.g., thunderstorms, blizzards) could still ground the drone.

Scope
The scope defines the goals, boundaries, and expected outcomes of the system. It identifies
what the system will accomplish and its intended usage scenarios.

1. Functional Scope

● Autonomous Navigation: The system will support autonomous flight based on pre-defined
waypoints and dynamic path planning. It will feature obstacle avoidance, return-to-home
functionality, and flight stability in various conditions.
● Object Detection: The system will implement machine learning algorithms (e.g., YOLO,
ResNet) to automatically detect objects of interest (e.g., vehicles, people, wildlife) in the video
footage.
● Data Collection and Storage: The system will store surveillance data (video footage,
telemetry, sensor data) for later analysis or real-time processing.
Conclusion

The development of an Autonomous Drone Navigation and Surveillance System offers


significant advancements in a wide range of industries, from security and disaster
management to environmental monitoring and infrastructure inspection. By leveraging
state-of-the-art technologies such as machine learning, real-time data processing, high-
precision sensors, and autonomous flight capabilities, this system can enhance
operational efficiency, safety, and situational awareness in various applications.

The future of autonomous drones in navigation and surveillance is filled with


potential. As advancements in hardware, software, and communication protocols
continue to evolve, these drones will become more reliable, efficient, and capable of
handling a wider range of tasks with higher precision and safety. The system’s
scalability, along with integration into broader ecosystems (e.g., IoT, 5G), will further
improve its applicability in both current and emerging industries.

In conclusion, while the Autonomous Drone Navigation and Surveillance System is


poised to revolutionize industries by providing faster, safer, and more efficient
solutions, its full potential will depend on overcoming technological, regulatory, and
operational challenges. Ongoing research, innovation, and collaboration across
industries will be key to unlocking the next generation of drone capabilities and
ensuring they can operate effectively in real-world scenarios.

You might also like