Final Project Report PDF
Final Project Report PDF
Submitted in the partial fulfillment of the requirement for the award of the degree of
BACHELOR OF ENGINEERING
In
COMPUTER SCIENCE & ENGINEERING
Submitted by
Belagavi)
The student of “Raja Rajeswari College of Engineering” in partial fulfillment for the
seventh semester of Bachelor of Engineering in Computer Science & Engineering of the
Visvesvaraya Technological University, Belagavi during the year 2023-24. It is certified that
all corrections/suggestions indicated for Internal assessment have been incorporated in the
report deposited in the department library. The project report has been approved as it satisfied
the academic requirements in respect of project work prescribed for the seventh semester.
External Viva-Voce
Examiners: Signature:
1. 1.
2. 2.
ABSTRACT
Surveillance drones, or Unmanned Aerial Vehicles (UAVs), have emerged as pivotal tools in
diverse fields, ranging from security and law enforcement to environmental monitoring and
agriculture. This paper explores the multifaceted applications of surveillance drones and their impact
on enhancing security, efficiency, and data acquisition. The primary objective is to provide an
overview of the technological advancements, operational capabilities, and ethical considerations
associated with the deployment of surveillance drones. The paper delves into the use of drones in
security and law enforcement, emphasizing their role in monitoring public spaces, events, and
borders. Additionally, it highlights their application in search and rescue operations, environmental
monitoring, and infrastructure inspection, showcasing the versatility of this technology across
various domains.
Key technological aspects, such as sensor capabilities, communication systems, and autonomy,
are discussed to underscore the sophistication that surveillance drones bring to data collection and
analysis. Furthermore, the paper addresses the potential societal concerns related to privacy,
security, and ethical use, emphasizing the need for robust regulations and responsible practices in
drone deployment.
TABLE OF CONTENTS
1 INTRODUCTION 1-4
1.1 Existing System
1.2 Proposed System
1.3 Motivation
1.4 Objectives of the work
1.5 Key features with scope of the features or overall
scope of the work
1.6 Organization of the project report.
5.1 Module
5.2 Module description
5.3 Charts / Functions / Codlings
6 SYSTEM TESTING 14-15
6.1 Testing
CONCLUSION
REFERENCES
LIST OF FIGURES
CHAPTER 1
INTRODUCTION
The project Surveillance drones, also known as Unmanned Aerial Vehicles (UAVs), represent a
transformative leap in the field of aerial technology, revolutionizing the way we monitor, gather
information, and ensure security across diverse sectors. These unmanned vehicles equipped with advanced
sensors, cameras, and communication systems have rapidly evolved from military applications to a wide
range of civilian uses, leaving an indelible mark on fields such as law enforcement, agriculture,
environmental monitoring, and infrastructure inspection.
The advent of surveillance drones has ushered in a new era where remote sensing and data acquisition can
be conducted with unprecedented efficiency and precision. Unlike traditional methods of aerial
surveillance, which often involve significant costs, risks, and limitations, drones offer a flexible, cost-
effective, and scalable solution. This technology provides a bird's-eye view of areas that were previously
difficult to access or monitor, allowing for real-time data collection and analysis.
In this era of rapid technological advancement, surveillance drones have found applications in enhancing
public safety, securing borders, managing traffic, monitoring critical infrastructure, and even aiding in
search and rescue missions. The ability of these unmanned vehicles to navigate challenging terrains and
capture high-resolution imagery has proven invaluable in various scenarios, ranging from disaster response
to agricultural management.
3. Power Systems:
Battery Technology: Lithium-polymer and lithium-ion batteries power many drones, offering a
balance between weight and energy density.
Hybrid Systems: Some drones use a combination of batteries and internal combustion engines
for longer flight durations.
Utilize robust and secure communication systems, incorporating both short-range (Wi-Fi, RF) and
long-range (satellite communication) options for flexible and extended operations Implement
advanced encryption protocols to ensure secure data transmission.
Navigation and Autonomy Enhancements: Integrate advanced GPS modules for precise location
tracking and navigation. Implement computer vision algorithms for improved autonomous flight
capabilities, obstacle avoidance, and dynamic path planning. Explore AI-based navigation for
adaptive decision-making during missions.
Power Efficiency: Research and implement cutting-edge battery technologies or alternative power
sources to extend flight durations. Consider energy-efficient propulsion systems and aerodynamic
designs to optimize power consumption.
Data Processing and Analytics: Include onboard processing capabilities for real-time data
analysis. Explore edge computing for distributed processing and quicker decision-making during
surveillance missions. Implement AI algorithms for object recognition, anomaly detection, and
pattern analysis in captured data.
Remote Control and Ground Control Stations: Enhance user interfaces for remote controllers and
ground control stations to provide intuitive controls and comprehensive mission planning
capabilities.
1.3 MOTIVATION
The motivation behind implementing a surveillance drone project stems from a compelling need to
bolster security measures, enhance situational awareness, and respond effectively to emerging
threats in diverse scenarios. By harnessing the capabilities of surveillance drones, the project aims
to revolutionize monitoring practices, providing cost-effective and efficient solutions for
safeguarding critical infrastructure, managing public events, and addressing emergency situations.
The utilization of aerial surveillance not only accelerates response times but also minimizes risks to
personnel, making it an invaluable tool for search and rescue operations, border control, wildlife
conservation, and environmental monitoring. The overarching goal is to leverage advanced
technology to create a more secure and resilient environment while optimizing resource allocation
and minimizing the impact on human and natural resources.
The primary objective of this surveillance drone project is to establish a robust and adaptive aerial
monitoring system with the aim of enhancing security, safety, and response capabilities across
various domains. By deploying cutting-edge drone technology equipped with advanced sensors and
imaging devices, the project seeks to provide real-time surveillance of critical infrastructure, public
spaces, and high-risk areas. The overarching goal is to improve situational awareness, enabling
rapid and informed decision-making in emergency situations, search and rescue operations, and
event management. Additionally, the project aims to contribute to wildlife conservation,
environmental monitoring, and agricultural efficiency by leveraging the versatility of surveillance
drones. Through the integration of this technology, the objective is to create a comprehensive and
efficient solution that addresses diverse security challenges while respecting ethical considerations.
2. Autonomous Flight:
Scope: Ability for the drone to fly autonomously based on pre-defined flight paths or mission
objectives.
Functionality: Autonomous take off, landing, waypoint navigation, and obstacle avoidance,
enabling hands-free operation and efficient coverage of designated areas.
3. High-resolution Imaging:
Scope: Capture and transmission of high-resolution images for detailed inspection and
surveillance purposes.
Functionality: High-resolution camera with zooming capabilities, image stabilization, and
adjustable exposure settings to capture clear and detailed images even from a distance.
4. Night Vision:
Scope: Enhanced visibility in low-light or nighttime conditions for continuous surveillance
operations.
Functionality: Infrared or thermal imaging technology to detect and capture images in low-
light environments, allowing for 24/7 surveillance capabilities.
5. Geo-fencing:
Scope: Define virtual boundaries or restricted airspace to prevent the drone from entering
unauthorized areas.
Functionality: GPS-based geo-fencing feature that alerts operators or automatically adjusts
the drone's flight path when it approaches designated boundaries or no-fly zones.
CHAPTER 2
LITERATURE SURVEY
2.1 “Anuar Bin Ahmad; Ruzairi Bin Abdul Rahim”(2023), “Silent
Surveillance Autonomous Drone For Disaster Management And Military
Security Using Artificial Intelligence “(IEEE):
This paper explores the application of deep learning algorithms for drone to use in disaster management
and military security using AI, achieving improved accuracy through the integration of advanced
convolutional neural networks (CNNs). The study evaluates the performance against traditional methods
and presents promising results in real- world scenarios.
METHODOLOGY:
This systematic review article uses the PRISMA-based flowchart process to identify relevant academic
papers on drone operations. The search was conducted using Scopus and Web of Science databases, with
a search string of "drone," "unmanned aerial vehicle," "uav," "unmanned aircraft system," "uas,"
"remotely piloted aircraft," "surveillance," "monitoring," "inspection," and "smart cities." The search
resulted in 323 records, with 166 records identified in the identification stage. The screening stage
revealed 72 records were irrelevant, leaving 94 records. The eligibility stage involved screening the full
text of all 94 articles, with 51 articles addressing technical and non-technical issues of drone operations.
The review included 43 articles, addressing both technical and non-technical aspects of drone operations.
Merits:
Autonomous operations, advanced AI surveillance.
Demerits:
Privacy concerns, limited payload capacity, high cost, regulatory challenges.
METHODOLOGY:
The proposed RF-based UAV surveillance system consists of drones, a remote controller, an RF sensing
module, and a processing module with a database repository. The drones are specified by different
technical specifications, such as maximum operation range and connectivity. The remote controller sends
flight commands to the target drones and receives the response of operating status. An RF sensing
module is configured to intercept drone-controller communication, assuming WiFi frequency is known.
The intercepted RF signals can be captured by software-defined radio configurable devices and stored in
a local database repository for processing. DroneRF, a large-scale RF dataset, is introduced for drone
detection, classification, and operation mode recognition. The system considers four operation modes of
three different drones (Parrot Bebop, Parrot AR Drone, and DJI Phantom 3) and uses two USRP-2943
RF receivers to collect the lower and upper half of the frequency band signals. The dataset has 227 RF
signal segments, with each segment having two amplitude records.
Merits: Offers a comprehensive overview of various inspection techniques beyond CNNs, providing
a holistic perspective on automated quality control.
Demerits: Less emphasis on the specifics of deep learning and CNNs in drone detection.
This paper conducts a comprehensive comparison of various drone detection techniques, ranging from
traditional image processing methods to state-of-the-art machine learning approaches. The study
evaluates their accuracy, speed, and adaptability across diverse datasets
METHODOLOGY:
The proposed system is divided into three parts: User Control, Drone, and Surveillance. The user control
part includes a laptop and transmitting radio telemetry device, while the drone consists of a flight
controller, GPS module, electronic speed controllers, brushless motors, and a lipo battery. The
surveillance part includes a Raspberry pi 3B microcomputer, laptop, camera module, and ultra-sonic
sensor. The drone is the main part of the system, and commands are sent from the laptop to the drone.
The surveillance part performs live streaming using a Raspberry pi connected to Wi-Fi and an IP
address. The surveillance part includes an ultrasonic sensor for obstacle detection. The workflow
describes the system's operation from the ground to the destination and backward, with a forward and
backward flowchart. The system moves towards the destination, checks for obstacles, and checks if the
destination is reached
Dept. of CSE, 2023-2024 8
RRCE
RASPBERRY PI-BASED SURVEILLANCE DRONE
Merits: Specific emphasis on a crucial aspect of drone detection inspection, providing in-depth
analysis and potential solutions.
Demerits: Dependency on specific datasets; challenges in cross-domain generalization.
Unusual event detection is a crucial research topic in image processing and computer vision, with
applications in smart city transportation management systems. Anomalies are activities or events that deviate
significantly from what is expected or normal, such as traffic collisions, violations, accidents, fights, and
crimes. Detecting traffic anomalies involves multiple types of violations of regulations, but is challenging due
to complex traffic environments, lightning conditions, dynamic weather conditions, lack of high-quality data,
and traffic scene complexity.
Technological advancements have led to rapid growth in video surveillance networks, providing safety and
security in public and private places. However, developing large datasets for traffic surveillance systems is
challenging due to the difficulty, unexpected cost, and laboriousness of sample collection in real-life
situations. This leads to insufficiently labeled anomalous data, including suspicious human activities.
To address these challenges and the increasing demand for public safety and security, a novel benchmark
dataset captured by an aerial drone is introduced, focusing on anomaly detection relevant to road traffic
situations. A comparative study with existing methods is conducted to provide a challenging benchmark for
real-time object detection and anomaly detection in aerial videos.
METHODOLOGY:
nomaly detection in traffic surveillance videos is primarily achieved using unsupervised and
weakly supervised learning methods due to the limited availability of annotated anomalous
instances. Unsupervised methods require more normal data, while weakly supervised methods
improve learning accuracy. A comprehensive review of modern deep learning techniques for
traffic anomalies was conducted, focusing on various computer-vision-based methods,
frameworks, applicability, implementation details, and limitations. U-Net, Chang et al., and
Yang et al. proposed frameworks for video anomaly detection, including spatial autoencoder
networks, motion autoencoders, and variance-based attention. A hybrid approach was
proposed to integrate space-time trajectories and semantic information of objects for
extracting critical activities and events from drone-based surveillance sequences. Yang et al.
classified safety-related abnormalities into three groups, and Yang et al. proposed a functional
approach to model temporal relations of time-to-collision safety indicators. CADNet, an
architecture based on deep learning for contextual anomaly detection, exploited contextual
information from aerial video surveillance to find point anomalies and contextual anomalies.
Merits: Addresses the need for real-time detection, offering insights into the privacy and security of the
system.
Demerits: Trade-off between privacy and accuracy; ethical considerations in deployment
CHAPTER 3
The system analysis for the surveillance drone project involves a meticulous examination of its
components and functionalities to ensure optimal performance and alignment with project
objectives. This encompasses a thorough requirements analysis, identifying specific surveillance
needs such as coverage area, resolution, and environmental considerations. The selection and
evaluation of hardware, including drones, cameras, sensors, and communication systems, are
crucial factors, with an emphasis on weight, battery life, range, and sensor capabilities. The
communication infrastructure is scrutinized to establish secure and reliable data transmission
between the drone and ground control station. Additionally, a detailed assessment of navigation
and control systems is undertaken to guarantee precise and responsive drone movements. This
holistic analysis ensures that the surveillance drone system is well-equipped to meet its intended
purposes efficiently and ethically.
The functional requirements for the surveillance drone project encompass a set of crucial capabilities
aimed at ensuring effective and reliable surveillance operations. Firstly, the drone must feature real-
time video capture capabilities, utilizing onboard cameras to provide live monitoring of the designated
area. It should incorporate GPS navigation for precise positioning and automated missions, allowing
predefined routes and waypoints. An obstacle avoidance system, consisting of sensors and algorithms,
is essential to prevent collisions and ensure safe navigation, particularly in dynamic environments. The
system should facilitate remote control and monitoring, enabling operators to adjust flight parameters
and navigate the drone in real-time through a user-friendly interface
Scalability is crucial, allowing for the seamless integration of additional sensors or upgraded components
to enhance surveillance capabilities as needed
CHAPTER 4
SYSTEM DESIGN
4.2 INPUT/OUTPUT:
Surveillance data from cameras and sensors. Real-time processed information transmitted to a
ground control station for monitoring and control.
4.4 ALGORITHM
Haar Cascades:
Utilize Haar cascades, a machine learning object detection method, to identify specific objects based
on their features.
OpenCV Library:
Integrate OpenCV, a popular computer vision library, for implementing pre-trained object detection
models or training custom models.
Optical Flow:
Use optical flow algorithms (e.g., Lucas-Kanade) to track movement by analyzing the apparent
motion of pixels between consecutive frames.
Kalman Filter:
Implement a Kalman filter to predict the location of moving objects, providing smoother tracking
even in the presence of noise.
CHAPTER 5
SYSTEM
IMPLEMENTATION
Hardware Implementation:
Raspberry Pi Setup:
Install Raspbian OS on Raspberry Pi. Configure GPIO pins for sensors and components.
Connect Components:
Connect camera module, sensors, motors, and communication module to GPIO pins.
Power System:
Set up lithium-polymer batteries and a power distribution board for stable power.
Frame Assembly:
Build a lightweight and durable frame to house Raspberry Pi and components.
Flight Controller:
Install flight control software (e.g., Ardu Pilot) and configure for stabilization.
Communication Setup:
Connect Wi-Fi or radio transceivers for real-time data transmission.
Software Implementation:
Camera Module:
Captures high-resolution images or video footage.
Functions: Provides visual data for surveillance.
Sensor Module:
Integrates sensors (GPS, altitude, orientation).
Functions: Collects telemetry data for accurate navigation.
Dataset Acquisition:
Collects raw data from the camera and sensors.
Functions: Gathers information for further analysis. Public datasets from government agencies and
open data portals provide valuable resources, while web scraping techniques can gather images
from online repositories and search engine APIs.
Platforms and collaborations with organizations allow for contributions from diverse sources,
ensuring a broad representation of license plate images. In-house data collection through camera
installations in controlled environments or simulations of specific scenarios adds a controlled
aspect to the dataset.
Image Processing:
Uses computer vision algorithms.
Functions: Enhances image quality, corrects distortions, and prepares images for object detection.
Object Detection:
Applies object detection algorithms (e.g., YOLO, Haar cascades).
Functions: Identifies objects of interest in images.
Deployment:
Regulatory Compliance:
Adheres to local airspace regulations. Functions: Ensures legal compliance for drone operation.
Privacy Measures:
Implements privacy features. Functions: Protects identifiable information in captured footage.
Security Measures:
Ensures secure data transmission. Functions: Guards against unauthorized access and tampering.
Training and Testing Splits: Divide the dataset into training, validation, and testing
sets for effective model training, parameter tuning, and evaluating system performance.
CHAPTER 6
SYSTEM TESTING
Simulation and HIL Testing:
Validate software algorithms through rigorous simulated testing for accuracy.
Verify hardware components in real-world scenarios using Hardware-in-the-Loop (HIL) testing to
ensure robust functionality.
Assess the system's response in diverse simulated environments to identify and rectify any
discrepancies.
Performance Testing:
Assess system response times to user commands and external stimuli for real-time
efficiency.
Optimize data processing speed and algorithms to enhance overall system performance.
Verify compliance with local airspace regulations and drone operation laws.
footage. Conduct thorough testing to identify and address potential regulatory and privacy
concerns.
Deploy the surveillance drone system in real-world scenarios following successful testing.
REFERENCES
[1]Liu, Wei, et al, "SSD: Single shot multi box detector," in European conference on computer
vision., 2016.
[2]Mathworks.com, "MathWorks.com," Math Work, [Online]. Available:
https://it.mathworks.com/discovery/convolutional-neural-network-matlab.html.Dynamics," in
International Conference on Multiphase Flow, New Orleans, USA, 2001.
[3]S. Boyce, "AI-on-raspberry-pi-with-the-intel-neural-compute-stick," 31/1/2019. [Online].
Available: https://hackaday.com/2019/01/31/ai-on-raspberry-pi-with-the-intel-neuralcompute-stick/.
[4]"Raspberry pi," Raspberry Pi Foundation, 2019. [Online]. Available:
https://www.raspberrypi.org/.
[5]Wikipedia, "TensorFlow," Google, 2019. [Online]. Available:
https://vi.wikipedia.org/wiki/TensorFlow.
[6]Wikipedia, "Backpropagation," [Online]. Available:
https://en.wikipedia.org/wiki/Backpropagation.
[7]viblo.asia, "Mang No-ron tich chap," [Online]. Available: https://viblo.asia/p/mang-no-ron-tich
chap-p1-DZrGNNjPGVB.
[8]A. Krizhevsky, I. Sutskever, G. E. Hinton, "ImageNet Classification with Deep Convolutional
Neural Networks," 2012. [Online]. Available:
https://www.cs.toronto.edu/~fritz/absps/imagenet.pdf. [Accessed 3 3 2019].
[9]Mlab.vn, "Raspberry-pi-la-gi," [Online]. Available: http://mlab.vn/11025-raspberry-pi-la-gi
gioi-thieu-cac-ung-dung-cua-raspberry-pi-3.html.
[10]Reglisse44, "Instructables.com - The drone pi," [Online]. Available:
https://www.instructables.com/id/The-Drone-Pi/.
[11]"TensorFlow-Object-Detection-on-the-Raspberry-Pi," 25/2/2019. [Online]. Available:
https://github.com/EdjeElectronics/TensorFlow-Object-Detection-on-the-Raspberry-Pi.
[12]""TensorFlow-Object-Detection-on-the-Raspberry-Pi," 25/2/2019. [Online]. Available:
https://github.com/EdjeElectronics/TensorFlow-Object-Detection-on-the-Raspberry-Pi.".