Gesture Controlled Bluetooth Speaker
Project report submitted for the award of B. Tech. Degree
Name of the Department: Electrical Engineering
Submitted by
a. Aritra Saha (Roll No. 21/EE/030)
b. Md Istkhar Alam (Roll No. 21/EE/063)
c. Parthapratim Mahata (Roll No. 21/EE/072)
d. Rahul Parua (Roll No. 21/EE/076)
e. KM Satya Upadhyay (Roll No. 21/EE/055)
f. Sk Nasiruddin (Roll No. L22/EE/013)
Under the supervision of
Dr. Indranil Dey
Assistant Professor
Department of Electrical Engineering
Page 1 of 20
DECLARATION
We hereby certify that the project entitled “Gesture Controlled Bluetooth Speaker” by Aritra
Saha (10301621030), Md Istkhar Alam (10301621063), Parthapratim Mahata (10301621072),
Rahul Parua (10301621076), KM Satya Upadhyay (10301621055) and Sk Nasiruddin
(10301622013) in partial fulfilment of requirements for the award of degree of B.Tech.
submitted in the Department of Department of Electrical Engineering at HALDIA
INSTITUTE OF TECHNOLOGY under MAULANA ABUL KALAM AZAD UNIVERSITY
OF TECHNOLOGY, WEST BENGAL (Formerly known as WEST BENGAL UNIVERSITY
OF TECHNOLOGY), KOLKATA is an authentic record of our own work carried out under
the supervision of Mr. Indranil Dey, Assistant Professor, Electrical Engineering Department.
The matter presented has not been submitted by me in any other University / Institute for the
award of B.Tech. Degree.
1. 2.
(Aritra Saha) (Md Istkhar Alam)
3. 4.
(Parthapratim Mahata) (Rahul Parua)
5. 6.
(KM Satya Upadhyay) (Sk Nasiruddin)
This is to certify that the above statement made by the candidate is correct to the best of my
knowledge.
Name of the Supervisor:
Designation:
-----------------------------------------------------
(Signature)
------------------------------------------------------------
HOD
(Electrical Engineering Dept.)
Page 2 of 20
ACKNOWLEDGEMENT
We would like to place on record my deep sense of gratitude to Dr. Dilip Kumar Dey, HOD –
Dept. of Electrical Engineering for his generous guidance, help and useful suggestions. We
express our sincere gratitude to Dr. Indranil Dey, Asst. professor, Dept. of Electrical
Engineering for his stimulating guidance, continuous encouragement and supervision
throughout the course of present work. We are extremely thankful to our guide Dr. Indranil
Dey for providing us infrastructural facilities to work in, without which this work would not
have been possible.
1. 2.
(Aritra Saha) (Md Istkhar Alam)
3. 4.
(Parthapratim Mahata) (Rahul Parua)
5. 6.
(KM Satya Upadhyay) (Sk Nasiruddin)
Page 3 of 20
CONTENTS
ABSTRACT 5
INTRODUCTION 6
METHODOLOGY 7 – 11
RESULTS AND DISCUSSIONS 12 – 13
CONCLUSION 14 – 15
REFERENCES 16 – 19
Page 4 of 20
ABSTRACT
A communication system has been proposed which converts sign language used by mute
people into speech. It is done based on the novel hand gesture recognition technique. This
solution approach consists of a hardware module and software application. In hardware
module - The Gesture recognition is done with the help of sensor glove which are best
positioned in fingers, based on the analysis of American Sign Language (ASL) signs. The
design of glove and the concept of decoding gestures by considering the axis orientation with
respect to gravity and their corresponding voltage levels are discussed. In Software part - an
android application named Speaking gestures have been developed. It receives the data
(alphabet/word) via Bluetooth, converts them into text and speaks it out. Bluetooth speakers
are the most widely used speakers these days. Their compact size with portability and long
battery life has made them a centre of attraction. Well, we hereby take Bluetooth speakers to
the next level by integrating touchless operation. The Bluetooth speaker allows user to change
music by just swiping their hand over the Bluetooth speaker. Also, the speaker allows user to
adjust volume by just raising and lowering their hand over the speaker. The user can thus
operate the complete speaker operation without even having to touch his/her phone or the
speaker.
Everyone will love to hear music, if they are in happy mood or sad mood, they love to hear
the music but the dumb and blind individuals can't hear the music because they can't see and
they can't talk and hear also, so we have created special device called Gesture Controlled
Bluetooth Speaker. A dumb and blind individuals can communicate their feelings through
hands and expressions, dumb people can’t able to speak, and blind persons can't see anything,
that’s why this module is prepared for them, that is operated by their hands and they can
change the music by raising and lower the speaker to change the volume and to change the
song. This module is prepared by using Bluetooth Module, and this module helps to access all
main functions of music player like play, stop and volume up/down through hardware. In
additionally we are adding the feature like gesture control slides movements without the use
of remote.
Page 5 of 20
INTRODUCTION
The gesture – based speaker advances the state – of – the – art of Bluetooth speakers. The
system includes an Arduino, battery charging board, Lidar sensor, LED, audio amplifier IC,
Bluetooth module, and 6 – watt speaker with subwoofer. To connect phones to the speaker for
audio input, the system uses a Bluetooth module. A second charging input connector and an
AUX connection for audio input are also supported by the speaker. The amplifier IC boosts
the audio signal after it has been received without losing any data. Now, the speaker module
receives this signal and transforms it into high – quality audio. The Bluetooth speaker has a
Lidar sensor installed on top of it. The Arduino processes the sensor’s input before sending it
to the controller, which can then change the song, adjust the level, or turn on the speaker. This
makes contactless speaker operation possible. Power for the entire device is provided by the
battery pack. The battery charger and protection circuitry regulate battery power and
discharge. Additionally, this circuitry has an internal logic system that automatically turns off
the system after more than five minutes of inactivity to conserve power.
In today's rapidly advancing technological landscape, the fusion of hardware and software
innovations opens up a realm of possibilities for interactive and intuitive devices. One such
innovation is the Gesture Controlled Bluetooth Speaker, a project that harnesses the power of
Arduino microcontrollers to create a hand – free audio experience. This project aims to
revolutionize how users interact with their audio devices by eliminating the need for physical
buttons or remote controls. Instead, it leverages gesture recognition technology to interpret
hand movements and translate them into commands for controlling playback, volume
adjustment, and even playlist navigation. At its core, this project combines the versatility of
Arduino boards with the convenience of Bluetooth connectivity. By integrating sensors
capable of detecting gestures, such as accelerometers or infrared sensors, with Arduino
microcontrollers, we can capture and process hand movements in real – time. The Bluetooth
module facilitates seamless wireless communication between the gesture control unit and the
speaker, allowing users to enjoy their favourite music or podcasts without being tethered to
the device. Whether it's adjusting the volume with a flick of the wrist or skipping tracks with a
simple hand gesture, the intuitive nature of gesture control enhances the user experience.
Moreover, by utilizing open – source hardware and software components like Arduino and
libraries such as Arduino Gesture Recognition, this project promotes accessibility and
encourages experimentation and customization. Enthusiasts can modify the code, add new
gestures, or integrate additional features to tailor the device to their preferences. In addition to
its practical applications, this project serves as an educational platform for aspiring makers
and technologists interested in exploring the intersection of electronics, programming, and
human – computer interaction. By providing detailed instructions, schematics, and code, this
project empowers individuals to delve into the fascinating world of gesture – controlled
technology.
Page 6 of 20
METHODOLOGY
Gesture Controlled Bluetooth Speaker uses sensors to detect and interpret hand gestures and
make changes accordingly. The sensors process the input gestures such as tapping or waving.
Based on the recognized gestures the speaker adjusts accordingly such as changing the
volume or skipping tracks or causing playback. This technology allows users to use hand free
way to interact with speaker.
Fig.: Circuit Diagram
Page 7 of 20
Fig.: Block Diagram
Designing a gesture – controlled Bluetooth speaker involves incorporating sensors and gesture
recognition algorithms to enable hands – free, intuitive control. Below are the details of
hardware & software involved in a system design.
1. Hardware Components
Microcontroller with Bluetooth: Use a microcontroller that supports Bluetooth, such as an
ESP32 or Arduino Nano BLE.
Fig.: Arduino
Arduino uno: The Arduino Uno, which features an ATmega328P microcontroller, is a user-
friendly device for beginners since it has easy USB connection, changeable chips, and an
assortment of I/O choices. The Arduinos have been around in various forms since then; one
that’s remained popular over time is called “uno” after the first iteration.
Gesture Sensor: A time – of – flight (ToF) sensor like the VL53L0X or an IR – based
sensor can detect hand gestures. Alternatively, Leap Motion sensors or cameras can provide
more complex gesture detection.
ADXL345 Sensor: ADXL345 is a type of accelerometer developed by Analog Devices. It
is high – resolution, low – power, versatile sensor that is widely used to measure acceleration
in various applications. The ADXL345 is a 3 – axis accelerometer. This directs that you can
measure acceleration in three perpendicular directions (X, Y, Z). The sensor ADXL345 is
based on Micro – Electro – Mechanical Systems (MEMS) technology, where small
microstructures are integrated into the sensor to detect changes in acceleration. The
ADXL345's high sensitivity and accuracy make it suitable for a different of applications with
automotive systems, consumer electronics, industrial equipment, robotics, and more. It is
broadly used in motion detection, tilt detection and orientation tracking projects.
Page 8 of 20
Fig.: ADXL345 Sensor
Speaker Driver and Amplifier: Include a speaker and amplifier circuit for quality audio
output.
Power Source: A rechargeable battery (e.g., lithium – ion) for portability.
2. Software Components
Gesture Recognition Algorithm: Develop or implement an algorithm for recognizing basic
hand gestures, such as swipe up/down, left/right, and circular motions. User can use simple
threshold – based detection for ToF sensors or more sophisticated models for image – based
recognition.
Bluetooth Communication: The microcontroller connects to the user’s device via
Bluetooth, allowing audio streaming and receiving gesture data.
Embedded Code: Program the microcontroller to interpret gestures and convert them to
specific commands (e.g. play/pause, volume up/down, skip track).
Arduino IDE: The Arduino IDE, which works with Windows, macOS, and Linux, makes it
easier to write, assemble, and upload code to Arduino boards thanks to its intuitive interface,
pre – built libraries, and support for C and C+. The great number of example texts and their
easy accessibility make this tool useful in all phases of learning and teaching.
Page 9 of 20
Fig.: Arduino IDE
3. Gesture – Controlled Commands
Swipe Up/Down: Adjust volume up/down.
Swipe Left/Right: Skip to the previous/next track.
Circular Motion: Control playback (e.g., play/pause).
Hold Gesture: Activate pairing mode or power off.
4. Testing and Calibration
Prototype Testing: Test gestures in varying lighting and hand positioning scenarios to
ensure the accuracy and consistency of gesture recognition.
Sensitivity Calibration: Adjust the sensor’s sensitivity to minimize false positives or
negatives.
5. Optional Features
Voice Control Integration: Integrate voice control for a seamless hand – free experience.
LED Feedback: Use LEDs to provide feedback, indicating when a gesture is recognized or
when Bluetooth pairing is active.
Page 10 of 20
Fig.: Schematic Diagram
The schematic diagram for the gesture control Bluetooth speaker system is as shown above. It
illustrates the setup of an ultrasonic sensor. The HC – SR04 is connected as an ultrasonic
sensor, along with an Arduino UNO and an LCD screen, which displays the distance and
volume readings from the sensor. Upon power on, the system shows the sign – on message –
“GESTURE CONTROL”. The connections of ultrasonic sensor & LCD display are done as
shown in the schematic diagram. After completing the connections, the code is loaded in a
target device and is run in the simulation as well.
Page 11 of 20
RESULTS AND DISCUSSIONS
The above figure which detects the volume which is high or low. Through our hand gesture
we can change the music. So, it will help to the blind and dumb peoples.
According to our findings, Bluetooth technology has revolutionised the music business.
Bluetooth has simplified the use of speakers and headphones for music lovers. Because of
their small stature, mobility, and extended battery life, they have become rather popular.
Introducing our revolutionary touchless technology, which elevates Bluetooth speakers to a
whole new level.
Simply swiping one's hands over the Bluetooth speaker can play or pause the music. Users
may also change the distance threshold on the speaker to customise the hand – eye distance.
As a result, the user may control every aspect of the speaker without ever touching the device.
In order to describe experimental results for a gesture – controlled Bluetooth speaker, a focus
is typically given on testing outcomes for the gesture control, Bluetooth connectivity, and
functionality of volume control and playback through gestures. Here’s a structured outline of
experimental results of a system.
1. Gesture Detection Accuracy: Measure how accurately the ultrasonic sensor detects
gestures for controlling volume and playback.
Results: Detection Range: Successfully detected gestures within a range of 5 cm to 50 cm.
Play/Pause Accuracy: Detected play/pause gestures with a success rate of 90% when the hand
is positioned at a specific distance.
Volume Control: Volume increased or decreased smoothly in response to hand movement
closer or farther from the sensor, with a 95% accuracy rate in recognizing changes.
Page 12 of 20
2. Bluetooth Connectivity and Responsiveness: Evaluate the Bluetooth module's
ability to connect with devices and respond to commands in real time.
Results: Connection Time: Bluetooth module (HC – 05) connected to a smartphone within 3 –
5 seconds.
Stability: Maintained a stable connection up to a range of 10 meters.
Response Delay: Commands sent via Bluetooth (play/pause, volume control) executed with
minimal delay, typically under 1 second.
3. Distance vs. Volume Control Relationship: Verify that the volume changes
according to hand distance from the ultrasonic sensor. Results:
Volume Increase/Decrease: As the distance from the sensor increased, the volume
correspondingly increased, and vice versa.
Smoothness of Volume Transition: Volume adjustment was gradual without abrupt jumps,
confirming successful implementation of a proportional distance – to – volume control
system.
4. Reliability and Stability of the System: Test the overall reliability of the gesture
control system in various environments and over time. Results:
Continuous Operation: System ran without errors or resets during a continuous 2 – hour test
Environmental Impact: Detected gestures reliably in indoor lighting and normal ambient noise
conditions. Bright lighting or reflective surfaces slightly impacted the sensor’s accuracy.
5. User Experience: Gather user feedback on the ease of use and responsiveness of
gesture controls. Results:
User Satisfaction: 85% of users found the gesture control intuitive and responsive.
Feedback on Improvements: Users suggested enhancing gesture response time slightly and
expanding the volume control range.
Page 13 of 20
CONCLUSION
Finally, the Gesture – Controlled Bluetooth Speaker Project presents a new approach to
controlling audio devices, which simplifies, expands, and improves the user experience. This
research paves the way for novel intuitive control options in a variety of settings by
combining gesture detection with Bluetooth speakers. Commercialization and broad use of
this technology are possible outcomes of further research and improvement.
With its innovative and intuitive gesture control, the Bluetooth speaker is a giant leap forward
in the world of smart music products. The device offers a natural way to operate it without
using your hands by combining a set of infrareds (IR) sensors for gesture detection, a
Bluetooth module for wireless connection, and a microprocessor for command processing. In
addition to making the product more accessible for those with mobility issues or who are
doing tasks that benefit from hands – free operation, this innovation improves the user
experience by enabling smooth control over music playing and volume changes. More
advanced and interactive products will be possible thanks to the project's successful
execution, which showcases the potential of gesture recognition technology in consumer
electronics.
The way we listen to music will never be the same with our Gesture Controlled Bluetooth
Speaker. We have full control over your audio experience with the ability to change volume,
skip songs, pause/play, and even personalise gestures for certain operations.
Several opportunities for growth and development are ahead in the future. An upgrade from
infrared sensors to time – of – flight sensors or computer vision using machine learning
algorithms might be one way to improve the gesture detection system. These innovations may
improve precision and accommodate a wider variety of gestures, even those with more subtle
and intricate hand motions. By using machine learning, the system may gradually adapt to
user interactions, learning their unique gesture patterns and enhancing the accuracy of its
detection.
In conclusion, the Gesture Controlled Bluetooth Speaker using Arduino represents a
convergence of innovative technology, convenience, and creativity. Through the integration of
gesture recognition capabilities with Arduino microcontrollers and Bluetooth connectivity,
this project offers a myriad of benefits and opportunities across different domains. The hands
– free operation and intuitive user interface provided by gesture control enhance the
accessibility and usability of audio devices, revolutionizing how users interact with their
audio experiences.
Whether it's controlling playback, adjusting volume, or navigating playlists, users can
seamlessly engage with their music or audio content using simple hand gestures, freeing them
from the constraints of physical buttons or remote controls. From modifying gesture
recognition algorithms to adding new features or integrating additional sensors, the
possibilities for experimentation and innovation are virtually limitless, fostering a culture of
exploration and discovery in the DIY electronics community. Furthermore, the applications of
Gesture Controlled Bluetooth Speakers extend beyond personal entertainment devices to
encompass a wide range of contexts, including smart homes, accessibility devices, public
Page 14 of 20
spaces, education, healthcare, and interactive exhibits. As technology continues to evolve and
new advancements emerge, projects like the Gesture Controlled Bluetooth Speaker using
Arduino serve as exemplars of the innovative integration of hardware, software, and human –
computer interaction.
By embracing the principles of creativity, accessibility, and collaboration, we can continue to
push the boundaries of what's possible and create solutions that enrich lives and inspire
generations to come. In essence, the Gesture Controlled Bluetooth Speaker using Arduino not
only represents a functional audio device but also embodies the spirit of innovation,
exploration, and empowerment, inviting individuals to embark on a journey of discovery and
creativity in the realm of DIY electronics.
The development of a gesture – controlled Bluetooth speaker demonstrates the potential for
enhancing user interaction with audio devices through intuitive, touch – free controls. By
leveraging an ultrasonic sensor and Arduino microcontroller, the system successfully
interprets hand gestures to manage basic speaker functions like play, pause, and volume
adjustments.
This hands – free approach offers practical benefits in environments where physical contact
with devices may be limited or impractical, such as kitchens, medical facilities, or
workplaces.
Through iterative testing and optimization, the prototype proves effective in recognizing
gestures within a predefined range, delivering a responsive and accurate control system. The
system highlights the feasibility of integrating gesture – based control in consumer electronics
and points toward possible future advancements, including multi-gesture recognition and
extended control features. Overall, this system provides a user – centred, innovative approach
to Bluetooth speaker control, underscoring the increasing relevance of gesture – controlled
technology in modern smart devices. Future improvements could focus on expanding gesture
capabilities, refining detection accuracy, and enhancing system integration, paving the way
for broader applications and even more seamless user experiences.
Page 15 of 20
BIBLIOGRAPHY
1. X. Teng, B. Wu, W. Yu, and C. Liu described a hand gesture detection system based on
local linear embedding. In the Journal of Visual Languages & Computing, Vol. 16, pp. 442 –
454, 2005,
2. Chen, Y., Gao, W., and Ma, J., "Hand Gesture Recognition Based on Decision Tree," in
Proceedings of the 5th International Symposium on Chinese Spoken Language Processing
(ICSL 2006), Kent Ridge, Singapore, December 13 – 15, 2006.
3. N. Sriram and M. Nithiyanandham, "A hand gesture recognition – based communication
system for silent speakers," in 2013 International Conference on Human Computer
Interactions (ICHCI), Chennai, 2013, pp. 1 – 5.
4. R. Lock town and A. W. Fitzgibbon, Real – time gesture recognition with deterministic
boosting of the 13th British Machine Vision Conference, September 2 – 5, 2002, Card.
5. Rupesh Prajapati., Vedant Pandey., Nupur Zamindar., Neeraj Yadav., prof. Neelam
Phadnis., “Hand gesture recognition & voice conversion for deaf & dumb’’, IEEE Vol.05,
Issue. 04, pp. 2395 – 0072, 2018.
6. Alois, F., Stefan, R., Clemens, H., and Martin, R. “Orientation sensing for gesture – based
interaction with smart artefacts”, IEEE Transactions on audio, speech, and language
processing, Vol.28, No. 8, pp. 1434 – 1520, 2007.
7. Donglin, W., Ajeesh, P., Hye, K., and Hosub, P. “A DE convolutive neural network for
speech classification with applications to home service robot,” IEEE Transactions on
instrumentation and measurement, Vol. 59, No. 12, pp. 2334 – 4620, 2010
8. Ibrahim, P., and Srinivasa, R. “Automated speech recognition approach to continuous cue
symbols generation”, International journal of power control signal and computation, Vol. 18,
No. 8pp. 434 – 520, 2007.
9. Jean, C., and Peter, B., “Recognition of Arm Gestures Using Multiple Orientation Sensors:
Gesture Classification”, IEEE Intelligent transportation systems conference on electronics,
Vol. 13, No. 1, pp. 334 – 520, 2004
10. Joyeeta, S., and Karen, D. “Indian Sign Language Recognition Using Eigen Value
Weighted Euclidean Distance Based Classification Technique”, International journal of
advanced computer science and applications, Vol. 4, No. 2, pp. 434 – 820, 2013.
11. Ravikiran, J., Kavi, M., Suhas, M., Sudheender, and S., Nitin, V. “Fin – ger Detection for
Sign Language Recognition”, Proceedings of the international multi conference of engineers
and computer scientists, Vol. 2, No. 1, pp. 234 – 520, 2009.
Page 16 of 20
12. Kekre, M., and Vaishali, K. “Speaker identification by using vector quantization,” IEEE
Transactions on mechatronics, Vol. 15, No. 1, pp. 1034 – 1620, 2010
13. X. Teng, B. Wu, W. Yu, and C. Liu, "A hand gesture recognition system based on local
linear embedding", Journal of Visual Languages & Computing, Vol. 16, pp. 442 – 454, 2005.
14. Y. Chen, W. Gao, and J. Ma, "Hand Gesture Recognition Based on Decision Tree", in
Proc. of ISCSLP 2006: The 5th International Symposium on Chinese Spoken Language
Processing, December 13 – 15, 2006, Kent Ridge, Singapore.
15. N. Sriram and M. Nithiyanandham, "A hand gesture recognition – based communication
system for silent speakers," 2013 International Conference on Human Computer Interactions
(ICHCI), Chennai, 2013, pp. 1 – 5.
16. M. R. Islam, U. K. Mitu. R. A. Bhuiyan and J. Shin, "Hand Gesture Feature Extraction
Using Deep Convolutional Neural Network for Recognizing American Sign Language, 2018
4th International Conference on Frontiers of Signal Processing (ICFSP), Poitiers, 2018, pp.
115 – 119.
17. Chen, X., Zhang, Z., & Yang, J. (2017). Hand Gesture Recognition Using Ultrasonic
Sensors for Human – Machine Interaction. Sensors, 17 (7), 1705. DOI: 10.3390/s17071705.
18. Spark Fun Electronics. (n.d.). HC-05 Bluetooth Module Hookup Guide. Retrieved from
https://learn.sparkfun.com/tutorials/hc-05-bluetooth-module-hookup-guide.
19. Bhat, A., & Bhargav, P. (2018). Design and Implementation of a Gesture – Controlled
System Using Arduino and Ultrasonic Sensors. Proceedings of the International Conference
on IoT and Applications, 34 – 38. DOI: 10.1109/ICIoTA.2018.8674075.
20. Ultrasonic Sensor Documentation. (n.d.). HC – SR04 Ultrasonic Sensor Datasheet.
Retrieved from https://components101.com/sensors/hc-sr04-ultrasonic-sensor.
21. Nguyen, M. T., & Tran, V. K. (2021). A Review of Gesture-Based Control in Consumer
Electronics. Journal of Innovative Electronics, 15 (2), 102 – 109.
22. X. Teng, B. Wu, W. Yu, and C. Liu, “A hand gesture recognition system based on local
linear embedding”, Journal of Visual Languages & Computing, Vol. 16, pp. 442 – 454, 2005.
23. Y. Chen, W. Gao, and J. Ma, “Hand Gesture Recognition Based on Decision Tree “, in
Proof ISCSLP 2006: The 5th International Symposium on Chinese Spoken Language
Processing, December 13 – 15, 2006, Kent Ridge, Singapore.
24. N. Sriram and M. Nityananda, "A hand gesture recognition-based communication system
for silent speakers," 2013 International Conference on Human Computer Interactions
(ICHCI), Chennai, 2013, pp. 1 – 5.
25. M. R. Islam, U. K. Mitu, R. A. Bhuiyan and J. Shin, "Hand Gesture Feature Extraction
Using Deep Convolutional Neural Network for Recognizing American Sign Language," 2018
Page 17 of 20
4th International Conference on Frontiers of Signal Processing (ICFSP), Poitiers, 2018, pp.
115 – 119.
26. S. Ghotkar, R. Khatal, S. Khupase, S. Asati and M. Hadap, "Hand gesture recognition for
Indian Sign Language," 2012 International Conference on Computer Communication and
Informatics, Coimbatore, 2012, pp. 1 – 4.
27. T. Schlomer, B. Poppinga, N. Henze, and S. Boll, “Gesture recognition with a Wii
controller”, in Proc. of the 2nd International Conference on Tangible and embedded
interaction, February 1820, 2008, Bonn, Germany, pp. 11 – 14.
28. R. Lock town, and A. W. Fitzgibbon, “Real – time gesture recognition using deterministic
boosting”, in Pro
29. c. of the 13th British Machine Vision Conference, September 2 – 5, 2002, Cardiff.
30. S. Hong, N. A. Setiawan, and C. Lee, “Real – Time Vision Based Gesture Recognition for
Human – Robot Interaction”, Lecture Notes in Computer Science, Vol. 4692/2010, pp. 493 –
500.
31. X. Zhang, X. Chen, W. Wang, J. Yang, V. Lantz, and K. Wang, “Hand gesture recognition
and virtual game control based on 3D accelerometer and EMG sensors”, in Proc. of the 13th
International Conference on Intelligent User Interface, 2009, Sanibel Island, USA, pp. 401 –
406.
32. P. Breuer, C. Eckes, and S. Muller, “Hand Gesture Recognition with a Novel IR Time – of
– Flight Range camera – A Pilot Study”, Lecture Notes in Computer Science, Vol. 4418/2007,
pp.
33. Priyanka Kulkarni, & Dr. Swaroopa Shastri. (2024). Rice Leaf Diseases Detection Using
Machine Learning. Journal of Scientific Research and Technology, 2 (1), 17 – 22.
https://doi.org/10.61808/jsrt81
34. Shilpa Patil. (2023). Security for Electronic Health Record Based on Attribute using
Block-Chain Technology. Journal of Scientific Research and Technology, 1 (6), 145 – 155.
https://doi.org/10.5281/zenodo.8330325
35. Mohammed Maaz, Md Akif Ahmed, Md Maqsood, & Dr Shridevi Soma. (2023).
Development Of Service Deployment Models in Private Cloud. Journal of Scientific Research
and Technology, 1 (9), 1 – 12. https://doi.org/10.61808/jsrt74
36. Antariksh Sharma, Prof. Vibhakar Mansotra, & Kuljeet Singh. (2023). Detection of Mirai
Botnet Attacks on IoT devices Using Deep Learning. Journal of Scientific Research and
Technology, 1 (6), 174 – 187.
37. Dr. Megha Rani Raigonda, & Shweta. (2024). Signature Verification System Using SSIM
In Image Processing. Journal of Scientific Research and Technology, 2 (1), 5 – 11.
https://doi.org/10.61808/jsrt79
38. Shri Udayshankar B, Veeraj R Singh, Sampras P, & Aryan Dhage. (2023). Fake Job Post
Prediction Using Data Mining. Journal of Scientific Research and Technology, 1 (2), 39 – 47.
Page 18 of 20
39. Gaurav Prajapati, Avinash, Lav Kumar, & Smt. Rekha S Patil. (2023). Road Accident
Prediction Using Machine Learning. Journal of Scientific Research and Technology, 1 (2),
48–59.
40. Dr. Rekha Patil, Vidya Kumar Katrabad, Mahanthappa, & Sunil Kumar. (2023). Image
Classification Using CNN Model Based on Deep Learning. Journal of Scientific Research and
Technology, 1 (2), 60 – 71.
41. Ambresh Bhadra Shetty, & Surekha Patil. (2024). Movie Success and Rating Prediction
Using Data Mining. Journal of Scientific Research and Technology, 2 (1), 1 – 4.
https://doi.org/10.61808/jsrt78
42. Dr. Megha Rani Raigonda, & Shweta. (2024). Signature Verification System Using SSIM
In Image Processing. Journal of Scientific Research and Technology, 2(1), 5–11.
https://doi.org/10.61808/jsrt79
43. S. M. Amin and B. F. Wollenberg, “Toward a smart grid: power delivery for the 21st
century,” IEEE Power and Energy Magazine, vol. 3, no. 5, pp. 34 – 41, 2018
44. Dr. Suvarna Nandyal, Prajita R Udgiri, & Sakshi Sherikar. (2023). Smart Glasses for
Visually Impaired Person. Journal of Scientific Research and Technology, 1 (3), 21 – 31.
https://doi.org/10.5281/zenodo.8021418
45. Dr. Rekha J Patil, Indira Mulage, & Nishant Patil. (2023). Smart Agriculture Using IoT
and Machine Learning. Journal of Scientific Research and Technology, 1 (3), 47 – 59.
https://doi.org/10.5281/zenodo.8025371
46. X. Teng, B. Wu, W. Yu, and C. Liu, "A hand gesture recognition system based on local
linear embedding", Journal of Visual Languages & Computing, Vol. 16, pp. 442 – 454, 2005.
47. Y. Chen, W. Gao, and J. Ma, "Hand Gesture Recognition Based on Decision Tree", in
Proc. of ISCSLP 2006: The 5th International Symposium on Chinese Spoken Language
Processing, December 13 – 15, 2006, Kent Ridge, Singapore.
48. N. Sriram and M. Nithiyanandham, "A hand gesture recognition – based communication
system for silent speakers," 2013 International Conference on Human Computer Interactions
(ICHCI), Chennai, 2013, pp. 1 – 5.
49. M. R. Islam, U. K. Mitu. R. A. Bhuiyan and J. Shin, "Hand Gesture Feature Extraction
Using Deep Convolutional Neural Network for Recognizing American Sign Language, 2018
4th International Conference on Frontiers of Signal Processing (ICFSP), Poitiers, 2018, pp.
115 – 119.
50. S. Ghotkar, R. Khatal, S. Khupase, S. Asati and M. Hadap, "Hand gesture recognition for
Indian Sign Language," 2012 International Conference on Computer Communication and
Informatics, Coimbatore, 2012, pp. 1 – 4.
51. T. Schlamer, R. Poppinga, N. Henze, and S. Boll, "Gesture recognition with a Wii
controller", in Proc, of the 2nd International Conference on Tangible and embedded
interaction, February 1820, 2008, Bonn, Germany, pp. 11 – 14.
Page 19 of 20
52. R. Lock town, and A. W. Fitzgibbon, "Real – time gesture recognition u using
deterministic boosting". In Proc. of the 13th British Machine Vision Conference, September 2
– 5, 2002.
Page 20 of 20