Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
85 views4 pages

Sixth Sense Navigation For The Operating System: Varun G.S Krishna Kumar U Gowtham N.G Thilak Shenoy

This document proposes a gesture-based navigation system as an alternative to using a mouse and keyboard. A webcam would capture the user's gestures, which would be interpreted by software to perform mouse functions like pointing, selecting, and navigation. Hand gestures could also replace keyboard inputs. This 6th sense navigation system would allow for more flexible computer interaction without being tied to a desk. The system is designed to use background subtraction algorithms and 6th sense gesture processing to interpret hand movements from the webcam feed and translate them into mouse commands. Hardware requirements include a webcam, computer, and microcontroller, while software needs include Windows, Matlab, and Embedded C.

Uploaded by

junkman11
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views4 pages

Sixth Sense Navigation For The Operating System: Varun G.S Krishna Kumar U Gowtham N.G Thilak Shenoy

This document proposes a gesture-based navigation system as an alternative to using a mouse and keyboard. A webcam would capture the user's gestures, which would be interpreted by software to perform mouse functions like pointing, selecting, and navigation. Hand gestures could also replace keyboard inputs. This 6th sense navigation system would allow for more flexible computer interaction without being tied to a desk. The system is designed to use background subtraction algorithms and 6th sense gesture processing to interpret hand movements from the webcam feed and translate them into mouse commands. Hardware requirements include a webcam, computer, and microcontroller, while software needs include Windows, Matlab, and Embedded C.

Uploaded by

junkman11
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

SIXTH SENSE NAVIGATION FOR THE

OPERATING SYSTEM

By:
Varun G.S
Krishna Kumar U
Gowtham N.G
Thilak Shenoy

Guide
Mr.Shyam Karanth
SIXTH SENSE NAVIGATION FOR THE OPERATING SYSTEM

Existing system:
The mouse and the keyboard are usually a source of input operation in the computing
environment. We are confined to sit in front of the workstation in order to input into the system.
The mouse may be optic mouse or the traditional ball mouse. But this does not provide us with
the flexibility of staying away from the system.

Proposed system:
We are trying to replace this traditional sticking in front of the computer mouse by a very
convenient gesture based mouse. This replaces the mouse with the gesture based system with just
a simple web camera that captures the gestures and performs the mouse operation such as
pointing selection and navigation operations. Similarly we can implement the keyboard input
operation of inputting the character operation. A palm serves as the imaginary key board .The
web camera captures these image and notes down the specific character that has to be
interpreted.Gesture input system can be conveniently used by anyone who wishes not to be tied
down to a desk when using a computer, making it perfect for giving presentations or web surfing
from the couch. Additionally, since input system does not exert pressure on the median nerve at
the wrist while in use, it may prevent repetitive stress injuries.

Requirements
Software requirement:

 Windows XP or higher
 Matlab
 Embedded C

Hardware requirement:

 Processor : Intel Pentium IV


 RAM : 1GB
 Monitor : 14” SVGA Digital Color Monitor
 Hard Disk : 20GB
 Microcontroller : 89c52
 Web Cam :Bluetooth
Design

Web camera
Image scan

Image

PC
Background subtraction Interpret the gesture Convert to co ordinate system

Mouse pointer location PS2 signal to the PC

Micro controller
Mark the point Convert to PS2

Methodology:

In this project we are implementing a very simple concept of navigating the mouse by
sensing the changes in the gesture. Here we scan the gesture using a very simple web camera
(preferably a Bluetooth) which is fed into the system. This is interpreted by the pc and then sent
to the microcontroller which in turn performs the mouse operation.

Here the web camera does a simple task of scanning the image and sending it to the
computer. Computer performs the operation of locating the hand then the gesture is interpreted
(pointing, selecting, zoom-in, zoom-out) and then converted into the co ordinate system. The
mouse pointer location is noted down and a signal is generated to the micro controller .the micro
controller locates this location and then converts this into ps2 signal notifying the movement of
the mouse.
Here we predominantly use the following algorithms and technology in order to process the
image

Background Subtraction Algorithm: A video stream of the static background is


obtained and a mean background is computed from the different frames. This background
template is saved and is going to be used every time with every single incoming frame from the
video stream to obtain the foreground. An arbitrary threshold is then used on the resulting
foreground image to eliminate noise on the picture, and at the same time creating a binary image
of the foreground.

This project also implements the Sixth sense technology which imbibes the gesture
processing. The Sixth Sense prototype comprises a pocket projector, a mirror and a camera
contained in a pendant like, wearable device. Both the projector and the camera are connected to
a mobile computing device in the user’s pocket. The projector projects visual information
enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera
recognizes and tracks user's hand gestures and physical objects using computer-vision based
techniques. The software program processes the video stream data captured by the camera and
tracks the locations of the colored markers at the tip of the user’s fingers. The movements and
arrangements of these are interpreted into gestures that act as interaction instructions for the
projected application interfaces.

You might also like