SlideShare a Scribd company logo
IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 579
HAND GESTURE RECOGNITION USING ULTRASONIC SENSOR AND
ATMEGA128 MICROCONTROLLER
Nidhi Gupta1
, Ramandeep Singh2
, Sidharth Bhatia3
1
PG Student, EECE Department, ITM University, Gurgaon, Haryana, India
2
Assistant Professor, EECE Department, ITM University, Gurgaon, Haryana, India
3
Assistant Professor, EECE Department, ITM University, Gurgaon, Haryana, India
Abstract
The aim of this paper is to present a system that detects the hand gesture motions using the principle of Doppler Effect. Ultrasonic
waves are transmitted by a sensor module and are reflected by a moving hand. The received waves are frequency shifted due to
Doppler Effect. These frequency shifts in the ultrasonic waves are characterized to determine gestures. The gestures once
recognized are mapped into commands to control the movement of a small robot. Current research work spans only four gestures:
front movement, back movement, move left and move right.
Keywords: Hand Gesture Recognition, Ultrasonic Doppler, Human Computer Interface, Sonar, Presence Detection
--------------------------------------------------------------------***----------------------------------------------------------------------
1. INTRODUCTION
In order to recognize hand gestures, the principle of Doppler
Effect is used here. An ultrasonic wave of frequency 22
KHz is generated with the help of an ultrasonic sensor.
When a hand is waved near the source of Ultrasound wave,
there is shift in the frequency of the sound wave, due to
Doppler Effect. The received signal is then analyzed for
characteristic shifts to determine whether the motion was
push, pull, clockwise rotation or anti-clockwise rotation.
The results will be indicated by the movement of a small
robot. The robot will move towards forwards, if the motion
of the hand was push; backwards if the hand motion was
push; left if the hand motion was a clockwise rotation; and
right if the hand motion was an anti-clockwise rotation.
This project is based on the Microsoftโ€™s SoundWave [1] and
Acoustic Doppler Sonar (ADS) [2].
The system comprises of an ultrasonic pair of transmitter
and a receiver. The transmitter transmits an inaudible tone
that is reflected by a hand in motion, in proximity. The
reflected tone undergoes a frequency shift due to Doppler
Effect. This amount of shift is dependent on the velocity of
hand. The receiver captures the frequency shifted signals.
The signals received are used to characterize the movement
of the hand, in time domain.
The incoming time-domain signal are buffered [3], and
Fourier transform is applied on them. The result of this
operation is magnitude vectors that are spread equally over
the spectral width.
The relationship between observed frequency f and emitted
frequency f0 is given by
๐‘“ =
๐‘+๐‘ฃ ๐‘Ÿ
๐‘+๐‘ฃ ๐‘ 
๐‘“0 (1)
Where,
c is the velocity of waves in the air;
vr is the velocity of the receiver in air; if the object is
moving towards the source then positive (and negative in the
other direction);
vs is the velocity of the source in air; if the source is moving
away from the receiver then positive (negative if moving
towards)
After each FFT vector is computed, it is further processed to
determine the bandwidth of the signals, speed of gestures
and motion detection. The detected motions are then
converted to robotic commands.
2. BLOCK DIAGRAM
Fig 1: Components in the hardware setup
The hardware configuration for this system is as follows:
An ultrasonic module has on board transmitter and receiver.
This module is connected to the ATMEGA128
IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 580
microcontroller. The analog input received is converted to
digital values by the inbuilt ADC (Analog to digital
converter) of the microcontroller.
The microcontroller (AVR ATMEGA128) emits the waves
at a regular interval. If there is any obstacle (or hand, in this
case) then the ultrasonic waves will be reflected, frequency
shifted. The microcontroller performs a multiplication
function that to compare the incoming and outgoing signals
in hardware. This multiplier function uses the fundamental
property of sine waves to find the difference in frequency
between incoming and outgoing signals in hardware. The
product of two sine waves produces two phase-shifted sine
waves, one with a frequency that is the difference in
frequencies between that of the two input waves, and the
other with a frequency that is the sum of the two input
waves:
๐‘ ๐‘–๐‘› 2๐œ‹๐‘“๐‘Ž ๐‘ฅ ๐‘ ๐‘–๐‘› 2๐œ‹๐‘“๐‘ ๐‘ฅ =
๐‘๐‘œ๐‘  2๐œ‹ ๐‘“๐‘Ž โˆ’ ๐‘“๐‘ ๐‘ฅ + ๐‘๐‘œ๐‘ โก2๐œ‹ ๐‘“๐‘Ž + ๐‘“๐‘ ๐‘ฅ 2 (2)
The robot will receive commands from the microcontroller
corresponding to each gesture.
LCD display is used to provide textual information of the
signal analyzed.
The following are the gestures and corresponding functions
that are executed in the current project:
Table 1: Gesture and function chart
Gesture Motion Function
Front Movement Move Forward
Back Movement Move Backward
Anti- Clockwise
Movement
Move Left
Clockwise Movement Move Right
3. ALGORITHM
Fig 2: Finite State Machine
Initial state is the Fill State. The peaks (local highs above a
predefined threshold) in the ADC samples fill the buffer
until full. Check state is where the local highs are analyzed
to determine if the motion is one of the six gestures
predefined in the database. The total number of peaks that
dropped from HIGH to LOW, or number of peaks that rose
from LOW to HIGH are counted. WaitPush is transited to
from Check state if there is any gesture in the sample data
collected. WaitNone is transited to from Check State if there
was no gesture. FindPeak fills a buffer with one peak, as
against the Fill state where the buffer is completely filled
with local HIGHs. This separate state is required because
the buffer will not need to be completely refilled even if no
gesture was detected.
4. HARDWARE CONSIDERATION
Fig 3: Schematic diagram for hand gesture controlled
robotic movement
The above circuit comprises of ATMEGA128
microcontroller, HC-SR04 Ultrasonic sensor, two motor
driver ICs L293, and four DC motors. The sensor has on-
board transmitter and receiver. The sensorโ€™s TRIG and
ECHO pins are connected to the microcontrollerโ€™s ADC
port. The motor driver is connected to the other input/output
port.
The microcontroller gives a logic HIGH at the pin connected
to TRIG of sensor at regular intervals. The ultrasonic waves
are emitted and if there is any object obstructing these
IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 581
signals, then the ultrasonic waves are reflected back. The
receiver in the ultrasonic module receives the frequency
shifted signal at the pin connected to ECHO of sensor.
These received waves are buffered and stored in the
microcontroller. The microcontroller analyzes the signal for
a pattern. If the pattern matches a gesture in the database,
then it further determines the command associated with the
gesture. The command is given in the form of logic HIGH
or LOW to the pins of the motor driver L293D. These
commands correspond to the robotic movements, as
mentioned in Table 1.
The processing of the microcontroller is also reflected in the
LCD. The gestures determined and the corresponding
robotic actions are displayed on the Liquid Crystal Display,
simultaneously as the commands are given to the robot. This
provides an additional result display as well as verification
mechanism.
Table 2: PIN Description
Table 2 (contd.): PIN Description
5. APPLICATION AREAS
Gesture recognition is useful in processing information from
human beings that is not conveyed through speech or other
methods. This technology is useful in following areas:
a. Immersive gaming technology: Gestures may be
used to control interactions with the gaming
console and give a more interactive and immersive
experience.
b. Sign Language interpretation: Gesture recognition
can be used to transcribe signs into text, just like
speech recognition. This would be greatly helpful
for the speech impaired.
c. Control through facial gestures: This technology
can be used for applications with even more
precision like recognizing face gestures. This will
be helpful in situations when users cannot use other
input interfaces like mouse or keyboard or even
hand gestures. This would be additionally helpful
in applications like mood sensing.
d. Alternative computer interfaces: Strong gesture
recognition can be used to accomplish common
tasks performed traditionally with the current input
devices such as mouse or keyboard. Gestures, along
with other methodologies like speech recognition
can be made to control the electronic appliances
and gadgets completely or with little need to type
or touch. [4] [5] [6]
e. Remote control: By using gesture recognition, it is
possible to use hand alone as a remote control for
various devices. The signal must not only indicate
the desired response, but also which device to be
controlled. [9] [10] [11]
f. Home Appliances control: It is possible to extend
the gesture recognition technology to control the
household appliances. [10]
6. ADVANTAGES & LIMITATIONS
Gesture recognition is very useful for automation. Gestures,
a natural language of humans, provide an intuitive and
effortless interface for communication with the computers.
They will reduce our need for devices like mouse, keys,
remote control or keys for interaction with the electronic
devices. When combined with other advanced user interface
technologies such as voice commands and face recognition,
gestures can create a richer user experience that strives to
understand the human "language," thereby fueling the next
wave of electronic innovation.
This technology is limited in the sense that all whole of
human signs or gestures are not recognizable using this
technology. The ultrasonic waves spread out and cannot be
used to detect gestures like victory sign, where the gesture is
made by two fingers.
7. RESULTS
The gesture recognition using ultrasonic waves is found to
be accurate and reliable. The methodology for testing
comprised of movement of single hand or multiple hands.
Single hand movement was detected accurately. When there
are multiple hands, the movement is not detected accurately.
The detection did not take into account the background area.
The noise in human audible range did not affect the
detection.
8. CONCLUSIONS & FUTURE SCOPE
Additional gesture recognition opportunities exist in medical
applications where, for health and safety reasons, a nurse or
doctor may not be able to touch a display or track-pad but
still needs to control a system. In other cases, the medical
PIN
NUMBER
NAME I/O FUNCTION
61 PORTF.0 IN Receive input
from the sensor
12, 13, 37,
38
PORTB.2,
PORTB.3,
PORTC.2,
PORTC.3
OUT Enable the motor
10,11, 14,
15, 35, 36,
39, 40
PORTB.0,
PORTB.1,
PORTB.4,
PORTB.5,
PORTC.0,
PORTC.1,
PORTC.4,
PORTC.5
OUT DC motor values
25 PORTD.0 OUT LCD RS pin
26 PORTD.1 OUT LCD RW pin
27 PORTD.2 OUT LCD EN pin
2-9 PORTE OUT LCD DATA
PIN
NUMBER
NAME I/O FUNCTION
23/24 XTAL IN Crystal
Oscillator
64 Vcc IN Power source
62 ARef IN Ground
IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 582
professional may not be within reach of the display yet still
needs to manipulate the content being shown on the display.
Appropriate gestures, such as hand swipes or using a finger
as a virtual mouse, are a safer and faster way to control the
device.
Gesture recognition may be used in automobiles for user
control, and among other motivations as an incremental
convenience and safety capability.
Robots, if trained to recognize some gestures can be used in
situations of social needs, like rehabilitation or catastrophe,
more independently.
ACKNOWLEDGEMENTS
Several people helped me through the project, notably the
Lab Assistants, for providing hardware and support; my
beloved friends and associates for numerous code reviews
and innumerable verbal discussions.
REFERENCES
[1]. Sidhant Gupta, Dan Morris, Shwetak N Patel, Desney
Tan, SoundWave: Using the Doppler Effect to Sense
Gestures, ACM, 2012
[2]. Kaustubh Kalgaonkar, Bhiksha Raj, One-handed gesture
recognition using ultrasonic sonar, IEEE, 2009
[3]. Thomas Schlomer, Benjamin Poppinga, Niels Henze,
Susanne Boll, Gesture Recognition with a Wii Controller,
Proceedings of the 2nd international Conference on
Tangible and Embedded interaction, 2008
[4]. Afshin Sepehri, Yaser Yacoob, Larry S. Davis
"Employing the Hand as an Interface Device", Journal of
Multimedia,
[5]. Lars Bretzner and Tony Lindeberg "Use Your Hand as a
3-D Mouse ...", Proc. 5th European Conference on
Computer Vision (H. Burkhardt and B. Neumann, eds.),
June 1998.
[6]. Matthew Turk and Mathias Kรถlsch, "Perceptual
Interfaces", University of California, Santa Barbara UCSB
Technical Report 2003-33
[7]. M Porta "Vision-based user interfaces: methods and
applications", International Journal of Human-Computer
Studies, 2002.
[8]. Henriksen, K. Sporring, J. Hornbaek, K. " Virtual
trackballs revisited", IEEE Transactions on Visualization
and Computer Graphics, 2004
[9]. Do Jun-Hyeong, Jung Jin-Woo, Sung hoon Jung, Jang
Hyoyoung, Bien Zeungnam, Advanced soft remote control
system using hand gesture, Mexican International
Conference on Artificial Intelligence, 2006
[10]. K. Ouchi, N. Esaka, Y. Tamura, M. Hirahara, M. Doi,
Magic Wand: an intuitive gesture remote control for home
appliances, International Conference on Active Media
Technology, AMT 2005
[11]. Lars Bretzner, Ivan Laptev, Tony Lindeberg, Sรถren
Lenman, Yngve Sundblad "A Prototype System for
Computer Vision Based Human Computer Interaction",
2001.
BIOGRAPHIES
Mrs. Nidhi Gupta is a student of M.Tech
(Embedded Systems) at ITM university.
She is a Bachelor in Information
Technology from University Of Delhi.
She holds six years of experience in
telecom industry.
Ramandeep Singh is an Assistant
Professor in EECE Department of ITM
University, Gurgaon. He is pursuing Ph.D
in embedded systems from ITM
University. In 2009 he has completed
M.E. in embedded systems. He is a B.
Tech. graduate from GGSIPU, Delhi. His
core research areas are low power embedded systems,
robotics, FPGA based embedded systems and SCADA.
Prior to joining ITM University he has worked with NXP
Semiconductors, Bangalore. He has various publications in
international journals and conferences.
Mr. Sidharth Bhatia expertize is in the
domain of Computer Vision and
Applications of Robotics. He has
published three research papers in the
field of Computer Vision in international
journals. He has done his M.Tech
(Embedded Systems) from SRM-IST, Chennai and B.Tech
from SRM-IST, Modinagar.

More Related Content

Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller

  • 1. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308 _______________________________________________________________________________________ Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 579 HAND GESTURE RECOGNITION USING ULTRASONIC SENSOR AND ATMEGA128 MICROCONTROLLER Nidhi Gupta1 , Ramandeep Singh2 , Sidharth Bhatia3 1 PG Student, EECE Department, ITM University, Gurgaon, Haryana, India 2 Assistant Professor, EECE Department, ITM University, Gurgaon, Haryana, India 3 Assistant Professor, EECE Department, ITM University, Gurgaon, Haryana, India Abstract The aim of this paper is to present a system that detects the hand gesture motions using the principle of Doppler Effect. Ultrasonic waves are transmitted by a sensor module and are reflected by a moving hand. The received waves are frequency shifted due to Doppler Effect. These frequency shifts in the ultrasonic waves are characterized to determine gestures. The gestures once recognized are mapped into commands to control the movement of a small robot. Current research work spans only four gestures: front movement, back movement, move left and move right. Keywords: Hand Gesture Recognition, Ultrasonic Doppler, Human Computer Interface, Sonar, Presence Detection --------------------------------------------------------------------***---------------------------------------------------------------------- 1. INTRODUCTION In order to recognize hand gestures, the principle of Doppler Effect is used here. An ultrasonic wave of frequency 22 KHz is generated with the help of an ultrasonic sensor. When a hand is waved near the source of Ultrasound wave, there is shift in the frequency of the sound wave, due to Doppler Effect. The received signal is then analyzed for characteristic shifts to determine whether the motion was push, pull, clockwise rotation or anti-clockwise rotation. The results will be indicated by the movement of a small robot. The robot will move towards forwards, if the motion of the hand was push; backwards if the hand motion was push; left if the hand motion was a clockwise rotation; and right if the hand motion was an anti-clockwise rotation. This project is based on the Microsoftโ€™s SoundWave [1] and Acoustic Doppler Sonar (ADS) [2]. The system comprises of an ultrasonic pair of transmitter and a receiver. The transmitter transmits an inaudible tone that is reflected by a hand in motion, in proximity. The reflected tone undergoes a frequency shift due to Doppler Effect. This amount of shift is dependent on the velocity of hand. The receiver captures the frequency shifted signals. The signals received are used to characterize the movement of the hand, in time domain. The incoming time-domain signal are buffered [3], and Fourier transform is applied on them. The result of this operation is magnitude vectors that are spread equally over the spectral width. The relationship between observed frequency f and emitted frequency f0 is given by ๐‘“ = ๐‘+๐‘ฃ ๐‘Ÿ ๐‘+๐‘ฃ ๐‘  ๐‘“0 (1) Where, c is the velocity of waves in the air; vr is the velocity of the receiver in air; if the object is moving towards the source then positive (and negative in the other direction); vs is the velocity of the source in air; if the source is moving away from the receiver then positive (negative if moving towards) After each FFT vector is computed, it is further processed to determine the bandwidth of the signals, speed of gestures and motion detection. The detected motions are then converted to robotic commands. 2. BLOCK DIAGRAM Fig 1: Components in the hardware setup The hardware configuration for this system is as follows: An ultrasonic module has on board transmitter and receiver. This module is connected to the ATMEGA128
  • 2. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308 _______________________________________________________________________________________ Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 580 microcontroller. The analog input received is converted to digital values by the inbuilt ADC (Analog to digital converter) of the microcontroller. The microcontroller (AVR ATMEGA128) emits the waves at a regular interval. If there is any obstacle (or hand, in this case) then the ultrasonic waves will be reflected, frequency shifted. The microcontroller performs a multiplication function that to compare the incoming and outgoing signals in hardware. This multiplier function uses the fundamental property of sine waves to find the difference in frequency between incoming and outgoing signals in hardware. The product of two sine waves produces two phase-shifted sine waves, one with a frequency that is the difference in frequencies between that of the two input waves, and the other with a frequency that is the sum of the two input waves: ๐‘ ๐‘–๐‘› 2๐œ‹๐‘“๐‘Ž ๐‘ฅ ๐‘ ๐‘–๐‘› 2๐œ‹๐‘“๐‘ ๐‘ฅ = ๐‘๐‘œ๐‘  2๐œ‹ ๐‘“๐‘Ž โˆ’ ๐‘“๐‘ ๐‘ฅ + ๐‘๐‘œ๐‘ โก2๐œ‹ ๐‘“๐‘Ž + ๐‘“๐‘ ๐‘ฅ 2 (2) The robot will receive commands from the microcontroller corresponding to each gesture. LCD display is used to provide textual information of the signal analyzed. The following are the gestures and corresponding functions that are executed in the current project: Table 1: Gesture and function chart Gesture Motion Function Front Movement Move Forward Back Movement Move Backward Anti- Clockwise Movement Move Left Clockwise Movement Move Right 3. ALGORITHM Fig 2: Finite State Machine Initial state is the Fill State. The peaks (local highs above a predefined threshold) in the ADC samples fill the buffer until full. Check state is where the local highs are analyzed to determine if the motion is one of the six gestures predefined in the database. The total number of peaks that dropped from HIGH to LOW, or number of peaks that rose from LOW to HIGH are counted. WaitPush is transited to from Check state if there is any gesture in the sample data collected. WaitNone is transited to from Check State if there was no gesture. FindPeak fills a buffer with one peak, as against the Fill state where the buffer is completely filled with local HIGHs. This separate state is required because the buffer will not need to be completely refilled even if no gesture was detected. 4. HARDWARE CONSIDERATION Fig 3: Schematic diagram for hand gesture controlled robotic movement The above circuit comprises of ATMEGA128 microcontroller, HC-SR04 Ultrasonic sensor, two motor driver ICs L293, and four DC motors. The sensor has on- board transmitter and receiver. The sensorโ€™s TRIG and ECHO pins are connected to the microcontrollerโ€™s ADC port. The motor driver is connected to the other input/output port. The microcontroller gives a logic HIGH at the pin connected to TRIG of sensor at regular intervals. The ultrasonic waves are emitted and if there is any object obstructing these
  • 3. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308 _______________________________________________________________________________________ Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 581 signals, then the ultrasonic waves are reflected back. The receiver in the ultrasonic module receives the frequency shifted signal at the pin connected to ECHO of sensor. These received waves are buffered and stored in the microcontroller. The microcontroller analyzes the signal for a pattern. If the pattern matches a gesture in the database, then it further determines the command associated with the gesture. The command is given in the form of logic HIGH or LOW to the pins of the motor driver L293D. These commands correspond to the robotic movements, as mentioned in Table 1. The processing of the microcontroller is also reflected in the LCD. The gestures determined and the corresponding robotic actions are displayed on the Liquid Crystal Display, simultaneously as the commands are given to the robot. This provides an additional result display as well as verification mechanism. Table 2: PIN Description Table 2 (contd.): PIN Description 5. APPLICATION AREAS Gesture recognition is useful in processing information from human beings that is not conveyed through speech or other methods. This technology is useful in following areas: a. Immersive gaming technology: Gestures may be used to control interactions with the gaming console and give a more interactive and immersive experience. b. Sign Language interpretation: Gesture recognition can be used to transcribe signs into text, just like speech recognition. This would be greatly helpful for the speech impaired. c. Control through facial gestures: This technology can be used for applications with even more precision like recognizing face gestures. This will be helpful in situations when users cannot use other input interfaces like mouse or keyboard or even hand gestures. This would be additionally helpful in applications like mood sensing. d. Alternative computer interfaces: Strong gesture recognition can be used to accomplish common tasks performed traditionally with the current input devices such as mouse or keyboard. Gestures, along with other methodologies like speech recognition can be made to control the electronic appliances and gadgets completely or with little need to type or touch. [4] [5] [6] e. Remote control: By using gesture recognition, it is possible to use hand alone as a remote control for various devices. The signal must not only indicate the desired response, but also which device to be controlled. [9] [10] [11] f. Home Appliances control: It is possible to extend the gesture recognition technology to control the household appliances. [10] 6. ADVANTAGES & LIMITATIONS Gesture recognition is very useful for automation. Gestures, a natural language of humans, provide an intuitive and effortless interface for communication with the computers. They will reduce our need for devices like mouse, keys, remote control or keys for interaction with the electronic devices. When combined with other advanced user interface technologies such as voice commands and face recognition, gestures can create a richer user experience that strives to understand the human "language," thereby fueling the next wave of electronic innovation. This technology is limited in the sense that all whole of human signs or gestures are not recognizable using this technology. The ultrasonic waves spread out and cannot be used to detect gestures like victory sign, where the gesture is made by two fingers. 7. RESULTS The gesture recognition using ultrasonic waves is found to be accurate and reliable. The methodology for testing comprised of movement of single hand or multiple hands. Single hand movement was detected accurately. When there are multiple hands, the movement is not detected accurately. The detection did not take into account the background area. The noise in human audible range did not affect the detection. 8. CONCLUSIONS & FUTURE SCOPE Additional gesture recognition opportunities exist in medical applications where, for health and safety reasons, a nurse or doctor may not be able to touch a display or track-pad but still needs to control a system. In other cases, the medical PIN NUMBER NAME I/O FUNCTION 61 PORTF.0 IN Receive input from the sensor 12, 13, 37, 38 PORTB.2, PORTB.3, PORTC.2, PORTC.3 OUT Enable the motor 10,11, 14, 15, 35, 36, 39, 40 PORTB.0, PORTB.1, PORTB.4, PORTB.5, PORTC.0, PORTC.1, PORTC.4, PORTC.5 OUT DC motor values 25 PORTD.0 OUT LCD RS pin 26 PORTD.1 OUT LCD RW pin 27 PORTD.2 OUT LCD EN pin 2-9 PORTE OUT LCD DATA PIN NUMBER NAME I/O FUNCTION 23/24 XTAL IN Crystal Oscillator 64 Vcc IN Power source 62 ARef IN Ground
  • 4. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308 _______________________________________________________________________________________ Volume: 03 Issue: 06 | Jun-2014, Available @ http://www.ijret.org 582 professional may not be within reach of the display yet still needs to manipulate the content being shown on the display. Appropriate gestures, such as hand swipes or using a finger as a virtual mouse, are a safer and faster way to control the device. Gesture recognition may be used in automobiles for user control, and among other motivations as an incremental convenience and safety capability. Robots, if trained to recognize some gestures can be used in situations of social needs, like rehabilitation or catastrophe, more independently. ACKNOWLEDGEMENTS Several people helped me through the project, notably the Lab Assistants, for providing hardware and support; my beloved friends and associates for numerous code reviews and innumerable verbal discussions. REFERENCES [1]. Sidhant Gupta, Dan Morris, Shwetak N Patel, Desney Tan, SoundWave: Using the Doppler Effect to Sense Gestures, ACM, 2012 [2]. Kaustubh Kalgaonkar, Bhiksha Raj, One-handed gesture recognition using ultrasonic sonar, IEEE, 2009 [3]. Thomas Schlomer, Benjamin Poppinga, Niels Henze, Susanne Boll, Gesture Recognition with a Wii Controller, Proceedings of the 2nd international Conference on Tangible and Embedded interaction, 2008 [4]. Afshin Sepehri, Yaser Yacoob, Larry S. Davis "Employing the Hand as an Interface Device", Journal of Multimedia, [5]. Lars Bretzner and Tony Lindeberg "Use Your Hand as a 3-D Mouse ...", Proc. 5th European Conference on Computer Vision (H. Burkhardt and B. Neumann, eds.), June 1998. [6]. Matthew Turk and Mathias Kรถlsch, "Perceptual Interfaces", University of California, Santa Barbara UCSB Technical Report 2003-33 [7]. M Porta "Vision-based user interfaces: methods and applications", International Journal of Human-Computer Studies, 2002. [8]. Henriksen, K. Sporring, J. Hornbaek, K. " Virtual trackballs revisited", IEEE Transactions on Visualization and Computer Graphics, 2004 [9]. Do Jun-Hyeong, Jung Jin-Woo, Sung hoon Jung, Jang Hyoyoung, Bien Zeungnam, Advanced soft remote control system using hand gesture, Mexican International Conference on Artificial Intelligence, 2006 [10]. K. Ouchi, N. Esaka, Y. Tamura, M. Hirahara, M. Doi, Magic Wand: an intuitive gesture remote control for home appliances, International Conference on Active Media Technology, AMT 2005 [11]. Lars Bretzner, Ivan Laptev, Tony Lindeberg, Sรถren Lenman, Yngve Sundblad "A Prototype System for Computer Vision Based Human Computer Interaction", 2001. BIOGRAPHIES Mrs. Nidhi Gupta is a student of M.Tech (Embedded Systems) at ITM university. She is a Bachelor in Information Technology from University Of Delhi. She holds six years of experience in telecom industry. Ramandeep Singh is an Assistant Professor in EECE Department of ITM University, Gurgaon. He is pursuing Ph.D in embedded systems from ITM University. In 2009 he has completed M.E. in embedded systems. He is a B. Tech. graduate from GGSIPU, Delhi. His core research areas are low power embedded systems, robotics, FPGA based embedded systems and SCADA. Prior to joining ITM University he has worked with NXP Semiconductors, Bangalore. He has various publications in international journals and conferences. Mr. Sidharth Bhatia expertize is in the domain of Computer Vision and Applications of Robotics. He has published three research papers in the field of Computer Vision in international journals. He has done his M.Tech (Embedded Systems) from SRM-IST, Chennai and B.Tech from SRM-IST, Modinagar.