Gesture Recognition Using Accelerometer
- Last Updated: 05 January 2014 Hits: 3074
This ECE Project discuss Gesture Recognition Using Accelerometer. The use of hand gestures provides an attractive alternative to cumbersome interface devices for human-computer interaction. The aim behind the project is to be able to sense the movement of a user's hand and to recognize the gestures using a gesture recognition algorithm. A simple inertial navigation sensor, accelerometer is utilized to get dynamic / static profile of movement of the users hand. It gives the acceleration sensed by the device in each of the 3 axis. An ArduinoUNO R3 board is used for serial communication of this accelerometer data with the computer. Based on values from the accelerometer, hand movements are detected and classified into previously trained gestures. Discrete hidden Markov models form the core part of the gesture recognition system. Four test gestures have been defined and used to evaluate the performance of the application. An average recognition rate of over 70 percent was achieved. The processing of data and recognition of gestures was done in MATLAB.
As computation is getting to play an important role in enhancing the quality of life, more and more research has been directed towards natural human-computer interaction. In a smart environment, people usually hope to use the most natural and convenient ways to express their intentions and interact with the environment. Button pressing, often used in the remote control panel, provides the most traditional means of giving commands to household appliances. Such kind of operation, however, is not natural and sometimes even inconvenient, especially for elders or visually disabled people who are not able to distinguish the buttons on the device. In this regard, gesture-based interaction offers an alternative way in a smart environment. Most of previous work on gesture recognition has been based on computer vision techniques . However, the performance of such vision-based approaches depends strongly on the lighting condition and camera facing angles, which greatly restricts its applications in the smart environments. Suppose you are enjoying movies in your home theater with all the lights on. If you intend to change the volume of TV with gesture, it turns out to be rather difficult to accurately recognize the gesture under poor lighting condition using a camera based system. In addition, it is also uncomfortable and inconvenient if you are always required to face the camera directly to complete a gesture.
Gesture recognition from accelerometer data is an emerging technique for gesture based interaction, which suits well the requirements in ubiquitous computing environments. With the rapid development of the MEMS (Micro Electro-Mechanical System) technology, people can wear/carry one or more accelerometer-equipped devices in daily life, for example, Apple iPhone , Nintendo Wiimote. These wireless-enabled mobile/wearable devices provide new possibilities for interacting with a wide range of applications, such as home appliances, mixed reality, etc. The first step of accelerometer based gesture recognition system is to get the time series of a gesture motion. Now most accelerometers can capture three-axis acceleration data, i.e. 3D accelerometers, which convey more motion information than 2D accelerometers. They have been embedded into several commercial products such as iPhone and Wiimote.
Human Interfacing Device is key area in modern electronics era. Also motion interface is rapidly becoming a key function in many consumer electronics devices including smart phones ,tablets, gaming consoles, and smart-TVs as it provides an intuitive way for consumers to interact with electronic devices by tracking motion in free space and delivering these motions as input commands. In the field of Human Computer Interaction (HCI), gesture recognition is becoming increasingly important as an interface method. In particular, interpretation of human hand and body gestures can help in achieving the ease and comfort desired for HCI. In this project we propose an accelerometer based wireless embedded system that can accurately detect predefined dynamic gestures. We consider a gesture as a stochastic process. The analysis is handled entirely by the system and the final results are transmitted via wireless to the computer which identifies the correct gesture from the predefined set of gestures.
The gesture based wireless 3D accelerometer is a breakthrough technology. It takes the input as gestures by using an accelerometer and feeds into the computer. Various sensors capable of detecting motion in free space have been commercially available for several decades and have been used in automobiles, aircraft and ships. Initial size, power consumption and price, however, prevented their mass adoption in consumer electronics until the past few years. But, the advent of simple inertial navigation sensor like accelerometer can be utilized in getting Dynamic or Static acceleration profile of movement. The gesture based 3D accelerometer can be treated as the new age input device. It is more natural in its feel and provides the user with better ease of use. The user interface of applications can be changed to utilize the free hand movement possible with the device.