Advancements in Gesture Recognition Technology

Similar documents
R (2) Controlling System Application with hands by identifying movements through Camera

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

Multi-Modal User Interaction

Live Hand Gesture Recognition using an Android Device

Gesture Recognition with Real World Environment using Kinect: A Review

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

Development of a telepresence agent

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

RASim Prototype User Manual

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

HUMAN MACHINE INTERFACE

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

A Kinect-based 3D hand-gesture interface for 3D databases

International Journal of Advance Engineering and Research Development. Surface Computer

Interior Design with Augmented Reality

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

DATA GLOVES USING VIRTUAL REALITY

Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica Agell

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

HUMAN COMPUTER INTERFACE

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

CONTENT RICH INTERACTIVE, AND IMMERSIVE EXPERIENCES, IN ADVERTISING, MARKETING, AND EDUCATION

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Classifying 3D Input Devices

3D Data Navigation via Natural User Interfaces

Enabling Cursor Control Using on Pinch Gesture Recognition

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

GUI and Gestures. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

Technology offer. Aerial obstacle detection software for the visually impaired

VICs: A Modular Vision-Based HCI Framework

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Heads up interaction: glasgow university multimodal research. Eve Hoggan

THE Touchless SDK released by Microsoft provides the

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Controlling Humanoid Robot Using Head Movements

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

Virtual Grasping Using a Data Glove

Touch & Gesture. HCID 520 User Interface Software & Technology

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

Classifying 3D Input Devices

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Short Course on Computational Illumination

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Virtual Environments. Ruth Aylett

A Study on Motion-Based UI for Running Games with Kinect

KINECT CONTROLLED HUMANOID AND HELICOPTER

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

What was the first gestural interface?

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Hand Gesture Recognition System Using Camera

Projection Based HCI (Human Computer Interface) System using Image Processing

Definitions and Application Areas

WHITE PAPER Need for Gesture Recognition. April 2014

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Multimodal Research at CPK, Aalborg

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Computing Overview Breadth of Study. Autumn Spring Summer

Smart Phone Accelerometer Sensor Based Wireless Robot for Physically Disabled People

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device

3-Degrees of Freedom Robotic ARM Controller for Various Applications

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

Space Mouse - Hand movement and gesture recognition using Leap Motion Controller

Research Seminar. Stefano CARRINO fr.ch

Sketchpad Ivan Sutherland (1962)

International Journal of Research in Computer and Communication Technology, Vol 2, Issue 12, December- 2013

Input devices and interaction. Ruth Aylett

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

User Interface Software Projects

Sri Shakthi Institute of Engg and Technology, Coimbatore, TN, India.

Building a bimanual gesture based 3D user interface for Blender

Interactive and Immersive 3D Visualization for ATC. Matt Cooper Norrköping Visualization and Interaction Studio University of Linköping, Sweden

Service Robots in an Intelligent House

A Real Time Static & Dynamic Hand Gesture Recognition System

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

Gesture Based Smart Home Automation System Using Real Time Inputs

Controlling vehicle functions with natural body language

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

The University of Algarve Informatics Laboratory

Human Computer Interaction by Gesture Recognition

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

MRT: Mixed-Reality Tabletop

CS 315 Intro to Human Computer Interaction (HCI)

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Transcription:

IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka Srilatha, 2 Tiruveedhula Saranya 1,2 Department of Electronics and Communication Engineering,Sree Vidyanikethan Engineering College (Autonomous),(Affiliated to JNTU, Anantapur), Sree Sainath Nagar, Tirupathi, India. Abstract: Gesture recognition is an upcoming technology which can change entire technology and our work in daily life. Gesture means signs made by human beings which originate from face, hands or any part of the body. These gestures can be captured using scanning or video methods and by processing these gestures in to human signals. GUI related interfaces are used in which text is taken as input from mouse and keyboard. In this new system gestures are used as inputs which do not require any mechanical elements to communicate between man and machine. If we move our hand on computer screen and based on our movement, curser will accordingly move which will make work easier. But now with increase in technological knowledge, the concept slowly enhanced into voice, speech recognition and position recognition models. These have enriched the domain and have roped in some very sophisticated means of human computer interaction. Finger tracking is one such advanced gesture innovation. It is the use of hands and their various positions to kick-start a computer application. It aims at minimizing the use of keyboard and mouse. Non-touch based interaction or giving the input to computers with eyes is one major breakthrough in the domain. It can certainly be adjudged as the ray of hope for disabled people or people busy with multitasking. Keywords: Algorithm, AllSee, GUI, HMI, sign language. I. Introduction Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive interfaces or even GUIs (graphical user interfaces), which still limit the majority of input to keyboard and mouse. Gesture recognition enables humans to communicate with the machine (HMI) and interact naturally without any mechanical devices. Using the concept of gesture recognition, it is possible to point a finger at the computer screen so that the cursor will move accordingly. This could potentially make conventional input devices such as mouse, keyboards and even touch-screens redundant. Gesture recognition can be conducted with techniques from computer vision and image processing. The literature includes ongoing work in the computer vision field on capturing gestures or more general human pose and movements by cameras connected to a computer. II. Headings 1 Types of gestures 1.1 online gestures 1.2 offline gestures 2 Various types of gestures recognised by computer 2.1 Sign language recognition 2.2 For socially assistive robotics 2.3 Control through facial gestures 2.4 Alternative computer interfaces 2.5 Immersive game technology 2.6 Remote control 3 Brief history of gesture recognition technology 4 Input devices 4.1 Wired gloves 4.2 Depth aware camera 4.3 Stereo camera 4.4 Controller based gestures 4.5 Single camera 5 A basic working of the gesture recognition system 6 Challenges 1 Page

7 Conclusion Figure 2.3 Figure 2.4 Figure 2.5 Figure 2.6 Figure 3.1 Figure 3.2 Figure 3.3 Figure 3.4 Figure 3.5 Figure 3.6 Figure 4.1 Figure 4.2 Figure 4.4 Figure 5.1 Figure 5.2 III. Figures Controlling computer through facial gestures Alternate computer interface Gestures to control interactions within video games Remote control with the wave of hand First touch screen device First personal computer The essential reality p5 glove Apple iphone Multi-touch product by Microsoft Application of open frame works Wired gloves Depth aware camera Controller based gestures working of gesture recognition system Types of algorithms IV. Types of gestures In computer interfaces, two types of gestures are distinguished. We consider online gestures, which can also be regarded as direct manipulations like scaling and rotating. In contrast, offline gestures are usually processed after the interaction is finished; e. g. a circle is drawn to activate a context menu. 1.1 Online gestures: Direct manipulation gestures. They are used to scale or rotate a tangible object. 1.2 Offline gestures: Those gestures that are processed after the user interaction with the object. An example is the gesture to activate a menu. V. Various types of gestures recognised by computer Gesture recognition is useful for processing information from humans that is not conveyed through speech or type. There are also various types of gestures that can be identified by computers. 2.1 Sign language recognition: Just as speech recognition can transcribe speech to text, certain types of gesture recognition software can transcribe the symbols represented through sign language into text. 2.2 For socially assistive robotics: By using proper sensors (accelerometers and gyros) worn on the body of a patient and by reading the values from those sensors, robots can assist in patient rehabilitation. The best example can be stroke rehabilitation. 2.3 Control through facial gestures: Controlling a computer through facial gestures is a useful application of gesture recognition for users who may not physically be able to use a mouse or keyboard. Eye tracking in particular may be of use for controlling cursor motion or focusing on elements of a display. Fig 2.3: Controlling computer through facial gestures. 2 Page

2.4 Alternative computer interfaces: Fore going the traditional keyboard and mouse setup to interact with a computer, strong gesture recognition could allow users to accomplish frequent or common tasks using hand or face gestures to a camera. Fig 2.4: Alternate computer interface. 2.5 Immersive game technology: Gestures can be used to control interactions within video games to make the game player's experience more interactive or immersive. Fig 2.5: Gestures to control interactions within video games. 2.6 Remote control: Through the use of gesture recognition, remote control with the wave of a hand" of various devices is possible. The signal must not only indicate the desired response, but also which device to be controlled. Fig 2.6: Remote control with the wave of hand. VI. Brief History Of Gesture Recognition Technology 1977: Accutouch The first true touch screen device in the form of a curved glass touch screen sensor. 3 Page

Fig 3.1: First touch screen device. 1983: Hewlett-packard 150 The first personal computer featuring a touch-sensitive screen allows users to position the cursor and select on-screen buttons. Fig 3.2: First personal computer. 2001: The Essential Reality P5 Glove By Lionhead Studios Black & White is a game controlled by a special glove that can translate physical gestures into movement on the screen. This is likely the first commercial controller for gestural interfaces. Fig 3.3: the essential reality p5 glove 2007:Apple announces the iphone Apple receives a patent for an Apple touch device using gestural interface. Fig 3.4: Apple iphone. 4 Page

Microsoft surface Microsoft announces a multi-touch product that combines software and hardware to offer image manipulation through hand gestures and physical objects. Fig 3.5: Multi-touch product by Microsoft. 2008/9: Open frameworks New programming platforms such as Open Frameworks create simple tools for developing highly interactive interfaces that can be easily and intuitively triggered and manipulated. A fun example of an Open Frameworks application is Sniff, an interactive storefront window display of an animated dog that follows passers-by, discerns their behaviour and engaging them in play. Figure 3.6: Application of open frame works. VII. Input devices The ability to track a person's movements and determine what gestures they may be performing can be achieved through various tools. Although there is a large amount of research done in image/video based gesture recognition, there is some variation within the tools and environments used between implementations. 4.1 Wired gloves: These can provide input to the computer about the position and rotation of the hands using magnetic or inertial tracking devices. Furthermore, some gloves can detect finger bending with a high degree of accuracy or even provide haptic feedback to the user, which is a simulation of the sense of touch. The first commercially available hand-tracking glove-type device was the Data Glove, a glove-type device which could detect hand position, movement and finger bending. Fig 4.1: Wired gloves. 4.2 Depth-aware camera: Using specialized cameras such as structured light or time-of-flight cameras, one can generate a depth map of what is being seen through the camera at a short range, and use this data to approximate a 3d representation of what is being seen. These can be effective for detection of hand gestures due to their short range capabilities. 5 Page

Fig 4.2: Depth aware camera. 4.3 Stereo camera: Using two cameras whose relations to one another are known, a 3d representation can be approximated by the output of the cameras. In combination with direct motion measurement (6D-Vision) gestures can directly be detected. 4.4 Controller-based gestures: These controllers act as an extension of the body so that when gestures are performed, some of their motion can be conveniently captured by software. Mouse gestures are one such example, where the motion of the mouse is correlated to symbol being drawn by a person s hand. Fig 4.4: Controller based gestures. 4.5 Single camera: A standard 2D camera can be used for gesture recognition where the resources/environment would not be convenient for other forms of image-based recognition. Earlier it was thought that single camera may not be as effective as stereo or depth aware cameras, but some companies are challenging this theory. VIII. A Basic Working Of The Gesture Recognition System Fig 5.1: working of gesture recognition system. 6 Page

Algorithm: Fig 5.2: Types of algorithms. IX. Challenges There are many challenges associated with the accuracy and usefulness of gesture recognition software. For image-based gesture recognition there are limitations on the equipment used and image noise. Images or video may not be under consistent lighting, or in the same location. Items in the background or distinct features of the users may make recognition more difficult. The variety of implementations for image-based gesture recognition may also cause issue for viability of the technology to general usage. X. Conclusion Gesture recognition technology is relatively robust and accurate. Trade off can be maintained between speed and accuracy. Non-touch based interaction can certainly be adjudged as the ray of hope for disabled people or people busy with multitasking. References [1]. http://en.wikipedia.org/wiki/facial_recognition_system [2]. http://en.wikipedia.org/wiki/gesture_recognition [3]. https://goldin-meadow-lab.uchicago.edu/sites/goldin-meadow lab.uchicago.edu/files/uploads/pdfs/1999_gm.pdf 7 Page