International Journal of Research in Computer and Communication Technology, Vol 2, Issue 12, December- 2013

Similar documents
Research Seminar. Stefano CARRINO fr.ch

Gesture Recognition with Real World Environment using Kinect: A Review

Advancements in Gesture Recognition Technology

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Enabling Cursor Control Using on Pinch Gesture Recognition

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience

What was the first gestural interface?

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

Gesticulation Based Smart Surface with Enhanced Biometric Security Using Raspberry Pi

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Development of a telepresence agent

VICs: A Modular Vision-Based HCI Framework

Humera Syed 1, M. S. Khatib 2 1,2

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

GUIBDSS Gestural User Interface Based Digital Sixth Sense The wearable computer

Visual Interpretation of Hand Gestures as a Practical Interface Modality

EFFICIENT ATTENDANCE MANAGEMENT SYSTEM USING FACE DETECTION AND RECOGNITION

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Controlling Humanoid Robot Using Head Movements

Sri Shakthi Institute of Engg and Technology, Coimbatore, TN, India.

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

A SURVEY ON HAND GESTURE RECOGNITION

Face Recognition Based Attendance System with Student Monitoring Using RFID Technology

Interaction via motion observation

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Available online at ScienceDirect. Procedia Computer Science 50 (2015 )

SIXTH SENSE TECHNOLOGY A STEP AHEAD

R (2) Controlling System Application with hands by identifying movements through Camera

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

Vision-based User-interfaces for Pervasive Computing. CHI 2003 Tutorial Notes. Trevor Darrell Vision Interface Group MIT AI Lab

Projection Based HCI (Human Computer Interface) System using Image Processing

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Gesture Recognition Based Mouse Events

Virtual Touch Human Computer Interaction at a Distance

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

ISSN: [Arora * et al., 7(4): April, 2018] Impact Factor: 5.164

Navigation of PowerPoint Using Hand Gestures

Short Course on Computational Illumination

Live Hand Gesture Recognition using an Android Device

Student Attendance Monitoring System Via Face Detection and Recognition System

Getting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning...

An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Finger rotation detection using a Color Pattern Mask

MRT: Mixed-Reality Tabletop

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

Building a gesture based information display

Direct gaze based environmental controls

Hand Segmentation for Hand Gesture Recognition

II. LITERATURE SURVEY

AUGMENTED REALITY APPLICATIONS USING VISUAL TRACKING

VISUAL FINGER INPUT SENSING ROBOT MOTION

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

ReVRSR: Remote Virtual Reality for Service Robots

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture

A Real Time Static & Dynamic Hand Gesture Recognition System

Various Calibration Functions for Webcams and AIBO under Linux

ifinger Study of Gesture Recognition Technologies & Its Applications Volume II of II

Intelligent interaction

Real Time Hand Gesture Tracking for Network Centric Application

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

International Journal of Informative & Futuristic Research ISSN (Online):

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

LabVIEW based Intelligent Frontal & Non- Frontal Face Recognition System

DETECTION AND RECOGNITION OF HAND GESTURES TO CONTROL THE SYSTEM APPLICATIONS BY NEURAL NETWORKS. P.Suganya, R.Sathya, K.

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

Human Computer Interaction by Gesture Recognition

A Kinect-based 3D hand-gesture interface for 3D databases

Toward an Augmented Reality System for Violin Learning Support

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

GlassSpection User Guide

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Relationship to theory: This activity involves the motion of bodies under constant velocity.

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

Touch & Gesture. HCID 520 User Interface Software & Technology

A Novel Approach for Image Cropping and Automatic Contact Extraction from Images

Wadehra Kartik, Kathpalia Mukul, Bahl Vasudha, International Journal of Advance Research, Ideas and Innovations in Technology

Sensor system of a small biped entertainment robot

Interior Design with Augmented Reality

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Blue Eyes Technology with Electric Imp Explorer Kit Ankita Shaily*, Saurabh Anand I.

Training Schedule. Robotic System Design using Arduino Platform

CS295-1 Final Project : AIBO

Virtual Grasping Using a Data Glove

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

The Control of Avatar Motion Using Hand Gesture

Analysis of Various Methodology of Hand Gesture Recognition System using MATLAB

Air Marshalling with the Kinect

User Interface Agents

Super resolution with Epitomes

KINECT CONTROLLED HUMANOID AND HELICOPTER

Comparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Bare PCB Inspection and Sorting System

Transcription:

Design Of Virtual Sense Technology For System Interface Mr. Chetan Dhule, Prof.T.H.Nagrare Computer Science & Engineering Department, G.H Raisoni College Of Engineering. ABSTRACT A gesture-based human computer interaction can make computers and devices easier to use, such as by allowing people to control the application on windows by moving their hands through the air. Existing solutions have relied on gesture recognition algorithms they needs exotic hardware, often involving elaborate setups limited to the research lab. Gesture recognition algorithms used so far are not practical or responsive enough for real-world use, partially due to the inadequate data on which the image processing is applied. As existing methods are based on gesture recognition algorithms.it needs ANN training which makes whole process slow and reduce accuracy. Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, without using any ANN training to get exact sequence of motion of hands and fingers. Keywords computer vision, gesture recognition, speech computer human interaction 1. INTRODUCTION Existing solutions have relied on gesture recognition algorithms they needs exotic hardware, often involving elaborate setups limited to the research lab. Existing Gesture recognition algorithms used so far are not practical or responsive enough for real-world use, partially due to the inadequate data on which the image processing is applied. As existing methods are based on gesture recognition algorithms. It needs ANN training which makes whole process slow and reduce accuracy. The main objective of Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by Email Id :- chetandhule123@gmail.com calculating the change in pixels values of RBG colors from a video, without using any ANN training to get exact sequence of motion of hands and fingers. 2. PROBLEM DEFINATION Unfortunately, most existing solutions suffer from several shortcomings. Some of the hardware that has been used for processing gestures has required users to wear obtrusive sensors and stand near multiple carefully calibrated cameras. Most cameras used so far rely on color data and are therefore sensitive to environmental factors such as dynamic backgrounds and lighting conditions. The algorithms used to determine gestures from the data returned by the hardware have been unreliable when tested on a wide variety of users, and gestures have generally been limited to basic hand-tracking. Existing solutions have relied on gesture recognition algorithms.since the time needed for the computer to recognize a gesture is usually longer than the time needed to display its result, there is always a lag affecting the practical application of such interfaces. Finally, there have not been any collaborative workspaces or environments that allow users to freely use gestures for completing tasks such as controlling motion and events of mouse. 3. OBJECTIVES Existing solutions have relied on gesture recognition algorithms they needs exotic hardware, often involving elaborate setups limited to the research lab. Existing Gesture recognition algorithms used so far are not practical or responsive enough for real-world use, partially due to the inadequate data on which the image processing is applied. As existing methods are based on gesture recognition algorithms. It needs ANN training which makes whole process slow and reduce accuracy. The main objective of Method we proposed is based on real time www.ijrcct.org Page 1454

controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, without using any ANN training to get exact sequence of motion of hands and fingers. 4. LITERATURE REVIEW The processing of hand gestures has been explored extensively in existing literature. Some of the earlier work by Freeman and Weissman [1] used a video camera and computer vision template matching algorithms to detect a user's hand from across a room and allow the user to control a television set. A user could show an open hand and an on-screen hand icon would appear that could be used to adjust various graphical controls, such as a volume slider. The slider was activated when the user would cover the control for a fixed amount of time. The authors discovered that users enjoyed this alternative to the physical remote control and that the feedback of the on-screen hand was effective in assisting the user. However, users found it tiring to hold their hand up for long amounts of time to activate the different controls. This user fatigue common of gesturebased interfaces has been called gorilla arm. Other approaches have relied on using multiple cameras to produce a 3D image which can be used to detect and track hand motion [2][4]. These systems required an elaborate installation process which had to be completed carefully as calibration parameters such as the distance between the cameras was important in the triangulation algorithms used. These algorithms were also computationally expensive since a large amount of video data needed to be processed in realtime,and stereo-matching typically fails on scenes with little or no texture. Ultimately, such systems would not be useable outside of their special lab environments. In [3], Mistry presented the Sixth Sense wearable gestural interface, which used a camera and projector worn on the user's chest to allow the user to zoom in on projected maps(among other activities) by the use of two-handed gestures. In order for the camera to detect the user's hand, the user had to wear brightly-colored markers on their index fingers and thumbs. The regular webcam worn by the user would also be sensitive to environmental conditions such as bright sunlight or darkness, which would make distinguishing the colored markers much more difficult, if not impossible. Wilson and Oliver [5] aimed to create a Minority Report-like environment that they called G Windows. The user was able to move an onscreen cursor of a Microsoft Windows desktop by pointing with their hand and using voice commands to trigger actions like "close" and "scroll" to affect the underlying application windows. They concluded that users preferred interacting with hand gestures over voice commands and that desktop workspaces designed for gesture interactions were worth pursuing further. When considering collaborative online workspaces, several commercial and academic web-based collaboration solutions have existed for some time. However, interaction with other users in these environments is usually limited to basic sharing of media files, rather than allowing for full real-time collaboration of entire web-based applications and their data between users on distinctly deployed domains, as this paper proposes. Cristian Gadea, Bogdan Ionescu [6] aimed to create Finger-Based Gesture Control of a Collaborative Online Workspace, but system needs continuous internet connectivity, but this is not possible always in India. It needs an online workspace called as UC- IC, the application is within web browser to determine latest hand gesture, but it is not possible always to provide all time high speed connectivity everywhere and every time. Beside this it needs the training to recognize gesture, it slows down the system. In [7,8,9] methods are based on gesture recognition algorithms. It needs ANN training which makes whole process slow and reduce accuracy. Because each time if we are trying to recognize the guestre so the ANN training will be needed, and much of time will be needed. So system will not work or can t match its output speed with exact motion of mouse pointer. 5. SYSTEM ARCHITECTURE In this system we have used different preprocessing techniques, feature extraction a tool for recognizing the pixel based values or coordinates of RBG color by tracking the change in pixel position of different color stickers attached at fingers of user in real time. So accordingly the new updated values will be sent to PC to track motion of mouse. www.ijrcct.org Page 1455

In this phase we will get pixel sequence from image without using any ANN training to get exact sequence of motion of hands and fingers. Video Data Capturing acquisition Color Detection : In this phase we will extract color positions of RGB color from pixel sequence to detect the motion of hand and fingures by calculating change in pixel values of RBG colors. Image Processing Pixel Extraction Color Detection Controlling Position of mouse Pointer: Send signals to system to control mouse pointer motion and mouse events. It will give an appropriate command to PC to display the motion of mouse pointer according to motion of users fingers or hand. 6. METHODOLOGY i.hand Position tracking and mouse control Controlling Position of mouse Pointer Fig. 1: Block diagram of the different phases of the system.. Video Capturing: Here continuous video will be given as an input toby our system to the laptop. Image Processing: Image segmentation is done under two phases: 1. Skin Detection Model: To detect hand and fingersfron image. Getting user input virtually is the main aim for this module where user will move his finger in front of camera capture area. This motion will capture and detected by the camera and processed by the system frame by frame. After processing system will try to get the finger co-ordinates and once coordinates get calculated it will operate the cursor position. ii. Laser Pointer Detection 2. Approximate Median model : For subtraction of background.it has been observed that by the use of both methods for segmentation was obtained much better for further process. Pixel Extraction: www.ijrcct.org Page 1456

vii. Virtual Sense for file handling. This system will make use of the virtual sense technology in order to copy a file from one system into another within a local area network (LAN)/wify. The user will make an action of picking upon the file that needs to be copied and then move it to the system where the file would be copied and then release it over that system. 7.RESULTS AND DISCUSSION iii. Hand Gesture Based Auto Image Grabbing: (virtual Zoom in/out) The software has provision to control all clicking events of mouse by using a color marker.. After several experiments, it was observed that use of red color marker are more effective in comparison with when other color markers are used. iv. Camera Processing and image capturing: Fig2.Graphical user Interface of application. v. Object based bricks game. vi. Virtual playing of drum by holding drum sticks in hand. www.ijrcct.org Page 1457

Fig3:Start camera This application can be very useful for people who want to control computer without actually touching to system or by using wireless mouse which needs always a platform to operate. As a part of future scope the application can be improved to work with mobile phone and play stations. Other mode of human computer interaction like voice recognition, facial expression, eye gaze, etc. can also be combined to make the system more robust and flexible. Acknowledgment We thank the subjects participating in our experiments. 7. REFERENCES Fig: Set the marker color. [1] W. T. Freeman and C. D. Weissman, "Television Control by HandGestures", in Proc. of Int. Workshop on Automatic Face and GestureRecognition. IEEE Computer Society, 1995, pp. 179-183. [2] Z. Jun, Z. Fangwen, W. Jiaqi, Y. Zhengpeng, and C. Jinbo, "3D HandGesture Analysis Based on Multi-Criterion in Multi-Camera System",in ICAL2008: IEEE Int. Conf. on Automation and Logistics. IEEEComputer Society, September 2008, pp. 2342-2346. Fig:Control motion and clicking events of mouse with the color marker set earlier 8. CONCLUSION By using use of red color marker, there is a significant increase in accuracy and user friendly. The accuracy of use of red color marker system is increased in comparison to the case when other color markers were used individually. The problem of changing lighting condition and color based recognition has been solved in this work by giving the button to set the marker color at starting phase of application. Still there are some problems while recognition speed, where speed of controlling motion of mouse is not 100% which need to be improved for some of the gestures. All mouse movement and keys action has already been mapped that is working well under given circumstances. [3] P. Mistry and P. Maes, "SixthSense: A Wearable Gestural Interface", inacm SIGGRAPH ASIA 2009 Sketches. New York, NY, USA: ACM,2009. [4] A. Utsumi, T. Miyasato, and F. Kishino, "Multi-Camera Hand Pose Recognition System Using Skeleton Image", in RO-MAN'95:Proc. Of4th IEEE Int. Workshop on Robot and Human Communication. IEEE Computer Society, July 1995, pp. 219-224. [5] A. Wilson and N. Oliver, "GWindows: Robust Stereo Vision for Gesture Based Control of Windows", in ICMI '03: Proc. of 5th Int. Con! On Multimodal interfaces. New York, NY, USA: ACM, 2003, pp. 211-218. [6]Cristian Gadea, BogdanIonescu, Dan Ionescu, Shahidul Islam, Bogdan Solomon University of Ottawa, Mgestyk Technologies, Finger-Based Gesture Control of a Collaborative Online Workspace 7th IEEE International Symposium on Applied computational intelligence and www.ijrcct.org Page 1458

Informatics May 24-26, 2012 Timisoara, Romania. [7]Manaram Ganasekera, COMPUTER VISION BASED HANDMOVEMENT CAPTURING SYSTEM, The 8th International Conference on Computer Science & Education (ICCSE 2013) April 26-28, 2013. Colombo, Sri Lanka [8]Fabrizio Lamberti, Endowing Existing Desktop Applications with Customizable Body Gesturebased Interfaces,IEEE Int l Conference on Consumer Electronics(ICCE),978-1-4673-1363-6, 2013 [9]Anupam Agrawal, Rohit Raj and Shubha Porwal, Vision-based Multimodal Human- Computer Interaction using Hand and Head Gestures, Proceedings of 2013 IEEE Conference on Information and Communication Technologies (ICT 2013) www.ijrcct.org Page 1459