Real Time Hand Gesture Tracking for Network Centric Application

Similar documents
R (2) Controlling System Application with hands by identifying movements through Camera

Gesture Recognition with Real World Environment using Kinect: A Review

KINECT CONTROLLED HUMANOID AND HELICOPTER

Air Marshalling with the Kinect

Formation and Cooperation for SWARMed Intelligent Robots

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Virtual Grasping Using a Data Glove

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

ReVRSR: Remote Virtual Reality for Service Robots

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Hand Gesture Recognition Using Radial Length Metric

A Publicly Available RGB-D Data Set of Muslim Prayer Postures Recorded Using Microsoft Kinect for Windows

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Analysis of Various Methodology of Hand Gesture Recognition System using MATLAB

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Robust Hand Gesture Recognition for Robotic Hand Control

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Advancements in Gesture Recognition Technology

Hand Gesture Recognition System Using Camera

Wirelessly Controlled Wheeled Robotic Arm

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Development of a telepresence agent

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Towards Complex Human Robot Cooperation Based on Gesture-Controlled Autonomous Navigation

Collaborative Robotic Navigation Using EZ-Robots

Mars Rover: System Block Diagram. November 19, By: Dan Dunn Colin Shea Eric Spiller. Advisors: Dr. Huggins Dr. Malinowski Mr.

Toward an Augmented Reality System for Violin Learning Support

APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE

Controlling Humanoid Robot Using Head Movements

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

The Making of a Kinect-based Control Car and Its Application in Engineering Education

Service Robots in an Intelligent House

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Augmented Reality using Hand Gesture Recognition System and its use in Virtual Dressing Room

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT

What was the first gestural interface?

Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Sensor system of a small biped entertainment robot

CS295-1 Final Project : AIBO

CONTACT: , ROBOTIC BASED PROJECTS

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

WHITE PAPER Need for Gesture Recognition. April 2014

HUMAN MACHINE INTERFACE

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

A Kinect-based 3D hand-gesture interface for 3D databases

2.1 Dual-Arm Humanoid Robot A dual-arm humanoid robot is actuated by rubbertuators, which are McKibben pneumatic artiæcial muscles as shown in Figure

Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica Agell

VOICE CONTROL BASED PROSTHETIC HUMAN ARM

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

DRAFT 2016 CSTA K-12 CS

Training NAO using Kinect

EF-45 Iris Recognition System

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Introduction to Computer Science - PLTW #9340

Gesture Controlled Car

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

Multi Robot Navigation and Mapping for Combat Environment

FINGER PLACEMENT CORRECTION FOR STATIC GESTURE RECOGNITION IN AMERICAN SIGN LANGUAGE. Veronica Yenquenida Flamenco Cordova

Efficient Gesture Interpretation for Gesture-based Human-Service Robot Interaction

Research Seminar. Stefano CARRINO fr.ch

Hand Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

EFFICIENT ATTENDANCE MANAGEMENT SYSTEM USING FACE DETECTION AND RECOGNITION

Various Calibration Functions for Webcams and AIBO under Linux

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Available theses (October 2011) MERLIN Group

Multi-touch Interface for Controlling Multiple Mobile Robots

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES

9/2/2013 Excellent ID. Operational Manual eskan SADL handheld scanner

A Real Time Static & Dynamic Hand Gesture Recognition System

Bloodhound RMS Product Overview

Multi-Modal User Interaction

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

OBJECTIVE OF THE BOOK ORGANIZATION OF THE BOOK

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Finger rotation detection using a Color Pattern Mask

Frictioned Micromotion Input for Touch Sensitive Devices

Evaluation of Five-finger Haptic Communication with Network Delay

MATLAB is a high-level programming language, extensively

Project: Sudoku solver

Continuous Rotation Control of Robotic Arm using Slip Rings for Mars Rover

Live Hand Gesture Recognition using an Android Device

2. Visually- Guided Grasping (3D)

Team KMUTT: Team Description Paper

Introduction to Robotics

Transcription:

Real Time Hand Gesture Tracking for Network Centric Application Abstract Chukwuemeka Chijioke Obasi 1 *, Christiana Chikodi Okezie 2, Ken Akpado 2, Chukwu Nnaemeka Paul 3, Asogwa, Chukwudi Samuel 1, Akuma Chukwuemeka Kingsley 4, Oyeleke D. Oluseun 4 1. Electrical Electronics Engineering Department, Air Force Institute of Technology, Nigeria 2. Electronics and Computer Engineering Department, Nnamdi Azikiwe University, Nigeria 3. Federal Polytechnic Ekowe, Bayelsa State, Nigeria 4. Aircraft Engineering Department, Air Force Institute of Technology, Nigeria This paper focuses on the Real Time Gesture Tracking for Network Centric Application. In it, human hand gesture was acquired using Kinect depth camera. The user posed in front of the camera about 2 meters away from the camera mounted about 80 cm above ground level. The acquired image is processed to extract the right hand joints vertical and horizontal coordinates, which is transmitted over a network medium. The received information is then classified and assigned to a sub routine that is meant to perform a defined task. Keywords: Real Time; Gesture; Network; Tracking; Human machine Interaction (HMI) 1. Introduction Human Machine Interface (HMI) is a growing technology. The technology supports the interaction of human and machine for effective communication operation and control. According to Sanjay (2011), some achievements of this technology are in the areas of voice synthesis, face recognition, computer vision, gesture recognition, etc. Sanjay stated that Gestures are the non-verbally form of communication. The author added that multiple gestures are possible with a person at a time and that since human gestures are perceived through vision, it is a subject of great interest for computer vision researchers. Dong-Iko et al. (2012), noted that over the past few years, gesture recognition has been largely used in the gaming and entertainment market. Now, gesture recognition is becoming a commonplace technology, enabling humans and machines to interface more easily in the home, the automobile and at work. The author added that gesture are already applied in some TV and light controls. Before the advent of intensive research in the area of non - verbal human - machine interaction and computer vision (Sanjay, 2011), network applications has been with some form of hardwired or wireless technologies. Today s technology of networks and the internet can be exploited in the application of certain human machine interaction. 2. Related Works The related works are not limited to: a. A Gesture Based Interface for Human Robot Interaction (Waldherr, Romero & Thrun, 2000). This paper is aimed at developing and evaluating a vision based interface that is capable of recognizing both pose and motion gestures. The robot could identify a person and follow his hand gesture. It could recognize and interpret the following gesture commands made by hand: Follow, Stop, Pick and Drop. The robot could follow a human guide, stop, pick up rubbish and drop it in a refuse bin at a known spot in the map already designed for the robot to follow. The robot could distinguish between static and motion gestures. b. Real Time Robotic Hand Control Using Hand Gesture (Raheja, Shyam, Rajsekhar & Prasad 2012). This research paper is aimed at Detecting Hand Gesture for Real Time Control of Robot Hand, in which they were able to detect hand gesture with an accuracy of 95% when the hand was kept in from of the camera for one second. c. Application for Gesture Based Control of the Pioneer Robot with Manipulator Lapuskin (2012). With an aim to create a system that would allow a human to use gesture to control a robot, the author, Lapuskin, in his 32

master s degree thesis presented a work that was able to use Java to develop GUI that achieved gesture recognition and encoding of gesture information into wifi signal for transition. The implication shows that the gesture could detect and interpret Ok signal made by hand gesture, Go there signal made by pointing the index finger, Come here signal made by hand gesture an Turn around signal made by drawing a circle I the air. The robot could also respond to all these gestures by driving its manipulator. d. Gesture Recognition and Mimicking in a Humanoid Robot (Begley, 2008). Here, Begley, was able to reprogram ISAC (Intelligent Soft Arm Control), resident at Vanderbilt s Featheringill University to repeat any hand gesture made in front of it. The robot is fitted with two Sony XC-999 color CCD (Charged Coupled Device) cameras as the robot eye. e. Gesture Based PC Interface with Kinect Sensor (Samet, 2012). This is a master s thesis by Samet ERAP with an aim to use abilities of Kinect XBOX 360 sensor to develop an application which allows operating windows 7 operating system in touch-less and clicking-less manner. The thesis was able to achieve a software system that could switch from one slide to another during presentation, control a pioneer robot and reliably collect information about position of human body in real time. f. Hand Gesture Recognition Using Computer Vision (Lockton). Lockton in his thesis worked towards interpreting American one-handed sign language alphabet and numbers using hand gesture captured by camera. He was able to make the computer to display the alphabets and number as he made the gestures in front of the camera. A stereographic system with 3D Depth camera and good lightning system for different skin color and background color were used. The human hand is identified by a color marker won on the hand. RGB color of the hand captured was celebrated using the following model: g. Gesture Interface for Robot Control (Florian, N., Ken, & Sumudu 2010). Florian in this paper presented an innovative way to control a robot arm by using a gesture interface. Their aim was to offer a different control method that needs only one hand and is a simple alternative to joystick for direct control of an arm that can move in 3D. The result of comparative study of hand gesture, haptic and gamepad control of robot showed that gesture interface performed better than gamepad (joystick) because it free up the hand for other task and cost less. In the field of human machine interface, computer vision, and in particular, hand gesture recognition has been greatly exploited. Noting the fact that this area of research is still very new, numerous effects have been deployed by many researchers and a lot have been achieved so far, though the exploit remains inexhaustible. The review presented here show a hug application area hand gesture recognition in the control of robot. Raheja et al. presented a work bordering on the real-time tracking of gesture information for robotic control. This work and others dwell on standalone application of hand gesture tracking. The aspect of network application of the tracked information seems not to be considered by any author. This paper therefore will focus on this aspect, tracking human hand gesture information in real time and deploying the tracked data over network for certain application. 3. Methodology In the development of this system, the modular methodology was adopted. Figure 1 shows the block depicting the system being implemented. The figure shows the point where the hand gesture is captured using depth camera (Kinect), connected to the Personal computer (PC) through an interface. Gesture processing software running on the PC detect and process the gesture information by encryption. The encrypted information transmitted over a network medium to receiving PC, where it is decrypted and classified for a classified action or control. This block diagram typifies the Real Time Hand Gesture Tracking for Network Centric Application. From this block diagram, the following major modules can be identified: a. Gesture Recognition module b. Network module c. Application module 33

Human Gesture Image Capture/Filter PC Interfaces Sender PC Software Interface a. Gesture processing b. Data Encryption Network Medium Data Decryption Sender PC Destination Application Figure 1 System Block Diagram 3.1 Development of Gesture Recognition Module To design the gesture detection software, the basic gesture algorithm was adopted. The flow chart in figure 2 represents the algorithm. The following steps are involved the gesture detection application The image of the user in front of the camera was detected; X, y coordinates each of twenty (20) skeletal joints of the user was identified; If any change in joint location was noticed, the new location of the joint would be tracked and computed using Pythagoras theorem; Where d is the distance between two joints The value that was computed was assigned a predefined task, according to the required angular movement of the robotic arm. The flow chart below implements the Kinect algorithm: (1) 34

Start User Action Kinect Sensor Identify Joint Calculate Joint Distance and depth Joint Match? Score Gesture 3.2 Development of the Network Module Figure 2 Flow Chart of Gesture Recognition sub system In developing this module, socket programming using TCP/IP was implemented in C#. There are two parts of this implementation, the server and the second is the client applications. Figure 3 shows the representation of the flow sequence of the implementation. The flow chart describes the following algorithm. Step 1: Server and client create a stream socket s with the socket () call. Step 2: (Optional for client) Sever bind socket s to a local address with the bind() call. Step 3: Server uses the listen() call to alert the TCP/IP machine of the willingness to accept connections. Step 4: Client connects socket s to a foreign host with the connect() call. Step 5: Server accepts the connection and receives a second socket, Step 6 and 7: Server reads and writes data on socket ns, client reads and writes data on socket s, by using send() and recv() calls, until all data has been exchanged. Step 8: Sever closes socket ns with the close)) call. Client closes socket and end the TCP/IP session with the closed call. Go to step 5. 35

Server Client Create Socket Bind Socket Listen Bind Socket Listen Accept Connection Accept Connection Read/Write Data Read/Write Data Close Socket Close Socket Accept another connection or close present connection Figure 3 Flow Sequence of Client Server Interaction 3.3 Implementation of Feedback or Voltage monitor circuit The application module was the part that was used for the scoring and classification of the actions that would be controlled by the gesture. This was achieved by converting all the joints coordinate locations into values. The controlled action is such that when a particular joint value is received, a sub routine defining an action is triggered. 4. Results and Discussions Each module defined above was implemented, tested and integrated. The received information over the network shows the following result: The gesture sub system was implemented in C#, using visual studio 2010. During this test, the Kinect sensor was connected id a PC running the visual studio 2010 and the Kinect SDK. The code was able to track the dynamic positions of the following joints in the x y plane: shoulder, elbow, wrist and the palm. Figure 4 shows the software interface. 36

Fig. 4.1 Joint Tracking Results These results were only for the right hand. Other joints were not tracked. Different positions of these joints presented different results. At rest position of the right hand, the joints values were as follows: Table 1 Pixel values of joints at rest position Joint Value (Pixel) X Y Palm 420 400 Wrist 420 360 Elbow 420 300 Shoulder 420 217 All x values are the same while y values change from joint to joint. 5. Conclusion This paper looked at how hand gesture information can be tracked in real time and transmitted over a network medium for the remote control of any task. The system used Kinect camera to extract the hand gesture of a person that poses a few inches away from the camera. The result show that gesture information was extracted as the x, y coordinate of the joints of the hand. These values are stored and assigned desired task when needed. Hand gesture recognition can therefore be effectively used for remote control of certain operation as the need arises. References Begley S. (2008) Gesture Recognition and Mimicking in a Humanoid Robot, M. Sc. Thesis, Graduate School of Vanderbilt University. Retrieved from: http://etd.library.vanderbilt.edu/available/etd-03272008-144850/unrestricted/ Dong-Ik K. et al, (2012), Gesture recognition: Enabling natural interactions with electronics, Texas Instruments, retrieved from: http://www.ti.com/lit/wp/spryl99/ Florian N., Ken T., and Sumudu M. (2010), Gesture Based Interface for Robot Control, Retrieved from: http://www.academia.edu/772641. Lapuskin M., (2012) Application for Gesture Based Control of the Pioneer Robot with Manipulator, Master's Thesis, Tallinn University of Technology. Lockton R., Hand Gesture Recognition Using Computer Vision Project Report, Balliol College, Oxford 37

University, Retrieved from: http://www.microsoft.com/en-us/um/pcoplc/awf/bmvc02/. Reheja J., Shyam R., Rajsekhar G. and Prasad P., (2012). Real-Time Robotic Hand Control Using Hand Gestures, Robotic Systems - Applications, Control and Programming, InTech, Retrieved from: hup://www.intcchopen.com/books/robotic-systcms-applications-control-and-rogramming/real-time-robotic hand-control-using-hand-gestures. Samet E., (2012), Gesture Based PC Interface with Kind Sensor, Master's Thesis. Tallinn University of Technology. Sanjay M. (2011), A Study on Hand Gesture Recognition Technique, Master's Thesis, National Institute of Technology, Rourkela Orissa 769 008, INDIA. Waldherr S., Romero R. and Thrun S., (2000), A Gesture Based Interface for Human - Robot Interaction, Kluwer Academic Publishers, Netherlands. 38