Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

Similar documents
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Information Layout and Interaction on Virtual and Real Rotary Tables

Double-side Multi-touch Input for Mobile Devices

Evaluation of a Soft-Surfaced Multi-touch Interface

Building a gesture based information display

A novel click-free interaction technique for large-screen interfaces

R (2) Controlling System Application with hands by identifying movements through Camera

Occlusion-Aware Menu Design for Digital Tabletops

CONSTRUCTING AN ELASTIC TOUCH PANEL WITH EMBEDDED IR-LEDS USING SILICONE RUBBER

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

International Journal of Advance Engineering and Research Development. Surface Computer

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

A Multi-Touch Enabled Steering Wheel Exploring the Design Space

A Technique for Touch Force Sensing using a Waterproof Device s Built-in Barometer

Multi-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

Classic3D and Single3D: Two unimanual techniques for constrained 3D manipulations on tablet PCs

VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures

Advancements in Gesture Recognition Technology

Multi-touch Interface for Controlling Multiple Mobile Robots

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

A Gesture Oriented Android Multi Touch Interaction Scheme of Car. Feilong Xu

Image Manipulation Interface using Depth-based Hand Gesture

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes)

DATA GLOVES USING VIRTUAL REALITY

EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment

Workshop one: Constructing a multi-touch table (6 december 2007) Laurence Muller.

arxiv: v1 [cs.hc] 14 Jan 2015

DRAFT: SPARSH UI: A MULTI-TOUCH FRAMEWORK FOR COLLABORATION AND MODULAR GESTURE RECOGNITION. Desirée Velázquez NSF REU Intern

Toolkit For Gesture Classification Through Acoustic Sensing

EnhancedTable: An Augmented Table System for Supporting Face-to-Face Meeting in Ubiquitous Environment

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Translucent Tangibles on Tabletops: Exploring the Design Space

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

SKETCHING CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATION TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 7 CLASS 13

Enabling Cursor Control Using on Pinch Gesture Recognition

Efficient In-Situ Creation of Augmented Reality Tutorials

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Augmented Desk Interface. Graduate School of Information Systems. Tokyo , Japan. is GUI for using computer programs. As a result, users

Frictioned Micromotion Input for Touch Sensitive Devices

Beyond: collapsible tools and gestures for computational design

Tilt or Touch? An Evaluation of Steering Control of Racing Game on Tablet or Smartphone

TIMEWINDOW. dig through time.

Open Archive TOULOUSE Archive Ouverte (OATAO)


Development of Video Chat System Based on Space Sharing and Haptic Communication

MonoTouch: Single Capacitive Touch Sensor that Differentiates Touch Gestures

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

What was the first gestural interface?

3D Interactions with a Passive Deformable Haptic Glove

Mixed Reality Approach and the Applications using Projection Head Mounted Display

NTT DOCOMO Technical Journal. 1. Introduction. 2. Process of Popularizing Glasses-Type Devices

Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience

Projection Based HCI (Human Computer Interface) System using Image Processing

Humera Syed 1, M. S. Khatib 2 1,2

Interaction Technique for a Pen-Based Interface Using Finger Motions

TACTUS: A Hardware and Software Testbed for Research in Multi-Touch Interaction

Gesture Recognition with Real World Environment using Kinect: A Review

Infrared Touch Screen Sensor

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space

Transporters: Vision & Touch Transitive Widgets for Capacitive Screens

Geo-Located Content in Virtual and Augmented Reality

Multitouch Finger Registration and Its Applications

Visual Touchpad: A Two-handed Gestural Input Device

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction

Finger Posture and Shear Force Measurement using Fingernail Sensors: Initial Experimentation

Investigating Gestures on Elastic Tabletops

The Control of Avatar Motion Using Hand Gesture

Comparison of Haptic and Non-Speech Audio Feedback

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Expanding Touch Input Vocabulary by Using Consecutive Distant Taps

COMPARATIVE STUDY AND ANALYSIS FOR GESTURE RECOGNITION METHODOLOGIES

Analysing Different Approaches to Remote Interaction Applicable in Computer Assisted Education

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

QS Spiral: Visualizing Periodic Quantified Self Data

Omni-Directional Catadioptric Acquisition System

A Kinect-based 3D hand-gesture interface for 3D databases

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

UbiBeam: An Interactive Projector-Camera System for Domestic Deployment

Mobile Multi-Display Environments

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities

STRUCTURE SENSOR QUICK START GUIDE

Sketchpad Ivan Sutherland (1962)

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Two-Handed Interactive Menu: An Application of Asymmetric Bimanual Gestures and Depth Based Selection Techniques

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Mudpad: Fluid Haptics for Multitouch Surfaces

Touch Interfaces. Jeff Avery

Enhancing Traffic Visualizations for Mobile Devices (Mingle)

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Development of excavator training simulator using leap motion controller

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

How to Create a Touchless Slider for Human Interface Applications

HUMAN COMPUTER INTERFACE

Vein and Fingerprint Identification Multi Biometric System: A Novel Approach

Transcription:

2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science & Engineering Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device Hayato Takahashi, Yuhki Kitazono Advanced School of Creative Engineering National Institute of Technology, Kitakyushu College Kitakyushu, Japan kitazono@kct.ac.jp Abstract In this study, we aimed to integrate multi-touch gesture with hand gesture and make new input method. Multitouch gesture and hand gesture have been studied separately, however, these could be integrated, and it would be new interaction way for computer. We made a large scale multi-touch screen by FTIR method that uses infrared ray to detect the touched point, and a hand gesture recognition device with sensors; pressure sensor and bend sensor. In the end, we integrated the input information to interact with computer. User can input coordinate by touching, and simultaneously input action by performing hand gesture. For instance, user can select target by touching, and then make an action on target by clicking the pressure sensor or bending the bend sensor. We demonstrated the input method on GUI application. Keywords multi-touch; FTIR display; sensor; hand gesture I. INTRODUCTION In recent years, multi-touch display has become a major way of interacting with the computer. Multi-touch interaction enables the user to interact intuitively rather than mouse or keyboard. However, though multi-touch interaction is intuitive, it has difficulty in providing various operating command. This is due to that touch gesture is identified by number of fingers and track of them, so a few gestures meet practical performance, remaining intuitive. Representative touch gestures are swipe, double tap, rotate, pinch, and hold. General multi-touch device like smartphone supports these gestures natively. Fig. 1 indicates touch gestures. As Shima et al. proposed[1], it is also possible to use the gesture as a trigger to start a specific input mode, not only performing an action. In general, gesture pattern is designed to be performed by one hand because an input device such as smartphone is too small to interact with two hands. However, since large device such as a tablet or an interactive tabletop has enough display size to interact with two hands, it is effective to interact by a combination of both hands. As an effective gesture patterns with both hands in a large multi-touch display, Kuribara et al. proposed a gesture named "HandyScope"[2]. HandyScope is a gesture designed to be performed with both hands. User taps with two fingers of one hand at first, and then passes through between these fingers with one finger of the other hand, then menu bar is displayed between the two fingers and one finger. Fig. 1. Hand Gesture. This "HandyScope" is a study which focused on how to display menu. Addition to this, Lepinski designed multitouch menu[3]. General gesture pattern such as swipe or pinch is identified by the track and number of touching finger. However, gesture pattern gets complicated when the gesture pattern increased, because new gesture patterns have to avoid conflicts with existing gesture patterns, though gestures are distinguished by only the number and track of the touch points. To expand gesture pattern, there have been studies which aim to identify touch interaction by other factors, as in TapSense[4] that distinguish touching object such as a nail, knuckle, and so on. For another approach to expand gesture pattern, Boring et al. proposed Fat Thumb[5] that distinguish touches by the contact size of the finger. These studies expanded gesture pattern, however, yet even generally utilized operations, such as copy and paste, need to be selected from the displayed menu. In other words, multitouch interaction has no shortcut command like keyboard interaction. Multi-touch interaction is a very intuitive way to operate, but it is not efficient as a keyboard. In addition to multi-touch interaction, contactless interaction; NUI (Natural User Interface) have been studied in recent years[6]. Typical input methods of NUI are hand gesture 978-1-5090-4871-7/16 $31.00 2016 IEEE DOI 10.1109/ACIT-CSII-BCD.2016.27 81

and body language. For instance, Sato et al. proposes a specific hand gesture recognition with high-speed camera[7]. The system obtains the input angle of the hand by recognizing the virtual straight line between the fingertips of the thumb and index finger and the center of the circle shaped by these fingers. For another instance, Lee proposes a method to combine hand gesture and augmented reality[8]. In the system user's fingertips are tracked by capturing with a camera, and the user can draw lines in the air. The track of the finger is acquired and line drown by the finger is mapped on the image captured by the camera. Even hand gesture is useful for certain interaction, it is difficult to use for general input method like a mouse or keyboard. Some of the reasons are that user have to continue raising hands while operating, and that the accuracy of pointing is lower compared to existing input method. Basing on this current situation, we propose an interface which integrates the touch display with hand gesture. On this interface, user can interact by both touch and hand gesture simultaneously. Operational information of the hand gesture is integrated with information about the touch points. By including such information, touch operation pattern can be expanded without getting complicated. In this study, we made a large screen multi-touch display based on FTIR method[9] and glove type device for hand gesture recognition. Additionally, we developed a GUI application for demonstrating the advantage of the input method. II. SYSTEM This system, which has the aim of integrating touch information with hand gesture input, consists of multi-touch display and hand gesture recognition device. Touch input and hand gesture are performed simultaneously, so the system is suitable for large screen. Therefore, the system is used on a tabletop device, so we made a 400 400mm multi-touch display based on FTIR method. Hand gesture information is integrated with the touch information, so it needs to be performed while touching. For this reason, hand gestures are recognized with a sensor device attached to the hand. In this study, we use a pressure sensor and a bending sensor for gesture recognition. Using these sensors, the system detects the finger's bending condition and click input. A. FTIR Multi-touch Display We constructed a multi-touch display using frustrated internal reflection (FTIR) technique. FTIR multi-touch display is generally used for tabletop system[10]. FTIR multi-touh display is basically composed of an acrylic plate, infrared LED, and infrared camera. When light encounters an interface to medium with a lower index of reflection, in this case acrylic to air, the light becomes refracted, it becomes total internal reflection (TIR). When infrared rays are irradiated from the side of the acrylic plate, owing to the difference of the refractive index of acrylic and air, at the interface total reflection occurs, and infrared rays are reflected in acrylic. However, total internal reflection does not occur at some interfaces where are touched by something like fingers, and the infrared rays diffuse into air. By capturing these diffused infrared rays with an infrared camera from the opposite side, the coordinates of the Fig. 2. Principle of FTIR Display. Fig. 3. Infrared Ray Diffused by the Finger. Fig. 4. Configuration of Display and Projector. 82

touch point can be detected. That is the principle of FTIR multitouch display. Fig. 2 shows the principle graphically. Fig. 3 is the actual image which shows how the camera captures the diffused infrared ray. In this study, we utilize FTIR display as multi-touch input interface. To use as a touch display not just a touching plate, it is necessary to project an image of the operating screen on it, but the acrylic has very high transparency, so the projector's light would pass through the acrylic plate. Therefore, we put tracing paper on the acrylic plate as a projection surface. Screen image is projected well on tracing paper, however, if tracing paper is placed directly on the FTIR display, total internal reflection is prevented. Silicon rubber is one solution for this problem. It is put between acrylic plate and tracing paper, and its thickness is 1mm. It has a rough texture, contains very minute bubbles. Owing to these, by placing silicon rubber in contact with the acrylic plate, infrared ray is diffused only at the place where the finger touches. Therefore, the place where the fingers are touching could be determined accurately. Furthermore, since the surface of the tracing paper is smooth, it is suitable to swipe on it. Fig. 4 illustrates the configureuration of the projection screen. To adjust the size of the projection, the optical path distance is extended by reflecting the light of the projector with a mirror. It is possible to obtain the touch point's coordinates by taking infrared image with an infrared camera on the opposite side of touching surface. Fig. 5 shows the image captured from opposite side of touch screen which is touched by five fingers. The infrared image indicates just the diffused infrared's shape and position in the image. Therefore, it is necessary to process the information and convert this information into coordinates data on the screen. To obtain the coordinates, the system should correlate the position in the image and coordinate on the touch screen, remove background objects and shades from the captured image. In this study, we used open source software: Tbeta[11]. This software includes all the functions this system needs; converting the touching position to the coordinates data on the screen, and removing the shades and background from the captured image by calculating the difference between the background image and the captured image. It sends input coordinates information obtained by processing the infrared image through TCP communication. The application window of TBeta is shown in Fig. 6. The upper left image is captured infrared image, and upper right image is the processed image. The touching position of the finger is detected by calculating the centroid of the white object. In Fig. 7, the centroids of each finger are indicated. Due to surrounding conditions, ambient light may be captured by the infrared camera as shown in Fig. 8. Even in such situation, TBeta can eliminate ambient lights with Subtract Background function. While the function is working, few moving objects; background objects are eliminated from capturing target. The situation that there is ambient light and it is detected is shown in Fig. 8, and the state after the Subtract Background function has worked on Fig. 5. Touched Screen. Fig. 6. Window of TBeta. Fig. 7. Indicated Centroid of the Finger. 83

Fig. 11. Bending the Ring Finger. Fig. 8. Erroneous Detection of Ambient Light. Fig. 12. Clicking the Pressure Sensor. Fig. 9. After Subtract Background Function Worked. Fig. 10. Hand Gesture Recognition Device. the situation is shown in Fig. 9. By using Tbeta, we could get the position of the fingers, without influence of ambient light or positional relation between camera and touch display. B. Hand Gesture Recognition Device Hand gesture recognition device has been studied to date[12][13][14]. In this study, because we aimed to integrate hand gesture with touch information, hand gesture needs to be operated while the user is touching the screen. When user is performing touch interaction, form of user s hand become unsettle, so general hand gestures which use relational positions of each finger are not suitable for this purpose. For such reason, we recognized hand gesture based on each finger s state; bending or extending, and clicking. User's hand gestures are recognized with sensors mounted on the hand. The sensor mounting schematic view of the hand gesture recognition device is shown in Fig. 10. Its resistance value changes depending on its curving condition. When it's straight, its resistance value is about 32k, and when it's bent to 45 degrees, resistance value is about 53k, when it's 90 degrees, resistance value is about 74k. Since the bending sensor can get its curving condition, by mounting it along the ring finger, it is possible to detect the bending and stretching state of the finger. The pressure sensor is a thin sheetlike sensor which has rounded pressure-sensing section, fitted to the side of the index finger on Fig. 10. Its resistance value changes from 10M to around 5k depending on pressure 84

(a) Touching the Blue Circle. (b) Blue Circle follows the Finger. (c) The Circle Become Smaller. (d) The Circle Become Bigger. (e) Clicking the Pressure Sensor. being exerted on its pressure-sensing section. By mounting it on the side of the index finger and pressing it, it is able to perform the operation like clicking while touching the display. How the bend sensor attached to the finger is bent along the finger is shown in Fig. 11, and Fig. 12 indicates the state that the pressure-sensitive section is clicked. By measuring and Fig. 13. Interaction on the Application. (f) The Circle Was Eliminated. processing the resistance values of these sensors with Arduino, click gesture and finger bending gesture are detected. Detected hand gesture operation is sent to the application through serial communication. Clicking and bending status information is sent to the application, and it is integrated with touch information. 85

III. GUI APPLICATION We have developed an application which uses touch coordinate, click and bending information for the operation. The graphical user interface of the application is shown in Fig. 13. Circles floating among the screen could be selected by touching, and touched circle follows the finger. Selected circle's size becomes larger when the ring finger, that bend sensor is attached, is extended, and becomes smaller when the finger is bent. When the pressure sensor is clicked by the thumb, the selected circle is eliminated. These operations on normal touch display generally require complicated operations, however, on this interface these are able to be performed by hand gesture. For example, changing the size of the circle needs slide bar GUI displayed on screen, and eliminating the circle needs menu bar to select the delete operation. Fig. 13 shows the interactions on the application. The application screen and recognition device mounted on the user s hand is in the figure. The circles are floating among the screen, and the user can interact with these by touching and performing hand gesture. In Fig. 13 (a), user is touching the blue circle, then the selected blue circle becomes operating target while it is touched by the finger, and when the user slides the finger, the circle follows the finger as shown in Fig. 13 (b). User can change the size of the touched circle by bending or extending the ring finger that the bending sensor is mounted. When the ring finger is bent, the circle becomes smaller as shown in Fig. 13 (c), and when the ring finger is extended, the circle becomes larger as shown in Fig. 13 (d). User eliminate the selected circle by pressing the pressure sensor with thumb while touching a circle. In Fig. 13 (e), user is pressing the pressure sensor, after a second, the blue circle which was touched by the finger is eliminated as shown in Fig. 13 (f). This application represents the features of this interface. IV. CONCLUSION In this study, we proposed an interface which integrates touch gestures and hand gestures, and developed a GUI application which utilizes these input information. Compared to existing input method which uses either one of touching or hand gesture, the characteristic of this input method is that the drawbacks of the touch display and hand gesture are compensated with each other. It is future tasks to recognize more gesture pattern, to make this interface more practical. The gesture patterns that we recognized in this study are only the click and bending condition of fingers. However, it is possible to recognize more gesture patterns by adding another sensor. For instance, we can add pressure sensor on the nail of the index finger; user press it with the middle finger while touching. Furthermore, considering that the touching input is performed by one hand, the other hand is not used in general. Therefore, it is possible and effective to perform a hand gesture with another hand; user performs touch gesture with the right hand, and simultaneously performs complicated hand gesture with the left hand. Addition to expansion of these gesture, combination of touch gesture and hand gesture will increase gesture pattern. In this study, each gesture is recognized individually, however, combination gestures such as four finger tap with clicking pressure sensor, or two finger swipe with bending ring finger may be available. Each of hand gesture pattern and touch gesture pattern has room for expanding, and by combining these gestures, the gesture pattern on this interface will become unlimited. This interface is a prototype of new interface which combines touch gesture with hand gesture. Therefore we will continue to improve each of touch and gesture interface, and try to make this interface practical one. REFERENCES [1] K. Shima, H. Hakoda, T. Kurihara, B. Sizuki, and J. Tanaka, Range Selection Using Three Points Touch as Start-up Gesture", IPSJ SIG Technical Report, 2014-HCI-159, pp.1 8, 2014. [2] T. Kuribara, Y. Mita, K. Onishi, B. Shizuki and J. Tanaka, "HandyScope: A Remote Control Technique Using Circular Widget on Tabletops", Proceedings of 16th International Conference on Human-Computer Interaction (HCI International 2014), LNCS 8511, pp.69-80, Heraklion, Crete, Greece, June 22 27, 2014. [3] G. J. Lepinski, T.Grossman, and G. Fitzmaurice, "The design and evaluation of multitouch marking menus", Proceedings of the SIGCHI Conference on Hu- man Factors in Computing Systems, CHI 10, New York, NY, USA, ACM, pp. 2233 2242, 2010. [4] C. Harrison, J. Schwarz, and S. E. Hudson, "TapSense: enhancing finger interaction on touch surfaces", Proceed- ings of the 24th annual ACM symposium on User interface software and technology, UIST 11, New York, NY, USA, ACM, pp. 627 636, 2011. [5] S. Boring, D. Ledo, X. A. Chen, N. Marquardt, A. Tang, and S. Greenberg, "The fat thumb: using the thumb s contact size for singlehanded mobile interaction", Pro- ceedings of the 14th international conference on Human-computer interaction with mobile devices and services, MobileHCI 12, New York, NY, USA, ACM, pp. 39 48, 2012. [6] D. Wigdor and D. Wixon, "Brave Nui World", pp. 9 15. Morgan Kaufmann,2011 [7] T. Sato, K. Fukuchi, and H. Koike, OHAJIKI Interface: Flicking Gesture Recognition with a High-Speed Camera", IPSJ SIG Technical Report, 2014-HCI-159, pp. 1 8, 2014. [8] U. Lee and J. Tanaka, " Finger Controller: Natural User Interaction Using Finger Gestures, Human-Computer Interaction Part IV( HCII 2013), LNCS 8007, pp. 281 290, 2013. [9] J. Y. Han, "Low-cost multi-touch sensing through frustrated total internal reflection", Proc. of the 18th Annual ACM Symposium on User Interface Software and Technology(UIST), pp.115 118, 2005. [10] S. Suto and S. Shibusawa, "A tabletop system using infrared image recognition for multi-user identification", INTERACT2013, 2013. [11] TBeta : http://ccv.nuigroup.com/ [12] L. Dipietro, A. M. Sabatini, and Paolo Dario, A Survey of Glove-Based Systems and Their Applications IEEE transactions on systems, man, and cybernetics part c: applications and reviews, vol. 38, no. 4, pp. 461 482, July 2008,. [13] P. Kumar, J. Verma and S. Prasad, "Hand Data Glove: A Wearable Real- Time Device for Human-Computer Interaction", International Journal of Advanced Science and Technology, Vol. 43, June, 2012. [14] Y. D. Kataware and U. L. Bombale, "A Wearable Wireless Device for Effective Human Computer Interaction." International Journal of Computer Applications 99.9, pp. 9 14, 2014. 86