Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device
|
|
- Barnard Nelson
- 5 years ago
- Views:
Transcription
1 2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science & Engineering Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device Hayato Takahashi, Yuhki Kitazono Advanced School of Creative Engineering National Institute of Technology, Kitakyushu College Kitakyushu, Japan Abstract In this study, we aimed to integrate multi-touch gesture with hand gesture and make new input method. Multitouch gesture and hand gesture have been studied separately, however, these could be integrated, and it would be new interaction way for computer. We made a large scale multi-touch screen by FTIR method that uses infrared ray to detect the touched point, and a hand gesture recognition device with sensors; pressure sensor and bend sensor. In the end, we integrated the input information to interact with computer. User can input coordinate by touching, and simultaneously input action by performing hand gesture. For instance, user can select target by touching, and then make an action on target by clicking the pressure sensor or bending the bend sensor. We demonstrated the input method on GUI application. Keywords multi-touch; FTIR display; sensor; hand gesture I. INTRODUCTION In recent years, multi-touch display has become a major way of interacting with the computer. Multi-touch interaction enables the user to interact intuitively rather than mouse or keyboard. However, though multi-touch interaction is intuitive, it has difficulty in providing various operating command. This is due to that touch gesture is identified by number of fingers and track of them, so a few gestures meet practical performance, remaining intuitive. Representative touch gestures are swipe, double tap, rotate, pinch, and hold. General multi-touch device like smartphone supports these gestures natively. Fig. 1 indicates touch gestures. As Shima et al. proposed[1], it is also possible to use the gesture as a trigger to start a specific input mode, not only performing an action. In general, gesture pattern is designed to be performed by one hand because an input device such as smartphone is too small to interact with two hands. However, since large device such as a tablet or an interactive tabletop has enough display size to interact with two hands, it is effective to interact by a combination of both hands. As an effective gesture patterns with both hands in a large multi-touch display, Kuribara et al. proposed a gesture named "HandyScope"[2]. HandyScope is a gesture designed to be performed with both hands. User taps with two fingers of one hand at first, and then passes through between these fingers with one finger of the other hand, then menu bar is displayed between the two fingers and one finger. Fig. 1. Hand Gesture. This "HandyScope" is a study which focused on how to display menu. Addition to this, Lepinski designed multitouch menu[3]. General gesture pattern such as swipe or pinch is identified by the track and number of touching finger. However, gesture pattern gets complicated when the gesture pattern increased, because new gesture patterns have to avoid conflicts with existing gesture patterns, though gestures are distinguished by only the number and track of the touch points. To expand gesture pattern, there have been studies which aim to identify touch interaction by other factors, as in TapSense[4] that distinguish touching object such as a nail, knuckle, and so on. For another approach to expand gesture pattern, Boring et al. proposed Fat Thumb[5] that distinguish touches by the contact size of the finger. These studies expanded gesture pattern, however, yet even generally utilized operations, such as copy and paste, need to be selected from the displayed menu. In other words, multitouch interaction has no shortcut command like keyboard interaction. Multi-touch interaction is a very intuitive way to operate, but it is not efficient as a keyboard. In addition to multi-touch interaction, contactless interaction; NUI (Natural User Interface) have been studied in recent years[6]. Typical input methods of NUI are hand gesture /16 $ IEEE DOI /ACIT-CSII-BCD
2 and body language. For instance, Sato et al. proposes a specific hand gesture recognition with high-speed camera[7]. The system obtains the input angle of the hand by recognizing the virtual straight line between the fingertips of the thumb and index finger and the center of the circle shaped by these fingers. For another instance, Lee proposes a method to combine hand gesture and augmented reality[8]. In the system user's fingertips are tracked by capturing with a camera, and the user can draw lines in the air. The track of the finger is acquired and line drown by the finger is mapped on the image captured by the camera. Even hand gesture is useful for certain interaction, it is difficult to use for general input method like a mouse or keyboard. Some of the reasons are that user have to continue raising hands while operating, and that the accuracy of pointing is lower compared to existing input method. Basing on this current situation, we propose an interface which integrates the touch display with hand gesture. On this interface, user can interact by both touch and hand gesture simultaneously. Operational information of the hand gesture is integrated with information about the touch points. By including such information, touch operation pattern can be expanded without getting complicated. In this study, we made a large screen multi-touch display based on FTIR method[9] and glove type device for hand gesture recognition. Additionally, we developed a GUI application for demonstrating the advantage of the input method. II. SYSTEM This system, which has the aim of integrating touch information with hand gesture input, consists of multi-touch display and hand gesture recognition device. Touch input and hand gesture are performed simultaneously, so the system is suitable for large screen. Therefore, the system is used on a tabletop device, so we made a mm multi-touch display based on FTIR method. Hand gesture information is integrated with the touch information, so it needs to be performed while touching. For this reason, hand gestures are recognized with a sensor device attached to the hand. In this study, we use a pressure sensor and a bending sensor for gesture recognition. Using these sensors, the system detects the finger's bending condition and click input. A. FTIR Multi-touch Display We constructed a multi-touch display using frustrated internal reflection (FTIR) technique. FTIR multi-touch display is generally used for tabletop system[10]. FTIR multi-touh display is basically composed of an acrylic plate, infrared LED, and infrared camera. When light encounters an interface to medium with a lower index of reflection, in this case acrylic to air, the light becomes refracted, it becomes total internal reflection (TIR). When infrared rays are irradiated from the side of the acrylic plate, owing to the difference of the refractive index of acrylic and air, at the interface total reflection occurs, and infrared rays are reflected in acrylic. However, total internal reflection does not occur at some interfaces where are touched by something like fingers, and the infrared rays diffuse into air. By capturing these diffused infrared rays with an infrared camera from the opposite side, the coordinates of the Fig. 2. Principle of FTIR Display. Fig. 3. Infrared Ray Diffused by the Finger. Fig. 4. Configuration of Display and Projector. 82
3 touch point can be detected. That is the principle of FTIR multitouch display. Fig. 2 shows the principle graphically. Fig. 3 is the actual image which shows how the camera captures the diffused infrared ray. In this study, we utilize FTIR display as multi-touch input interface. To use as a touch display not just a touching plate, it is necessary to project an image of the operating screen on it, but the acrylic has very high transparency, so the projector's light would pass through the acrylic plate. Therefore, we put tracing paper on the acrylic plate as a projection surface. Screen image is projected well on tracing paper, however, if tracing paper is placed directly on the FTIR display, total internal reflection is prevented. Silicon rubber is one solution for this problem. It is put between acrylic plate and tracing paper, and its thickness is 1mm. It has a rough texture, contains very minute bubbles. Owing to these, by placing silicon rubber in contact with the acrylic plate, infrared ray is diffused only at the place where the finger touches. Therefore, the place where the fingers are touching could be determined accurately. Furthermore, since the surface of the tracing paper is smooth, it is suitable to swipe on it. Fig. 4 illustrates the configureuration of the projection screen. To adjust the size of the projection, the optical path distance is extended by reflecting the light of the projector with a mirror. It is possible to obtain the touch point's coordinates by taking infrared image with an infrared camera on the opposite side of touching surface. Fig. 5 shows the image captured from opposite side of touch screen which is touched by five fingers. The infrared image indicates just the diffused infrared's shape and position in the image. Therefore, it is necessary to process the information and convert this information into coordinates data on the screen. To obtain the coordinates, the system should correlate the position in the image and coordinate on the touch screen, remove background objects and shades from the captured image. In this study, we used open source software: Tbeta[11]. This software includes all the functions this system needs; converting the touching position to the coordinates data on the screen, and removing the shades and background from the captured image by calculating the difference between the background image and the captured image. It sends input coordinates information obtained by processing the infrared image through TCP communication. The application window of TBeta is shown in Fig. 6. The upper left image is captured infrared image, and upper right image is the processed image. The touching position of the finger is detected by calculating the centroid of the white object. In Fig. 7, the centroids of each finger are indicated. Due to surrounding conditions, ambient light may be captured by the infrared camera as shown in Fig. 8. Even in such situation, TBeta can eliminate ambient lights with Subtract Background function. While the function is working, few moving objects; background objects are eliminated from capturing target. The situation that there is ambient light and it is detected is shown in Fig. 8, and the state after the Subtract Background function has worked on Fig. 5. Touched Screen. Fig. 6. Window of TBeta. Fig. 7. Indicated Centroid of the Finger. 83
4 Fig. 11. Bending the Ring Finger. Fig. 8. Erroneous Detection of Ambient Light. Fig. 12. Clicking the Pressure Sensor. Fig. 9. After Subtract Background Function Worked. Fig. 10. Hand Gesture Recognition Device. the situation is shown in Fig. 9. By using Tbeta, we could get the position of the fingers, without influence of ambient light or positional relation between camera and touch display. B. Hand Gesture Recognition Device Hand gesture recognition device has been studied to date[12][13][14]. In this study, because we aimed to integrate hand gesture with touch information, hand gesture needs to be operated while the user is touching the screen. When user is performing touch interaction, form of user s hand become unsettle, so general hand gestures which use relational positions of each finger are not suitable for this purpose. For such reason, we recognized hand gesture based on each finger s state; bending or extending, and clicking. User's hand gestures are recognized with sensors mounted on the hand. The sensor mounting schematic view of the hand gesture recognition device is shown in Fig. 10. Its resistance value changes depending on its curving condition. When it's straight, its resistance value is about 32k, and when it's bent to 45 degrees, resistance value is about 53k, when it's 90 degrees, resistance value is about 74k. Since the bending sensor can get its curving condition, by mounting it along the ring finger, it is possible to detect the bending and stretching state of the finger. The pressure sensor is a thin sheetlike sensor which has rounded pressure-sensing section, fitted to the side of the index finger on Fig. 10. Its resistance value changes from 10M to around 5k depending on pressure 84
5 (a) Touching the Blue Circle. (b) Blue Circle follows the Finger. (c) The Circle Become Smaller. (d) The Circle Become Bigger. (e) Clicking the Pressure Sensor. being exerted on its pressure-sensing section. By mounting it on the side of the index finger and pressing it, it is able to perform the operation like clicking while touching the display. How the bend sensor attached to the finger is bent along the finger is shown in Fig. 11, and Fig. 12 indicates the state that the pressure-sensitive section is clicked. By measuring and Fig. 13. Interaction on the Application. (f) The Circle Was Eliminated. processing the resistance values of these sensors with Arduino, click gesture and finger bending gesture are detected. Detected hand gesture operation is sent to the application through serial communication. Clicking and bending status information is sent to the application, and it is integrated with touch information. 85
6 III. GUI APPLICATION We have developed an application which uses touch coordinate, click and bending information for the operation. The graphical user interface of the application is shown in Fig. 13. Circles floating among the screen could be selected by touching, and touched circle follows the finger. Selected circle's size becomes larger when the ring finger, that bend sensor is attached, is extended, and becomes smaller when the finger is bent. When the pressure sensor is clicked by the thumb, the selected circle is eliminated. These operations on normal touch display generally require complicated operations, however, on this interface these are able to be performed by hand gesture. For example, changing the size of the circle needs slide bar GUI displayed on screen, and eliminating the circle needs menu bar to select the delete operation. Fig. 13 shows the interactions on the application. The application screen and recognition device mounted on the user s hand is in the figure. The circles are floating among the screen, and the user can interact with these by touching and performing hand gesture. In Fig. 13 (a), user is touching the blue circle, then the selected blue circle becomes operating target while it is touched by the finger, and when the user slides the finger, the circle follows the finger as shown in Fig. 13 (b). User can change the size of the touched circle by bending or extending the ring finger that the bending sensor is mounted. When the ring finger is bent, the circle becomes smaller as shown in Fig. 13 (c), and when the ring finger is extended, the circle becomes larger as shown in Fig. 13 (d). User eliminate the selected circle by pressing the pressure sensor with thumb while touching a circle. In Fig. 13 (e), user is pressing the pressure sensor, after a second, the blue circle which was touched by the finger is eliminated as shown in Fig. 13 (f). This application represents the features of this interface. IV. CONCLUSION In this study, we proposed an interface which integrates touch gestures and hand gestures, and developed a GUI application which utilizes these input information. Compared to existing input method which uses either one of touching or hand gesture, the characteristic of this input method is that the drawbacks of the touch display and hand gesture are compensated with each other. It is future tasks to recognize more gesture pattern, to make this interface more practical. The gesture patterns that we recognized in this study are only the click and bending condition of fingers. However, it is possible to recognize more gesture patterns by adding another sensor. For instance, we can add pressure sensor on the nail of the index finger; user press it with the middle finger while touching. Furthermore, considering that the touching input is performed by one hand, the other hand is not used in general. Therefore, it is possible and effective to perform a hand gesture with another hand; user performs touch gesture with the right hand, and simultaneously performs complicated hand gesture with the left hand. Addition to expansion of these gesture, combination of touch gesture and hand gesture will increase gesture pattern. In this study, each gesture is recognized individually, however, combination gestures such as four finger tap with clicking pressure sensor, or two finger swipe with bending ring finger may be available. Each of hand gesture pattern and touch gesture pattern has room for expanding, and by combining these gestures, the gesture pattern on this interface will become unlimited. This interface is a prototype of new interface which combines touch gesture with hand gesture. Therefore we will continue to improve each of touch and gesture interface, and try to make this interface practical one. REFERENCES [1] K. Shima, H. Hakoda, T. Kurihara, B. Sizuki, and J. Tanaka, Range Selection Using Three Points Touch as Start-up Gesture", IPSJ SIG Technical Report, 2014-HCI-159, pp.1 8, [2] T. Kuribara, Y. Mita, K. Onishi, B. Shizuki and J. Tanaka, "HandyScope: A Remote Control Technique Using Circular Widget on Tabletops", Proceedings of 16th International Conference on Human-Computer Interaction (HCI International 2014), LNCS 8511, pp.69-80, Heraklion, Crete, Greece, June 22 27, [3] G. J. Lepinski, T.Grossman, and G. Fitzmaurice, "The design and evaluation of multitouch marking menus", Proceedings of the SIGCHI Conference on Hu- man Factors in Computing Systems, CHI 10, New York, NY, USA, ACM, pp , [4] C. Harrison, J. Schwarz, and S. E. Hudson, "TapSense: enhancing finger interaction on touch surfaces", Proceed- ings of the 24th annual ACM symposium on User interface software and technology, UIST 11, New York, NY, USA, ACM, pp , [5] S. Boring, D. Ledo, X. A. Chen, N. Marquardt, A. Tang, and S. Greenberg, "The fat thumb: using the thumb s contact size for singlehanded mobile interaction", Pro- ceedings of the 14th international conference on Human-computer interaction with mobile devices and services, MobileHCI 12, New York, NY, USA, ACM, pp , [6] D. Wigdor and D. Wixon, "Brave Nui World", pp Morgan Kaufmann,2011 [7] T. Sato, K. Fukuchi, and H. Koike, OHAJIKI Interface: Flicking Gesture Recognition with a High-Speed Camera", IPSJ SIG Technical Report, 2014-HCI-159, pp. 1 8, [8] U. Lee and J. Tanaka, " Finger Controller: Natural User Interaction Using Finger Gestures, Human-Computer Interaction Part IV( HCII 2013), LNCS 8007, pp , [9] J. Y. Han, "Low-cost multi-touch sensing through frustrated total internal reflection", Proc. of the 18th Annual ACM Symposium on User Interface Software and Technology(UIST), pp , [10] S. Suto and S. Shibusawa, "A tabletop system using infrared image recognition for multi-user identification", INTERACT2013, [11] TBeta : [12] L. Dipietro, A. M. Sabatini, and Paolo Dario, A Survey of Glove-Based Systems and Their Applications IEEE transactions on systems, man, and cybernetics part c: applications and reviews, vol. 38, no. 4, pp , July 2008,. [13] P. Kumar, J. Verma and S. Prasad, "Hand Data Glove: A Wearable Real- Time Device for Human-Computer Interaction", International Journal of Advanced Science and Technology, Vol. 43, June, [14] Y. D. Kataware and U. L. Bombale, "A Wearable Wireless Device for Effective Human Computer Interaction." International Journal of Computer Applications 99.9, pp. 9 14,
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationInformation Layout and Interaction on Virtual and Real Rotary Tables
Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer System Information Layout and Interaction on Virtual and Real Rotary Tables Hideki Koike, Shintaro Kajiwara, Kentaro Fukuchi
More informationDouble-side Multi-touch Input for Mobile Devices
Double-side Multi-touch Input for Mobile Devices Double side multi-touch input enables more possible manipulation methods. Erh-li (Early) Shen Jane Yung-jen Hsu National Taiwan University National Taiwan
More informationEvaluation of a Soft-Surfaced Multi-touch Interface
Evaluation of a Soft-Surfaced Multi-touch Interface Anna Noguchi, Toshifumi Kurosawa, Ayaka Suzuki, Yuichiro Sakamoto, Tatsuhito Oe, Takuto Yoshikawa, Buntarou Shizuki, and Jiro Tanaka University of Tsukuba,
More informationBuilding a gesture based information display
Chair for Com puter Aided Medical Procedures & cam par.in.tum.de Building a gesture based information display Diplomarbeit Kickoff Presentation by Nikolas Dörfler Feb 01, 2008 Chair for Computer Aided
More informationA novel click-free interaction technique for large-screen interfaces
A novel click-free interaction technique for large-screen interfaces Takaomi Hisamatsu, Buntarou Shizuki, Shin Takahashi, Jiro Tanaka Department of Computer Science Graduate School of Systems and Information
More informationR (2) Controlling System Application with hands by identifying movements through Camera
R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity
More informationOcclusion-Aware Menu Design for Digital Tabletops
Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at
More informationCONSTRUCTING AN ELASTIC TOUCH PANEL WITH EMBEDDED IR-LEDS USING SILICONE RUBBER
CONSTRUCTING AN ELASTIC TOUCH PANEL WITH EMBEDDED IR-LEDS USING SILICONE RUBBER Yuichiro Sakamoto, Takuto Yoshikawa, Tatsuhito Oe, Buntarou Shizuki, and Jiro Tanaka Department of Computer Science, University
More informationAbstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction
Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri
More informationA Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,
IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,
More informationInternational Journal of Advance Engineering and Research Development. Surface Computer
Scientific Journal of Impact Factor (SJIF): 4.72 International Journal of Advance Engineering and Research Development Volume 4, Issue 4, April -2017 Surface Computer Sureshkumar Natarajan 1,Hitesh Koli
More informationNUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch
1 2 Research Topic TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY Human-Computer Interaction / Natural User Interface Neng-Hao (Jones) Yu, Assistant Professor Department of Computer Science National
More informationA Multi-Touch Enabled Steering Wheel Exploring the Design Space
A Multi-Touch Enabled Steering Wheel Exploring the Design Space Max Pfeiffer Tanja Döring Pervasive Computing and User Pervasive Computing and User Interface Engineering Group Interface Engineering Group
More informationA Technique for Touch Force Sensing using a Waterproof Device s Built-in Barometer
Late-Breaking Work B C Figure 1: Device conditions. a) non-tape condition. b) with-tape condition. A Technique for Touch Force Sensing using a Waterproof Device s Built-in Barometer Ryosuke Takada Ibaraki,
More informationMulti-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group
Multi-touch Technology 6.S063 Engineering Interaction Technologies Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group how does my phone recognize touch? and why the do I need to press hard on airplane
More informationClassic3D and Single3D: Two unimanual techniques for constrained 3D manipulations on tablet PCs
Classic3D and Single3D: Two unimanual techniques for constrained 3D manipulations on tablet PCs Siju Wu, Aylen Ricca, Amine Chellali, Samir Otmane To cite this version: Siju Wu, Aylen Ricca, Amine Chellali,
More informationVolGrab: Realizing 3D View Navigation by Aerial Hand Gestures
VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures Figure 1: Operation of VolGrab Shun Sekiguchi Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, 338-8570, Japan sekiguchi@is.ics.saitama-u.ac.jp
More informationAdvancements in Gesture Recognition Technology
IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka
More informationMulti-touch Interface for Controlling Multiple Mobile Robots
Multi-touch Interface for Controlling Multiple Mobile Robots Jun Kato The University of Tokyo School of Science, Dept. of Information Science jun.kato@acm.org Daisuke Sakamoto The University of Tokyo Graduate
More informationZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field
ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field Figure 1 Zero-thickness visual hull sensing with ZeroTouch. Copyright is held by the author/owner(s). CHI 2011, May 7 12, 2011, Vancouver, BC,
More informationA Gesture Oriented Android Multi Touch Interaction Scheme of Car. Feilong Xu
3rd International Conference on Management, Education, Information and Control (MEICI 2015) A Gesture Oriented Android Multi Touch Interaction Scheme of Car Feilong Xu 1 Institute of Information Technology,
More informationImage Manipulation Interface using Depth-based Hand Gesture
Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking
More informationGESTURES. Luis Carriço (based on the presentation of Tiago Gomes)
GESTURES Luis Carriço (based on the presentation of Tiago Gomes) WHAT IS A GESTURE? In this context, is any physical movement that can be sensed and responded by a digital system without the aid of a traditional
More informationDATA GLOVES USING VIRTUAL REALITY
DATA GLOVES USING VIRTUAL REALITY Raghavendra S.N 1 1 Assistant Professor, Information science and engineering, sri venkateshwara college of engineering, Bangalore, raghavendraewit@gmail.com ABSTRACT This
More informationEnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment
EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment Hideki Koike 1, Shin ichiro Nagashima 1, Yasuto Nakanishi 2, and Yoichi Sato 3 1 Graduate School of Information Systems,
More informationWorkshop one: Constructing a multi-touch table (6 december 2007) Laurence Muller.
Workshop one: Constructing a multi-touch table (6 december 2007) Introduction A Master of Grid Computing (former Computer Science) student at the Universiteit van Amsterdam Currently doing research in
More informationarxiv: v1 [cs.hc] 14 Jan 2015
Expanding the Vocabulary of Multitouch Input using Magnetic Fingerprints Halim Çağrı Ateş cagri@cse.unr.edu Ilias Apostolopoulous ilapost@cse.unr.edu Computer Science and Engineering University of Nevada
More informationDRAFT: SPARSH UI: A MULTI-TOUCH FRAMEWORK FOR COLLABORATION AND MODULAR GESTURE RECOGNITION. Desirée Velázquez NSF REU Intern
Proceedings of the World Conference on Innovative VR 2009 WINVR09 July 12-16, 2008, Brussels, Belgium WINVR09-740 DRAFT: SPARSH UI: A MULTI-TOUCH FRAMEWORK FOR COLLABORATION AND MODULAR GESTURE RECOGNITION
More informationToolkit For Gesture Classification Through Acoustic Sensing
Toolkit For Gesture Classification Through Acoustic Sensing Pedro Soldado pedromgsoldado@ist.utl.pt Instituto Superior Técnico, Lisboa, Portugal October 2015 Abstract The interaction with touch displays
More informationEnhancedTable: An Augmented Table System for Supporting Face-to-Face Meeting in Ubiquitous Environment
EnhancedTable: An Augmented Table System for Supporting Face-to-Face Meeting in Ubiquitous Environment Hideki Koike 1, Shinichiro Nagashima 1, Yasuto Nakanishi 2, and Yoichi Sato 3 1 Graduate School of
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationTranslucent Tangibles on Tabletops: Exploring the Design Space
Translucent Tangibles on Tabletops: Exploring the Design Space Mathias Frisch mathias.frisch@tu-dresden.de Ulrike Kister ukister@acm.org Wolfgang Büschel bueschel@acm.org Ricardo Langner langner@acm.org
More informationDevelopment of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture
Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Nobuaki Nakazawa 1*, Toshikazu Matsui 1, Yusaku Fujii 2 1 Faculty of Science and Technology, Gunma University, 29-1
More informationSKETCHING CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATION TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 7 CLASS 13
SKETCHING CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATION TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 7 CLASS 13 Joanna McGrenere and Leila Aflatoony Includes slides from Karon MacLean
More informationEnabling Cursor Control Using on Pinch Gesture Recognition
Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on
More informationEfficient In-Situ Creation of Augmented Reality Tutorials
Efficient In-Situ Creation of Augmented Reality Tutorials Alexander Plopski, Varunyu Fuvattanasilp, Jarkko Polvi, Takafumi Taketomi, Christian Sandor, and Hirokazu Kato Graduate School of Information Science,
More informationBeyond Actuated Tangibles: Introducing Robots to Interactive Tabletops
Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationAugmented Desk Interface. Graduate School of Information Systems. Tokyo , Japan. is GUI for using computer programs. As a result, users
Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface Yoichi Sato Institute of Industrial Science University oftokyo 7-22-1 Roppongi, Minato-ku Tokyo 106-8558, Japan ysato@cvl.iis.u-tokyo.ac.jp
More informationFrictioned Micromotion Input for Touch Sensitive Devices
Technical Disclosure Commons Defensive Publications Series May 18, 2015 Frictioned Micromotion Input for Touch Sensitive Devices Samuel Huang Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationBeyond: collapsible tools and gestures for computational design
Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationTilt or Touch? An Evaluation of Steering Control of Racing Game on Tablet or Smartphone
Tilt or Touch? An Evaluation of Steering Control of Racing Game on Tablet or Smartphone Muhammad Suhaib Ph.D. Candidate Sr. Full Stack Developer, T-Mark, Inc, Tokyo, Japan Master of Engineering, Ritsumeikan
More informationTIMEWINDOW. dig through time.
TIMEWINDOW dig through time www.rex-regensburg.de info@rex-regensburg.de Summary The Regensburg Experience (REX) is a visitor center in Regensburg, Germany. The REX initiative documents the city s rich
More informationOpen Archive TOULOUSE Archive Ouverte (OATAO)
Open Archive TOULOUSE Archive Ouverte (OATAO) OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible. This is an author-deposited
More informationVEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu
More informationDevelopment of Video Chat System Based on Space Sharing and Haptic Communication
Sensors and Materials, Vol. 30, No. 7 (2018) 1427 1435 MYU Tokyo 1427 S & M 1597 Development of Video Chat System Based on Space Sharing and Haptic Communication Takahiro Hayashi 1* and Keisuke Suzuki
More informationMonoTouch: Single Capacitive Touch Sensor that Differentiates Touch Gestures
MonoTouch: Single Capacitive Touch Sensor that Differentiates Touch Gestures Ryosuke Takada University of Tsukuba 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8573, Japan rtakada@iplab.cs.tsukuba.ac.jp Buntarou
More informationGesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS
Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More information3D Interactions with a Passive Deformable Haptic Glove
3D Interactions with a Passive Deformable Haptic Glove Thuong N. Hoang Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ngocthuong@gmail.com Ross
More informationMixed Reality Approach and the Applications using Projection Head Mounted Display
Mixed Reality Approach and the Applications using Projection Head Mounted Display Ryugo KIJIMA, Takeo OJIKA Faculty of Engineering, Gifu University 1-1 Yanagido, GifuCity, Gifu 501-11 Japan phone: +81-58-293-2759,
More informationNTT DOCOMO Technical Journal. 1. Introduction. 2. Process of Popularizing Glasses-Type Devices
Wearable Device Cloud Service Intelligent Glass This article presents an overview of Intelligent Glass exhibited at CEATEC JAPAN 2013. Google Glass * 1 has brought high expectations for glasses-type devices,
More informationInteractive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience
Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience Radu-Daniel Vatavu and Stefan-Gheorghe Pentiuc University Stefan cel Mare of Suceava, Department of Computer Science,
More informationProjection Based HCI (Human Computer Interface) System using Image Processing
GRD Journals- Global Research and Development Journal for Volume 1 Issue 5 April 2016 ISSN: 2455-5703 Projection Based HCI (Human Computer Interface) System using Image Processing Pankaj Dhome Sagar Dhakane
More informationHumera Syed 1, M. S. Khatib 2 1,2
A Hand Gesture Recognition Approach towards Shoulder Wearable Computing Humera Syed 1, M. S. Khatib 2 1,2 CSE, A.C.E.T/ R.T.M.N.U, India ABSTRACT: Human Computer Interaction needs computer systems and
More informationInteraction Technique for a Pen-Based Interface Using Finger Motions
Interaction Technique for a Pen-Based Interface Using Finger Motions Yu Suzuki, Kazuo Misue, and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki, 305-8573, Japan {suzuki,misue,jiro}@iplab.cs.tsukuba.ac.jp
More informationTACTUS: A Hardware and Software Testbed for Research in Multi-Touch Interaction
TACTUS: A Hardware and Software Testbed for Research in Multi-Touch Interaction Paul Varcholik, Joseph J. Laviola Jr., Denise Nicholson Institute for Simulation & Training University of Central Florida
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationInfrared Touch Screen Sensor
Infrared Touch Screen Sensor Umesh Jagtap 1, Abhay Chopde 2, Rucha Karanje 3, Tejas Latne 4 1, 2, 3, 4 Vishwakarma Institute of Technology, Department of Electronics Engineering, Pune, India Abstract:
More informationDevelopment a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space
Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space Yuki Fujibayashi and Hiroki Imamura Department of Information Systems Science, Graduate School
More informationTransporters: Vision & Touch Transitive Widgets for Capacitive Screens
Transporters: Vision & Touch Transitive Widgets for Capacitive Screens Florian Heller heller@cs.rwth-aachen.de Simon Voelker voelker@cs.rwth-aachen.de Chat Wacharamanotham chat@cs.rwth-aachen.de Jan Borchers
More informationGeo-Located Content in Virtual and Augmented Reality
Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationMultitouch Finger Registration and Its Applications
Multitouch Finger Registration and Its Applications Oscar Kin-Chung Au City University of Hong Kong kincau@cityu.edu.hk Chiew-Lan Tai Hong Kong University of Science & Technology taicl@cse.ust.hk ABSTRACT
More informationVisual Touchpad: A Two-handed Gestural Input Device
Visual Touchpad: A Two-handed Gestural Input Device Shahzad Malik, Joe Laszlo Department of Computer Science University of Toronto smalik jflaszlo @ dgp.toronto.edu http://www.dgp.toronto.edu ABSTRACT
More informationHand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction
Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction Piyush Kumar 1, Jyoti Verma 2 and Shitala Prasad 3 1 Department of Information Technology, Indian Institute of Information Technology,
More informationFinger Posture and Shear Force Measurement using Fingernail Sensors: Initial Experimentation
Proceedings of the 1 IEEE International Conference on Robotics & Automation Seoul, Korea? May 16, 1 Finger Posture and Shear Force Measurement using Fingernail Sensors: Initial Experimentation Stephen
More informationInvestigating Gestures on Elastic Tabletops
Investigating Gestures on Elastic Tabletops Dietrich Kammer Thomas Gründer Chair of Media Design Chair of Media Design Technische Universität DresdenTechnische Universität Dresden 01062 Dresden, Germany
More informationThe Control of Avatar Motion Using Hand Gesture
The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,
More informationComparison of Haptic and Non-Speech Audio Feedback
Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability
More informationDepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface
DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA
More informationExpanding Touch Input Vocabulary by Using Consecutive Distant Taps
Expanding Touch Input Vocabulary by Using Consecutive Distant Taps Seongkook Heo, Jiseong Gu, Geehyuk Lee Department of Computer Science, KAIST Daejeon, 305-701, South Korea seongkook@kaist.ac.kr, jiseong.gu@kaist.ac.kr,
More informationCOMPARATIVE STUDY AND ANALYSIS FOR GESTURE RECOGNITION METHODOLOGIES
http:// COMPARATIVE STUDY AND ANALYSIS FOR GESTURE RECOGNITION METHODOLOGIES Rafiqul Z. Khan 1, Noor A. Ibraheem 2 1 Department of Computer Science, A.M.U. Aligarh, India 2 Department of Computer Science,
More informationAnalysing Different Approaches to Remote Interaction Applicable in Computer Assisted Education
47 Analysing Different Approaches to Remote Interaction Applicable in Computer Assisted Education Alena Kovarova Abstract: Interaction takes an important role in education. When it is remote, it can bring
More informationFrom Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness
From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness Alaa Azazi, Teddy Seyed, Frank Maurer University of Calgary, Department of Computer Science
More informationQS Spiral: Visualizing Periodic Quantified Self Data
Downloaded from orbit.dtu.dk on: May 12, 2018 QS Spiral: Visualizing Periodic Quantified Self Data Larsen, Jakob Eg; Cuttone, Andrea; Jørgensen, Sune Lehmann Published in: Proceedings of CHI 2013 Workshop
More informationOmni-Directional Catadioptric Acquisition System
Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationIllusion of Surface Changes induced by Tactile and Visual Touch Feedback
Illusion of Surface Changes induced by Tactile and Visual Touch Feedback Katrin Wolf University of Stuttgart Pfaffenwaldring 5a 70569 Stuttgart Germany katrin.wolf@vis.uni-stuttgart.de Second Author VP
More informationInternational Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN
International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, www.ijcea.com ISSN 2321-3469 AUGMENTED REALITY FOR HELPING THE SPECIALLY ABLED PERSONS ABSTRACT Saniya Zahoor
More informationUbiBeam: An Interactive Projector-Camera System for Domestic Deployment
UbiBeam: An Interactive Projector-Camera System for Domestic Deployment Jan Gugenheimer, Pascal Knierim, Julian Seifert, Enrico Rukzio {jan.gugenheimer, pascal.knierim, julian.seifert3, enrico.rukzio}@uni-ulm.de
More informationMobile Multi-Display Environments
Jens Grubert and Matthias Kranz (Editors) Mobile Multi-Display Environments Advances in Embedded Interactive Systems Technical Report Winter 2016 Volume 4, Issue 2. ISSN: 2198-9494 Mobile Multi-Display
More informationA SURVEY ON GESTURE RECOGNITION TECHNOLOGY
A SURVEY ON GESTURE RECOGNITION TECHNOLOGY Deeba Kazim 1, Mohd Faisal 2 1 MCA Student, Integral University, Lucknow (India) 2 Assistant Professor, Integral University, Lucknow (india) ABSTRACT Gesture
More informationExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality
ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality The MIT Faculty has made this article openly available. Please share how this access benefits you. Your
More informationSocial Viewing in Cinematic Virtual Reality: Challenges and Opportunities
Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities Sylvia Rothe 1, Mario Montagud 2, Christian Mai 1, Daniel Buschek 1 and Heinrich Hußmann 1 1 Ludwig Maximilian University of Munich,
More informationSTRUCTURE SENSOR QUICK START GUIDE
STRUCTURE SENSOR 1 TABLE OF CONTENTS WELCOME TO YOUR NEW STRUCTURE SENSOR 2 WHAT S INCLUDED IN THE BOX 2 CHARGING YOUR STRUCTURE SENSOR 3 CONNECTING YOUR STRUCTURE SENSOR TO YOUR IPAD 4 Attaching Structure
More informationSketchpad Ivan Sutherland (1962)
Sketchpad Ivan Sutherland (1962) 7 Viewable on Click here https://www.youtube.com/watch?v=yb3saviitti 8 Sketchpad: Direct Manipulation Direct manipulation features: Visibility of objects Incremental action
More informationShopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction
Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp
More informationTwo-Handed Interactive Menu: An Application of Asymmetric Bimanual Gestures and Depth Based Selection Techniques
Two-Handed Interactive Menu: An Application of Asymmetric Bimanual Gestures and Depth Based Selection Techniques Hani Karam and Jiro Tanaka Department of Computer Science, University of Tsukuba, Tennodai,
More informationDevelopment of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
More informationMudpad: Fluid Haptics for Multitouch Surfaces
Mudpad: Fluid Haptics for Multitouch Surfaces Yvonne Jansen RWTH Aachen University 52056 Aachen, Germany yvonne@cs.rwth-aachen.de Abstract In this paper, we present an active haptic multitouch input device.
More informationTouch Interfaces. Jeff Avery
Touch Interfaces Jeff Avery Touch Interfaces In this course, we have mostly discussed the development of web interfaces, with the assumption that the standard input devices (e.g., mouse, keyboards) are
More informationEnhancing Traffic Visualizations for Mobile Devices (Mingle)
Enhancing Traffic Visualizations for Mobile Devices (Mingle) Ken Knudsen Computer Science Department University of Maryland, College Park ken@cs.umd.edu ABSTRACT Current media for disseminating traffic
More informationMultimodal Interaction Concepts for Mobile Augmented Reality Applications
Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl
More informationDevelopment of excavator training simulator using leap motion controller
Journal of Physics: Conference Series PAPER OPEN ACCESS Development of excavator training simulator using leap motion controller To cite this article: F Fahmi et al 2018 J. Phys.: Conf. Ser. 978 012034
More informationHAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA
HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1
More informationHow to Create a Touchless Slider for Human Interface Applications
How to Create a Touchless Slider for Human Interface Applications By Steve Gerber, Director of Human Interface Products Silicon Laboratories Inc., Austin, TX Introduction Imagine being able to control
More informationHUMAN COMPUTER INTERFACE
HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the
More informationVein and Fingerprint Identification Multi Biometric System: A Novel Approach
Vein and Fingerprint Identification Multi Biometric System: A Novel Approach Hatim A. Aboalsamh Abstract In this paper, a compact system that consists of a Biometrics technology CMOS fingerprint sensor
More information