Double-side Multi-touch Input for Mobile Devices

Similar documents
Novel Modalities for Bimanual Scrolling on Tablet Devices

LucidTouch: A See-Through Mobile Device

Using Hands and Feet to Navigate and Manipulate Spatial Data

Occlusion-Aware Menu Design for Digital Tabletops

COMET: Collaboration in Applications for Mobile Environments by Twisting

Touch Interfaces. Jeff Avery

Under the Table Interaction

Getting Back To Basics: Bimanual Interaction on Mobile Touch Screen Devices

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

User-defined Surface+Motion Gestures for 3D Manipulation of Objects at a Distance through a Mobile Device

Evaluating Touch Gestures for Scrolling on Notebook Computers

EVALUATION OF MULTI-TOUCH TECHNIQUES FOR PHYSICALLY SIMULATED VIRTUAL OBJECT MANIPULATIONS IN 3D SPACE

A Gestural Interaction Design Model for Multi-touch Displays

Multitouch Finger Registration and Its Applications

Investigating Gestures on Elastic Tabletops

LensGesture: Augmenting Mobile Interactions with Backof-Device

Rock & Rails: Extending Multi-touch Interactions with Shape Gestures to Enable Precise Spatial Manipulations

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Mimetic Interaction Spaces : Controlling Distant Displays in Pervasive Environments

My New PC is a Mobile Phone

Magic Desk: Bringing Multi-Touch Surfaces into Desktop Work

CS 247 Project 2. Part 1. Reflecting On Our Target Users. Jorge Cueto Edric Kyauk Dylan Moore Victoria Wee

Evaluation of Flick and Ring Scrolling on Touch- Based Smartphones

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

SegTouch: Enhancing Touch Input While Providing Touch Gestures on Screens Using Thumb-To-Index-Finger Gestures

VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Microsoft Scrolling Strip Prototype: Technical Description

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Precise Selection Techniques for Multi-Touch Screens

Understanding Multi-touch Manipulation for Surface Computing

Classic3D and Single3D: Two unimanual techniques for constrained 3D manipulations on tablet PCs

Multi-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit

Making Pen-based Operation More Seamless and Continuous

Brandon Jennings Department of Computer Engineering University of Pittsburgh 1140 Benedum Hall 3700 O Hara St Pittsburgh, PA

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

The performance of hand postures in front- and back-of-device interaction for mobile computing

Photo Editing in Mac and ipad and iphone

Markus Schneider Karlsruhe Institute of Technology (KIT) Campus Süd, Fritz-Erlerstr Karlsruhe, Germany

arxiv: v1 [cs.hc] 14 Jan 2015

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Copyright 2014 Association for Computing Machinery

Shift: A Technique for Operating Pen-Based Interfaces Using Touch

Multi touch Vector Field Operation for Navigating Multiple Mobile Robots

An Experimental Comparison of Touch Interaction on Vertical and Horizontal Surfaces

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

The whole of science is nothing more than a refinement of everyday thinking. Albert Einstein,

Multi-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

Sub-space gestures. Elements of design for mid-air interaction with distant displays

DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

FingerGlass: Efficient Multiscale Interaction on Multitouch Screens

Tangible User Interfaces

Multitouch and Gesture: A Literature Review of. Multitouch and Gesture

Indirect Mappings of Multi-touch Input UsingOneandTwoHands

Air+Touch: Interweaving Touch & In-Air Gestures

The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand

Classifying 3D Input Devices

Integrating 2D Mouse Emulation with 3D Manipulation for Visualizations on a Multi-Touch Table

ShapeTouch: Leveraging Contact Shape on Interactive Surfaces

Expanding Touch Input Vocabulary by Using Consecutive Distant Taps

Enhancing Traffic Visualizations for Mobile Devices (Mingle)

A novel click-free interaction technique for large-screen interfaces

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Pointable: An In-Air Pointing Technique to Manipulate Out-of-Reach Targets on Tabletops

Evaluating Reading and Analysis Tasks on Mobile Devices: A Case Study of Tilt and Flick Scrolling

IMGD 4000 Technical Game Development II Interaction and Immersion

Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface

Information Layout and Interaction on Virtual and Real Rotary Tables

Many Fingers Make Light Work: Non-Visual Capacitive Surface Exploration

Eden: A Professional Multitouch Tool for Constructing Virtual Organic Environments

Transporters: Vision & Touch Transitive Widgets for Capacitive Screens

Cricut Design Space App for ipad User Manual

The use of gestures in computer aided design

Classifying 3D Input Devices

Superflick: a Natural and Efficient Technique for Long-Distance Object Placement on Digital Tables

WaveForm: Remote Video Blending for VJs Using In-Air Multitouch Gestures

Two-Handed Interactive Menu: An Application of Asymmetric Bimanual Gestures and Depth Based Selection Techniques

Interaction Technique for a Pen-Based Interface Using Finger Motions

3D Data Navigation via Natural User Interfaces

Exploring Multi-touch Contact Size for Z-Axis Movement in 3D Environments

Understanding Hand Degrees of Freedom and Natural Gestures for 3D Interaction on Tabletop

Touch & Gesture. HCID 520 User Interface Software & Technology


Sensing Human Activities With Resonant Tuning

Building a gesture based information display

Touch & Gesture. HCID 520 User Interface Software & Technology

UbiBeam: An Interactive Projector-Camera System for Domestic Deployment

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Multi-User Interaction Using Handheld Projectors

Enabling Cursor Control Using on Pinch Gesture Recognition

From Table System to Tabletop: Integrating Technology into Interactive Surfaces

ACTUI: Using Commodity Mobile Devices to Build Active Tangible User Interfaces

General conclusion on the thevalue valueof of two-handed interaction for. 3D interactionfor. conceptual modeling. conceptual modeling

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Using Google SketchUp

GART: The Gesture and Activity Recognition Toolkit

Transcription:

Double-side Multi-touch Input for Mobile Devices Double side multi-touch input enables more possible manipulation methods. Erh-li (Early) Shen Jane Yung-jen Hsu National Taiwan University National Taiwan University 1, Sec. 4, Roosevelt Rd., 1, Sec. 4, Roosevelt Rd., b94502024@ntu.edu.tw yjhsu@csie.ntu.edu.tw Sung-sheng (Daniel) Tsai Chi-wen (Euro) Chen National Taiwan University National Taiwan University 1, Sec. 4, Roosevelt Rd., 1, Sec. 4, Roosevelt Rd., b94901075@ntu.edu.tw b93501005@csie.ntu.edu.tw Hao-Hua Chu National Taiwan University 1, Sec. 4, Roosevelt Rd., hchu@csie.ntu.edu.tw Copyright is held by the author/owner(s). CHI 2009, April 4 9, 2009, Boston, MA, USA ACM 978-1-60558-246-7/09/04. Abstract We present a new mobile interaction model, called double-side multi-touch, based on a mobile device that receives simultaneous multi-touch input from both the front and the back of the device. This new double-sided multi-touch mobile interaction model enables intuitive finger gestures for manipulating 3D objects and user interfaces on a 2D screen. Keywords Double-side Multi-Touch, Finger Touch Gesture, Mobile Interaction ACM Classification Keywords H5.2. [User Interfaces]: Input devices and strategies, Interaction styles. Introduction A recent trend in smart phones is moving toward a larger or higher resolution screen that gives users and applications more working screen space. To accommodate this trend, many smart phones, such as Apple iphones and HTC Touch Diamond phones, have replaced physical keyboards or keypads with an input by stylus or direct finger touch screens.

figure 1. The grab gesture. Transparency of ipod in the photo is adjusted to show the relative position of fingers. Two colored circles on screen indicate the position of touch points. Recently, HybridTouch [7] added a single-touch pad to the rear side of a small screen device, to enable interactive use of the back of the device. In this interaction model, users perform simple controls, such as scrolling, from the device s rear side with their nondominant hand. This leaves the dominant hand free to hold a stylus pen to interact with the main front-side screen. LucidTouch [8] also enables users to operate a touch screen from the device s rear side. Its goal is to address the fat finger problem where a finger occludes small targets on a mobile device s screen. By putting the touch screen on the device s rear side, users can see targets on the main screen without obstruction. To give visual feedback about the position of a rearside finger, a transparent interface is used. A rear-side camera captures the finger s position and displays the finger shadow on the front screen. Rather than extending the device s interaction surface or solving the finger occlusion problem, our work aims at providing a simple and intuitive interaction method for the manipulation of 3D objects on a mobile device. Our proposed interaction model is called the doubleside multi-touch based on a mobile device that supports simultaneous multi-touch from both the front and back sides of the device. To demonstrate the use of the double-side multi-touch interaction model, we explore the design space of various touch gestures for manipulating 3D objects on a mobile device. Double side Multi-touch Interaction Although multi-touch input enables scrolling and the zooming of a document, its manipulation is constrained to two dimensions over the horizontal or vertical planes of a mobile device s 2D touch screen. By adding touch inputs from the device s back side, the degree of freedom for manipulation can be extended to a pseudo three dimensions. This 3D space concept is as described in the under-table interaction work [9], in which the virtual 3D space shown in the device s display is a fixed volume, sandwiched between the front and the back surfaces of the device. This tablebased 3D concept was extended in our double-side multi-touch interaction model on a mobile device, and then a set of touch gestures was created for manipulating 3D objects. In traditional single-side touch interaction, manipulating a 3D object is done by touching one face of the object. In contrast, manipulating a real-world object in a physical 3D space involves a rich set of multi-finger actions such as grabbing, waving, pushing or flipping the target object. If the object is soft or elastic, it can also be stretched or kneaded into different shapes. Our double-side multi-touch interaction model provides double-side multi-finger touch gestures similar to manipulating 3D objects in the physical world. Each of these double-side, multifinger touch gestures is described as follows. Grab In physical space, grabbing an object (e.g., a coin or card) by hand involves at least two fingers applying opposing pressure on two or more surfaces of the target object. When mapping this physical grab action onto the double-side multi-touch input device, this grab gesture is done by two fingers simultaneously touching the target object - one top finger touches the target object on the device s front side screen, and the bottom finger touches the target object on the device s rear surface. Figure 1 illustrates this grab gesture.

figure 2. The drag gesture involves moving the front-side and back-side fingers together simultaneously in the same direction. figure 3. The push up gesture involves the back-side finger touching the target object. Drag (x and y axes) In physical space, dragging a small object by hand involves first grabbing the object with two fingers and then pulling it by moving both fingers together in one target direction. When mapping this physical drag action onto the double-side multi-touch input device, this drag gesture follows a 2-step process. (1) The target object must first be grabbed by two fingers simultaneously touching the target object from the front and back sides of the device. (2) The target object is then pulled with both fingers sliding together in the same target direction over the horizontal and/or vertical planes. Figure 2 shows this drag gesture. Push toward/away (z axis) Pushing up/down on an object requires two or more fingers touching the target object. Touching the object on the device s front side screen pushes the object down and away from the user, while touching the object on the device s back pushes the object up and toward the user. Figure 3 illustrates this push-toward gesture. Flip right/left/up/down (x and y axes) In physical space, flipping an object, such as a coin, by hand involves first grabbing the object with two fingers and then flipping it by sliding the two fingers in opposite directions. When mapping this physical flip action onto the double-side multi-touch input device, this flip gesture follows a 2-step process. (1) The target object must first be grabbed by two fingers described previously. (2) The target object is then flipped by the two fingers sliding in opposite directions. A left flip involves sliding the top finger to the left on the device s front-side screen while sliding the bottom finger on the device s back side surface to the right. Similarly, a right/up/down flip involves sliding the top finger to the right/top/down on the device s front-side screen while sliding the bottom finger on the device s back side surface to the left/down/up. Figure 4 illustrates this left-flip gesture. figure 4. The flip-left gesture involves one front-side finger moving to the left and the back-side finger moving to the right. Stretch (x, y and z axes) In physical space, stretching a soft or elastic object by hand involves first grabbing the object at several places and then applying varying forces at these places to stretch or knead it. When mapping this physical stretch action onto the double-side multi-touch input device, the stretch gesture follows a 2-step process. (1) The target object must first be grabbed by two or more sets of fingers. (2) The target object is then stretched by fingers sliding in different directions. Figure 5 shows this stretch-wide gesture. Four fingers are involved the first two grab the right side of the target object and slide together to the right, while the other two fingers grab the left side of the target object and slide together to the left.

figure 5. The stretch gesture involves two grabbing actions and two hands moving in different directions. Prototype A double-side multi-touch device prototype was created as shown in figure 6. This was accomplished with two ipod-touch devices attached back to back. In this case the back-side ipod-touch device became the back-side multi-touch pad for the device. The touch input signals of the back-side ipod-touch device, including locations of the touch points, were then transmitted to the front side ipod-touch device through an ad-hoc WiFi connection. Note that the current ipod-touch device supports at most five touch points at the same time. When the 6 th finger touches the screen, all touch events are cancelled. We have found that five touch points are also sufficient to implement our double-side, multifinger touch gestures. figure 6. Prototype of our double side multi-touch input mobile device. The location of back side touch points are displayed to provide visual feedback. Our preliminary prototype focused on 3D object manipulation through double side gestures. In the next prototype we will try to improve the precision of backside pointing through visual feedback. We propose a method that by displaying cursors mapped to touch points on the back-side, users can point to the target more accurately. With this kind of visual feedback, we can separate the cursor-moving and selecting actions to rear and front side fingers, respectively. The back side finger would move the cursor to target, and the front side finger then taps the screen to complete a selected action. Preliminary user study In an exploratory study, we invited three users to test our double-side multi-touch device and touch gestures. As the users performed a set of 3D object manipulations including dragging, flipping and pushing, we were interested in answering the following two questions: (1) How intuitive or easy is it to learn the double-side multi-touch gestures? (2) How does the double-side multi-touch interface compare with the existing single-side multi-touch interface? Note that an extensive study with more subjects will be needed to obtain valid answers. However, our preliminary results suggested that all three users found our double-side multi-touch gestures easy to learn. In trials, the amount of time needed to learn the double-side multitouch interface was not longer than the time necessary to learn the existing single-side multi-touch interface. Our preliminary results also suggested that the time needed to perform a set of 3D object manipulations using our double-side multi-touch gestures was not longer than the gestures on the existing single-side multi-touch. Related Work Unimanual and Bimanual Clifton and Daniel [4] conducted an experiment to

compare the difference between unimanual and bimanual, direct-touch and mouse input. Their results show that users benefit from direct-touch input in bimanual tasks. A study by Tomer et al. [5] reveals that two-handed multi-touch manipulation is better than one-handed multi-touch in object manipulation tasks, but only when there is clear correspondence between fingers and control points. Precision pointing using touch input In addition to two well known techniques, Zoom- Pointing, and Take-Off, the High precision touch screen interaction project [1] proposes two complementary methods: Cross-Keys and Precision-Handle. The former uses virtual keys to move a cursor with a crosshairs, while the latter amplifies finger movement with an analog handle. Their work improves the pointing interaction at pixel level but has difficulty when targets are near the edge of screen. Benko et al. [3] develops a technique called Dual Finger Selection that enhances the precision of selection on a multi-touch screen. The technique achieves pixel-level targeting by dynamically adjusting the control-display ratio with a secondary finger while the primary finger moves the cursor. Back side interaction The idea of back side touch was first introduced in the Gummi project [6] which showed the possibility of using the back side touch to interact with a physically bendable device. Under-table interaction [9] combines two touch surfaces in one table. Since users cannot see the touch points on the underside of the table, it proposes visual feedback on the topside screen to show the touch points on the underside of the table. This improves the touch point precision on the underside. HybridTouch [7] expands the interaction surface of a mobile device to its back side. This is done by attaching a single-touch touch pad to the back-side of the device. This enables the non-dominate hand to perform document scrolling on the backside touch pad. LucidTouch [8] develops pseudo-transparency for a mobile device s back-side interaction. In this, by using a camera extended from the device s back-side to capture the locations, fingers operating on the device s back-side are shown on the front screen. Wobbrock et al. [11] also conducted a series of experiments to compare the performance of the index finger and thumb on the front and the rear sides of a mobile device. Our work emphasizes finger gestures that involve simultaneously touching on both sides of a mobile device. These double-side multi-touch gestures are suitable for manipulating 3D objects or interfaces on a mobile device. Future Work and Conclusion The double-side multi-touch input device provides new possibilities for interface and the manipulation of 3D objects on mobile devices. Several touch gestures are proposed that simulate how we use our fingers to interact with objects in the physical world, including grabbing, dragging, pushing, flipping, and stretching. We have created a preliminary prototype and developed these finger gestures. In our future work, we will conduct a more extensive user study to evaluate the double-side multi-touch interaction model. Based on user feedback, we would improve and fine-tune this interaction model.

We are interested in applying a physics engine to simulate when force is exerted on the surfaces of the object. Given a soft body object, it would bend. If the soft body object has flexibility, its shape would recover when the exerted forces are gone or become weak. For a rigid body object, it can be torn apart or broken into pieces by grabbing two edges and pulling it in opposite directions. We are also interested in mobile applications, e.g., games, which can utilize the features of double-side multi-touch interaction. In all, we believe that doubleside multi-touch interaction is promising for future development. Acknowledgements We would like to thank Shih-Yen Liu for his valuable comments regarding our idea. References [1] Albinsson, P.-A. and Zhai, S. High precision touch screen interaction. In Proceedings of CHI, pages 105 112, New York, NY, USA, 2003. ACM Press. [2] Balakrishnan, R. and Hinckley, K. Symmetric bimanual interaction. In Proceedings of CHI, pages 33 40, New York, NY, USA, 2000. ACM Press. [3] Benko, H., Wilson, A., and Baudisch, P. Precise Selection Techniques for Multi-Touch Screens. In Proceedings of CHI 2006, Montreal, Canada, April 2006, pp. 1263-1272. [4] Forlines, C., Wigdor, D., Shen, C., Balakrishnan, R. (2007). Direct-Touch vs. Mouse Input for Tabletop Displays. In the Proceedings of the 2007 CHI conference on Human factors in computing systems. [5] Moscovich, T., Hughes, J., Indirect mappings of multi-touch input using one and two hands, Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, April 05-10, 2008, Florence, Italy. [6] Schwesig, C., Poupyrev, I., and Mori., E. Gummi: a bendable computer. Proceedings of CHI 2004. 2004: ACM: pp 263-270. [7] Sugimoto, M., Hiroki, K. HybridTouch: an intuitive manipulation technique for PDAs using their front and rear surfaces. Proceedings of MobileHCI '06, p. 137-140. [8] Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C. LucidTouch: A See-Through Mobile Device. In Proceedings of UIST 2007, Newport, Rhode Island, October 7-10, 2007, pp. 269 278. [9] Wigdor, D., Leigh, D., Forlines, C., Shipman, S., Barnwell, J., Balakrishnan, R., Shen, C. Under the table interaction. Proceedings of UIST 2006 the ACM Symposium on User Interface Software and Technology. p. 259-268. [10] Wilson, A. D., Izadi, S., Hilliges, O., Garcia- Mendoza, A., Kirk, D. Bringing physics to the surface. In Proceedings of 21st ACM Symposium on User Interface and Software Technologies (ACM UIST), Monterey, CA, USA, October 19-22, 2008. [11] Wobbrock, J.O., Myers, B.A. and Aung, H.H. (2008) The performance of hand postures in front and back of device interaction for mobile computing. International Journal of Human-Computer Studies 66 (12), pp. 857-875.