ForceTap: Extending the Input Vocabulary of Mobile Touch Screens by adding Tap Gestures

Size: px
Start display at page:

Download "ForceTap: Extending the Input Vocabulary of Mobile Touch Screens by adding Tap Gestures"

Transcription

1 ForceTap: Extending the Input Vocabulary of Mobile Touch Screens by adding Tap Gestures Seongkook Heo and Geehyuk Lee Department of Computer Science, KAIST Daejeon, , South Korea {leodic, ABSTRACT We introduce an interaction technique that increases the touch screen input vocabulary by distinguishing a strong tap from a gentle tap without the use of additional hardware. We have designed and validated an algorithm that detects different types of screen touches by combining data from the built-in accelerometer with position data from the touch screen. The proposed technique allows a touch screen input to contain not only the position of a finger contact, but also its type, i.e., whether the contact is a Tap or a ForceTap. To verify the feasibility of the proposed technique we have implemented our detection algorithm in experiments that test cases of single-handed, two-handed, immersive, and on the move usage. Based on the experimental results, we investigate the advantages of using two types of touch inputs and discuss emerging issues. Finally, we suggest a design guideline for applying the proposed technique to touch screen applications, and present possible application scenarios. Author Keywords Touch screen, mobile devices, tap input, force sensing, single-handed interaction, built-in accelerometer, pressure input ACM Classification Keywords H5.2. User Interfaces: Input Devices and Strategies, Interaction Styles General Terms Design, Human Factors INTRODUCTION Touch screens have gained popularity with multi-functional mobile devices, as they enable the convergence of input and output and the design of a flexible and intuitive user interface. One crucial limitation of current touch screens is the mere detection of a touch position and a binary touch state, while it fails to distinguish touches with different forces. For instance, in the real world we naturally use different forces on a target object regarding pressing, touching, or tapping. Current mobile touch-devices, however, do not discriminate these operations. In this paper, we present an input technique on a touch screen that discriminates a gentle tap (Tap) from a strong tap (ForceTap) by utilizing accelerometer data available on many mobile devices. When a user taps on a touch screen, the device momentarily moves to the direction of the force that the finger applies to the screen. This movement can be measured by monitoring the acceleration of the device along the direction perpendicular to the touch screen. As shown in Figure 1, finger taps with different force levels cause different acceleration patterns along the z-axis, which is the axis normal to the touch screen. By analyzing these patterns, a mobile device will be able to distinguish the meanings of touch screen inputs on the same touch point when the associated applied forces are different. In the following, we give a brief review of a previous research that attempted to enrich touch screen interactions by using additional information beyond a touch point and a binary touch state, including studies that explored the advantages of using acceleration data in conjunction with touch screen input. Subsequently, we present the results of an experiment to study the acceleration patterns made by tapping with a finger and the design of an algorithm to Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. MobileHCI 2011, Aug 30 Sept 2, 2011, Stockholm, Sweden. Copyright 2011 ACM /11/ $ Figure 1. Z-axis accelerations (blue curves) and touch events (yellow bars) caused by (a) Taps and (b) ForceTaps. 113

2 classify a touch input into a Tap and a ForceTap. We then present the results of a series of experiments to verify the feasibility of the proposed technique in scenarios of singlehanded, two-handed, immersive, and on the move usage. We will also discuss the features of the proposed interaction technique based on experimental results and participant feedback. RELATED WORK Using pressure on a touch screen has been explored especially for the use of a stylus. Ramos et al. designed widgets controlled by pressure, and found that a continuous visual feedback is essential for maintaining a constant force level [19]. Ramos and Balakrishnan adapted a pressure input to control high-precision parameters in [18], and to improve selection performance with a pen stroke [17]. Davidson et al. [4] introduced a pressure-based 2-D object arrangement technique that detects pressure applied to a display using the contact size of a finger. Furthermore, it helps users handle layered objects on a tabletop display by utilizing a pressure level that tilts them. Pressure inputs were investigated to enrich limited input options on mobile devices. Clarkson et al. [2] attached pressure sensors to a mobile phone s keypad, thus enabling the buttons to mediate continuous pressure inputs. The authors have classified the possible usages of pressure inputs for three categories: direct specification, interactions that can be previewed, and affective inputs. Using a pressure input for touch screens of mobile devices was first introduced by Miyaki and Rekimoto [15]. A pressure sensor attached on the back of a mobile device measured changing pressure levels that were used to control single-handed zooming or scrolling. Thereby, the direction of zooming or scrolling was controlled by a thumb gesture on the screen. Stewart et al. [25] have investigated the characteristics of a pressure input on mobile devices through a series of experiments. The authors compared four different kinds of feedback and found that a visual feedback was the most effective for reducing errors and that a multimodal feedback did not show strong effects on error rates. Heo and Lee [6] presented a method to augment touch screen gestures by sensing normal and tangential forces applied to a touch screen. There have been some approaches to enrich an input without adding additional hardware. MicroRolls [22] by Roudaut et al. expanded the input vocabulary of a touch screen by detecting a finger roll, a movement without the slippage of a finger. The authors found that the movement amplitude of thumb roll gestures on a touch screen is limited, so that they could distinguish the roll of a thumb from drag or swipe gesture. Roth and Turner [23] introduced a novel input method called Bezel Swipe that utilizes a dragging gesture starting from the bezel of a touch screen. Since the Bezel Swipe uses a gesture that is new to mobile devices, it could be applied without causing a conflict with traditional touch gestures. Expressive Typing by Iwasaki et al. [10] and Sensor Synaesthesia [7] are particularly relevant to our project. Expressive Typing is a method of detecting a typing force using a built-in accelerometer for a more expressive input. Iwaskai et al. [10] found that there is a relationship between the typing velocity and the corresponding acceleration measured by the built-in accelerometer. They showed the validity of using an accelerometer embedded in a portable device for estimating the typing velocity. Hinckley and Song [7] presented a method of combining touch and motion input by suggesting possible interaction styles in two categories: touch-enhanced motion and motionenhanced touch. COMBINING TAP INPUT WITH TOUCH SCREEN Tap Gesture A tap gesture, which is stronger than a touch gesture, is described by Ronkainen et al. [20] in that a tap gesture is a small gesture that is commonly accepted, easy to detect, and in itself providing an immediate haptic feedback. These advantages have encouraged many attempts to use a tap gesture [8, 9, 11, 12, 20, 24]. A tap input, however, suffers the limitations of a small input vocabulary and the possibility of false detections. To expand the input vocabulary of a tap input and reduce false detections, Ronkainen et al. [20] used combinations of tap gestures: tap and double-tap. Hudson et al. [9] have used wiggle, twist, tap and flick gestures between two whack gestures for higher expressiveness. Tap Gestures on a Touch Screen Tap gestures on a touch screen have advantages over tapping on a touch-insensitive surface. First, when a tap gesture is performed on a touch screen, it can determine the tapping location. Compared to tap gestures on a touch insensitive surface, tap gestures on a touch screen have additional two dimensional position data. Secondly, combining a tap input with a touch screen input can also prevent false detections of a tap. As mobile devices are mostly being used in situations of high mobility, using accelerometers tend to cause incidental taps by the body movement of a user, or by unintended hits [20]. We, therefore, use a finger tap on a touch screen as a trigger signal for detecting only intended tap gestures. Thereby, the direction of the acceleration is constrained to the direction of the touch screen s perpendicular, which makes it easier to detect tapping and prevent false detection of tapping caused by an impact at the side of the device. We also use high-pass filtered acceleration to eliminate the effect of the gravity and acceleration caused by human body movements. TAP DISCRIMINATION BY ACCELERATION PATTERNS We observed acceleration signal patterns of taps on a touch screen in three different conditions, designed a tap classification algorithm, and verified the designed algorithm by experiment. 114

3 Acceleration Patterns Before designing an algorithm to distinguish two types of taps, we conducted a simple experiment to observe acceleration patterns of finger taps in three different conditions: stationary (device is placed on a foam cushion), single-handed use, and two-handed use conditions. We logged touch events and acceleration data along the z-axis (perpendicular to the screen) at 100Hz on an Apple iphone 3GS. Figure 2(a) shows the acceleration data that we acquired while moving the phone and tapping on the screen. A large drift in the signal in Figure 2(a) was due to the phone s movement and had to be eliminated first. For this purpose, acceleration data were passed through a high-pass filter with a cut-off frequency of 5Hz. The resulting data after the high-pass filtering is shown in Figure 2(b). Figure 3 shows acceleration data and touch events in the three conditions: (a) stationary - device is placed on a foam cushion and hit by the index finger, (b) single-handed - held in a hand and hit by the thumb of the holding hand, and (c) two-handed - held in a hand and hit with the index finger of the other hand. The touch events of the ios appeared about 20ms after an acceleration peak. As shown in the graph, similar screen taps resulted in different acceleration patterns for different placement conditions. The oscillation of the device was relatively slower when the device was held in one hand than when it was placed on a foam cushion. In the single-handed use we could observe a negative acceleration before an acceleration peak as shown in Figure 3(b), meaning that the device was moved upward before the finger hit the screen. In the two-handed use case, the oscillation of the device remained longer than in the singlehanded use case. Tap Discrimination Algorithm Algorithm Design Figure 4 shows high-pass-filtered z-axis acceleration for Taps and ForceTaps. In both cases, an acceleration peak appears just before the beginning of the touch event. Peak acceleration values by Taps were relatively smaller than that of ForceTaps, but their difference was not large enough for easy discrimination. We needed to find a better discriminator than a peak acceleration value. One of the possible reasons why a peak acceleration value failed as a discriminator may be that what one can control when tapping is not a force, but an impulse on the screen. Since an impulse is (force) x (time), the same impulse can result in a large force or a small force depending on the amount of collision time in which the impulse was delivered to the screen. This collision time is quite unpredictable, because it is small for a hard target, but large for a soft target. In this respect, an impulse may be a better discriminator for distinguishing two different types of taps than a peak acceleration value. Since we did not need a physically correct impulse but a discriminator that is conceptually similar to an impulse and convenient to compute, we defined a discriminator D as the sum of the Figure 2. Accelerations by tapping the screen while moving the device: (a) before and (b) after high-pass filtering. Figure 3. High-pass filtered z-axis accelerations and touch events by tapping for (a) stationary, (b) single-handed, and (c) two-handed conditions. Figure 4. High-pass filtered z-axis accelerations by (a) Taps and (b) ForceTaps. 115

4 absolute values of all accelerometer samples within a time window around the touch event, and called it a pseudoimpulse. Another reason for the unreliability of a peak acceleration value may be aliasing due to the low sampling frequency of the accelerometer data. The duration of an acceleration peak by a finger tap is very brief, and the peak may be missed by slow sampling. Acceleration changes after the peak are slower than during a peak. Thus, the aliasing problem may also be reduced when a pseudo impulse is used instead of a peak acceleration value. As mentioned before, acceleration data are first processed through a high-pass filter with a cut off frequency of 5Hz to eliminate slow drifts due to device movements. Then, a pseudo impulse is computed using the filtered accelerometer data in a time window which starts 50ms before a touch-begin event and lasts for 150ms. Since the discriminator can be computed after this time window, tap classification is done in about 120ms after a touch-begin event. A time delay of 120ms is comparable to the minimum time delay that a human can notice [3, 14]. Tap classification is done by comparing the discriminator D with a predefined threshold. Algorithm Validation To verify the effectiveness of the proposed algorithm, we conducted a simple experiment. We recruited 14 participants (22 to 34 years old, average age of 25.6, eight males and six females). Touch events and acceleration data were logged at 100Hz on an Apple iphone 3GS. Participants were asked to tap on a touch screen gently and strongly 20 times each. We calculated the values of the two discriminators, a peak acceleration value P and a pseudo impulse D. The result is shown in Figure 5 through histograms. Peak accelerations and pseudo-impulses for Taps and ForceTaps are presented as yellow and red bars. We compared the ratio of the number of overlapped samples for Tap and ForceTap over all samples, between two discriminators P and D. The ratio of overlapping discriminator values throughout all taps has dropped from 18.2% to 11.4% with the pseudo-impulse algorithm, which is 37.3% lower than the peak algorithm. Since a lower overlap ratio means a better discrimination, we could conclude that a pseudo-impulse D is a better discriminator than the peak acceleration value. Visual Feedback Design Since the threshold for the discriminator D may not be obvious to users, we speculated that a visual feedback would be necessary to help users learn the threshold. We designed a visual feedback with the metaphor of a ripple made by a finger. The size of a ripple is proportional to the force that we apply to the surface of water. As shown in Figure 6, two concentric circles in red and gray appear after a touch. A gray circle indicates the level of the force applied by a tap. It grows from a point to a circle whose radius is proportional to the pseudo impulse D of a tap. The threshold for D is represented by a red circle. We expected that a user might learn the threshold better with this visual feedback. FORCES ON A TOUCH SCREEN In order to determine an appropriate threshold for the discriminator D, it was necessary to study the distribution of the discriminator D in the normal usage of the current touch screen applications. The threshold should be set so that a ForceTap may be safely discriminated from normal touch operations, i.e., Tap operations. We implemented four simple applications: Sketch, Photo Viewer, Web Browser, and Notes. The user interfaces of these applications are similar to that of the corresponding applications on an iphone except for Sketch, which did not have a counterpart on an iphone. These applications were chosen to cover various interaction styles of a touch screen interface: Sketch: Drag, Touch Figure 5. The histograms of peak accelerations (P) and pseudo impulses (D) for Taps and ForceTaps. Figure 6. A visual feedback design: (a) for a Tap and (b) for a ForceTap. 116

5 Photo Viewer: Drag, Flick Web Browser: Touch Notes: Type (quick and consecutive touches in short time) Five graduate students (three male, two female, 23 to 34 years old with an average age of 27) participated in this experiment. All participants had experience in touch screen enabled mobile devices. No information was provided about the experiment and participants were asked to use the four applications freely. The experimental result is shown in Table 1. The mean and the standard deviation of D for Sketch and Photo Viewer applications are higher than that of Web Browser and Notes applications. We found through observation that people tend to tap harder when drawing a dot than when selecting a web link or typing a letter. This results in the highest standard deviation in the case of the Sketch application. Based on these results, we set the threshold of the discriminator D to 1.6, which can correctly classify over 95% of touch inputs in all applications except for the Sketch application. Sketch Photo Viewer Web Browser Notes Mean Std.Dev Table 1. Average impulse values measured from the use of general touch screen applications. USER STUDY The goal of the user study was to verify the feasibility of ForceTap, i.e., whether users will be able to use two kinds of taps effectively, in diverse conditions like (1) in a singlehanded-use case, (2) in a two-handed-use case, (3) in an immersive use case, and (4) in a walking condition. We also investigated the effect of a visual feedback of a tap s strength in the first two cases. Apparatus and Environmental Settings For the experiment we used an Apple iphone 3GS with ios 4.0. The sampling frequency of the accelerometer was set to 100 Hz. All applications used for the experiment were developed in Objective-C with ios SDK 4.0. In experiments 1, 2 and 3, participants were sitting on a chair with their arms resting on a table. In experiment 4, participants performed tasks while walking in the corridor. The hand postures for a single-handed and a two-handed use are shown in Figure 7. In order to obtain qualitative feedback, we interviewed participants after finishing all blocks. Participants Fourteen participants (eight male, six female, average age 25.6) aged 22 to 34 were recruited for experiment 1 to experiment 3. All participants were right-handed. All but (a) one of them were experienced users of mobile devices with a touch screen, and twelve participants were currently using mobile devices with a touch screen. For experiment 4, five participants (four male, one female, average age 24.8) were recruited and all were right-handed. Three of them also took part in the experiment 1 to 3. All participants were experienced users of touch screen enabled mobile devices. All participants of the two groups were paid 5000 KRW, which is about 4.3 US dollars. Experiment 1: Single-handed Use Participants were asked to perform a Tap or a ForceTap operation with the thumb of the hand holding the device in response to an instruction on the screen. After a participant performed a Tap or a ForceTap operation, the classification result was shown at the center of the screen. We designed the experiment to be between-subject in order to study both the effect of a visual feedback and the effect of learning over blocks. Half of the participants were asked to complete the task without a visual feedback and the other half with a visual feedback. The experiment consisted of five blocks and each block consisted of 20 trials. An instruction about the experiment was given before starting the experiment. A short warm-up session was given. The dependent variables were Selection Time and Error Rate. Experiment 2: Two-handed Use The task for experiment 2 was exactly the same as in experiment 1 except that it was performed with two hands. They held the device with one hand and operated it with the index finger of the other hand. Participants were given a short warm-up session to learn the tap threshold for two handed use. The experiment consisted of three blocks and each block consisted of 20 trials. Experiment 3: Use in an Immersive Environment When users are immersed in a situation such as playing games or watching movies, it may be difficult to perform precise operations. Using two different types of taps may be difficult for users in the immersive environment, and we thought that it is necessary to check the feasibility of ForceTap in an immersive environment. We developed a (b) Figure 7. (a) One-handed and (b) two-handed use cases. 117

6 simple shooting game application for the test. The application provides two shooting methods: shooting a gun and firing a cannon (Figure 8). A ForceTap is mapped to a heavier action-firing cannon, and a Tap to a gun. The task for experiment 3 was similar to clay pigeon shooting. As the game starts, birds and lions appear and move across the screen following predefined paths. Birds can only be caught with a gun, while lions exclusively require a cannon. When a bird or a lion appeared on a screen, participants were asked to shoot them with the correct weapon, and the game finished when they caught 10 animals. Participants were asked to repeat the task three times. Experiment 4: Use While Walking The goal of this experiment was to check the feasibility of ForceTap while being in a situation of high mobility. Participants were asked to complete a task similar to that of experiment 1 while walking along the corridor. The experiment consisted of three blocks and each block consisted of 20 trials. From an informal pilot study, we observed that the acceleration values of a Tap and a ForceTap were higher while the participants walked compared to when they sat. We increased the discriminator threshold to 3.4 to reduce classification errors. Results Error rates and their standard errors in experiment 1 are shown in Figure 9. In the last block the error rate tends to decrease to 4.29% (with visual feedback) and 2.86% (without visual feedback). There is no significant effect of visual feedback on error rates contrary to our expectation. While the visual feedback does not make significant difference in error rates, it makes difference in the pseudo impulse distributions of the two types of screen taps as shown in Figure 10. Here, the mean and standard deviation of the pseudo impulse values of a Tap and a ForceTap with and without visual feedback can be seen in the shape of a bar. The distributions of pseudo impulse are significantly different depending on the existence of the visual feedback (Paired T-test, p<0.01 for Tap, p<0.05 for ForceTap). We also observe that participants apply more strength for a ForceTap and less strength for a Tap when visual feedback was not provided. The number of false detection of the ForceTap was almost twice that of the false detection of the Tap (See Table 2 (a)). After experiment 1 we asked participants about the difficulties in using a Tap and a ForceTap, and also what they felt about its recognition. Most participants were Error Rate With Visual Feedback Without Visual Feedback Block Figure 9. Mean error rates and standard errors in experiment 1. Figure 10. Mean and standard deviation of pseudo impulse values in experiment One-handed Use Two-handed Use Mean Error Rate (a) Figure 8. A screenshot of a game application developed for the immersive environment experiment: (a) shooting a gun and (b) firing a cannon. (b) 0 visual Feedback without visual Feedback Figure 11. Mean error rates and their standard errors in the single-handed and the two-handed conditions (experiments 1 and 2). 118

7 satisfied with its recognition rate. Two participants commented that tapping required too much tapping force and wished to have a lower threshold for a ForceTap. Figure 11 summarizes the mean error rates and the standard errors for single-handed use and two-handed use cases. As shown in the graph, the error rates for two-handed use were higher than in the single-handed case regardless of a visual feedback. Visual feedback caused no significant difference in error rates. Similar to the single-handed use case, more false detections of a ForceTap occurred than the false detection of a Tap (Table 2(b)). Three out of eight participants answered that the two-handed use was easier. Other participants said: It was easier to control force in a single-handed posture, due to the movable range of the thumb, Using a two-handed posture led me to apply more force than needed, I had to tap stronger in the two-handed posture (a) (b) (c) Inputted Inputted Inputted Tap Recognized ForceTap Tap 660 (95.4%) 32 (4.6%) ForceTap 62 (8.8%) 646 (91.2%) Tap Recognized ForceTap Tap 414 (93.5%) 29 (6.5%) ForceTap 47 (11.8%) 350 (88.2%) Tap Recognized ForceTap Tap 272 (95.8%) 12 (4.2%) ForceTap 13 (8.1%) 148 (91.9%) Table 2. Confusion matrices for Tap / ForceTap detection in (a) single-handed, (b) two-handed, and (c) immersive cases. The numbers in parentheses indicates the percentage of each case. The rate of correctly recognized input in the immersive environment test stood at 94.38%, which is similar to that of experiment 1. We can conclude that using both a Tap and a ForceTap in an immersive environment is feasible. About the use in an immersive environment, all participants said that they did not experience difficulties using a Tap and a ForceTap. One participant commented that it was a little stressful to apply two different types of touch inputs on the previous experiments, but I totally forgot the stress when playing the game. There also have been negative feedbacks, such as it was not easy to change input type when I need to attack different kinds of targets consecutively in short time, when I get excited playing a game, it ll not be easy to touch with a weak force. I m worried that I will only be able to trigger a cannon. The average rate of misrecognized inputs for three blocks of the walking environment test was 3.61%. This is similar to the error rate of the first experiment. Participants commented that it was not difficult to use two types of tap. It seemed that we needed a higher discriminator threshold, not only because it was a situation of high mobility, but also because the arm was more active. To change the threshold adaptively to the mobility condition, we may have needed to monitor the variance of the low-pass filtered acceleration amplitude. DISCUSSION Effect of the Visual Feedback We obtained an interesting result from the experiment, regarding visual feedback. Contrary to our expectation, the error rate did not show significant difference with the existence of a visual feedback. Participants were able to learn the proposed technique without a visual feedback. There may be some explanations for this phenomenon. One is that the difference between a Tap and a ForceTap is clear. Participants did not think that they were applying two different levels of force for two different inputs. Instead, they seemed to understand the two inputs as a no-force input and a force input. In fact, a Tap does not require any force on a capacitive touch screen. On the other hand the effect of visual feedback was shown in the pseudo-impulse distributions. The average pseudoimpulse value for a Tap input was significantly lower and the average pseudo-impulse value for a ForceTap input was significantly higher without a visual feedback. We assume that the participants must have performed two types of Tap more carefully without a visual feedback. With a visual feedback, participants were able to see the force level they have exerted relative to the detection threshold. They could learn the detection threshold more precisely and could apply only the required force. We expect that the physical demand would be reduced with a visual feedback. We are planning a future study on this topic. Variable Detection Threshold We received some comments on the difficulties of performing a ForceTap on the bottom area of the touch screen. There is a physiological constraint coming from the posture of holding the device and a shape of a thumb. According to the experimental result by Karlson et al. [13], it is difficult for users to access the edge of a touch screen, especially at the top-left and the bottom-right corners on a PDA-sized mobile device. A ForceTap requires a larger finger movement compared with a Tap, and therefore users may have more difficulties in performing a ForceTap on such locations. We conducted a pilot study with five participants (mean age 25.8, all male) to observe the acceleration patterns at various positions on the screen. We divided the whole screen area into 24 zones, and asked participants to tap on each zone gently and strongly. The acceleration value was at its highest around the screen center and relatively high on the upper-left area. The acceleration values were at its 119

8 lowest in the bottom area and in the upper right area. This phenomenon seems to be due to the shape of a hand. We modified the discrimination algorithm to use an adaptive discrimination threshold that is dependent on the location on the screen as shown in Figure 12(a), where the bright areas have a high detection threshold while the dark areas have a low detection threshold. To check the effectiveness of the modified algorithm, a small user study was conducted. Six participants tapped targets as they appear at random positions on the screen. Targets were shown as an icon, one was round-shaped representing the Tap and another was star-shaped, representing the ForceTap. To avoid ordering effects, participants were divided into two groups and were counterbalanced. Figure 12(b, c) shows the scatter plots of detection errors. For the static threshold algorithm, errors tend to occur at locations where accelerometer values from the pilot test were low. For the variable threshold algorithm, error rates decreased. Mean error rates were 11.67% for the static threshold algorithm, and 5.83% for the variable threshold algorithm. We also measured the mean error distance between a target and a touch position. The mean error distance was pixels for a Tap and pixels for a ForceTap. Because the threshold map for the variable threshold algorithm was derived from the single-handed use case, the algorithm may have not worked well on a two-handed use case. However, adding a condition detection algorithm and applying different threshold maps may solve this problem. As shown in Figure 3, there is a clear difference in the acceleration pattern after the peak between the singlehanded and two-handed use case, even when the peak amplitudes of the two conditions were almost the same. The amplitude of oscillation after the peak decreases much faster for the single-handed use case compared to the twohanded one. By using this difference, it will be possible to determine which case the device is being used. Use on a Table Sometimes mobile devices are being used while on the table. Thus, we briefly investigated the possibility of using tap inputs for on-table use cases. Figure 13 shows the histogram of the pseudo impulse for a Tap and a ForceTap operation while the device is on a table. We observed that the pseudo impulses of Taps were very low for this condition, and also the pseudo impulses of ForceTaps were low compared to the handheld case. It is possible to determine whether the device is on the table or not, by monitoring the stability of the unfiltered acceleration value. An algorithm may lower the detection threshold for the ontable condition. However, we can expect that the error rate will increase for on-table cases, because the percentage of the overlapping area between Tap and ForceTap is 18.3%, which is higher than the handheld case. APPLICATION SCENARIO Discriminating the touch input by the level of force adds a new input dimension to touch screens on mobile devices and can be adapted on various applications. We suggest a design guideline for using this technique on mobile applications and present sample application scenarios. Design Guideline There are properties of ForceTap input to be considered for designing mobile applications. Require stronger force: Unlike traditional touch inputs, ForceTap input requires users to apply a stronger force, making it possible to increase the user s fatigue. Therefore, when mapping Tap and ForceTap to applications, we recommend mapping Tap to the frequently used and ForceTap to less frequently used functions. Lower Precision: As we confirmed from the experiment with the variable threshold algorithm, a ForceTap is less precise in position than a Tap. Because of this, ForceTap input will work better for an area input than a point input. Metaphor: There is a clear difference in the speed of movement between a Tap and a ForceTap and those gestures may be considered as lighter and heavier gestures. If the target mappings have physical metaphors, using that difference, it will help users to easily figure out the mappings, just like we mapped the ForceTap to a Figure 12. (a) Threshold map and errors for (b) static threshold and (c) variable threshold algorithms. Figure 13. Histogram of the pseudo impulse for Taps and ForceTaps for the on-table condition. 120

9 heavier cannon and Tap to a lighter gun. Applications Using ForceTap as Alternative Touch Input Small Target Selection There have been many attempts to solve the fat finger problem occurring in the use of touch screens. Most of them suggested a solution that does not require an additional hardware [16, 21, 26] but only using the existing touch screen input. A problem with such techniques is that the actions they use for target magnification or calling precise control are often in conflict with existing gestures. In fact, it is not easy to avoid such a conflict if one has to rely solely on binary touch events. ForceTap is a gesture that has not yet been used on a capacitive touch screen. Therefore, it does not conflict with traditional touch screen interactions. Consequently, the ForceTap gesture can be used to trigger target magnification for small targets even on legacy mobile applications using various touch screen inputs. As ForceTap does not require complex combinations or time consuming gestures, the magnification can be called in a time of tapping. Display Context Menu While it is common to call up the context menu in a desktop environment by clicking the right mouse button, it is difficult to do the equivalent on touch screen mobile devices, because of the limited input vocabulary on mobile touch screens. Simple variations or combinations of touches too easily conflict with other functions. Using ForceTap as an alternative touch input can be a good method to display the context menu. Since ForceTap was not being used on previous touch screen interfaces, we can prevent the confusion of different touch inputs. As ForceTap requires stronger force than a Tap, it can be used for displaying a context menu, a less frequently used function than e.g. selecting or manipulating a target. Applications Using ForceTap as Expressive Touch Input Playing Musical Instruments We have obtained many comments on using ForceTap for playing musical instruments. Playing musical instruments is an expressive behavior. It is natural to exert different levels of force to the musical instrument. Many musical applications are developed for mobile devices and some applications estimate the tapping force using an accelerometer [1, 5]. Tapping with a different velocity is a natural behavior on playing a musical instrument such as a piano or a drum. Force-sensitive Game Using physical gestures is not new to the game industry. Nintendo Wii, Sony PlayStation Move, or Microsoft Natal Projects are trying to provide more realistic experiences to the players by supporting physical motion gestures as an input. Also, there are many trials to provide such experience on mobile devices by detecting gravity induced reaction forces. Many racing games are using an accelerometer for detecting the balance of the mobile device to steer the car, and some games are controlled by tilting the device. Different from gravity and balance, pressure-force detection is not being used on mobile games. Sports games require various power controls. For instance, while playing golf, pressing could be used to control the swing power. Using a tap gesture can be intuitively adapted for such a game to control the power. According to the subjective feedback from a participant, it is less precise to tap than touching the screen. This is an inevitable feature, as the accuracy of the movement decreases with the increase in movement speed. Thus, it can also be a fun factor for some games. CONCLUSION AND FUTURE WORK We introduced and explored ForceTap as an interaction technique that enriches the input vocabulary of a mobile device s touch screen by combining the screen input with the tapping velocity as detected by a built-in accelerometer. We studied the acceleration patterns created by finger taps during single-handed use, and designed an algorithm to discriminate a Tap from a Force Tap using a pseudoimpulse instead of an acceleration peak value. We verified the algorithm s advantage with user tests. Through a series of experiments, we proved the feasibility of ForceTap in diverse conditions. We discovered that the technique can be learned and used well without visual feedback. We also showed that ForceTap can be used with two hands, while still maintaining a reliable detection rate. We also showed that ForceTap is feasible in an immersive environment as well as while walking. After the experiments, we conducted an interview and obtained valuable feedback from those who inspected the features of the ForceTap technique. Based on the experimental results and an analysis of the feedback, we could summarize guidelines to use the ForceTap technique for a mobile application. We are planning to investigate better algorithms of detecting a tapping velocity and more useful ways of using them. A more precise detection of the tapping velocity will enable more expressive operations on touch screen enabled mobile devices. ACKNOWLEDGMENTS This work was in part supported by the IT R&D program of MKE/KEIT. [ , Core UI technologies for improving Smart TV UX] REFERENCES 1. Apple GarageBand for ipad Clarkson, E. C., Patel, S. N., Pierce, J. S., Abowd, G. D. Exploring continuous pressure input for mobile phones. GVU Technical Report, GIT-GVU-06-20,

10 3. Dabrowski, J. R. and Munson, E. V. Is 100 milliseconds too fast? Ext. Abstracts CHI 2001, ACM Press (2001), Davidson, P. L. and Han, J. Y. Extending 2D object arrangement with pressure-sensitive layering cues. In Proc. UIST 2008, ACM Press (2008), Greatapps. TapForce, 2009, 6. Heo, S. and Lee, G. Force Gestures: augmented touch screen gestures using normal and tangential force. Ext. Abstracts CHI 2011, ACM Press (2011), Hinckley, K. and Song, H. Sensor synaesthesia: touch in motion, and motion in touch. In Proc. CHI 2011, ACM Press (2011), Hu, C., Tung, K., and Lau, L. Music Wall: a tangible user interface using tapping as an interactive technique. In Proc. APCHI 2008, LNCS vol , Hudson, S. E., Harrison, C., Harrison, B. L., and LaMarca, A. whack gestures: inexact and inattentive interaction with mobile devices. In Proc. TEI 2010, ACM Press (2010), Iwasaki, K., Miyaki, T., and Rekimoto, J. Expressive typing: a new way to sense typing pressure and its applications. Ext. Abstracts CHI 2009, ACM Press (2009), Linjama, J., Hakkila, J., and Ronkainen, S. Gesture interfaces for mobile devices - minimalist approach for haptic interaction. In CHI Workshop: Hands on Haptics: Exploring Non-Visual Visualisation Using the Sense of Touch Linjama, J. and Kaaresoja, T. Novel, minimalist haptic gesture interaction for mobile devices, In Proc. NordiCHI 2004, ACM Press (2004), Karlson, A. K., Benderson, B. B. and Contreras-Vidal, J. L. Understanding one handed use of mobile devices. In Handbook of Research on User Interface Design and Evaluation for Mobile Technology, IGI Global, Miller, R. B. Response time in man-computer conversational transactions. In AFIPS 1968: Proc. of the Dec. 9-11, 1968, Joint Computer Conference (Fall, part I), ACM Press (1968), Miyaki, T. and Rekimoto, J. GraspZoom: zooming and scrolling control model for single-handed mobile interaction. In Proc. MobileHCI 2009, ACM Press (2009), Olwal, A., Feiner, S., and Heyman, S. Rubbing and tapping for precise and rapid selection on touch-screen displays. In Proc. CHI 2008, ACM Press (2008), Ramos, G. A. and Balakrishnan, R. Pressure marks. In Proc. CHI 2007, ACM Press (2007), Ramos, G. and Balakrishnan, R. Zliding: fluid zooming and sliding for high precision parameter manipulation. In Proc. UIST 2005, ACM Press (2005), Ramos, G., Boulos, M., and Balakrishnan, R. Pressure widgets. In Proc. CHI ACM Press (2004), Ronkainen, S., Häkkilä, J., Kaleva, S., Colley, A., and Linjama, J. Tap input as an embedded interaction method for mobile devices. In Proc. TEI 2007, ACM Press (2007), Roudaut, A., Huot, S., and Lecolinet, E. TapTap and MagStick: improving one-handed target acquisition on small touch-screens. In Proc. AVI 2008, ACM Press (2008), Roudaut, A., Lecolinet, E., and Guiard, Y. MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb. In Proc. CHI 2009, ACM Press (2009), Roth, V. and Turner, T. Bezel swipe: conflict-free scrolling and multiple selection on mobile touch screen devices. In Proc. CHI 2009, ACM Press (2009), Schmidt, D., Chehimi, F., Rukzio, E. and Gellersen, H. PhoneTouch: a technique for direct phone interaction on surfaces. In Proc. UIST 2010, ACM Press (2010), Stewart, C., Rohs, M., Kratz, S., and Essl, G. Characteristics of pressure-based input for mobile devices. In Proc. CHI ACM Press (2010), Vogel, D. and Baudisch, P. Shift: a technique for operating pen-based interfaces using touch. In Proc. CHI 2007, ACM Press (2007),

A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones

A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones Jianwei Lai University of Maryland, Baltimore County 1000 Hilltop Circle, Baltimore, MD 21250 USA jianwei1@umbc.edu

More information

Expanding Touch Input Vocabulary by Using Consecutive Distant Taps

Expanding Touch Input Vocabulary by Using Consecutive Distant Taps Expanding Touch Input Vocabulary by Using Consecutive Distant Taps Seongkook Heo, Jiseong Gu, Geehyuk Lee Department of Computer Science, KAIST Daejeon, 305-701, South Korea seongkook@kaist.ac.kr, jiseong.gu@kaist.ac.kr,

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

TapBoard: Making a Touch Screen Keyboard

TapBoard: Making a Touch Screen Keyboard TapBoard: Making a Touch Screen Keyboard Sunjun Kim, Jeongmin Son, and Geehyuk Lee @ KAIST HCI Laboratory Hwan Kim, and Woohun Lee @ KAIST Design Media Laboratory CHI 2013 @ Paris, France 1 TapBoard: Making

More information

CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone

CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone Young-Woo Park Department of Industrial Design, KAIST, Daejeon, Korea pyw@kaist.ac.kr Chang-Young Lim Graduate School of

More information

Making Pen-based Operation More Seamless and Continuous

Making Pen-based Operation More Seamless and Continuous Making Pen-based Operation More Seamless and Continuous Chuanyi Liu and Xiangshi Ren Department of Information Systems Engineering Kochi University of Technology, Kami-shi, 782-8502 Japan {renlab, ren.xiangshi}@kochi-tech.ac.jp

More information

arxiv: v1 [cs.hc] 2 Oct 2016

arxiv: v1 [cs.hc] 2 Oct 2016 Augmenting Mobile Phone Interaction with Face-Engaged Gestures Jian Zhao Ricardo Jota Daniel Wigdor Ravin Balakrishnan Department of Comptuer Science, University of Toronto ariv:1610.00214v1 [cs.hc] 2

More information

Microsoft Scrolling Strip Prototype: Technical Description

Microsoft Scrolling Strip Prototype: Technical Description Microsoft Scrolling Strip Prototype: Technical Description Primary features implemented in prototype Ken Hinckley 7/24/00 We have done at least some preliminary usability testing on all of the features

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

COMET: Collaboration in Applications for Mobile Environments by Twisting

COMET: Collaboration in Applications for Mobile Environments by Twisting COMET: Collaboration in Applications for Mobile Environments by Twisting Nitesh Goyal RWTH Aachen University Aachen 52056, Germany Nitesh.goyal@rwth-aachen.de Abstract In this paper, we describe a novel

More information

Extending the Vocabulary of Touch Events with ThumbRock

Extending the Vocabulary of Touch Events with ThumbRock Extending the Vocabulary of Touch Events with ThumbRock David Bonnet bonnet@lri.fr Caroline Appert appert@lri.fr Michel Beaudouin-Lafon mbl@lri.fr Univ Paris-Sud & CNRS (LRI) INRIA F-9145 Orsay, France

More information

PhonePaint: Using Smartphones as Dynamic Brushes with Interactive Displays

PhonePaint: Using Smartphones as Dynamic Brushes with Interactive Displays PhonePaint: Using Smartphones as Dynamic Brushes with Interactive Displays Jian Zhao Department of Computer Science University of Toronto jianzhao@dgp.toronto.edu Fanny Chevalier Department of Computer

More information

Apple s 3D Touch Technology and its Impact on User Experience

Apple s 3D Touch Technology and its Impact on User Experience Apple s 3D Touch Technology and its Impact on User Experience Nicolas Suarez-Canton Trueba March 18, 2017 Contents 1 Introduction 3 2 Project Objectives 4 3 Experiment Design 4 3.1 Assessment of 3D-Touch

More information

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness Alaa Azazi, Teddy Seyed, Frank Maurer University of Calgary, Department of Computer Science

More information

Android User manual. Intel Education Lab Camera by Intellisense CONTENTS

Android User manual. Intel Education Lab Camera by Intellisense CONTENTS Intel Education Lab Camera by Intellisense Android User manual CONTENTS Introduction General Information Common Features Time Lapse Kinematics Motion Cam Microscope Universal Logger Pathfinder Graph Challenge

More information

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device 2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &

More information

Occlusion-Aware Menu Design for Digital Tabletops

Occlusion-Aware Menu Design for Digital Tabletops Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at

More information

Double-side Multi-touch Input for Mobile Devices

Double-side Multi-touch Input for Mobile Devices Double-side Multi-touch Input for Mobile Devices Double side multi-touch input enables more possible manipulation methods. Erh-li (Early) Shen Jane Yung-jen Hsu National Taiwan University National Taiwan

More information

MicroRolls: Expanding Touch-Screen Input Vocabulary by Distinguishing Rolls vs. Slides of the Thumb

MicroRolls: Expanding Touch-Screen Input Vocabulary by Distinguishing Rolls vs. Slides of the Thumb MicroRolls: Expanding Touch-Screen Input Vocabulary by Distinguishing Rolls vs. Slides of the Thumb Anne Roudaut1,2 anne.roudaut@enst.fr 1 Eric Lecolinet1 eric.lecolinet@enst.fr TELECOM ParisTech CNRS

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Multimodal Interaction Concepts for Mobile Augmented Reality Applications Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl

More information

Moving Man - Velocity vs. Time Graphs

Moving Man - Velocity vs. Time Graphs Moving Man Velocity vs. Graphs Procedure Go to http://www.colorado.edu/physics/phet and find The Moving Man simulation under the category of motion. 1. After The Moving Man is open leave the position graph

More information

Project Multimodal FooBilliard

Project Multimodal FooBilliard Project Multimodal FooBilliard adding two multimodal user interfaces to an existing 3d billiard game Dominic Sina, Paul Frischknecht, Marian Briceag, Ulzhan Kakenova March May 2015, for Future User Interfaces

More information

BEST PRACTICES COURSE WEEK 14 PART 2 Advanced Mouse Constraints and the Control Box

BEST PRACTICES COURSE WEEK 14 PART 2 Advanced Mouse Constraints and the Control Box BEST PRACTICES COURSE WEEK 14 PART 2 Advanced Mouse Constraints and the Control Box Copyright 2012 by Eric Bobrow, all rights reserved For more information about the Best Practices Course, visit http://www.acbestpractices.com

More information

IMGD 4000 Technical Game Development II Interaction and Immersion

IMGD 4000 Technical Game Development II Interaction and Immersion IMGD 4000 Technical Game Development II Interaction and Immersion Robert W. Lindeman Associate Professor Human Interaction in Virtual Environments (HIVE) Lab Department of Computer Science Worcester Polytechnic

More information

Evaluating Touch Gestures for Scrolling on Notebook Computers

Evaluating Touch Gestures for Scrolling on Notebook Computers Evaluating Touch Gestures for Scrolling on Notebook Computers Kevin Arthur Synaptics, Inc. 3120 Scott Blvd. Santa Clara, CA 95054 USA karthur@synaptics.com Nada Matic Synaptics, Inc. 3120 Scott Blvd. Santa

More information

Introduction to NeuroScript MovAlyzeR Handwriting Movement Software (Draft 14 August 2015)

Introduction to NeuroScript MovAlyzeR Handwriting Movement Software (Draft 14 August 2015) Introduction to NeuroScript MovAlyzeR Page 1 of 20 Introduction to NeuroScript MovAlyzeR Handwriting Movement Software (Draft 14 August 2015) Our mission: Facilitate discoveries and applications with handwriting

More information

Frictioned Micromotion Input for Touch Sensitive Devices

Frictioned Micromotion Input for Touch Sensitive Devices Technical Disclosure Commons Defensive Publications Series May 18, 2015 Frictioned Micromotion Input for Touch Sensitive Devices Samuel Huang Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Comparison of Relative Versus Absolute Pointing Devices

Comparison of Relative Versus Absolute Pointing Devices The InsTITuTe for systems research Isr TechnIcal report 2010-19 Comparison of Relative Versus Absolute Pointing Devices Kent Norman Kirk Norman Isr develops, applies and teaches advanced methodologies

More information

Sketchpad Ivan Sutherland (1962)

Sketchpad Ivan Sutherland (1962) Sketchpad Ivan Sutherland (1962) 7 Viewable on Click here https://www.youtube.com/watch?v=yb3saviitti 8 Sketchpad: Direct Manipulation Direct manipulation features: Visibility of objects Incremental action

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences Elwin Lee, Xiyuan Liu, Xun Zhang Entertainment Technology Center Carnegie Mellon University Pittsburgh, PA 15219 {elwinl, xiyuanl,

More information

Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience

Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience Radu-Daniel Vatavu and Stefan-Gheorghe Pentiuc University Stefan cel Mare of Suceava, Department of Computer Science,

More information

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3 INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3 Labshare 2011 Table of Contents 1 Introduction... 3 1.1 Remote Laboratories... 3 1.2 Inclined Plane - The Rig Apparatus... 3 1.2.1 Block Masses & Inclining

More information

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,

More information

Twist n Knock: A One-handed Gesture for Smart Watches

Twist n Knock: A One-handed Gesture for Smart Watches Twist n Knock: A One-handed Gesture for Smart Watches Vikram Kamath Cannanure* Xiang Anthony Chen Jennifer Mankoff ABSTRACT Interacting with a smart watch requires a fair amount of attention, which can

More information

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback Illusion of Surface Changes induced by Tactile and Visual Touch Feedback Katrin Wolf University of Stuttgart Pfaffenwaldring 5a 70569 Stuttgart Germany katrin.wolf@vis.uni-stuttgart.de Second Author VP

More information

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced

More information

Getting started with AutoCAD mobile app. Take the power of AutoCAD wherever you go

Getting started with AutoCAD mobile app. Take the power of AutoCAD wherever you go Getting started with AutoCAD mobile app Take the power of AutoCAD wherever you go Getting started with AutoCAD mobile app Take the power of AutoCAD wherever you go i How to navigate this book Swipe the

More information

PHYSICS 220 LAB #1: ONE-DIMENSIONAL MOTION

PHYSICS 220 LAB #1: ONE-DIMENSIONAL MOTION /53 pts Name: Partners: PHYSICS 22 LAB #1: ONE-DIMENSIONAL MOTION OBJECTIVES 1. To learn about three complementary ways to describe motion in one dimension words, graphs, and vector diagrams. 2. To acquire

More information

A Gestural Interaction Design Model for Multi-touch Displays

A Gestural Interaction Design Model for Multi-touch Displays Songyang Lao laosongyang@ vip.sina.com A Gestural Interaction Design Model for Multi-touch Displays Xiangan Heng xianganh@ hotmail ABSTRACT Media platforms and devices that allow an input from a user s

More information

How to Create Animated Vector Icons in Adobe Illustrator and Photoshop

How to Create Animated Vector Icons in Adobe Illustrator and Photoshop How to Create Animated Vector Icons in Adobe Illustrator and Photoshop by Mary Winkler (Illustrator CC) What You'll Be Creating Animating vector icons and designs is made easy with Adobe Illustrator and

More information

Flick-and-Brake: Finger Control over Inertial/Sustained Scroll Motion

Flick-and-Brake: Finger Control over Inertial/Sustained Scroll Motion Flick-and-Brake: Finger Control over Inertial/Sustained Scroll Motion Mathias Baglioni, Sylvain Malacria, Eric Lecolinet, Yves Guiard To cite this version: Mathias Baglioni, Sylvain Malacria, Eric Lecolinet,

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Seungmoon Choi and Hong Z. Tan Haptic Interface Research Laboratory Purdue University 465 Northwestern Avenue West Lafayette,

More information

Haptic messaging. Katariina Tiitinen

Haptic messaging. Katariina Tiitinen Haptic messaging Katariina Tiitinen 13.12.2012 Contents Introduction User expectations for haptic mobile communication Hapticons Example: CheekTouch Introduction Multiple senses are used in face-to-face

More information

A novel click-free interaction technique for large-screen interfaces

A novel click-free interaction technique for large-screen interfaces A novel click-free interaction technique for large-screen interfaces Takaomi Hisamatsu, Buntarou Shizuki, Shin Takahashi, Jiro Tanaka Department of Computer Science Graduate School of Systems and Information

More information

VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures

VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures Figure 1: Operation of VolGrab Shun Sekiguchi Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, 338-8570, Japan sekiguchi@is.ics.saitama-u.ac.jp

More information

Haptic Feedback on Mobile Touch Screens

Haptic Feedback on Mobile Touch Screens Haptic Feedback on Mobile Touch Screens Applications and Applicability 12.11.2008 Sebastian Müller Haptic Communication and Interaction in Mobile Context University of Tampere Outline Motivation ( technologies

More information

Table of Contents. Display + Touch + People = Interactive Experience. Displays. Touch Interfaces. Touch Technology. People. Examples.

Table of Contents. Display + Touch + People = Interactive Experience. Displays. Touch Interfaces. Touch Technology. People. Examples. Table of Contents Display + Touch + People = Interactive Experience 3 Displays 5 Touch Interfaces 7 Touch Technology 10 People 14 Examples 17 Summary 22 Additional Information 23 3 Display + Touch + People

More information

Enabling Cursor Control Using on Pinch Gesture Recognition

Enabling Cursor Control Using on Pinch Gesture Recognition Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on

More information

Aerospace Sensor Suite

Aerospace Sensor Suite Aerospace Sensor Suite ECE 1778 Creative Applications for Mobile Devices Final Report prepared for Dr. Jonathon Rose April 12 th 2011 Word count: 2351 + 490 (Apper Context) Jin Hyouk (Paul) Choi: 998495640

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Measure simulated forces of impact on a human head, and test if forces are reduced by wearing a protective headgear.

Measure simulated forces of impact on a human head, and test if forces are reduced by wearing a protective headgear. PocketLab Science Fair Kit: Preventing Concussions and Head Injuries This STEM Science Fair Kit lets you be a scientist and simulate real world accidents and injuries with a crash test style dummy head.

More information

arxiv: v1 [cs.hc] 14 Jan 2015

arxiv: v1 [cs.hc] 14 Jan 2015 Expanding the Vocabulary of Multitouch Input using Magnetic Fingerprints Halim Çağrı Ateş cagri@cse.unr.edu Ilias Apostolopoulous ilapost@cse.unr.edu Computer Science and Engineering University of Nevada

More information

Laboratory 1: Motion in One Dimension

Laboratory 1: Motion in One Dimension Phys 131L Spring 2018 Laboratory 1: Motion in One Dimension Classical physics describes the motion of objects with the fundamental goal of tracking the position of an object as time passes. The simplest

More information

Exploring Multi-touch Contact Size for Z-Axis Movement in 3D Environments

Exploring Multi-touch Contact Size for Z-Axis Movement in 3D Environments Exploring Multi-touch Contact Size for Z-Axis Movement in 3D Environments Sarah Buchanan Holderness* Jared Bott Pamela Wisniewski Joseph J. LaViola Jr. University of Central Florida Abstract In this paper

More information

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Heads up interaction: glasgow university multimodal research. Eve Hoggan Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not

More information

Gesture-based interaction via finger tracking for mobile augmented reality

Gesture-based interaction via finger tracking for mobile augmented reality Multimed Tools Appl (2013) 62:233 258 DOI 10.1007/s11042-011-0983-y Gesture-based interaction via finger tracking for mobile augmented reality Wolfgang Hürst & Casper van Wezel Published online: 18 January

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Bimanual Input for Multiscale Navigation with Pressure and Touch Gestures

Bimanual Input for Multiscale Navigation with Pressure and Touch Gestures Bimanual Input for Multiscale Navigation with Pressure and Touch Gestures Sebastien Pelurson and Laurence Nigay Univ. Grenoble Alpes, LIG, CNRS F-38000 Grenoble, France {sebastien.pelurson, laurence.nigay}@imag.fr

More information

Overview. The Game Idea

Overview. The Game Idea Page 1 of 19 Overview Even though GameMaker:Studio is easy to use, getting the hang of it can be a bit difficult at first, especially if you have had no prior experience of programming. This tutorial is

More information

Jitter Analysis Techniques Using an Agilent Infiniium Oscilloscope

Jitter Analysis Techniques Using an Agilent Infiniium Oscilloscope Jitter Analysis Techniques Using an Agilent Infiniium Oscilloscope Product Note Table of Contents Introduction........................ 1 Jitter Fundamentals................. 1 Jitter Measurement Techniques......

More information

EECS 4441 / CSE5351 Human-Computer Interaction. Topic #1 Historical Perspective

EECS 4441 / CSE5351 Human-Computer Interaction. Topic #1 Historical Perspective EECS 4441 / CSE5351 Human-Computer Interaction Topic #1 Historical Perspective I. Scott MacKenzie York University, Canada 1 Significant Event Timeline 2 1 Significant Event Timeline 3 As We May Think Vannevar

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

HandMark Menus: Rapid Command Selection and Large Command Sets on Multi-Touch Displays

HandMark Menus: Rapid Command Selection and Large Command Sets on Multi-Touch Displays HandMark Menus: Rapid Command Selection and Large Command Sets on Multi-Touch Displays Md. Sami Uddin 1, Carl Gutwin 1, and Benjamin Lafreniere 2 1 Computer Science, University of Saskatchewan 2 Autodesk

More information

1 Sketching. Introduction

1 Sketching. Introduction 1 Sketching Introduction Sketching is arguably one of the more difficult techniques to master in NX, but it is well-worth the effort. A single sketch can capture a tremendous amount of design intent, and

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

Virtual I.V. System overview. Directions for Use.

Virtual I.V. System overview. Directions for Use. System overview 37 System Overview Virtual I.V. 6.1 Software Overview The Virtual I.V. Self-Directed Learning System software consists of two distinct parts: (1) The basic menus screens, which present

More information

Using Hands and Feet to Navigate and Manipulate Spatial Data

Using Hands and Feet to Navigate and Manipulate Spatial Data Using Hands and Feet to Navigate and Manipulate Spatial Data Johannes Schöning Institute for Geoinformatics University of Münster Weseler Str. 253 48151 Münster, Germany j.schoening@uni-muenster.de Florian

More information

Running an HCI Experiment in Multiple Parallel Universes

Running an HCI Experiment in Multiple Parallel Universes Author manuscript, published in "ACM CHI Conference on Human Factors in Computing Systems (alt.chi) (2014)" Running an HCI Experiment in Multiple Parallel Universes Univ. Paris Sud, CNRS, Univ. Paris Sud,

More information

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different

More information

LAB 1 Linear Motion and Freefall

LAB 1 Linear Motion and Freefall Cabrillo College Physics 10L Name LAB 1 Linear Motion and Freefall Read Hewitt Chapter 3 What to learn and explore A bat can fly around in the dark without bumping into things by sensing the echoes of

More information

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics

More information

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Jung Wook Park HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 jungwoop@andrew.cmu.edu

More information

An exploration of pen tail gestures for interactions

An exploration of pen tail gestures for interactions Available online at www.sciencedirect.com Int. J. Human-Computer Studies 71 (2012) 551 569 www.elsevier.com/locate/ijhcs An exploration of pen tail gestures for interactions Feng Tian a,d,n, Fei Lu a,

More information

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

Artex: Artificial Textures from Everyday Surfaces for Touchscreens Artex: Artificial Textures from Everyday Surfaces for Touchscreens Andrew Crossan, John Williamson and Stephen Brewster Glasgow Interactive Systems Group Department of Computing Science University of Glasgow

More information

Classic3D and Single3D: Two unimanual techniques for constrained 3D manipulations on tablet PCs

Classic3D and Single3D: Two unimanual techniques for constrained 3D manipulations on tablet PCs Classic3D and Single3D: Two unimanual techniques for constrained 3D manipulations on tablet PCs Siju Wu, Aylen Ricca, Amine Chellali, Samir Otmane To cite this version: Siju Wu, Aylen Ricca, Amine Chellali,

More information

Comparison of Phone-based Distal Pointing Techniques for Point-Select Tasks

Comparison of Phone-based Distal Pointing Techniques for Point-Select Tasks Comparison of Phone-based Distal Pointing Techniques for Point-Select Tasks Mohit Jain 1, Andy Cockburn 2 and Sriganesh Madhvanath 3 1 IBM Research, Bangalore, India mohitjain@in.ibm.com 2 University of

More information

Automated Terrestrial EMI Emitter Detection, Classification, and Localization 1

Automated Terrestrial EMI Emitter Detection, Classification, and Localization 1 Automated Terrestrial EMI Emitter Detection, Classification, and Localization 1 Richard Stottler James Ong Chris Gioia Stottler Henke Associates, Inc., San Mateo, CA 94402 Chris Bowman, PhD Data Fusion

More information

Shift: A Technique for Operating Pen-Based Interfaces Using Touch

Shift: A Technique for Operating Pen-Based Interfaces Using Touch Shift: A Technique for Operating Pen-Based Interfaces Using Touch Daniel Vogel Department of Computer Science University of Toronto dvogel@.dgp.toronto.edu Patrick Baudisch Microsoft Research Redmond,

More information

CONCEPTS EXPLAINED CONCEPTS (IN ORDER)

CONCEPTS EXPLAINED CONCEPTS (IN ORDER) CONCEPTS EXPLAINED This reference is a companion to the Tutorials for the purpose of providing deeper explanations of concepts related to game designing and building. This reference will be updated with

More information

Sensing Human Activities With Resonant Tuning

Sensing Human Activities With Resonant Tuning Sensing Human Activities With Resonant Tuning Ivan Poupyrev 1 ivan.poupyrev@disneyresearch.com Zhiquan Yeo 1, 2 zhiquan@disneyresearch.com Josh Griffin 1 joshdgriffin@disneyresearch.com Scott Hudson 2

More information

EECS 4441 Human-Computer Interaction

EECS 4441 Human-Computer Interaction EECS 4441 Human-Computer Interaction Topic #1:Historical Perspective I. Scott MacKenzie York University, Canada Significant Event Timeline Significant Event Timeline As We May Think Vannevar Bush (1945)

More information

Touch Interfaces. Jeff Avery

Touch Interfaces. Jeff Avery Touch Interfaces Jeff Avery Touch Interfaces In this course, we have mostly discussed the development of web interfaces, with the assumption that the standard input devices (e.g., mouse, keyboards) are

More information

Lab 1. Motion in a Straight Line

Lab 1. Motion in a Straight Line Lab 1. Motion in a Straight Line Goals To understand how position, velocity, and acceleration are related. To understand how to interpret the signed (+, ) of velocity and acceleration. To understand how

More information

SCOUT Mobile User Guide 3.0

SCOUT Mobile User Guide 3.0 SCOUT Mobile User Guide 3.0 Android Guide 3864 - SCOUT February 2017 SCOUT Mobile Table of Contents Supported Devices...1 Multiple Manufacturers...1 The Three Tabs of SCOUT TM Mobile 3.0...1 SCOUT...1

More information

The Effect of Display Type and Video Game Type on Visual Fatigue and Mental Workload

The Effect of Display Type and Video Game Type on Visual Fatigue and Mental Workload Proceedings of the 2010 International Conference on Industrial Engineering and Operations Management Dhaka, Bangladesh, January 9 10, 2010 The Effect of Display Type and Video Game Type on Visual Fatigue

More information

A Technique for Touch Force Sensing using a Waterproof Device s Built-in Barometer

A Technique for Touch Force Sensing using a Waterproof Device s Built-in Barometer Late-Breaking Work B C Figure 1: Device conditions. a) non-tape condition. b) with-tape condition. A Technique for Touch Force Sensing using a Waterproof Device s Built-in Barometer Ryosuke Takada Ibaraki,

More information

Virtual Grasping Using a Data Glove

Virtual Grasping Using a Data Glove Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct

More information

Haptic Feedback Design for a Virtual Button Along Force-Displacement Curves

Haptic Feedback Design for a Virtual Button Along Force-Displacement Curves Haptic Feedback Design for a Virtual Button Along Force-Displacement Curves Sunjun Kim and Geehyuk Lee Department of Computer Science, KAIST Daejeon 305-701, Republic of Korea {kuaa.net, geehyuk}@gmail.com

More information

Computer Tools for Data Acquisition

Computer Tools for Data Acquisition Computer Tools for Data Acquisition Introduction to Capstone You will be using a computer to assist in taking and analyzing data throughout this course. The software, called Capstone, is made specifically

More information

Facilitation of Affection by Tactile Feedback of False Heartbeat

Facilitation of Affection by Tactile Feedback of False Heartbeat Facilitation of Affection by Tactile Feedback of False Heartbeat Narihiro Nishimura n-nishimura@kaji-lab.jp Asuka Ishi asuka@kaji-lab.jp Michi Sato michi@kaji-lab.jp Shogo Fukushima shogo@kaji-lab.jp Hiroyuki

More information

Exploring Surround Haptics Displays

Exploring Surround Haptics Displays Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,

More information