Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation
|
|
- Phillip Chapman
- 5 years ago
- Views:
Transcription
1 Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation Georgios Kouroupetroglou 1,2, Alexandros Pino 2, Athanasios Balmpakakis 1, Dimitrios Chalastanis 1, Vasileios Golematis 1, Nikolaos Ioannou 1, and Ioannis Koutsoumpas 1 1 Department of Informatics and Telecommunications 2 Accessibility Unit for Students with Disabilities, National and Kapodistrian University of Athens, Panepistimiopolis, GR-15784, Athens, Greece koupe@di.uoa.gr Abstract. We present two studies to comparatively evaluate the performance of gesture-based 2D and 3D pointing tasks. In both of them, a Wiimote controller and a standard mouse were used by six participants. For the 3D experiments we introduce a novel configuration analogous to the ISO standard methodology. We examine the pointing devices conformance to Fitts law and we measure eight extra parameters that describe more accurately the cursor movement trajectory. For the 2D tasks using Wiimote, Throughput is 41,2% lower than using the mouse, target re-entry is almost the same, and missed clicks count is three times higher. For the 3D tasks using Wiimote, Throughput is 56,1% lower than using the mouse, target re-entry is increased by almost 50%, and missed clicks count is sixteen times higher. Keywords: Fitts law, 3D pointing, Gesture User Interface, Wiimote. 1 Introduction Nowadays, low-cost hand-held devices, introduced along with widespread game platforms/consoles, can also be used as input devices in general purpose Personal Computers. Thus, during the last years there has been a growing research interest in the domain of device-based gesture user interaction. Nintendo's Wii Remote Control (known as Wiimote) represents a typical example of these devices. Most of them incorporate accelerometer sensors. Accelerometer-based recognition of dynamic gestures has been investigated mainly by applying Hidden Markov Models (HMM) [1-2] and their usability has been evaluated compared to other modalities [3]. Gesture recognition for the Wiimote using either its 3-axis accelerometer or its high-resolution high-speed IR camera [7] has been developed by applying various methods and techniques, such as simple pattern recognition approaches [4], HMM [5], Dynamic Time Warping combined with HMM [10], or Slow Feature Analysis and parametric bootstrap [6]. GesText is an accelerometer-based Wiimote gestural text-entry system [9]. Wiimote can be utilized to uncover the user s cultural background by analyzing E. Efthimiou, G. Kouroupetroglou, S.-E. Fotinea (Eds.): GW 2011, LNAI 7206, pp , Springer-Verlag Berlin Heidelberg 2012
2 14 G. Kouroupetroglou et al. his patterns of gestural expressivity in a model based on cultural dimensions [8]. SignWiiver, a gesture recognition system which lets the user perform gestures with a Wiimote, uses a language built around the movement parameter of Natural Sign Languages [11]. Usability evaluation based on gesture recognition also revealed the applicability of Wiimote as a musical controller [25]. The point-and-click metaphor (usually referred to as pointing) constitutes a fundamental task for most two-dimensional (2D) and three-dimensional (3D) Graphical User Interfaces (GUI) enabling users to perform an object selection operation. Moreover, typing, resizing, dragging, scrolling, as well as other GUI operations require pointing. In order to develop better pointing techniques we need to understand the human pointing behavior and motor control. Fitts Law [12] can be used to: a) model the way users perform target selection, b) measure the user s performance and c) compare the user s performance amongst various input devices or the change in performance over time. Fitts law has been applied to three-dimensional pointing tasks [13] as well as to the design of the gesture-based pointing interactions [14-15], including the Wiimote [16-17]. The most common evaluation measures of Fitts law are speed, accuracy, and Throughput [18]. In this paper we present two experiments to comparatively evaluate the performance of gesture-based 2D and 3D pointing tasks. Beyond testing Fitts law, we measure the following eight extra parameters that describe more accurately the real cursor movement trajectory: missed clicks (MCL), target re-entry (TRE), task axis crossing (TAC), movement direction change (MDC), orthogonal direction change (ODC), movement variability (MV), movement error (ME), and movement offset (MO). For the 3D experiments we introduce a novel configuration analogous to the ISO standard [19] methodology. 2 Methodology Fitts [12] proposed a model for the tradeoff between accuracy and speed in human motor movements. The model, commonly known as Fitts' law, is based on Shannon's information theory. Fitts proposed to quantify a movement task's difficulty using information theory by the metric "bits". Specifically, he introduced the Index of Difficulty (ID): log 2 (1) D and W are the target s distance and width respectively and are analogous to signal and noise, in Shannon's original research on electronic communications systems. The following expression for ID is more commonly used today, as it improves the information-theoretic analogy [18]: log 1 (2) Because D and W are both measures of distance, the term in the parentheses is without units. "Bits" emerges from the choice of base 2 for the logarithm. Fitts' law is
3 Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation 15 often used to build a prediction model with the Movement Time (MT) to complete point-select tasks as the dependent variable: (3) The slope (a) and intercept (b) coefficients in the prediction equation are determined through empirical tests, typically using linear regression. In order to evaluate the Wiimote s conformance to Fitts law as an input device, we have designed and implemented a novel software application for our experiments that covers both 2D and 3D gesture-based user interaction. Our methodology is based on the ISO standard [19-20], which describes a standardized procedure to evaluate the performance, comfort, and effort in using computer pointing devices; this procedure offers the ability to understand the experimental results or to undertake comparisons between studies. For the 2D case, in each multi-directional test, 16 circular targets are arranged in an equidistance layout (Fig. 1). The task begins with a click on the centre of the first target; then the participant must move the cursor directly to the opposite target and click on it, and so on clockwise. The target to be clicked is highlighted every time. Each test block ends when all targets have been selected (16 trials) and 5 blocks are run with different combinations of target width and circle radius (with 5 different Indexes of Difficulty) giving a total of 80 trials per user. (1) (2) (3) Fig. 1. Screenshot of the 2D pointing task For the 3D case, 8 spherical targets are placed at the corners of a 3-dimensional cube (Fig. 2). Each task begins with a click on the centre of a target. Then the participant must move the cursor directly to the target that is opposite to the center of the cube and click on it. After a successful trial the cursor teleports to another target that will become the beginning of the next route. The next target is highlighted every time. Each test block ends when all 8 equidistance diagonal routes, that connect the 8
4 16 G. Kouroupetroglou et al. targets, are successfully done (8 trials) and 5 blocks are run for different target circle radii (in total 5 different Indexes of Difficulty) giving a total of 40 trials peruser. Fig. 2. Screenshot of 3D pointing task Fitts proposed to quantify the human rate of information processing in aimed movements using bits per second as unit. Fitts called the measure index of performance ; today it is more commonly called Throughput (TP, in bits/s). Although different methods of calculating Throughput exist in the literature, the preferred method is the one proposed by Fitts in 1954 [12]. The calculation involves a direct division of means: dividing ID (bits) by the mean Movement Time, MT (seconds), computed over a block of trials: (4) The subscript e in ID e reflects a small but important adjustment, which Fitts endorsed in a follow-up paper [22]. An adjustment for accuracy involves first computing the effective target width as 4,133 (5) where SD x is the observed standard deviation in a participant's selection coordinates over repeated trials with a particular D-W condition. Computed in this manner, W e includes the spatial variability, or accuracy, in responses. In essence, it captures what a participant actually did, rather than what he or she was asked to do. This adjustment necessitates a similar adjustment to ID, yielding an effective Index of Difficulty : log 1 (6)
5 Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation 17 Calculated using the adjustment for accuracy, TP is a human performance measure that embeds both the speed and accuracy of responses. TP is most useful as a dependent variable in factorial experiments using pointing devices or pointing techniques as independent variables. Additionally, based on the theory proposed by MacKenzie et al. [21], we measure the following extra parameters of the real cursor movement trajectory monitored by our application: Missed Clicks (MCL) occurs when an input device button click is registered outside of the target. Target re-entry (TRE) if this behaviour is recorded twice in a sequence of ten trials, TRE is reported as 0,2 per trial. A task with one target re-entry is shown in Figure 3. Fig. 3. Target Re-Entry Task axis crossing (TAC) the task axis is defined as the straight line from the starting point to the target centre (see Figure 4). A task axis crossing occurs when the cursor crosses this line, like it does once in Figure 5. Fig. 4. A perfect target-selection task Fig. 5. Task Axis Crossing Movement direction change (MDC) occurs when the tangent to the cursor path becomes parallel to the task axis. In the trajectory of Figure 6, three MDCs are logged. Fig. 6. Movement Direction Change Orthogonal direction change (ODC) direction changes that occur along the axis orthogonal to the task axis, as it happens twice in Figure 7.
6 18 G. Kouroupetroglou et al. Fig. 7. Movement Direction Change The five measures above characterize the pointer path by logging discrete events. Three continuous measures complete the set of calculations: Movement variability (MV) is a continuous measure computed from the x-y coordinates of the pointer during a movement task. It represents the extent to which the sample cursor points lie in a straight line along an axis parallel to the task axis. Consider Figure 8, which shows a simple left-to-right target selection task, and the path of the pointer with five sample points. Assuming the task axis is y = 0, y i is the distance from a sample point to the task axis, and is the mean distance of the sample points to the task axis. Movement variability is computed as the standard deviation in the distances of the sample points from the mean: 1 (7) In a perfectly executed trial, MV = 0. Movement error (ME) is the average deviation of the sample points from the task axis, irrespective of whether the points are above or below the axis. Assuming the task axis is y = 0 as in Figure 8, then y (8) n In an ideal task, ME = 0. Movement offset (MO) is the mean deviation of sample points from the task axis. Unlike movement error, this measure is dependent on whether the points are above or below the axis. y (9) Movement offset represents the tendency of the pointer to veer left or right of the task axis during a movement. In an ideal task, MO = 0. Fig. 8. Sample coordinates of pointer motion
7 Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation 19 These extra parameters have been previously applied in a 2D setup for Brain Computer Interface Cursor Measures for Motion-impaired and Able-bodied Users [23]. The experimental application was developed as a Virtual Instrument using the LabVIEW (Laboratory Virtual Instrumentation Engineering Workbench) graphical programming environment by National Instruments [24]. We tested Wiimote (Figure 9a) as a gesture input device getting real time data from both its high-resolution IR camera (in combination with the IR LED array illuminator shown in Figure 9b) and its 3-axis accelerometer. The Wiimote was treated as an HID (Human Interface Device) compliant device connected to a regular PC using Bluetooth communication. The computer used was a Pentium Core 2 Duo 1,8 GHz laptop with 3 GB of RAM and a NVIDIA GTS250 graphics card, running MS-Windows 7 Professional and LabVIEW It was connected to an external 24 TFT monitor with 1280x800 pixels resolution, which was used as the main display for the experiments. Six male participants, students of the Department of Informatics, University of Athens, volunteered for the study. Their age range was years (mean 25, SD 4,9). All of them had normal or corrected vision and were right-handed. They also reported an average three hour daily usage of a the mouse device. None of these participants had any previous experience with the Wiimote. (a) (b) Fig. 9. (a) The Wii Remote Control 1, (b) 4-LED infrared light source (illuminator)
8 20 G. Kouroupetroglou et al. 3 Results During the experiments we discovered that even the slightest amount of sun light in the room was interfering with the Wiimote s IR camera, making it impossible to get MT: Movement Time (msec) D experiments ,0 2,5 3,0 3,5 4,0 4,5 5,0 ID: Index of Difficulty y = 444,72x 33,762 R² = 0,7253 mouse wiimote mouse Wiimote y = 193,62x + 255,18 R² = 0,6869 (a) MT: Movement Time (msec) D experiments ,5 2,0 2,5 3,0 3,5 4,0 4,5 ID: Index of Difficulty y = 1426,8x ,1 R² = 0,3968 mouse wiimote mouse Wiimote y = 248,91x ,7 R² = 0,1786 Fig. 10. Measurements of Movement Time (mean values for all trials) as a function of Index of Difficulty for all the participants in 2D (a) and 3D (b) experiments using Wiimote and the mouse (b)
9 Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation 21 decent results when running on IR mode, i.e., on 3-D tests. We had to move to a very dark room (artificial light was not a problem) and run the trials again. In both 2D and 3D experiments users were instructed not to stop on erroneous clicks and an audio feedback was given in that case. Visual and audio feedback was also given on successful clicks. Each task was explained and demonstrated to participants and a warm up set of trials was given. A 100 Hz sampling rate was used for cursor trajectory data acquisition. Measurements of Movement Time (mean values for all trials) as a function of Index of Difficulty for all the participants in 2D (a) and 3D (b) experiments using the Wiimote and the mouse are presented in Fig. 10. After the statistical analysis of all data from all users, we present the results for the additional cursor movement parameters in Table 1. Table 1. Calculated parameters of the cursor trajectory generated by the two gesture input devices in 2D and 3D experiments TP MCL TRE TAC MDC ODC MV ME MO throughput missed clicks target re-entry task axis crossing movement direction change orthogonal direction change movement variability movement error movement offset 2D mouse 5,05 0,05 0,12 1,48 6,04 0,91 0,32 0,39 0,04 Wiimote 2,97 0,14 0,11 1,56 18,06 2,48 0,55 0,79 0,22 3D mouse 1,71 0,06 0,10 Wiimote 0,75 0,96 1,46 4 Conclusions Throughput calculations are consistent with other studies for mice (they range from 5 to 5,9 for 2D tasks). For the 2D tasks using Wiimote, Throughput is 41,2% lower than using the mouse, target re-entry is almost the same, and missed clicks count is 3 times higher. For the 3D tasks using Wiimote, Throughput is 56,1% lower than using the mouse, target re-entry is more than 14 times higher, and missed clicks count is 16 times higher. Furthermore, Fig. 10 shows that the fitting line correlation coefficient (R 2 ), which reflects the reliability of the linear relationship between MT and ID values and, therefore, the compliance to Fitts law, is generally slightly higher for the Wiimote controller than the mouse and significantly lower for the 3D than the 2D experiments.
10 22 G. Kouroupetroglou et al. We must note that for the 2D experiments we used the mouse in a standard way dragging it on a Goldtouch fabric pad and clicking with the left mouse button; as far as the Wiimote is concerned, for the 2D tests we acquired cursor movement coordinates by the device s accelerometers, taking into account only x and y axis normalized data. For the 3D experiments the difference was in that we used the mouse s scrolling wheel in order to move on the z axis, scrolling up to go inwards the screen, and scrolling down in order to come outwards ; regarding the Wiimote, for 3D tests we only considered the IR camera data for movement on all axis, calculating each time the distance difference between a pair of lights that the IR camera was seeing for z-axis movement, and the mean x-y coordinates of the same pair of lights for the other two axis respectively. The algorithm of our application chose a pair of visible lights, among the four available, every 10ms and changed pair when one or both of them were no longer visible. We conclude that the Wiimote was proven to be a much slower and harder to use input device for both 2D and 3D pointing tasks than the mouse. 3D tests show a strong weakness of both the Mouse and Wiimote to work effectively as 3D pointing devices, which is partly justified by the fact that all users had no previous experience of using these devices for such tasks. Future work may include the involvement of more users in the experiments, involvement of disabled users to measure their gesture abilities, research on how performance changes over time (i.e., familiarization with the Wiimote and performance improvement), introduction of new trajectory measures for 3D tasks (also in spherical coordinates), and construction of a larger experimental IR LED grid in order to test Wiimote gesture-based interaction again, anticipating more accurate results. References 1. Hofmann, F.G., Heyer, P., Hommel, G.: Velocity Profile Based Recognition of Dynamic Gestures with Discrete Hidden Markov Models. In: Wachsmuth, I., Fröhlich, M. (eds.) GW LNCS (LNAI), vol. 1371, pp Springer, Heidelberg (1998) 2. Mantyjarvi, J., Kela, J., Korpipaa, P., Kallio, S.: Enabling fast and effortless customization in accelerometer based gesture interaction. In: MUM 2004, pp ACM Press (2004) 3. Mantyjarvi, J., Kela, J., Korpipaa, P., Kallio, S., Savino, G., Jozzo, L., Marca, D.: Accelerometer-based gesture control for a design environment. Personal Ubiquitous Computing 10(5), (2006) 4. Kratz, S., Rohs, M.: A $3 Gesture Recognizer: Simple Gesture Recognition for Devices Equipped with 3D Acceleration Sensors. In: International Conference on Intelligent User Interfaces (IUI 2010), pp ACM Press (2010) 5. Schlomer, T., Poppinga, B., Henze, N., Boll, S.: Gesture recognition with a Wii controller. In: TEI Tangible and Embedded Interaction Conference, pp ACM Press (2008) 6. Koch, P., Konen, W., Hein, K.: Gesture Recognition on Few Training Data using Slow Feature Analysis and Parametric Bootstrap. In: International Joint Conference on Neural Networks, Barcelona, pp. 1 8 (2010) 7. Lee, J.C.: Hacking the Nintendo Wii remote. IEEE Pervasive Computing 7(3), (2008)
11 Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation Rehm, M., Bee, N., Andre, E.: Wave Like an Egyptian - Accelerometer Based Gesture Recognition for Culture Specific Interactions. In: HCI 2008: Culture, Creativity, Interaction (2008) 9. Jones, E., Alexander, J., Andreou, A., Irani, P., Subramanian, S.: GesText: Accelerometer- Based Gestural Text-Entry Systems. In: CHI 2010, Atlanta, Georgia, USA, April (2010) 10. Leong, T., Lai, J., Pong, P., Panza, J., Hong, J.: Wii Want to Write: An Accelerometer Based Gesture Recognition System. In: Intern. Conf. on Recent and Emerging Advanced Technologies in Engineering, Malaysia, pp. 4 7 (2009) 11. Malmestig, P., Sundberg, S.: SignWiiver - implementation of sign language technology. University of Göteborg (2008), Fitts, P.M.: The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology 47(6), (1954); reprinted in Journal of Experimental Psychology: General 121(3), (1992) 13. Murata, A., Iwase, H.: Extending Fitts law to a three-dimensional pointing task. Human Movement Science 20, (2001) 14. Chen, R., Wu, F.-G., Chen, K.: Extension of Fitts Law for the design of the gesture pointing interaction. In: 3th World Conference on Design Research - IASDR 2009, Korea, pp (2009) 15. Foehrenbach, S., König, W., Gerken, J., Reiterer, H.: Natural Interaction with Hand Gestures and Tactile Feedback for large, high-res Displays. In: MITH 2008: Workshop on Multimodal Interaction Through Haptic Feedback, Napoli, Italy (2008) 16. Fikkert, W., van der Vet, P., Nijholt, A.: Hand-held device evaluation in gesture interfaces. In: 8th International Gesture Workshop - GW 2009 (2009) 17. McArthur, V., Castellucci, S.J., MacKenzie, I.S.: An empirical comparison of Wiimote gun attachments for pointing tasks. In: ACM SIGCHI Symposium on Engineering Interactive Computing Systems EICS 2009, pp ACM, New York (2009) 18. MacKenzie, I.S.: Movement time prediction in human-computer interfaces. In: Baecker, R.M., Buxton, W.A.S., Grudin, J., Greenberg, S. (eds.) Readings in Human-Computer Interaction, 2nd edn., pp Kaufmann, San Francisco (1995) 19. ISO: Ergonomic requirements for office work with visual display terminals (vdts)-part 9: Req. for non-keyboard input devices. Technical Report (2000) 20. Soukoreff, W., MacKenzie, S.: Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts law research in HCI. International Journal of Human- Computer Studies 61(6), (2004) 21. MacKenzie, I.S., Kauppinen, T., Silfverberg, M.: Accuracy measures for evaluating computer pointing devices. In: ACM Conference on Human Factors in Computing Systems CHI 2001, pp ACM, New York (2001) 22. Fitts, P.M., Peterson, J.R.: Information capacity of discrete motor responses. J. Exp. Psychology 67, (1964) 23. Pino, A., Kalogeros, E., Salemis, I., Kouroupetroglou, G.: Brain Computer Interface Cursor Measures for Motion-impaired and Able-bodied Users. In: 10th International Conference on Human-Computer Interaction, vol. 4, pp Lawrence Erlbaum Associates, Inc., Mahwah (2003) 24. The LabVIEW Environment, Kiefer, C., Collins, N., Fitzpatrick, G.: Evaluating the Wiimote as a Musical Controller. In: International Computer Music Conference - ICMC 2008 (2008)
Brain Computer Interface Cursor Measures for Motionimpaired and Able-bodied Users
Brain Computer Interface Cursor Measures for Motionimpaired and Able-bodied Users Alexandros Pino, Eleftherios Kalogeros, Elias Salemis and Georgios Kouroupetroglou Department of Informatics and Telecommunications
More informationGesture in Embodied Communication and Human-Computer Interaction
Eleni Efthimiou Georgios Kouroupetroglou (Eds.) Gesture in Embodied Communication and Human-Computer Interaction 9th International Gesture Workshop, GW 2011 Athens, Greece, May 25-27, 2011 Institute for
More informationAN EXTENSIBLE AND INTERACTIVE RESEARCH PLATFORM FOR EXPLORING FITTS LAW
AN EXTENSIBLE AND INTERACTIVE RESEARCH PLATFORM FOR EXPLORING FITTS LAW Schedlbauer, Martin, University of Massachusetts Lowell, Department of Computer Science, Lowell, MA 01854, USA, mschedlb@cs.uml.edu
More informationComparison of Relative Versus Absolute Pointing Devices
The InsTITuTe for systems research Isr TechnIcal report 2010-19 Comparison of Relative Versus Absolute Pointing Devices Kent Norman Kirk Norman Isr develops, applies and teaches advanced methodologies
More informationDifferences in Fitts Law Task Performance Based on Environment Scaling
Differences in Fitts Law Task Performance Based on Environment Scaling Gregory S. Lee and Bhavani Thuraisingham Department of Computer Science University of Texas at Dallas 800 West Campbell Road Richardson,
More informationDo Stereo Display Deficiencies Affect 3D Pointing?
Do Stereo Display Deficiencies Affect 3D Pointing? Mayra Donaji Barrera Machuca SIAT, Simon Fraser University Vancouver, CANADA mbarrera@sfu.ca Wolfgang Stuerzlinger SIAT, Simon Fraser University Vancouver,
More informationPERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT
PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,
More informationHUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY
HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY *Ms. S. VAISHNAVI, Assistant Professor, Sri Krishna Arts And Science College, Coimbatore. TN INDIA **SWETHASRI. L., Final Year B.Com
More informationResearch Seminar. Stefano CARRINO fr.ch
Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks
More informationA Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones
A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones Jianwei Lai University of Maryland, Baltimore County 1000 Hilltop Circle, Baltimore, MD 21250 USA jianwei1@umbc.edu
More informationINVESTIGATION AND EVALUATION OF POINTING MODALITIES FOR INTERACTIVE STEREOSCOPIC 3D TV
INVESTIGATION AND EVALUATION OF POINTING MODALITIES FOR INTERACTIVE STEREOSCOPIC 3D TV Haiyue Yuan, Janko Ćalić, Anil Fernando, Ahmet Kondoz I-Lab, Centre for Vision, Speech and Signal Processing, University
More informationHUMAN COMPUTER INTERFACE
HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the
More information3D-Position Estimation for Hand Gesture Interface Using a Single Camera
3D-Position Estimation for Hand Gesture Interface Using a Single Camera Seung-Hwan Choi, Ji-Hyeong Han, and Jong-Hwan Kim Department of Electrical Engineering, KAIST, Gusung-Dong, Yusung-Gu, Daejeon, Republic
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationModeling Prehensile Actions for the Evaluation of Tangible User Interfaces
Modeling Prehensile Actions for the Evaluation of Tangible User Interfaces Georgios Christou European University Cyprus 6 Diogenes St., Nicosia, Cyprus gchristou@acm.org Frank E. Ritter College of IST
More informationInteractive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience
Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience Radu-Daniel Vatavu and Stefan-Gheorghe Pentiuc University Stefan cel Mare of Suceava, Department of Computer Science,
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationA Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,
IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,
More informationHigh-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control
High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control Pedro Neto, J. Norberto Pires, Member, IEEE Abstract Today, most industrial robots are programmed using the typical
More informationHaptic control in a virtual environment
Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely
More informationUUIs Ubiquitous User Interfaces
UUIs Ubiquitous User Interfaces Alexander Nelson April 16th, 2018 University of Arkansas - Department of Computer Science and Computer Engineering The Problem As more and more computation is woven into
More informationThe Leap Motion movement for 2D pointing tasks Characterisation and comparison to other devices
The Leap Motion movement for 2D pointing tasks Characterisation and comparison to other devices Manuel César Bessa Seixas 1,2, Jorge C. S. Cardoso 1 and Maria Teresa Galvão Dias 3 1 CITAR/School of Arts,
More informationVICs: A Modular Vision-Based HCI Framework
VICs: A Modular Vision-Based HCI Framework The Visual Interaction Cues Project Guangqi Ye, Jason Corso Darius Burschka, & Greg Hager CIRL, 1 Today, I ll be presenting work that is part of an ongoing project
More informationSMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY
SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY Sidhesh Badrinarayan 1, Saurabh Abhale 2 1,2 Department of Information Technology, Pune Institute of Computer Technology, Pune, India ABSTRACT: Gestures
More informationComparison of Phone-based Distal Pointing Techniques for Point-Select Tasks
Comparison of Phone-based Distal Pointing Techniques for Point-Select Tasks Mohit Jain 1, Andy Cockburn 2 and Sriganesh Madhvanath 3 1 IBM Research, Bangalore, India mohitjain@in.ibm.com 2 University of
More informationThe University of Algarve Informatics Laboratory
arxiv:0709.1056v2 [cs.hc] 13 Sep 2007 The University of Algarve Informatics Laboratory UALG-ILAB September, 2007 A Sudoku Game for People with Motor Impairments Stéphane Norte, and Fernando G. Lobo Department
More informationTHE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY
IADIS International Conference Gaming 2008 THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY Yang-Wai Chow School of Computer Science and Software Engineering
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationTarget Size and Distance: Important Factors for Designing User Interfaces for Older Notebook Users
Work with Computing Systems 2004. H.M. Khalid, M.G. Helander, A.W. Yeo (Editors). Kuala Lumpur: Damai Sciences. 454 Target Size and Distance: Important Factors for Designing User Interfaces for Older Notebook
More informationWiimote as an input device in Google Earth visualization and navigation: a user study comparing two alternatives
Wiimote as an input device in Google Earth visualization and navigation: a user study comparing two alternatives Beatriz Sousa Santos (1,2), Bruno Prada (1), Hugo Ribeiro (1), Paulo Dias (1,2), Samuel
More informationModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern
ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern
More informationEvaluating Touch Gestures for Scrolling on Notebook Computers
Evaluating Touch Gestures for Scrolling on Notebook Computers Kevin Arthur Synaptics, Inc. 3120 Scott Blvd. Santa Clara, CA 95054 USA karthur@synaptics.com Nada Matic Synaptics, Inc. 3120 Scott Blvd. Santa
More informationithrow : A NEW GESTURE-BASED WEARABLE INPUT DEVICE WITH TARGET SELECTION ALGORITHM
ithrow : A NEW GESTURE-BASED WEARABLE INPUT DEVICE WITH TARGET SELECTION ALGORITHM JONG-WOON YOO, YO-WON JEONG, YONG SONG, JUPYUNG LEE, SEUNG-HO LIM, KI-WOONG PARK, AND KYU HO PARK Computer Engineering
More information3D Data Navigation via Natural User Interfaces
3D Data Navigation via Natural User Interfaces Francisco R. Ortega PhD Candidate and GAANN Fellow Co-Advisors: Dr. Rishe and Dr. Barreto Committee Members: Dr. Raju, Dr. Clarke and Dr. Zeng GAANN Fellowship
More informationCross Display Mouse Movement in MDEs
Cross Display Mouse Movement in MDEs Trina Desrosiers Ian Livingston Computer Science 481 David Noete Nick Wourms Human Computer Interaction ABSTRACT Multi-display environments are becoming more common
More informationFiltering Joystick Data for Shooter Design Really Matters
Filtering Joystick Data for Shooter Design Really Matters Christoph Lürig 1 and Nils Carstengerdes 2 1 Trier University of Applied Science luerig@fh-trier.de 2 German Aerospace Center Nils.Carstengerdes@dlr.de
More informationComparison of Three Eye Tracking Devices in Psychology of Programming Research
In E. Dunican & T.R.G. Green (Eds). Proc. PPIG 16 Pages 151-158 Comparison of Three Eye Tracking Devices in Psychology of Programming Research Seppo Nevalainen and Jorma Sajaniemi University of Joensuu,
More informationQuantification of the Effects of Haptic Feedback During a Motor Skills Task in a Simulated Environment
Quantification of the Effects of Haptic Feedback During a Motor Skills Task in a Simulated Environment Steven A. Wall and William S. Harwin The Department of Cybernetics, University of Reading, Whiteknights,
More informationMensch-Maschine-Interaktion 1. Chapter 9 (June 28th, 2012, 9am-12pm): Basic HCI Models
Mensch-Maschine-Interaktion 1 Chapter 9 (June 28th, 2012, 9am-12pm): Basic HCI Models 1 Overview Introduction Basic HCI Principles (1) Basic HCI Principles (2) User Research & Requirements Designing Interactive
More informationHaptic Feedback in Remote Pointing
Haptic Feedback in Remote Pointing Laurens R. Krol Department of Industrial Design Eindhoven University of Technology Den Dolech 2, 5600MB Eindhoven, The Netherlands l.r.krol@student.tue.nl Dzmitry Aliakseyeu
More informationReaching Movements to Augmented and Graphic Objects in Virtual Environments
Reaching Movements to Augmented and Graphic Objects in Virtual Environments Andrea H. Mason, Masuma A. Walji, Elaine J. Lee and Christine L. MacKenzie School of Kinesiology Simon Fraser University Burnaby,
More informationMultimodal Interaction Concepts for Mobile Augmented Reality Applications
Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationExploring Surround Haptics Displays
Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,
More informationFeelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces
Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics
More informationHaptic presentation of 3D objects in virtual reality for the visually disabled
Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,
More informationhttp://uu.diva-portal.org This is an author produced version of a paper published in Proceedings of the 23rd Australian Computer-Human Interaction Conference (OzCHI '11). This paper has been peer-reviewed
More informationREBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL
World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced
More informationE X P E R I M E N T 12
E X P E R I M E N T 12 Mirrors and Lenses Produced by the Physics Staff at Collin College Copyright Collin College Physics Department. All Rights Reserved. University Physics II, Exp 12: Mirrors and Lenses
More informationMULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT
MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003
More informationProjection Based HCI (Human Computer Interface) System using Image Processing
GRD Journals- Global Research and Development Journal for Volume 1 Issue 5 April 2016 ISSN: 2455-5703 Projection Based HCI (Human Computer Interface) System using Image Processing Pankaj Dhome Sagar Dhakane
More informationDrumtastic: Haptic Guidance for Polyrhythmic Drumming Practice
Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The
More informationRESNA Gaze Tracking System for Enhanced Human-Computer Interaction
RESNA Gaze Tracking System for Enhanced Human-Computer Interaction Journal: Manuscript ID: Submission Type: Topic Area: RESNA 2008 Annual Conference RESNA-SDC-063-2008 Student Design Competition Computer
More informationCSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2
CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter
More informationLAB Week 7: Data Acquisition
LAB Week 7: Data Acquisition Wright State University: Mechanical Engineering ME 3600L Section 01 Report and experiment by: Nicholas Smith Experiment performed on February 23, 2015 Due: March 16, 2015 Instructor:
More informationThe Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience
The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience Ryuta Okazaki 1,2, Hidenori Kuribayashi 3, Hiroyuki Kajimioto 1,4 1 The University of Electro-Communications,
More informationMicrosoft Scrolling Strip Prototype: Technical Description
Microsoft Scrolling Strip Prototype: Technical Description Primary features implemented in prototype Ken Hinckley 7/24/00 We have done at least some preliminary usability testing on all of the features
More informationDistance Estimation and Localization of Sound Sources in Reverberant Conditions using Deep Neural Networks
Distance Estimation and Localization of Sound Sources in Reverberant Conditions using Deep Neural Networks Mariam Yiwere 1 and Eun Joo Rhee 2 1 Department of Computer Engineering, Hanbat National University,
More informationLaboratory Experiment #1 Introduction to Spectral Analysis
J.B.Francis College of Engineering Mechanical Engineering Department 22-403 Laboratory Experiment #1 Introduction to Spectral Analysis Introduction The quantification of electrical energy can be accomplished
More informationPrediction and Correction Algorithm for a Gesture Controlled Robotic Arm
Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Pushkar Shukla 1, Shehjar Safaya 2, Utkarsh Sharma 3 B.Tech, College of Engineering Roorkee, Roorkee, India 1 B.Tech, College of
More informationINTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT
INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,
More informationA SURVEY ON GESTURE RECOGNITION TECHNOLOGY
A SURVEY ON GESTURE RECOGNITION TECHNOLOGY Deeba Kazim 1, Mohd Faisal 2 1 MCA Student, Integral University, Lucknow (India) 2 Assistant Professor, Integral University, Lucknow (india) ABSTRACT Gesture
More informationFuzzy-Heuristic Robot Navigation in a Simulated Environment
Fuzzy-Heuristic Robot Navigation in a Simulated Environment S. K. Deshpande, M. Blumenstein and B. Verma School of Information Technology, Griffith University-Gold Coast, PMB 50, GCMC, Bundall, QLD 9726,
More informationDevelopment of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
More informationIntegrated Driving Aware System in the Real-World: Sensing, Computing and Feedback
Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Jung Wook Park HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 jungwoop@andrew.cmu.edu
More informationDirect Manipulation. and Instrumental Interaction. CS Direct Manipulation
Direct Manipulation and Instrumental Interaction 1 Review: Interaction vs. Interface What s the difference between user interaction and user interface? Interface refers to what the system presents to the
More informationThe Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments
The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments Elias Giannopoulos 1, Victor Eslava 2, María Oyarzabal 2, Teresa Hierro 2, Laura González 2, Manuel Ferre 2,
More informationRunning an HCI Experiment in Multiple Parallel Universes
Author manuscript, published in "ACM CHI Conference on Human Factors in Computing Systems (alt.chi) (2014)" Running an HCI Experiment in Multiple Parallel Universes Univ. Paris Sud, CNRS, Univ. Paris Sud,
More informationOnline Game Quality Assessment Research Paper
Online Game Quality Assessment Research Paper Luca Venturelli C00164522 Abstract This paper describes an objective model for measuring online games quality of experience. The proposed model is in line
More informationSafe and Efficient Autonomous Navigation in the Presence of Humans at Control Level
Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level Klaus Buchegger 1, George Todoran 1, and Markus Bader 1 Vienna University of Technology, Karlsplatz 13, Vienna 1040,
More informationAbstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction
Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri
More informationTowards the Design of Effective Freehand Gestural Interaction for Interactive TV
Towards the Design of Effective Freehand Gestural Interaction for Interactive TV Gang Ren a,*, Wenbin Li b and Eamonn O Neill c a School of Digital Arts, Xiamen University of Technology, No. 600 Ligong
More informationA Multimodal Locomotion User Interface for Immersive Geospatial Information Systems
F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,
More informationUsing Google SketchUp
Using Google SketchUp Opening sketchup 1. From the program menu click on the SketchUp 8 folder and select 3. From the Template Selection select Architectural Design Millimeters. 2. The Welcome to SketchUp
More informationHELPING THE DESIGN OF MIXED SYSTEMS
HELPING THE DESIGN OF MIXED SYSTEMS Céline Coutrix Grenoble Informatics Laboratory (LIG) University of Grenoble 1, France Abstract Several interaction paradigms are considered in pervasive computing environments.
More informationHumanoid robot. Honda's ASIMO, an example of a humanoid robot
Humanoid robot Honda's ASIMO, an example of a humanoid robot A humanoid robot is a robot with its overall appearance based on that of the human body, allowing interaction with made-for-human tools or environments.
More informationGaze-controlled Driving
Gaze-controlled Driving Martin Tall John Paulin Hansen IT University of Copenhagen IT University of Copenhagen 2300 Copenhagen, Denmark 2300 Copenhagen, Denmark info@martintall.com paulin@itu.dk Alexandre
More informationZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field
ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field Figure 1 Zero-thickness visual hull sensing with ZeroTouch. Copyright is held by the author/owner(s). CHI 2011, May 7 12, 2011, Vancouver, BC,
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationVEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL
VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL Instructor : Dr. K. R. Rao Presented by: Prasanna Venkatesh Palani (1000660520) prasannaven.palani@mavs.uta.edu
More informationIntegration of Hand Gesture and Multi Touch Gesture with Glove Type Device
2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &
More informationENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS
BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of
More informationComparison of Haptic and Non-Speech Audio Feedback
Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability
More informationDirection-of-Arrival Estimation Using a Microphone Array with the Multichannel Cross-Correlation Method
Direction-of-Arrival Estimation Using a Microphone Array with the Multichannel Cross-Correlation Method Udo Klein, Member, IEEE, and TrInh Qu6c VO School of Electrical Engineering, International University,
More informationEvaluating Visual/Motor Co-location in Fish-Tank Virtual Reality
Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality Robert J. Teather, Robert S. Allison, Wolfgang Stuerzlinger Department of Computer Science & Engineering York University Toronto, Canada
More informationInvestigating the Electromechanical Coupling in Piezoelectric Actuator Drive Motor Under Heavy Load
Investigating the Electromechanical Coupling in Piezoelectric Actuator Drive Motor Under Heavy Load Tiberiu-Gabriel Zsurzsan, Michael A.E. Andersen, Zhe Zhang, Nils A. Andersen DTU Electrical Engineering
More informationTED TED. τfac τpt. A intensity. B intensity A facilitation voltage Vfac. A direction voltage Vright. A output current Iout. Vfac. Vright. Vleft.
Real-Time Analog VLSI Sensors for 2-D Direction of Motion Rainer A. Deutschmann ;2, Charles M. Higgins 2 and Christof Koch 2 Technische Universitat, Munchen 2 California Institute of Technology Pasadena,
More informationStep Response of RC Circuits
EE 233 Laboratory-1 Step Response of RC Circuits 1 Objectives Measure the internal resistance of a signal source (eg an arbitrary waveform generator) Measure the output waveform of simple RC circuits excited
More informationGestureCommander: Continuous Touch-based Gesture Prediction
GestureCommander: Continuous Touch-based Gesture Prediction George Lucchese george lucchese@tamu.edu Jimmy Ho jimmyho@tamu.edu Tracy Hammond hammond@cs.tamu.edu Martin Field martin.field@gmail.com Ricardo
More informationAgilEye Manual Version 2.0 February 28, 2007
AgilEye Manual Version 2.0 February 28, 2007 1717 Louisiana NE Suite 202 Albuquerque, NM 87110 (505) 268-4742 support@agiloptics.com 2 (505) 268-4742 v. 2.0 February 07, 2007 3 Introduction AgilEye Wavefront
More informationThe Representational Effect in Complex Systems: A Distributed Representation Approach
1 The Representational Effect in Complex Systems: A Distributed Representation Approach Johnny Chuah (chuah.5@osu.edu) The Ohio State University 204 Lazenby Hall, 1827 Neil Avenue, Columbus, OH 43210,
More informationExpression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch
Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch Vibol Yem 1, Mai Shibahara 2, Katsunari Sato 2, Hiroyuki Kajimoto 1 1 The University of Electro-Communications, Tokyo, Japan 2 Nara
More informationCharacterization of Train-Track Interactions based on Axle Box Acceleration Measurements for Normal Track and Turnout Passages
Porto, Portugal, 30 June - 2 July 2014 A. Cunha, E. Caetano, P. Ribeiro, G. Müller (eds.) ISSN: 2311-9020; ISBN: 978-972-752-165-4 Characterization of Train-Track Interactions based on Axle Box Acceleration
More informationEECS 4441 Human-Computer Interaction
EECS 4441 Human-Computer Interaction Topic #1:Historical Perspective I. Scott MacKenzie York University, Canada Significant Event Timeline Significant Event Timeline As We May Think Vannevar Bush (1945)
More informationSKF TKTI. Thermal Camera Software. Instructions for use
SKF TKTI Thermal Camera Software Instructions for use Table of contents 1. Introduction...4 1.1 Installing and starting the Software... 5 2. Usage Notes...6 3. Image Properties...7 3.1 Loading images
More informationHaptic Camera Manipulation: Extending the Camera In Hand Metaphor
Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium
More informationSpace Mouse - Hand movement and gesture recognition using Leap Motion Controller
International Journal of Scientific and Research Publications, Volume 7, Issue 12, December 2017 322 Space Mouse - Hand movement and gesture recognition using Leap Motion Controller Nifal M.N.M, Logine.T,
More informationEECS 4441 / CSE5351 Human-Computer Interaction. Topic #1 Historical Perspective
EECS 4441 / CSE5351 Human-Computer Interaction Topic #1 Historical Perspective I. Scott MacKenzie York University, Canada 1 Significant Event Timeline 2 1 Significant Event Timeline 3 As We May Think Vannevar
More informationDevelopment of a telepresence agent
Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented
More informationPaint with Your Voice: An Interactive, Sonic Installation
Paint with Your Voice: An Interactive, Sonic Installation Benjamin Böhm 1 benboehm86@gmail.com Julian Hermann 1 julian.hermann@img.fh-mainz.de Tim Rizzo 1 tim.rizzo@img.fh-mainz.de Anja Stöffler 1 anja.stoeffler@img.fh-mainz.de
More information