3D Interactions with a Passive Deformable Haptic Glove

Size: px
Start display at page:

Download "3D Interactions with a Passive Deformable Haptic Glove"

Transcription

1 3D Interactions with a Passive Deformable Haptic Glove Thuong N. Hoang Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ngocthuong@gmail.com Ross T. Smith Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ross.smith@unisa.edu.au Bruce H. Thomas Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia bruce. thomas@unisa.edu.au Abstract This paper explores enhancing mobile immersive augmented reality manipulations by providing a sense of computer-captured touch through the use of a passive deformable haptic glove that responds to objects in the physical environment. The glove extends our existing pinch glove design with a Digital Foam sensor that is placed under the palm of the hand. The novel glove input device supports a range of touchactivated, precise, direct manipulation modeling techniques with tactile feedback including hole cutting, trench cutting, and chamfer creation. A user evaluation study comparing an image plane approach to our passive deformable haptic glove showed that the glove improves a user's task performance time, decreases error rate and erroneous hand movements, and reduces fatigue. Author Keywords Passive Haptics, Augmented Reality, Pinch Gloves, Input Device, Interaction Technique. Extended Abstracts of the IEEE International Symposium on Mixed and Augmented Reality 2013 Science and Technology Proceedings 1-4 October 2013, Adelaide, SA, Australia /13/$ IEEE ACM Classification Keywords H.S. 2 [Information Interfaces and Presentation] : Graphical User Interfaces-Input Devices and Strategies; [Computer Graphics]: Methodology and Techniques-Interaction Techniques.

2 Figure 1. The new Passive Deformable Haptic Glove Figure 2. The original ultrasonic glove as a platform to build the PDH glove In troduction We have been investigating new precise methods for 3D data manipulation in both augmented and virtual worlds with an ultrasonic glove [1], see Figure 2. This paper explores the use of a passive deformable haptic (PDH) glove, see Figure 1, that captures data when a user touches physical objects and a force is applied to develop touch-based mobile augmented reality (AR) interaction techniques. Our goal is to improve precision, accuracy, and reduce fatigue for users by leveraging the benefits of a deformable material. Deformable materials physically support users' hands and provide sensory feedback on real-world objects. PDHs are nonrigid objects that can change shape when users apply a force, and the PDH provides a repelling spring-like force feedback [2-4]. We are interested in attaching a PDH device to a user's palm as a one-dimensional distance sensor that provides tactile feedback to the user to improve their spatial understanding and control of depth manipulations. In particular, we wish to support the interactive creation of virtual features that are cut into or extrude from the surface of a physical object, with or without prior knowledge of the physical environment. Without known geometric dimensions of physical objects, virtual object registration is achieved through direct touch and global 6DOF tracking of the user. Our PDH glove utilizes the Digital Foam sensor [4] as the PDH material. The Digital Foam sensor employs conductive foam that changes resistance when deformed. Previously, Digital Foam has been applied as a covering for physical objects to allow clay-like interactions. Our new device attaches it to a glove worn on the user's hand. This glove-mounted sensor allows the user to perform touch-based interactions on a multitude of physical objects and surfaces, converting a stationary device into a mobile input device. A full description of the PDH glove is found in [5]. Related Work Glove-based technologies capture real-time finger movements and gestures with high degrees of freedom. Immersion CyberGloves use bend sensors to measure joint angles and capture the finger pose. Pinch glove designs use fabric switches attached to the finger tips [6], for command entry. Piekarski and Thomas [7] extended the pinch gloves with an additional switch in the palm for menu control. Hoang and Thomas developed an ultrasonic glove-based input device for distance based manipulation techniques [1]. Passive haptics have been employed in virtual and augmented environments to assist with realism and improve immersion [8]. A study by Viciana-Abad et al. [2] demonstrates that passive haptic feedback improves task performance with reduced errors. A table-mounted sheet of 'soft foam rubber' was used to support pointing gestures using fingers or with a stylus. The non-haptic condition had participants perform the task by stretching out their hands in mid-air. Kohli [9] explored a deployable substrate used with an AR system for a military training system. This system explored the idea of warping the augmented models so that the physical and virtual systems do not align exactly, and thus extends the deployable substrate to a range of virtual content. These passive haptic examples have not attached the soft materials directly to the user's body to enhance the interaction experience.

3 R x!j. distance ::: c.!t III ::J -+_-JlT:O... Wall (head) T Figure 3. Transformation matrix of PDH technique. Delta Distance is controlled by the Passive Deformable Haptic glove to provide greater manipulation control Figure 4. Door cut-out model added to the wall, with the user (artificially overlaid) performing the task. Passive Deformable Haptic Glove Techniques We present a set of interaction techniques enabled by the PDH glove to provide an additional ldof for depth information. Foam placement on the palm facilitates direct manipulation with tactile feedback and PDH support to reduce hand movements. We utilize existing tracking techniques, whose mechanism is outlined in Figure 3, to determine the 6DOF pose of the glove. Combining the user's position transformation matrix, T head, and relative position of the hand, T head-hand, determines the location of the glove. The Digital Foam sensor provides tracking of Lldistance. An additional orientation sensor can provide tracking of the rotation matrix R. We placed a fiducial marker on the glove to calculate the relative position of the glove to the user, using image plane technique. Our three techniques (hole/trench cutting and chamfer) are designed to perform cutting and carving operations on existing models using a variety of cutter models such as cylinder, prism, or plane. The techniques operate with and without existing models. Extrusion can be performed by inverting the cutting operations. Dwell activation The PDH glove enables a natural and intuitive activation mechanism for modeless interaction. The user can start the operation by slightly depressing the Digital Foam sensor on the physical surface. The operation is committed using a dwell technique wherein the user maintains a constant deformation of the sensor for a period of time. The user aborts simply by removing their hand from the surface, thus resetting and cancelling the dwell time activation. Hole Cutting The PDH glove supports the task of cutting out a predefined shape to a certain depth on an existing object. The glove enables a modeless hole-cutting technique where the user simply walks up to the surface and starts to 'punch' the device at the desired location. The initial deformation of the foam will trigger the process. In addition to refining existing models, cutter models can also be created as stand-alone virtual objects in an unprepared outdoor AR environment. For example, the user can use the technique to create virtual doorways on a building site, as shown in Figure 4. An added orientation sensor to the PDH glove will enable different angles of cutting. Chamfer The user can manipulate virtual objects with a chamfer (also known as a bevel) operation by employing the PDH glove. This is achieved by pressing the foam against the corner edges of objects (see Figure 5 for resulting image). The deformation of the digital foam determines the depth of the chamfer. With an orientation sensor attached to the glove, the user can cut the chamfers at different angles and different functions (round, square). When there is a model of the physical object, the technique cuts directly onto the model. Without an existing virtual model, the chamfer technique will load the prism cutter model at the hand location, to change the appearances to look as if the edges have been chamfered away. Trench Cutting The user can move their hand across the physical surface to carve trenches into a virtually aligned surface. The user can create trench waves with different depths, with varying pressure, see Figure 7. A

4 Figure S. Chamfer technique is performed by directly touching the surface edges during manipulation... III 12 ';' 10 E i= o "!v1>-.;s. o, ",-0 e,'< i' Q.,.0 Time data,. Figure 6. Mean (SD) time values for 2 techniques (in seconds) practical example of this technique is a spinning clay wheel to mold various organic shapes. Variable depth trenches can be carved out to create complex shapes, with cutter objects of different shapes and sizes. Coupled with an orientation sensor mounted on the back of the glove, the user can carve different trench shapes by treading the hand in various angles on the surface to model organic shapes. Figure 7. Trench cutting techniques on a solid surface User Evaluation We compared the PDH glove with an existing image plane technique, one of the common techniques for mobile AR systems, for undertaking a hole-cutting technique. Our hypotheses are: The PDH glove improves task performance in terms of: Hl. reduced overall completion and depth time. H2. reduced number of failed attempts. H3. reduced erroneous hand movement. H4. reduced fatigue. There were two independent variables: technique (PDH foam glove technique or existing image plane technique [7] (lpt)) and depth of task (Smm, lsmm, or 2Smm). This is a 2x3 repeated measures design. Design The participant was required to complete hole-cutting tasks, with three steps: 1) positioning the virtual cylinder on the surface, 2) moving the cylinder down the required depth and 3) holding for two seconds (dwell time activation) to complete the task. We measured the time to complete step 1 as homing time, and step 2 as depth time, and the total time to complete the tasks. The tasks were performed on a flat surface at torso height of a seated participant, see Figure 8. The three separate locations were located both close to the body and at arm full reach. Both PDH and IPT techniques employed the OptiTrack 6DOF tracking system on the surface, to remove the imprecise nature of the IPT technique. The end of step 1 was signaled by IPT technique with a keyboard press, to remove errors and speed considerations with novice users. In the second step, IPT condition employed the OptiTrack, by adjusting their hand position up and down in mid-air, and the PDH used the Digital Foam. Both conditions supported comparable sub-millimeter resolution tracking. Participants were required to hold the foam depression (for PDH condition) or their hand position in mid-air (for IPT condition) for two seconds within 2mm of the required depth to complete the task, referred to as dwell time. The dwell time was used to enable modeless interaction. In the IPT condition, the participant's hand is not supported by any means at any time. We collected the number of times the timer reset, when out of the depth range, as an indication of failed attempts. We captured hand location through OptiTrack and Digital Foam as erroneous hand movement, based on variance only in the vertical axis.

5 Smm lsmm 2Smm (2.68) (2.49) (2.16) Table 1. Mean (SO) number of failed attempts across techniques Smm lsmm 2Smm (0.0369) (0.0402) (0.0530) Table 2. Mean (SO) of hand movement during the successful dwell time (mm), by OptiTrack and Digital Foam, across two techniques. PDH IPT During dwell time (0.0313) (0.0408) (OptiTrack & Digital Foam) Table 3. Mean (SO) of hand movement between 2 techniques (mm) Participants completed a nine task block repeated two times for each of the conditions, randomized order with rest in between. The participants answered a questionnaire afterwards, regarding the level of fatigued felt, how easy and intuitive the techniques were to use and to reach the task's goal, and their perceived precision. Responses were recorded on a visual analogue scale. Results There were 20 participants (18 males, 2 females, mean age years, SD 5. 19) from the University of South Australia and the general public. We performed twoway repeated measures ANOVA over the factors of the hypotheses: time (total, homing, depth), failed attempts, and mean erroneous hand movement during the last two second dwell time. We performed the following analysis: between the two techniques, among the different depths level, and the interaction between technique and depth. The Mauchly's test for sphericity has not been violated. Table 4 outlines the mean and SD of the results data across all conditions (two techniques x three depths). TIME Figures 6 charts the results of time analysis. For total time, PDH technique was significantly faster F(1,19)=8.90, p<o.01. For homing time, there was no significant effect found across all the tests. The PDH technique has a significant effect of technique on depth time F(1,19)=12.05, p<o.01. Overall, Hl was supported. There was no significant effect of different depths nor a significant interaction between technique and depth. ERRORS The PDH had a significantly lower mean number of failed attempts of (SD 1. 14) than the IPT (SD 3. 03), F(1,19)=10. 56, p<o.01. H2 was supported. There was a significant effect on depth of the task F(2,38)=12.54, p<o.ool, see Table 1. Post-hoc analysis (with Tukey adjustments) showed a significant effect (p<o.ool) between depths 5mm-15mm and 5mm-25mm only. Participants found the 5mm task to be more difficult with more restarts. There was a significant effect for interaction between the technique and depth F(2,38)=4. 87, p<0.05. HAND MOVEMENT DURING SUCCESSFUL DWELL TIME The PDH (recorded by Digital Foam) has a significant advantage over IPT (by OptiTrack), F(1,19)=46. 44, p<o.ool, and depth F(2,38)=50. 71, p<o.ol, see Table 3, on mean hand movement during successful dwell time. H3 was supported. There was a significant effect on depth of the task F(2,38)=50. 71, p<o.ol, see Table 2. Post-hoc analysis (with Tukey adjustments) shows a significant effect (p<o.ool) between all depth pairs. There was a significant interaction between technique and depth F(2,38)=15. 20, p<o.oo1. The deepest task assists participants in steadying their hand. QUESTIONNAIRE A Wilcoxon signed-rank test showed a reported positive significant statistical effect for the PDH for: reduced arm fatigue, easy to reach the goal task, and precision on completing the task. H4 was supported. Con clusion We have presented the Passive Deformable HaptiC glove with Digital Foam sensor to support precise direct touch manipulation modeling techniques. Our technique

6 Mean (SO) Total time (s) 6.79 (2.93) Homing time (s) 3.09 (0.89) Depth time (s) 3.70 (2.60) Failed attempts 2.36 (2.48) Hand movement during successful (0.0458) dwell time (OptiTrack & Digital Foam) (mm) Table 4. Results across all techniques Figure 8. Study set-up allows arbitrary physical objects to be modified with virtual information through direct touch without prior knowledge of the physical geometry. The tactile feedback provided by the glove is demonstrated with interaction techniques (including hole/trench cutting and chamfer) with modeless activation and dwell time completion. The results of our study showed that the PDH significantly improved the time to complete the task, decreased error rate, erroneous hand movement, and reduced fatigue. There was no significant difference between the PDH and IPT homing times. It is an indication that existing techniques can be used with the PDH glove with no reduced effect. There was no significant difference in task time between the depth levels. Therefore, our PDH glove techniques are applicable to all the range of depths supported by the Digital Foam sensor. One limitation is the amount of pressure required to depress the foam may be too soft or too hard for some users. Changing the size and the density of the foam material for each user can help overcome this. The depth of the actual foam sensor also limits the range of manipulation distance. A thicker Digital Foam sensor with lower density can be used to increase the range of movement. We can apply a scaling factor to the mapping. There is a trade-off between range and resolution, depending on the requirements of the task. In the future we would like to extend the sensor on our PDH glove to other parts of the user's hand, such as fingertips and the edge of the hand. We would like to explore the combination of multiple sensors for more complex interactions. We would also like to explore other AR display technologies such as projectors in Spatial AR. Finally, we would like to explore combining our PDH sensor with our ultrasonic glove sensor, to provide a range of distance sensors to support the development of new interaction techniques. Referen ces [1] T. N. Hoang and B. H. Thomas, "Distance-based modeling and manipulation techniques using ultrasonic gloves," in ISMAR 2012, pp [2] R. Viciana-Abad, A. R. Lecuona, and M. Poyade, "The influence of passive haptic feedback and difference interaction metaphors on presence and task performance," Presence: Teleoperators and Virtual Environments, vol. 19, pp , [3] F. Vogt, T. Chen, R. Hoskinson, and S. Fels, "A malleable surface touch interface," ACM SIGGRAPH [4] RT Smith, BH Thomas, and W Piekarski, "Digital foam interaction techniques for 3D modeling," in VRST 2008, pp [5] TN Hoang, RT Smith, and BH Thomas, "Passive Deformable Haptic Glove to Support 3D Interactions in Mobile Augmented Reality Environments," ISMAR [6] D. Bowman, C. Wingrave, J. Campbell, and V. Ly, "Using pinch gloves for both natural and abstract interaction techniques in virtual environments," in Proc. HCI International 2001, pp [7] BH Thomas and W Piekarski, "Glove Based User Interaction Techniques for AR in an Outdoor Environment," Virtual Reality, vol 6, pp , [8] K. Hinckley, R. Pausch, J. C. Goble, and N. F. Kassell, "Passive real-world interface props for neurosurgical visualization," SIGCHI conference on Human factors in computing systems: celebrating interdependence, [9] L. Kohli, "Redirected touching: Warping space to remap passive haptics," in 30U! 2010, pp

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Doug A. Bowman, Chadwick A. Wingrave, Joshua M. Campbell, and Vinh Q. Ly Department of Computer Science (0106)

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Using Real Objects for Interaction Tasks in Immersive Virtual Environments Using Objects for Interaction Tasks in Immersive Virtual Environments Andy Boud, Dr. VR Solutions Pty. Ltd. andyb@vrsolutions.com.au Abstract. The use of immersive virtual environments for industrial applications

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Beyond: collapsible tools and gestures for computational design

Beyond: collapsible tools and gestures for computational design Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

Touching and Walking: Issues in Haptic Interface

Touching and Walking: Issues in Haptic Interface Touching and Walking: Issues in Haptic Interface Hiroo Iwata 1 1 Institute of Engineering Mechanics and Systems, University of Tsukuba, 80, Tsukuba, 305-8573 Japan iwata@kz.tsukuba.ac.jp Abstract. This

More information

CSC 2524, Fall 2017 AR/VR Interaction Interface

CSC 2524, Fall 2017 AR/VR Interaction Interface CSC 2524, Fall 2017 AR/VR Interaction Interface Karan Singh Adapted from and with thanks to Mark Billinghurst Typical Virtual Reality System HMD User Interface Input Tracking How can we Interact in VR?

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

A HYBRID DIRECT VISUAL EDITING METHOD FOR ARCHITECTURAL MASSING STUDY IN VIRTUAL ENVIRONMENTS

A HYBRID DIRECT VISUAL EDITING METHOD FOR ARCHITECTURAL MASSING STUDY IN VIRTUAL ENVIRONMENTS A HYBRID DIRECT VISUAL EDITING METHOD FOR ARCHITECTURAL MASSING STUDY IN VIRTUAL ENVIRONMENTS JIAN CHEN Department of Computer Science, Brown University, Providence, RI, USA Abstract. We present a hybrid

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

Study of the touchpad interface to manipulate AR objects

Study of the touchpad interface to manipulate AR objects Study of the touchpad interface to manipulate AR objects Ryohei Nagashima *1 Osaka University Nobuchika Sakata *2 Osaka University Shogo Nishida *3 Osaka University ABSTRACT A system for manipulating for

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Exploring Surround Haptics Displays

Exploring Surround Haptics Displays Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,

More information

Investigating Gestures on Elastic Tabletops

Investigating Gestures on Elastic Tabletops Investigating Gestures on Elastic Tabletops Dietrich Kammer Thomas Gründer Chair of Media Design Chair of Media Design Technische Universität DresdenTechnische Universität Dresden 01062 Dresden, Germany

More information

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User

More information

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments Mario Doulis, Andreas Simon University of Applied Sciences Aargau, Schweiz Abstract: Interacting in an immersive

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology

More information

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005. Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.

More information

Evaluating Remapped Physical Reach for Hand Interactions with Passive Haptics in Virtual Reality

Evaluating Remapped Physical Reach for Hand Interactions with Passive Haptics in Virtual Reality Evaluating Remapped Physical Reach for Hand Interactions with Passive Haptics in Virtual Reality Dustin T. Han, Mohamed Suhail, and Eric D. Ragan Fig. 1. Applications used in the research. Right: The immersive

More information

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device 2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &

More information

COMS W4172 Design Principles

COMS W4172 Design Principles COMS W4172 Design Principles Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 January 25, 2018 1 2D & 3D UIs: What s the

More information

Proprioception & force sensing

Proprioception & force sensing Proprioception & force sensing Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jussi Rantala, Jukka

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18,   ISSN International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, www.ijcea.com ISSN 2321-3469 AUGMENTED REALITY FOR HELPING THE SPECIALLY ABLED PERSONS ABSTRACT Saniya Zahoor

More information

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Doug A. Bowman Graphics, Visualization, and Usability Center College of Computing Georgia Institute of Technology

More information

CSE 165: 3D User Interaction. Lecture #11: Travel

CSE 165: 3D User Interaction. Lecture #11: Travel CSE 165: 3D User Interaction Lecture #11: Travel 2 Announcements Homework 3 is on-line, due next Friday Media Teaching Lab has Merge VR viewers to borrow for cell phone based VR http://acms.ucsd.edu/students/medialab/equipment

More information

Objective Data Analysis for a PDA-Based Human-Robotic Interface*

Objective Data Analysis for a PDA-Based Human-Robotic Interface* Objective Data Analysis for a PDA-Based Human-Robotic Interface* Hande Kaymaz Keskinpala EECS Department Vanderbilt University Nashville, TN USA hande.kaymaz@vanderbilt.edu Abstract - This paper describes

More information

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks 3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks David Gauldie 1, Mark Wright 2, Ann Marie Shillito 3 1,3 Edinburgh College of Art 79 Grassmarket, Edinburgh EH1 2HJ d.gauldie@eca.ac.uk, a.m.shillito@eca.ac.uk

More information

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Chan-Su Lee Kwang-Man Oh Chan-Jong Park VR Center, ETRI 161 Kajong-Dong, Yusong-Gu Taejon, 305-350, KOREA +82-42-860-{5319,

More information

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Multimodal Interaction Concepts for Mobile Augmented Reality Applications Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl

More information

Peter Berkelman. ACHI/DigitalWorld

Peter Berkelman. ACHI/DigitalWorld Magnetic Levitation Haptic Peter Berkelman ACHI/DigitalWorld February 25, 2013 Outline: Haptics - Force Feedback Sample devices: Phantoms, Novint Falcon, Force Dimension Inertia, friction, hysteresis/backlash

More information

DATA GLOVES USING VIRTUAL REALITY

DATA GLOVES USING VIRTUAL REALITY DATA GLOVES USING VIRTUAL REALITY Raghavendra S.N 1 1 Assistant Professor, Information science and engineering, sri venkateshwara college of engineering, Bangalore, raghavendraewit@gmail.com ABSTRACT This

More information

Spatial Mechanism Design in Virtual Reality With Networking

Spatial Mechanism Design in Virtual Reality With Networking Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 9-2001 Spatial Mechanism Design in Virtual Reality With Networking John N. Kihonge Iowa State University

More information

ENGINEERING GRAPHICS ESSENTIALS

ENGINEERING GRAPHICS ESSENTIALS ENGINEERING GRAPHICS ESSENTIALS Text and Digital Learning KIRSTIE PLANTENBERG FIFTH EDITION SDC P U B L I C AT I O N S Better Textbooks. Lower Prices. www.sdcpublications.com ACCESS CODE UNIQUE CODE INSIDE

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

3D Virtual Hand Selection with EMS and Vibration Feedback

3D Virtual Hand Selection with EMS and Vibration Feedback 3D Virtual Hand Selection with EMS and Vibration Feedback Max Pfeiffer University of Hannover Human-Computer Interaction Hannover, Germany max@uni-hannover.de Wolfgang Stuerzlinger Simon Fraser University

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Virtual Object Manipulation using a Mobile Phone

Virtual Object Manipulation using a Mobile Phone Virtual Object Manipulation using a Mobile Phone Anders Henrysson 1, Mark Billinghurst 2 and Mark Ollila 1 1 NVIS, Linköping University, Sweden {andhe,marol}@itn.liu.se 2 HIT Lab NZ, University of Canterbury,

More information

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology Virtual Reality man made reality sense world What is Virtual Reality? Dipl-Ing Indra Kusumah Digital Product Design Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indrakusumah@iptfraunhoferde wwwiptfraunhoferde

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions Sesar Innovation Days 2014 Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions DLR German Aerospace Center, DFS German Air Navigation Services Maria Uebbing-Rumke, DLR Hejar

More information

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments The 18th IEEE International Symposium on Robot and Human Interactive Communication Toyama, Japan, Sept. 27-Oct. 2, 2009 WeIAH.2 Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

More information

The Hologram in My Hand: How Effective is Interactive Exploration of 3D Visualizations in Immersive Tangible Augmented Reality?

The Hologram in My Hand: How Effective is Interactive Exploration of 3D Visualizations in Immersive Tangible Augmented Reality? The Hologram in My Hand: How Effective is Interactive Exploration of 3D Visualizations in Immersive Tangible Augmented Reality? Benjamin Bach, Ronell Sicat, Johanna Beyer, Maxime Cordeil, Hanspeter Pfister

More information

The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments

The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments Robert W. Lindeman 1 John L. Sibert 1 James N. Templeman 2 1 Department of Computer Science

More information

Mohammad Akram Khan 2 India

Mohammad Akram Khan 2 India ISSN: 2321-7782 (Online) Impact Factor: 6.047 Volume 4, Issue 8, August 2016 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case

More information

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES IADIS International Conference Computer Graphics and Visualization 27 TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES Nicoletta Adamo-Villani Purdue University, Department of Computer

More information

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch Vibol Yem 1, Mai Shibahara 2, Katsunari Sato 2, Hiroyuki Kajimoto 1 1 The University of Electro-Communications, Tokyo, Japan 2 Nara

More information

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test a u t u m n 2 0 0 3 Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test Nancy E. Study Virginia State University Abstract The Haptic Visual Discrimination Test (HVDT)

More information

Haptic Rendering CPSC / Sonny Chan University of Calgary

Haptic Rendering CPSC / Sonny Chan University of Calgary Haptic Rendering CPSC 599.86 / 601.86 Sonny Chan University of Calgary Today s Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering

More information

Benefits of using haptic devices in textile architecture

Benefits of using haptic devices in textile architecture 28 September 2 October 2009, Universidad Politecnica de Valencia, Spain Alberto DOMINGO and Carlos LAZARO (eds.) Benefits of using haptic devices in textile architecture Javier SANCHEZ *, Joan SAVALL a

More information

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

More information

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp

More information

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect Peter Dam 1, Priscilla Braz 2, and Alberto Raposo 1,2 1 Tecgraf/PUC-Rio, Rio de Janeiro, Brazil peter@tecgraf.puc-rio.br

More information

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback Illusion of Surface Changes induced by Tactile and Visual Touch Feedback Katrin Wolf University of Stuttgart Pfaffenwaldring 5a 70569 Stuttgart Germany katrin.wolf@vis.uni-stuttgart.de Second Author VP

More information

Efficient In-Situ Creation of Augmented Reality Tutorials

Efficient In-Situ Creation of Augmented Reality Tutorials Efficient In-Situ Creation of Augmented Reality Tutorials Alexander Plopski, Varunyu Fuvattanasilp, Jarkko Polvi, Takafumi Taketomi, Christian Sandor, and Hirokazu Kato Graduate School of Information Science,

More information

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: ,  Volume 2, Issue 11 (November 2012), PP 37-43 IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719, Volume 2, Issue 11 (November 2012), PP 37-43 Operative Precept of robotic arm expending Haptic Virtual System Arnab Das 1, Swagat

More information

2. Introduction to Computer Haptics

2. Introduction to Computer Haptics 2. Introduction to Computer Haptics Seungmoon Choi, Ph.D. Assistant Professor Dept. of Computer Science and Engineering POSTECH Outline Basics of Force-Feedback Haptic Interfaces Introduction to Computer

More information

AirTouch: Mobile Gesture Interaction with Wearable Tactile Displays

AirTouch: Mobile Gesture Interaction with Wearable Tactile Displays AirTouch: Mobile Gesture Interaction with Wearable Tactile Displays A Thesis Presented to The Academic Faculty by BoHao Li In Partial Fulfillment of the Requirements for the Degree B.S. Computer Science

More information

Haptic Feedback on Mobile Touch Screens

Haptic Feedback on Mobile Touch Screens Haptic Feedback on Mobile Touch Screens Applications and Applicability 12.11.2008 Sebastian Müller Haptic Communication and Interaction in Mobile Context University of Tampere Outline Motivation ( technologies

More information

Using Hybrid Reality to Explore Scientific Exploration Scenarios

Using Hybrid Reality to Explore Scientific Exploration Scenarios Using Hybrid Reality to Explore Scientific Exploration Scenarios EVA Technology Workshop 2017 Kelsey Young Exploration Scientist NASA Hybrid Reality Lab - Background Combines real-time photo-realistic

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Tools: Sharpie, Square, Vise, Hack saw, Ruler, Punch, Hammer, File. 2. Cut the stock Place stock in vise and cut with hack saw

Tools: Sharpie, Square, Vise, Hack saw, Ruler, Punch, Hammer, File. 2. Cut the stock Place stock in vise and cut with hack saw Purpose: MAKE CATAPULT ARM Step 1 Tools: Sharpie, Square, Vise, Hack saw, Ruler, Punch, Hammer, File Materials: Flat aluminum ½ inch stock (see picture below) Gloves required 1. Pick up the aluminum ½

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

Comparison of Haptic and Non-Speech Audio Feedback

Comparison of Haptic and Non-Speech Audio Feedback Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability

More information

3D UIs 101 Doug Bowman

3D UIs 101 Doug Bowman 3D UIs 101 Doug Bowman Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses The Wii Remote and You 3D UI and

More information

Introduction to Sheet Metal Features SolidWorks 2009

Introduction to Sheet Metal Features SolidWorks 2009 SolidWorks 2009 Table of Contents Introduction to Sheet Metal Features Base Flange Method Magazine File.. 3 Envelopment & Development of Surfaces.. 14 Development of Transition Pieces.. 23 Conversion to

More information

Omni-Directional Catadioptric Acquisition System

Omni-Directional Catadioptric Acquisition System Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Short Course on Computational Illumination

Short Course on Computational Illumination Short Course on Computational Illumination University of Tampere August 9/10, 2012 Matthew Turk Computer Science Department and Media Arts and Technology Program University of California, Santa Barbara

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

Getting started with. Getting started with VELOCITY SERIES.

Getting started with. Getting started with VELOCITY SERIES. Getting started with Getting started with SOLID EDGE EDGE ST4 ST4 VELOCITY SERIES www.siemens.com/velocity 1 Getting started with Solid Edge Publication Number MU29000-ENG-1040 Proprietary and Restricted

More information

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy Michael Saenz Texas A&M University 401 Joe Routt Boulevard College Station, TX 77843 msaenz015@gmail.com Kelly Maset Texas A&M University

More information

Haptic messaging. Katariina Tiitinen

Haptic messaging. Katariina Tiitinen Haptic messaging Katariina Tiitinen 13.12.2012 Contents Introduction User expectations for haptic mobile communication Hapticons Example: CheekTouch Introduction Multiple senses are used in face-to-face

More information

Hand-Held Windows: Towards Effective 2D Interaction in Immersive Virtual Environments

Hand-Held Windows: Towards Effective 2D Interaction in Immersive Virtual Environments Hand-Held Windows: Towards Effective 2D Interaction in Immersive Virtual Environments Robert W. Lindeman John L. Sibert James K. Hahn Institute for Computer Graphics The George Washington University, Washington,

More information

3D interaction strategies and metaphors

3D interaction strategies and metaphors 3D interaction strategies and metaphors Ivan Poupyrev Interaction Lab, Sony CSL Ivan Poupyrev, Ph.D. Interaction Lab, Sony CSL E-mail: poup@csl.sony.co.jp WWW: http://www.csl.sony.co.jp/~poup/ Address:

More information

An Evaluation of Bimanual Gestures on the Microsoft HoloLens

An Evaluation of Bimanual Gestures on the Microsoft HoloLens An Evaluation of Bimanual Gestures on the Microsoft HoloLens Nikolas Chaconas, * Tobias Höllerer Computer Science Department University of California, Santa Barbara ABSTRACT We developed and evaluated

More information

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,

More information

Haptic, vestibular and other physical input/output devices

Haptic, vestibular and other physical input/output devices Human Touch Sensing - recap Haptic, vestibular and other physical input/output devices SGN-5406 Virtual Reality Autumn 2007 ismo.rakkolainen@tut.fi The human sensitive areas for touch: Hand, face Many

More information

Gestaltung und Strukturierung virtueller Welten. Bauhaus - Universität Weimar. Research at InfAR. 2ooo

Gestaltung und Strukturierung virtueller Welten. Bauhaus - Universität Weimar. Research at InfAR. 2ooo Gestaltung und Strukturierung virtueller Welten Research at InfAR 2ooo 1 IEEE VR 99 Bowman, D., Kruijff, E., LaViola, J., and Poupyrev, I. "The Art and Science of 3D Interaction." Full-day tutorial presented

More information

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality The MIT Faculty has made this article openly available. Please share how this access benefits you. Your

More information

VE Input Devices. Doug Bowman Virginia Tech

VE Input Devices. Doug Bowman Virginia Tech VE Input Devices Doug Bowman Virginia Tech Goals and Motivation Provide practical introduction to the input devices used in VEs Examine common and state of the art input devices look for general trends

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

Evaluation of Remote Collaborative Manipulation for Scientific Data Analysis

Evaluation of Remote Collaborative Manipulation for Scientific Data Analysis Evaluation of Remote Collaborative Manipulation for Scientific Data Analysis Cédric Fleury, Thierry Duval, Valérie Gouranton, Anthony Steed To cite this version: Cédric Fleury, Thierry Duval, Valérie Gouranton,

More information

Physical Hand Interaction for Controlling Multiple Virtual Objects in Virtual Reality

Physical Hand Interaction for Controlling Multiple Virtual Objects in Virtual Reality Physical Hand Interaction for Controlling Multiple Virtual Objects in Virtual Reality ABSTRACT Mohamed Suhail Texas A&M University United States mohamedsuhail@tamu.edu Dustin T. Han Texas A&M University

More information

Physical Presence Palettes in Virtual Spaces

Physical Presence Palettes in Virtual Spaces Physical Presence Palettes in Virtual Spaces George Williams Haakon Faste Ian McDowall Mark Bolas Fakespace Inc., Research and Development Group ABSTRACT We have built a hand-held palette for touch-based

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

User s handbook Last updated in December 2017

User s handbook Last updated in December 2017 User s handbook Last updated in December 2017 Contents Contents... 2 System info and options... 3 Mindesk VR-CAD interface basics... 4 Controller map... 5 Global functions... 6 Tool palette... 7 VR Design

More information

Augmented Reality and Its Technologies

Augmented Reality and Its Technologies Augmented Reality and Its Technologies Vikas Tiwari 1, Vijay Prakash Tiwari 2, Dhruvesh Chudasama 3, Prof. Kumkum Bala (Guide) 4 1Department of Computer Engineering, Bharati Vidyapeeth s COE, Lavale, Pune,

More information

TRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH CHILDREN

TRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH CHILDREN Vol. 2, No. 2, pp. 151-161 ISSN: 1646-3692 TRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH Nicoletta Adamo-Villani and David Jones Purdue University, Department of Computer Graphics

More information

Natural User Interface (NUI): a case study of a video based interaction technique for a computer game

Natural User Interface (NUI): a case study of a video based interaction technique for a computer game 253 Natural User Interface (NUI): a case study of a video based interaction technique for a computer game M. Rauterberg Institute for Hygiene and Applied Physiology (IHA) Swiss Federal Institute of Technology

More information

Difficulties Using Passive Haptic Augmentation in the Interaction within a Virtual Environment

Difficulties Using Passive Haptic Augmentation in the Interaction within a Virtual Environment Difficulties Using Passive Haptic Augmentation in the Interaction within a Virtual Environment R. Viciana-Abad, A. Reyes-Lecuona, F.J. Cañadas-Quesada Department of Electronic Technology University of

More information

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play Sultan A. Alharthi Play & Interactive Experiences for Learning Lab New Mexico State University Las Cruces, NM 88001, USA salharth@nmsu.edu

More information