Ergonomic Design and Evaluation of a Free-Hand Pointing Technique for a Stereoscopic Desktop Virtual Environment

Size: px
Start display at page:

Download "Ergonomic Design and Evaluation of a Free-Hand Pointing Technique for a Stereoscopic Desktop Virtual Environment"

Transcription

1 Ergonomic Design and Evaluation of a Free-Hand Pointing Technique for a Stereoscopic Desktop Virtual Environment Ronald Meyer a, Jennifer Bützler a, Jeronimo Dzaack b and Christopher M. Schlick a a Chair and Institute of Industrial Engineering and Ergonomics, RWTH Aachen University, Bergdriesch 27, Aachen, GERMANY; b ATLAS ELEKTRONIK GmbH, Sebaldsbrücker Heerstr. 235, Bremen, GERMANY The use of virtual hand models as a pointing technique in a stereoscopic desktop environment enables users to have natural control over entities in a virtual environment with low risk of visual conflicts by letting the users control their hand in an offset setting. Besides direct control over the virtual hand the visualization of the model plays an important role as a too complex visualization can occlude virtual entities of interest in the virtual scene. A Fitts pointing task is used to investigate distantly controlled virtual hand models in a stereoscopic desktop environment. In a factorial design we compare the usage of different hand models and their reduction of visual information down to a point cloud model implemented through the Leap Motion Controller. Results show that used hand models may not be too low in their graphical representation and movement times on a hand model with interconnected lines works significantly better over a point cloud model. Practitioner Summary: Different visual representations of virtual hand models implemented through the Leap Motion Controller are investigated in a stereoscopic desktop environment. Visual representations follow the reduction of the visual information of the virtual hand down to a point cloud model. Results show that a visualization of a hand model using interconnected lines works significantly better over a point cloud model. Keywords: 3D User Interface, Natural User Interaction, Fitts Pointing Task, Freehand Interaction, Desktop VR 1 Introduction The variety of applications using virtual and augmented reality technology in industry and training is increasing with decreasing costs of the corresponding equipment and increasing availability on the mass market. Immersive virtual reality hardware for use in desktop environments is not solely subjected for digital gaming purpose but also for commercial applications especially where spatial entity relations in three dimensions are of importance. Advantages of three-dimensional visualization and the respected depth perception through view channel disparity are apparent, for example, when it comes to realistically scaled representations of e.g. spatial sensor data. Thus, technologies, that allow to display spatial data stereoscopically and to interact with it, can be useful in various areas such as magnet resonance imagery, visualization of radar data for air traffic control or the mapping of bathymetric data and sonar imagery in a three-dimensional geographic mapping environment. Scenarios for the application of underwater imagery in a stereoscopic desktop environment have been developed by Meyer et al. (Meyer et al., 2014). Spatial interaction requires precise and robust methods for navigation and manipulation in three rotatory and three translational degrees of freedom. Input devices offering three translational degrees of freedom are available on the market as 3D mouses, data hand gloves or the recently released Leap Motion Controller. The Leap Motion Controller s output parameters support control and visualization of a virtual hand representation with precise characteristics in motion behavior and scale according to a users real hand to facilitate a natural user experience and gestural interaction (Apostolellis et al., 2014). Usually a free moving 6-DOF input method such as hand tracking in conjunction with a virtual hand model provides speed and short learning times but has a tradeoff versus desktop devices like 6-DOF mouses which provide comfort, precise trajectory control and coordination, as denoted by Bowman et al. 1

2 (2006). Hence, the usage of a virtual hand model enables a user direct control over virtual objects in a virtual environment and qualifies for quick interaction with a scene. Our hypothesis is that the visualization of kinematics on a hand model, i.e. interconnected lines between finger joints, are sufficient to produce the visual effect of the users own hand in a virtual system, which we ascribe to the effect of the mirror neuron system of the human brain when perceiving this model. 1.1 Virtual Hand Models Teather and Stuerzlinger (2011) generally classified input methods for virtual environments into two categories: virtual hand or depth cursors and ray-based techniques. Ray-based pointing in virtual environments is considered as intuitive but lacks of precision when users must handle small objects. Furthermore, the technique ray casted pointing impedes safe selection of partly occluded objects (Bowman et al., 2001). Since virtual environments induce characteristics where the user is supposed to sense the feeling of presence, i.e. feeling as a part of the virtual environment, a virtual representation of the user s hand increases the feeling of being part of that environment. Yuan and Steed (2010) found evidence for an immersive connection of the user to a virtual hand by conducting the rubber hand illusion experiment in a fully immersive virtual environment (Botvinick & Cohen, 1998), where the subject is exposed to a haptic stimulus on his real hand while the stimulus is virtually applied to the virtual hand in parallel. Accordingly, the method of using a virtual hand for interaction in virtual worlds can be considered to have a high impact in research over the past years. The complexity of the visual representation of the virtual hand technique increased in concurrence with more detailed capabilities of tracking hand and arm motion. A frequently used technique of tracking is the data glove, which is the predominant way of tracking hand motion for virtual environments (Bowman, 2005) and works with inertial tracking of the user s motion. Most data gloves operate without visual tracking making them robust to occlusion of hand tracking positions. However, the discomfort originating in the wearing of a data glove disqualifies its application in working environments where the user needs their hand not solely for interaction with the virtual environment. A visual tracking system therefore might confront the user with occlusion while offering a higher comfort. The Leap Motion Controller, a visual hand tracking system equipped with an infrared stereo camera, is capable of tracking a model of the human hand with sub-millimeter accuracy (Weichert et al., 2013) D Stereoscopic Visualizations of Spatial Data Applications of stereoscopic visualizations in desktop environments allow the perception of virtual depth for analysis and cognition of spatial data sets. Wittman et al. (2011) conducted an experiment, where traditional air traffic controller workplaces were compared to ones equipped with stereoscopic visualization. Seven air traffic controllers from the Deutsche Flugsicherung GmbH participated in the study as well as nine nonprofessional subjects. Results indicated a slightly but not a significantly better outcome for the stereoscopic visualizations for the group of air traffic controllers whereas the group of non-professionals showed a preference and a better performance for the stereoscopic visualization. Kockro et al. (2013) compared a stereoscopic display system as a navigation tool for surgery planning and navigation patient data previously recorded through magnet resonance imagery to a classical standard display system. Results show a clear preference on the stereoscopic visualization which also includes a virtual tool rack for interaction with the data. The depth perception improves the understanding of manipulations and surgical strategy while reducing the need of guessing (e.g. guessing the adjustment of human organs) work during surgery. This approach promises better performance of surgeries. Thus, stereoscopic displays have benefits in perceiving depth information and provide a better understanding of spatial relationships. Consumer stereoscopic displays require passive or active glasses. Stereoscopic display systems equipped with passive light filter systems work with horizontally and vertically circular polarized light and corresponding light filters, which are integrated in odd and even pixel lines of the display unit. Glasses are equipped with corresponding light filters per eye. Their advantage lies in their low weight and that they do not need batteries for operation. Active shutter glasses function with active occlusion of each eye channel with a frequency of 60 hertz per eye. The shutting of each eye channel occurs synchronized to the display, which displays alternating disparity channels per eye with a frequency of 120 hertz. They are heavier due to 2

3 included batteries and shutter technology, e.g. an infrared communicator that synchronizes with the refresh rate of the display. Both systems suffer from the conflict of the eyes convergence relating to the virtual object of interest to the contradictive accommodation on the actual display surface. That results in visual discomfort after a period of usage (Bracco et al., 2013). The symptoms of discomfort increase with the application of techniques where real entities, e.g. the human hand, enter the stereoscopic display volume. Bruder et al. (2013) use a Fitts pointing task to evaluate pointing performance in a setting where participants pointed on targets which were visualized in mid-air of the display volume of a stereoscopic table top display. The setting investigated two levels of selecting targets: Selecting objects with the participants real hand and indirect selection with a distantly controlled hand image. The tabletop setting allowed the direct selection method to perform faster than the offset approach with a lack of precision in favor of the distant selection method. Our present work aims on reducing the visual conflict in stereoscopic desktop environments by application of distantly controlled hand models. Since visualizations and sensor of virtual hand models for manipulation tasks in virtual reality have become more and more sophisticated over the past years, we however argue that minimalistic model representations such as kinematic or point cloud models are sufficient to facilitate natural user interaction in virtual environments by considering a minimum of object occlusion while interacting with virtual entities. Therefore, the research described in this paper focuses on evaluating pointing precision using freehand controlled virtual hand models with three different graphical representations implemented through the Leap Motion Controller. 2 Method An empirical study was conducted to investigate the pointing performance of differently visualized hand models used as a pointing technique for stereoscopic desktop environments. Different visualizations of virtual hand models were implemented with three decremented complexity states in their visual representation. Data was analyzed concerning movement time in trivariate pointing movements. The pointing movement is based on a Fitts pointing task. 2.1 Design A repeated measures within-participant full factorial design was used for the experiment. The type of hand model was investigated as independent variable in three levels (M 1, M 2, M 3 ). All three hand models differ in their graphical representation concerning decreasing visibility (cf. figure 1) and base on a skeletal representation. M 1 M 2 M 3 Figure 1. Independent variables M 1, M 2 and M 3 with decreasing visibility factors. Finger tips and finger joints are designed as spherical objects. The index finger tip was reactive on collisions with active objects in the 3D scene. The movement time (MT) in milliseconds was analyzed as dependent variable. 3

4 Target objects were designed as spherical targets with different target sizes varying in their threedimensional position. Since target objects were located at trivariate positions in their spatial alignment, the Shannon formulation of Fitts law was used as a model to determine an index of difficulty for threedimensional objects, which is given by: 1. MT=a+b ID where ID=log2DW+1 MT is the movement time where the factors a and b are empirically determined constants for a given pointing technique. ID is the index of difficulty measured in bits (MacKenzie, 1992)(Teather & Stuerzlinger, 2011). The target alignment was circumferential around an initial object with circles lying on different depth levels (cf. figure 2) and the initial sphere in the middle and being used as a starting object. All target objects were located in the upper half of Cartesian space. The factor W (target width) for calculation of the indices of difficulty per target sphere is listed in table 1. Teather and Stuerzlinger (2011) validated the approach of using sphered targets over cylinder shaped targets in three-dimensional environments since spheres are the more natural 3D extension of 2D circles. Figure 2. Schematic visualization of all target objects in the scene: Five target objects each lying on one of three depth layers along the circumference of the initial target. 4

5 Table 1. Target sizes of the 15 ID s regarding their actual size and their perceived size by the participant. ID (bits) 3,0 3,2 3,3 3,3 3,4 3,4 3,4 3,4 3,6 3,6 3,6 3,7 3,8 4,0 4,1 Size (mm) 20,5 17,7 15,3 15,9 14,3 13,4 12,8 13,7 10,6 10,3 10,3 9,0 8,4 6,8 5,9 Size (arcminutes) 94,2 81,3 69,9 72,8 65,6 61,4 58,5 62,8 48,5 47,1 47,1 41,4 38,5 31,4 27,1 An initial sphere located in the middle of the interaction volume and one target sphere was visible at the same time. The participants began the pointing movement with their hand lying on a hand contour which was drawn to the right of the interaction volume. The participants task now was to move their hand into the interaction volume and use their index fingers tip to hit the initial target object in the stereoscopic display volume and to then guide the index finger s tip to the target sphere. Active spheres were indicated as active with a bright green colour, inactive spheres were kept grey and translucent. A time measurement has been conducted between hitting the two spheres. 2.2 Participants A total of n = 14 right handed participants took part in the experiment, ten male and four female aged from 22 to 36 years (M = 27.2; SD = 5.8). The participants were non-paid volunteers acquired at random age and fulfilling the criterion to have the ability of stereoscopic vision and a validation of their visual acuity. One participant had to be removed from the sample for having strabismus which led to difficulties in stereo vision during the main experiment and thus producing a high error rate in pointing at the virtual targets. 2.3 Apparatus Hardware The experiment was conducted on a 27 stereoscopic display equipped with integrated polarized light filters on altering horizontal pixel lines. The participants wore low-weight passive light filter glasses to segment disparate images for each of the participants eye. Participants wearing correcting glasses were provided with a clip light filter extension which was mounted on the participants glasses. The viewing distance was calibrated to be at 750 mm before every permutation during the main experiment. An upper-class laptop with dedicated nvidia K3000M graphics hardware was used as an experimental carrier which was attached to an ASUS VG27AH display. In the beginning of the experiment the display height was adjusted to the participants eye height to guarantee a perpendicular view on the display Software The experimental system was implemented using VSG Open Inventor 9.4.NET and a Visual Studio.NET 4.5 programming environment. Standard spherical objects were used from the Open Inventor framework as well as the standard collision model provided by the framework. There were no additional virtual objects in the scene except a blue gradient used as background. 2.4 Procedure The participants began the experiment with a short survey concerning their experience in using free hand input methods and virtual environments and a measurement of their visual acuity. Afterwards, participants were subjected to do a standardized figural spatial cognition test and a motoric test using their right hand with the SCHUHFRIED Vienna Test System series for fine motor skills. The pre-testing phase was concluded with a simulator sickness questionnaire (SSQ) (Kennedy et al., 1993) to record the participants well-being before the conduction of the main experiment which included wearing passive stereoscopic glasses. The participants were accommodated to the stereoscopic environment in a 2-minute experimental 5

6 phase by using one of the hand models before conduction of the main pointing task. The hand models were used in permutated order with filling an SSQ and a Borg RPE (Borg, 1998) after execution of each set of pointing movements. Pointing tasks to each target object were performed three times and movement time was recorded, aggregated for each ID and analyzed with ANOVA with a significance level set to α= Results The motoric test for fine motor skills and the spatial cognition test were used to validate the sample concerning fine motoric skills and spatial cognition skills. Collected measuring data from the main experiment was cleared within-participant from outliers based on movement time using a box plot: MT < Q1 1.5 x IQR and MT > Q3 1.5 x IQR were marked as outliers (Tukey, 1977). 3.1 Descriptive Analysis Figure 3 shows regression on means for each of the hand models over the ID range (given in bits). A first interpretation of the movement times indicates best performance in movement time for M 2 (M = ; SD = 310.5), medium performance for M 1 (M = ; SD = 244.7) and least performance for M 3 (M = ; SD = 375.3). Figure 3. Regression on the mean values of the three levels hand model. Subjects performing well in the spatial sense test showed shorter movement times for the Fitts pointing task. The Simulator Sickness Questionnaire did not indicate changes in visual discomfort or a change of being during the experiment. Results of the Borg RPE indicate a slight increase of the exertion measurement scale but not at an exceeding level. Figure 4. Regression of movement time on ID for each factor hand model. 6

7 Figure 4 shows the regression on movement time for each of the hand models including all movement times on the indices of difficulty. The regression indicates that the movement time on each of the three factors is slightly different but with increased variance in movement time. 3.2 Analysis of Variance A Kolmogorov-Smirnov indicated no significant deviations from normal distribution. Mauchly s test indicated that the assumption of sphericity had not been violated with χ² (2) = 0.755, p= We discovered a significant main effect on movement time related to the hand models (F=16.31; p=0.000). A post-hoc conducted pairwise comparison on each of the hand models showed a significant difference in movement time (MT) between factor M 2 and M 3 with p= (t=5.689). Movement times for M 2 were significantly better over movement times for M 3. 4 Discussion The results show the movement times are significantly different for the factor hand model M 2 over the factor hand model M 3. The decreased movement time values for M 3 can result from a lower recognition as reported by some the participants after the experiment. In contrast, the representation of a point cloud with interconnected lines produced convenient and quick interaction in hitting the target spheres. Factor M 1 still produced results with having higher movement times than hand model M 2. The inferior score for movement time of factor M 3 can be attributed to acclimatization issues which occurred on usage of hand model M 3. The standard deviation of the mean of each of the hand model increased with lowering the visual information of the hand model as expected. Regarding the decreased movement time on factor M 2 this model offers a balance between visual reduction and practical use. Observations during the execution of the main experiment indicated a different trajectory movement of the participants hand motion when factor M 3 was used as a hand model: Participants appeared to have difficulties in estimating the depth position of their virtual hand as they frequently corrected the depth position of their virtual hand before being able to hit the virtual targets. A recording of the trajectories was not considered in the experimental design. 5 Conclusion and Future Work We conducted an experiment using differently modeled virtual hands with skeletal representation to be used as a pointing technique in a stereoscopic desktop environment. A thin line skeletal representation worked best in our approach showing significantly better movement times over models with decreased visual information. We observed participants correcting the depth position of their virtual hand when using the point cloud hand model. The depth correction apparently cost movement time. As no trajectories on the pointing task were recorded during the experiment this could be considered for a future experiment. Depth corrections for pointing movements in stereoscopic display settings were analyzed by Song et al. (2014). Trajectories in pointing movements could be considered as an additional factor for a future study and serve further analysis of each of the visualizations of the hand models. Since the standard Shannon formulation of Fitts Law was used for the design of the pointing task in a three-dimensional environment, a redesign of the task could be considered for a different calculation on the indices of difficulty, which takes movement angles into account, as done by Vetter et al. (2011) for targets on large touch screens in 2D environments. Therefore, target positions and their angles must be considered for ID calculation which requires determining their positions on spherical coordinates. Effects on different target depth in view direction have been registered in this study but were not considered for analysis and could be featured in future work. 7

8 References Apostolellis, P., Bortz, B., Peng, M., Polys, N., & Hoegh, A Poster: Exploring the integrality and separability of the Leap Motion Controller for direct manipulation 3D interaction. In 3D User Interfaces (3DUI), 2014 IEEE Symposium on ( ). IEEE. Borg, G., Maibaum, S., Braun, M., & Jagomast, K. K Borg s perceived exertion and pain scales. DEUTSCHE ZEITSCHRIFT FÜR SPORTMEDIZIN, 52(9). Botvinick, M., & Cohen, J Rubber hands feel touch that eyes see. Nature, 391(6669), Bowman, D. A., Chen, J., Wingrave, C. A., Lucas, J., Ray, A., Polys, N. F., others New directions in 3d user interfaces. The International Journal of Virtual Reality, 5(2), Bowman, D. A., Kruijff, E., LaViola Jr, J. J., & Poupyrev, I An introduction to 3-D user interface design. Presence: Teleoperators and Virtual Environments, 10(1), Bracco, F., Chiorri, C., Glowinski, D., Hosseini Nia, B., & Vercelli, G Investigating Visual Discomfort with 3D Displays: The Stereoscopic Discomfort Scale. In CHI 13 Extended Abstracts on Human Factors in Computing Systems ( ). New York, NY, USA: ACM. Bruder, G., Steinicke, F., & Stürzlinger, W Effects of Visual Conflicts on 3D Selection Task Performance in Stereoscopic Display Environments. In Proceedings of IEEE Symposium on 3D User Interfaces (3DUI), IEEE Press ( ). Grossman, T., & Balakrishnan, R Pointing at Trivariate Targets in 3D Environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems ( ). New York, NY, USA: ACM. Kockro, R. A., Reisch, R., Serra, L., Goh, L. C., Lee, E., & Stadie, A. T Image-guided neurosurgery with 3-dimensional multimodal imaging data on a stereoscopic monitor. Neurosurgery, 72, A78 A88. Lambooij, M., Fortuin, M., Heynderickx, I., & IJsselsteijn, W Visual discomfort and visual fatigue of stereoscopic displays: a review. Journal of Imaging Science and Technology, 53(3), MacKenzie, I. S Fitts law as a research and design tool in human-computer interaction. Human- Computer Interaction, 7(1), Meyer, R., Bützler, J., Dzaack, J., & Schlick, C. M Development of Interaction Concepts for Touchless Human-Computer Interaction with Geographic Information Systems. In Human-Computer Interaction. Advanced Interaction Modalities and Techniques ( ). Springer. Song, Y., Sun, Y., Zeng, J., & Wang, F Automatic Correction of Hand Pointing in Stereoscopic Depth. Scientific Reports, 4. Teather, R. J., & Stuerzlinger, W. (2011). Pointing at 3D targets in a stereo head-tracked virtual environment. In 3D User Interfaces (3DUI), 2011 IEEE Symposium on (87 94). IEEE. Tukey, J. W. (1977). Exploratory data analysis. Reading, Mass. (u.a.): Addison-Wesley. Vetter, S., Bützler, J., Jochems, N., & Schlick, C. M. (2011). Fitts law in bivariate pointing on large touch screens: Age-differentiated analysis of motion angle effects on movement times and error rates. In Universal Access in Human-Computer Interaction. Users Diversity ( ). Springer. Weichert, F., Bachmann, D., Rudak, B., & Fisseler, D. (2013). Analysis of the Accuracy and Robustness of the Leap Motion Controller. Sensors, 13(5), Wittmann, D., Baier, A., Neujahr, H., Petermeier, B., Sandl, P., Vernaleken, C., & Vogelmeier, L. (2011). Development and evaluation of stereoscopic situation displays for air traffic control. Universitätsbibliothek Ilmenau. Yuan, Y., & Steed, A. (2010). Is the rubber hand illusion induced by immersive virtual reality? In Virtual Reality Conference (VR), 2010 IEEE (95 102). 8

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

3D Virtual Hand Selection with EMS and Vibration Feedback

3D Virtual Hand Selection with EMS and Vibration Feedback 3D Virtual Hand Selection with EMS and Vibration Feedback Max Pfeiffer University of Hannover Human-Computer Interaction Hannover, Germany max@uni-hannover.de Wolfgang Stuerzlinger Simon Fraser University

More information

Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality

Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality Robert J. Teather, Robert S. Allison, Wolfgang Stuerzlinger Department of Computer Science & Engineering York University Toronto, Canada

More information

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Helmut Schrom-Feiertag 1, Christoph Schinko 2, Volker Settgast 3, and Stefan Seer 1 1 Austrian

More information

Do Stereo Display Deficiencies Affect 3D Pointing?

Do Stereo Display Deficiencies Affect 3D Pointing? Do Stereo Display Deficiencies Affect 3D Pointing? Mayra Donaji Barrera Machuca SIAT, Simon Fraser University Vancouver, CANADA mbarrera@sfu.ca Wolfgang Stuerzlinger SIAT, Simon Fraser University Vancouver,

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

Omni-Directional Catadioptric Acquisition System

Omni-Directional Catadioptric Acquisition System Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University

More information

Differences in Fitts Law Task Performance Based on Environment Scaling

Differences in Fitts Law Task Performance Based on Environment Scaling Differences in Fitts Law Task Performance Based on Environment Scaling Gregory S. Lee and Bhavani Thuraisingham Department of Computer Science University of Texas at Dallas 800 West Campbell Road Richardson,

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction

More information

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS Xianjun Sam Zheng, George W. McConkie, and Benjamin Schaeffer Beckman Institute, University of Illinois at Urbana Champaign This present

More information

INVESTIGATION AND EVALUATION OF POINTING MODALITIES FOR INTERACTIVE STEREOSCOPIC 3D TV

INVESTIGATION AND EVALUATION OF POINTING MODALITIES FOR INTERACTIVE STEREOSCOPIC 3D TV INVESTIGATION AND EVALUATION OF POINTING MODALITIES FOR INTERACTIVE STEREOSCOPIC 3D TV Haiyue Yuan, Janko Ćalić, Anil Fernando, Ahmet Kondoz I-Lab, Centre for Vision, Speech and Signal Processing, University

More information

CSC 2524, Fall 2017 AR/VR Interaction Interface

CSC 2524, Fall 2017 AR/VR Interaction Interface CSC 2524, Fall 2017 AR/VR Interaction Interface Karan Singh Adapted from and with thanks to Mark Billinghurst Typical Virtual Reality System HMD User Interface Input Tracking How can we Interact in VR?

More information

A Three-Dimensional Evaluation of Body Representation Change of Human Upper Limb Focused on Sense of Ownership and Sense of Agency

A Three-Dimensional Evaluation of Body Representation Change of Human Upper Limb Focused on Sense of Ownership and Sense of Agency A Three-Dimensional Evaluation of Body Representation Change of Human Upper Limb Focused on Sense of Ownership and Sense of Agency Shunsuke Hamasaki, Atsushi Yamashita and Hajime Asama Department of Precision

More information

3D Interactions with a Passive Deformable Haptic Glove

3D Interactions with a Passive Deformable Haptic Glove 3D Interactions with a Passive Deformable Haptic Glove Thuong N. Hoang Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ngocthuong@gmail.com Ross

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp

More information

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o Traffic lights chapter 1 the human part 1 (modified extract for AISD 2005) http://www.baddesigns.com/manylts.html User-centred Design Bad design contradicts facts pertaining to human capabilities Usability

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Issues and Challenges of 3D User Interfaces: Effects of Distraction Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an

More information

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses

More information

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery Using Simulation to Design Control Strategies for Robotic No-Scar Surgery Antonio DE DONNO 1, Florent NAGEOTTE, Philippe ZANNE, Laurent GOFFIN and Michel de MATHELIN LSIIT, University of Strasbourg/CNRS,

More information

Evaluating Touch Gestures for Scrolling on Notebook Computers

Evaluating Touch Gestures for Scrolling on Notebook Computers Evaluating Touch Gestures for Scrolling on Notebook Computers Kevin Arthur Synaptics, Inc. 3120 Scott Blvd. Santa Clara, CA 95054 USA karthur@synaptics.com Nada Matic Synaptics, Inc. 3120 Scott Blvd. Santa

More information

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON Josep Amat 1, Alícia Casals 2, Manel Frigola 2, Enric Martín 2 1Robotics Institute. (IRI) UPC / CSIC Llorens Artigas 4-6, 2a

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Haptics CS327A

Haptics CS327A Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters University of Iowa Iowa Research Online Driving Assessment Conference 2017 Driving Assessment Conference Jun 28th, 12:00 AM Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected

More information

Navigating the Space: Evaluating a 3D-Input Device in Placement and Docking Tasks

Navigating the Space: Evaluating a 3D-Input Device in Placement and Docking Tasks Navigating the Space: Evaluating a 3D-Input Device in Placement and Docking Tasks Elke Mattheiss Johann Schrammel Manfred Tscheligi CURE Center for Usability CURE Center for Usability ICT&S, University

More information

Localized Space Display

Localized Space Display Localized Space Display EE 267 Virtual Reality, Stanford University Vincent Chen & Jason Ginsberg {vschen, jasong2}@stanford.edu 1 Abstract Current virtual reality systems require expensive head-mounted

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments Elias Giannopoulos 1, Victor Eslava 2, María Oyarzabal 2, Teresa Hierro 2, Laura González 2, Manuel Ferre 2,

More information

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments Cleber S. Ughini 1, Fausto R. Blanco 1, Francisco M. Pinto 1, Carla M.D.S. Freitas 1, Luciana P. Nedel 1 1 Instituto

More information

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Optical Marionette: Graphical Manipulation of Human s Walking Direction Optical Marionette: Graphical Manipulation of Human s Walking Direction Akira Ishii, Ippei Suzuki, Shinji Sakamoto, Keita Kanai Kazuki Takazawa, Hiraku Doi, Yoichi Ochiai (Digital Nature Group, University

More information

Improving Depth Perception in Medical AR

Improving Depth Perception in Medical AR Improving Depth Perception in Medical AR A Virtual Vision Panel to the Inside of the Patient Christoph Bichlmeier 1, Tobias Sielhorst 1, Sandro M. Heining 2, Nassir Navab 1 1 Chair for Computer Aided Medical

More information

Immersive Guided Tours for Virtual Tourism through 3D City Models

Immersive Guided Tours for Virtual Tourism through 3D City Models Immersive Guided Tours for Virtual Tourism through 3D City Models Rüdiger Beimler, Gerd Bruder, Frank Steinicke Immersive Media Group (IMG) Department of Computer Science University of Würzburg E-Mail:

More information

Head-Movement Evaluation for First-Person Games

Head-Movement Evaluation for First-Person Games Head-Movement Evaluation for First-Person Games Paulo G. de Barros Computer Science Department Worcester Polytechnic Institute 100 Institute Road. Worcester, MA 01609 USA pgb@wpi.edu Robert W. Lindeman

More information

Enhancing Fish Tank VR

Enhancing Fish Tank VR Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands mullie robertl @cwi.nl Abstract Fish tank VR systems provide head

More information

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision 11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste

More information

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Using Real Objects for Interaction Tasks in Immersive Virtual Environments Using Objects for Interaction Tasks in Immersive Virtual Environments Andy Boud, Dr. VR Solutions Pty. Ltd. andyb@vrsolutions.com.au Abstract. The use of immersive virtual environments for industrial applications

More information

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis CSC Stereography Course 101... 3 I. What is Stereoscopic Photography?... 3 A. Binocular Vision... 3 1. Depth perception due to stereopsis... 3 2. Concept was understood hundreds of years ago... 3 3. Stereo

More information

3D display is imperfect, the contents stereoscopic video are not compatible, and viewing of the limitations of the environment make people feel

3D display is imperfect, the contents stereoscopic video are not compatible, and viewing of the limitations of the environment make people feel 3rd International Conference on Multimedia Technology ICMT 2013) Evaluation of visual comfort for stereoscopic video based on region segmentation Shigang Wang Xiaoyu Wang Yuanzhi Lv Abstract In order to

More information

Running an HCI Experiment in Multiple Parallel Universes

Running an HCI Experiment in Multiple Parallel Universes Author manuscript, published in "ACM CHI Conference on Human Factors in Computing Systems (alt.chi) (2014)" Running an HCI Experiment in Multiple Parallel Universes Univ. Paris Sud, CNRS, Univ. Paris Sud,

More information

Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays

Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays Z.Y. Alpaslan, S.-C. Yeh, A.A. Rizzo, and A.A. Sawchuk University of Southern California, Integrated Media Systems

More information

Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI

Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI Marcelo Mortensen Wanderley Nicola Orio Outline Human-Computer Interaction (HCI) Existing Research in HCI Interactive Computer

More information

Touch Perception and Emotional Appraisal for a Virtual Agent

Touch Perception and Emotional Appraisal for a Virtual Agent Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de

More information

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Capability for Collision Avoidance of Different User Avatars in Virtual Reality Capability for Collision Avoidance of Different User Avatars in Virtual Reality Adrian H. Hoppe, Roland Reeb, Florian van de Camp, and Rainer Stiefelhagen Karlsruhe Institute of Technology (KIT) {adrian.hoppe,rainer.stiefelhagen}@kit.edu,

More information

HUMAN COMPUTER INTERFACE

HUMAN COMPUTER INTERFACE HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Assessing the Effects of Orientation and Device on (Constrained) 3D Movement Techniques

Assessing the Effects of Orientation and Device on (Constrained) 3D Movement Techniques Assessing the Effects of Orientation and Device on (Constrained) 3D Movement Techniques Robert J. Teather * Wolfgang Stuerzlinger Department of Computer Science & Engineering, York University, Toronto

More information

Effects of Simulation Fidelty on User Experience in Virtual Fear of Public Speaking Training An Experimental Study

Effects of Simulation Fidelty on User Experience in Virtual Fear of Public Speaking Training An Experimental Study Effects of Simulation Fidelty on User Experience in Virtual Fear of Public Speaking Training An Experimental Study Sandra POESCHL a,1 a and Nicola DOERING a TU Ilmenau Abstract. Realistic models in virtual

More information

The Influence of Visual Illusion on Visually Perceived System and Visually Guided Action System

The Influence of Visual Illusion on Visually Perceived System and Visually Guided Action System The Influence of Visual Illusion on Visually Perceived System and Visually Guided Action System Yu-Hung CHIEN*, Chien-Hsiung CHEN** * Graduate School of Design, National Taiwan University of Science and

More information

Stereoscopic Augmented Reality System for Computer Assisted Surgery

Stereoscopic Augmented Reality System for Computer Assisted Surgery Marc Liévin and Erwin Keeve Research center c a e s a r, Center of Advanced European Studies and Research, Surgical Simulation and Navigation Group, Friedensplatz 16, 53111 Bonn, Germany. A first architecture

More information

INTERACTIVE 3D VIRTUAL HYDRAULICS Using virtual reality environments in teaching and research of fluid power systems and components

INTERACTIVE 3D VIRTUAL HYDRAULICS Using virtual reality environments in teaching and research of fluid power systems and components INTERACTIVE 3D VIRTUAL HYDRAULICS Using virtual reality environments in teaching and research of fluid power systems and components L. Pauniaho, M. Hyvonen, R. Erkkila, J. Vilenius, K. T. Koskinen and

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Virtual Reality Calendar Tour Guide

Virtual Reality Calendar Tour Guide Technical Disclosure Commons Defensive Publications Series October 02, 2017 Virtual Reality Calendar Tour Guide Walter Ianneo Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Chapter 1 Virtual World Fundamentals

Chapter 1 Virtual World Fundamentals Chapter 1 Virtual World Fundamentals 1.0 What Is A Virtual World? {Definition} Virtual: to exist in effect, though not in actual fact. You are probably familiar with arcade games such as pinball and target

More information

Evaluating Effect of Sense of Ownership and Sense of Agency on Body Representation Change of Human Upper Limb

Evaluating Effect of Sense of Ownership and Sense of Agency on Body Representation Change of Human Upper Limb Evaluating Effect of Sense of Ownership and Sense of Agency on Body Representation Change of Human Upper Limb Shunsuke Hamasaki, Qi An, Wen Wen, Yusuke Tamura, Hiroshi Yamakawa, Atsushi Yamashita, Hajime

More information

Reinventing movies How do we tell stories in VR? Diego Gutierrez Graphics & Imaging Lab Universidad de Zaragoza

Reinventing movies How do we tell stories in VR? Diego Gutierrez Graphics & Imaging Lab Universidad de Zaragoza Reinventing movies How do we tell stories in VR? Diego Gutierrez Graphics & Imaging Lab Universidad de Zaragoza Computer Graphics Computational Imaging Virtual Reality Joint work with: A. Serrano, J. Ruiz-Borau

More information

Regan Mandryk. Depth and Space Perception

Regan Mandryk. Depth and Space Perception Depth and Space Perception Regan Mandryk Disclaimer Many of these slides include animated gifs or movies that may not be viewed on your computer system. They should run on the latest downloads of Quick

More information

The Human Visual System!

The Human Visual System! an engineering-focused introduction to! The Human Visual System! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 2! Gordon Wetzstein! Stanford University! nautilus eye,

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

COMS W4172 Design Principles

COMS W4172 Design Principles COMS W4172 Design Principles Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 January 25, 2018 1 2D & 3D UIs: What s the

More information

Geo-Located Content in Virtual and Augmented Reality

Geo-Located Content in Virtual and Augmented Reality Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY IADIS International Conference Gaming 2008 THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY Yang-Wai Chow School of Computer Science and Software Engineering

More information

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces In Usability Evaluation and Interface Design: Cognitive Engineering, Intelligent Agents and Virtual Reality (Vol. 1 of the Proceedings of the 9th International Conference on Human-Computer Interaction),

More information

Out-of-Reach Interactions in VR

Out-of-Reach Interactions in VR Out-of-Reach Interactions in VR Eduardo Augusto de Librio Cordeiro eduardo.augusto.cordeiro@ist.utl.pt Instituto Superior Técnico, Lisboa, Portugal October 2016 Abstract Object selection is a fundamental

More information

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments Weidong Huang 1, Leila Alem 1, and Franco Tecchia 2 1 CSIRO, Australia 2 PERCRO - Scuola Superiore Sant Anna, Italy {Tony.Huang,Leila.Alem}@csiro.au,

More information

Behavioural Realism as a metric of Presence

Behavioural Realism as a metric of Presence Behavioural Realism as a metric of Presence (1) Jonathan Freeman jfreem@essex.ac.uk 01206 873786 01206 873590 (2) Department of Psychology, University of Essex, Wivenhoe Park, Colchester, Essex, CO4 3SQ,

More information

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Technical Disclosure Commons Defensive Publications Series October 02, 2017 Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Adam Glazier Nadav Ashkenazi Matthew

More information

The Control of Avatar Motion Using Hand Gesture

The Control of Avatar Motion Using Hand Gesture The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,

More information

The Eyes Don t Have It: An Empirical Comparison of Head-Based and Eye-Based Selection in Virtual Reality

The Eyes Don t Have It: An Empirical Comparison of Head-Based and Eye-Based Selection in Virtual Reality The Eyes Don t Have It: An Empirical Comparison of Head-Based and Eye-Based Selection in Virtual Reality YuanYuan Qian Carleton University Ottawa, ON Canada heather.qian@carleton.ca ABSTRACT We present

More information

Augmented and Virtual Reality

Augmented and Virtual Reality CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

http://uu.diva-portal.org This is an author produced version of a paper published in Proceedings of the 23rd Australian Computer-Human Interaction Conference (OzCHI '11). This paper has been peer-reviewed

More information

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Beyond Visual: Shape, Haptics and Actuation in 3D UI Beyond Visual: Shape, Haptics and Actuation in 3D UI Ivan Poupyrev Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for

More information

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User

More information

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e. VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D

More information

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Visual Interpretation of Hand Gestures as a Practical Interface Modality Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate

More information

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005. Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.

More information

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, MANUSCRIPT ID 1 Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task Eric D. Ragan, Regis

More information

Enhancing Fish Tank VR

Enhancing Fish Tank VR Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands fmulliejrobertlg@cwi.nl Abstract Fish tank VR systems provide head

More information

pcon.planner PRO Plugin VR-Viewer

pcon.planner PRO Plugin VR-Viewer pcon.planner PRO Plugin VR-Viewer Manual Dokument Version 1.2 Author DRT Date 04/2018 2018 EasternGraphics GmbH 1/10 pcon.planner PRO Plugin VR-Viewer Manual Content 1 Things to Know... 3 2 Technical Tips...

More information

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Seungmoon Choi and Hong Z. Tan Haptic Interface Research Laboratory Purdue University 465 Northwestern Avenue West Lafayette,

More information

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

More information

Comparing Input Methods and Cursors for 3D Positioning with Head-Mounted Displays

Comparing Input Methods and Cursors for 3D Positioning with Head-Mounted Displays Comparing Input Methods and Cursors for 3D Positioning with Head-Mounted Displays Junwei Sun School of Interactive Arts and Technology Simon Fraser University junweis@sfu.ca Wolfgang Stuerzlinger School

More information

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS Announcements Homework project 2 Due tomorrow May 5 at 2pm To be demonstrated in VR lab B210 Even hour teams start at 2pm Odd hour teams start

More information

Discriminating direction of motion trajectories from angular speed and background information

Discriminating direction of motion trajectories from angular speed and background information Atten Percept Psychophys (2013) 75:1570 1582 DOI 10.3758/s13414-013-0488-z Discriminating direction of motion trajectories from angular speed and background information Zheng Bian & Myron L. Braunstein

More information

NeuroSim - The Prototype of a Neurosurgical Training Simulator

NeuroSim - The Prototype of a Neurosurgical Training Simulator NeuroSim - The Prototype of a Neurosurgical Training Simulator Florian BEIER a,1,stephandiederich a,kirstenschmieder b and Reinhard MÄNNER a,c a Institute for Computational Medicine, University of Heidelberg

More information

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc. Human Vision and Human-Computer Interaction Much content from Jeff Johnson, UI Wizards, Inc. are these guidelines grounded in perceptual psychology and how can we apply them intelligently? Mach bands:

More information

A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment

A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment S S symmetry Article A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment Mingyu Kim, Jiwon Lee ID, Changyu Jeon and Jinmo Kim * ID Department of Software,

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Test of pan and zoom tools in visual and non-visual audio haptic environments Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Published in: ENACTIVE 07 2007 Link to publication Citation

More information

A reduction of visual fields during changes in the background image such as while driving a car and looking in the rearview mirror

A reduction of visual fields during changes in the background image such as while driving a car and looking in the rearview mirror Original Contribution Kitasato Med J 2012; 42: 138-142 A reduction of visual fields during changes in the background image such as while driving a car and looking in the rearview mirror Tomoya Handa Department

More information

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1 Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility

More information