Discrete Rotation During Eye-Blink

Similar documents
Available online at ScienceDirect. Procedia CIRP 44 (2016 )

The Redirected Walking Toolkit: A Unified Development Platform for Exploring Large Virtual Environments

Panel: Lessons from IEEE Virtual Reality

ReWalking Project. Redirected Walking Toolkit Demo. Advisor: Miri Ben-Chen Students: Maya Fleischer, Vasily Vitchevsky. Introduction Equipment

CSC 2524, Fall 2017 AR/VR Interaction Interface

A 360 Video-based Robot Platform for Telepresent Redirected Walking

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Reorientation during Body Turns

Detection Thresholds for Rotation and Translation Gains in 360 Video-based Telepresence Systems

Leveraging Change Blindness for Redirection in Virtual Environments

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Moving Towards Generally Applicable Redirected Walking

Self-Motion Illusions in Immersive Virtual Reality Environments

Presence-Enhancing Real Walking User Interface for First-Person Video Games

A psychophysically calibrated controller for navigating through large environments in a limited free-walking space

Head-Movement Evaluation for First-Person Games

The Visual Cliff Revisited: A Virtual Presence Study on Locomotion. Extended Abstract

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

VR/AR Concepts in Architecture And Available Tools

Navigation in Immersive Virtual Reality The Effects of Steering and Jumping Techniques on Spatial Updating

Virtual/Augmented Reality (VR/AR) 101

Comparison of Travel Techniques in a Complex, Multi-Level 3D Environment

Mid-term report - Virtual reality and spatial mobility

WHEN moving through the real world humans

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Immersive Real Acting Space with Gesture Tracking Sensors

Aalborg Universitet. Walking in Place Through Virtual Worlds Nilsson, Niels Chr.; Serafin, Stefania; Nordahl, Rolf

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies

Redirecting Walking and Driving for Natural Navigation in Immersive Virtual Environments

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

Evaluating Joystick Control for View Rotation in Virtual Reality with Continuous Turning, Discrete Turning, and Field-of-view Reduction

CSE 165: 3D User Interaction. Lecture #11: Travel

Navigating the Virtual Environment Using Microsoft Kinect

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Does a Gradual Transition to the Virtual World increase Presence?

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Immersive Guided Tours for Virtual Tourism through 3D City Models

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

Simultaneous Object Manipulation in Cooperative Virtual Environments

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Discriminating direction of motion trajectories from angular speed and background information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Evaluation of an Omnidirectional Walking-in-Place User Interface with Virtual Locomotion Speed Scaled by Forward Leaning Angle

Physical Hand Interaction for Controlling Multiple Virtual Objects in Virtual Reality

VR Collide! Comparing Collision- Avoidance Methods Between Colocated Virtual Reality Users

Comparison of Three Eye Tracking Devices in Psychology of Programming Research

DESIGNING AND CONDUCTING USER STUDIES

Motion sickness issues in VR content

Touching Floating Objects in Projection-based Virtual Reality Environments

INTERIOUR DESIGN USING AUGMENTED REALITY

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Leaning-Based Travel Interfaces Revisited: Frontal versus Sidewise Stances for Flying in 3D Virtual Spaces

Multi variable strategy reduces symptoms of simulator sickness

Interactive Gamified Virtual Reality Training of Affine Transformations

Guidelines for choosing VR Devices from Interaction Techniques

Controlling Viewpoint from Markerless Head Tracking in an Immersive Ball Game Using a Commodity Depth Based Camera


PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Learning From Where Students Look While Observing Simulated Physical Phenomena

Scene-Motion- and Latency-Perception Thresholds for Head-Mounted Displays

Dynamic Platform for Virtual Reality Applications

The Control of Avatar Motion Using Hand Gesture

AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON

Virtual Reality to Support Modelling. Martin Pett Modelling and Visualisation Business Unit Transport Systems Catapult

Toward an Augmented Reality System for Violin Learning Support

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Intro to Virtual Reality (Cont)

Virtuelle Realität. Overview. Part 13: Interaction in VR: Navigation. Navigation Wayfinding Travel. Virtuelle Realität. Prof.

Improved Pilot Training using Head and Eye Tracking System

VMotion: Designing a Seamless Walking Experience in VR

Gaze Direction in Virtual Reality Using Illumination Modulation and Sound

AngkorVR. Advanced Practical Richard Schönpflug and Philipp Rettig

TRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH CHILDREN

Gaze-controlled Driving

Experiments with An Improved Iris Segmentation Algorithm

Impossible Spaces: Maximizing Natural Walking in Virtual Environments with Self-Overlapping Architecture

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

The Effects of Finger-Walking in Place (FWIP) for Spatial Knowledge Acquisition in Virtual Environments

Differences in Fitts Law Task Performance Based on Environment Scaling

Reflecting on Comic Con - Lecture 12. Mario Romero 2016/11/11

Human Autonomous Vehicles Interactions: An Interdisciplinary Approach

Experiments on the locus of induced motion

Move to Improve: Promoting Physical Navigation to Increase User Performance with Large Displays

CS295-1 Final Project : AIBO

Immersive Simulation in Instructional Design Studios

Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP)

Exploring Software Cities in Virtual Reality

Texture recognition using force sensitive resistors

A FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS

CB Database: A change blindness database for objects in natural indoor scenes

BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS

Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques

Motion Capturing Empowered Interaction with a Virtual Agent in an Augmented Reality Environment

The Matrix Has You. Realizing Slow Motion in Full-Body Virtual Reality

Immersive Well-Path Editing: Investigating the Added Value of Immersion

AGING AND STEERING CONTROL UNDER REDUCED VISIBILITY CONDITIONS. Wichita State University, Wichita, Kansas, USA

Experiments in Mixed Reality

Transcription:

Discrete Rotation During Eye-Blink Anh Nguyen (B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Zürich, Switzerland nngoc@ethz.ch Abstract. Redirection techniques enable users to explore a virtual environment larger than the real physical space by manipulating the mapping between the virtual and real trajectories without breaking immersion. These techniques can be applied continuously over time (using translational, rotational and curvature gains) or discretely (utilizing change blindness, visual suppression etc.). While most attention has been devoted to continuous techniques, not much has been done on discrete techniques, particularly those utilizing visual suppression. In this paper, we propose a study to investigate the effect of discrete rotation of the virtual environment during eye-blink. More specifically, we describe our methodology and experiment design for identifying rotation detection thresholds during blinking. We also discuss preliminary results from a pilot study. Keywords: Redirected walking Eye-blink Rotation detection threshold Visual suppression 1 Introduction Compared to other methods of navigating in a virtual environment (VE) such as using controllers or walking-in-place, real walking has been shown to have better integrity and provide better immersion [1]. However, the challenge arises when the VE is much larger than the physical space. One of the solutions to this problem is the use of redirection techniques (RDTs). Depending on how these techniques are applied, Suma et al. categorized them into continuous and discrete. These techniques can be further divided into overt and subtle depending on whether they are noticeable or not [2]. Overt continuous RDTs involve the use of metaphors such as seven league boots [3], flying [1] or virtual elevators and escalators. Subtle continuous RDTs involve continuously manipulating different aspects of the users trajectory such as translation - users walk faster/slower in the VE than in real life, rotation - users rotate faster/slower in the VE than in real life and curvature - users walk on a different curvature in the VE than in real life [4]. When applied within certain thresholds, these manipulations remain unnoticeable and immersion is maintained. Discrete RDTs refer to instantaneous relocation or reorientation of users in the VE. Some examples of overt discrete RDTs are teleportation [5] and portals [6]. Subtle discrete c Springer International Publishing AG, part of Springer Nature 2018 L. T. De Paolis and P. Bourdot (Eds.): AVR 2018, LNCS 10850, pp. 183 189, 2018. https://doi.org/10.1007/978-3-319-95270-3_13

184 A. Nguyen et al. RDT can be performed when users fail to notice the reorientation and relocation due to change blindness [7] or during visual suppression caused by saccadic eye movement or blinking [8, 9]. Although overt RDTs offer higher range of motion and enable users to travel in a much larger VE, it has been shown that subtle RDTs produce fewer breaks in presence [2] and therefore are generally prefered for a more immersive VR experience. Among the subtle RDTs, most attention has been paid on continuous RDTs including research on detection thresholds and factors that influence them [10, 11], or research on the implementation of these techniques in real walking applications such as steer-to-center, steer-toorbit, steer-to-predefined-target [4], model predictive control [12]. Up to now, current research on discrete RDTs, especially using eyetracker information (e.g. eye movements, blinks, gazes) is quite limited, probably due to the lack of head mounted displays (HMDs) with an integrated eyetracker. With the development of new HMDs with affordable integrated eyetrackers such as HTC Vive or FOVE, it is promising that research on subtle discrete RDTs using eyetracker information could be widely applicable in the future. In this paper, we propose the application of subtle discrete RDTs, more specifically rotation, in real walking during blinking. We first describe our methodology for blink detection and threshold identification. Furthermore, we discuss our experiment design and setup, and the results from a pilot study. 2 Related Work We blink spontaneously 20 30 times per minute [13] to moisturize our eyes and each blink lasts about 100 150 ms [14]. During blinking, the eyelids cover the pupils and prevent light and visual inputs from entering the eyes, resulting in a disruption of the image on the rectina. Nevertheless, we rarely notice this disruption due to the fact that our brain suppresses visual information during blinking, so-called visual suppression. Interestingly, because of this suppression, people sometimes fail to notice changes happening to the scene during blinking such as color change, target appearance/disappearance or target displacement [15]. While visual suppression during blinking is undesirable in tasks that require constant monitoring of visual input such as driving, it offers a new posibility for discrete subtle redirection in the context of redirected walking. There is, however, a limit to how much redirection could be applied to the scene without the user noticing it. The only study that addresses this question is by Ivleva where a blink sensor was created and used with the HTC Vive to identify the detection thresholds for reorientation and repositioning during blinking [9]. While results from this study can not be used in a redirected walking application, they concluded that it could be a potential method. There are also a few limitations of this study such as users in the study were not performing locomotion, and the scene used may have contained reference points that give clues to the users where they have been redirected. In other contexts not related to redirected walking, many studies have been conducted to confirm the fact that people do not notice target displacement during blinking. However, to our knowledge, there exists no other study that quantifies this displacement.

Discrete Rotation During Eye-Blink 185 3 Methodology 3.1 Blink Detection Figure1 shows typical pupil diameter recordings of a participant walking in a VE. It can be seen that during blinking the eyetracker loses track of the eyes and the pupil sizes become zero. However, it is worth to notice that the left and right eyes do not open or close at the same time and there is occasionally spurious noise like in Fig. 1(b). Since redirection should only be applied during blinking, it is important that blinks are detected reliably and there can not be any false positive. Therefore, in our blink detection algorithm, the following two conditions need to be satisfied for an event to be considered a blink: (i) both eyes pupil diameters should change from nonzero to zero and remain zero for a certain amount of time; (ii) once the first condition is satisfied, the subsequent step from nonzero to zero will only be considered after a predefined amount of time to eliminate irregular blinks or noise like in Fig. 1(b). Fig. 1. Diameter of left and right pupils of a participant during walking

186 A. Nguyen et al. 3.2 Threshold Identification The detection of a stimulus could be modeled by a psychometric curve where the x-axis represents the stimulus value and the y-axis represents the percent of correct response. Threshold identification refers to the process of identifying this psychometric function. The classical method to identify the whole psychometric function is called the constant stimuli method (CSM), where the whole range of stimulus is presented in random order. However, this method requires a large number of repetitions and is not efficient since most of the time, only certain aspects of this psychometric function such as the 75% correct response point, or the slope are of interest. In constrast to CSM, adaptive methods such as staircase method, bayesian adaptive methods, etc. select the next stimulus level based on previous responses and do not present the whole range of stimulus. These methods require fewer trials but only identify one point on the psychometric curve and/or the slope. While most existing studies on redirected walking adopt the CSM for threshold identification [8, 10], to reduce experiment time, we select the Bayesian adaptive method called QUEST, whose details are provided by Watson and Pelli [16]. 4 Experiment Design and Setup The aim of this study is to identify the detection threshold for scene rotation during blinking. While in other redirected walking thresholds studies the participants were informed about the purpose of the study and asked if they notice the manipulation correctly, the same design can not be used in our experiment. If the participants are informed that during blinking the scene will be rotated, they will potentially try to fixate on a reference point and deliberately blink to identify the rotation direction. As a result, the real aim of the study can not be disclosed. Instead, a cover story is given to the participants that they are testing a new system which may contain some technical bugs and are encouraged to inform the experimenter whenever such bug occurs. When a subject reports a bug, the experimenter first makes sure that a scene rotation has just been applied and then verifies if the subject has really noticed the rotation rather than something else. When it is confirmed that the subject has noticed the rotation, it will be considered a correct detection response. Otherwise, when a stimulus has been presented after a blink, without the user making any comment, it will be considered a no detection response. Depending on the type of responses, the next stimulus level is selected accordingly. In addition, since there may be asymmetry in users ability to detect scene rotation of different directions, we identify thresholds for left and right rotations separately. In this study, users are required to walk around a maze-like environment (Fig. 2(a)) to search for a target. The maze is much larger than the existing available tracking space (Fig. 2(b)) and therefore whenever users approach the physical wall, a reset action will be performed which reorients the users towards the center of the physical space. Once the target has been found, a new scene

Discrete Rotation During Eye-Blink 187 will be randomly generated and loaded. The experiment is completed after the users have been exposed to 40 stimulus values per rotation direction. (a) User view of the VR scene (b) Top view with real physical space overlay Fig. 2. Scene used in the study Our setup consists of an Oculus DK2 head mounted display (HMD) with an integrated SMI eyetracker providing eyetracking data such as gaze position, pupil diameters, etc. at 60 Hz. An Intersense IS-1200 optical tracking system is attached on top of the HMD and provides 6 DOF position tracking at a rate of 180 Hz. The system is powered by a backpack-mounted laptop and the game play was made with Unity. The environment was optimized to run constantly at the HMD s maximum frame rate of 75 Hz. The available tracking space is 13 m 6.6 m. 5 Pilot Study and Preliminary Results A pilot study was performed to verify the applicability of the proposed experiment protocol and the cover story. Five naive subjects (3 males and 2 females, age range: 20 29) who were all students from the university volunteered to participate in the study. The subjects were not informed about the real purpose of the study but instead were told the cover story. The first pilot subject remembered to mention to the expetimenter everytime he noticed a technical bug such as: the color is weird, some things seem a bit blur, or the scene just glitched. However, the next two subjects were too immersed in the VE that they did not mention anything even though the scene rotation was increased up to its predefined maximum of 15. When asked if they had noticed anything, they replied I sometimes saw the scene jump and I have seen it for a while now but forgot to mention it. Since it is crucial that the user s responses are timely collected, we changed the experiment protocol for the last two pilot subjects and added a training session. In this training session, the subjects were exposed to the same environment but the scene rotation was always 15.

188 A. Nguyen et al. This ensured that the subjects experienced the stimulus and understood what they should point out during the experiment. Moreover, keywords were assigned to each bug that the subjects discovered in the training session such as: blur, jump, color, etc. This way, during the final study, the subjects only need to use these keywords when they detect a bug and do not have to stop and explain in full sentence what just happened. This adjusted protocol worked well for the last two pilot subjects and will be adopted for the final study. After the experiment, a series of questions was used to debrief the subjects, to determine the effectiveness of the cover story and whether the subjects had realized that the scene rotations were linked to blinking. When asked if they could guess why the technical bugs occured, all the subjects recited the cover story and none of them identified that they were associated with their blinks. An average detection threshold could not be obtained from this pilot study due to the limited number of subjects and varied experiment protocol between subjects. However, it was observed that scene rotations below 5 were on average not detected by the subjects. This estimation is close to the detection threshold during saccadic eye movements found by Bolte and Lappe [8]. 6 Conclusion In this paper, we proposed an experiment design for identifying detection thresholds for scene rotation during blinking. Without being told the true purpose of the study, users were asked to walk around a VE looking for a target and encouraged to report when they detect some technical bugs, i.e. scene manipulation. The performed pilot study enabled us to refine the experiment design, showed that the cover story was effective and resulted in a rough estimation of the detection threshold. Further studies with large enough sample size are required to identify the detection threshold of not only scene rotation but displacement during blinking. References 1. Usoh, M., Arthur, K., Whitton, M.C., Bastos, R., Steed, A., Slater, M., Brooks Jr., F.P.: Walking > walking-in-place > flying, in virtual environments. In: Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1999, pp. 359 364. ACM Press/Addison-Wesley Publishing Co., New York (1999) 2. Suma, E.A., Bruder, G., Steinicke, F., Krum, D.M., Bolas, M.: A taxonomy for deploying redirection techniques in immersive virtual environments. In: 2012 IEEE Virtual Reality Workshops (VRW), pp. 43 46, March 2012 3. Interrante, V., Ries, B., Anderson, L.: Seven league boots: a new metaphor for augmented locomotion through moderately large scale immersive virtual environments. In: 2007 IEEE Symposium on 3D User Interfaces, March 2007 4. Razzaque, S., Kohn, Z., Whitton, M.C.: Redirected walking. In: Eurographics 2001 - Short Presentations, Geneva, Switzerland, pp. 1 6. Eurographics Association (2001)

Discrete Rotation During Eye-Blink 189 5. Bowman, D.A., Koller, D., Hodges, L.F.: Travel in immersive virtual environments: an evaluation of viewpoint motion control techniques. In: Proceedings of IEEE 1997 Annual International Symposium on Virtual Reality, pp. 45 52, 215, March 1997 6. Freitag, S., Rausch, D., Kuhlen, T.: Reorientation in virtual environments using interactive portals. In: 2014 IEEE Symposium on 3D User Interfaces (3DUI), pp. 119 122, March 2014 7. Suma, E.A., Clark, S., Krum, D., Finkelstein, S., Bolas, M., Warte, Z.: Leveraging change blindness for redirection in virtual environments. In 2011 IEEE Virtual Reality Conference, pp. 159 166, March 2011 8. Bolte, B., Lappe, M.: Subliminal reorientation and repositioning in immersive virtual environments using saccadic suppression. IEEE Trans. Vis. Comput. Graph. 21, 545 552 (2015) 9. Ivleva, V.: Redirected Walking in Virtual Reality during eye blinking. Bachelor s thesis, University of Bremen (2016) 10. Steinicke, F., Bruder, G., Jerald, J., Frenz, H., Lappe, M.: Estimation of detection thresholds for redirected walking techniques. IEEE Trans. Vis. Comput. Graph. 16, 17 27 (2010) 11. Neth, C.T., Souman, J.L., Engel, D., Kloos, U., Bülthoff, H.H., Mohler, B.J.: Velocity-dependent dynamic curvature gain for redirected walking. In: 2011 IEEE Virtual Reality Conference, pp. 151 158. IEEE, New York, March 2011 12. Nescher, T., Huang, Y.-Y., Kunz, A.: Planning redirection techniques for optimal free walking experience using model predictive control. In: 2014 IEEE Symposium on 3D User Interfaces (3DUI), pp. 111 118, March 2014 13. Sun, W.S., Baker, R.S., Chuke, J.C., Rouholiman, B.R., Hasan, S.A., Gaza, W., Stava, M.W., Porter, J.D.: Age-related changes in human blinks. Passive and active changes in eyelid kinematics. Invest. Ophthalmol. Vis. Sci. 38(1), 92 99 (1997) 14. VanderWerf, F., Brassinga, P., Reits, D., Aramideh, M., Ongerboer de Visser, B.: Eyelid movements: behavioral studies of blinking in humans under different stimulus conditions. J. Neurophysiol. 89(5), 2784 2796 (2003) 15. Kevin O Regan, J., Deubel, H., Clark, J.J., Rensink, R.A.: Picture changes during blinks: looking without seeing and seeing without looking. Vis. Cogn. 7(1 3), 191 211 (2000) 16. Watson, A.B., Pelli, D.G.: Quest: a Bayesian adaptive psychometric method. Percept. Psychophys. 33, 113 120 (1983)