HMD calibration and its effects on distance judgments

Size: px
Start display at page:

Download "HMD calibration and its effects on distance judgments"

Transcription

1 HMD calibration and its effects on distance judgments Scott A. Kuhl, William B. Thompson and Sarah H. Creem-Regehr University of Utah Most head-mounted displays (HMDs) suffer from substantial optical distortion, and vendorsupplied specifications for field-of-view often are at variance with reality. Unless corrected, such displays do not present perspective-related visual cues in a geometrically correct manner. Distorted geometry has the potential to affect applications of HMDs which depend on precise spatial perception. This paper provides empirical evidence for the degree to which common geometric distortions affect one type of spatial judgment in virtual environments. We show that minification or magnification in the HMD that would occur from misstated HMD field of view causes significant changes in distance judgments. Incorrectly calibrated pitch and pincushion distortion, however, do not cause statistically significant changes in distance judgments for the degree of distortions examined. While the means for determining the optical distortion of display systems are well known, they are often not used in non-see-through HMDs due to problems in measuring and correcting for distortion. As a result, we also provide practical guidelines for creating geometrically calibrated systems. Categories and Subject Descriptors: I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism Virtual reality General Terms: Experimentation, Human Factors Additional Key Words and Phrases: immersive virtual environment, perception, minification, field of view, pincushion distortion, pitch 1. INTRODUCTION Virtual environment (VE) systems are computer interfaces that provide users with the sensory experience of being in a simulated space. These systems frequently consist of a head-mounted display (HMD) that allows users to view and move within a computer generated environment. HMD-based VEs can be used for a variety of applications including training, prototyping, education, entertainment, and research. When a VE is used for a particular application, the overall utility of the system often depends on the ability of the VE to correctly convey the virtual world to the user. For instance, if an architect wants to view a virtual version of a Author s addresses: Scott A. Kuhl and William B. Thompson, School of Computing, University of Utah, 50 South Central Campus Drive, Room 3190, Salt Lake City, Utah, ; skuhl@cs.utah.edu, thompson@cs.utah.edu. Sarah H. Creem-Regehr, Department of Psychology, University of Utah, 380 South 1530 East, Room 502, Salt Lake City, Utah, ; sarah.creem@psych.utah.edu. Permission to make digital/hard copy of all or part of this material without fee for personal or classroom use provided that the copies are not made or distributed for profit or commercial advantage, the ACM copyright/server notice, the title of the publication, and its date appear, and notice is given that copying is by permission of the ACM, Inc. To copy otherwise, to republish, to post on servers, or to redistribute to lists requires prior specific permission and/or a fee. c 20YY ACM /20YY/ $5.00 ACM Transactions on Applied Perception, Vol. V, No. N, Month 20YY, Pages 1 24.

2 2 Scott A. Kuhl et al. Fig. 1. The angle of declination θ and eye-height h can be used to calculate absolute distance to objects on the floor. building in a VE, the architect should be able to perceive the same building as the computer is displaying. Uncalibrated HMD systems can distort perspective-related visual cues and may prevent people from properly perceiving the virtual world. We are particularly interested in learning how geometric distortions from uncalibrated HMDs might influence people s ability to make distance judgments in virtual environments. It is well documented that people frequently underestimate absolute distances to targets on the floor in HMD-based VEs at distances of 3 to 15 meters [Witmer and Kline 1998; Knapp 1999; Loomis and Knapp 2003; Thompson et al. 2004; Sahm et al. 2005; Willemsen et al. 2008; Willemsen et al. 2009]. Although different studies have found varying amounts of compression, people typically judge these targets to be 70 to 90% of their intended distance. None of these papers provide detailed information about if and how their HMDs were calibrated before being used for distance judgments. Since calibration procedures are not reported, it is difficult to determine if differences in calibration might cause some differences between laboratories. Therefore, understanding if and how HMD calibration might affect distance judgments can help us compare previous studies and can indicate to researchers which type of distortions can result in significant changes in distance judgments. The present work explores this issue by expanding on previously reported work by Kuhl et al. [2008]. It is also possible that geometric distortions change distance judgments differently over time due to adaptation. If people are able to quickly adapt to geometric distortions in an HMD, it may not be necessary for careful calibration for some HMD applications. Adaptation is known to occur in HMD-based VEs when a gain is applied to translation [Mohler et al. 2006] or rotation [Kuhl et al. 2008] when subjects are given feedback about the change. It is possible that geometric distortions may produce a similar adaptation. When a geometric distortion is present in an HMD, visual cues can conflict with proprioceptive and vestibular information about the direction the head is pointing. Distortions can also change the size of imagery presented to users and can cause familiar size cues for distance to conflict with other distance cues. The conflicts may cause people to change how they use and integrate distance cues over time. The angle of declination can provide sufficient information for people to judge

3 HMD calibration and its effects on distance judgments 3 the absolute distance to targets on the floor if eye height is known (Figure 1) [Sedgwick 1983]. There is evidence that manipulating this cue can change the judgments of height [Stoper 1999] and distance [Gardner and Mon-Williams 2001; Ooi et al. 2001; Grutzmacher et al. 1997; Andre and Rogers 2006] in real environments. Manipulating the location of the horizon can also change distance judgments within virtual environments [Messing and Durgin 2005]. Since the angle of the horizon can influence the perception of distance, we will focus on measuring and calibrating aspects of an HMD system which can cause changes to the angle of declination. First, virtual worlds can easily become incorrectly pitched up or down if the HMD s orientation sensor is not accurately aligned with the optical axis of display. Second, the optics in HMDs often introduce distortions which cause straight lines to appear curved. Third, the graphics displayed to the user can become minified or magnified if the field of view of the display is not accurately known. We will discuss each of these distortions in detail and present methods to measure and correct them. We will also compare distance judgments in a calibrated HMD and compare them to judgments in an incorrectly calibrated HMD. See-through HMDs are usually carefully calibrated to allow items in the virtual world to properly align with objects seen in the real world [Azuma and Bishop 1994; McGarrity and Tuceryan 1999; Genc et al. 2002; Gilson et al. 2008]. Non-seethrough displays are more difficult to calibrate because users cannot simultaneously see and compare a real point in space to the corresponding point in the virtual world. Unlike see-through displays where poor calibration could be distracting, it can be difficult to recognize incorrect calibration in non-see-through displays without careful measurement. 2. DISTANCE JUDGMENTS IN DISTORTED ENVIRONMENTS We measured how several geometric distortions in an HMD might influence distance judgments with a series of experiments. All experiments used an NVIS nvisor SX HMD and an IS-900 tracking system. Unless otherwise stated, we used our HMD with the pitch, pincushion distortion, and field of view calibrations (described in Section 3) to produce a geometrically correct display. 2.1 Distance judgment procedure All of the distance judgments in this paper used a direct blind walking task to measure the perceived distance to targets on the floor. During the entire experiment, subjects wore noise-canceling headphones that allowed them to hear the experimenter who was speaking into a wireless microphone. The subjects could also hear pink noise in the headphones which was designed to mask any localizable environmental sounds in the lab. The same sound was played in both headphone earpieces. Prior to the experiment, subjects were assisted in walking around blindfolded for several minutes outside the laboratory to familiarize them with walking without vision. Next, the experimenters brought the subjects into the lab blindfolded so that they would not see the lab prior to the experiment. Once in the lab, the subjects removed the blindfold and kept their eyes closed until the experimenter placed the HMD on their head. Then the subjects could open their eyes to see a blank screen in the HMD. The experimenter then assisted the subject in adjusting the HMD to comfortably fit on their head.

4 4 Scott A. Kuhl et al. Next, we unblanked the screen and allowed subjects to view a virtual hallway (shown in Figure 2(a)). Subjects viewed the target on the floor in front of them for as long as necessary to create a mental image of the space. They were allowed to rotate their heads to view the space but were not allowed to translate their heads or move their heads by bending at their waist. When the subjects felt confident in their mental image of the space, they closed their eyes, verbally stated that they were ready to walk, and walked to the target with their eyes closed. The HMD screens were blanked by the experimenter immediately after the subject stated that they were going to walk toward the target. The subjects stopped walking when they thought that they were standing directly on top of the target and the experimenter recorded their location with the tracking system. The subjects were not given feedback on their performance. Next, subjects kept their eyes closed and they were assisted in walking back to the starting location along an s shaped path. We used three different virtual world starting locations which were separated by one meter to make it more difficult for people to determine the target distance simply by looking at the target s location relative to other objects in the hallway. The starting location was at the same location in the real world and the virtual world was translated to create the three virtual starting locations. The subjects were informed that their starting location may differ over trials but were not informed that their physical starting location was always at the same location. Targets were displayed at four distances (3, 4, 5, and 6m) and each target distance seen from each of the three starting locations. In addition to these 12 trials, we included two practice trials at distances of 3.5 and 5.5 meters at the beginning of the experiment for subjects to familiarize themselves with the blind walking task. No feedback was provided for these practice trials. We also randomly inserted three other trials within the 12 trials with targets at distances of 3.5, 4.5, and 5.5m to make it more difficult for subjects to memorize the distances we were showing them. As a result, only the data from 12 of the 17 trials the subjects performed was included in the analysis. The subjects were not told that some of the trials would not be used. The size, shape, and color of the target was also randomized throughout the experiment. After the experiment was completed, subjects took off the HMD and answered several questions about their subjective opinion of virtual environment in a manner that did not bias their responses. In one question, we asked subjects if they noticed anything incorrect, unusual, or distracting in the VE and if they had any suggestions on how we might make the VE more realistic. We were particularly interested in the subjects responses to this question since we wanted to know if certain geometric distortions were consciously noticeable to subjects. All of the subjects were between the ages of 18 and 35 and had normal or corrected-to-normal vision as measured with a Snellen eye chart. None of the subjects were stereoblind and were all able to identify a shape in a random dot stereogram viewed with stereoscope glasses. Most of the subjects were recruited from the University of Utah s psychology department subject pool and received credit for their participation. Other subjects were recruited from the community and were paid ten dollars for participating.

5 HMD calibration and its effects on distance judgments 5 (a) Baseline (b) World pitched up 5.7 (c) World pitched down 5.7 Fig. 2. Screenshots of the VE used to measure effects of pitch on distance judgments 2.2 Pitched environment Tracked head-mounted displays typically use some type of sensor to measure the orientation of the HMD. If the sensor is perfectly aligned with the optical axis of the display, one can simply read the pitch, yaw, and roll values directly from the sensor to determine the HMD orientation. It is often unknown, however, if the sensor is accurately aligned with the optical axis of the display. We will focus on the calibration of pitch (up and down rotations) because there is evidence [Gardner and Mon-Williams 2001; Ooi et al. 2001; Grutzmacher et al. 1997; Andre and Rogers 2006] that pitching the world with prism glasses in the real world can change distance judgments. Pitching the world does not change the angle between the targets on the floor and the visually-indicated horizon. The angle is changed, however, between the horizontal with respect to gravity (i.e., true horizontal in the real world) and the targets. Therefore, pitching the world up or down would change the angle of declination (Figure 1) with respect to gravity and could subsequently make objects on the floor appear farther or closer than they actually are. In this experiment, we used a between-subject experimental design with three visual conditions that all used the same virtual hallway environment. In the baseline visual condition (Figure 2(a)), the virtual world was calibrated using all of the calibrations discussed in this paper. There were two pitch conditions which pitched the world up and down 5.7 around the eye point (Figures 2(b) and 2(c)). This magnitude of pitch corresponds to several studies which used prism glasses to pitch the world up or down [Grutzmacher et al. 1997; Ooi et al. 2001; Andre and Rogers 2006]. Thirteen subjects (9 male, 4 female) participated in the baseline (no pitch) condition and twelve participated in the world pitched up (7 male, 5 female) and world pitched down (5 male, 7 female) conditions Results and discussion. The results of the baseline and pitched conditions are shown in Figure 3. Subjects walked 80, 84, and 80 percent of the actual target distance in the baseline, pitch up, and pitch down conditions, respectively. A 3(condition) x 4(distance) ANOVA confirmed that there was no significant difference in judged distance due to the pitched environment (F(2, 23) = 0.45, p = 0.51). When asked after the experiment if they noticed anything incorrect, unusual, or distracting in the VE, none of the subjects said that the world appeared to be pitched up or down in any of the conditions. From a practical perspective, the results of this experiment show that pitch mis-

6 6 Scott A. Kuhl et al. Distance walked (m) Ideal Baseline World pitched up World pitched down Target distance (m) Fig. 3. Pitching a virtual world up or down does not influence distance judgments measured with direct blind walking. calibrations of 5.7 and smaller do not affect distance judgments between 3 and 6 meters in our HMD. Despite the lack of a statistically significant effect, there are five reasons why pitch calibration should be performed in HMDs. First, the present work focused on specific ranges of pitch and distances, one distance judgment measure, and one hallway virtual environment. More empirical investigation is needed before we can conclude exactly how pitch affects distance judgments. Second, pitch has the potential to affect other spatial judgments such as the perception of height, size, and location. For example, the perception of an object s height may change in a pitched VE much like it does when a real room is pitched around an observer [Stoper 1999]. More work is needed to determine if the same effect occurs in HMD-based VEs. Third, large amounts of pitch may be noticeable and distracting to users. Fourth, it is possible that pitch may affect distance judgments in other HMDs with differing specifications from the one used in this experiment. Finally, pitch calibration is relatively easy to perform and is described in detail in Section 3.2. Assuming the orientation sensor is firmly mounted to the HMD, the calibration only has to be performed once. The results of this experiment can also be compared to real world studies which use prism glasses to pitch the environment. Unlike the present work, several studies [Grutzmacher et al. 1997; Gardner and Mon-Williams 2001; Ooi et al. 2001; Andre and Rogers 2006] demonstrate that pitch affects real world distance judgments at distances ranging from 20 centimeters to 20 meters. Two of these real world studies used distances which overlapped with the distances examined in the present study with direct blind walking. Specifically, Andre and Rogers [2006] found that, unlike the present HMD experiment, pitching the real world up with prism glasses affects blind walking distance judgments to objects at distances from 1.5 to 18 meters.

7 HMD calibration and its effects on distance judgments 7 (a) Uncorrected (b) Corrected Fig. 4. Simulation of the pincushion distortion seen in our HMD before and after correction. The present work and Andre and Rogers [2006] both found that pitching the world down had no effect on distance judgments. Ooi et al. [2001] and Grutzmacher et al. [1997], however, found that pitching a real environment down can affect distance judgments especially at distances of 6 meters and greater. There are a couple possible explanations for why our results differ from previous real world studies. First, people rapidly adapt to a pitched environment and this adaptation may occur more quickly in HMD-based virtual environments than real environments. Second, the weight of the HMD may add additional uncertainty to the ability of people to use proprioceptive and vestibular information to measure pitch. This uncertainty may increase the reliance on visual-indicated horizon when determining the direction of the horizontal. The result of no difference between pitch manipulation conditions remains a useful finding for two different reasons. First, we have established the consequences of a range of pitch manipulations on one type of distance estimation. Second, we have identified a distinction in the behavioral outcomes of manipulations of pitch in virtual environments and the real world. This may suggest some differences in perceptual mechanisms or in the use of information in real versus virtual environments and is a direction of research for future studies. 2.3 Pincushion distortion HMDs typically have collimated optics between the display and the viewer to allow the user s eyes to focus as if the screens were farther away than they actually are. These optics can cause pincushion distortion a radial distortion that occurs when points near the edges of the image are magnified more than those near the center. This distortion causes every line that does not cross the center of the image to be curved inwards toward the center of the image. Figures 4(a) and 4(b) illustrate how a checkerboard image appears in our HMD before and after pincushion distortion correction. When there is not a visible horizon in an environment, linear perspective cues can provide information that can be used to determine the location of the effective horizon. For example, the hallway environment used for the present work has converging lines on edges of the floor and ceiling that can be used to determine where the effective horizon is. When this imagery has pincushion distortion, these

8 8 Scott A. Kuhl et al. lines are curved and, depending on where people look, may make the effective horizon appear higher or lower in the scene. We used the same experiment procedure described in Section 2.1 to measure the effect of pincushion distortion on distance judgments relative to the baseline condition. We reused the data recorded from the baseline condition conducted in the previous experiment (Section 2.2) which corrected for the pincushion distortion in our display using the methods described in Section 3.2. We compared the results of this baseline condition to two other conditions with intentional pincushion distortion. In the first condition, twelve subjects (7 male, 5 female) made distance judgments with our HMD without any pincushion distortion corrections. Section 3.2 provides a detailed discussion of the pincushion distortion present in our HMD. Since some HMDs may have even greater amounts of pincushion distortion than ours, we also examined how exaggerated pincushion distortion affects distance judgments. In this exaggerated condition, the graphics were rendered with the pincushion calibration coefficient k = 0.1 (k is described in Section 3.2). Ten subjects (3 male, 7 female) participated in the exaggerated pincushion distortion condition. Changing the amount of pincushion distortion correction results in differing black frames around the displayed image (see Figure 4). Since the different frames may also influence distance judgments, all three visual conditions were rendered with the same black frame which ensured that the edges around the perimeter of the image always appeared straight to the subjects. Since pincushion distortion is somewhat subtle, we have illustrated the visual conditions with a checkerboard image in Figure 5 instead of the hallway actually used in the experiment. As can be seen in the four corners of the images in the figure, the each condition was cropped differently by the black frame. For example, the checkerboard squares in the corners of the baseline condition are fully visible but are almost entirely cropped out in the exaggerated distortion condition. The pixels of the scene near the centers of the edges were not affected by the frame. Specifically, the center columns and center rows on the checkerboard are entirely visible in all three conditions in Figure 5. Therefore, both the center horizontal and center vertical fields of view of the display were the same between the three conditions Results and discussion. Figure 6 shows the results of the baseline and pincushion distortion conditions. Subjects walked 80, 86, and 81 percent of the actual target distance in the baseline, uncorrected, and exaggerated conditions, respectively. A 3(condition) x 4(distance) ANOVA showed no effect of pincushion distortion on distance judgments (F(2, 32) = 0.58, p = 0.57). The amount of pincushion distortion present in the exaggerated pincushion distortion condition was likely greater than that found in most HMDs. Therefore, these results suggest that pincushion distortion calibration will not affect distance judgments in most HMDs. When subjects were asked after the experiment if they noticed anything incorrect, unusual, or distracting in the VE, none said that the world looked distorted or that straight lines appeared to be curved. We believe, however, that pincushion distortion is distracting especially if the user is actively looking for it. Whereas a small amount of pitch is difficult for an experienced HMD user to notice, a small amount of pincushion distortion is clearly visible when viewing a scene that where

9 HMD calibration and its effects on distance judgments 9 (a) Baseline (with correction) (b) No correction (c) Exaggerated Fig. 5. An illustration of how a checkerboard would appear in the HMD for each of the three visual conditions. The black border is the same in each condition, but the amount of pincushion distortion varies between the images. Distance walked (m) Ideal Baseline No pincushion correction Exaggerated pincushion Target distance (m) Fig. 6. Pincushion distortion correction did not significantly change distance judgments. long straight lines are present. Although there was no clear effect of the pincushion manipulation on estimates of distance that we tested, the subjective experience of the distortion suggests that it could have consequences for other spatial behavior. Future work should examine this possibility. 2.4 Minification and magnification The term field of view (FOV) refers to the horizontal and vertical angles subtended by a display or imaging device. A related term, geometric field of view (gfov), is the field of view used by the computer to render 3D graphics. In a virtual environment system, gfov must match the FOV of the display to prevent unnecessary minification or magnification of the graphics. For example, if the gfov is larger than the FOV, the final image will be minified. In a real environment, minifying

10 10 Scott A. Kuhl et al. or magnifying lenses can be used to perform the same manipulations. Previous work shows that minification and magnification can change distance judgments in real [Campos et al. 2007] and photographed [Smith 1958b; 1958a; Lumsden 1983; Kraft and Green 1989; Rogers 1995] environments. Other studies have demonstrated that minification increases judgments of distance in HMD-based VEs [Kuhl et al. 2006; 2008] and can counteract the distance compression commonly found in HMDs. Minification should not be thought of as a general solution to distance compression, however, since little is currently known about how it might negatively impact other actions and spatial judgments. The present work expands on the previous minification research in HMDs by examining the effects of magnification on distance judgments. Minification and magnification change several visual cues that contain absolute distance information. Minification changes three specific cues in a way that can potentially increase perceived distance to objects. First, minification reduces the visual angle between objects in a scene and can decrease the angle of declination and increase perceived distance. Minification does not, however, change the angle of declination if it is measured proprioceptively with head rotations. Second, since minification will make objects subtend a smaller angle on the retina, familiar size cues may also make objects appear more distant. Finally, minification will cause binocular convergence to indicate that objects are also more distant. This cue, however, can only provide accurate distance information for objects within a few meters. Magnification changes all of these cues in an opposite way and can potentially decrease perceived distance. We used the same experiment design described in Section 2.1 to compare minification and magnification to the baseline condition used in the previous two experiments. Twelve subjects (7 male, 5 female) participated in a minification condition which scaled the graphics by a factor of 0.70 relative to the baseline condition. With this amount of minification, the visual angle between the targets and visuallyindicated horizon is decreased approximately the same amount as the visual angle between the targets and the horizon with respect to gravity when the world is pitched up by 5.7. Two other conditions in our experiment examined how differing amounts of magnification affect distance judgments. In one magnification condition, twelve subjects (8 male, 4 female) viewed imagery which was scaled by a factor of 1/0.70 or 1.43 relative to the baseline condition. Ten subjects (7 male, 3 female) participated in another magnification condition in which imagery was scaled by a factor of two relative to the baseline condition. Figure 7 illustrate the differences between the baseline, minified, and magnified conditions Results and discussion. On average, subjects judged objects to be 63.1, 76.3, 80.0, and 98.0 percent of the actual distance in the 2.00x magnification, 1.43x magnification, 1.00x baseline, and 1.43x minification conditions, respectively. Minification and magnification did not have a significant effect on judged distances according to a 4(condition) x 4(distance) ANOVA (F(3,43) = 9.76,p < 0.05). Post-hoc analysis showed a significant difference between the baseline condition and both the 0.70x minification (F(1,23) = 8.06,p < 0.01) and 2.00x magnification (F(1,21) = 5.21,p < 0.05) conditions. There was no significant change, however, between the baseline and 1.43x magnification conditions (F(1,23) = 0.48,p = 0.50).

11 HMD calibration and its effects on distance judgments 11 (a) Baseline (b) Minified (0.70x) (c) Magnified (1.43x) (d) Magnified (2.00x) Fig. 7. Screenshots of the VE used to measure effects of minification and magnification on distance judgments. The results of the minification are similar to those of Kuhl et al. [2006] which also found that minification increases judged distances to targets on the floor in hallway environments. Unlike previous work, the present work also demonstrates that magnification has the opposite effect on distance judgments. Therefore, researchers who use HMDs for distance perception research must be concerned about unintentional minification and magnification in their displays. Magnification, however, appears to have a smaller impact on distance judgments than minification. Although the scaling factor between the minification and baseline conditions was the same as that between the baseline and 1.43x magnification conditions, the 1.43x magnification condition did not significantly change distance judgments. An asymmetric effect of minification and magnification on distance judgments has also been reported in the picture perception literature [Smith 1958b; Kraft and Green 1989; Rogers 1995]. Specifically, the effects of magnification are often more accurately predicted from the geometry of the distortion than the effects of minification in picture perception. In our experiments, one might predict that distances should be judged as 56% and 114% of intended in the 1.43x magnification and 0.70x minification conditions respectively based on the geometry and the results of the baseline condition. When we compute the percent error of the our results relative to the predictions, we find an effect which is contrary to the picture perception findings. The effect of minification on distance judgments in our study is somewhat more accurately predicted

12 12 Scott A. Kuhl et al. Distance walked (m) Ideal Minification (0.70x) Baseline (1.00x) Magnification (1.43x) Magnification (2.00x) Target distance (m) Fig. 8. Minification and magnification significantly changed distance judgments measured by direct blind walking. by the geometry than it is for magnification. Minification and magnification are related to pincushion distortion since pincushion distortion varies the amount of minification/magnification across the screen radially. Unlike the minification/magnification results, pincushion distortion did not affect distance judgments. There are at least two explanations which may explain these differences. First, pincushion distortion causes most straight lines to appear curved and may decrease the usefulness of linear perspective cues such as a pair of converging lines indicating the location of the horizon. Minification and magnification, however, preserves straight lines. Second, if familiar size cues are used to determine distance, the retinal size of an object changes depending on its location on the screen. It may be possible that the retinal size of the object is averaged out by people if they frequently rotate their head. 2.5 Investigating adaptation People may adapt to the geometric distortions discussed in the present work since these distortions can put visual cues for distance in conflict with other visual cues or with vestibular and proprioceptive information. Over time, people may change which cues they rely on to make distance judgments due to this conflict. To examine whether this adaptation occurs, we added a five minute viewing period and four extra blind walking trials immediately after the initial seventeen blind walking trials. These extra trials allowed us to compare the average distance walked prior to the viewing period to those after the viewing period. During the viewing period, subjects stood in the virtual hallway and were encouraged to look around. They were instructed not to translate their head through space. Any geometric distortions which occurred during the trials prior to the

13 HMD calibration and its effects on distance judgments Ideal Pre-viewing period Post-viewing period 7 6 Ideal Pre-viewing period Post-viewing period Distance walked (m) Distance walked (m) Target distance (m) (a) Baseline Target distance (m) (b) 0.70x minification Fig. 9. Blind walking judgments before and after a five minute viewing period for the baseline and 0.70x minification conditions. viewing period were also present during the period. The subject and experimenter also had a conversation during the five minute period. The conversation typically included a variety of topics unrelated to the experiment such as the subject s major and hobbies. After the viewing period, the subject performed four direct blind walking trials to randomized targets at 3, 4, 5, and 6 meters. We performed a 2(pre-/post-viewing period)x4(target distance) ANOVA on each of the visual conditions performed in the present work to assess whether there was a difference in the two sets of trials. There was no effect of the pre-/postviewing period variable, providing no evidence that adaptation occurred in any of the conditions: baseline ((F(1,12) = 0.04,p = 0.85), 0.70x minified (F(1,11) = 0.88,p = 0.37), 1.43x magnified (F(1,11) = 0.50,p = 0.50), 2.00x magnified (F(1,9) = 0.11,p = 0.75) conditions, world pitched up (F(1,11) = 1.00,p = 0.34), world pitched down (F(1,11) = 0.01,p = 0.91), no pincushion distortion correction (F(1, 11) = 0.43, p = 0.52), and exaggerated pincushion distortion (F(1, 9) = 0.62,p = 0.45). The graphs in Figure 9 show results which are representative of the results seen in all of the visual conditions. We also examined the data to determine if the percent walked (i.e., (distance walked)/(target distance)) changed over time within the trials prior to the viewing period. No strong trends were apparent in any of the visual conditions. It is important to note that the subjects in the present work were given no feedback about their distance judgments and they never translated within the space with their eyes open. If subjects were given feedback about the accuracy of their judgments, other work [Richardson and Waller 2005; Mohler et al. 2006] suggests that there would be a strong adaptation effect in all of the conditions. Although minification and magnification do change distance judgments relative to the baseline condition, we were unable to demonstrate that people adapt to the geometric distortions and eventually revert back to making distance judgments comparable to the baseline condition. Therefore, the effects of minification and magnification on distance judgments appear to stay the same even after a five minute viewing period. It is possible, however, that adaptation occurs but is too small to be measured or too slow to happen in five minutes. More work is needed to

14 14 Scott A. Kuhl et al. fully understand if and how minification and magnification change distance judgments over time. 3. HMD CALIBRATION 3.1 Pitch calibration Background. In an HMD where pitch is correctly calibrated, the line of sight to any location in the virtual world will align with the line of sight to a corresponding location in the real world. In the context of distance perception, there is particular interest in ensuring that the virtual horizon appears at the correct location (Figure 1). We developed a calibration procedure where users compare a real marker to another marker displayed in the virtual environment. The pitch of the virtual environment is then adjusted until the two markers are in alignment. If the optical axis of the HMD is aligned with the calibration sensor, no pitch calibration is necessary, and the pitch information indicated by the orientation sensor can be directly used to orient the virtual camera used to render the scene. Orientation sensors, however, are usually mounted so that they only approximately align with the optical axes. Therefore, the goal of pitch calibration is to measure the relative difference in pitch between the optical axis and orientation sensor so that we can compensate for any misalignment. There has been little information published on this topic for non-see-through HMDs systems. In most mid- and high-end HMDs, the line of sight to a displayed image point is almost entirely a function of the orientation of the optical axis of the display and the image plane location of the displayed point relative to the center of the display. If a static image is displayed on such an HMD, rotating the display by a small amount will cause the apparent location of any given point in the displayed image to rotate relative to the real world by exactly the same amount. More surprising is what happens if the display is lifted slightly on a user s head while maintaining its orientation. In this case, the image will appear not to move, though the portion of the image that is visible may vary due to occlusion by the exit pupil aperture. This has an important consequence for pitch calibration. (Pitch is the angle of the line of sight relative to the horizontal plane or equivalently, relative to up.) In order to render a virtual world location such that the pitch of the line of sight to that location is correct relative to horizontal in both the real and virtual worlds, it is necessary to know the pitch of the optical axis of the display relative to real world horizontal. For example, consider a virtual world location that is intended to be seen 30 below horizontal, viewed by a user who when his or her head is level but is wearing an HMD pitched down by 5. If the user has his or her head tipped down by 10, the location of interest needs to be rendered such that it appears 15 ( ) from the optical axis of the display. Since the frame of reference used by orientation trackers is rarely aligned with the optical axis of the display, an orientation calibration procedure is needed. One common calibration procedure starts every HMD session by asking the user to hold his or her head level with their eyes closed and then treat whatever value is returned by the orientation sensor as indicating that the user is looking horizontally. This procedure is fundamentally in error because it ignores all optical properties of the HMD. In the example above, this calibration procedure would cause orientation

15 HMD calibration and its effects on distance judgments 15 to be presumed to be level when the optical axis of the display is pitched down by 5. When the user later pitches his or her head down 10, the target would be rendered such that it appeared 20 from the optical axis of the display. The apparent line of sight relative to the real world of a rendered point depends only on the orientation of the optical axis with respect to the real-world frame of reference and the image location of the point. In this situation, the location of interest would appear 5 lower relative to horizontal than it should. Instead, tracker orientation should be calibrated so that the angle of the optical axis of the display relative to real-world horizontal can be determined, independent of how a user might be wearing the HMD. Calibrating orientation based on how users wear an HMD when attempting to hold their head level has an additional disadvantage since there can be a large variability between how people level their head or wear the display. To measure this variability, we asked subjects the subjects who participated in our distance judgment experiments (Section 2) to perform this procedure after they put on the HMD and before the virtual hallway was displayed. Subjects were told to close their eyes, stand up straight, imagine that they were standing in front of a mirror, and point their head straight ahead as if they were looking at their own eyes in the imaginary mirror in front of them. Next, we recorded the pitch indicated by the orientation sensor but did not use the value to render the graphics. For twelve subjects in one of our visual conditions, we found that the orientation sensor was pitched an average of 1 (relative to level as determined by our pitch calibration procedure discussed below). While average positioning was near accurate, there was a 22 range (standard deviation of 6.7) in pitch across subjects. This variability could be caused by subjects pointing their head inaccurately or by the HMD sitting differently on subjects heads. Although we found that a 5.7 pitched world does not affect distance judgments (Section 2.2), greater amounts of pitch may affect distance judgments or other indications of space perception. This experiment demonstrates that relying on this procedure can introduce a significant and unnecessary betweensubject variable Calibration procedure. We developed a new calibration procedure that does not depend on subjects ability to look straight ahead when asked and makes no assumptions about how the HMD is sitting on a subject s head. Instead, it relies on the assumption that people can match a horizon rendered in the virtual world with the real world horizon. When the real and virtual horizons are aligned, we can conclude that the pitch of the virtual world is correctly calibrated. The amount we need to adjust the values returned by the orientation sensor to align the real and virtual horizon corresponds to the amount of misalignment between the sensor and the optical axis of the display. For example, if the orientation sensor is pitched up 10 relative to the optical axis, the virtual world will appear to be pitched down by 10 prior to calibration regardless of how the HMD is worn on the user s head. When the user performs our calibration procedure, the virtual world will be pitched until the virtual horizon aligns with the real horizon. In this case, the virtual world would have to be pitched up 10 or, equivalently, 10 would be subtracted from the pitch value provided by the orientation sensor. In the situation where the orientation sensor is already aligned with the optical axis, the user would notice

16 16 Scott A. Kuhl et al. that the real and virtual horizons aligned and would recognize that no additional pitch adjustment is necessary. In the first step of our procedure, we measured the subject s eye height and placed a horizontal strip of tape on a wall in the laboratory such that the top of the tape matched the measured eye height. For that subject, the top of the tape would align with the real world horizon if it were visible. In the HMD, we displayed a colored rectangle that was as tall as the subject s eye height to accurately render the location of the horizon in the virtual world. The virtual rectangle was located at the same position of the real tape so that the bottom of the rectangle matched the intersection of the real wall with the floor and the top of the virtual rectangle aligned with the top of the tape. To verify that the placement of the rectangle was correct, subjects compared the virtual rectangle with the tape while standing as close as possible to the tape. Subjects performed this comparison by repeatedly raising and lowering the HMD onto their head. At short distances, the top of the tape and the top of the virtual rectangle should appear to line up almost exactly even if the virtual world is incorrectly pitched by a small amount. Next, subjects stood approximately nine meters directly in front of the tape and the virtual rectangle. By standing at a distance from the tape, the effect of any small errors in the placement of the tape or in the tracking system s height measurements are reduced. Subjects then compared the alignment of the virtual rectangle with the real tape by repeatedly lifting and lowering the HMD onto their head. Each subject performed four matching trials for each eye. For half of the trials, the virtual world was initially pitched up by more than 5. For the other trials, it was pitched down. Subjects then indicated to the experimenter if the virtual environment had to be pitched up or down to align the top of the virtual rectangle to the top of the real horizontal strip of tape. The experimenter followed the instructions given by subject and pressed buttons on the computer which pitched the world up or down by quarter degree increments. When the subjects were satisfied with the alignment, the experimenter recorded the amount of pitch adjustment that needed to be applied to the orientation sensor produce the alignment. During this calibration procedure, subjects were told to keep the HMD as level as possible as they raised and lowered the HMD. It is not critical, however, that the HMD be perfectly level during this procedure as long as there is no minification, magnification, or pincushion distortion in the display. Therefore, we performed this pitch calibration procedure after measuring the field of view of our HMD (Section 3.3) and correcting for pincushion distortion (Section 3.2). To understand why this procedure works regardless of the pitch of the HMD on the subject s head, consider the situation where the subject is comparing the virtual rectangle and the real tape with the HMD pitched up ten degrees. The HMD pitch causes the orientation sensor, optical axis, and screens to be pitched up by the same amount. The pitched orientation sensor causes the virtual camera to be pitched up by ten degrees and pitches the virtual world down on the screens by ten degrees. The fact that the horizon appears lower on the screens is compensated by the fact that the screens appear higher since the HMD is pitched up. As a result, the virtual rectangle should appear stationary relative to the real tape regardless of the HMD orientation on the subject s head. Therefore, even if a subject pitches the HMD during the alignment

17 HMD calibration and its effects on distance judgments 17 procedure, it still provides an accurate measure of the misalignment between the sensor and the optical axis. Six lab members performed this pitch calibration procedure with our HMD. All lab members had normal vision without eyeglasses since they can introduce small amounts of pitch. We found that the pitch indicated by the orientation sensor had to be decreased (the virtual world pitched up) for both eyes in order for the virtual world to be level. The median value of the averages for each subject were 1.5 and 2.2 for the left and right eyes respectively. A paired t-test showed a statistically significant difference between the left and right eyes (t(5) = 3.57, p < 0.02). The difference between the left and right eye indicate that the two optical axes are not perfectly aligned in our display. One-sample t-tests shows that both the left (t(5) = 4.60, p < 0.01) and right (t(5) = 6.66, p < 0.01) eye results differed significantly from zero. This result shows that both the left and right eye optical axes are misaligned relative to the orientation sensor. Now that we have measured the misalignment in each eye, we can use these numbers to adjust the pitch values indicated by the sensor before orienting the virtual cameras used to render the left and right eyes. The same amount of pitch compensation can be used for every user without performing a per-subject leveling procedure. 3.2 Pincushion distortion correction Several methods can be used to correct for pincushion distortion [Tsai 1987; Weng et al. 1992; Robinett and Rolland 1992; Bax 2004; Watson and Hodges 1995]. For example, one real-time method uses a textured mesh that is warped to compensate for the distortion [Watson and Hodges 1995; Bax 2004]. We used a fragment program, instead of a textured mesh, to correct for the distortion in real-time. The first step in our method is to render a normal image of the scene to a texture. In our implementation, we render to a 1280x1024 rectangular texture that is the same size as one of the screens in our HMD. We apply this texture with a fragment program to a quad which covers the entire screen. The fragment program is effectively run for every pixel on the screen and corrects for the pincushion distortion using the following steps: (1) The program looks up the coordinate on the screen the fragment program is being run on. Next, we calculate the distance between the current screen coordinate and the center of the screen. We normalize this distance such that the distance from the center of the screen to the corner is 1. We also record which direction the point is from the center of the screen. (2) Using the radial distance from the center of the screen (r s ), we calculate the radial distance we should use to lookup into the texture (r t ): r t = r s + k r 3 s This equation is an approximation of pincushion distortion where k is a number that represents the amount of distortion. Next, we reverse the normalization procedure on r t and use it, along with the direction recorded in step 1, to calculate the texture coordinate of the pixel we want to display. (3) Finally, we look up the color of the pixel in the texture at the texture coordinate and use that color at the current position on the screen. If we look up a pixel

Perception in Immersive Environments

Perception in Immersive Environments Perception in Immersive Environments Scott Kuhl Department of Computer Science Augsburg College scott@kuhlweb.com Abstract Immersive environment (virtual reality) systems provide a unique way for researchers

More information

Spatial Judgments from Different Vantage Points: A Different Perspective

Spatial Judgments from Different Vantage Points: A Different Perspective Spatial Judgments from Different Vantage Points: A Different Perspective Erik Prytz, Mark Scerbo and Kennedy Rebecca The self-archived postprint version of this journal article is available at Linköping

More information

Laboratory 7: Properties of Lenses and Mirrors

Laboratory 7: Properties of Lenses and Mirrors Laboratory 7: Properties of Lenses and Mirrors Converging and Diverging Lens Focal Lengths: A converging lens is thicker at the center than at the periphery and light from an object at infinity passes

More information

E X P E R I M E N T 12

E X P E R I M E N T 12 E X P E R I M E N T 12 Mirrors and Lenses Produced by the Physics Staff at Collin College Copyright Collin College Physics Department. All Rights Reserved. University Physics II, Exp 12: Mirrors and Lenses

More information

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K.

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K. THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION Michael J. Flannagan Michael Sivak Julie K. Simpson The University of Michigan Transportation Research Institute Ann

More information

Lenses- Worksheet. (Use a ray box to answer questions 3 to 7)

Lenses- Worksheet. (Use a ray box to answer questions 3 to 7) Lenses- Worksheet 1. Look at the lenses in front of you and try to distinguish the different types of lenses? Describe each type and record its characteristics. 2. Using the lenses in front of you, look

More information

School of Computing University of Utah Salt Lake City, UT USA. December 5, Abstract

School of Computing University of Utah Salt Lake City, UT USA. December 5, Abstract Does the Quality of the Computer Graphics Matter When Judging Distances in Visually Immersive Environments? William B. Thompson, Peter Willemsen, Amy A. Gooch, Sarah H. Creem-Regehr, Jack M. Loomis 1,

More information

COPYRIGHTED MATERIAL. Overview

COPYRIGHTED MATERIAL. Overview In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experience data, which is manipulated

More information

COPYRIGHTED MATERIAL OVERVIEW 1

COPYRIGHTED MATERIAL OVERVIEW 1 OVERVIEW 1 In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experiential data,

More information

AP Physics Problems -- Waves and Light

AP Physics Problems -- Waves and Light AP Physics Problems -- Waves and Light 1. 1974-3 (Geometric Optics) An object 1.0 cm high is placed 4 cm away from a converging lens having a focal length of 3 cm. a. Sketch a principal ray diagram for

More information

Judgment of Natural Perspective Projections in Head-Mounted Display Environments

Judgment of Natural Perspective Projections in Head-Mounted Display Environments Judgment of Natural Perspective Projections in Head-Mounted Display Environments Frank Steinicke, Gerd Bruder, Klaus Hinrichs Visualization and Computer Graphics Research Group Department of Computer Science

More information

Module 2. Lecture-1. Understanding basic principles of perception including depth and its representation.

Module 2. Lecture-1. Understanding basic principles of perception including depth and its representation. Module 2 Lecture-1 Understanding basic principles of perception including depth and its representation. Initially let us take the reference of Gestalt law in order to have an understanding of the basic

More information

Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments

Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments Date of Report: September 1 st, 2016 Fellow: Heather Panic Advisors: James R. Lackner and Paul DiZio Institution: Brandeis

More information

Effects of Interaction with an Immersive Virtual Environment on Near-field Distance Estimates

Effects of Interaction with an Immersive Virtual Environment on Near-field Distance Estimates Clemson University TigerPrints All Theses Theses 8-2012 Effects of Interaction with an Immersive Virtual Environment on Near-field Distance Estimates Bliss Altenhoff Clemson University, blisswilson1178@gmail.com

More information

Distance perception from motion parallax and ground contact. Rui Ni and Myron L. Braunstein. University of California, Irvine, California

Distance perception from motion parallax and ground contact. Rui Ni and Myron L. Braunstein. University of California, Irvine, California Distance perception 1 Distance perception from motion parallax and ground contact Rui Ni and Myron L. Braunstein University of California, Irvine, California George J. Andersen University of California,

More information

Chapter 18 Optical Elements

Chapter 18 Optical Elements Chapter 18 Optical Elements GOALS When you have mastered the content of this chapter, you will be able to achieve the following goals: Definitions Define each of the following terms and use it in an operational

More information

Discriminating direction of motion trajectories from angular speed and background information

Discriminating direction of motion trajectories from angular speed and background information Atten Percept Psychophys (2013) 75:1570 1582 DOI 10.3758/s13414-013-0488-z Discriminating direction of motion trajectories from angular speed and background information Zheng Bian & Myron L. Braunstein

More information

AgilEye Manual Version 2.0 February 28, 2007

AgilEye Manual Version 2.0 February 28, 2007 AgilEye Manual Version 2.0 February 28, 2007 1717 Louisiana NE Suite 202 Albuquerque, NM 87110 (505) 268-4742 support@agiloptics.com 2 (505) 268-4742 v. 2.0 February 07, 2007 3 Introduction AgilEye Wavefront

More information

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21 Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Opto Engineering S.r.l.

Opto Engineering S.r.l. TUTORIAL #1 Telecentric Lenses: basic information and working principles On line dimensional control is one of the most challenging and difficult applications of vision systems. On the other hand, besides

More information

PHYSICS 289 Experiment 8 Fall Geometric Optics II Thin Lenses

PHYSICS 289 Experiment 8 Fall Geometric Optics II Thin Lenses PHYSICS 289 Experiment 8 Fall 2005 Geometric Optics II Thin Lenses Please look at the chapter on lenses in your text before this lab experiment. Please submit a short lab report which includes answers

More information

How to Make 3D Images for Viewing with No Glasses

How to Make 3D Images for Viewing with No Glasses By James Bruce Believe it or not, you don t actually need 3D glasses to experience convincingly realistic 3D images (or movies). You just need to make yourself go cross-eyed. Essentially, you look at two

More information

Perceived depth is enhanced with parallax scanning

Perceived depth is enhanced with parallax scanning Perceived Depth is Enhanced with Parallax Scanning March 1, 1999 Dennis Proffitt & Tom Banton Department of Psychology University of Virginia Perceived depth is enhanced with parallax scanning Background

More information

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses Chapter 29/30 Refraction and Lenses Refraction Refraction the bending of waves as they pass from one medium into another. Caused by a change in the average speed of light. Analogy A car that drives off

More information

Distance Estimation in Virtual and Real Environments using Bisection

Distance Estimation in Virtual and Real Environments using Bisection Distance Estimation in Virtual and Real Environments using Bisection Bobby Bodenheimer, Jingjing Meng, Haojie Wu, Gayathri Narasimham, Bjoern Rump Timothy P. McNamara, Thomas H. Carr, John J. Rieser Vanderbilt

More information

Thinking About Psychology: The Science of Mind and Behavior 2e. Charles T. Blair-Broeker Randal M. Ernst

Thinking About Psychology: The Science of Mind and Behavior 2e. Charles T. Blair-Broeker Randal M. Ernst Thinking About Psychology: The Science of Mind and Behavior 2e Charles T. Blair-Broeker Randal M. Ernst Sensation and Perception Chapter Module 9 Perception Perception While sensation is the process by

More information

Image Characteristics and Their Effect on Driving Simulator Validity

Image Characteristics and Their Effect on Driving Simulator Validity University of Iowa Iowa Research Online Driving Assessment Conference 2001 Driving Assessment Conference Aug 16th, 12:00 AM Image Characteristics and Their Effect on Driving Simulator Validity Hamish Jamson

More information

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor: UNDERGRADUATE REPORT Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool by Walter Miranda Advisor: UG 2006-10 I R INSTITUTE FOR SYSTEMS RESEARCH ISR develops, applies

More information

NREM 345 Week 2, Material covered this week contributes to the accomplishment of the following course goal:

NREM 345 Week 2, Material covered this week contributes to the accomplishment of the following course goal: NREM 345 Week 2, 2010 Reading assignment: Chapter. 4 and Sec. 5.1 to 5.2.4 Material covered this week contributes to the accomplishment of the following course goal: Goal 1: Develop the understanding and

More information

MOTION PARALLAX AND ABSOLUTE DISTANCE. Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673

MOTION PARALLAX AND ABSOLUTE DISTANCE. Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673 MOTION PARALLAX AND ABSOLUTE DISTANCE by Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673 Bureau of Medicine and Surgery, Navy Department Research

More information

Determination of Focal Length of A Converging Lens and Mirror

Determination of Focal Length of A Converging Lens and Mirror Physics 41 Determination of Focal Length of A Converging Lens and Mirror Objective: Apply the thin-lens equation and the mirror equation to determine the focal length of a converging (biconvex) lens and

More information

GROUPING BASED ON PHENOMENAL PROXIMITY

GROUPING BASED ON PHENOMENAL PROXIMITY Journal of Experimental Psychology 1964, Vol. 67, No. 6, 531-538 GROUPING BASED ON PHENOMENAL PROXIMITY IRVIN ROCK AND LEONARD BROSGOLE l Yeshiva University The question was raised whether the Gestalt

More information

Elucidating Factors that can Facilitate Veridical Spatial Perception in Immersive Virtual Environments

Elucidating Factors that can Facilitate Veridical Spatial Perception in Immersive Virtual Environments Elucidating Factors that can Facilitate Veridical Spatial Perception in Immersive Virtual Environments Victoria Interrante 1, Brian Ries 1, Jason Lindquist 1, and Lee Anderson 2 1 Department of Computer

More information

P202/219 Laboratory IUPUI Physics Department THIN LENSES

P202/219 Laboratory IUPUI Physics Department THIN LENSES THIN LENSES OBJECTIVE To verify the thin lens equation, m = h i /h o = d i /d o. d o d i f, and the magnification equations THEORY In the above equations, d o is the distance between the object and the

More information

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS Xianjun Sam Zheng, George W. McConkie, and Benjamin Schaeffer Beckman Institute, University of Illinois at Urbana Champaign This present

More information

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc. Human Vision and Human-Computer Interaction Much content from Jeff Johnson, UI Wizards, Inc. are these guidelines grounded in perceptual psychology and how can we apply them intelligently? Mach bands:

More information

The principles of CCTV design in VideoCAD

The principles of CCTV design in VideoCAD The principles of CCTV design in VideoCAD 1 The principles of CCTV design in VideoCAD Part VI Lens distortion in CCTV design Edition for VideoCAD 8 Professional S. Utochkin In the first article of this

More information

The Influence of Restricted Viewing Conditions on Egocentric Distance Perception: Implications for Real and Virtual Environments

The Influence of Restricted Viewing Conditions on Egocentric Distance Perception: Implications for Real and Virtual Environments The Influence of Restricted Viewing Conditions on Egocentric Distance Perception: Implications for Real and Virtual Environments Sarah H. Creem-Regehr 1, Peter Willemsen 2, Amy A. Gooch 2, and William

More information

Adding Content and Adjusting Layers

Adding Content and Adjusting Layers 56 The Official Photodex Guide to ProShow Figure 3.10 Slide 3 uses reversed duplicates of one picture on two separate layers to create mirrored sets of frames and candles. (Notice that the Window Display

More information

Estimating distances and traveled distances in virtual and real environments

Estimating distances and traveled distances in virtual and real environments University of Iowa Iowa Research Online Theses and Dissertations Fall 2011 Estimating distances and traveled distances in virtual and real environments Tien Dat Nguyen University of Iowa Copyright 2011

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming)

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Purpose: The purpose of this lab is to introduce students to some of the properties of thin lenses and mirrors.

More information

Be aware that there is no universal notation for the various quantities.

Be aware that there is no universal notation for the various quantities. Fourier Optics v2.4 Ray tracing is limited in its ability to describe optics because it ignores the wave properties of light. Diffraction is needed to explain image spatial resolution and contrast and

More information

Effects of Visual and Proprioceptive Information in Visuo-Motor Calibration During a Closed-Loop Physical Reach Task in Immersive Virtual Environments

Effects of Visual and Proprioceptive Information in Visuo-Motor Calibration During a Closed-Loop Physical Reach Task in Immersive Virtual Environments Effects of Visual and Proprioceptive Information in Visuo-Motor Calibration During a Closed-Loop Physical Reach Task in Immersive Virtual Environments Elham Ebrahimi, Bliss Altenhoff, Leah Hartman, J.

More information

The introduction and background in the previous chapters provided context in

The introduction and background in the previous chapters provided context in Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at

More information

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where Fisheye mathematics Fisheye image y 3D world y 1 r P θ θ -1 1 x ø x (x,y,z) -1 z Any point P in a linear (mathematical) fisheye defines an angle of longitude and latitude and therefore a 3D vector into

More information

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Optical Marionette: Graphical Manipulation of Human s Walking Direction Optical Marionette: Graphical Manipulation of Human s Walking Direction Akira Ishii, Ippei Suzuki, Shinji Sakamoto, Keita Kanai Kazuki Takazawa, Hiraku Doi, Yoichi Ochiai (Digital Nature Group, University

More information

Dual-fisheye Lens Stitching for 360-degree Imaging & Video. Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington

Dual-fisheye Lens Stitching for 360-degree Imaging & Video. Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington Dual-fisheye Lens Stitching for 360-degree Imaging & Video Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington Introduction 360-degree imaging: the process of taking multiple photographs and

More information

Chapter 34 Geometric Optics

Chapter 34 Geometric Optics Chapter 34 Geometric Optics Lecture by Dr. Hebin Li Goals of Chapter 34 To see how plane and curved mirrors form images To learn how lenses form images To understand how a simple image system works Reflection

More information

Analyzing the Effect of a Virtual Avatarʼs Geometric and Motion Fidelity on Ego-Centric Spatial Perception in Immersive Virtual Environments

Analyzing the Effect of a Virtual Avatarʼs Geometric and Motion Fidelity on Ego-Centric Spatial Perception in Immersive Virtual Environments Analyzing the Effect of a Virtual Avatarʼs Geometric and Motion Fidelity on Ego-Centric Spatial Perception in Immersive Virtual Environments Brian Ries *, Victoria Interrante, Michael Kaeding, and Lane

More information

Chapter 34. Images. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.

Chapter 34. Images. Copyright 2014 John Wiley & Sons, Inc. All rights reserved. Chapter 34 Images Copyright 34-1 Images and Plane Mirrors Learning Objectives 34.01 Distinguish virtual images from real images. 34.02 Explain the common roadway mirage. 34.03 Sketch a ray diagram for

More information

TenMarks Curriculum Alignment Guide: EngageNY/Eureka Math, Grade 7

TenMarks Curriculum Alignment Guide: EngageNY/Eureka Math, Grade 7 EngageNY Module 1: Ratios and Proportional Relationships Topic A: Proportional Relationships Lesson 1 Lesson 2 Lesson 3 Understand equivalent ratios, rate, and unit rate related to a Understand proportional

More information

Basic Optics System OS-8515C

Basic Optics System OS-8515C 40 50 30 60 20 70 10 80 0 90 80 10 20 70 T 30 60 40 50 50 40 60 30 70 20 80 90 90 80 BASIC OPTICS RAY TABLE 10 0 10 70 20 60 50 40 30 Instruction Manual with Experiment Guide and Teachers Notes 012-09900B

More information

doi: /

doi: / doi: 10.1117/12.872287 Coarse Integral Volumetric Imaging with Flat Screen and Wide Viewing Angle Shimpei Sawada* and Hideki Kakeya University of Tsukuba 1-1-1 Tennoudai, Tsukuba 305-8573, JAPAN ABSTRACT

More information

Refraction, Lenses, and Prisms

Refraction, Lenses, and Prisms CHAPTER 16 14 SECTION Sound and Light Refraction, Lenses, and Prisms KEY IDEAS As you read this section, keep these questions in mind: What happens to light when it passes from one medium to another? How

More information

Object Perception. 23 August PSY Object & Scene 1

Object Perception. 23 August PSY Object & Scene 1 Object Perception Perceiving an object involves many cognitive processes, including recognition (memory), attention, learning, expertise. The first step is feature extraction, the second is feature grouping

More information

Physics 208 Spring 2008 Lab 2: Lenses and the eye

Physics 208 Spring 2008 Lab 2: Lenses and the eye Name Section Physics 208 Spring 2008 Lab 2: Lenses and the eye Your TA will use this sheet to score your lab. It is to be turned in at the end of lab. You must use complete sentences and clearly explain

More information

Collision judgment when viewing minified images through a HMD visual field expander

Collision judgment when viewing minified images through a HMD visual field expander Collision judgment when viewing minified images through a HMD visual field expander Gang Luo, Lee Lichtenstein, Eli Peli Schepens Eye Research Institute Department of Ophthalmology, Harvard Medical School,

More information

MADE EASY a step-by-step guide

MADE EASY a step-by-step guide Perspective MADE EASY a step-by-step guide Coming soon! June 2015 ROBBIE LEE One-Point Perspective Let s start with one of the simplest, yet most useful approaches to perspective drawing: one-point perspective.

More information

Topic 6 - Optics Depth of Field and Circle Of Confusion

Topic 6 - Optics Depth of Field and Circle Of Confusion Topic 6 - Optics Depth of Field and Circle Of Confusion Learning Outcomes In this lesson, we will learn all about depth of field and a concept known as the Circle of Confusion. By the end of this lesson,

More information

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1 EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1 Abstract Navigation is an essential part of many military and civilian

More information

Thin Lenses * OpenStax

Thin Lenses * OpenStax OpenStax-CNX module: m58530 Thin Lenses * OpenStax This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution License 4.0 By the end of this section, you will be able to:

More information

Improving distance perception in virtual reality

Improving distance perception in virtual reality Graduate Theses and Dissertations Graduate College 2015 Improving distance perception in virtual reality Zachary Daniel Siegel Iowa State University Follow this and additional works at: http://lib.dr.iastate.edu/etd

More information

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS INFOTEH-JAHORINA Vol. 10, Ref. E-VI-11, p. 892-896, March 2011. MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS Jelena Cvetković, Aleksej Makarov, Sasa Vujić, Vlatacom d.o.o. Beograd Abstract -

More information

Towards Quantifying Depth and Size Perception in 3D Virtual Environments

Towards Quantifying Depth and Size Perception in 3D Virtual Environments -1- Towards Quantifying Depth and Size Perception in 3D Virtual Environments Jannick P. Rolland*, Christina A. Burbeck, William Gibson*, and Dan Ariely Departments of *Computer Science, CB 3175, and Psychology,

More information

Graphing Techniques. Figure 1. c 2011 Advanced Instructional Systems, Inc. and the University of North Carolina 1

Graphing Techniques. Figure 1. c 2011 Advanced Instructional Systems, Inc. and the University of North Carolina 1 Graphing Techniques The construction of graphs is a very important technique in experimental physics. Graphs provide a compact and efficient way of displaying the functional relationship between two experimental

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

Table of Contents DSM II. Lenses and Mirrors (Grades 5 6) Place your order by calling us toll-free

Table of Contents DSM II. Lenses and Mirrors (Grades 5 6) Place your order by calling us toll-free DSM II Lenses and Mirrors (Grades 5 6) Table of Contents Actual page size: 8.5" x 11" Philosophy and Structure Overview 1 Overview Chart 2 Materials List 3 Schedule of Activities 4 Preparing for the Activities

More information

Photographing Long Scenes with Multiviewpoint

Photographing Long Scenes with Multiviewpoint Photographing Long Scenes with Multiviewpoint Panoramas A. Agarwala, M. Agrawala, M. Cohen, D. Salesin, R. Szeliski Presenter: Stacy Hsueh Discussant: VasilyVolkov Motivation Want an image that shows an

More information

Properties of Structured Light

Properties of Structured Light Properties of Structured Light Gaussian Beams Structured light sources using lasers as the illumination source are governed by theories of Gaussian beams. Unlike incoherent sources, coherent laser sources

More information

Person s Optics Test KEY SSSS

Person s Optics Test KEY SSSS Person s Optics Test KEY SSSS 2017-18 Competitors Names: School Name: All questions are worth one point unless otherwise stated. Show ALL WORK or you may not receive credit. Include correct units whenever

More information

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Exam Name MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. 1) A plane mirror is placed on the level bottom of a swimming pool that holds water (n =

More information

Learning relative directions between landmarks in a desktop virtual environment

Learning relative directions between landmarks in a desktop virtual environment Spatial Cognition and Computation 1: 131 144, 1999. 2000 Kluwer Academic Publishers. Printed in the Netherlands. Learning relative directions between landmarks in a desktop virtual environment WILLIAM

More information

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS Equipment and accessories: an optical bench with a scale, an incandescent lamp, matte, a set of

More information

30 Lenses. Lenses change the paths of light.

30 Lenses. Lenses change the paths of light. Lenses change the paths of light. A light ray bends as it enters glass and bends again as it leaves. Light passing through glass of a certain shape can form an image that appears larger, smaller, closer,

More information

Where should the fisherman aim? The fish is not moving.

Where should the fisherman aim? The fish is not moving. Where should the fisherman aim? The fish is not moving. When a wave hits a boundary it can Reflect Refract Reflect and Refract Be Absorbed Refraction The change in speed and direction of a wave Due to

More information

Exploring 3D in Flash

Exploring 3D in Flash 1 Exploring 3D in Flash We live in a three-dimensional world. Objects and spaces have width, height, and depth. Various specialized immersive technologies such as special helmets, gloves, and 3D monitors

More information

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, MANUSCRIPT ID 1 Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task Eric D. Ragan, Regis

More information

Scene layout from ground contact, occlusion, and motion parallax

Scene layout from ground contact, occlusion, and motion parallax VISUAL COGNITION, 2007, 15 (1), 4868 Scene layout from ground contact, occlusion, and motion parallax Rui Ni and Myron L. Braunstein University of California, Irvine, CA, USA George J. Andersen University

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Image of Formation Images can result when light rays encounter flat or curved surfaces between two media. Images can be formed either by reflection or refraction due to these

More information

Perceiving binocular depth with reference to a common surface

Perceiving binocular depth with reference to a common surface Perception, 2000, volume 29, pages 1313 ^ 1334 DOI:10.1068/p3113 Perceiving binocular depth with reference to a common surface Zijiang J He Department of Psychological and Brain Sciences, University of

More information

Parity and Plane Mirrors. Invert Image flip about a horizontal line. Revert Image flip about a vertical line.

Parity and Plane Mirrors. Invert Image flip about a horizontal line. Revert Image flip about a vertical line. Optical Systems 37 Parity and Plane Mirrors In addition to bending or folding the light path, reflection from a plane mirror introduces a parity change in the image. Invert Image flip about a horizontal

More information

TRAFFIC SIGN DETECTION AND IDENTIFICATION.

TRAFFIC SIGN DETECTION AND IDENTIFICATION. TRAFFIC SIGN DETECTION AND IDENTIFICATION Vaughan W. Inman 1 & Brian H. Philips 2 1 SAIC, McLean, Virginia, USA 2 Federal Highway Administration, McLean, Virginia, USA Email: vaughan.inman.ctr@dot.gov

More information

Optics Laboratory Spring Semester 2017 University of Portland

Optics Laboratory Spring Semester 2017 University of Portland Optics Laboratory Spring Semester 2017 University of Portland Laser Safety Warning: The HeNe laser can cause permanent damage to your vision. Never look directly into the laser tube or at a reflection

More information

Readings: Hecht, Chapter 24

Readings: Hecht, Chapter 24 5. GEOMETRIC OPTICS Readings: Hecht, Chapter 24 Introduction In this lab you will measure the index of refraction of glass using Snell s Law, study the application of the laws of geometric optics to systems

More information

12.1. Human Perception of Light. Perceiving Light

12.1. Human Perception of Light. Perceiving Light 12.1 Human Perception of Light Here is a summary of what you will learn in this section: Focussing of light in your eye is accomplished by the cornea, the lens, and the fluids contained in your eye. Light

More information

Basics of Photogrammetry Note#6

Basics of Photogrammetry Note#6 Basics of Photogrammetry Note#6 Photogrammetry Art and science of making accurate measurements by means of aerial photography Analog: visual and manual analysis of aerial photographs in hard-copy format

More information

Psychophysics of night vision device halo

Psychophysics of night vision device halo University of Wollongong Research Online Faculty of Health and Behavioural Sciences - Papers (Archive) Faculty of Science, Medicine and Health 2009 Psychophysics of night vision device halo Robert S Allison

More information

28 Thin Lenses: Ray Tracing

28 Thin Lenses: Ray Tracing 28 Thin Lenses: Ray Tracing A lens is a piece of transparent material whose surfaces have been shaped so that, when the lens is in another transparent material (call it medium 0), light traveling in medium

More information

Consumer Behavior when Zooming and Cropping Personal Photographs and its Implications for Digital Image Resolution

Consumer Behavior when Zooming and Cropping Personal Photographs and its Implications for Digital Image Resolution Consumer Behavior when Zooming and Cropping Personal Photographs and its Implications for Digital Image Michael E. Miller and Jerry Muszak Eastman Kodak Company Rochester, New York USA Abstract This paper

More information

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc How to Optimize the Sharpness of Your Photographic Prints: Part II - Practical Limits to Sharpness in Photography and a Useful Chart to Deteremine the Optimal f-stop. Robert B.Hallock hallock@physics.umass.edu

More information

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view) Camera projections Recall the plenoptic function: Panoramic imaging Ixyzϕθλt (,,,,,, ) At any point xyz,, in space, there is a full sphere of possible incidence directions ϕ, θ, covered by 0 ϕ 2π, 0 θ

More information

The ground dominance effect in the perception of 3-D layout

The ground dominance effect in the perception of 3-D layout Perception & Psychophysics 2005, 67 (5), 802-815 The ground dominance effect in the perception of 3-D layout ZHENG BIAN and MYRON L. BRAUNSTEIN University of California, Irvine, California and GEORGE J.

More information

Imaging Optics Fundamentals

Imaging Optics Fundamentals Imaging Optics Fundamentals Gregory Hollows Director, Machine Vision Solutions Edmund Optics Why Are We Here? Topics for Discussion Fundamental Parameters of your system Field of View Working Distance

More information

Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment

Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment Marko Horvat University of Zagreb Faculty of Electrical Engineering and Computing, Zagreb,

More information

EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES 4.2 AIM 4.1 INTRODUCTION

EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES 4.2 AIM 4.1 INTRODUCTION EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES Structure 4.1 Introduction 4.2 Aim 4.3 What is Parallax? 4.4 Locating Images 4.5 Investigations with Real Images Focal Length of a Concave Mirror Focal

More information

In addition to one-point perespective, another common perspective

In addition to one-point perespective, another common perspective CHAPTR 5 Two-Point Perspective In addition to one-point perespective, another common perspective drawing technique is two-point perspective, illustrated in Figure 5.1. Unless otherwise stated, we will

More information

7 th grade Math Standards Priority Standard (Bold) Supporting Standard (Regular)

7 th grade Math Standards Priority Standard (Bold) Supporting Standard (Regular) 7 th grade Math Standards Priority Standard (Bold) Supporting Standard (Regular) Unit #1 7.NS.1 Apply and extend previous understandings of addition and subtraction to add and subtract rational numbers;

More information

.VP CREATING AN INVENTED ONE POINT PERSPECTIVE SPACE

.VP CREATING AN INVENTED ONE POINT PERSPECTIVE SPACE PAGE ONE Organize an invented 1 point perspective drawing in the following order: 1 Establish an eye level 2 Establish a Center Line Vision eye level vision Remember that the vanishing point () in one

More information