Towards Quantifying Depth and Size Perception in 3D Virtual Environments

Size: px
Start display at page:

Download "Towards Quantifying Depth and Size Perception in 3D Virtual Environments"

Transcription

1 -1- Towards Quantifying Depth and Size Perception in 3D Virtual Environments Jannick P. Rolland*, Christina A. Burbeck, William Gibson*, and Dan Ariely Departments of *Computer Science, CB 3175, and Psychology, CB 3270, University of North Carolina, Chapel Hill NC 27599, U.S.A. Abstract With the fast advance of real-time computer graphics, head-mounted displays (HMDs) have become popular tools for 3D visualization. One of the most promising and challenging future uses of HMDs, however, is in applications where virtual environments enhance rather then replace real environments. The difficulty comes from the fact that the relative registration of real and virtual environments is a difficult task to perform from both a computational and a perceptual point of view, those being tightly correlated. Given a computational model for stereoscopic viewing, a better understanding of depth and size perception in virtual environments is called for, as well as the relationships between perceived depths and sizes of real and virtual objects sharing a common visual space. This paper first discusses the important parameters within the model that need to be carefully evaluated and set for accurate calibration of the system. We then describe the experimental paradigm used for assessing depth perception of two generic objects, in this case a cube and a cylinder. Finally, experimental results of perceived depth of real and virtual generic objects are presented. 1. Introduction The advance of 3D visualization techniques presents us with the problem of integrating simulated 3D environments with real 3D environments that we would like to experience simultaneously. If a 3D image is viewed directly on a CRT display, both the display and the image are seen simultaneously. If a 3D image is viewed through a see-through HMD, the intermingling of the real and simulated 3D world is even more complex. To provide a seamless integration of the real and virtual images, the absolute depths and sizes of the objects in the two images must correspond appropriately. Then, and only then, will the subject s overall percept be consistent across the 3D space. Theoretically, see-through can be accomplished using an optical or a video see-through HMD. If video is used to combine real and virtual information, however, the real world is often replaced by a limited field of view projection of the real world and a new set of problems unfolds. Moreover, there are situations where awareness of the real world is desirable or necessary. For those situations, the see-through HMD is superior to both the opaque and video see-through because useful information can be provided to the user in the visual context of the real world. The main potential drawback of optical see-through HMDs is that virtual objects cannot in principle occlude real objects. But it is necessary in general to distinguish between physical and perceptual properties, and some solutions to the occlusion problem may be found once more basic perceptual properties have been quantified. In his book on binocular vision, R. W. Reading speaks of visual space as opposed to physical space to clearly distinguish between how things may appear given different

2 -2- human-made visual stimuli, and how they are in the physical world (Reading, 1983). He defines visual space and physical space: visual space is made of appearances and scaled off in terms of apparent relationships of objects in the field of view and physical space is that form which can be measured with rulers and goniometers." In some cases, the exact nature of the visual space is not very relevant for operating in a physical space. Because humans have the ability to adapt, that is, they can learn to adjust motor responses to what is seen, what is important is that there exists a fixed relationship between the visual and physical worlds. The most common example of this adaptation process is the adjustments people make when getting a new prescription of eye glasses. When virtual and physical worlds are superimposed and made to interact, however, the intermingling of the real and simulated worlds renders normal adaptations of this type useless. A fixed relationship between the two worlds is insufficient to provide a seamless integration of the real and simulated images; rather, the absolute depths and sizes of the objects, real and virtual, respectively, must correspond exactly. Only then, will the observer s overall percept be consistent across the 3D space. Most perceptual psychophysics studies in virtual environments have been done in human factors studies of Air Force pilots. The use of virtual displays has been investigated as a potential means to create more effective displays, given the task of detecting visual targets. Two types of display, Head-Up Displays (HUDs) and HMDs, have been used to display symbols during flight runs. The use of virtual displays to visualize symbology was thought to be a means of relieving the pilot of the need to constantly shift his attention between the cockpit instruments and the environment outside the window of the aircraft, therefore increasing the pilot's potential chances of detecting several targets in his or her environment. Haines and colleagues have shown, however, that despite the fact that pilots are supposed to have access to symbology and physical targets information at the same time, symbology most often captures the pilot s attention and targets are being missed (Haines et al. 1980). This simple example suggests that superimposing virtual and real information in an effective manner is difficult to achieve. Moreover, effectiveness needs to be measured in terms of the task to be performed. In more recent years, HMDs have become more and more popular in aircraft to visualize simulated scenes for flight training. Several articles by Stanley Roscoe were published in the Human Factors literature that describe some of the problems found with virtual displays for flight simulation (Roscoe, 1948, 1979, 1984, 1985, 1987, 1991; Roscoe et al., 1966). His main finding is that pilots have a tendency to overestimate the subjective distance to virtual objects in collimated displays. Pilots also reported a decrease in apparent size of virtual objects. More specifically, he has found that imaged objects such as airport runways appear smaller and farther away than objects subtending the same visual angle by direct vision. The decrease in apparent size of objects would then be the cause of depth overestimation because objects of given familiar physical sizes appear farther away as their apparent size decreases. Roscoe attributed this discrepancy to inappropriate accommodation. In such displays, the optical images of the two displays, where the simulated images are drawn, are formed at optical infinity. From a purely optical point of view, this would mean that the pilots are accommodating (focusing) at optical infinity to form sharp images of the displays on their retina. Research has shown, however, that pilots eyes do not automatically focus at optical infinity (Leibowitz, 1975; Roscoe, 1976; Benel, 1979; Simonelli, 1980; Randle, 1980); rather, they tend to focus more likely at their resting point of accommodation, sometimes referred to as their dark focus, which can lie anywhere between 1m and optical infinity, depending on the individual. This issue of inappropriate accommodation causing a decrease in apparent size may not be, however, the correct or the only explanation for the increase in perceived distance of

3 -3- virtual objects. Other researchers claim that the coordinated change in accommodation, vergence, and pupil diameter that occurs when looking at nearby objects, may explain the phenomena (Enright, 1989; Lockhard and Wolbarsh, 1989). Other studies have also reported that optically generated displays that present synthetic images of the outside world have the characteristic of producing systematic errors in size and depth judgments (Palmer and Cronn, 1973; Sheehy and Wilkinson, 1989; Hadani, 1991). We have experienced in our own laboratory some discrepancies between the theoretical modelization of some environments and what was actually perceived in the HMD. An example is the case of navigation in a walkthrough experiment using an opaque (no direct view of the physical environment is then available) type of HMD with about 90 degrees binocular FOV. Several subjects reported that the floor of the virtual room they walked on seemed lower than they expected and behaved as if they had to take a step down to move forward, while no steps were present or visible. It proved to be difficult to navigate comfortably in this environment and to get a good idea of objects positions, room width, length, and depth. This percept may be the product of inaccurate calibration since this effect was not reported systematically by all subjects and, moreover, no interpupillary distance adjustments were available on the system. The resulting distorted space, however, could also be due, at least in part, to the relationship of the objects within the virtual environment itself, as well as the objects characteristics such as the types of texture, illumination, and color used. We propose in this paper to, first of all, review the evaluation of important parameters within the computational model used to generate the stereo images for the two eyes and to identify potential sources of inaccuracies in displaying the stereo images that would result in some mislocation of the objects in virtual space. We shall then describe the experimental setup and the calibration procedure performed to test our computational model. Finally, we shall describe the experimental paradigm used to assess perceived depth of generic objects (real or virtual) in real and virtual environments and present the experimental results. 2. Definitions Aperture: An opening or hole through which radiation or matter may pass Aperture Stop: A physical constraint, often a lens retainer, that limits the diameter of the axial light bundle allowed to pass through a lens. Entrance pupil: In a lens or other optical system, the image of the aperture stop as seen from the object space. Exit Pupil: In a lens or other optical system, the image of the aperture stop as seen from the image space. Pupil: 1. In the eye, the opening in the iris that permits light to pass and be focused on the retina. 2. in a lens, the image of the aperture stop as seen from object and image space. Chief ray: Any ray that passes through the center of the entrance pupil, the pupil, or the exit pupil is referred to as the chief ray. Field of view: The maximum area that can be seen through a lens or an optical instrument. Focal point: That point on the optical axis of a lens to which an incident bundle of parallel light rays will converge. Principal planes: The two conjugate planes in an optical system of unit positive linear magnification Principal points: The intersection of the principal plane and the optical axis of a lens.

4 -4- Nodal points: The intersection with the optical axis of the two conjugate planes positive angular magnification. of unit 3. Computational Model Generating accurate stereoscopic images for a HMD is a difficult task to perform because of the multiple potential sources of errors that can occur at several stages of the computation. The basic idea behind the computational model assumes that if the virtual images of the display screens are centered on the eyes of the subject and the center pixels of the two display screens are turned on, the lines joining the center of perspective chosen for the eyes to those bright pixels will be parallel and the subject will see a point of light at infinity. Any 3D point of light at a finite distance can be rendered by turning on the appropriate pixels on each display such that the line of sights of the eyes passing through those pixels converge at that distance. This concept, very simple in nature, is actually far from simple to implement with accuracy in a HMD (as we have ourselves discovered). A more complete description of the computational model used in HMDs can be found in (Robinett and Rolland, 1992). A computational model that details the appropriate transformation matrices in the most general case where the eyes may be decentered with respect to the virtual image of the optics, and the optical axis of the viewer may be non parallel is under preparation (Robinett and Holloway, 1993). One of the sources of inaccuracies in depth location of 3D virtual objects come from ignoring the optical system used to magnify the images, underestimating or overestimating the field of view (FOV), inaccurately specifying the center of perspective, and finally ignoring the variations in interpupillary distance (IPD) between subjects. The approach of tweaking the software parameters until things look about right is sometimes all one needs to get a subjective percept of depth. Such an approach however is far too subjective for many virtual environments applications (Bajura et al., 1992). Rather, an accurate calibration becomes absolutely necessary. Even though it is especially true for cases where virtual and real environments meet and interact, as with see-through HMDs, it will be benefit opaque HMDs as well. If the sense of vision is our main emphasis here, a HMD must be thought of as a fourcomponent system: the display screens, the optical magnifiers, the eyes of the viewer, and the head-tracking device. For the display software to generate the correct set of stereo pairs to be imaged on the retinas, the software must model the hardware components involved in creating the images presented to the eyes. This includes the characteristics of the frame buffer where the images are rendered, the display-screens parameters, some specifications of the imaging optics, and the position of the optics with respect to the eyes of the wearer. An important point to note is that the spatial relationship of this components relative to each other is most important to the calibration of the system. If more than the sense of vision is considered, additional components could be used for audio, touch, manipulation, force feedback, and smell. Even though we shall only speak of the sense of vision, some of the issues addressed here may be generalized to systems addressing other senses as well. 3.1 Image Formation We shall give a brief description of the image-forming process in order to understand how it can affect the measurements made on the images such as those of our 3D visual system. An optical system can be described most generally by its principal points, its nodal points, and its focal length ; A good review of the basics of geometrical optics can be found in (Longhurst, 1973). Let s first consider a monocular system. Given the position of the display screen with respect to the first (object) principal point of the optics, the position

5 -5- and size of the virtual image can be determined with respect to the second (image) principal point. The display screens and their corresponding virtual images are said to be conjugates of one another. In the case of a binocular system, one would apply the same argument to each eye. We shall point out that this general description of optical imaging only applies to describe the imaging process between two conjugate planes. Moreover, a plane in the image space can only be the conjugate of a single plane in object space. This means that if one is to image multiple planes in object space onto one plane in image space, only objects within one plane in object space will be in sharp focus at the image plane. Other objects will be out of focus, and each point on those objects will be imaged as a blurry spot at the image plane. In this case, the knowledge of the location of the entrance and exit pupils of the optical system need to be known since the center of the blurry spot corresponds to the intercept of the chief ray with the image plane, rather than the intercept of the corresponding nodal ray with the image plane. A concrete application of this point will be encountered when we discuss the choice of the eyepoint location for the computational model. 3.2 Display Screens Alignment and Positioning We shall assume in this following discussion that 1. the eyes of the subject are centered on the optical axes of the optics and 2. that the eyes are reduced to one physical point that overlap the theoretical center of projection used to calculate the stereo projections for each left and right image. A discussion of the effect of the mislocation of the eyes is given in section 3.3. Given the above assumptions, in order for the images to be easily fused and computational depths to correlate accurately with the displayed depths, the display screens must be 1. centered on the optical axis of the optics; this will also maximize the monocular FOVs; 2. kept from unwanted rotations around the optical axis; 3. magnified equally through the optics. Any lateral misalignments of the display screens with respect to the optical axis will cause an error in depth location unless compensated for by the software. An existing lateral misalignment may be due to the fact that the displays are physically too large to be mounted centered on the optics as in the VPL and Flight Helmets (Vision Research) or simply be the result of mechanical limitations in mounting the displays centered on the optics with the precision of a fraction of a pixel. In the case of our experiments, we built an optical bench prototype HMD using off-the-shelves optical components in order to have complete access to all the parameters of the system and to eliminate lateral misalignments that are often found in commercially available systems. Even though we used 3 inch diagonal LCDs displays, we adopted a geometry of the system that allows the displays to be centered on the optics within residual mechanical and assembly errors. A picture of the setup is shown in Figure 3. The exact amount of residual lateral misalignment has proven to be difficult to estimate in number of pixels, and a calibration technique derived from our computational model is described in detail in section 4.2. A vertical misalignment of the displays with respect to the optics will either 1. shift the objects vertically within the FOV if the amount of offset is equal for the two eyes or 2. affect the ability to fuse the images due to induced dipvergence of the eyes of the wearer. While the eyes have a wide latitude in fusing images of various degrees of lateral shifts, the slightest difference in vertical position causes eye strains, possibly headaches, and if the discrepancy is more than 0.34 degree visual angle the time needed to fuse the images increases exponentially (Burton and Home, 1980). In the particular case of our system, the vertical dimension of the display area was 40 mm, and an angle of 0.34 degree at the eyes

6 -6- between the two displays corresponds to a vertical displacement of one display with respect to the other of about 0.5 mm which corresponds to roughly 6 addressable vertical lines while a total of 442 addressable lines were available. An unwanted rotation of the display screens around the optical axis can prevent the fusion of the two images for severe angles. In any case, it introduces distortion of the 3D objects being formed through stereo fusion that may be beyond tolerances. Finally, the positioning of the display screens with respect to the optics must be the same for the two eyes since it determines the magnification of the monocular images. A difference in magnification for the two eyes will cause a systematic error in depth location for the objects being viewed. A complete model of computational errors in HMDs, and their effect on object size and depth locations in under way (Holloway, doctoral dissertation in progress) 3.3 Eyepoint In computer graphics, the center of perspective projection is often referred to as the eyepoint since it is either assumed to be the eye location for the viewer or the location of video cameras (acting as two eyes separated by an intraocular distance) used for the acquisition of stereo images. Even though the simplest optical system, known as the pinhole camera (Barrett and Swindell, 1981) reduces to a single point that serves as the center of projection or eyepoint, this is generally not the case of most optical systems. For the purpose of accurate calibration of HMDs, the eyepoint for perspective projection used in the computational model must be defined in terms of a precise location within the eye, a standard eye being about 25 mm in diameter (Levine, 1985). There is much controversy in the literature about which point within the eye must be chosen for the center of projection. Let s look at the meaning of such a point from a computer graphics point of view: searching for the existence of a center of projection within the eyes comes from the assumption that stimulating a region of the retina produces a perception of light that is projected out into the surrounding environment through a center of projection. This means that there exist a point to point relationship between a point object in the surrounding environment and a point image on the retina. According to this purely geometrical projection, the center of projections for the object and image point are the first and second nodal point of the eye, respectively, as we and others have claimed (Robinett and Rolland, 1991; Deering, 1992). The controversy arises from the fact that not all points in the field of view are in focus at the same time as has been pointed out by Ogle (1962). In this case, any point in the surrounding environment that is out of focus, corresponds to a blurry spot at the retina. The important point is that the center of the blurry spot is determined by the corresponding chief ray passing through the eye. That is, the fact that the nodal points correspond to the optical center of projection of an optical system is only true for precisely conjugate planes. For this reason, Ogle suggests that choosing the entrance pupil of the eye as the center of projection in object space is more realistic. This subtle point is related to the fact that the nodal point of an optical system such as the eye is a useful geometrical construct for describing imaging properties of conjugate planes in an optical system, but has little to do with the way the retina is being stimulated by light coming from the surrounding environment at different depth. Note that the first nodal point of the human eye is only at about 3.5 mm behind the entrance pupil in a standard eye (Ogle, 1962). However, the precise localization of the center of projection may have some impact on the value given to the FOV.

7 Field of View (FOV) The vertical and horizontal FOVs of an optical system working at finite conjugates are generally specified by 1. the vertical (V) and horizontal (H) dimensions of the display screens, respectively; 2. the focal length (f) of the magnifier; 3. the distance of the display screens (D) with respect to first principal plane of the optics; and 4. the location (L) of the entrance pupils of the eyes with respect to the virtual images of the display screens formed by the optical system. One may combine D and f to calculate the position D of the images of the displays formed through the optics with respect to the second principal plane, and thus the magnification m of the optical system for this set of conjugate planes. The FOV is then most generally given by FOV = 2Θ = tan -1 m y L, (1) where y equal V/2 or H/2 for vertical and horizontal FOVs, respectively. We shall, however, describe two optical configurations that make exception to this rule: one where the FOV is independent of the eye location and another where the FOV is independent of the display screens location with respect to the optics, in other words, is independent of the system optical magnification. virtual image of the screen at infinity screen F O F' O eye location can be anywhere along the optical axis on the right hand side of the optical system. P P' Figure 1. Model of a lens by its principal planes P and P' and its focal points F and F'. For the display screen at F, the virtual image of the display is at infinity and subtends a visual angle of 2Θ, independent of the position of the eye. The FOV subtended at the eyes is independent of the position of the eyes with respect to the optics as shown in Fig. 1, when L becomes infinite. The images are referred to as collimated images, the magnification m becomes infinite as well, and one characterizes the size of an object in the FOV by its angular subtense at the eye point. In this particular case, the FOV depends solely on the size of the display and the focal length of the optical system. If we denote the focal length of the optics P F as f, the FOV is now given by FOV = 2 Θ = tan -1 y f. (2)

8 -8- We shall note however, that given a physical dimension for the lens and a size for the pupil of the eye, only a limited range of eye point locations will produce an unvignetted FOV. Rays of light emitted by the display screens are said vignetted by the lens if they intercept the lens at a height larger than the outer diameter of the lens. As the eyes get further away from the lens, the effective rays from a point on the screens to the pupil of the eyes intercept the lens closer to its outer diameter and will eventually be first vignetted before being completely occluded. The other exception to the rule is the case where the FOV is independent of the position of the display screens with respect to the optics. This occurs when the eye points are located at the focal points of the lens as shown also in Fig. 2. In this case, the FOV can also be calculated using Eq. 2. virtual image of the screen screen F Y X P P' d O F' eye location Figure 2. Model of a lens by its principal planes P and P' and its focal points F and F'. For the eye position at F', the FOV is given by 2Θ, independent of the position of the display screen with respect to the lens. The computed images are rendered into a frame buffer before being scanned out onto the display screens. The default assumption of the graphics library is that the frame buffer (512x640 pixels in our case) maps exactly to the vertical and horizontal dimensions of the screens. In practice, this is often not the case, and the exact overscan of the frame buffer must be measured. What we mean by overscan, in this case, is that region of the frame buffer that does not get drawn onto the display area. This overscan can be accounted for by defining a sub-viewport within the frame buffer that corresponds to the display capability of the screens. The FOV can then be specified as the FOV corresponding to this sub-viewport which indeed corresponds to the screens size. One common definition of the FOV is to specify the vertical FOV and a pixel aspect ratio instead of the horizontal FOV. The pixel aspect ratio can be calculated using the physical dimensions of the screens and the vertical and horizontal number of scan lines of the frame buffer that are actually visible on the screens. 3.5 Interpupillary Distance We may want to distinguish between three types of distances: the human subject interpupillary distance (IPD); the lateral separation of the optical axes of the monocular optical systems that we shall refer to as the optics baseline; and the lateral separation of the

9 -9- computational eyepoints that we shall refer to as the computational baseline since it is used to compute objects positions and shapes on the screens. In most systems, the optics baseline is fixed, the subjects IPD is ignored and the computational baseline is set to a small value of about 62 mm to allow most users to fuse the images. This still create a lot of problems for people with small IPDs since they must work harder at fusing images, and it does create shifts in depth perception to almost everyone since none of the parameters actually match. Again, this may be acceptable for opaque systems but is unacceptable for see-through HMDs. In a calibrated setup, the optics baseline and the computational baseline should be set to the IPD of the subject. Moreover, depending on whether the images are collimated or not, the computational model may have to assume a certain position for the distance of the eyes of the subject with respect to the monocular virtual images of display screens for the FOV to be correctly specified, as described in section 3.4. In the case of collimated images only, we have seen that the FOV subtended at the eyes is independent of the eyes location. This can also be said of any point in the FOV. When the two eyepoints are combined to determine the depth location of an object, however, simple geometry shows that virtual objects follow the eyes; that is, they theoretically occupy the same depth as measured from the eyes, regardless of where the latter reside, but in the case of lateral displacement of the eyes with respect to the optical axes, however, the 3D virtual objects will be laterally displaced by the same amount. 4. Experimental Setup and calibration 4.1. Experimental Setup All studies described here were carried out on an optical bench prototype HMD that we designed and built using off-the-shelves optical components. The layout of the optical setup for one eye is given in Fig. 3., while the setup is shown Fig. 4. We strongly believed that building our own setup was necessary in order to have complete access to and control over all the different parameters of the system. The displays used for the study were Memorex LCD color displays, while the computer graphics were generated by Pixel Plane 5, a massively parallel graphics engine developed at UNC-CH under the direction of Henry Fuchs and John Poulton (Fuchs et al., 1989). In addition, the geometry adopted for the layout of the optical system allowed us to eliminate lateral misalignments that are often found in commercially available systems. Even though the displays used were about 3 inch diagonal, this adopted geometry allowed the displays to be centered on the optics with minimal residual mechanical and assembly errors. The exact amount of residual lateral misalignment has proven to be difficult to estimate in number of pixels, and a calibration technique derived from our computational model was derived and described below. The vertical and horizontal dimensions of the display screens used was 40 mm and mm, respectively, and the focal length of the magnifier mm. The number of vertical and horizontal addressable scan lines were mesured to be 441 x 593, respectively. For collimated images, which is the case we are presenting in this paper, the vertical FOV is degrees as calculated using Eq. 2. A pixel ratio of 1.02 can be calculated from the physical dimensions of the display area and the number of vertical and horizontal addressable lines. We shall point out that neglecting the overscan of the frame buffer would result in an error of about 5 degrees in the specified FOV which would create objects that look larger and closer than theoretically intended. This could be indeed detected by the calibration procedure described below.

10 -10- Figure 3. Optical layout of the magnifier for one eye. A nice feature of the optical setup was the ability to adjust the optics and computational baselines to match the subjects IPD down to 60 mm. All of our observers had IPDs between 61 mm and 68 mm measured with a pupilometer. Any dissymetry between the two eyes was taken into account Calibration Technique Once the FOV has been set, and the optics and computational baselines have been matched to the IPD of the subject, a quick calibration of the system can be performed to determine if any computational offsets of the images being displayed are necessary to account for unwanted lateral displacements of the displays with respect to the optical axis. If the system has been carefully aligned and the parameters within the computational correctly set, the magnitude of the offsets should be very small, on the order of a few pixels. The calibration consists of looking monocularly at a symmetrical pattern drawn on the displays in such a way that it projects at a very large distance if seen binocularly (greater or equal to 200m is recommended; 6m, which is often referred to as optical infinity proved to be insufficient). A symmetrical pattern is recommended since the symmetry will be used in the alignment process described now. While the subject is checking for his alignment of his/her eyes with the optical axis of the magnifier, one may draw on a sheet of paper two crosses separated by the IPD of the observer, as well as the mid-point between the two crosses. As the observer is looking through the displays monocularly, one positions the two crosses at any depth in space such that, while looking with the right eye let s say, the right cross falls in the exact center of the virtual pattern displayed in the right eye. As the observer switches eyes, the center of the left virtual pattern and the left cross should exactly superimpose. If the superimposition for each eye does not hold, some offsets for the

11 -11- images being displayed must be set in the software until perfect match occurs. This is better accomplished using some systematic procedure to offset the images. Figure 4. Bench prototype optical setup designed and built by J.P. Rolland. Once the calibration at infinity is completed, one can check for consistency by looking at the virtual pattern which is now projected at a nearby distance. The two physical crosses are now positioned at the same depth from the eyepoints than the virtual pattern. While closing alternatively one eye, the observer should perceive the center of the virtual pattern to fall exactly on the mid point between the two crosses. This calibration technique was derived directly from the basic concept of the computational model of stereo images illustrated on Fig.5. Note that the calibration procedure never used both eyes simultaneously to avoid confusing calibration issues with perception issues. The two eyes are only used sequentially. This consistency check will only be strictly valid, however, if the optical distortion of the optical system has been compensated for (Robinett and Rolland, 1990; Rolland and Hopkins, 1993). The calibration using the optical infinity condition, on the other hand, is always valid, regardless of optical distortion as long as the center of the symmetrical pattern is confounded with the center of optical distortion within a few pixels, which is the case in our experiment. A variant of the above technique could also be used to verify the FOV of the system (Ellis, 1993). It consists of comparing the size of a virtual object with that of a real identical object, again under monocular viewing. Caution must be exerted, however, if optical distortion is present in the display since monocular images will appear larger in the case of pincushion distortion, which often limits HMD systems. This method, however, can be used for distortion free systems.

12 -12- IPD object at 2m center marked by a cross located at IPD/2 virtual image seen by left eye virtual image seen by right eye location of the cross on left screen location of the cross on right screen screen P P' left eye screen P P' right eye optics model Figure 5. Computational model of stereo pairs and its direct application to calibration 5. Experimental Studies In our day-to-day experience in the real world, looking at nearby objects triggers two processes that are put into action almost simultaneously: the eyes converge by rotating the eyes inward and the crystalline lenses accommodate to the location of the object. Accommodation is needed to bring into focus details of the objects. It may seem natural that the actions of accommodation and convergence are definite, and that for any specific accommodation, there must be an exactly corresponding convergence. This is not, however, how vision works in most HMDs. In HMDs, one needs to distinguish between monocular and stereo aspects of vision. With respect to each eye separately, the virtual images of the 2 LCD displays mounted on the head are formed by the optics at a definite plane in space. The position of that plane solely depends on the distance of the LCD displays to the optical viewers. That distance usually remains fixed once it has been set to a desired value.

13 -13- A particular setting is the one of infinity viewing or zero accommodation where the displays are positioned at the focal plane of the lenses. The monocular virtual images are then said to be collimated. Such a setting is often preferred because it sets the eyes to a relaxed state of accommodation and it can also be shown that what is perceived in the HMD is less dependent on the exact position of the eyes of the wearer behind the optics A particular plane, that is infinity, corresponds, for example, to positioning the displays at the focal plane of the lenses. The monocular virtual images are then said to be collimated. In this case, the eyes of the subject are in the condition for infinity viewing or zero accommodation. Such a setting is often preferred not only because it sets the eyes to a relax state of accommodation but also because what is perceived in the HMD is less dependent on the exact position of the eyes of the wearer as shown in section 3.3 and 3.4. Indeed, most stereoscopes are so adjusted that the viewing lenses act as collimators (McKay, 1948). With respect to the two eyes combined, vision in HMDs is similar to our vision in the real world in that convergence is driven by the location of the 3D objects perceived from the disparate monocular images. The fact that accommodation and convergence are divorced from one another in most HMD settings brings up the question of whether or not the distance at which the monocular images are optically formed has any impact on the perceived depths and sizes of virtual objects embedded in the physical environment and/or creates distortions in depth and sizes of virtual spaces. This is especially important in the case of see-through HMDs, where the observer has access to both the simulated and the physical world simultaneously. As mentioned in section 4, cases of discrepancies in assessing distance and size judgments of virtual objects have been reported by Roscoe and others. We started investigating this issue further by studying the case of generic non overlapping objects. To study the impact of the uncoupling between convergence and accommodation, we considered the most extreme case of uncoupling, that is when 3D real and virtual objects are placed at nearby distances while the 2D virtual monocular images are optically collimated Paradigm. We asked the subjects to judge the proximity in depth of a virtual object displayed via our see-through HMD with respect to that of a physical object presented simultaneously. The two objects were laterally separated by 110 mm so that, given the subject viewpoint and the range of distances the virtual object occupied, no overlap of the two objects ever occurred during the experiment. Moreover, such a lateral separation prevented the subjects from basing their judgments upon an edge of the object. The contrast of such edges is very dependant upon the shape of the objects and the light model configuration. Generic objects of different shapes were chosen for comparison, in our case a cube and a cylinder, to prevent the subjects from assessing depth perception solely on the basis of the perceived size of an object relative to the other. For example, if the two objects were of similar shapes but of different sizes, one may be biased to always see a small cube farther away than a larger one. On the other hand, if two cubes of identical size were chosen, then depth assessment could be done solely upon the assessment of their respective sizes. The virtual and real objects were both white. Shading in the virtual environment matched closely the shading of the physical objects. We considered three conditions for the objects: both objects were real, virtual, or one was real and the other one was virtual. These three conditions are referred to as real/real, virtual/virtual, and real/virtual. One of the objects was always fixed in location, and the other was moved in depth between trials. The screens displaying the virtual objects and the real objects were blanked for a few seconds between trials. No other objects were ever perceivable within the field of view.

14 -14- Task. Subjects performed a 2-alternative force choice task. They were asked to answer (via a key press) if the object on the right (the one translated in depth between trials), was closer or farther than the object on the left. Method. We used a method of constant stimuli (MOC) to present the stimuli in any of the three conditions. The depth values presented were chosen before each run to capture the whole psychometric function for each subject and experiment, so that at least 4 points felt along the steepest part of the psychometric curve. Each experiment was made of 31 blocks of 10 trials each corresponding to a random sequence of ten possible depths around a mean value. Subjects used a chin rest to maintain a constant viewing distance. Viewing duration was unlimited and terminated on a decision key press. The subjects used a two button device for entering their responses. The data were analyzed using probit analysis on the last 30 blocks. The first block was always discarded. The resulting psychometric function was plotted for each subject and after each experiment to assure that at least 4 points lay on the linear part of the psychometric function. Points on the shoulders of the psychometric function were used as anchor points for the subject. Data were collected for three subjects. Subjects. Data were collected from three observers and at the two viewing distances of 0.8m and 1.2 m. The subjects were 3 volunteers, highly interested in the outcomes of these studies. Two were students, one was a graduate psychology major with no experience in HMDs, while the other was a computer science major who was working on the HMD team. The third subject was a faculty member working with the HMD team as well. All observers were tested for stereoscopic vision using a standard test like Julesz random dot stereograms. We define veridicality of perceived depth as the difference between the nominal depth as specified in the computational model and the measured value. We shall denote it as PSE since it represents the departure from the expected point of subjective equality. Precision of perceived depth is defined as the smallest difference in perceived depth that can be detected reliably at the 84% level. Results. Results are shown Figures 6 and 7 for veridicality and precision of perceived depth, respectively. Figures 6a, 6b and 6c show the veridicality of perceived depths for the three basic conditions, real/real, virtual/virtual, and real/virtual, respectively. Similarly, Figure 7a, 7b, and 7c, show the precision of perceived depth plotted this time on a log scale, for the three basic conditions. Each of the three graphs shows the performance of each of the three subjects separately. A summary plot of perceived depth is shown in Figure 6d and 7d, for veridicality and precision of perceived depth, respectively, where the performances per condition were averaged over the three subjects. Figures 6a, 6b, and 6c show that there is in average no mean error in perceived depth if both objects are real or virtual, even though the variance associated with the mean values in the virtual/virtual condition is significantly higher. The upward shift of the data points in Figures 6c and 6d means that there is a shift in perceived depth for virtual objects as compared to real objects. A positive PSE, in this case, means that the virtual objects are perceived farther away than real objects. Figure 7d, which summarizes Figures 7a, 7b, and 7c, shows a significant increase in discrimination thresholds for the two conditions virtual/virtual and virtual/real as compared to the real/real condition.

15 -15- Veridicality of Perceived Depth Real/Real Veridicality of Perceived Depth Virtual/Virtual PSE (m) E 1J 1E J PSE (m) J E 1 EJ E Subject 1 1 Subject 2 J Subject Distance in Depth (m) (a) Distance in Depth (m) Veridicality of Perceived Depth Real/Virtual 0.10 J J 1 E PSE (m) 0.00 E PSE (m) Veridicality of Perceived Depth Average data for 3 Subjects > BG > G B B Real/Real G Virtual/V > Real/Virtual Distance in Depth (m) (c) Distance in Depth (m) Figure 6. The veridicality of perceived depth, graphed for the three conditions: (a) REAL/REAL; (b) VIRTUAL/VIRTUAL; and (c) REAL/VIRTUAL, are plotted as a function of two depth 0.8 m and 1.2 m for three subjects. Figure (d) summarizes the results for the three conditions after averaging the data over the three subjects. The shifts of the data in (c) and (d) mean that virtual objects are seen further away than real objects. (d)

16 -16- Precision of Perceived Depth Real/Real Discrimination Threshold (m) Discrimination Threshold (m) J E J E Distance in Depth (m) Precision of Perceived Depth Real/Virtual E 1J E J Distance in Depth (m) (c) (a) 1 Discrimination Threshold (m) Discrimination Threshold (m) Precision of Perceived Depth Virtual/Virtual E J 1 E J Distance in Depth (m) Precision of Perceived Depth Average data for 3 Subjects G > B G> Distance in Depth (m) Figure 7. The precision of perceived depth, graphed for the three conditions: (a) REAL/REAL; (b) VIRTUAL/VIRTUAL; and (c) REAL/VIRTUAL, are plotted as a function of two depth 0.8 m and 1.2 m for three subjects. Figure (d) summarizes the results for the three conditions after averaging the data over the three subjects. Those Figures show that depth discrimination thresholds are being elevated from 2 mm in condition (a) to roughly 15 mm in conditions (b) and (c). Moreover no systematic elevation of thresholds with depth was observed. (d) B E Subject 1 1 Subject 2 J Subject 3 B Real/Real G Virtual/V > Real/Virtual

17 Discussion The results on veridicality of perceived depth indicate that virtual objects are seen farther away than real objects when both objects are presented at the same depth in visual space. Similar findings were reported by Roscoe and others for familiar objects to the subjects such as airport runways. One fundamental difference between this study and others is that 2 generic objects, here a cube and a cylinder, were chosen as stimuli, rather than more familiar objects. The question is whether this finding is purely a perceptual phenomenon or if it can be predicted totally or in part by taking into account parameters of our system that we have not yet discussed. One parameter that we have not yet discussed is the one of optical distortion. It is indeed the case that at the time of doing this first set of studies, optical predistortion of the display was not yet implemented in our laboratory. Even though the optical distortion should not have affected the calibration of the system as explained earlier, it did affect the placement in depth of the virtual objects since the objects were laterally separated by 110 mm and felt away from the center of distortion in at least one of the two displays. An illustration of the amount of distortion caused by the optics is shown Fig. 9. Because we know the exact layout and parameters of the optical system, we can estimate the effect of residual distortion on assigned depth to objects. An exact calculation of the impact of optical distortion on the results reported in Figure 6 and 7 was carried out using an optical ray tracing program available in house. Calculations show that pincushion distortion for the individual monocular images has the effect of bringing the 3D virtual objects created from the two disparate virtual monocular images closer to the eyes. The calculated values were about 11mm and 7 mm for the three subjects at the viewing distances of 0.8 m and 1.2 m, respectively. This means that we underestimated the amount of shifts in depth of the virtual objects with respect to the real objects. New data are now being acquired using predistorded images (Robinett and Rolland, 1992; Rolland and Hopkins, 1993). Another more subtle, yet relevant parameter that could have biased our results is the one of illumination. It is the case indeed that illumination in real environments is often punctual (it comes from a point source of light) while illumination in virtual environments is more likely to be directional (it is not spatially localized). Moreover, the spectral range of illumination can be different in subtle ways. We noticed for example that the illumination in the virtual environment was slightly bluish, while illumination of real objects was slightly more reddish. This is of importance since it is commonly known that the binocular observation of a red light and a blue light, located at the same physical depth, often will cause the red one to appear closer than the blue, phenomenon known as chromostereopsis (Vos, 1960). Experimental data reported by Vos show that the maximum amplitude of the effect over 4 observers was 4 mm at 0.8 m viewing distance. Moreover, half of the observers reported that the red object ( a slit in Vos s case) was perceived behind the blue slit. Due to the very slight difference in the spectral range of our illumination sources in real and virtual environments, respectively, and the measurements reported by Vos in saturated environments, we are fairly confident to claim that if chromostereopsis can explain part in our findings, it is surely a very small, if not negligible, part. Experiments are under way to confirm our believes. An elevation in variance in the measure of veridicality of perceived depth of virtual objects was recorded. It can be interpreted as a decrease in strength of the percept of depth for virtual objects in the context of that specific experiment. A variant to this effect is the need for anchor points on the psychometric functions. The use of anchor points was mentioned in section 5. In their absence, the subjects had a tendency to drift toward larger differences between perceived and physical depth, and to be more inconsistent across repeated measures. This is indeed a point of interest because it suggests that the percept of

Opto Engineering S.r.l.

Opto Engineering S.r.l. TUTORIAL #1 Telecentric Lenses: basic information and working principles On line dimensional control is one of the most challenging and difficult applications of vision systems. On the other hand, besides

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Laboratory 7: Properties of Lenses and Mirrors

Laboratory 7: Properties of Lenses and Mirrors Laboratory 7: Properties of Lenses and Mirrors Converging and Diverging Lens Focal Lengths: A converging lens is thicker at the center than at the periphery and light from an object at infinity passes

More information

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura MIT CSAIL 6.869 Advances in Computer Vision Fall 2013 Problem Set 6: Anaglyph Camera Obscura Posted: Tuesday, October 8, 2013 Due: Thursday, October 17, 2013 You should submit a hard copy of your work

More information

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays Einführung in die Erweiterte Realität 5. Head-Mounted Displays Prof. Gudrun Klinker, Ph.D. Institut für Informatik,Technische Universität München klinker@in.tum.de Nov 30, 2004 Agenda 1. Technological

More information

Section 3. Imaging With A Thin Lens

Section 3. Imaging With A Thin Lens 3-1 Section 3 Imaging With A Thin Lens Object at Infinity An object at infinity produces a set of collimated set of rays entering the optical system. Consider the rays from a finite object located on the

More information

OPTICAL SYSTEMS OBJECTIVES

OPTICAL SYSTEMS OBJECTIVES 101 L7 OPTICAL SYSTEMS OBJECTIVES Aims Your aim here should be to acquire a working knowledge of the basic components of optical systems and understand their purpose, function and limitations in terms

More information

Performance Factors. Technical Assistance. Fundamental Optics

Performance Factors.   Technical Assistance. Fundamental Optics Performance Factors After paraxial formulas have been used to select values for component focal length(s) and diameter(s), the final step is to select actual lenses. As in any engineering problem, this

More information

Princeton University COS429 Computer Vision Problem Set 1: Building a Camera

Princeton University COS429 Computer Vision Problem Set 1: Building a Camera Princeton University COS429 Computer Vision Problem Set 1: Building a Camera What to submit: You need to submit two files: one PDF file for the report that contains your name, Princeton NetID, all the

More information

E X P E R I M E N T 12

E X P E R I M E N T 12 E X P E R I M E N T 12 Mirrors and Lenses Produced by the Physics Staff at Collin College Copyright Collin College Physics Department. All Rights Reserved. University Physics II, Exp 12: Mirrors and Lenses

More information

COPYRIGHTED MATERIAL. Overview

COPYRIGHTED MATERIAL. Overview In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experience data, which is manipulated

More information

Opti 415/515. Introduction to Optical Systems. Copyright 2009, William P. Kuhn

Opti 415/515. Introduction to Optical Systems. Copyright 2009, William P. Kuhn Opti 415/515 Introduction to Optical Systems 1 Optical Systems Manipulate light to form an image on a detector. Point source microscope Hubble telescope (NASA) 2 Fundamental System Requirements Application

More information

6.869 Advances in Computer Vision Spring 2010, A. Torralba

6.869 Advances in Computer Vision Spring 2010, A. Torralba 6.869 Advances in Computer Vision Spring 2010, A. Torralba Due date: Wednesday, Feb 17, 2010 Problem set 1 You need to submit a report with brief descriptions of what you did. The most important part is

More information

COPYRIGHTED MATERIAL OVERVIEW 1

COPYRIGHTED MATERIAL OVERVIEW 1 OVERVIEW 1 In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experiential data,

More information

CHAPTER 33 ABERRATION CURVES IN LENS DESIGN

CHAPTER 33 ABERRATION CURVES IN LENS DESIGN CHAPTER 33 ABERRATION CURVES IN LENS DESIGN Donald C. O Shea Georgia Institute of Technology Center for Optical Science and Engineering and School of Physics Atlanta, Georgia Michael E. Harrigan Eastman

More information

Lenses- Worksheet. (Use a ray box to answer questions 3 to 7)

Lenses- Worksheet. (Use a ray box to answer questions 3 to 7) Lenses- Worksheet 1. Look at the lenses in front of you and try to distinguish the different types of lenses? Describe each type and record its characteristics. 2. Using the lenses in front of you, look

More information

Imaging Optics Fundamentals

Imaging Optics Fundamentals Imaging Optics Fundamentals Gregory Hollows Director, Machine Vision Solutions Edmund Optics Why Are We Here? Topics for Discussion Fundamental Parameters of your system Field of View Working Distance

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Image of Formation Images can result when light rays encounter flat or curved surfaces between two media. Images can be formed either by reflection or refraction due to these

More information

Chapter 34. Images. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.

Chapter 34. Images. Copyright 2014 John Wiley & Sons, Inc. All rights reserved. Chapter 34 Images Copyright 34-1 Images and Plane Mirrors Learning Objectives 34.01 Distinguish virtual images from real images. 34.02 Explain the common roadway mirage. 34.03 Sketch a ray diagram for

More information

Be aware that there is no universal notation for the various quantities.

Be aware that there is no universal notation for the various quantities. Fourier Optics v2.4 Ray tracing is limited in its ability to describe optics because it ignores the wave properties of light. Diffraction is needed to explain image spatial resolution and contrast and

More information

The eye, displays and visual effects

The eye, displays and visual effects The eye, displays and visual effects Week 2 IAT 814 Lyn Bartram Visible light and surfaces Perception is about understanding patterns of light. Visible light constitutes a very small part of the electromagnetic

More information

Basics of Photogrammetry Note#6

Basics of Photogrammetry Note#6 Basics of Photogrammetry Note#6 Photogrammetry Art and science of making accurate measurements by means of aerial photography Analog: visual and manual analysis of aerial photographs in hard-copy format

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Notation for Mirrors and Lenses The object distance is the distance from the object to the mirror or lens Denoted by p The image distance is the distance from the image to the

More information

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation

More information

PHYSICS FOR THE IB DIPLOMA CAMBRIDGE UNIVERSITY PRESS

PHYSICS FOR THE IB DIPLOMA CAMBRIDGE UNIVERSITY PRESS Option C Imaging C Introduction to imaging Learning objectives In this section we discuss the formation of images by lenses and mirrors. We will learn how to construct images graphically as well as algebraically.

More information

Tangents. The f-stops here. Shedding some light on the f-number. by Marcus R. Hatch and David E. Stoltzmann

Tangents. The f-stops here. Shedding some light on the f-number. by Marcus R. Hatch and David E. Stoltzmann Tangents Shedding some light on the f-number The f-stops here by Marcus R. Hatch and David E. Stoltzmann The f-number has peen around for nearly a century now, and it is certainly one of the fundamental

More information

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses Chapter 29/30 Refraction and Lenses Refraction Refraction the bending of waves as they pass from one medium into another. Caused by a change in the average speed of light. Analogy A car that drives off

More information

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5 Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain

More information

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display https://doi.org/10.2352/issn.2470-1173.2017.5.sd&a-376 2017, Society for Imaging Science and Technology Analysis of retinal images for retinal projection type super multiview 3D head-mounted display Takashi

More information

10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions

10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions 10.2 SUMMARY Refraction in Lenses Converging lenses bring parallel rays together after they are refracted. Diverging lenses cause parallel rays to move apart after they are refracted. Rays are refracted

More information

UNITY VIA PROGRESSIVE LENSES TECHNICAL WHITE PAPER

UNITY VIA PROGRESSIVE LENSES TECHNICAL WHITE PAPER UNITY VIA PROGRESSIVE LENSES TECHNICAL WHITE PAPER UNITY VIA PROGRESSIVE LENSES TECHNICAL WHITE PAPER CONTENTS Introduction...3 Unity Via...5 Unity Via Plus, Unity Via Mobile, and Unity Via Wrap...5 Unity

More information

Topic 6 - Optics Depth of Field and Circle Of Confusion

Topic 6 - Optics Depth of Field and Circle Of Confusion Topic 6 - Optics Depth of Field and Circle Of Confusion Learning Outcomes In this lesson, we will learn all about depth of field and a concept known as the Circle of Confusion. By the end of this lesson,

More information

Chapter 34 Geometric Optics (also known as Ray Optics) by C.-R. Hu

Chapter 34 Geometric Optics (also known as Ray Optics) by C.-R. Hu Chapter 34 Geometric Optics (also known as Ray Optics) by C.-R. Hu 1. Principles of image formation by mirrors (1a) When all length scales of objects, gaps, and holes are much larger than the wavelength

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 3: Imaging 2 the Microscope Original Version: Professor McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create highly

More information

How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail

How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail Robert B.Hallock hallock@physics.umass.edu Draft revised April 11, 2006 finalpaper1.doc

More information

I-I. S/Scientific Report No. I. Duane C. Brown. C-!3 P.O0. Box 1226 Melbourne, Florida

I-I. S/Scientific Report No. I. Duane C. Brown. C-!3 P.O0. Box 1226 Melbourne, Florida S AFCRL.-63-481 LOCATION AND DETERMINATION OF THE LOCATION OF THE ENTRANCE PUPIL -0 (CENTER OF PROJECTION) I- ~OF PC-1000 CAMERA IN OBJECT SPACE S Ronald G. Davis Duane C. Brown - L INSTRUMENT CORPORATION

More information

Speed and Image Brightness uniformity of telecentric lenses

Speed and Image Brightness uniformity of telecentric lenses Specialist Article Published by: elektronikpraxis.de Issue: 11 / 2013 Speed and Image Brightness uniformity of telecentric lenses Author: Dr.-Ing. Claudia Brückner, Optics Developer, Vision & Control GmbH

More information

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS Equipment and accessories: an optical bench with a scale, an incandescent lamp, matte, a set of

More information

HMD calibration and its effects on distance judgments

HMD calibration and its effects on distance judgments HMD calibration and its effects on distance judgments Scott A. Kuhl, William B. Thompson and Sarah H. Creem-Regehr University of Utah Most head-mounted displays (HMDs) suffer from substantial optical distortion,

More information

Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May

Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May 30 2009 1 Outline Visual Sensory systems Reading Wickens pp. 61-91 2 Today s story: Textbook page 61. List the vision-related

More information

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION Revised November 15, 2017 INTRODUCTION The simplest and most commonly described examples of diffraction and interference from two-dimensional apertures

More information

This experiment is under development and thus we appreciate any and all comments as we design an interesting and achievable set of goals.

This experiment is under development and thus we appreciate any and all comments as we design an interesting and achievable set of goals. Experiment 7 Geometrical Optics You will be introduced to ray optics and image formation in this experiment. We will use the optical rail, lenses, and the camera body to quantify image formation and magnification;

More information

Perception: From Biology to Psychology

Perception: From Biology to Psychology Perception: From Biology to Psychology What do you see? Perception is a process of meaning-making because we attach meanings to sensations. That is exactly what happened in perceiving the Dalmatian Patterns

More information

TESTING VISUAL TELESCOPIC DEVICES

TESTING VISUAL TELESCOPIC DEVICES TESTING VISUAL TELESCOPIC DEVICES About Wells Research Joined TRIOPTICS mid 2012. Currently 8 employees Product line compliments TRIOPTICS, with little overlap Entry level products, generally less expensive

More information

AP Physics Problems -- Waves and Light

AP Physics Problems -- Waves and Light AP Physics Problems -- Waves and Light 1. 1974-3 (Geometric Optics) An object 1.0 cm high is placed 4 cm away from a converging lens having a focal length of 3 cm. a. Sketch a principal ray diagram for

More information

Geometric optics & aberrations

Geometric optics & aberrations Geometric optics & aberrations Department of Astrophysical Sciences University AST 542 http://www.northerneye.co.uk/ Outline Introduction: Optics in astronomy Basics of geometric optics Paraxial approximation

More information

Encoding and Code Wheel Proposal for TCUT1800X01

Encoding and Code Wheel Proposal for TCUT1800X01 VISHAY SEMICONDUCTORS www.vishay.com Optical Sensors By Sascha Kuhn INTRODUCTION AND BASIC OPERATION The TCUT18X1 is a 4-channel optical transmissive sensor designed for incremental and absolute encoder

More information

Module 2. Lecture-1. Understanding basic principles of perception including depth and its representation.

Module 2. Lecture-1. Understanding basic principles of perception including depth and its representation. Module 2 Lecture-1 Understanding basic principles of perception including depth and its representation. Initially let us take the reference of Gestalt law in order to have an understanding of the basic

More information

APPLICATIONS FOR TELECENTRIC LIGHTING

APPLICATIONS FOR TELECENTRIC LIGHTING APPLICATIONS FOR TELECENTRIC LIGHTING Telecentric lenses used in combination with telecentric lighting provide the most accurate results for measurement of object shapes and geometries. They make attributes

More information

Bias errors in PIV: the pixel locking effect revisited.

Bias errors in PIV: the pixel locking effect revisited. Bias errors in PIV: the pixel locking effect revisited. E.F.J. Overmars 1, N.G.W. Warncke, C. Poelma and J. Westerweel 1: Laboratory for Aero & Hydrodynamics, University of Technology, Delft, The Netherlands,

More information

13. Optical Instruments*

13. Optical Instruments* 13. Optical Instruments* Objective: Here what you have been learning about thin lenses is applied to make a telescope. In the process you encounter general optical instrument design concepts. The learning

More information

doi: /

doi: / doi: 10.1117/12.872287 Coarse Integral Volumetric Imaging with Flat Screen and Wide Viewing Angle Shimpei Sawada* and Hideki Kakeya University of Tsukuba 1-1-1 Tennoudai, Tsukuba 305-8573, JAPAN ABSTRACT

More information

Regan Mandryk. Depth and Space Perception

Regan Mandryk. Depth and Space Perception Depth and Space Perception Regan Mandryk Disclaimer Many of these slides include animated gifs or movies that may not be viewed on your computer system. They should run on the latest downloads of Quick

More information

PHYSICS 289 Experiment 8 Fall Geometric Optics II Thin Lenses

PHYSICS 289 Experiment 8 Fall Geometric Optics II Thin Lenses PHYSICS 289 Experiment 8 Fall 2005 Geometric Optics II Thin Lenses Please look at the chapter on lenses in your text before this lab experiment. Please submit a short lab report which includes answers

More information

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc. Human Vision and Human-Computer Interaction Much content from Jeff Johnson, UI Wizards, Inc. are these guidelines grounded in perceptual psychology and how can we apply them intelligently? Mach bands:

More information

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

IMAGE SENSOR SOLUTIONS. KAC-96-1/5 Lens Kit. KODAK KAC-96-1/5 Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2 KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

More information

The diffraction of light

The diffraction of light 7 The diffraction of light 7.1 Introduction As introduced in Chapter 6, the reciprocal lattice is the basis upon which the geometry of X-ray and electron diffraction patterns can be most easily understood

More information

Perceived depth is enhanced with parallax scanning

Perceived depth is enhanced with parallax scanning Perceived Depth is Enhanced with Parallax Scanning March 1, 1999 Dennis Proffitt & Tom Banton Department of Psychology University of Virginia Perceived depth is enhanced with parallax scanning Background

More information

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21 Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:

More information

NREM 345 Week 2, Material covered this week contributes to the accomplishment of the following course goal:

NREM 345 Week 2, Material covered this week contributes to the accomplishment of the following course goal: NREM 345 Week 2, 2010 Reading assignment: Chapter. 4 and Sec. 5.1 to 5.2.4 Material covered this week contributes to the accomplishment of the following course goal: Goal 1: Develop the understanding and

More information

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design Outline Chapter 1: Introduction Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design 1 Overview: Integration of optical systems Key steps

More information

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit Experiment 1: Fraunhofer Diffraction of Light by a Single Slit Purpose 1. To understand the theory of Fraunhofer diffraction of light at a single slit and at a circular aperture; 2. To learn how to measure

More information

MOTION PARALLAX AND ABSOLUTE DISTANCE. Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673

MOTION PARALLAX AND ABSOLUTE DISTANCE. Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673 MOTION PARALLAX AND ABSOLUTE DISTANCE by Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673 Bureau of Medicine and Surgery, Navy Department Research

More information

EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES 4.2 AIM 4.1 INTRODUCTION

EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES 4.2 AIM 4.1 INTRODUCTION EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES Structure 4.1 Introduction 4.2 Aim 4.3 What is Parallax? 4.4 Locating Images 4.5 Investigations with Real Images Focal Length of a Concave Mirror Focal

More information

This document is a preview generated by EVS

This document is a preview generated by EVS INTERNATIONAL STANDARD ISO 17850 First edition 2015-07-01 Photography Digital cameras Geometric distortion (GD) measurements Photographie Caméras numériques Mesurages de distorsion géométrique (DG) Reference

More information

Image Formation by Lenses

Image Formation by Lenses Image Formation by Lenses Bởi: OpenStaxCollege Lenses are found in a huge array of optical instruments, ranging from a simple magnifying glass to the eye to a camera s zoom lens. In this section, we will

More information

Physics 2020 Lab 8 Lenses

Physics 2020 Lab 8 Lenses Physics 2020 Lab 8 Lenses Name Section Introduction. In this lab, you will study converging lenses. There are a number of different types of converging lenses, but all of them are thicker in the middle

More information

On Cosine-fourth and Vignetting Effects in Real Lenses*

On Cosine-fourth and Vignetting Effects in Real Lenses* On Cosine-fourth and Vignetting Effects in Real Lenses* Manoj Aggarwal Hong Hua Narendra Ahuja University of Illinois at Urbana-Champaign 405 N. Mathews Ave, Urbana, IL 61801, USA { manoj,honghua,ahuja}@vision.ai.uiuc.edu

More information

Physics 3340 Spring Fourier Optics

Physics 3340 Spring Fourier Optics Physics 3340 Spring 011 Purpose Fourier Optics In this experiment we will show how the Fraunhofer diffraction pattern or spatial Fourier transform of an object can be observed within an optical system.

More information

Physics 208 Spring 2008 Lab 2: Lenses and the eye

Physics 208 Spring 2008 Lab 2: Lenses and the eye Name Section Physics 208 Spring 2008 Lab 2: Lenses and the eye Your TA will use this sheet to score your lab. It is to be turned in at the end of lab. You must use complete sentences and clearly explain

More information

Mirrors and Lenses. Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses.

Mirrors and Lenses. Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses. Mirrors and Lenses Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses. Notation for Mirrors and Lenses The object distance is the distance from the object

More information

Radial Polarization Converter With LC Driver USER MANUAL

Radial Polarization Converter With LC Driver USER MANUAL ARCoptix Radial Polarization Converter With LC Driver USER MANUAL Arcoptix S.A Ch. Trois-portes 18 2000 Neuchâtel Switzerland Mail: info@arcoptix.com Tel: ++41 32 731 04 66 Principle of the radial polarization

More information

AgilEye Manual Version 2.0 February 28, 2007

AgilEye Manual Version 2.0 February 28, 2007 AgilEye Manual Version 2.0 February 28, 2007 1717 Louisiana NE Suite 202 Albuquerque, NM 87110 (505) 268-4742 support@agiloptics.com 2 (505) 268-4742 v. 2.0 February 07, 2007 3 Introduction AgilEye Wavefront

More information

Chapter 8. The Telescope. 8.1 Purpose. 8.2 Introduction A Brief History of the Early Telescope

Chapter 8. The Telescope. 8.1 Purpose. 8.2 Introduction A Brief History of the Early Telescope Chapter 8 The Telescope 8.1 Purpose In this lab, you will measure the focal lengths of two lenses and use them to construct a simple telescope which inverts the image like the one developed by Johannes

More information

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming)

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Purpose: The purpose of this lab is to introduce students to some of the properties of thin lenses and mirrors.

More information

October 7, Peter Cheimets Smithsonian Astrophysical Observatory 60 Garden Street, MS 5 Cambridge, MA Dear Peter:

October 7, Peter Cheimets Smithsonian Astrophysical Observatory 60 Garden Street, MS 5 Cambridge, MA Dear Peter: October 7, 1997 Peter Cheimets Smithsonian Astrophysical Observatory 60 Garden Street, MS 5 Cambridge, MA 02138 Dear Peter: This is the report on all of the HIREX analysis done to date, with corrections

More information

Accuracy Estimation of Microwave Holography from Planar Near-Field Measurements

Accuracy Estimation of Microwave Holography from Planar Near-Field Measurements Accuracy Estimation of Microwave Holography from Planar Near-Field Measurements Christopher A. Rose Microwave Instrumentation Technologies River Green Parkway, Suite Duluth, GA 9 Abstract Microwave holography

More information

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT PHYSICS FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E Chapter 35 Lecture RANDALL D. KNIGHT Chapter 35 Optical Instruments IN THIS CHAPTER, you will learn about some common optical instruments and

More information

Geometric Optics. Find the focal lengths of lenses and mirrors; Draw and understand ray diagrams; and Build a simple telescope

Geometric Optics. Find the focal lengths of lenses and mirrors; Draw and understand ray diagrams; and Build a simple telescope Geometric Optics I. OBJECTIVES Galileo is known for his many wondrous astronomical discoveries. Many of these discoveries shook the foundations of Astronomy and forced scientists and philosophers alike

More information

Phys 531 Lecture 9 30 September 2004 Ray Optics II. + 1 s i. = 1 f

Phys 531 Lecture 9 30 September 2004 Ray Optics II. + 1 s i. = 1 f Phys 531 Lecture 9 30 September 2004 Ray Optics II Last time, developed idea of ray optics approximation to wave theory Introduced paraxial approximation: rays with θ 1 Will continue to use Started disussing

More information

Basic Optics System OS-8515C

Basic Optics System OS-8515C 40 50 30 60 20 70 10 80 0 90 80 10 20 70 T 30 60 40 50 50 40 60 30 70 20 80 90 90 80 BASIC OPTICS RAY TABLE 10 0 10 70 20 60 50 40 30 Instruction Manual with Experiment Guide and Teachers Notes 012-09900B

More information

Active Aperture Control and Sensor Modulation for Flexible Imaging

Active Aperture Control and Sensor Modulation for Flexible Imaging Active Aperture Control and Sensor Modulation for Flexible Imaging Chunyu Gao and Narendra Ahuja Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL,

More information

STEM Spectrum Imaging Tutorial

STEM Spectrum Imaging Tutorial STEM Spectrum Imaging Tutorial Gatan, Inc. 5933 Coronado Lane, Pleasanton, CA 94588 Tel: (925) 463-0200 Fax: (925) 463-0204 April 2001 Contents 1 Introduction 1.1 What is Spectrum Imaging? 2 Hardware 3

More information

Psychophysics of night vision device halo

Psychophysics of night vision device halo University of Wollongong Research Online Faculty of Health and Behavioural Sciences - Papers (Archive) Faculty of Science, Medicine and Health 2009 Psychophysics of night vision device halo Robert S Allison

More information

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K.

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K. THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION Michael J. Flannagan Michael Sivak Julie K. Simpson The University of Michigan Transportation Research Institute Ann

More information

Chapter 18 Optical Elements

Chapter 18 Optical Elements Chapter 18 Optical Elements GOALS When you have mastered the content of this chapter, you will be able to achieve the following goals: Definitions Define each of the following terms and use it in an operational

More information

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017 Cameras Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017 Camera Focus Camera Focus So far, we have been simulating pinhole cameras with perfect focus Often times, we want to simulate more

More information

Imaging Instruments (part I)

Imaging Instruments (part I) Imaging Instruments (part I) Principal Planes and Focal Lengths (Effective, Back, Front) Multi-element systems Pupils & Windows; Apertures & Stops the Numerical Aperture and f/# Single-Lens Camera Human

More information

Optoliner NV. Calibration Standard for Sighting & Imaging Devices West San Bernardino Road West Covina, California 91790

Optoliner NV. Calibration Standard for Sighting & Imaging Devices West San Bernardino Road West Covina, California 91790 Calibration Standard for Sighting & Imaging Devices 2223 West San Bernardino Road West Covina, California 91790 Phone: (626) 962-5181 Fax: (626) 962-5188 www.davidsonoptronics.com sales@davidsonoptronics.com

More information

5.0 NEXT-GENERATION INSTRUMENT CONCEPTS

5.0 NEXT-GENERATION INSTRUMENT CONCEPTS 5.0 NEXT-GENERATION INSTRUMENT CONCEPTS Studies of the potential next-generation earth radiation budget instrument, PERSEPHONE, as described in Chapter 2.0, require the use of a radiative model of the

More information

Simple Figures and Perceptions in Depth (2): Stereo Capture

Simple Figures and Perceptions in Depth (2): Stereo Capture 59 JSL, Volume 2 (2006), 59 69 Simple Figures and Perceptions in Depth (2): Stereo Capture Kazuo OHYA Following previous paper the purpose of this paper is to collect and publish some useful simple stimuli

More information

Determination of Focal Length of A Converging Lens and Mirror

Determination of Focal Length of A Converging Lens and Mirror Physics 41 Determination of Focal Length of A Converging Lens and Mirror Objective: Apply the thin-lens equation and the mirror equation to determine the focal length of a converging (biconvex) lens and

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

Abstract shape: a shape that is derived from a visual source, but is so transformed that it bears little visual resemblance to that source.

Abstract shape: a shape that is derived from a visual source, but is so transformed that it bears little visual resemblance to that source. Glossary of Terms Abstract shape: a shape that is derived from a visual source, but is so transformed that it bears little visual resemblance to that source. Accent: 1)The least prominent shape or object

More information

Practice Problems (Geometrical Optics)

Practice Problems (Geometrical Optics) 1 Practice Problems (Geometrical Optics) 1. A convex glass lens (refractive index = 3/2) has a focal length of 8 cm when placed in air. What is the focal length of the lens when it is immersed in water

More information

Chapter 34 Geometric Optics

Chapter 34 Geometric Optics Chapter 34 Geometric Optics Lecture by Dr. Hebin Li Goals of Chapter 34 To see how plane and curved mirrors form images To learn how lenses form images To understand how a simple image system works Reflection

More information

Experiments on the locus of induced motion

Experiments on the locus of induced motion Perception & Psychophysics 1977, Vol. 21 (2). 157 161 Experiments on the locus of induced motion JOHN N. BASSILI Scarborough College, University of Toronto, West Hill, Ontario MIC la4, Canada and JAMES

More information

ECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the

ECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the ECEN 4606 Lab 8 Spectroscopy SUMMARY: ROBLEM 1: Pedrotti 3 12-10. In this lab, you will design, build and test an optical spectrum analyzer and use it for both absorption and emission spectroscopy. The

More information

Laboratory experiment aberrations

Laboratory experiment aberrations Laboratory experiment aberrations Obligatory laboratory experiment on course in Optical design, SK2330/SK3330, KTH. Date Name Pass Objective This laboratory experiment is intended to demonstrate the most

More information

The introduction and background in the previous chapters provided context in

The introduction and background in the previous chapters provided context in Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at

More information