A Virtual Reality approach to progressive lenses simulation

Size: px
Start display at page:

Download "A Virtual Reality approach to progressive lenses simulation"

Transcription

1 A Virtual Reality approach to progressive lenses simulation Jose Antonio Rodríguez Celaya¹, Pere Brunet Crosa,¹ Norberto Ezquerra², J. E. Palomar³ ¹ Departament de Llenguajes i Sistemes Informatics, Universitat Politécnica de Catalunya {jcelaya, ² College of Computing, Georgia Institute of Technology norberto@cc.gatech.edu ³ R+D+i Department. Lens Division, Industrias de Optica S.A. joan.palomar@indo.es Abstract Progressive lenses are lenses that allow focusing objects at any distance. The main problem of these lenses is the arise of marginal zones that present aberrations in which vision is defective. The interest of our research is to study the perceptual effect of these distortions using Virtual Reality techniques. In this paper we describe the different phases and goals of our research and we present a lens simulator that will help us to have a correct perception of the aberrations produced by these marginal zones. The interest of our research is to study the perceptive effect of these distortions by the use of Virtual Reality (VR) techniques. 1. Introduction Presbyopia is the optical condition in which eye s accommodation power irreversibly decreases due to age effects. It appears in people around 40 years old and older people suffer it in higher or lower level. A solution for presbyopia correction is the use of progressive lenses. A progressive lens has 3 zones, one for far vision (the upper one), other for lecture or near vision (lower one) and a progressive path between them (figure 1). Lens power varies continuously from far vision zone to low vision zone, allowing a clear vision at any distance. The main problem of these lenses is the arise of marginal zones that present aberrations in which vision is defective. Figure 1. Progressive lens design In section 2 we describe the goals and different phases that compound our research. We review in section 3 relevant previous work in the field of optical systems simulation and the methods used to model them.

2 In section 4 we expose the main techniques and algorithms used in the development of our lenses simulator. Section 5 shows some images of the results obtained by the developed system. Finally in section 6 we expose the conclusions obtained and the future work to do in the research. 2. The use of VR in progressive lens simulation As we previously mentioned, the main interest in our research is to study the perceptual effect of the aberrations produced by progressive lenses using VR techniques. To achieve this goal we need to develop a lens simulator that permits having a correct perception of the lens aberrations. We decided to develop two different simulators that generate pictures of a static model: A flat display simulator, which allows perceiving how a scene is seen through a particular lens model. This is the result we present in this article. The second simulator will be similar to the previous one but using a VR passive stereo system for visualizing the scenes. This is ongoing research and will be presented in a future paper. The most important task in this stage is the usability test, that will let us find out if users have the same perception with real lenses and with the simulation. The test will be done with N users, always with the same scene. For each user, the following tests will be performed: 1. User watches the real scene with real lenses. 2. User watches the simulated scene on the display using lenses that allow focusing at display s distance. 3. User watches the scene in a VR system using lenses that allow focusing at display s distance and passive stereo glasses. Other goal of these tests is to compare the simulations and decide which one is better, the flat display simulation or the simulation in the VR system, which has convergence and adaptation problems. This simulator and/or lens models will be improved until the usability test shows that perception in the simulator is acceptably the same to real perception. In a second stage both simulators should admit dynamic models with movement. Usability test would be the same to the one in the first stage. The final goal, if usability tests throw positive results, is the use of the simulator for testing nonexisting lenses and for tuning their parameters. The present paper focuses on the design and implementation of the flat display simulator that allows to have a correct perception of a lens aberrations. After discussion of the previous work the algorithm is presented in section 4 and the results are discussed in section Previous work One of the biggest problems that optical industry has when giving solutions to particular patients is the difficulty of knowing the effect of a particular lens on a particular patient. In the last years these needs have leaded to the development of optical systems simulators that generate distorted images from real lens data from modeled lenses [1] or from a particular patient s eye model, allowing to see an image equal to the one seen through that optical system. One of the most interesting projects in this area is the one developed by John Bastian et al. in collaboration with Sola optical [2]. In this project, a lenses simulator for a RV immersive system is developed, in this case, a CAVE system. This simulator has a lens model as input, done from a vector map that represent the distortion of the light rays through the lens. For simulating the distortion originated by the lens, a technique denominated Partitioned Blur Algorithm (PBA) is used, which, making a convolution, applies a Gaussian blur to each pixel in the image. To achieve this blur, a technique based on circle of least confusion, developed by Potmesil and Chakravarty [3], is used. This technique uses blur discs with a particular size depending of the distance of the object to which the pixel pertains.

3 The major inconvenience of this technique comes from the lens modeling. By using a vector map they assume that distortion in each point is exactly a circle which color has not in account near pixels but its degraded from its center to its border. This makes object colors interference between them. These vector maps, as they are in fact a simplification of a ray tracing, don t provide depth of field information, so all parts of a rendered image are equally in focus. Another interesting project is the one developed by the computer graphics group at the university of Berkeley. This project, called OPTICAL (OPtic and Topography Involving the Cornea And Lens) and leaded by Brian Barsky [4], has as primary goal to do a realistic simulation of vision through an optical system. This system can be a lens, a real patient eye or both together. This simulation produces an accurate image of what a particular patient sees [5]. To accomplish this simulation, the starting point are the data obtained by a Hartmann-Shack device[6][7], which gives an accurate measurement of the wavefront aberrations of an optical system. Wavefronts, differently to ray tracing, provide depth of field information. A way of representing those wavefronts is using PSF (Point Spread Functions) which are twodimensional energy histograms[5]. These PSF provide information about how deformed and blurred is seen a point, at a particular distance and direction, through an optical system. PSFs are used as focal two-dimensional filters and for each pixel in the final image its color is calculated doing a convolution. The result is a good approach to the real image seen through an optical system, which is specially useful for simulating the real vision of patients with vision defects [3]. One of the main limitations of this approach is the use of discrete depth values. This can result in images in which focus differences can be perceived. It is also a slow algorithm, not suitable for real time simulations. 4. Algorithms for lens simulation For the development of our lens simulator, we choose to use Barsky s approach, using PSF to model the lenses we want to simulate. In this section we describe the process done to achieve this simulation Modeling a lens A lens is modeled form a tri-dimensional PSF matrix, oriented in accordance to the viewer coordinate system. Imagine we are looking at an object through a lens in a particular distance and direction. This direction is modeled from 2 angles, horizontal and vertical, from optical axis. The PSF function for a particular distance and direction models the deformed and blurred we see the object. The ideal lens model would store a specific PSF for each possible direction and distance would have PSF. As this is not possible, a lens model is approached as a tri-dimensional matrix of PSF (fig 2). PSF 111 PSF 211 PSF 121 PSF 1N1 PSF 112 PSF 11M PSF N11 PSF NN1 PSF N12 PSF NN2 PSF N1M PSF NNM Figure 2. A lens model. Each point corresponds to a PSF. There are NxNxM PSF. DEPTH

4 4.2. Preprocess The PSF we use in our simulator are bidimensional arrays of 80x80 values. These PSF are normalized so all the values in a particular PSF sum 1. Ideally, there should be as many PSF, in x and y, as pixels on the screen. As we mentioned before, this is not possible due to the huge amount of data required. As we have less values that needed, an approach to the real PSF has to be done for each pixel. This can be done applying a tri-linear interpolation. Even this way, the interpolation is a costly process, because for each pixel a 80x80 value matrix (PSF) should be calculated. A way to reduce these calculations is to convert these PSF to pixel based PSF, so we determine how many pixels are affected by a PSF and we create a PSF with a value for each pixel affected. Resulting PSF are much smaller but perceptually the same to the previous ones. Due to PSF normalization, a 0 value doesn t contribute to final visualization s color, so we can calculate the maximal PSF size in pixels (from all given PSF, see figure 2) and convert all the initial PSF to these pixel-sized values, making computations easier to achieve. For calculating the new PSF we adjust the field of view to the angle the sum of angles between PSFs on the x axis. This way we can use the formula (1) to deduce the number of pixels in a PSF: x w pixx = 2d (1) tg( α) x is the size in millimeters of the x axis of the PSF (is equal to y), w is the number of pixels of the screens resolution, α is the angle from optical axis to the PSF at the edge of the screen (half screen) on the x axis and d is the distance at which the image is projected (fig 3). This preprocess has to be done only once, when loading a lens model or if we change any parameter as the projection distance. Figure 3. Calculating the size in pixels of a PSF. For simplification we use square screens (width = height). The FOV of the screen is adjusted to 2α Calculate PSF values for each pixel on the screen For a rendering a particular image, we need to calculate the PSF values for each pixel in the screen. This is done as many times as we want to render an image, so in a real time simulation, it should be done for every frame. As we previously mentioned, this process is done applying a trilinear interpolation. In order to achieve this interpolation, its necessary to know the 8 PSF enclosing the point P(x,y,z) for which we are calculating the PSF (fig 4). For choosing these PSF, knowing the depth of the object in the scene that corresponds to the actual pixel is needed. We make this grabbing the depth information for that pixel from the depth buffer, so with x,y (screen coordinates) and depth values of the actual pixel, we know these 8 PSF and we can achieve a trilinear interpolation between them obtaining another PSF as result.

5 } With (x,y,depth) choose the 8 surrounding PSF (see Figure 4) Tri-linear interpolation of the 8 PSF at (x,y,depth) Apply convolution (Formula 2) using the PSF values and the color value in the pixels close to (x,y) 5. Results Figure 4. White circles are points in which we have PSF (data) P is the point for which we want to calculate its PSF. Algorithm computes PSF function from the 8 PSF of the vertex of the box containing P, using tri-linear interpolation For testing our simulator we decided to use a very simply lens model. The used PSFs correspond to a 5 diopter lens and they are calculated for an object at optical infinity (from 5 meters) and in 9 directions, differing 20º. They form a 3x3 matrix where the center corresponds to the optical axis and the other directions form 20º with the axis (fig 5) Convolution Knowing the PSF corresponding to the actual pixel, we only have to achieve a convolution in order to find out how surrounding pixels affect to the final color of the pixel we are processing. Convolution is applied following the next formula: x + y+ I ( x, y) = color( i, j) PSF( x i, y j) (2) i= x j= y In this formula, is half the number of pixels of a PSF and x, y are screen coordinates of the computed pixel. The result, after computing these calculations for every pixel on the screen, we get a blurred scene, which is the simulation of the original scene seen through the modeled lens. The toplevel algorithm of the blurring process is: For each pixel on screen { Get depth values from depth buffer. Figure 5. Lens Model used in test. For this simulation, as at least another PSF plane for interpolating is needed, we decided to set a perfect vision PSF plane (PSF with very high values at its central point) at near distance. This distance can be modified by the application. The results are shown in the pictures (figs 6-10)

6 Figure 6. Original and Blurred Engine model. Figure 7. Two cubes. Original and blurred image.

7 Figure 8. Interior of a ship. Original and blurred image. Figure 9. Depth values of engine and cubes. Green values indicates depths between the 2 planes (one at 1m ant the other one at 5 m). Red values indicates depths greater than the depth of the 2 nd depth plane (5m). Blue values are depths < 1m.

8 Figure 10. Lens simulator interface. 6. Conclusions and future work The developed simulator allows us to see the behavior of different lens models and to try different parameters in the simulation, but there are still some tasks to accomplish: Try other lens types, for ex. Progressive lenses, with more depth planes. Implementation of the effects of magnification. This will improve the quality of the simulation. Adapt the simulator to generate stereoscopic images. This will allow us to simulate lenses in VR devices. Perform an usability test as described in section 2, which will determine the accurate the final images of our simulator are. In a second stage, we will try to develop the convolution described in section 4.4 using hardware graphics. This could allow us to make real time simulations. Finally, another usability test will be performed to test the final simulator and determine the advantages and disadvantages of the use of VR techniques. 7. Acknowledgements This work has been developed in the framework of the INDO-UPC agreement for the study of the usability of VR tools in the improvement of visual defects. It has been partially funded by Industrias de Optica S.A., by CDTI FIT , by PROFIT , and by the TIN research project TIN C The authors would like to thank Javier Vegas, Enric Fontdecaba and Juan C. Dürsteler from INDO for their help and cooperation. References [1] Heidrich, W., and Slusallek, P., and Seidel, H. An image-based model for realistic lens

9 systems in interactive computer graphics. Graphic Interface 97, pp.68-75, 1997 [2] J. Bastian, A. van den Hengel, K. Hawick, F. Vaughan. Modelling the perceptual effects of lens distortion via an immersive virtual reality system. Technical report DHPC 086, Department of computer science, University of Adelaide [3] Potmesil, M., and Chakravarty, I. A lens aperture camera model for synthetic image generation. In SIGGRAPH 81 conf. proc., Computer Graphics, ACM Press, pp ,1981. [4] D. García, B. A. Barsky, S. A. Klein. The OPTICAL project at UC Berkeley: simulating visual acuity, Medicine Meets Virtual Reality, 6 (Art, Science, Technology: Healthcare (r)evolution), San Diego, January 28-31, 1998 [5] D. García. CwhatUC: Software Tools for Predicting, Visualizing and Simulating Corneal Visual Acuity. PhD thesis, Computer Science Division, University of California, Berkeley, CA, May [6] B. A. Barsky, D. Garcia, S. A. Klein. Computer Simulation of Vision-Based Synthetic Images Using Hartmann-Shack- Derived Wavefront Aberrations., Association for Research in Vision and Ophthalmology, Fort Lauderdale, Florida, 29 April - 4 May Abstract in Investigative Ophthalmology & Visual Science, Vol. 42, No. 4, March 15, 2001, pp. S162. [7] B. A. Barsky, D. Garcia, S. A. Klein, W. M. Yu, B. P. Chen, S. S. Dalal. RAYS (Render As You See): Vision-Realistic Rendering Using Hartmann-Shack Wavefront Aberrations. internal report, OPTICAl project, UC Berkeley March [8] P. Artal, J. Santamaría, J. Bescoós. Retrieval of wave aberration of human eyes from actual point-spread-function data. J. Opt. Soc. Am. A, 5: , [9] B. A. Barsky. Vision-Realistic Rendering: Simulation of the Scanned Foveal Image from Wavefront Data of Human Subjects. First Symposium on Applied Perception in Graphics and Visualization, co-located with ACM SIGGRAPH, Los Angeles, 7-8 August 2004, pp [10] B. A. Barsky, B. P. Chen, A. C. Berg, M. Moutet, D. Garcia, S. A. Klein. Incorporating Camera Models, Ocular Models, and Actual Patient Eye Data for Photo-Realistic and Vision-Realistic Rendering. Abstract in the Fifth International Conference on Mathematical Methods for Curves and Surfaces, June 29 - July 4, 2000, Oslo, Norway [11] J. Loops, Ph. Slusallek, H. P. Seidel. Using Wavefront Tracing for the Visualization and Optimization of Progressive Lenses. Proc of Eurographics Computer graphics forum Vol. 17 nº3

Fast Perception-Based Depth of Field Rendering

Fast Perception-Based Depth of Field Rendering Fast Perception-Based Depth of Field Rendering Jurriaan D. Mulder Robert van Liere Abstract Current algorithms to create depth of field (DOF) effects are either too costly to be applied in VR systems,

More information

4th International Congress of Wavefront Sensing and Aberration-free Refractive Correction ADAPTIVE OPTICS FOR VISION: THE EYE S ADAPTATION TO ITS

4th International Congress of Wavefront Sensing and Aberration-free Refractive Correction ADAPTIVE OPTICS FOR VISION: THE EYE S ADAPTATION TO ITS 4th International Congress of Wavefront Sensing and Aberration-free Refractive Correction (Supplement to the Journal of Refractive Surgery; June 2003) ADAPTIVE OPTICS FOR VISION: THE EYE S ADAPTATION TO

More information

Waves & Oscillations

Waves & Oscillations Physics 42200 Waves & Oscillations Lecture 33 Geometric Optics Spring 2013 Semester Matthew Jones Aberrations We have continued to make approximations: Paraxial rays Spherical lenses Index of refraction

More information

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

IMAGE SENSOR SOLUTIONS. KAC-96-1/5 Lens Kit. KODAK KAC-96-1/5 Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2 KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

More information

Optics: Lenses & Mirrors

Optics: Lenses & Mirrors Warm-Up 1. A light ray is passing through water (n=1.33) towards the boundary with a transparent solid at an angle of 56.4. The light refracts into the solid at an angle of refraction of 42.1. Determine

More information

Lenses, exposure, and (de)focus

Lenses, exposure, and (de)focus Lenses, exposure, and (de)focus http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 15 Course announcements Homework 4 is out. - Due October 26

More information

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017 Cameras Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017 Camera Focus Camera Focus So far, we have been simulating pinhole cameras with perfect focus Often times, we want to simulate more

More information

Explanation of Aberration and Wavefront

Explanation of Aberration and Wavefront Explanation of Aberration and Wavefront 1. What Causes Blur? 2. What is? 4. What is wavefront? 5. Hartmann-Shack Aberrometer 6. Adoption of wavefront technology David Oh 1. What Causes Blur? 2. What is?

More information

Converging and Diverging Surfaces. Lenses. Converging Surface

Converging and Diverging Surfaces. Lenses. Converging Surface Lenses Sandy Skoglund 2 Converging and Diverging s AIR Converging If the surface is convex, it is a converging surface in the sense that the parallel rays bend toward each other after passing through the

More information

Ch 24. Geometric Optics

Ch 24. Geometric Optics text concept Ch 24. Geometric Optics Fig. 24 3 A point source of light P and its image P, in a plane mirror. Angle of incidence =angle of reflection. text. Fig. 24 4 The blue dashed line through object

More information

Camera Models and Optical Systems Used in Computer Graphics: Part I, Object-Based Techniques

Camera Models and Optical Systems Used in Computer Graphics: Part I, Object-Based Techniques Camera Models and Optical Systems Used in Computer Graphics: Part I, Object-Based Techniques Brian A. Barsky 1,2,3,DanielR.Horn 1, Stanley A. Klein 2,3,JeffreyA.Pang 1, and Meng Yu 1 1 Computer Science

More information

Supplemental: Accommodation and Comfort in Head-Mounted Displays

Supplemental: Accommodation and Comfort in Head-Mounted Displays Supplemental: Accommodation and Comfort in Head-Mounted Displays GEORGE-ALEX KOULIERIS, Inria, Université Côte d Azur BEE BUI, University of California, Berkeley MARTIN S. BANKS, University of California,

More information

Study of self-interference incoherent digital holography for the application of retinal imaging

Study of self-interference incoherent digital holography for the application of retinal imaging Study of self-interference incoherent digital holography for the application of retinal imaging Jisoo Hong and Myung K. Kim Department of Physics, University of South Florida, Tampa, FL, US 33620 ABSTRACT

More information

VISUAL PHYSICS ONLINE DEPTH STUDY: ELECTRON MICROSCOPES

VISUAL PHYSICS ONLINE DEPTH STUDY: ELECTRON MICROSCOPES VISUAL PHYSICS ONLINE DEPTH STUDY: ELECTRON MICROSCOPES Shortly after the experimental confirmation of the wave properties of the electron, it was suggested that the electron could be used to examine objects

More information

CS 443: Imaging and Multimedia Cameras and Lenses

CS 443: Imaging and Multimedia Cameras and Lenses CS 443: Imaging and Multimedia Cameras and Lenses Spring 2008 Ahmed Elgammal Dept of Computer Science Rutgers University Outlines Cameras and lenses! 1 They are formed by the projection of 3D objects.

More information

Chapter 25. Optical Instruments

Chapter 25. Optical Instruments Chapter 25 Optical Instruments Optical Instruments Analysis generally involves the laws of reflection and refraction Analysis uses the procedures of geometric optics To explain certain phenomena, the wave

More information

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS Announcements Homework project 2 Due tomorrow May 5 at 2pm To be demonstrated in VR lab B210 Even hour teams start at 2pm Odd hour teams start

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Removing Temporal Stationary Blur in Route Panoramas

Removing Temporal Stationary Blur in Route Panoramas Removing Temporal Stationary Blur in Route Panoramas Jiang Yu Zheng and Min Shi Indiana University Purdue University Indianapolis jzheng@cs.iupui.edu Abstract The Route Panorama is a continuous, compact

More information

doi: /

doi: / doi: 10.1117/12.872287 Coarse Integral Volumetric Imaging with Flat Screen and Wide Viewing Angle Shimpei Sawada* and Hideki Kakeya University of Tsukuba 1-1-1 Tennoudai, Tsukuba 305-8573, JAPAN ABSTRACT

More information

Introduction. Related Work

Introduction. Related Work Introduction Depth of field is a natural phenomenon when it comes to both sight and photography. The basic ray tracing camera model is insufficient at representing this essential visual element and will

More information

In recent years there has been an explosion of

In recent years there has been an explosion of Line of Sight and Alternative Representations of Aberrations of the Eye Stanley A. Klein, PhD; Daniel D. Garcia, PhD ABSTRACT Several methods for representing pupil plane aberrations based on wavefront

More information

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc. Lecture Outline Chapter 27 Physics, 4 th Edition James S. Walker Chapter 27 Optical Instruments Units of Chapter 27 The Human Eye and the Camera Lenses in Combination and Corrective Optics The Magnifying

More information

Exam Preparation Guide Geometrical optics (TN3313)

Exam Preparation Guide Geometrical optics (TN3313) Exam Preparation Guide Geometrical optics (TN3313) Lectures: September - December 2001 Version of 21.12.2001 When preparing for the exam, check on Blackboard for a possible newer version of this guide.

More information

Adding Realistic Camera Effects to the Computer Graphics Camera Model

Adding Realistic Camera Effects to the Computer Graphics Camera Model Adding Realistic Camera Effects to the Computer Graphics Camera Model Ryan Baltazar May 4, 2012 1 Introduction The camera model traditionally used in computer graphics is based on the camera obscura or

More information

LENSES. INEL 6088 Computer Vision

LENSES. INEL 6088 Computer Vision LENSES INEL 6088 Computer Vision Digital camera A digital camera replaces film with a sensor array Each cell in the array is a Charge Coupled Device light-sensitive diode that converts photons to electrons

More information

OPTICAL SYSTEMS OBJECTIVES

OPTICAL SYSTEMS OBJECTIVES 101 L7 OPTICAL SYSTEMS OBJECTIVES Aims Your aim here should be to acquire a working knowledge of the basic components of optical systems and understand their purpose, function and limitations in terms

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Image of Formation Images can result when light rays encounter flat or curved surfaces between two media. Images can be formed either by reflection or refraction due to these

More information

COPYRIGHTED MATERIAL. Overview

COPYRIGHTED MATERIAL. Overview In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experience data, which is manipulated

More information

Fast Motion Blur through Sample Reprojection

Fast Motion Blur through Sample Reprojection Fast Motion Blur through Sample Reprojection Micah T. Taylor taylormt@cs.unc.edu Abstract The human eye and physical cameras capture visual information both spatially and temporally. The temporal aspect

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Notation for Mirrors and Lenses The object distance is the distance from the object to the mirror or lens Denoted by p The image distance is the distance from the image to the

More information

Chapter 34: Geometric Optics

Chapter 34: Geometric Optics Chapter 34: Geometric Optics It is all about images How we can make different kinds of images using optical devices Optical device example: mirror, a piece of glass, telescope, microscope, kaleidoscope,

More information

PHYS 160 Astronomy. When analyzing light s behavior in a mirror or lens, it is helpful to use a technique called ray tracing.

PHYS 160 Astronomy. When analyzing light s behavior in a mirror or lens, it is helpful to use a technique called ray tracing. Optics Introduction In this lab, we will be exploring several properties of light including diffraction, reflection, geometric optics, and interference. There are two sections to this lab and they may

More information

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation

More information

Topic 6 - Optics Depth of Field and Circle Of Confusion

Topic 6 - Optics Depth of Field and Circle Of Confusion Topic 6 - Optics Depth of Field and Circle Of Confusion Learning Outcomes In this lesson, we will learn all about depth of field and a concept known as the Circle of Confusion. By the end of this lesson,

More information

Image Enhancement Using Calibrated Lens Simulations

Image Enhancement Using Calibrated Lens Simulations Image Enhancement Using Calibrated Lens Simulations Jointly Image Sharpening and Chromatic Aberrations Removal Yichang Shih, Brian Guenter, Neel Joshi MIT CSAIL, Microsoft Research 1 Optical Aberrations

More information

Rediscover quality of life thanks to vision correction with technology from Carl Zeiss. Patient Information

Rediscover quality of life thanks to vision correction with technology from Carl Zeiss. Patient Information Rediscover quality of life thanks to vision correction with technology from Carl Zeiss Patient Information 5 2 It was really w Vision defects: Light that goes astray For clear vision the eyes, cornea and

More information

Simulated Programmable Apertures with Lytro

Simulated Programmable Apertures with Lytro Simulated Programmable Apertures with Lytro Yangyang Yu Stanford University yyu10@stanford.edu Abstract This paper presents a simulation method using the commercial light field camera Lytro, which allows

More information

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017 Lecture 22: Cameras & Lenses III Computer Graphics and Imaging UC Berkeley, Spring 2017 F-Number For Lens vs. Photo A lens s F-Number is the maximum for that lens E.g. 50 mm F/1.4 is a high-quality telephoto

More information

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming)

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Purpose: The purpose of this lab is to introduce students to some of the properties of thin lenses and mirrors.

More information

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view) Camera projections Recall the plenoptic function: Panoramic imaging Ixyzϕθλt (,,,,,, ) At any point xyz,, in space, there is a full sphere of possible incidence directions ϕ, θ, covered by 0 ϕ 2π, 0 θ

More information

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations. Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

More information

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5 Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain

More information

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations. Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

More information

Opto Engineering S.r.l.

Opto Engineering S.r.l. TUTORIAL #1 Telecentric Lenses: basic information and working principles On line dimensional control is one of the most challenging and difficult applications of vision systems. On the other hand, besides

More information

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS INFOTEH-JAHORINA Vol. 10, Ref. E-VI-11, p. 892-896, March 2011. MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS Jelena Cvetković, Aleksej Makarov, Sasa Vujić, Vlatacom d.o.o. Beograd Abstract -

More information

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses Chapter 29/30 Refraction and Lenses Refraction Refraction the bending of waves as they pass from one medium into another. Caused by a change in the average speed of light. Analogy A car that drives off

More information

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc How to Optimize the Sharpness of Your Photographic Prints: Part II - Practical Limits to Sharpness in Photography and a Useful Chart to Deteremine the Optimal f-stop. Robert B.Hallock hallock@physics.umass.edu

More information

Chapter 34 Geometric Optics

Chapter 34 Geometric Optics Chapter 34 Geometric Optics Lecture by Dr. Hebin Li Goals of Chapter 34 To see how plane and curved mirrors form images To learn how lenses form images To understand how a simple image system works Reflection

More information

Applied Optics. , Physics Department (Room #36-401) , ,

Applied Optics. , Physics Department (Room #36-401) , , Applied Optics Professor, Physics Department (Room #36-401) 2290-0923, 019-539-0923, shsong@hanyang.ac.kr Office Hours Mondays 15:00-16:30, Wednesdays 15:00-16:30 TA (Ph.D. student, Room #36-415) 2290-0921,

More information

Perspective. Announcement: CS4450/5450. CS 4620 Lecture 3. Will be MW 8:40 9:55 How many can make the new time?

Perspective. Announcement: CS4450/5450. CS 4620 Lecture 3. Will be MW 8:40 9:55 How many can make the new time? Perspective CS 4620 Lecture 3 1 2 Announcement: CS4450/5450 Will be MW 8:40 9:55 How many can make the new time? 3 4 History of projection Ancient times: Greeks wrote about laws of perspective Renaissance:

More information

Performance Factors. Technical Assistance. Fundamental Optics

Performance Factors.   Technical Assistance. Fundamental Optics Performance Factors After paraxial formulas have been used to select values for component focal length(s) and diameter(s), the final step is to select actual lenses. As in any engineering problem, this

More information

CHAPTER 33 ABERRATION CURVES IN LENS DESIGN

CHAPTER 33 ABERRATION CURVES IN LENS DESIGN CHAPTER 33 ABERRATION CURVES IN LENS DESIGN Donald C. O Shea Georgia Institute of Technology Center for Optical Science and Engineering and School of Physics Atlanta, Georgia Michael E. Harrigan Eastman

More information

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21 Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:

More information

The Human Visual System!

The Human Visual System! an engineering-focused introduction to! The Human Visual System! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 2! Gordon Wetzstein! Stanford University! nautilus eye,

More information

Physics 208 Spring 2008 Lab 2: Lenses and the eye

Physics 208 Spring 2008 Lab 2: Lenses and the eye Name Section Physics 208 Spring 2008 Lab 2: Lenses and the eye Your TA will use this sheet to score your lab. It is to be turned in at the end of lab. You must use complete sentences and clearly explain

More information

Angular motion point spread function model considering aberrations and defocus effects

Angular motion point spread function model considering aberrations and defocus effects 1856 J. Opt. Soc. Am. A/ Vol. 23, No. 8/ August 2006 I. Klapp and Y. Yitzhaky Angular motion point spread function model considering aberrations and defocus effects Iftach Klapp and Yitzhak Yitzhaky Department

More information

Transferring wavefront measurements to ablation profiles. Michael Mrochen PhD Swiss Federal Institut of Technology, Zurich IROC Zurich

Transferring wavefront measurements to ablation profiles. Michael Mrochen PhD Swiss Federal Institut of Technology, Zurich IROC Zurich Transferring wavefront measurements to ablation profiles Michael Mrochen PhD Swiss Federal Institut of Technology, Zurich IROC Zurich corneal ablation Calculation laser spot positions Centration Calculation

More information

Subjective Image Quality Metrics from The Wave Aberration

Subjective Image Quality Metrics from The Wave Aberration Subjective Image Quality Metrics from The Wave Aberration David R. Williams William G. Allyn Professor of Medical Optics Center For Visual Science University of Rochester Commercial Relationship: Bausch

More information

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Ashill Chiranjan and Bernardt Duvenhage Defence, Peace, Safety and Security Council for Scientific

More information

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design Computer Aided Design Several CAD tools use Ray Tracing (see

More information

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display https://doi.org/10.2352/issn.2470-1173.2017.5.sd&a-376 2017, Society for Imaging Science and Technology Analysis of retinal images for retinal projection type super multiview 3D head-mounted display Takashi

More information

6.A44 Computational Photography

6.A44 Computational Photography Add date: Friday 6.A44 Computational Photography Depth of Field Frédo Durand We allow for some tolerance What happens when we close the aperture by two stop? Aperture diameter is divided by two is doubled

More information

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Camera & Color Overview Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Book: Hartley 6.1, Szeliski 2.1.5, 2.2, 2.3 The trip

More information

COPYRIGHTED MATERIAL OVERVIEW 1

COPYRIGHTED MATERIAL OVERVIEW 1 OVERVIEW 1 In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experiential data,

More information

Image Formation and Capture

Image Formation and Capture Figure credits: B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, A. Theuwissen, and J. Malik Image Formation and Capture COS 429: Computer Vision Image Formation and Capture Real world Optics Sensor Devices

More information

Reading: Lenses and Mirrors; Applications Key concepts: Focal points and lengths; real images; virtual images; magnification; angular magnification.

Reading: Lenses and Mirrors; Applications Key concepts: Focal points and lengths; real images; virtual images; magnification; angular magnification. Reading: Lenses and Mirrors; Applications Key concepts: Focal points and lengths; real images; virtual images; magnification; angular magnification. 1.! Questions about objects and images. Can a virtual

More information

Head Mounted Display Optics II!

Head Mounted Display Optics II! ! Head Mounted Display Optics II! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 8! stanford.edu/class/ee267/!! Lecture Overview! focus cues & the vergence-accommodation conflict!

More information

History of projection. Perspective. History of projection. Plane projection in drawing

History of projection. Perspective. History of projection. Plane projection in drawing History of projection Ancient times: Greeks wrote about laws of perspective Renaissance: perspective is adopted by artists Perspective CS 4620 Lecture 3 Duccio c. 1308 1 2 History of projection Plane projection

More information

Analysis of Hartmann testing techniques for large-sized optics

Analysis of Hartmann testing techniques for large-sized optics Analysis of Hartmann testing techniques for large-sized optics Nadezhda D. Tolstoba St.-Petersburg State Institute of Fine Mechanics and Optics (Technical University) Sablinskaya ul.,14, St.-Petersburg,

More information

Notes from Lens Lecture with Graham Reed

Notes from Lens Lecture with Graham Reed Notes from Lens Lecture with Graham Reed Light is refracted when in travels between different substances, air to glass for example. Light of different wave lengths are refracted by different amounts. Wave

More information

Customized Correction of Wavefront Aberrations in Abnormal Human Eyes by Using a Phase Plate and a Customized Contact Lens

Customized Correction of Wavefront Aberrations in Abnormal Human Eyes by Using a Phase Plate and a Customized Contact Lens Journal of the Korean Physical Society, Vol. 49, No. 1, July 2006, pp. 121 125 Customized Correction of Wavefront Aberrations in Abnormal Human Eyes by Using a Phase Plate and a Customized Contact Lens

More information

Visual Perception. Readings and References. Forming an image. Pinhole camera. Readings. Other References. CSE 457, Autumn 2004 Computer Graphics

Visual Perception. Readings and References. Forming an image. Pinhole camera. Readings. Other References. CSE 457, Autumn 2004 Computer Graphics Readings and References Visual Perception CSE 457, Autumn Computer Graphics Readings Sections 1.4-1.5, Interactive Computer Graphics, Angel Other References Foundations of Vision, Brian Wandell, pp. 45-50

More information

3D Viewing. Introduction to Computer Graphics Torsten Möller / Manfred Klaffenböck. Machiraju/Zhang/Möller

3D Viewing. Introduction to Computer Graphics Torsten Möller / Manfred Klaffenböck. Machiraju/Zhang/Möller 3D Viewing Introduction to Computer Graphics Torsten Möller / Manfred Klaffenböck Machiraju/Zhang/Möller Reading Chapter 5 of Angel Chapter 13 of Hughes, van Dam, Chapter 7 of Shirley+Marschner Machiraju/Zhang/Möller

More information

General Imaging System

General Imaging System General Imaging System Lecture Slides ME 4060 Machine Vision and Vision-based Control Chapter 5 Image Sensing and Acquisition By Dr. Debao Zhou 1 2 Light, Color, and Electromagnetic Spectrum Penetrate

More information

GEOMETRICAL OPTICS AND OPTICAL DESIGN

GEOMETRICAL OPTICS AND OPTICAL DESIGN GEOMETRICAL OPTICS AND OPTICAL DESIGN Pantazis Mouroulis Associate Professor Center for Imaging Science Rochester Institute of Technology John Macdonald Senior Lecturer Physics Department University of

More information

Vision and Color. Reading. The lensmaker s formula. Lenses. Brian Curless CSEP 557 Autumn Good resources:

Vision and Color. Reading. The lensmaker s formula. Lenses. Brian Curless CSEP 557 Autumn Good resources: Reading Good resources: Vision and Color Brian Curless CSEP 557 Autumn 2017 Glassner, Principles of Digital Image Synthesis, pp. 5-32. Palmer, Vision Science: Photons to Phenomenology. Wandell. Foundations

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

Vision and Color. Reading. Optics, cont d. Lenses. d d f. Brian Curless CSEP 557 Fall Good resources:

Vision and Color. Reading. Optics, cont d. Lenses. d d f. Brian Curless CSEP 557 Fall Good resources: Reading Good resources: Vision and Color Brian Curless CSEP 557 Fall 2016 Glassner, Principles of Digital Image Synthesis, pp. 5-32. Palmer, Vision Science: Photons to Phenomenology. Wandell. Foundations

More information

Vision and Color. Brian Curless CSEP 557 Fall 2016

Vision and Color. Brian Curless CSEP 557 Fall 2016 Vision and Color Brian Curless CSEP 557 Fall 2016 1 Reading Good resources: Glassner, Principles of Digital Image Synthesis, pp. 5-32. Palmer, Vision Science: Photons to Phenomenology. Wandell. Foundations

More information

La photographie numérique. Frank NIELSEN Lundi 7 Juin 2010

La photographie numérique. Frank NIELSEN Lundi 7 Juin 2010 La photographie numérique Frank NIELSEN Lundi 7 Juin 2010 1 Le Monde digital Key benefits of the analog2digital paradigm shift? Dissociate contents from support : binarize Universal player (CPU, Turing

More information

Coded Computational Photography!

Coded Computational Photography! Coded Computational Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 9! Gordon Wetzstein! Stanford University! Coded Computational Photography - Overview!!

More information

Pablo Artal. Adaptive Optics visual simulator ( and depth of focus) LABORATORIO DE OPTICA UNIVERSIDAD DE MURCIA, SPAIN

Pablo Artal. Adaptive Optics visual simulator ( and depth of focus) LABORATORIO DE OPTICA UNIVERSIDAD DE MURCIA, SPAIN Adaptive Optics visual simulator ( and depth of focus) Pablo Artal LABORATORIO DE OPTICA UNIVERSIDAD DE MURCIA, SPAIN 8th International Wavefront Congress, Santa Fe, USA, February New LO UM building! Diego

More information

Optical Systems: Pinhole Camera Pinhole camera: simple hole in a box: Called Camera Obscura Aristotle discussed, Al-Hazen analyzed in Book of Optics

Optical Systems: Pinhole Camera Pinhole camera: simple hole in a box: Called Camera Obscura Aristotle discussed, Al-Hazen analyzed in Book of Optics Optical Systems: Pinhole Camera Pinhole camera: simple hole in a box: Called Camera Obscura Aristotle discussed, Al-Hazen analyzed in Book of Optics 1011CE Restricts rays: acts as a single lens: inverts

More information

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

More information

Lecture 19 (Geometric Optics I Plane and Spherical Optics) Physics Spring 2018 Douglas Fields

Lecture 19 (Geometric Optics I Plane and Spherical Optics) Physics Spring 2018 Douglas Fields Lecture 19 (Geometric Optics I Plane and Spherical Optics) Physics 262-01 Spring 2018 Douglas Fields Optics -Wikipedia Optics is the branch of physics which involves the behavior and properties of light,

More information

Omni-Directional Catadioptric Acquisition System

Omni-Directional Catadioptric Acquisition System Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Enhancing Fish Tank VR

Enhancing Fish Tank VR Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands mullie robertl @cwi.nl Abstract Fish tank VR systems provide head

More information

Lab 11: Lenses and Ray Tracing

Lab 11: Lenses and Ray Tracing Name: Lab 11: Lenses and Ray Tracing Group Members: Date: TA s Name: Materials: Ray box, two different converging lenses, one diverging lens, screen, lighted object, three stands, meter stick, two letter

More information

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS Equipment and accessories: an optical bench with a scale, an incandescent lamp, matte, a set of

More information

Big League Cryogenics and Vacuum The LHC at CERN

Big League Cryogenics and Vacuum The LHC at CERN Big League Cryogenics and Vacuum The LHC at CERN A typical astronomical instrument must maintain about one cubic meter at a pressure of

More information

Dr. Todd Satogata (ODU/Jefferson Lab) Monday, April

Dr. Todd Satogata (ODU/Jefferson Lab)  Monday, April University Physics 227N/232N Mirrors and Lenses Homework Optics 2 due Friday AM Quiz Friday Optional review session next Monday (Apr 28) Bring Homework Notebooks to Final for Grading Dr. Todd Satogata

More information

E X P E R I M E N T 12

E X P E R I M E N T 12 E X P E R I M E N T 12 Mirrors and Lenses Produced by the Physics Staff at Collin College Copyright Collin College Physics Department. All Rights Reserved. University Physics II, Exp 12: Mirrors and Lenses

More information

VC 11/12 T2 Image Formation

VC 11/12 T2 Image Formation VC 11/12 T2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra Outline Computer Vision? The Human Visual System

More information

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES Petteri PÖNTINEN Helsinki University of Technology, Institute of Photogrammetry and Remote Sensing, Finland petteri.pontinen@hut.fi KEY WORDS: Cocentricity,

More information

PHYSICS FOR THE IB DIPLOMA CAMBRIDGE UNIVERSITY PRESS

PHYSICS FOR THE IB DIPLOMA CAMBRIDGE UNIVERSITY PRESS Option C Imaging C Introduction to imaging Learning objectives In this section we discuss the formation of images by lenses and mirrors. We will learn how to construct images graphically as well as algebraically.

More information

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember Günter Toesko - Laserseminar BLZ im Dezember 2009 1 Aberrations An optical aberration is a distortion in the image formed by an optical system compared to the original. It can arise for a number of reasons

More information

Perspective. Cornell CS4620/5620 Fall 2012 Lecture Kavita Bala 1 (with previous instructors James/Marschner)

Perspective. Cornell CS4620/5620 Fall 2012 Lecture Kavita Bala 1 (with previous instructors James/Marschner) CS4620/5620: Lecture 6 Perspective 1 Announcements HW 1 out Due in two weeks (Mon 9/17) Due right before class Turn it in online AND in class (preferably) 2 Transforming normal vectors Transforming surface

More information

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura MIT CSAIL 6.869 Advances in Computer Vision Fall 2013 Problem Set 6: Anaglyph Camera Obscura Posted: Tuesday, October 8, 2013 Due: Thursday, October 17, 2013 You should submit a hard copy of your work

More information

Using Optics to Optimize Your Machine Vision Application

Using Optics to Optimize Your Machine Vision Application Expert Guide Using Optics to Optimize Your Machine Vision Application Introduction The lens is responsible for creating sufficient image quality to enable the vision system to extract the desired information

More information