The eye, displays and visual effects

Similar documents
The Human Visual System. Lecture 1. The Human Visual System. The Human Eye. The Human Retina. cones. rods. horizontal. bipolar. amacrine.

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Retina. Convergence. Early visual processing: retina & LGN. Visual Photoreptors: rods and cones. Visual Photoreptors: rods and cones.

The Special Senses: Vision

Visual Effects of Light. Prof. Grega Bizjak, PhD Laboratory of Lighting and Photometry Faculty of Electrical Engineering University of Ljubljana

Visual Perception. human perception display devices. CS Visual Perception

Visual Effects of. Light. Warmth. Light is life. Sun as a deity (god) If sun would turn off the life on earth would extinct

Slide 4 Now we have the same components that we find in our eye. The analogy is made clear in this slide. Slide 5 Important structures in the eye

AP PSYCH Unit 4.2 Vision 1. How does the eye transform light energy into neural messages? 2. How does the brain process visual information? 3.

Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May

Seeing and Perception. External features of the Eye

How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail

Yokohama City University lecture INTRODUCTION TO HUMAN VISION Presentation notes 7/10/14

The Science Seeing of process Digital Media. The Science of Digital Media Introduction

Spatial coding: scaling, magnification & sampling

This question addresses OPTICAL factors in image formation, not issues involving retinal or other brain structures.

III: Vision. Objectives:

Sensation. What is Sensation, Perception, and Cognition. All sensory systems operate the same, they only use different mechanisms

Sensation. Sensation. Perception. What is Sensation, Perception, and Cognition

CS 534: Computer Vision

Chapter 2: The Beginnings of Perception

iris pupil cornea ciliary muscles accommodation Retina Fovea blind spot

Chapter Six Chapter Six

Lecture 4 Foundations and Cognitive Processes in Visual Perception From the Retina to the Visual Cortex

Digital Image Processing COSC 6380/4393

Spatial Vision: Primary Visual Cortex (Chapter 3, part 1)

Visual Perception of Images

Fundamentals of Computer Vision

Visual Perception. Jeff Avery

Digital Image Processing

The human visual system

Life Science Chapter 2 Study Guide

Human Vision. Human Vision - Perception

Lecture 2 Digital Image Fundamentals. Lin ZHANG, PhD School of Software Engineering Tongji University Fall 2016

Early Visual Processing: Receptive Fields & Retinal Processing (Chapter 2, part 2)

Human Senses : Vision week 11 Dr. Belal Gharaibeh

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

Lecture 8. Lecture 8. r 1

AS Psychology Activity 4

Science 8 Unit 2 Pack:

HW- Finish your vision book!

Achromatic and chromatic vision, rods and cones.

Vision and Color. Reading. Optics, cont d. Lenses. d d f. Brian Curless CSE 557 Autumn Good resources:

Vision and Color. Brian Curless CSE 557 Autumn 2015

Vision and Color. Reading. The lensmaker s formula. Lenses. Brian Curless CSEP 557 Autumn Good resources:

Psych 333, Winter 2008, Instructor Boynton, Exam 1

Color and Perception

The Physiology of the Senses Lecture 1 - The Eye

The Visual System. Computing and the Brain. Visual Illusions. Give us clues as to how the visual system works

Refraction, Lenses, and Prisms

Physiology of Vision The Eye as a Sense Organ. Rodolfo T. Rafael,M.D. Topics

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Digital Image Processing

Lecture 5. The Visual Cortex. Cortical Visual Processing

Chapter 20 Human Vision

Vision Science I Exam 2 31 October 2016

Vision. Definition. Sensing of objects by the light reflected off the objects into our eyes

Chapter 23 Study Questions Name: Class:

Vision and Color. Reading. Optics, cont d. Lenses. d d f. Brian Curless CSEP 557 Fall Good resources:

Vision and Color. Brian Curless CSEP 557 Fall 2016

Visual Perception. Readings and References. Forming an image. Pinhole camera. Readings. Other References. CSE 457, Autumn 2004 Computer Graphics

PSY 214 Lecture # (09/14/2011) (Introduction to Vision) Dr. Achtman PSY 214. Lecture 4 Topic: Introduction to Vision Chapter 3, pages 44-54

Spectral colors. What is colour? 11/23/17. Colour Vision 1 - receptoral. Colour Vision I: The receptoral basis of colour vision

PHGY Physiology. The Process of Vision. SENSORY PHYSIOLOGY Vision. Martin Paré. Visible Light. Ocular Anatomy. Ocular Anatomy.

Chapter 36. Image Formation

PHGY Physiology. SENSORY PHYSIOLOGY Vision. Martin Paré

Vision 1. Physical Properties of Light. Overview of Topics. Light, Optics, & The Eye Chaudhuri, Chapter 8

HUMAN PERFORMANCE DEFINITION

Aspects of Vision. Senses

The Human Eye and a Camera 12.1

Work environment. Retina anatomy. A human eyeball is like a simple camera! The way of vision signal. Directional sensitivity. Lighting.

Visibility, Performance and Perception. Cooper Lighting

The eye* The eye is a slightly asymmetrical globe, about an inch in diameter. The front part of the eye (the part you see in the mirror) includes:

Visual System I Eye and Retina

We have already discussed retinal structure and organization, as well as the photochemical and electrophysiological basis for vision.

Sensation and Perception

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points

EC-433 Digital Image Processing

OPTICAL DEMONSTRATIONS ENTOPTIC PHENOMENA, VISION AND EYE ANATOMY

Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis

Digital Image Processing COSC 6380/4393

Further reading. 1. Visual perception. Restricting the light. Forming an image. Angel, section 1.4

Refraction of Light. Refraction of Light

PERCEPTUAL INSIGHTS INTO FOVEATED VIRTUAL REALITY. Anjul Patney Senior Research Scientist

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

Chapter 36. Image Formation

Chapter 25. Optical Instruments

Getting light to imager. Capturing Images. Depth and Distance. Ideal Imaging. CS559 Lecture 2 Lights, Cameras, Eyes

Digital Image Processing

Biological Vision. Ahmed Elgammal Dept of Computer Science Rutgers University

DIGITAL IMAGE PROCESSING LECTURE # 4 DIGITAL IMAGE FUNDAMENTALS-I

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Applications of Optics

Outline 2/21/2013. The Retina

Work environment. Vision. Human Millieu system. Retina anatomy. A human eyeball is like a simple camera! Lighting. Eye anatomy. Cones colours

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

CSE 527: Introduction to Computer Vision

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

CS 565 Computer Vision. Nazar Khan PUCIT Lecture 4: Colour

Digital Image Processing

Transcription:

The eye, displays and visual effects Week 2 IAT 814 Lyn Bartram Visible light and surfaces Perception is about understanding patterns of light. Visible light constitutes a very small part of the electromagnetic spectrum. Humans can perceive light only in the range of 400 to 700 nanometers. At wavelengths shorter than 400nm are ultraviolet light and X-rays. At wavelengths longer than 700nm are infrared light, microwaves, and radio waves. Surface perception is primary interface with objects in the world (Gibson) Ambient optical array Optical flow World is an information display How do get similar information from dots on a screen? 1

The ambient optical array The Ambient optical array is a term that describes the array of light that arrives from all directions at some designated point in the environment. Simulating the appearance of the bundle of rays that would pass through a glass rectangle is one of the goals of computer graphics Textured Surfaces and Texture Gradients Surface texture is one of the fundamental visual properties of an object. The texture of an object helps us see where an object is and what shape it has. Orientation, shape and spatial layout Texture gradient of ground is important in space perception Even subtle texture needed for 3D Texture can be used for information - Subtle texturing exceeds pixel capacities of most displays 2

The paint model of surfaces Surfaces in nature are endlessly varied and complex. Microtextures give irregular patterns of reflection, so the amount and color of reflected light can vary with both the illumination angle and the viewing angle. This is a simple model that approximates many common materials. Shape from shading Simplified model may be embedded in our visual systems Explanation of why these models work so well. a) Lambertian shading only. (b) Lambertian shading with specular and ambient shading. (c) Lambertian shading with specular, ambient, and cast shadows. 3

Glossy leaves. The highlights are the colour of the illuminant. The eye and visual effects IAT814 `12.01.2009 Specular light reveals the details of surface structure but relies on viewpoint 4

How the eye works The human eye, like a camera, contains the equivalents of a lens, an aperture (the pupil), and a film (the retina). The lens focuses a small, inverted picture of the world onto the retina. The iris performs the function of a variable aperture, helping the eye to adjust to different lighting conditions. Some people find it difficult to understand how we can see the world properly when the image is upside down. The right way to think about this is to adopt a computational perspective. We do not perceive what is on the retina; instead, our brains compute a percept based on sensory information. Inversion of the images is the least of the brain s computational problems. The machinery Perception of 3D visual space Perceptual features 5

The eye and visual effects IAT814 21.01.2009 Optic nerve Visual Scene Photo Receptors Rods, cones Other retinal cells Retinal Ganglion Cells P,M pathways LGN Primary Visual Cortex Sec y Vis Cortex Brain pixels 6

Receptive fields The area of the retina that results in a neuron being stimulated Can be the result of a pattern Edges and lines http://psych.hanover.edu/krantz/receptive/ Used to explain a variety of brightness and contrast effects Cell is excited Cell is inhibited Fires less msgs emits msgs at a greater rate Visual angle and FOV Visual angle Angle subtended by an object on the eye of an observer Normal vision Distinguish an object that subtends 1/60 at a distance of 20 ft FOV = 150 per eye 7

Depth of focus Eye contains compound lens Diopters define focal length of lens Range of diopters is the range over which the lenses in the eye can adapt Oculomotor control to change the shape of the lenses Flex diminishes with age Reduces focal capacity Depth of focus is the range over which objects are in focus when the eye(lens) is adjusted for a particular distance Eye focused at infinity: 3m--infinity At 50 cm (monitor) : 7cm in front to 10 cm behind in focus Important for VR displays Hard to model depth of focus effects Depth of focus: augmented reality (AR) Augmented-reality systems involve superimposing visual imagery on the real world so that people can see a computer graphics-enhanced view of the world. Easier to perceive both when same depth of focus Set focal plane of virtual at real depth Different focal depths enforce perceptual distinction and enable selective attention (HUDs) Problems with distance estimation 8

Depth of focus: Virtual Reality (VR) Virtual-reality (VR) displays block out the real world, unlike the seethrough augmented-reality displays discussed previously. Ideally, objects on which the user fixates should be in sharp focus, while objects farther away or nearer should be blurred to the appropriate extents. Helmet mounted displays (HMDs) set screen at 2m 1.2-6m is depth of focus Need image blur to model out of focus effect Depth of focus simulation is difficult and computationally expensive Large pixels and low resolution in VR displays prevent effective image blur The eye and visual effects IAT814 112.01.2009 Depth of focus: flat displays simulating depth of focus using a flatscreen display is a major technical problem. It has two parts; simulating optical blur simulating the optical distance of the virtual object. There is also the problem of knowing what the user is looking at so that the object of attention can be made sharp while other objects are displayed as though out of focus. Eye tracking for adaptive focus rendering 9

Chromatic aberration The human eye is not corrected for chromatic aberration. Chromatic aberration means that different wavelengths of light are focused at different distances within the eye. Short-wavelength blue light is refracted more than longwavelength red light. The chromatic aberration of the eye can give rise to strong illusory depth effects Chromostereopsis. 10

Visual Acuities Visual acuities are measurements of our ability to see detail. give us an idea of the ultimate limits on the information densities that we can perceive. Most of the acuity measurements suggest that we can resolve things, such as the presence of two distinct lines, down to about 1 minute. 1 degree = 60 arc minutes = 360 arc seconds This is in rough agreement with the spacing of receptors in the center of the fovea. Visual acuities Type Point acuity (1 arc minute) Description The ability to resolve two distinct point targets. Grating acuity (1-2 arc minutes) The ability to distinguish a pattern of bright and dark bars from a uniform grey patch. Letter acuity (5 arc minutes) Stereo acuity (10 arc seconds) Vernier acuity (10 arc seconds) The ability to resolve a letter. The Snellen eye chart is a standard way of measuring this ability. 20/20 vision means that 5-minute target can be seen 90% of the time. The ability to resolve objects in depth. The acuity is measured as the difference between two angles for a just-detectable depth difference. The ability to see if two line segments are collinear. 11

Acuity distribution over the visual field If we look directly ahead and hold our arms straight out to either side, then we can just see both hands when we wiggle our fingers. This tells us that both eyes together provide a visual field of a bit more than 180 degrees. The fact that we cannot see our fingers until they move also tells us that motion sensitivity in the periphery is better than static sensitivity. Binocular viewing improves acuity by 7% as compared with monocular viewing. roughly triangular region of binocular overlap within which both eyes receive input. Visual field of view Acuity outside of the fovea drops rapidly, so that we can only resolve about one-tenth the detail at 10 degrees from the fovea. 12

Acuity Distribution Brain pixels Brain pixels = image units used by brain to process space Retinal ganglion cells are neurons that send information from the eyeball up the optic nerve to the cortex. Each one pools information from many rod and cone receptors. In the fovea, a single ganglion cell may be devoted to a single cone; in the far periphery each ganglion cell receives information from thousands of rods and cones. one nerve fiber ( axon,) from each ganglion cell, and there are about a million axons in each optic nerve. 13

P and M pathways Ganglion cells from the retina to the lateral geniculate nucleus (LGN) parvocellular P (small neurons) Slow-conducting colour Detailed shape Magnocellular M (big neurons) Fast Gross shape Luminance motion Visual efficiency of displays How many brain pixels are stimulated by a display? There are two types of inefficiency that occur when we view displays. At the fovea there are many brain pixels for each screen pixel. higher-resolution screens would definitely help foveal vision. However, off to the side, the situation is reversed; there are many more screen pixels than brain pixels. We are, in a sense, wasting information, because the brain cannot appreciate the detail and we could easily get away with fewer pixels. 14

Brain pixels and the optimal display Even though a conventional monitor covers only about 5-10% of our visual field when viewed normally, it stimulates almost 50% of brain pixels. Thus even if we could have very high-resolution, large screens, we would not be getting very much more information into the brain - given a single viewing position. Computer screens are currently about the right size for most tasks. However, large screens certainly have their uses in supporting many viewers. Better match in periphery of SP to BP Dual-Resolution Stereoscopic Display with Scene- Adaptive Fovea Boundary G. Godin, J.-F. Lalonde, L. Borgeat. NRC Ottawa 15

Human Spatial Acuity Spatial modulation sensitivity function Measures sensitivity of vision to range of contrast that can be detected and how this varies with spatial frequency Spatial frequency: A measure of how rapidly a property changes in space. A commonly used form of visual stimulus consists of vertical bars where the lightness varies according to a sinusoidal function. In this simple case the spatial frequency of the stimulus is just the frequency of the function used to generate the pattern. In general stimuli with fine detail including sharp edges have high spatial frequency while those where the stimulus properties change more slowly in space have low spatial frequency. Spatial contrast sensitivity function 16

Spatial resolution sensitivity Spatial sensitivity falls off at high and low frequencies Most sensitive to 2-3 Hz (cycles per degree) Don t see low frequency variation - non-uniform monitors Varies with age Most tests of visual acuity - such as letter or point - are tests of high-frequency resolution Increasingly apparent that low-frequency resolution is extremely important Pilot performance measures (Ginsburg 82) Icon discrimination (Queen 2007) Spatial frequency filtering The visual system maintains a set of scales that we associate with distance. If we see an object thought to have great size say, a building but that takes up little space on the retina (i.e. it looks very small), we immediately perceive it as being far away rather than perceiving it as a miniature building. The perception of scale is actually based on the encoding of visual spatial frequency (Schyns & Olivia, 1994). This is interesting because you can encode images in specific spatial frequencies (Schyns & Olivia, 1999). The eye and visual effects IAT814 12.01.2009 17

Spatial frequency filtering Spatial frequency filtering This phenomenon is based on our inability to perceive high frequency information from greater distances if the image has no distinctive low frequency component, it simply disappears when viewed from a distance. Matt Queen http://www.boxesandarrows.com/view/icon_analysis Where does this apply?? Consider icon design 18

Aliasing Aliasing can cause all kinds of unwanted effects. Patterns that should be invisible because they are beyond the resolving power of the human eye can become all too visible. Anti-aliasing consists of computing the average of the light pattern that is represented by each pixel. Proper anti-aliasing can be a more cost-effective solution than simply increasing the number of pixels in the display. Aliasing can sometimes be useful Horizontality Small misalignments (vernier acuity) Anti aliasing Input pattern Pixel matrix Output pattern 19

Temporal requirements 50-Hz flicker is about the limit of resolution that most of us can perceive. The 50-75-Hz refresh rate of the typical monitor. temporal aliasing artifacts are common in computer graphics and movies. reversing wagon wheel. pronounced when the image update rate is low some visualization systems have objects updated only about 10 times per second even though the screen is refreshed at 60Hz or better. A poor example of adaptive rendering! Use motion blur to compensate Expensive to compute The eye and visual effects IAT814 12.01.2009 20