Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

Save this PDF as:

Size: px
Start display at page:

Transcription

1 Camera projections Recall the plenoptic function: Panoramic imaging Ixyzϕθλt (,,,,,, ) At any point xyz,, in space, there is a full sphere of possible incidence directions ϕ, θ, covered by 0 ϕ 2π, 0 θ π. A regular camera captures the incident rays from some region around a forward direction, and projects these directions onto rectilinear image coordinates ( uv, ) in the image plane by a linear perspective projection, or something reasonably close. This type of projection limits the field of view to strictly less than 180 degrees. Most cameras in fact have a field of view (FOV) that is only around 45 degrees. A lens with a field of view of 90 degrees is considered a very wide angle lens. The planar perspective projection is in fact unsuitable for wide angles. Objects at the image edges are projected in a very oblique direction towards the plane, and will have their proportions heavily distorted. A projection with a FOV close to 180 degrees will spend most of its pixels for objects at the extreme edge of the view, whereas the important parts of the image are most probably in the middle. The scene from above (the small black object in the middle marks the camera position for the following views) 45 degrees FOV (normal view) 90 degrees FOV (very wide angle view) 150 degrees FOV (extreme) 170 degrees FOV (strange) 175 degrees FOV (useless) Figure 1: Planar perspective projections with increasing field of view For very wide angle lenses, the planar projection is abandoned in favor of the fisheye projection. A fisheye projection is a projection through a projection reference point, just like the planar perspective projection, but it is performed against a sphere instead of a plane. The projection reference point is at the center of the sphere. The surface of the sphere is then mapped to a planar

2 image, oftan so that the angle of incidence is mapped linearly to the radial distance to the image center. Fisheye projections can have a very large field of view, 180 degrees or even more. The projection as such lends itself to a full 360 degrees field of view, even though it is difficult to design actual lenses with a FOV significantly larger than 180 degrees. Panorama mappings Figure 2: A fisheye projection with 180 degrees FOV There are of course infinitely many ways of mapping from ( ϕ, θ) to ( uv, ) when recording an image. Planar perspective projection and fisheye projection are just two examples. The planar perspective projection happens to be practical for most applications, and when it is inappropriate, the fisheye mapping will often do the job. Other mappings can be useful, though. In particular, it is sometimes desirable to capture image data in every possible direction from one single point of view. An image with a full 360 degrees field of view is called a panoramic image, or a panorama. Recording and storing such images presents a mapping problem. All image recording and storage media that exist today are essentially flat. Digital images are perhaps not flat in the normal sense of the word, but they are two-dimensional data structures with an equidistant rectilinear mapping of ( uv, ) image position to pixel coordinates, which really implies a flat image. The mapping for a panorama can be chosen in a number of different ways. The probem is equivalent to the cartographic problem of making a flat map of the whole Earth. Perhaps the most straightforward mapping is to map ( ϕ, θ) directly onto ( uv, ). This is called a spherical mapping. Another option is to exclude the top and bottom parts of the sphere and project the remainder against a cylinder. Such an image is called a cylindrical mapping. A third possibility is to map the environment sphere onto the six faces of a cube, and store the panorama as six square images. Quite logically, this mapping is called a cubical mapping. Each of these three mappings have their benefits.

3 Figure 3: Common panorama mappings. From top to bottom: spherical, cylindrical, cubical.

4 Other mappings exist. One relatively convenient way of storing a panorama would be to store two 180 degree fisheye images, each covering one hemisphere. It is also possible to use a fisheye mapping extended to 360 degrees field of view. Such mappings are irregular and highly non-uniform, but they are sometimes used for environment maps, and recently also for image based lighting. The light probe mapping described and used by Debevec et al [ is in fact a 360 degree fisheye projection, and the dual paraboloid environment mapping described by Heidrich and Seidel [ is a kind of dual fisheye projection. Panorama remappings One of the advantages with digital images over traditional image media is that they can easily be non-uniformly spatially resampled (warped) in a totally arbitrary manner. It is therefore possible to remap panoramas from one particular projection to any other projection, but some caution is advisable, because the resampling might throw away significant amounts of data. Different panorama mappings have different sampling densities, and because there is simply no way of making a perfectly uniform mapping from the surface of a sphere to a plane, the sampling density will always vary somewhat over the panoramic image. Converting a panorama from one mapping to another means resampling an image from one irregular, curvilinear coordinate system to another, which is a lot more tricky than resampling between regular, rectilinear coordinate systems. When dealing with panoramas, it is even more important than with regular images to use a higher resolution for the input image than the final output image, and to avoid repeated remapping operations. Using software with a good resampling algorithm can make a big difference. Commercial image editing tools are often insufficient. A very high quality resampling package is Panorama Tools by Helmut Dersch [ It is free, and it is designed for tricky panoramic remappings. Panorama acquisition Panoramic images are not a new invention. Panorama cameras have been available since at least the beginning of the 20th century, even though they have never had a big market. There are several different principles for acquiring a panorama image. Which one is best depends heavily on the application. Each of the methods presented below has its its own merits. Scanning panorama cameras Classic panorama cameras, and some modern digital versions as well, operate by the scanning slit principle. The shutter opening of the camera is a vertical slit, the camera rotates on a tripod during the exposure, and the film is fed past the slit with a speed that matches the translation of the scene as a result of the camera rotation. The width of the slit can be adjusted to change the aperture. Scanning slit panoramic cameras use standard film, and capture a 360 degree panorama horizontally on a wide film strip. Their vertical field of view, however, is comparable to a regular camera and therefore significantly less than 180 degrees. The panoramas obtained in this manner are cylindrical panoramas. Scanning slit cameras can also capture panoramas with less than 360 degrees horizontal field of view, simply by stopping before one full turn is completed. One-shot, single view panorama cameras It is possible to design optical systems that directly capture the full environment sphere. A fisheye lens with a 360 degrees field of view is impossible to build, but by using curved mirrors a 360 degrees field of view can be achieved.

5 The simplest setup uses a reflective sphere. An image of a reflective sphere contains direct reflections from all (or at least most) incidence directions. Such images can be used straight off for panoramas, but their bad sampling uniformity makes it advisable to remap them before storage. A reflective sphere image may be remapped to a 360 degree fisheye projection by a resampling in the radial direction. Figure 4: An image of a reflective sphere. Background masked black for clarity. One-shot, multi-view panorama cameras Another way of designing a camera that captures a panorama in a single shot is to use multiple lenses and acquire images from all lenses simultaneously, either by using multiple cameras, or by using mirrors or prisms to combine images from several lenses in a a single frame. Two 180 degree fisheye images are enough to cover an entire sphere, but many fisheye lenses have problems with the image quality towards the eges, with significant defocus, color aberration, flare and light falloff. For better image quality, four fisheye lenses in a tetrahedral configuration can be used, but that of course requires twice as much hardware. A cubical panorama can be aquired by six cameras facing front, back, left, right, up and down, each with a 90 degree field of view. 90 degrees FOV is an unusually wide angle lens which is not readily available as inexpensive standard equipment, but such lenses do exist. A problem which becomes apparent with single-shot panoramas is that a photographer cannot hold the camera, or even be near it, without being in clear view in the shot. This can be solved either by remote control or a timer release. The camera mount will still present a problem, though. Sometimes it is OK to leave it in the image, other times some manual retouching of the image might be required. Scanning or multi-shot panorama acquisition methods make it easier to keep the photographer and the camera mount out of view. Panoramic movie cameras One-shot panorama cameras (single- or multi-view) can of course capture movies instead of still images. Panoramic film or video sequences have found some applications, and a couple of panorama video systems are on the market.

6 Multi-shot panoramas Instead of acquiring several images at once, a single camera can be used to capture each part of a panorama in sequence. A 180 degree fisheye lens ideally requires only two shots, even though three or four might be needed to better avoid edge problems. A fisheye lens makes the method quite covenient, but even a regular planar projection lens with a smaller field of view can be used, along with a good tripod mount, to capture a dozen or more images in different directions. It takes some cutting and pasting to combine the images to a panorama, but it requires no special camera equipment. Furthermore, the image quality is excellent, because the method uses many high quality source images, each with a very uniform sampling. The recombination of several images to a panorama is called stitching. Software tools exist to assist panorama photographers in stitching, and even make the process more or less automatic. Figure 5: Unfinished stitched panorama (6 images placed, each with 45 degrees FOV) One disadvantage of stitching, apart from that it is cumbersome, is that the images are not taken simultaneously. If the scene changes or the light conditions change while the images are acquired, there will be problems stitching them together to a consistent panorama. Effects from strong onlight, such as glare and lens flare, will also present a problem, since glare and flare will only appear in images where a light source is visible, and not in adjacent images in the sequence. Panorama viewers A panorama can be viewed in two quite different ways. Either the entire panorama is displayed on a surface that encloses the viewer, and lets him or her look around freely. This method is suitable for large audiences, but it requires expensive display equipment and a lot of open space. A more common method is to use an interactive computer program to remap part of the panorama to a planar perspective projection with a smaller field of view, and giving the user control over the direction of view, and possibly also the field of view. This gives the viewer an impression of being in a scene and looking around with a camera, which is often enough to invoke a sense of immersion. Remapping a panoramic projection to a planar perspective projection is not a simple operation, and an image contains a lot of data. It is in fact not until recently that computers have been able to handle that kind of heavy computations with a reasonable speed and sufficient image quality. The recent development of inexpensive 3-D graphics accelerators also presents an alternative mapping method: the panorama can be placed as a texture on the inside of a three-dimensional

7 shape that matches the panorama projection (a sphere, a cylinder or a cube), and the viewing can then be taken care of by a hardware accelerated rendering of the view from a virtual camera placed at the center. Cubical panoramas are particularly suitable for this, because their projection fits nicely into today s polygon and texture rendering pipelines. Commercial applications There are actually not that many commercial applications for panoramic images, but now that computers are able to handle them more easily and with high quality, we will probably see more of it in the near future. A few computer games using panoramic images have been released over the years, the most recent one being Myst III: Exile. In Exile, the panorama viewer is enhanced to simulate glare, and parts of the scene are moving, which gives a very immersive and compelling effect. For quite a few years by now, Apple Computers [ has been one of the leaders in the development of panoramic imaging with their software Quicktime VR. They deserve a mention for providing reasonably open standards, while some others have been trying to market their software as secret magic. A particularly sad mix of marketing hype, prohibitively expensive software licensing and repeated and dubious attacks on free software developers comes from the company IPIX [ Their products are not bad, but they have made a lot of enemies over the years. As panoramic imaging becomes more commonplace, it will be easier for companies to make a living on selling their products, but right now, commercial panoramic imaging is a struggle between only a few actors for shares of a too small market. Stefan Gustavson,

A short introduction to panoramic images

A short introduction to panoramic images By Richard Novossiltzeff Bridgwater Photographic Society March 25, 2014 1 What is a panorama Some will say that the word Panorama is over-used; the better word

Creating a Panorama Photograph Using Photoshop Elements

Creating a Panorama Photograph Using Photoshop Elements Following are guidelines when shooting photographs for a panorama. Overlap images sufficiently -- Images should overlap approximately 15% to 40%.

Synthetic Stereoscopic Panoramic Images

Synthetic Stereoscopic Panoramic Images What are they? How are they created? What are they good for? Paul Bourke University of Western Australia In collaboration with ICinema @ University of New South

Beacon Island Report / Notes

Beacon Island Report / Notes Paul Bourke, ivec@uwa, 17 February 2014 During my 2013 and 2014 visits to Beacon Island four general digital asset categories were acquired, they were: high resolution panoramic

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES Petteri PÖNTINEN Helsinki University of Technology, Institute of Photogrammetry and Remote Sensing, Finland petteri.pontinen@hut.fi KEY WORDS: Cocentricity,

Brief summary report of novel digital capture techniques

Brief summary report of novel digital capture techniques Paul Bourke, ivec@uwa, February 2014 The following briefly summarizes and gives examples of the various forms of novel digital photography and video

Time-Lapse Panoramas for the Egyptian Heritage

Time-Lapse Panoramas for the Egyptian Heritage Mohammad NABIL Anas SAID CULTNAT, Bibliotheca Alexandrina While laser scanning and Photogrammetry has become commonly-used methods for recording historical

Capturing Omni-Directional Stereoscopic Spherical Projections with a Single Camera

Capturing Omni-Directional Stereoscopic Spherical Projections with a Single Camera Paul Bourke ivec @ University of Western Australia, 35 Stirling Hwy, Crawley, WA 6009 Australia. paul.bourke@uwa.edu.au

SpheroCam HDR. Image based lighting with. Capture light perfectly SPHERON VR. 0s 20s 40s 60s 80s 100s 120s. Spheron VR AG

Image based lighting with SpheroCam HDR Capture light perfectly 0 60 120 180 240 300 360 0s 20s 40s 60s 80s 100s 120s SPHERON VR high dynamic range imaging Spheron VR AG u phone u internet Hauptstraße

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Panoramas CS 178, Spring 2012 Marc Levoy Computer Science Department Stanford University What is a panorama?! a wider-angle image than a normal camera can capture! any image stitched from overlapping photographs!

Realistic Visual Environment for Immersive Projection Display System

Realistic Visual Environment for Immersive Projection Display System Hasup Lee Center for Education and Research of Symbiotic, Safe and Secure System Design Keio University Yokohama, Japan hasups@sdm.keio.ac.jp

Which equipment is necessary? How is the panorama created?

Congratulations! By purchasing your Panorama-VR-System you have acquired a tool, which enables you - together with a digital or analog camera, a tripod and a personal computer - to generate high quality

ContextCapture Quick guide for photo acquisition

ContextCapture Quick guide for photo acquisition ContextCapture is automatically turning photos into 3D models, meaning that the quality of the input dataset has a deep impact on the output 3D model which

Advanced Diploma in. Photoshop. Summary Notes

Advanced Diploma in Photoshop Summary Notes Suggested Set Up Workspace: Essentials or Custom Recommended: Ctrl Shift U Ctrl + T Menu Ctrl + I Ctrl + J Desaturate Free Transform Filter options Invert Duplicate

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Panoramas CS 178, Spring 2010 Marc Levoy Computer Science Department Stanford University What is a panorama?! a wider-angle image than a normal camera can capture! any image stitched from overlapping photographs!

Homographies and Mosaics

Homographies and Mosaics Jeffrey Martin (jeffrey-martin.com) with a lot of slides stolen from Steve Seitz and Rick Szeliski 15-463: Computational Photography Alexei Efros, CMU, Fall 2011 Why Mosaic? Are

Macro and Close-up Lenses

58 Macro and Close-up Lenses y its very nature, macro photography B(and to a lesser degree close-up photography) has always caused challenges for lens manufacturers, and this is no different for digital

Homographies and Mosaics

Homographies and Mosaics Jeffrey Martin (jeffrey-martin.com) CS194: Image Manipulation & Computational Photography with a lot of slides stolen from Alexei Efros, UC Berkeley, Fall 2014 Steve Seitz and

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Image Formation and Capture Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen Image Formation and Capture Real world Optics Sensor Devices Sources of Error

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Determining MTF with a Slant Edge Target Douglas A. Kerr Issue 2 October 13, 2010 ABSTRACT AND INTRODUCTION The modulation transfer function (MTF) of a photographic lens tells us how effectively the lens

CSC 170 Introduction to Computers and Their Applications. Lecture #3 Digital Graphics and Video Basics. Bitmap Basics

CSC 170 Introduction to Computers and Their Applications Lecture #3 Digital Graphics and Video Basics Bitmap Basics As digital devices gained the ability to display images, two types of computer graphics

How to combine images in Photoshop

How to combine images in Photoshop In Photoshop, you can use multiple layers to combine images, but there are two other ways to create a single image from mulitple images. Create a panoramic image with

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Light field sensing Marc Levoy Computer Science Department Stanford University The scalar light field (in geometrical optics) Radiance as a function of position and direction in a static scene with fixed

Technical information about PhoToPlan The following pages shall give you a detailed overview of the possibilities using PhoToPlan. kubit GmbH Fiedlerstr. 36, 01307 Dresden, Germany Fon: +49 3 51/41 767

Peripheral imaging with electronic memory unit

Rochester Institute of Technology RIT Scholar Works Articles 1997 Peripheral imaging with electronic memory unit Andrew Davidhazy Follow this and additional works at: http://scholarworks.rit.edu/article

CS 443: Imaging and Multimedia Cameras and Lenses

CS 443: Imaging and Multimedia Cameras and Lenses Spring 2008 Ahmed Elgammal Dept of Computer Science Rutgers University Outlines Cameras and lenses! 1 They are formed by the projection of 3D objects.

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E Updated 20 th Jan. 2017 References Creator V1.4.0 2 Overview This document will concentrate on OZO Creator s Image Parameter

Cameras for Stereo Panoramic Imaging Λ

Cameras for Stereo Panoramic Imaging Λ Shmuel Peleg Yael Pritch Moshe Ben-Ezra School of Computer Science and Engineering The Hebrew University of Jerusalem 91904 Jerusalem, ISRAEL Abstract A panorama

Volume 1 - Module 6 Geometry of Aerial Photography. I. Classification of Photographs. Vertical

RSCC Volume 1 Introduction to Photo Interpretation and Photogrammetry Table of Contents Module 1 Module 2 Module 3.1 Module 3.2 Module 4 Module 5 Module 6 Module 7 Module 8 Labs Volume 1 - Module 6 Geometry

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Light from distant things Chapter 36 We learn about a distant thing from the light it generates or redirects. The lenses in our eyes create images of objects our brains can process. This chapter concerns

Bryce 7.1 Pro HDRI Export. HDRI Export

HDRI Export Bryce can create an HDRI from the sky or load an external HDRI. These HDRIs can also be exported from the IBL tab into different file formats. There are a few things to watch out for. Export

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

CSC Stereography Course 101... 3 I. What is Stereoscopic Photography?... 3 A. Binocular Vision... 3 1. Depth perception due to stereopsis... 3 2. Concept was understood hundreds of years ago... 3 3. Stereo

High-Resolution Interactive Panoramas with MPEG-4

High-Resolution Interactive Panoramas with MPEG-4 Peter Eisert, Yong Guo, Anke Riechers, Jürgen Rurainsky Fraunhofer Institute for Telecommunications, Heinrich-Hertz-Institute Image Processing Department

Dr. Reham Karam. Perspective Drawing. For Artists & Designers. By : Dr.Reham Karam

Perspective Drawing For Artists & Designers By : Dr.Reham Karam Geometry and Art : What is perspective? Perspective, in the vision and visual perception, is : the way that objects appear to the eye based

Instruction Manual for HyperScan Spectrometer

August 2006 Version 1.1 Table of Contents Section Page 1 Hardware... 1 2 Mounting Procedure... 2 3 CCD Alignment... 6 4 Software... 7 5 Wiring Diagram... 19 1 HARDWARE While it is not necessary to have

International Planetarium Society 98 Conference. ElectricSky Immersive Multimedia Theater Ed Lantz, Product Development Mgr. Spitz, Inc.

ElectricSky Immersive Multimedia Theater Ed Lantz, Product Development Mgr. Spitz, Inc. Abstract A new vision is emerging for planetaria. We soon will be able to graphically control the entire surface

Practical assessment of veiling glare in camera lens system

Professional paper UDK: 655.22 778.18 681.7.066 Practical assessment of veiling glare in camera lens system Abstract Veiling glare can be defined as an unwanted or stray light in an optical system caused

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit Purpose 1. To understand the theory of Fraunhofer diffraction of light at a single slit and at a circular aperture; 2. To learn how to measure

(51) Int Cl.: H04N 1/00 ( ) H04N 13/00 ( ) G06T 3/40 ( )

(19) (12) EUROPEAN PATENT SPECIFICATION (11) EP 1 048 167 B1 (4) Date of publication and mention of the grant of the patent: 07.01.09 Bulletin 09/02 (21) Application number: 999703.0 (22) Date of filing:

PandroidWiz and Presets

PandroidWiz and Presets What are Presets PandroidWiz uses Presets to control the pattern of movements of the robotic mount when shooting panoramas. Presets are data files that specify the Yaw and Pitch

INSTRUCTIONS 303SPH SPHERICAL VR HEAD The spherical VR head is designed to allow virtual scenes to be created by Computer from a various panoramic sequences of digital or digitised photographs, taken at

Parity and Plane Mirrors. Invert Image flip about a horizontal line. Revert Image flip about a vertical line.

Optical Systems 37 Parity and Plane Mirrors In addition to bending or folding the light path, reflection from a plane mirror introduces a parity change in the image. Invert Image flip about a horizontal

Flair for After Effects v1.1 manual

Contents Introduction....................................3 Common Parameters..............................4 1. Amiga Rulez................................. 11 2. Box Blur....................................

INTERFEROMETER VI-direct

Universal Interferometers for Quality Control Ideal for Production and Quality Control INTERFEROMETER VI-direct Typical Applications Interferometers are an indispensable measurement tool for optical production

Panoramic Image Mosaics

Panoramic Image Mosaics Image Stitching Computer Vision CSE 576, Spring 2008 Richard Szeliski Microsoft Research Full screen panoramas (cubic): http://www.panoramas.dk/ Mars: http://www.panoramas.dk/fullscreen3/f2_mars97.html

Cinematic Drone. https://vimeo.com/

Cinematic Drone https://vimeo.com/174064811 Some basic moves How to pull off five essential drone shots and uplift your videos https://vimeo.com/blog/post/how-to-pull-off-5-essential-drone-shots-and-uplift

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group bdawson@goipd.com (987) 670-2050 Introduction Automated Optical Inspection (AOI) uses lighting, cameras, and vision computers

CS535 Fall Department of Computer Science Purdue University

Omnidirectional Camera Models CS535 Fall 2010 Daniel G Aliaga Daniel G. Aliaga Department of Computer Science Purdue University A little bit of history Omnidirectional cameras are also called panoramic

PHYS 160 Astronomy. When analyzing light s behavior in a mirror or lens, it is helpful to use a technique called ray tracing.

Optics Introduction In this lab, we will be exploring several properties of light including diffraction, reflection, geometric optics, and interference. There are two sections to this lab and they may

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

INFOTEH-JAHORINA Vol. 10, Ref. E-VI-11, p. 892-896, March 2011. MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS Jelena Cvetković, Aleksej Makarov, Sasa Vujić, Vlatacom d.o.o. Beograd Abstract -

DISPLAY metrology measurement

Curved Displays Challenge Display Metrology Non-planar displays require a close look at the components involved in taking their measurements. by Michael E. Becker, Jürgen Neumeier, and Martin Wolf DISPLAY

INSTRUCTION MANUAL FOR THE MODEL C OPTICAL TESTER

INSTRUCTION MANUAL FOR THE MODEL C OPTICAL TESTER INSTRUCTION MANUAL FOR THE MODEL C OPTICAL TESTER Data Optics, Inc. (734) 483-8228 115 Holmes Road or (800) 321-9026 Ypsilanti, Michigan 48198-3020 Fax:

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING ROGER STETTNER, HOWARD BAILEY AND STEVEN SILVERMAN Advanced Scientific Concepts, Inc. 305 E. Haley St. Santa Barbara, CA 93103 ASC@advancedscientificconcepts.com

digital film technology Resolution Matters what's in a pattern white paper standing the test of time

digital film technology Resolution Matters what's in a pattern white paper standing the test of time standing the test of time An introduction >>> Film archives are of great historical importance as they

CHARGE-COUPLED DEVICE (CCD)

CHARGE-COUPLED DEVICE (CCD) Definition A charge-coupled device (CCD) is an analog shift register, enabling analog signals, usually light, manipulation - for example, conversion into a digital value that

MS260i 1/4 M IMAGING SPECTROGRAPHS

MS260i 1/4 M IMAGING SPECTROGRAPHS ENTRANCE EXIT MS260i Spectrograph with 3 Track Fiber on input and InstaSpec IV CCD on output. Fig. 1 OPTICAL CONFIGURATION High resolution Up to three gratings, with

Computational Photography and Video. Prof. Marc Pollefeys

Computational Photography and Video Prof. Marc Pollefeys Today s schedule Introduction of Computational Photography Course facts Syllabus Digital Photography What is computational photography Convergence

MPRemap v.0.2a Remap From and To Motion Pictures

MPRemap v.0.2a Remap From and To Motion Pictures Helmut Dersch derfh-furtwangen.de December 9, 2007 Summary With MPRemap motion pictures can be remapped from and to almost any geometric projection, or

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

Computational Cameras. Rahul Raguram COMP

Computational Cameras Rahul Raguram COMP 790-090 What is a computational camera? Camera optics Camera sensor 3D scene Traditional camera Final image Modified optics Camera sensor Image Compute 3D scene

Intorduction to light sources, pinhole cameras, and lenses

Intorduction to light sources, pinhole cameras, and lenses Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 October 26, 2011 Abstract 1 1 Analyzing

Photolithography II ( Part 2 )

1 Photolithography II ( Part 2 ) Chapter 14 : Semiconductor Manufacturing Technology by M. Quirk & J. Serda Saroj Kumar Patra, Department of Electronics and Telecommunication, Norwegian University of Science

Maine Day in May. 54 Chapter 2: Painterly Techniques for Non-Painters

Maine Day in May 54 Chapter 2: Painterly Techniques for Non-Painters Simplifying a Photograph to Achieve a Hand-Rendered Result Excerpted from Beyond Digital Photography: Transforming Photos into Fine

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

High Dynamic Range Video for Photometric Measurement of Illumination

High Dynamic Range Video for Photometric Measurement of Illumination Jonas Unger, Stefan Gustavson, VITA, Linköping University, Sweden 1 ABSTRACT We describe the design and implementation of a high dynamic

This histogram represents the +½ stop exposure from the bracket illustrated on the first page.

Washtenaw Community College Digital M edia Arts Photo http://courses.wccnet.edu/~donw Don W erthm ann GM300BB 973-3586 donw@wccnet.edu Exposure Strategies for Digital Capture Regardless of the media choice

Panoramas. Featuring ROD PLANCK. Rod Planck DECEMBER 29, 2017 ADVANCED

DECEMBER 29, 2017 ADVANCED Panoramas Featuring ROD PLANCK Rod Planck D700, PC-E Micro NIKKOR 85mm f/2.8d, 1/8 second, f/16, ISO 200, manual exposure, Matrix metering. When we asked the noted outdoor and

10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions

10.2 SUMMARY Refraction in Lenses Converging lenses bring parallel rays together after they are refracted. Diverging lenses cause parallel rays to move apart after they are refracted. Rays are refracted

Dental photography: Dentist Blog. This is what matters when choosing the right camera equipment! Checklist. blog.ivoclarvivadent.

Dental photography: This is what matters when choosing the right camera equipment! Checklist Dentist Blog blog.ivoclarvivadent.com/dentist Dental photography: This is what matters when choosing the right

CCD Requirements for Digital Photography

IS&T's 2 PICS Conference IS&T's 2 PICS Conference Copyright 2, IS&T CCD Requirements for Digital Photography Richard L. Baer Hewlett-Packard Laboratories Palo Alto, California Abstract The performance

Testing Aspherics Using Two-Wavelength Holography

Reprinted from APPLIED OPTICS. Vol. 10, page 2113, September 1971 Copyright 1971 by the Optical Society of America and reprinted by permission of the copyright owner Testing Aspherics Using Two-Wavelength

APPENDIX D: ANALYZING ASTRONOMICAL IMAGES WITH MAXIM DL

APPENDIX D: ANALYZING ASTRONOMICAL IMAGES WITH MAXIM DL Written by T.Jaeger INTRODUCTION Early astronomers relied on handmade sketches to record their observations (see Galileo s sketches of Jupiter s

Tomorrow s Digital Photography

Tomorrow s Digital Photography Gerald Peter Vienna University of Technology Figure 1: a) - e): A series of photograph with five different exposures. f) In the high dynamic range image generated from a)

Telescopes and their configurations. Quick review at the GO level

Telescopes and their configurations Quick review at the GO level Refraction & Reflection Light travels slower in denser material Speed depends on wavelength Image Formation real Focal Length (f) : Distance

Chapter 34 Geometric Optics (also known as Ray Optics) by C.-R. Hu

Chapter 34 Geometric Optics (also known as Ray Optics) by C.-R. Hu 1. Principles of image formation by mirrors (1a) When all length scales of objects, gaps, and holes are much larger than the wavelength

Desktop - Photogrammetry and its Link to Web Publishing

Desktop - Photogrammetry and its Link to Web Publishing Günter Pomaska FH Bielefeld, University of Applied Sciences Bielefeld, Germany, email gp@imagefact.de Key words: Photogrammetry, image refinement,

Classical Viewing. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Classical Viewing Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico 1 Objectives Introduce the classical views Compare and contrast image

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Image acquisition Digital images are acquired by direct digital acquisition (digital still/video cameras), or scanning material acquired as analog signals (slides, photographs, etc.). In both cases, the

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

Evaluating Commercial Scanners for Astronomical Images Robert J. Simcoe Associate Harvard College Observatory rjsimcoe@cfa.harvard.edu Introduction: Many organizations have expressed interest in using

The Formation of an Aerial Image, part 3

T h e L i t h o g r a p h y T u t o r (July 1993) The Formation of an Aerial Image, part 3 Chris A. Mack, FINLE Technologies, Austin, Texas In the last two issues, we described how a projection system

Beam Profiling. Introduction. What is Beam Profiling? by Michael Scaggs. Haas Laser Technologies, Inc.

Beam Profiling by Michael Scaggs Haas Laser Technologies, Inc. Introduction Lasers are ubiquitous in industry today. Carbon Dioxide, Nd:YAG, Excimer and Fiber lasers are used in many industries and a myriad

CGS 3034 Lecture 4 Camera Animation

CGS 3034 Lecture 4 Camera Animation Introduction to Computer Aided Animation Instructor: Brent Rossen Overview Depth of Field and Gobo Lighting The Imperfect Camera Maya s Perfect Camera Imperfecting:

La photographie numérique. Frank NIELSEN Lundi 7 Juin 2010

La photographie numérique Frank NIELSEN Lundi 7 Juin 2010 1 Le Monde digital Key benefits of the analog2digital paradigm shift? Dissociate contents from support : binarize Universal player (CPU, Turing

Geometry of Aerial Photographs

Geometry of Aerial Photographs Aerial Cameras Aerial cameras must be (details in lectures): Geometrically stable Have fast and efficient shutters Have high geometric and optical quality lenses They can

Basic principles of photography. David Capel 346B IST

Basic principles of photography David Capel 346B IST Latin Camera Obscura = Dark Room Light passing through a small hole produces an inverted image on the opposite wall Safely observing the solar eclipse

Optical design of a high resolution vision lens

Optical design of a high resolution vision lens Paul Claassen, optical designer, paul.claassen@sioux.eu Marnix Tas, optical specialist, marnix.tas@sioux.eu Prof L.Beckmann, l.beckmann@hccnet.nl Summary:

Unsharp Masking. Contrast control and increased sharpness in B&W. by Ralph W. Lambrecht

Unsharp Masking Contrast control and increased sharpness in B&W by Ralph W. Lambrecht An unsharp mask is a faint positive, made by contact printing a. The unsharp mask and the are printed together after

In literary texts, we speak of the contributing parts as words, phrases, sentences, paragraphs, and chapters. In film, there are:

READING FILMS CRITICALLY Films, like literary texts, can be decoded or read to uncover multiple levels of meaning. While cinema uses language to communicate meaning, it also adds visual imagery, movement

3D Scanning Guide. 0. Login. I. Startup

3D Scanning Guide UTSOA has a Konica Minolta Vivid 910 3D non-contact digitizing system. This scanner is located in the digital fabrication section of the technology lab in Sutton Hall 1.102. It is free

A taste for landscapes

A taste for landscapes NEPG workshop October 2012 Colin White 1 Main ingredients 1. Light 2. Composition 3. Kit 4. Post production 2 Light Hue, direction, dynamic range Time of day - around sunrise or

arxiv:physics/ v1 [physics.optics] 12 May 2006

Quantitative and Qualitative Study of Gaussian Beam Visualization Techniques J. Magnes, D. Odera, J. Hartke, M. Fountain, L. Florence, and V. Davis Department of Physics, U.S. Military Academy, West Point,

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 29/30 Refraction and Lenses Refraction Refraction the bending of waves as they pass from one medium into another. Caused by a change in the average speed of light. Analogy A car that drives off

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Cameras Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26 with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros Camera trial #1 scene film Put a piece of film in front of

Panorama Photogrammetry for Architectural Applications

Panorama Photogrammetry for Architectural Applications Thomas Luhmann University of Applied Sciences ldenburg Institute for Applied Photogrammetry and Geoinformatics fener Str. 16, D-26121 ldenburg, Germany

Physics 3340 Spring Fourier Optics

Physics 3340 Spring 011 Purpose Fourier Optics In this experiment we will show how the Fraunhofer diffraction pattern or spatial Fourier transform of an object can be observed within an optical system.