Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

Similar documents
A short introduction to panoramic images

High Dynamic Range Imaging

Image stitching. Image stitching. Video summarization. Applications of image stitching. Stitching = alignment + blending. geometrical registration

Synthetic Stereoscopic Panoramic Images

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Creating a Panorama Photograph Using Photoshop Elements

Brief summary report of novel digital capture techniques

Extended View Toolkit

Photographing Long Scenes with Multiviewpoint

Beacon Island Report / Notes

Time-Lapse Panoramas for the Egyptian Heritage

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

UNIT Explain the radiation from two-wire. Ans: Radiation from Two wire

11/25/2009 CHAPTER THREE INTRODUCTION INTRODUCTION (CONT D) THE AERIAL CAMERA: LENS PHOTOGRAPHIC SENSORS

Novel Hemispheric Image Formation: Concepts & Applications

Which equipment is necessary? How is the panorama created?

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

SpheroCam HDR. Image based lighting with. Capture light perfectly SPHERON VR. 0s 20s 40s 60s 80s 100s 120s. Spheron VR AG

Chapter 18 Optical Elements

Digital Photographic Imaging Using MOEMS

Capturing Omni-Directional Stereoscopic Spherical Projections with a Single Camera

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Realistic Visual Environment for Immersive Projection Display System

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where

Holographic Stereograms and their Potential in Engineering. Education in a Disadvantaged Environment.

This talk is oriented toward artists.

Advanced Diploma in. Photoshop. Summary Notes

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

How to combine images in Photoshop

Instant strip photography

FAQ AUTODESK STITCHER UNLIMITED 2009 FOR MICROSOFT WINDOWS AND APPLE OSX. General Product Information CONTENTS. What is Autodesk Stitcher 2009?

Image Mosaicing. Jinxiang Chai. Source: faculty.cs.tamu.edu/jchai/cpsc641_spring10/lectures/lecture8.ppt

CSC 170 Introduction to Computers and Their Applications. Lecture #3 Digital Graphics and Video Basics. Bitmap Basics

LENSES. INEL 6088 Computer Vision

Macro and Close-up Lenses

Peripheral imaging with electronic memory unit

Digital Design and Communication Teaching (DiDACT) University of Sheffield Department of Landscape. Adobe Photoshop CS4 INTRODUCTION WORKSHOPS

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

ContextCapture Quick guide for photo acquisition

5 180 o Field-of-View Imaging Polarimetry

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

Adding Depth. Introduction. PTViewer3D. Helmut Dersch. May 20, 2016

Homographies and Mosaics

Technical information about PhoToPlan

Homographies and Mosaics

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Omni-Directional Catadioptric Acquisition System

Parity and Plane Mirrors. Invert Image flip about a horizontal line. Revert Image flip about a vertical line.

Single Camera Catadioptric Stereo System

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Dual-fisheye Lens Stitching for 360-degree Imaging & Video. Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington

Bryce 7.1 Pro HDRI Export. HDRI Export

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Volume 1 - Module 6 Geometry of Aerial Photography. I. Classification of Photographs. Vertical

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

Notes on the VPPEM electron optics

Cameras. CSE 455, Winter 2010 January 25, 2010

Big League Cryogenics and Vacuum The LHC at CERN

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Observational Astronomy

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

Why learn about photography in this course?

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E

Phys 531 Lecture 9 30 September 2004 Ray Optics II. + 1 s i. = 1 f

lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response

Be aware that there is no universal notation for the various quantities.

PANORAMIC PROJECTION SYSTEM USING A PANORAMIC ANNULAR LENS

Copyright Indizen Optical Technologies

Cameras for Stereo Panoramic Imaging Λ

of a Panoramic Image Scene

CS 443: Imaging and Multimedia Cameras and Lenses

Abstract. 1. Introduction and Motivation. 3. Methods. 2. Related Work Omni Directional Stereo Imaging

Two strategies for realistic rendering capture real world data synthesize from bottom up

UNIVERSITY OF CALICUT

Image Formation and Camera Design

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Photoshop Elements 3 Panoramas

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

SNC2D PHYSICS 5/25/2013. LIGHT & GEOMETRIC OPTICS L Converging & Diverging Lenses (P ) Curved Lenses. Curved Lenses

INTERFEROMETER VI-direct

Dr. Reham Karam. Perspective Drawing. For Artists & Designers. By : Dr.Reham Karam

HDR videos acquisition

Instruction Manual for HyperScan Spectrometer

J. C. Wyant Fall, 2012 Optics Optical Testing and Testing Instrumentation

Basic Principles of the Surgical Microscope. by Charles L. Crain

Laser Scanning for Surface Analysis of Transparent Samples - An Experimental Feasibility Study

Active Aperture Control and Sensor Modulation for Flexible Imaging

How do we see the world?

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

Photographing Art By Mark Pemberton March 26, 2009

This document explains the reasons behind this phenomenon and describes how to overcome it.

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Image Formation and Capture

Painting the Total Picture

Lenses and Focal Length

PandroidWiz and Presets

High-Resolution Interactive Panoramas with MPEG-4

Transcription:

Camera projections Recall the plenoptic function: Panoramic imaging Ixyzϕθλt (,,,,,, ) At any point xyz,, in space, there is a full sphere of possible incidence directions ϕ, θ, covered by 0 ϕ 2π, 0 θ π. A regular camera captures the incident rays from some region around a forward direction, and projects these directions onto rectilinear image coordinates ( uv, ) in the image plane by a linear perspective projection, or something reasonably close. This type of projection limits the field of view to strictly less than 180 degrees. Most cameras in fact have a field of view (FOV) that is only around 45 degrees. A lens with a field of view of 90 degrees is considered a very wide angle lens. The planar perspective projection is in fact unsuitable for wide angles. Objects at the image edges are projected in a very oblique direction towards the plane, and will have their proportions heavily distorted. A projection with a FOV close to 180 degrees will spend most of its pixels for objects at the extreme edge of the view, whereas the important parts of the image are most probably in the middle. The scene from above (the small black object in the middle marks the camera position for the following views) 45 degrees FOV (normal view) 90 degrees FOV (very wide angle view) 150 degrees FOV (extreme) 170 degrees FOV (strange) 175 degrees FOV (useless) Figure 1: Planar perspective projections with increasing field of view For very wide angle lenses, the planar projection is abandoned in favor of the fisheye projection. A fisheye projection is a projection through a projection reference point, just like the planar perspective projection, but it is performed against a sphere instead of a plane. The projection reference point is at the center of the sphere. The surface of the sphere is then mapped to a planar

image, oftan so that the angle of incidence is mapped linearly to the radial distance to the image center. Fisheye projections can have a very large field of view, 180 degrees or even more. The projection as such lends itself to a full 360 degrees field of view, even though it is difficult to design actual lenses with a FOV significantly larger than 180 degrees. Panorama mappings Figure 2: A fisheye projection with 180 degrees FOV There are of course infinitely many ways of mapping from ( ϕ, θ) to ( uv, ) when recording an image. Planar perspective projection and fisheye projection are just two examples. The planar perspective projection happens to be practical for most applications, and when it is inappropriate, the fisheye mapping will often do the job. Other mappings can be useful, though. In particular, it is sometimes desirable to capture image data in every possible direction from one single point of view. An image with a full 360 degrees field of view is called a panoramic image, or a panorama. Recording and storing such images presents a mapping problem. All image recording and storage media that exist today are essentially flat. Digital images are perhaps not flat in the normal sense of the word, but they are two-dimensional data structures with an equidistant rectilinear mapping of ( uv, ) image position to pixel coordinates, which really implies a flat image. The mapping for a panorama can be chosen in a number of different ways. The probem is equivalent to the cartographic problem of making a flat map of the whole Earth. Perhaps the most straightforward mapping is to map ( ϕ, θ) directly onto ( uv, ). This is called a spherical mapping. Another option is to exclude the top and bottom parts of the sphere and project the remainder against a cylinder. Such an image is called a cylindrical mapping. A third possibility is to map the environment sphere onto the six faces of a cube, and store the panorama as six square images. Quite logically, this mapping is called a cubical mapping. Each of these three mappings have their benefits.

Figure 3: Common panorama mappings. From top to bottom: spherical, cylindrical, cubical.

Other mappings exist. One relatively convenient way of storing a panorama would be to store two 180 degree fisheye images, each covering one hemisphere. It is also possible to use a fisheye mapping extended to 360 degrees field of view. Such mappings are irregular and highly non-uniform, but they are sometimes used for environment maps, and recently also for image based lighting. The light probe mapping described and used by Debevec et al [http://www.debevec.org] is in fact a 360 degree fisheye projection, and the dual paraboloid environment mapping described by Heidrich and Seidel [http://www.cs.ubc.ca/~heidrich/papers/index.html] is a kind of dual fisheye projection. Panorama remappings One of the advantages with digital images over traditional image media is that they can easily be non-uniformly spatially resampled (warped) in a totally arbitrary manner. It is therefore possible to remap panoramas from one particular projection to any other projection, but some caution is advisable, because the resampling might throw away significant amounts of data. Different panorama mappings have different sampling densities, and because there is simply no way of making a perfectly uniform mapping from the surface of a sphere to a plane, the sampling density will always vary somewhat over the panoramic image. Converting a panorama from one mapping to another means resampling an image from one irregular, curvilinear coordinate system to another, which is a lot more tricky than resampling between regular, rectilinear coordinate systems. When dealing with panoramas, it is even more important than with regular images to use a higher resolution for the input image than the final output image, and to avoid repeated remapping operations. Using software with a good resampling algorithm can make a big difference. Commercial image editing tools are often insufficient. A very high quality resampling package is Panorama Tools by Helmut Dersch [http://www.fh-furtwangen.de/~dersch/]. It is free, and it is designed for tricky panoramic remappings. Panorama acquisition Panoramic images are not a new invention. Panorama cameras have been available since at least the beginning of the 20th century, even though they have never had a big market. There are several different principles for acquiring a panorama image. Which one is best depends heavily on the application. Each of the methods presented below has its its own merits. Scanning panorama cameras Classic panorama cameras, and some modern digital versions as well, operate by the scanning slit principle. The shutter opening of the camera is a vertical slit, the camera rotates on a tripod during the exposure, and the film is fed past the slit with a speed that matches the translation of the scene as a result of the camera rotation. The width of the slit can be adjusted to change the aperture. Scanning slit panoramic cameras use standard film, and capture a 360 degree panorama horizontally on a wide film strip. Their vertical field of view, however, is comparable to a regular camera and therefore significantly less than 180 degrees. The panoramas obtained in this manner are cylindrical panoramas. Scanning slit cameras can also capture panoramas with less than 360 degrees horizontal field of view, simply by stopping before one full turn is completed. One-shot, single view panorama cameras It is possible to design optical systems that directly capture the full environment sphere. A fisheye lens with a 360 degrees field of view is impossible to build, but by using curved mirrors a 360 degrees field of view can be achieved.

The simplest setup uses a reflective sphere. An image of a reflective sphere contains direct reflections from all (or at least most) incidence directions. Such images can be used straight off for panoramas, but their bad sampling uniformity makes it advisable to remap them before storage. A reflective sphere image may be remapped to a 360 degree fisheye projection by a resampling in the radial direction. Figure 4: An image of a reflective sphere. Background masked black for clarity. One-shot, multi-view panorama cameras Another way of designing a camera that captures a panorama in a single shot is to use multiple lenses and acquire images from all lenses simultaneously, either by using multiple cameras, or by using mirrors or prisms to combine images from several lenses in a a single frame. Two 180 degree fisheye images are enough to cover an entire sphere, but many fisheye lenses have problems with the image quality towards the eges, with significant defocus, color aberration, flare and light falloff. For better image quality, four fisheye lenses in a tetrahedral configuration can be used, but that of course requires twice as much hardware. A cubical panorama can be aquired by six cameras facing front, back, left, right, up and down, each with a 90 degree field of view. 90 degrees FOV is an unusually wide angle lens which is not readily available as inexpensive standard equipment, but such lenses do exist. A problem which becomes apparent with single-shot panoramas is that a photographer cannot hold the camera, or even be near it, without being in clear view in the shot. This can be solved either by remote control or a timer release. The camera mount will still present a problem, though. Sometimes it is OK to leave it in the image, other times some manual retouching of the image might be required. Scanning or multi-shot panorama acquisition methods make it easier to keep the photographer and the camera mount out of view. Panoramic movie cameras One-shot panorama cameras (single- or multi-view) can of course capture movies instead of still images. Panoramic film or video sequences have found some applications, and a couple of panorama video systems are on the market.

Multi-shot panoramas Instead of acquiring several images at once, a single camera can be used to capture each part of a panorama in sequence. A 180 degree fisheye lens ideally requires only two shots, even though three or four might be needed to better avoid edge problems. A fisheye lens makes the method quite covenient, but even a regular planar projection lens with a smaller field of view can be used, along with a good tripod mount, to capture a dozen or more images in different directions. It takes some cutting and pasting to combine the images to a panorama, but it requires no special camera equipment. Furthermore, the image quality is excellent, because the method uses many high quality source images, each with a very uniform sampling. The recombination of several images to a panorama is called stitching. Software tools exist to assist panorama photographers in stitching, and even make the process more or less automatic. Figure 5: Unfinished stitched panorama (6 images placed, each with 45 degrees FOV) One disadvantage of stitching, apart from that it is cumbersome, is that the images are not taken simultaneously. If the scene changes or the light conditions change while the images are acquired, there will be problems stitching them together to a consistent panorama. Effects from strong onlight, such as glare and lens flare, will also present a problem, since glare and flare will only appear in images where a light source is visible, and not in adjacent images in the sequence. Panorama viewers A panorama can be viewed in two quite different ways. Either the entire panorama is displayed on a surface that encloses the viewer, and lets him or her look around freely. This method is suitable for large audiences, but it requires expensive display equipment and a lot of open space. A more common method is to use an interactive computer program to remap part of the panorama to a planar perspective projection with a smaller field of view, and giving the user control over the direction of view, and possibly also the field of view. This gives the viewer an impression of being in a scene and looking around with a camera, which is often enough to invoke a sense of immersion. Remapping a panoramic projection to a planar perspective projection is not a simple operation, and an image contains a lot of data. It is in fact not until recently that computers have been able to handle that kind of heavy computations with a reasonable speed and sufficient image quality. The recent development of inexpensive 3-D graphics accelerators also presents an alternative mapping method: the panorama can be placed as a texture on the inside of a three-dimensional

shape that matches the panorama projection (a sphere, a cylinder or a cube), and the viewing can then be taken care of by a hardware accelerated rendering of the view from a virtual camera placed at the center. Cubical panoramas are particularly suitable for this, because their projection fits nicely into today s polygon and texture rendering pipelines. Commercial applications There are actually not that many commercial applications for panoramic images, but now that computers are able to handle them more easily and with high quality, we will probably see more of it in the near future. A few computer games using panoramic images have been released over the years, the most recent one being Myst III: Exile. In Exile, the panorama viewer is enhanced to simulate glare, and parts of the scene are moving, which gives a very immersive and compelling effect. For quite a few years by now, Apple Computers [www.apple.com] has been one of the leaders in the development of panoramic imaging with their software Quicktime VR. They deserve a mention for providing reasonably open standards, while some others have been trying to market their software as secret magic. A particularly sad mix of marketing hype, prohibitively expensive software licensing and repeated and dubious attacks on free software developers comes from the company IPIX [www.ipix.com]. Their products are not bad, but they have made a lot of enemies over the years. As panoramic imaging becomes more commonplace, it will be easier for companies to make a living on selling their products, but right now, commercial panoramic imaging is a struggle between only a few actors for shares of a too small market. Stefan Gustavson, 2002-01-22