Capturing Omni-Directional Stereoscopic Spherical Projections with a Single Camera

Similar documents
Synthetic Stereoscopic Panoramic Images

Cameras for Stereo Panoramic Imaging Λ

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Adding Depth. Introduction. PTViewer3D. Helmut Dersch. May 20, 2016

Colour correction for panoramic imaging

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Brief summary report of novel digital capture techniques

Beacon Island Report / Notes

A short introduction to panoramic images

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Extended View Toolkit

COPYRIGHTED MATERIAL. Overview

COPYRIGHTED MATERIAL OVERVIEW 1

Abstract. 1. Introduction and Motivation. 3. Methods. 2. Related Work Omni Directional Stereo Imaging

(51) Int Cl.: H04N 1/00 ( ) H04N 13/00 ( ) G06T 3/40 ( )

Displays and immersion. Paul Bourke The University of Western Australia

Fulldome activity report 2012

Which equipment is necessary? How is the panorama created?

Dual-fisheye Lens Stitching for 360-degree Imaging & Video. Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington

Omni-Directional Catadioptric Acquisition System

Novel Hemispheric Image Formation: Concepts & Applications

Realistic Visual Environment for Immersive Projection Display System

Peripheral imaging with electronic memory unit

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Creating a Panorama Photograph Using Photoshop Elements

Image Mosaicing. Jinxiang Chai. Source: faculty.cs.tamu.edu/jchai/cpsc641_spring10/lectures/lecture8.ppt

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

Image stitching. Image stitching. Video summarization. Applications of image stitching. Stitching = alignment + blending. geometrical registration

Digital Photographic Imaging Using MOEMS

As an aside I think there have been two exciting pieces of hardware released this year.

Single Camera Catadioptric Stereo System

Definiti Fulldome Projection Systems

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

Appendix 8.2 Information to be Read in Conjunction with Visualisations

Depth Perception with a Single Camera

All projected images must be visible from the camera point of view. The content exists in 2D - an "unwrapped" view of the content in the aspect ratio

Reconstructing Virtual Rooms from Panoramic Images

303SPH SPHERICAL VR HEAD

Holographic Stereograms and their Potential in Engineering. Education in a Disadvantaged Environment.

Quintic Hardware Tutorial Camera Set-Up

Mapping cityscapes into cyberspace for visualization

Enhancing Fish Tank VR

La photographie numérique. Frank NIELSEN Lundi 7 Juin 2010

The diffraction of light

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

Time-Lapse Panoramas for the Egyptian Heritage

Chapter 18 Optical Elements

Holographic 3D imaging methods and applications

Photographing Long Scenes with Multiviewpoint

Air-filled type Immersive Projection Display

Reading: Lenses and Mirrors; Applications Key concepts: Focal points and lengths; real images; virtual images; magnification; angular magnification.

The Bellows Extension Exposure Factor: Including Useful Reference Charts for use in the Field

Macro and Close-up Lenses

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS

CSC 170 Introduction to Computers and Their Applications. Lecture #3 Digital Graphics and Video Basics. Bitmap Basics

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Getting Unlimited Digital Resolution

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E

Why learn about photography in this course?

High Dynamic Range Imaging

Panoramic Mosaicing with a 180 Field of View Lens

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Christian Richardt. Stereoscopic 3D Videos and Panoramas

How to combine images in Photoshop

lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display

PANORAMIC PROJECTION SYSTEM USING A PANORAMIC ANNULAR LENS

Film Cameras Digital SLR Cameras Point and Shoot Bridge Compact Mirror less

Lenses, exposure, and (de)focus

Adding Realistic Camera Effects to the Computer Graphics Camera Model

ECEN 4606, UNDERGRADUATE OPTICS LAB

Movie 10 (Chapter 17 extract) Photomerge

The principles of CCTV design in VideoCAD

NEW EDUCATIONAL SOFTWARE FOR TEACHING THE SUNPATH DIAGRAM AND SHADING MASK PROTRACTOR.

Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror

Image Characteristics and Their Effect on Driving Simulator Validity

Fast Focal Length Solution in Partial Panoramic Image Stitching

Chapter 1 Virtual World Fundamentals

Opto Engineering S.r.l.

Focus on an optical blind spot A closer look at lenses and the basics of CCTV optical performances,

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc

Stereo Photography. What is Stereo Photography?

Taking Panorama Pictures with the Olympus e-1. Klaus Schraeder May 2004

Advanced Diploma in. Photoshop. Summary Notes

Digital Design and Communication Teaching (DiDACT) University of Sheffield Department of Landscape. Adobe Photoshop CS4 INTRODUCTION WORKSHOPS

Removing Temporal Stationary Blur in Route Panoramas

Theory of the No-Parallax Point in Panorama Photography Version 1.0, February 6, 2006 Rik Littlefield

How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail

OPTICAL SYSTEMS OBJECTIVES

UNIT Explain the radiation from two-wire. Ans: Radiation from Two wire

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

CSE 527: Introduction to Computer Vision

5 180 o Field-of-View Imaging Polarimetry

Subjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen

Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere

MEM: Intro to Robotics. Assignment 3I. Due: Wednesday 10/15 11:59 EST

Transcription:

Capturing Omni-Directional Stereoscopic Spherical Projections with a Single Camera Paul Bourke ivec @ University of Western Australia, 35 Stirling Hwy, Crawley, WA 6009 Australia. paul.bourke@uwa.edu.au Abstract This paper will discuss the photographic capture of omni-directional stereoscopic spherical projections, that is, a means of creating stereoscopic spherical projections that can be experienced by a single viewer without the need for head tracking or by a larger audience all of whom may be looking in different directions. We will illustrate a means of photographically capturing such stereoscopic image pairs using a single camera and fisheye lens. The principles behind this technique have previously been applied to computer generated stereoscopic spherical projections [1], a simpler situation due to the greater flexibility of virtual cameras. Applications of these omnidirectional stereoscopic spherical projections include general virtual reality environments, the increasing number of stereo capable digital planetariums, and a personal hemispherical immersive projection system known as the idome [2]. each image, these can be considered to be identical to slits from two cameras rotating about a circle of a different radius, see Figure 1. This obviously results in a lower cost camera arrangement but also simplifies the timing, colour, and optical calibration required when one is using two cameras with inevitable slight manufacturing differences. It also has the benefit of being able to choose, within reason, the final interocular separation in post production by choosing the distance of the chosen slits from the center of the image frames. Keywords: Stereoscopy, panorama, fisheye, planetarium, idome, immersion, virtual reality. I. INTRODUCTION High resolution cylindrical panoramic images are traditionally captured using a camera that rotates about its nodal point. Ideally the camera consists of a narrow slit that exposes the film (or sensor) in a continuous fashion [3]. When a more traditional still camera is used, thin vertical sections from a large number of individual frames are laid side by side and either simply blended together or stitched together to form the panorama [4]. In this discrete frame case there is a trade off between the number of images captured and the possible parallax errors that can occur if the optics does not rotate exactly about the correct axis, a problem that generally increases as parts of the scene get closer to the camera. The capture of such cylindrical panoramic images is now widespread, indeed the capabilities are often built into modern commodity cameras and free software solutions are generally available. Cylindrical panoramic stereoscopic pairs can be captured by extending the process to two slit cameras separated by the intended interocular separation and rotating about a common center [5, 6], see Figure 1. The slits so captured may be discrete or continuous, for example various cameras have been designed to expose a roll of film as each camera rotates and more recently digital versions have become available [3]. It has additionally been realised [7] that one only needs a single camera rotating about a point some distance from the cameras nodal point. Two narrow vertical sections are extracted from Figure 1. Dual camera capture (left) vs single camera capture (right), showing the cameras at two positions (A and B) as they rotate. In both cases either a slit camera is employed, or as shown a standard camera and the narrow slit of pixels are extracted from each frame to form the left and right panorama. Cylindrical panoramic stereoscopic images captured in this way have a further interesting characteristic, that is, they can be projected into a cylindrical display [8, 9] and multiple observers can each be looking at different parts of the panoramic image with acceptable stereoscopic depth perception. This often seems impossible to those working with virtual reality environments or familiar with planar stereoscopic projection where stereoscopic imagery that even partially surrounds the viewer requires head tracking for

acceptable viewing. This unexpected benefit arises because the stereoscopic depth perception is only strictly correct for an observer looking at any limited portion of the panoramic image, the error increases towards the left and right of that center of view. These errors are however not observed since the viewer is generally wearing glasses of some sort that obscure all but a relatively narrow field of view. In an immersive viewing environment the observer still benefits from the peripheral vision offered by the panoramic image since imagery can generally be seen outside the active area of the glasses. Indeed, this matches our real world sensation where in our far peripheral field we see neither stereo nor high resolution imagery, indeed even colour perception is limited. Similarly, for a standard display surface, HMD, or other VR flat surface display the stereoscopic effect can be observed as one freely rotates horizontally within the panoramic cylinder [10]. Such panoramic images are therefore referred to as omnidirectional [11, 12] stereoscopic panoramic images. achieved in real time with current graphics hardware. Such a fisheye stereo pair provides acceptable stereoscopic viewing by an audience all possibly looking in different directions, compared to fisheye stereo pairs captured simply from two offset cameras with fisheye lenses. This technique can be further extended to full spherical panoramic pairs by replacing the lens of the camera with a 180 degree (or greater) circular fisheye lens. Such a lens will capture 180 degrees vertically and the vertical slits that would be used to form a cylindrical panorama now become lune (crescent moon) shaped. In order to construct the spherical panorama for each eye, see Figure 5, these lune slices need to be stacked or blended together appropriately. The same tradeoff occurs as in the standard cylindrical stereoscopic panoramic image, the narrower the slits the less parallax errors occur but the more involved and time consuming the capture process becomes. In the case of computer generated stereoscopic panorama pairs the slits essentially become vanishing small and the process is continuous with no parallax errors. Figure 3. Three representative images from the set of 72 captured. II. IMPLEMENTATION DETAILS The geometry for calculating the effective eye separation (2R ) given a rotation radius (R) of the cameras nodal point and the angle (ø) of the strip extracted from the fisheye is shown in Figure 2. The relationship is simply Figure 2. Geometry for computing the effective eye separation of a single offset camera. Note that the camera is now perpendicular to the circle of rotation compared to the more traditional dual camera arrangement. R = R atan(ø) It is the semi-automatic [14] capture of these images that is the discussion of this paper. It extends the previous work of capturing omni-directional cylindrical stereoscopic image pairs for interactive presentation, among others, in a cylindrical display environment that surrounds a number of observers. In the case of omni-directional spherical stereoscopic image pairs the intended display environment is the idome or stereoscopic enabled planetarium domes. In both these cases an omnidirectional stereoscopic fisheye pair is extracted from the spherical panoramic pairs, generally this can readily be Controlling the distance to zero parallax is simply a matter of sliding the images horizontally and wrapping them across the left and right edges. Note that while this process of offsetting images horizontally with respect to each other to control the zero parallax distance is also employed for standard stereoscopic photography [13], in that case it is only strictly correct at a single offset. In contrast cylindrical and spherical stereoscopic panoramic images can be offset by any amount (although there are other ease of viewing constraints) and thus zero parallax can be located at any chosen depth. It should be (1)

noted that there is only one value for the effective eye separation and zero parallax if a sense of correct depth is required. By convention we prepare the images with zero parallax at infinity, the offset operation is handled by the playback/presentation software given a previously chosen offset angle. This has the benefit of keeping all such images in a standard format rather than each having a different zero parallax distance built into the image pair. The above applies to lenses such as the ones used here, namely the Sigma 7.8mm fisheye (α = ½) and the Sunex 185 degree fisheye. Still other fisheye lenses are supplied with calibration curves, these are typically low order polynomials in ψ. For the procedure outlined here it is critical to have a good estimate of this radial function in order to achieve a good blending at the upper and lower extremes of the image where the nonlinearities have most effect. If the radial relationship is less precise then finer camera stepping angles and narrower slits are required for an acceptable blending. Figure 5. The left and right strips from in fisheye projection that will become rectangular slits in the final spherical projection. The angular width of each lune slice is dependent on the number of images captured and the degree of overlap used for blending the strips together. The minimum width is the same as the stepping angle of the camera. In the implementation here a 50% overlap is employed, in this case the angular width is twice the stepping angle of the camera. Figure 4. Photograph of the Canon EOS5D mkii camera, Sigma 8mm circular fisheye lens and modified Gigapan mount. Extracting the two lune shaped slits, see Figure 5, from each fisheye image requires an understanding of how each point in the fisheye image maps to longitude and latitude of a spherical projection. An ideal equiangular fisheye lens has a linear relationship between the normalised radius (r on the range of 0 to 1) of a point on the fisheye image and the angular distance from the central line of sight (ψ on the range of 0... π/2). r=2ψ/π (2) Such lenses have been manufactured and are often called true-theta lenses but much more common are various nonlinear relationships. The most common nonlinear relationship is a sinusoidal one as follows, where the constant α is unique for each lens type. r = sin(a ψ) / sin(a π/2) (3) III. EXAMPLE To illustrate the stages of the process an example will now be presented. The lens employed is a Sigma 180 degree (horizontally and vertically) circular fisheye mounted on a Canon EOS 5DMk11. The camera is mounted on a modified Gigapan [15] mount as shown in Figure 3, this was chosen since it gives suitably precise stepping in the horizontal plane and automatic camera clicking. Figure 4 shows three representative fisheye images from the set of 72 captured (5 degree steps). The exact number of shots chosen is not critical, however if too few are captured then parallax errors may result, if too many are chosen the time for capture grows and there is an increased chance of changes in the scene during the capture process. As is the norm with such capturing the automatic settings on the camera are disabled, aperture and exposure are set manually so they stay constant during the capturing process. The Gigapan mount is set to capture one frame every 2 seconds, the total capture time then for this example is a little over two and a half minutes.

Figure 5 shows a single frame with the two extracted lune shaped slits highlighted. In this case each slit is extracted 15 degrees from the center view direction, this was chosen to give an interocular distance of 6.5cm, the average human interocular distance. The width of each extracted slit is 30 degrees (2*360/72) and since the camera is rotated in 15 degree (360/72) steps there is a 50% the overlap between each slice for the image blending. Figure 6 shows the two lune slices converted into the two rectangular strip sections of a spherical cylindrical panorama. It is these strips from each of the 72 camera positions that are blended together to form the left and right eye final spherical panorama stereoscopic pairs as shown in Figure 8. The blending of each pair of strips was performed using a sinusoidal weighted contribution from each image, see Figure 7. If is the contribution from a pixel in one image then (1- ) is the contribution from the corresponding pixel in the adjacent image. Various blending schemes were explored with a range of overlap widths, the conclusion was that choosing an overlap range from 10% to 50% made very little difference in the final result as long as the camera rotation steps were precise. CONCLUSION We have demonstrated a viable method of capturing high resolution omni-directional stereoscopic spherical projections of a static scene using a single camera and a circular fisheye lens. Omni-directional is taken to mean that one gets an acceptable stereoscopic 3D view when looking at the image pairs in any horizontal direction, this applies if the images are projected correctly onto a flat screen, a cylinder, or converted to fisheye projections and presented on a hemispherical surface. The image capture requires a full 180 degree fisheye lens (at least vertically), an accurate means of rotating the camera in small increments of a few degrees, and a knowledge of the angular radial dependence of the fisheye lens. The principle of omni-directional cylindrical panoramic images has been previously explored with cylindrical displays, the omnidirectional spherical panoramic images presented here have been tested in stereo within the idome display as well as on flat stereoscopic walls. Figure 6. Slices as sections of a spherical projection arising from the slices shown in Figure 5. Figure 7. Sinusoidal weighted blending function used to combine two overlapping longitudinal strips. Figure 8. The final spherical stereoscopic panoramic images for each eye and some example positive parallax points. Zero parallax set to a position on the floor left most arrow set, different parallax separation illustrated with other arrows.

ACKNOWLEDGMENT The author would like to acknowledge Peter Morse and Volker Kuchelmeister for valuable discussions on motorised rigs and lens options, Akos Bruz for engineering support, Martin Ford of Redshift West Pty Ltd and the ivec Small Business Technology Assistance Scheme. REFERENCES [1] P.D. Bourke. Omni-Directional Stereoscopic Fisheye Images for Immersive Hemispherical Dome Environments. Proceedings of the Computer Games & Allied Technology 09 (CGAT09), Research Publishing Services, ISBN: 978-981-08-3165-3, pp 136-143, 2008. [2] P.D. Bourke. idome: Immersive gaming with the Unity game engine. Proceedings of the Computer Games & Allied Technology 09 (CGAT09), Research Publishing Services, ISBN: 978-981-08-3165-3, pp 265-272. [3] RoundShot, Web reference: http://www.roundshot.ch/. [4] H.C. Huang, Y.P. Hung. Panoramic stereo imaging system with automatic disparity warping and seaming. Graphical Models and Image Processing, 60 (3) pp 196 208, May 1998. [5] P.D. Bourke. Synthetic Stereoscopic Panoramic Images. Lecture Notes in Computer Science (LNCS), Springer, ISBN 978-3-540-46304-7, Volume 4270, 2006, pp 147-155 [6] M. Ben-Ezra, Y. Pritch. S. Peleg. Omnistereo: Panoramic Stereo. Imagine. IEEE Transactions on Pattern Analysis and Machine Intelligence. Vol.23, No.3, pp.279-290, March, 2001. [7] S. Peleg. M. Ben-Ezra. Stereo panorama with a single camera. IEEE Conference on Computer Vision and Pattern Recognition, pp 395-401, Ft. Collins, Colorado, June 1999. [8] AVIE: Advanced Visualisation and Interaction Environment. Web reference: http://www.icinema.unsw.edu.au/projects/infra_avie.html. [9] R. Hayashi, F. Nakaizumi, H. Yano, H. Iwata. Development of a full surround stereo spherical immersive display using multiple projectors. Transactions of the Virtual Reality Society of Japan. Vol 10 (2), pp 163-172, 2005. [10] N.A. Bolhassan, M. Cohan, W.L. Martens. VR4U2C: A multiuser Multiperspective Panorama and Turnorama Browser Using QuickTime VR and Java Featuring Multimonitor and Stereographic Display. Trans. of Virtual Reality Society of Japan, Vol 9, No 1, pp69-78, 2004. [11] H. Ishiguro, M. Yamamoto, S. Tsuji. OmniDirectional Stereo. IEEE Transactions on Pattern Analysis Machine Intelligence 14(2), pp 257-262, 1992. [12] S. Peleg, M. Ben-Ezra, Y. Pritch. Omnistereo: Panoramic Stereo Imaging, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 3, March 2001. [13] G. Jones, D. Lee, N. Holliman, and D. Ezra, Controlling Perceived Depth in Stereoscopic Images. Proc. SPIE 4297A, (2001). [14] H.Y. Shum, H. Li-Wei. Rendering with Concentric Mosaics. Proceedings of SIGGRAPH 1999, pp. 299-306. [15] Gigapan, Web reference: http://www.gigapan.org/.