Realistic Visual Environment for Immersive Projection Display System

Similar documents
A short introduction to panoramic images

Time-Lapse Panoramas for the Egyptian Heritage

Development of Informal Communication Environment Using Interactive Tiled Display Wall Tetsuro Ogi 1,a, Yu Sakuma 1,b

Experience of Immersive Virtual World Using Cellular Phone Interface

LOW COST CAVE SIMPLIFIED SYSTEM

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Taking Panorama Pictures with the Olympus e-1. Klaus Schraeder May 2004

Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3

The upper rail is adjusted so that the gold ring is vertically in line with the axis of rotation as shown in the image below.

303SPH SPHERICAL VR HEAD

Application of 3D Terrain Representation System for Highway Landscape Design

Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror

Omni-Directional Catadioptric Acquisition System

Panoramic Photography

A taste for landscapes

Appendix 8.2 Information to be Read in Conjunction with Visualisations

Photoshop Elements 3 Panoramas

INFERENCE OF LATENT FUNCTIONS IN VIRTUAL FIELD

Beacon Island Report / Notes

Introduction to Virtual Reality (based on a talk by Bill Mark)

Which equipment is necessary? How is the panorama created?

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

Enhancing Fish Tank VR

SpheroCam HDR. Image based lighting with. Capture light perfectly SPHERON VR. 0s 20s 40s 60s 80s 100s 120s. Spheron VR AG

Creating a Panorama Photograph Using Photoshop Elements

Manfrotto 303plus QTVR Pano Head

Getting Unlimited Digital Resolution

Adding Depth. Introduction. PTViewer3D. Helmut Dersch. May 20, 2016

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

WORKFLOW GUIDE. Trimble TX8 3D Laser Scanner Camera and Nodal Ninja R1w/RD5 Bracket Kit

High Performance Imaging Using Large Camera Arrays

Nodal Ninja SPH-2 User Manual. Nodal Ninja A Panoramic Tripod Head what s in your bag?

Creating Stitched Panoramas

Development of Head-Up Display for Motorcycle Navigation System

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

Collaborative Flow Field Visualization in the Networked Virtual Laboratory

Photographing Art By Mark Pemberton March 26, 2009

Enhancing Fish Tank VR

Synthetic Stereoscopic Panoramic Images

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

istar Panoramas for HDS Point Clouds Hugh Anderson Illustrations, descriptions and technical specification are not binding and may change.

Low-cost virtual reality visualization for SMEs

Instruction Manual. Roundshot VR Drive / Roundshot VR Drive s Software release: version 4.0 (January 2010)

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

CSI: Rombalds Moor Photogrammetry Photography

CAPTURING PANORAMA IMAGES

Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment

CSC 170 Introduction to Computers and Their Applications. Lecture #3 Digital Graphics and Video Basics. Bitmap Basics

A collection of example photos SB-900

Capturing Omni-Directional Stereoscopic Spherical Projections with a Single Camera

Panoramas. Featuring ROD PLANCK. Rod Planck DECEMBER 29, 2017 ADVANCED

How to combine images in Photoshop

Colour correction for panoramic imaging

Abstract. 1. Introduction and Motivation. 3. Methods. 2. Related Work Omni Directional Stereo Imaging

Overcoming Time-Zone Differences and Time Management Problems with Tele-Immersion

Multiuser Collaborative Exploration of Immersive Photorealistic Virtual Environments in Public Spaces

7. Michelson Interferometer

Film Cameras Digital SLR Cameras Point and Shoot Bridge Compact Mirror less

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

High-Resolution Interactive Panoramas with MPEG-4

Effective Contents Creation for Spatial AR Exhibition

Brief summary report of novel digital capture techniques

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

3D and Sequential Representations of Spatial Relationships among Photos

1. PANORAMIC MANUAL. Guidelines to creating your own panoramic images. Version Author: Richard Kennedy Brent Barcena

4. Photography. 4D construction learning environment virtual tour making. Team leader: Dr Chris Landorf. Contributors:

Sample Copy. Not For Distribution.

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

RAF DRAFT. Viewpoint 11: Taken from a road within Burlescombe, looking oking south-west towards the site.

Robert Mark and Evelyn Billo

Analysis 3. Immersive Virtual Modeling for MEP Coordination. Penn State School of Forest Resources University Park, PA

Digital Design and Communication Teaching (DiDACT) University of Sheffield Department of Landscape. Adobe Photoshop CS4 INTRODUCTION WORKSHOPS

Advanced Diploma in. Photoshop. Summary Notes

Air-filled type Immersive Projection Display

Output Devices - Visual

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

4. Pa n o r a m a Cr e a t i o n (Pa r t 1): Me t h o d s f o r Te c h n i q u e s

6 System architecture

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

Introduction to Panoramic photography. David R. Chung Linn Area Photography Club

CREATION AND SCENE COMPOSITION FOR HIGH-RESOLUTION PANORAMAS

VR based HCI Techniques & Application. November 29, 2002

A Comparison of Virtual Reality Displays - Suitability, Details, Dimensions and Space

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Be aware that there is no universal notation for the various quantities.

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where

Macro and Close-up Lenses

Interactive intuitive mixed-reality interface for Virtual Architecture

Making the right lens choice All images Paul Hazell

State Library of Queensland Digitisation Toolkit: Scanning and capture guide for image-based material

Movie 10 (Chapter 17 extract) Photomerge

One Size Doesn't Fit All Aligning VR Environments to Workflows

FAQ AUTODESK STITCHER UNLIMITED 2009 FOR MICROSOFT WINDOWS AND APPLE OSX. General Product Information CONTENTS. What is Autodesk Stitcher 2009?

Install simple system for playing environmental animation in the stereo display

POLARISATION OF LIGHT. Polarisation: It is the phenomenon by which the vibrations in a transverse wave are confined to one particular direction only.

Information for Physics 1201 Midterm 2 Wednesday, March 27

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Transcription:

Realistic Visual Environment for Immersive Projection Display System Hasup Lee Center for Education and Research of Symbiotic, Safe and Secure System Design Keio University Yokohama, Japan hasups@sdm.keio.ac.jp Yoshisuke Tateyama, Tetsuro Ogi Graduate School of System Design and Management Keio University Yokohama, Japan tateyama@sdm.keio.ac.jp, ogi@sdm.keio.ac.jp People should feel more immersive and act realistically when the backgrounds of virtual reality system are more similar to real world. If we surround users by the real images - panoramic imaging - in CAVE, we can make more realistic environment. We make the panoramic images from the real environment using a digital camera and a panoramic tripod head and apply them to the K-CAVE system to increase user's immersion. We formulate panoramic image photographing for efficiency and use the SCTP data connection for reliability in the K-CAVE system. head, a digital camera and a tripod. Then we stitch them to produce a panoramic image for background. Finally we make a panoramic image representation for the CAVE using these images. Also we use SCTP [4] data transfer for reliable data connection in runtime. Keywords- Panoramic Imaging; Immersive Projection Display; CAVE; SCTP; Virtual Reality I. INTRODUCTION One of the objects of virtual reality technologies is to present the same environment and to feel the same way as real world. Because of the limitation of resources, we cannot present all the features of real world. Instead of it, we catch some key features like human avatars, object modeling and so on. People should feel more immersive and act realistically when the backgrounds of virtual reality system are more similar to real world. Figure 1. Equirectangular panorama image format (Courtesy of NASA) In virtual reality research field, the CAVE [1] is a widely used display system for user s immersion. The projectors and the screens of the CAVE are used for users to feel high immersion. They surround users with artificial objects, avatars, background and etc. The CAVE can use a real images or image-based rendering objects to feel more realistic. A real image-based background can be used just as users see in real world and a panoramic image can be viewed like this because it can surround user in the 360º whole direction. If a panoramic image representation which surrounds users by the real images is applied to the CAVE, we can make more immersive environment. There are several researches about panoramic imaging like QuickTime VR [2] and they are well surveyed in [3]. Figure 2. Cubic panorama image format version of Figure 1 In this paper, we develop a panoramic image representation for the CAVE to increase user's immersion. First we shot background photos of real world using a panoramic tripod

II. A. Panorama image format PANORAMA IMAGE REPRESENTATION There are two main formats of panorama image [5] which show 360º field-of-views in the vertical and horizontal directions. One is the equirectangular format as sown in Figure 1 and the other is cubic in Figure 2. The equirectangular format is also called the spherical format. It contains a single image of the equirectangular projection and its ratio is 2:1. The cubic format is made of six face images of cube that surrounds user. Figure 1 and 2 are example of equirectangular and cubic panorama image format. The earth map image of Figure 1 is from NASA and Figure 2 is made from Figure 1. In Figure 2, we can see Antarctica and the Arctic Ocean without distortion. B. Panorama image representation in CAVE We use the equirectangular format to produce image representation. Because the background images are located and shown with 3 dimensions in the CAVE, the images of the cubic format are distorted in the boundaries of each image. We make a virtual sphere which contains the cubic of CAVE screens like Figure 3. Then we project the equirectangular format image to the inner face of the virtual sphere. This projection is implemented using the texture mapping technique. C. Data transfer over SCTP in CAVE In our K-CAVE system [6], we use 4 screens and 8 projectors left and right image projectors for each screen. There are 4 rendering PC servers which controlled by one master server. We develop data transfer between master and rendering servers using SCTP protocol [4]. We had already implemented a SCTP test bed in UNIX system. A host is multihomed if it can be addressed by multiple IP addresses as is the case when the host has multiple network interfaces [4][7]. The current transport protocols - TCP and UDP - are ignorant of multihoming; TCP allows binding to only one network address at each end of a connection. When TCP was designed, the network interfaces were expensive components, and hence multihoming was beyond the scope of research. Increasing economical feasibility and a desire for networked applications to be fault tolerant at an end-to-end level have brought multihoming within the purview of the transport layer. SCTP is a new transport protocol in the Internet [4][7]. It is a connection-oriented protocol and provides reliable data communication between applications. One of the important features of SCTP is multihoming function. SCTP establishes a logical connection called the association prior to data communication. The end points of an association may have multiple addresses, i.e., there can be multiple data paths in an association. One of the paths is called the primary path and the others are called the secondary paths. Data is transmitted on the primary path. If the primary path becomes unavailable due to some errors, one of the secondary paths is selected as the primary. For a reliable data connection in future, we use SCTP protocol for data transfer. The update rate is not so important yet because we send only one panoramic image in this step. The final structure of our system is showed as Figure 4. The server module reads panoramic images of real environment and transfers them to the renderer modules via SCTP protocol. The server and renderer modules produce all images of each screen using OpenCABIN library [6]. The users can see stereo images in the CAVE using polarization filter glasses. III. A. Panorama photographing PANORAMA IMAGE PHOTOGRAPHING We use a digital camera and a panoramic tripod head for photographing because a digital camera is widely spread and a panoramic tripod head is very cheaper than a panoramic photographing system. When we take a panoramic image, it must be considered to stitch together later. We can rotate the camera in the axis of a no-parallax point, which is also called a nodal point, to eliminate parallax between images. A panoramic tripod head which is shown in Figure 5 is used to rotate the camera on the no-parallax point. Figure 3. Virtual Sphere that contains CAVE Figure 4. System Architecture Identify applicable sponsor/s here. If no sponsors, delete this text box. (sponsors)

Figure 5. Photographing using panoramic tripod head While we take a picture, there are some conditions which were kept in mind. The exposure value, the f-number, the focal length and the shutter speed must be fixed through all images. If these values are not fixed, the resulting images are mottled. We can take pictures in this condition using manual mode. Large f-value is preferred to get generally clear image. To get the images, we must move a tripod which is not so light to carry and take dozens of pictures. If we pre-calculate how many and what directions images we must take, we can do photographing more efficiently. m is a multiple of 2 because + and - degree parts are symmetry. We take photos of tilt degree d, +/-2d, +/-3d, and so on until +/-90º. The number n θ of needed images with tilt degree θ is as follows. Figure 7. Direction of Photographing n 360 cos fv(1 l) θ is one of a multiple of +/-d. The directions of photographing are showed in Figure 7. If there is no mark that is used when stitching in photo like playgrounds, woods, sea and etc., we must change photographing degree slightly. In our results, we change lower photographing degree for this reason. As shown in Figure 6, for the width w(mm) of CCD in the camera and focal length f(mm) of camera lens, the field of view fv(degree) in horizontal direction is as follows. w/ 2 tan fv / 2 f If we want to overlap l(%) between the adjacent images, we need n images to fill in horizontal direction as follows. Figure 6. Camera diagram 360 n fv(1 l) Using similar calculation with the height h(mm) of CCD, we can get the number k of images to fill in vertical direction. The unit tilt degree difference d between the adjacent images in vertical direction is as follows. IV. RESULTS Our photographing system is shown as Figure 5. We take images using Nikon D70 digital camera with Nikon 18~70mm DX lens. As the panoramic tripod head, Fanotec NN3-II is used. We take 12 images in 0º, 8 in -30º, +45º and 1 in +/-90º as shown in Figure 9. We take pictures in -30º instead of them of -45º which contains no anchor to stitching. When the environment is wide area of similar texture like square, playground and etc., some adjustment of degree of photographing is needed for efficient stitching. We stitch these images to a panoramic image using Autodesk Stitcher Unlimited 2009 software. The final panoramic image is shown in Figure 10. Its resolution is 3600 X 1800. The K-CAVE [6] is a CAVE-clone display system at the Keio University. It consists of 4 screens, 8 projectors, 8 Linux based PCs, a magnetic position sensor and a joystick. Stereo feature is achieved by circular polarization filters and all experiments are developed using OpenCABIN Library [6]. We implement a panoramic image representation in K-CAVE as shown in Figure 8. 360 d m m is thesmallest number that is a multiple of 2 and m k

VI. FUTURE WORKS We plan to expand the panoramic image representation to stereo. For 360º field of view in horizontal only, there are several researches but we expand them to in whole direction - vertical and horizontal direction. ACKNOWLEDGMENT This work was supported by G-COE (Center of Education and Research of Symbiotic, Safe and Secure System Design) program at Keio University. Figure 8. Panoramic image representation for K-CAVE V. CONCLUSIONS We make the panoramic images from the real environment using a digital camera and a panoramic tripod head and apply them to the K-CAVE system with SCTP data transfer to increase user's immersion. Our contributions are as follows. We take photos for panoramic image representation efficiently by formulating the calculation of panoramic image photographing. And we construct panoramic representation for the CAVE to improve user s immersion and use SCTP data transfer for reliable data communication. REFERENCES [1] C. Cruz-Neira, D. Sandin, T. DeFanti, R. Kenyon, and J. Hart, The CAVE: Audio Visual Experience Automatic Virtual Environment, Communications of the ACM, vol. 35(6), pp. 64-72, 1992. [2] S. Chen, Quicktime VR - An Image-Based Approach to Virtual Environment Navigation, Proceedings of the SIGGRAPH '95, pp. 29-38, 1995. [3] D. Gledhill, G. Tian, D. Taylor, and D. Clarke, Panoramic imaging - a review, Computers & Graphics, vol. 27(3), pp. 435 445, 2003. [4] R. Stewart, (2007). Stream Control Transmission Protocol. Retrieved December 16, 2009, from http://tools.ietf.org/rfc/rfc4960.txt [5] Panorama formats. (n.d.) Retrieved December 16, 2009, from http://wiki.panotools.org/panorama_formats [6] Y. Tateyama, S. Oonuki, S. Sato, and T. Ogi, K-Cave demonstration: Seismic information visualization system using the OpenCABIN library, Proceedings of the ICAT 2008, pp. 363-364, 2008. [7] J. Iyengar, K. Shah, P. Amer, and R. Stewart, Concurrent multipath transfer using SCTP multihoming, Proceedings of SPECTS 2004, pp. 25-29, 2004.

Figure 9. Raw images of each tilt degree Figure 10. Stitched panoramic image