High Dynamic Range Imaging

Size: px
Start display at page:

Download "High Dynamic Range Imaging"

Transcription

1 High Dynamic Range Imaging IMAGE BASED RENDERING, PART 1 Mihai Aldén mihal915@student.liu.se Fredrik Salomonsson fresa516@student.liu.se Tuesday 7th September, 2010 Abstract This report describes the implementation of several HDR imaging techniques for capturing HDR images and displaying them on regular displays using different tone mapping methods. This project is done in the course TNM083 Image Based Rendering. The main purpose of this part of the project is to produce HDR panoramic images that can be used to produce photorealistic renderings of synthetic objects. This project resulted in a simple image processing program that given a set of differently exposed images and the camera response curve can produce HDR images, create panoramas and perform some basic image processing operations.

2 X Response Curve Good data Bad data E No data X Weight Function Figure 1: The response curve and weighting function. W(X) There are currently no commercial devices that can display HDR images since HDR imaging is a relatively new area and not many consumers have heard of or seen this new technique. Therefore it is necessary to prepare these images for display on regular devices using different tone mapping methods. 2 Method 2.1 Creating an HDR image from a set of LDR images 1 Introduction Today, one byte is used for each color channel to represent the color intensity of the captured or synthesized scene, which means that each channel can only have 256 different values and in most scenes this is inadequate, since these images can cover a dynamic range of about 1:100 while a common outdoor scene can have a dynamic range of 1: or more. Therefore to be able to represent the full dynamic range of a scene more than eight bits are needed. Images that can represent the entire dynamic range of a real world scene are called High Dyamic Range HDR images. One of the major issues with HDR images is that there are not so many capturing devices that can actually capture the full dynamic range of a real world scene. Therefore it is quiet common to use a series of differently exposed Low Dynamic Range LDR images, that together cover the full dynamic range, in order to produce the HDR image. These LDR images can be captured using a regular camera. Another considerable issue with HDR images is the storage, since these images contain much more information for each pixel compared to LDR images. The two major file formats that are used today are OpenEXR [2] and RGBE [3]. First step is to capture a set of differently exposed LDR images, keeping the camera and the scene stationary. Knowing the exposure settings of each image (shutter speed, aperture, etc) and the scene is the same in all the images, it is possible to recover the response curve for the camera from the image data Theory This response curve describes the mapping from photometric units E to pixel values X. For most cameras this is a highly non-linear function X = f (E). From the response curve, a range of good values can be selected by using a suitable weight function W(X). And by good values meaning values that are not completely black or white, which are large enough to have a reasonable precision and which are in the range where the slope of the response curve is reasonably large so they have enough accuracy. The weight function is constructed such that good values have a high weight (close to one), bad values have a lower weight (close to zero), completely black or white pixels has zero weight so that they do not influence the final value at all. The response curve and the weight functions are illustrated in figure 1. By mapping the pixel values through the inverse of he response function, it is possible to recover the photometric exposure: E = f 1 (x) (1) 1

3 Different shutter times and adaptive settings influence the exposure values in image i by a scaling factor s i, so the true intensity values can then be recovered by scaling the photometric exposure with the scaling factor: I i = s i E i (2) The final value for each pixel is a weighted sum of the linearized and exposure corrected input values: I = ( i W(Xi )s i f 1 (X i ) ) (3) i W(X i ) In order to calculate the response function we use the HDR Shop program developed by Paul Debevec 1. For more information on how to calculate the response function pleas see [1]. 2.2 Storing HDR images In order to store the HDR images we chose the RGBE file format for its simplicity. The RGBE file format is very compact and flexible because it can be stored in many standard 8-bit file formats by encoding the E channel in the alpha channel. In the RGBE format, 8-bits are used for the mantissa of each color channel and a common 8-bit exponent E is stored as an extra channel which gives a 32-bit per pixel size. 2.3 Tone mapping the HDR input to LDR output using some kind of tone mapping method. Tone mapping methods can be split in two categories: Global and Local tone mapping. Global methods use a monotonic function to map HDR value to LDR values. The properties of the function can depend on some local or global statistics of the HDR data, but the exact same function is applied to all pixels. For more information on local methods please see [4]. We chose to implement a global method commonly refereed to as the S-curve [4], because it is computationally inexpensive and mimics how the human visual system adapts to high dynamic ranges. I = I n I n + σ n (4) This function is shown in figure Panoramic images A common way to capture omnidirectional panoramic images is to photograph a reflective sphere. This process will capture the incident light at a desired point in the scene, where the sphere is placed. By capturing HDR images of the reflective sphere we are able to calculate the correct radiance and later use the information to relight synthetic objects. R N θ θ/2 logi Figure 2: The S-Curve. In order to display HDR images on an regular LDR display it is required to first map Figure 3: Reflected light rays on sphere surface. 1 debevec/ 2

4 Figure 4: Side view of the reflective sphere, image property of [5] Angular mapping A reflective sphere will reflect the entire environment in a single view on its surface. The reason for this can be explained with some basic trigonometry, see figure 3. Before the captured image can be used as a panorama we first need to remap the image through several steps. Pixel coordinates in the image (s, t) are first mapped to the unit circle using the following equtions: s u = 2( 0.5) I width t v = 2( 0.5) I height (5a) (5b) This will ensure that we only sample the reflected image on the sphere and not the background. We assume that the reflected image is centered and that the captured images are uniformly cropped around the sphere. Then the polar coordinate system can be used to find the azimuth angle φ: r = u 2 + v 2 φ = atan( v u ) (6a) (6b) See figure 5, and in order to describe reflected direction from the surface of the sphere into the Figure 5: Front view of the reflective sphere, image property of [5] environment we also need to find the elevation angle θ. In figure 4 the azimuth angle φ = 0, this gives: r = sin( θ 2 ) θ = 2asin(r) (7a) (7b) Unfortunately the sampling of the sphere is non-uniform. This means that the outermost pixels in the image of the reflective sphere will each cover a large solid angle and this makes it difficult to get good data with sharp features for those angles. Therefore it is necessary to remap the sphere image to have a uniform sampling. This is done by replacing θ with the radial distance from the center of the sphere r multiplied 3

5 by π, this will give a linear sampling for the backwards directions, as follows: θ = πr (8) Using these angles it is now possible to go to cartesian (world space) coordinates: x = sinφcosθ y = sinφsinθ z = cosφ (9a) (9b) (9c) and the inverse mapping from cartesian to angular is as follows: r = acos( z) 2π x 2 + y 2 u = 1 2 ry v = rx (10a) (10b) (10c) Equation 8 solves the uniformity problem by avoiding to undersample the edges but due to the spherical geometry very few rays are actually traced backwards and the light information on the edges of the sphere will be mixed together with many different reflections. This will result in a singularity that can be observed in the center of the image. The only way to remove this artifact is to photograph the sphere once more 90 apart and combine the two images together to cover the missing light samples. Therefore if the image is to be viewed directly in 2D it is preferable to shift the singularity to the edges by rotating the coordinate system 90. This can be achieved by modifying the φ to span [ π, π] instead of [0, 2π] Latitude-Longitude mapping The angular map is then mapped into a latitude-longitude map for a more convenient way of using it to relight a synthetic scene. Since the latitude-longitude map stores the angular map s azimuth angles along the horizontal axis and it s elevation on the vertical axis, and also flattens the spherical image into a rectangular area, therefore it is quiet simple to index the latitude-longitude map. The top edge of the map corresponds to the top pole of the sphere, and the bottom edge corresponds to the bottom pole. To map the (u, v) coordinates of the image to cartesian (world space) coordinates we use the following formula, first we remap them to angles that account for the previously stated rotation: φ = πu θ = π v (11a) (11b) then the angles are used to create the cartesian coordinates: x = sinφsinθ y = cosθ z = sinφcosθ (12a) (12b) (12c) these are just the spherical coordinates slightly modified to take inconsideration that the up vector of the reflective sphere is on the y-axis and that the spherical coordinates are defined inside a sphere while the reflective image is seen from the outside Implementation For each sample in the Lat-Long image we need to find the corresponding sample in the angular map. We do this by converting the Lat- Long (u, v) coordinates to cartesian coordinates using equations 13 and 12, then equation 10 is used to calculate angular (u, v) coordinates which are remapped to floating pixel indices. The final pixel value is calculated using bilinear interpolation in the angular map. If the sampling process is not done from the final image (Lat-Long) to the input image (angular map) there is a great risk that not all pixels will be set Cube map Cube maps are today a standard part of the OpenGL API and are typically used to to create reflections from an environment. Cube maps together with HDR data have become very 4

6 common in real-time rendering applications because it is possible to do very realistic lighting of objects at a very cheap cost on any modern GPU. We have chosen to produce cube maps by resampling the latitude-longitude map. To do this we firs create six images and map each image, in 3D space, to the side of the unit cube. Then for each pixel in each image the direction vector from the center of the coordinate system through that particular pixel is calculated and then used to sample the Lat-Long image, see figure 6. The sample positions in the Lat-Long vill almost never be at fixed pixel coordinates, it is therefore necessary to perform a bilinear interpolation to get the final pixel value. Figure 7: LDR image set. Figure 6: Constructing the cube map 3 Results All images are captured using a Canon EOS 20D camera with Canon EF 200mm f/2.8 L. The camera is mounted on a tripod and aimed towards a reflective sphere, in order to capture the surrounding environment. Figure 7 shows different images of the same scen captured with 1/2f-stop apart. Figure 8: Canon EOS 20D Response Curve 3.1 The response curve Figure 8 shows the response curve of the camera we used, calculated using HDR Shop. 5

7 3.2 Tone mapping Figure 9 demonstrates the S-Curve tone mapping method with different σ values and figure 10 demonstrates different exponential values. Figure 12: Motalastrom. Figure 9: Varying intensity. A: σ = 0.01, B: σ = 0.05 and C: σ = 0.2. Figure 10: Varying contrast. A: n = 0.9, B: n = 1.4 and C: n = Figure 13: T appan. Latitude longitude images Below are some latitude longitude images captured at different locations around the LiU Campus area in Norrkoping. Figure 11: K akenhus. 3.4 Cube map Figure 14 shows the six different views of the cube map. Figure 14: Cube map. 6

8 4 Discussion As the results demonstrate it is currently possible to capture high dynamic images using consumer level cameras. However because several images are captured within a small time interval the captured scene needs to be static during that time. This requirement excludes a vast majority of the everyday scenes. Because it is not always possible to control every aspect of an outdoor scene some artifacts will appear in the final HDR image. These artifacts can be noticed as blurred trails behind the clouds in our images. This limitation is one of the major obstacles in making HDR imagine mainstream. Some artifacts in the resulting images are due to irregularities and scratches on the surface of the reflective sphere. The S-Curve tone mapping method works quite well for most images, but it has its limits. References [1] Paul E. Debevec and Jitendra Malik. Recovering High Dynamic Range Radiance Maps from Photographs. University of California at Berkeley, [2] Rod Bogart Florian Kainz. OpenEXR, TechnicalIntroduction. Industrial Light And Magic, [3] Greg Ward Larson. Graphics gems II, Chapter 11.5: Real Pixels. Morgan Kaufmann, [4] Erik Reinhard, Greg Ward, Sumanta Pattanaik, and Paul Debevec. High Dynamic Range Imaging Acquisition, Display, and Image-Based Lighting. Morgan Kaufmann, [5] Jonas Unger, Stefan Gustavson, and Joakim Löw. Light Probes, Panoramas and Image Warping. Linköping University, TNM083 Image Based Rendering, If more time where given we would like to include support for several file formats such as the the Canon raw format CR2 and the HDR format OpenEXR. Add some post processing methods to compensate for small camera movements and moving objects (ghosting). We would also like to be able to overlap two images of the same sphere as described in [4] to remove the singularity effect and the reflection of the camera. 7

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view) Camera projections Recall the plenoptic function: Panoramic imaging Ixyzϕθλt (,,,,,, ) At any point xyz,, in space, there is a full sphere of possible incidence directions ϕ, θ, covered by 0 ϕ 2π, 0 θ

More information

High dynamic range imaging and tonemapping

High dynamic range imaging and tonemapping High dynamic range imaging and tonemapping http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 12 Course announcements Homework 3 is out. - Due

More information

Why learn about photography in this course?

Why learn about photography in this course? Why learn about photography in this course? Geri's Game: Note the background is blurred. - photography: model of image formation - Many computer graphics methods use existing photographs e.g. texture &

More information

lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response

lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response - application: high dynamic range imaging Why learn

More information

HDR Images (High Dynamic Range)

HDR Images (High Dynamic Range) HDR Images (High Dynamic Range) 1995-2016 Josef Pelikán & Alexander Wilkie CGG MFF UK Praha pepca@cgg.mff.cuni.cz http://cgg.mff.cuni.cz/~pepca/ 1 / 16 Dynamic Range of Images bright part (short exposure)

More information

High Dynamic Range Images : Rendering and Image Processing Alexei Efros. The Grandma Problem

High Dynamic Range Images : Rendering and Image Processing Alexei Efros. The Grandma Problem High Dynamic Range Images 15-463: Rendering and Image Processing Alexei Efros The Grandma Problem 1 Problem: Dynamic Range 1 1500 The real world is high dynamic range. 25,000 400,000 2,000,000,000 Image

More information

High Dynamic Range Imaging

High Dynamic Range Imaging High Dynamic Range Imaging 1 2 Lecture Topic Discuss the limits of the dynamic range in current imaging and display technology Solutions 1. High Dynamic Range (HDR) Imaging Able to image a larger dynamic

More information

HDR images acquisition

HDR images acquisition HDR images acquisition dr. Francesco Banterle francesco.banterle@isti.cnr.it Current sensors No sensors available to consumer for capturing HDR content in a single shot Some native HDR sensors exist, HDRc

More information

High Dynamic Range Images

High Dynamic Range Images High Dynamic Range Images TNM078 Image Based Rendering Jonas Unger 2004, V1.2 1 Introduction When examining the world around us, it becomes apparent that the lighting conditions in many scenes cover a

More information

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging Outline Cameras Pinhole camera Film camera Digital camera Video camera High dynamic range imaging Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2006/3/1 with slides by Fedro Durand, Brian Curless,

More information

Realistic Image Synthesis

Realistic Image Synthesis Realistic Image Synthesis - HDR Capture & Tone Mapping - Philipp Slusallek Karol Myszkowski Gurprit Singh Karol Myszkowski LDR vs HDR Comparison Various Dynamic Ranges (1) 10-6 10-4 10-2 100 102 104 106

More information

SpheroCam HDR. Image based lighting with. Capture light perfectly SPHERON VR. 0s 20s 40s 60s 80s 100s 120s. Spheron VR AG

SpheroCam HDR. Image based lighting with. Capture light perfectly SPHERON VR. 0s 20s 40s 60s 80s 100s 120s. Spheron VR AG Image based lighting with SpheroCam HDR Capture light perfectly 0 60 120 180 240 300 360 0s 20s 40s 60s 80s 100s 120s SPHERON VR high dynamic range imaging Spheron VR AG u phone u internet Hauptstraße

More information

A Saturation-based Image Fusion Method for Static Scenes

A Saturation-based Image Fusion Method for Static Scenes 2015 6th International Conference of Information and Communication Technology for Embedded Systems (IC-ICTES) A Saturation-based Image Fusion Method for Static Scenes Geley Peljor and Toshiaki Kondo Sirindhorn

More information

Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University!

Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University! Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University! Motivation! wikipedia! exposure sequence! -4 stops! Motivation!

More information

! High&Dynamic!Range!Imaging! Slides!from!Marc!Pollefeys,!Gabriel! Brostow!(and!Alyosha!Efros!and! others)!!

! High&Dynamic!Range!Imaging! Slides!from!Marc!Pollefeys,!Gabriel! Brostow!(and!Alyosha!Efros!and! others)!! ! High&Dynamic!Range!Imaging! Slides!from!Marc!Pollefeys,!Gabriel! Brostow!(and!Alyosha!Efros!and! others)!! Today! High!Dynamic!Range!Imaging!(LDR&>HDR)! Tone!mapping!(HDR&>LDR!display)! The!Problem!

More information

Learning to Predict Indoor Illumination from a Single Image. Chih-Hui Ho

Learning to Predict Indoor Illumination from a Single Image. Chih-Hui Ho Learning to Predict Indoor Illumination from a Single Image Chih-Hui Ho 1 Outline Introduction Method Overview LDR Panorama Light Source Detection Panorama Recentering Warp Learning From LDR Panoramas

More information

Visualizing High Dynamic Range Images in a Web Browser

Visualizing High Dynamic Range Images in a Web Browser jgt 29/4/2 5:45 page # Vol. [VOL], No. [ISS]: Visualizing High Dynamic Range Images in a Web Browser Rafal Mantiuk and Wolfgang Heidrich The University of British Columbia Abstract. We present a technique

More information

How to combine images in Photoshop

How to combine images in Photoshop How to combine images in Photoshop In Photoshop, you can use multiple layers to combine images, but there are two other ways to create a single image from mulitple images. Create a panoramic image with

More information

HDR Video Compression Using High Efficiency Video Coding (HEVC)

HDR Video Compression Using High Efficiency Video Coding (HEVC) HDR Video Compression Using High Efficiency Video Coding (HEVC) Yuanyuan Dong, Panos Nasiopoulos Electrical & Computer Engineering Department University of British Columbia Vancouver, BC {yuand, panos}@ece.ubc.ca

More information

THE SINUSOIDAL WAVEFORM

THE SINUSOIDAL WAVEFORM Chapter 11 THE SINUSOIDAL WAVEFORM The sinusoidal waveform or sine wave is the fundamental type of alternating current (ac) and alternating voltage. It is also referred to as a sinusoidal wave or, simply,

More information

Compression of High Dynamic Range Video Using the HEVC and H.264/AVC Standards

Compression of High Dynamic Range Video Using the HEVC and H.264/AVC Standards Compression of Dynamic Range Video Using the HEVC and H.264/AVC Standards (Invited Paper) Amin Banitalebi-Dehkordi 1,2, Maryam Azimi 1,2, Mahsa T. Pourazad 2,3, and Panos Nasiopoulos 1,2 1 Department of

More information

MODIFICATION OF ADAPTIVE LOGARITHMIC METHOD FOR DISPLAYING HIGH CONTRAST SCENES BY AUTOMATING THE BIAS VALUE PARAMETER

MODIFICATION OF ADAPTIVE LOGARITHMIC METHOD FOR DISPLAYING HIGH CONTRAST SCENES BY AUTOMATING THE BIAS VALUE PARAMETER International Journal of Information Technology and Knowledge Management January-June 2012, Volume 5, No. 1, pp. 73-77 MODIFICATION OF ADAPTIVE LOGARITHMIC METHOD FOR DISPLAYING HIGH CONTRAST SCENES BY

More information

RECOMMENDATION ITU-R S.1257

RECOMMENDATION ITU-R S.1257 Rec. ITU-R S.157 1 RECOMMENDATION ITU-R S.157 ANALYTICAL METHOD TO CALCULATE VISIBILITY STATISTICS FOR NON-GEOSTATIONARY SATELLITE ORBIT SATELLITES AS SEEN FROM A POINT ON THE EARTH S SURFACE (Questions

More information

Spatially Varying Image Based Lighting by Light Probe Sequences

Spatially Varying Image Based Lighting by Light Probe Sequences The Visual Computer manuscript No. (will be inserted by the editor) Spatially Varying Image Based Lighting by Light Probe Sequences Capture, Processing and Rendering Jonas Unger 1, Stefan Gustavson 1,

More information

Beginning Digital Image

Beginning Digital Image Beginning Digital Image Processing Using Free Tools for Photographers Sebastian Montabone Apress Contents Contents at a Glance Contents About the Author About the Technical Reviewer Acknowledgments Introduction

More information

UNIT Explain the radiation from two-wire. Ans: Radiation from Two wire

UNIT Explain the radiation from two-wire. Ans:   Radiation from Two wire UNIT 1 1. Explain the radiation from two-wire. Radiation from Two wire Figure1.1.1 shows a voltage source connected two-wire transmission line which is further connected to an antenna. An electric field

More information

High Dynamic Range (HDR) Photography in Photoshop CS2

High Dynamic Range (HDR) Photography in Photoshop CS2 Page 1 of 7 High dynamic range (HDR) images enable photographers to record a greater range of tonal detail than a given camera could capture in a single photo. This opens up a whole new set of lighting

More information

Advanced Diploma in. Photoshop. Summary Notes

Advanced Diploma in. Photoshop. Summary Notes Advanced Diploma in Photoshop Summary Notes Suggested Set Up Workspace: Essentials or Custom Recommended: Ctrl Shift U Ctrl + T Menu Ctrl + I Ctrl + J Desaturate Free Transform Filter options Invert Duplicate

More information

Bryce 7.1 Pro HDRI Export. HDRI Export

Bryce 7.1 Pro HDRI Export. HDRI Export HDRI Export Bryce can create an HDRI from the sky or load an external HDRI. These HDRIs can also be exported from the IBL tab into different file formats. There are a few things to watch out for. Export

More information

Technical Notes Volume 1, N u m b e r 6. JBL High-frequency Directional Data in Isobar Form. 1. Introduction: 3. The Isobars:

Technical Notes Volume 1, N u m b e r 6. JBL High-frequency Directional Data in Isobar Form. 1. Introduction: 3. The Isobars: Technical Notes Volume 1, N u m b e r 6 JBL High-frequency Directional Data in Isobar Form 1. Introduction: This Technical Note presents directional data on JBL's high-frequency horns in isobar form for

More information

A Study of Slanted-Edge MTF Stability and Repeatability

A Study of Slanted-Edge MTF Stability and Repeatability A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency

More information

Extended Dynamic Range Imaging: A Spatial Down-Sampling Approach

Extended Dynamic Range Imaging: A Spatial Down-Sampling Approach 2014 IEEE International Conference on Systems, Man, and Cybernetics October 5-8, 2014, San Diego, CA, USA Extended Dynamic Range Imaging: A Spatial Down-Sampling Approach Huei-Yung Lin and Jui-Wen Huang

More information

13-1 Practice. Trigonometric Identities. Find the exact value of each expression if 0 < θ < 90. 1, find sin θ. 1. If cos θ = 1, find cot θ.

13-1 Practice. Trigonometric Identities. Find the exact value of each expression if 0 < θ < 90. 1, find sin θ. 1. If cos θ = 1, find cot θ. 1-1 Practice Trigonometric Identities Find the exact value of each expression if 0 < θ < 90. 1. If cos θ = 5 1, find sin θ.. If cot θ = 1, find sin θ.. If tan θ = 4, find sec θ. 4. If tan θ =, find cot

More information

Automatic Selection of Brackets for HDR Image Creation

Automatic Selection of Brackets for HDR Image Creation Automatic Selection of Brackets for HDR Image Creation Michel VIDAL-NAQUET, Wei MING Abstract High Dynamic Range imaging (HDR) is now readily available on mobile devices such as smart phones and compact

More information

High dynamic range imaging

High dynamic range imaging High dynamic range imaging Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/3/6 with slides by Fedro Durand, Brian Curless, Steve Seitz and Alexei Efros Announcements Assignment #1 announced on

More information

MAT01B1: Calculus with Polar coordinates

MAT01B1: Calculus with Polar coordinates MAT01B1: Calculus with Polar coordinates Dr Craig 23 October 2018 My details: acraig@uj.ac.za Consulting hours: Monday 14h40 15h25 Thursday 11h30 12h55 Friday (this week) 11h20 12h25 Office C-Ring 508

More information

arxiv: v1 [cs.cv] 29 May 2018

arxiv: v1 [cs.cv] 29 May 2018 AUTOMATIC EXPOSURE COMPENSATION FOR MULTI-EXPOSURE IMAGE FUSION Yuma Kinoshita Sayaka Shiota Hitoshi Kiya Tokyo Metropolitan University, Tokyo, Japan arxiv:1805.11211v1 [cs.cv] 29 May 2018 ABSTRACT This

More information

High dynamic range image compression with improved logarithmic transformation

High dynamic range image compression with improved logarithmic transformation High dynamic range image compression with improved logarithmic transformation Masahide Sumizawa a) and Xi Zhang b) Graduate School of Informatics and Engineering, The University of Electro- Communications,

More information

Low Dynamic Range Solutions to the High Dynamic Range Imaging Problem

Low Dynamic Range Solutions to the High Dynamic Range Imaging Problem Low Dynamic Range Solutions to the High Dynamic Range Imaging Problem Submitted in partial fulfillment of the requirements of the degree of Doctor of Philosophy by Shanmuganathan Raman (Roll No. 06407008)

More information

Images and Displays. Lecture Steve Marschner 1

Images and Displays. Lecture Steve Marschner 1 Images and Displays Lecture 2 2008 Steve Marschner 1 Introduction Computer graphics: The study of creating, manipulating, and using visual images in the computer. What is an image? A photographic print?

More information

High dynamic range imaging

High dynamic range imaging Announcements High dynamic range imaging Digital Visual Effects, Spring 27 Yung-Yu Chuang 27/3/6 Assignment # announced on 3/7 (due on 3/27 noon) TA/signup sheet/gil/tone mapping Considered easy; it is

More information

2. (8pts) If θ is an acute angle, find the values of all the trigonometric functions of θ given

2. (8pts) If θ is an acute angle, find the values of all the trigonometric functions of θ given Trigonometry Joysheet 1 MAT 145, Spring 2017 D. Ivanšić Name: Covers: 6.1, 6.2 Show all your work! 1. 8pts) If θ is an acute angle, find the values of all the trigonometric functions of θ given that sin

More information

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A. Camera Simulation Effect Cause Field of view Film size, focal length Depth of field Aperture, focal length Exposure Film speed, aperture, shutter Motion blur Shutter References Photography, B. London and

More information

Appendix III Graphs in the Introductory Physics Laboratory

Appendix III Graphs in the Introductory Physics Laboratory Appendix III Graphs in the Introductory Physics Laboratory 1. Introduction One of the purposes of the introductory physics laboratory is to train the student in the presentation and analysis of experimental

More information

Photoshop Elements Hints by Steve Miller

Photoshop Elements Hints by Steve Miller 2015 Elements 13 A brief tutorial for basic photo file processing To begin, click on the Elements 13 icon, click on Photo Editor in the first box that appears. We will not be discussing the Organizer portion

More information

CS 465 Prelim 1. Tuesday 4 October hours. Problem 1: Image formats (18 pts)

CS 465 Prelim 1. Tuesday 4 October hours. Problem 1: Image formats (18 pts) CS 465 Prelim 1 Tuesday 4 October 2005 1.5 hours Problem 1: Image formats (18 pts) 1. Give a common pixel data format that uses up the following numbers of bits per pixel: 8, 16, 32, 36. For instance,

More information

High Dynamic Range Photography

High Dynamic Range Photography JUNE 13, 2018 ADVANCED High Dynamic Range Photography Featuring TONY SWEET Tony Sweet D3, AF-S NIKKOR 14-24mm f/2.8g ED. f/22, ISO 200, aperture priority, Matrix metering. Basically there are two reasons

More information

High Dynamic Range Video for Photometric Measurement of Illumination

High Dynamic Range Video for Photometric Measurement of Illumination High Dynamic Range Video for Photometric Measurement of Illumination Jonas Unger, Stefan Gustavson, VITA, Linköping University, Sweden 1 ABSTRACT We describe the design and implementation of a high dynamic

More information

10.1 Curves defined by parametric equations

10.1 Curves defined by parametric equations Outline Section 1: Parametric Equations and Polar Coordinates 1.1 Curves defined by parametric equations 1.2 Calculus with Parametric Curves 1.3 Polar Coordinates 1.4 Areas and Lengths in Polar Coordinates

More information

Introduction Antenna Ranges Radiation Patterns Gain Measurements Directivity Measurements Impedance Measurements Polarization Measurements Scale

Introduction Antenna Ranges Radiation Patterns Gain Measurements Directivity Measurements Impedance Measurements Polarization Measurements Scale Chapter 17 : Antenna Measurement Introduction Antenna Ranges Radiation Patterns Gain Measurements Directivity Measurements Impedance Measurements Polarization Measurements Scale Model Measurements 1 Introduction

More information

CATALOG. HDRi. reference and support

CATALOG. HDRi. reference and support CATALOG HDRi reference and support Note: This catalog was created on 09-Sep-2014 and may be outdated. Download the pdf updated: http://www.giancr.com/descarga/catalog_hdri.pdf Urban Clear Sky DESCRIPTION

More information

multiframe visual-inertial blur estimation and removal for unmodified smartphones

multiframe visual-inertial blur estimation and removal for unmodified smartphones multiframe visual-inertial blur estimation and removal for unmodified smartphones, Severin Münger, Carlo Beltrame, Luc Humair WSCG 2015, Plzen, Czech Republic images taken by non-professional photographers

More information

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES Petteri PÖNTINEN Helsinki University of Technology, Institute of Photogrammetry and Remote Sensing, Finland petteri.pontinen@hut.fi KEY WORDS: Cocentricity,

More information

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where Fisheye mathematics Fisheye image y 3D world y 1 r P θ θ -1 1 x ø x (x,y,z) -1 z Any point P in a linear (mathematical) fisheye defines an angle of longitude and latitude and therefore a 3D vector into

More information

Lecture # 7 Coordinate systems and georeferencing

Lecture # 7 Coordinate systems and georeferencing Lecture # 7 Coordinate systems and georeferencing Coordinate Systems Coordinate reference on a plane Coordinate reference on a sphere Coordinate reference on a plane Coordinates are a convenient way of

More information

ISSN: (Online) Volume 2, Issue 2, February 2014 International Journal of Advance Research in Computer Science and Management Studies

ISSN: (Online) Volume 2, Issue 2, February 2014 International Journal of Advance Research in Computer Science and Management Studies ISSN: 2321-7782 (Online) Volume 2, Issue 2, February 2014 International Journal of Advance Research in Computer Science and Management Studies Research Article / Paper / Case Study Available online at:

More information

This talk is oriented toward artists.

This talk is oriented toward artists. Hello, My name is Sébastien Lagarde, I am a graphics programmer at Unity and with my two artist co-workers Sébastien Lachambre and Cyril Jover, we have tried to setup an easy method to capture accurate

More information

High Performance Imaging Using Large Camera Arrays

High Performance Imaging Using Large Camera Arrays High Performance Imaging Using Large Camera Arrays Presentation of the original paper by Bennett Wilburn, Neel Joshi, Vaibhav Vaish, Eino-Ville Talvala, Emilio Antunez, Adam Barth, Andrew Adams, Mark Horowitz,

More information

Image stitching. Image stitching. Video summarization. Applications of image stitching. Stitching = alignment + blending. geometrical registration

Image stitching. Image stitching. Video summarization. Applications of image stitching. Stitching = alignment + blending. geometrical registration Image stitching Stitching = alignment + blending Image stitching geometrical registration photometric registration Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/3/22 with slides by Richard Szeliski,

More information

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura MIT CSAIL 6.869 Advances in Computer Vision Fall 2013 Problem Set 6: Anaglyph Camera Obscura Posted: Tuesday, October 8, 2013 Due: Thursday, October 17, 2013 You should submit a hard copy of your work

More information

This histogram represents the +½ stop exposure from the bracket illustrated on the first page.

This histogram represents the +½ stop exposure from the bracket illustrated on the first page. Washtenaw Community College Digital M edia Arts Photo http://courses.wccnet.edu/~donw Don W erthm ann GM300BB 973-3586 donw@wccnet.edu Exposure Strategies for Digital Capture Regardless of the media choice

More information

VU Rendering SS Unit 8: Tone Reproduction

VU Rendering SS Unit 8: Tone Reproduction VU Rendering SS 2012 Unit 8: Tone Reproduction Overview 1. The Problem Image Synthesis Pipeline Different Image Types Human visual system Tone mapping Chromatic Adaptation 2. Tone Reproduction Linear methods

More information

25/02/2017. C = L max L min. L max C 10. = log 10. = log 2 C 2. Cornell Box: need for tone-mapping in graphics. Dynamic range

25/02/2017. C = L max L min. L max C 10. = log 10. = log 2 C 2. Cornell Box: need for tone-mapping in graphics. Dynamic range Cornell Box: need for tone-mapping in graphics High dynamic range and tone mapping Advanced Graphics Rafał Mantiuk Computer Laboratory, University of Cambridge Rendering Photograph 2 Real-world scenes

More information

Removing Temporal Stationary Blur in Route Panoramas

Removing Temporal Stationary Blur in Route Panoramas Removing Temporal Stationary Blur in Route Panoramas Jiang Yu Zheng and Min Shi Indiana University Purdue University Indianapolis jzheng@cs.iupui.edu Abstract The Route Panorama is a continuous, compact

More information

HDR videos acquisition

HDR videos acquisition HDR videos acquisition dr. Francesco Banterle francesco.banterle@isti.cnr.it How to capture? Videos are challenging: We need to capture multiple frames at different exposure times and everything moves

More information

High Dynamic Range (HDR) photography is a combination of a specialized image capture technique and image processing.

High Dynamic Range (HDR) photography is a combination of a specialized image capture technique and image processing. Introduction High Dynamic Range (HDR) photography is a combination of a specialized image capture technique and image processing. Photomatix Pro's HDR imaging processes combine several Low Dynamic Range

More information

High dynamic range and tone mapping Advanced Graphics

High dynamic range and tone mapping Advanced Graphics High dynamic range and tone mapping Advanced Graphics Rafał Mantiuk Computer Laboratory, University of Cambridge Cornell Box: need for tone-mapping in graphics Rendering Photograph 2 Real-world scenes

More information

The accuracy of lighting simulations depends on the physically based modeling

The accuracy of lighting simulations depends on the physically based modeling Evalution of High Dynamic Range Image-Based Sky Models in Lighting Simulation Mehlika Inanici Abstract High Dynamic Range (HDR) Photography is used to capture 180 o images of the sky dome and provide data

More information

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017 Lecture 22: Cameras & Lenses III Computer Graphics and Imaging UC Berkeley, Spring 2017 F-Number For Lens vs. Photo A lens s F-Number is the maximum for that lens E.g. 50 mm F/1.4 is a high-quality telephoto

More information

High-Resolution Interactive Panoramas with MPEG-4

High-Resolution Interactive Panoramas with MPEG-4 High-Resolution Interactive Panoramas with MPEG-4 Peter Eisert, Yong Guo, Anke Riechers, Jürgen Rurainsky Fraunhofer Institute for Telecommunications, Heinrich-Hertz-Institute Image Processing Department

More information

UNIT-3. Ans: Arrays of two point sources with equal amplitude and opposite phase:

UNIT-3. Ans: Arrays of two point sources with equal amplitude and opposite phase: `` UNIT-3 1. Derive the field components and draw the field pattern for two point source with spacing of λ/2 and fed with current of equal n magnitude but out of phase by 180 0? Ans: Arrays of two point

More information

International Journal of Advance Engineering and Research Development. Asses the Performance of Tone Mapped Operator compressing HDR Images

International Journal of Advance Engineering and Research Development. Asses the Performance of Tone Mapped Operator compressing HDR Images Scientific Journal of Impact Factor (SJIF): 4.72 International Journal of Advance Engineering and Research Development Volume 4, Issue 9, September -2017 e-issn (O): 2348-4470 p-issn (P): 2348-6406 Asses

More information

POST-PRODUCTION/IMAGE MANIPULATION

POST-PRODUCTION/IMAGE MANIPULATION 6 POST-PRODUCTION/IMAGE MANIPULATION IMAGE COMPRESSION/FILE FORMATS FOR POST-PRODUCTION Florian Kainz, Piotr Stanczyk This section focuses on how digital images are stored. It discusses the basics of still-image

More information

MATH 255 Applied Honors Calculus III Winter Homework 1. Table 1: 11.1:8 t x y

MATH 255 Applied Honors Calculus III Winter Homework 1. Table 1: 11.1:8 t x y MATH 255 Applied Honors Calculus III Winter 2 Homework Section., pg. 692: 8, 24, 43. Section.2, pg. 72:, 2 (no graph required), 32, 4. Section.3, pg. 73: 4, 2, 54, 8. Section.4, pg. 79: 6, 35, 46. Solutions.:

More information

Omnidirectional High Dynamic Range Imaging with a Moving Camera

Omnidirectional High Dynamic Range Imaging with a Moving Camera Omnidirectional High Dynamic Range Imaging with a Moving Camera by Fanping Zhou Thesis submitted to the Faculty of Graduate and Postdoctoral Studies in partial fulfillment of the requirements for the M.A.Sc.

More information

of the whole circumference.

of the whole circumference. TRIGONOMETRY WEEK 13 ARC LENGTH AND AREAS OF SECTORS If the complete circumference of a circle can be calculated using C = 2πr then the length of an arc, (a portion of the circumference) can be found by

More information

Tonemapping and bilateral filtering

Tonemapping and bilateral filtering Tonemapping and bilateral filtering http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 6 Course announcements Homework 2 is out. - Due September

More information

Practice Problems: Calculus in Polar Coordinates

Practice Problems: Calculus in Polar Coordinates Practice Problems: Calculus in Polar Coordinates Answers. For these problems, I want to convert from polar form parametrized Cartesian form, then differentiate and take the ratio y over x to get the slope,

More information

ESCI Cloud Physics and Precipitation Processes Lesson 10 - Weather Radar Dr. DeCaria

ESCI Cloud Physics and Precipitation Processes Lesson 10 - Weather Radar Dr. DeCaria ESCI 340 - Cloud Physics and Precipitation Processes Lesson 10 - Weather Radar Dr. DeCaria References: A Short Course in Cloud Physics, 3rd ed., Rogers and Yau, Ch. 11 Radar Principles The components of

More information

6.1 - Introduction to Periodic Functions

6.1 - Introduction to Periodic Functions 6.1 - Introduction to Periodic Functions Periodic Functions: Period, Midline, and Amplitude In general: A function f is periodic if its values repeat at regular intervals. Graphically, this means that

More information

Figure 1 HDR image fusion example

Figure 1 HDR image fusion example TN-0903 Date: 10/06/09 Using image fusion to capture high-dynamic range (hdr) scenes High dynamic range (HDR) refers to the ability to distinguish details in scenes containing both very bright and relatively

More information

Image Registration for Multi-exposure High Dynamic Range Image Acquisition

Image Registration for Multi-exposure High Dynamic Range Image Acquisition Image Registration for Multi-exposure High Dynamic Range Image Acquisition Anna Tomaszewska Szczecin University of Technology atomaszewska@wi.ps.pl Radoslaw Mantiuk Szczecin University of Technology rmantiuk@wi.ps.pl

More information

HDR Darkroom 2 User Manual

HDR Darkroom 2 User Manual HDR Darkroom 2 User Manual Everimaging Ltd. 1 / 22 www.everimaging.com Cotent: 1. Introduction... 3 1.1 A Brief Introduction to HDR Photography... 3 1.2 Introduction to HDR Darkroom 2... 5 2. HDR Darkroom

More information

Overview of Simulation of Video-Camera Effects for Robotic Systems in R3-COP

Overview of Simulation of Video-Camera Effects for Robotic Systems in R3-COP Overview of Simulation of Video-Camera Effects for Robotic Systems in R3-COP Michal Kučiš, Pavel Zemčík, Olivier Zendel, Wolfgang Herzner To cite this version: Michal Kučiš, Pavel Zemčík, Olivier Zendel,

More information

Panoramic Image Mosaics

Panoramic Image Mosaics Panoramic Image Mosaics Image Stitching Computer Vision CSE 576, Spring 2008 Richard Szeliski Microsoft Research Full screen panoramas (cubic): http://www.panoramas.dk/ Mars: http://www.panoramas.dk/fullscreen3/f2_mars97.html

More information

Evaluation of 3C sensor coupling using ambient noise measurements Summary

Evaluation of 3C sensor coupling using ambient noise measurements Summary Evaluation of 3C sensor coupling using ambient noise measurements Howard Watt, John Gibson, Bruce Mattocks, Mark Cartwright, Roy Burnett, and Shuki Ronen Veritas Geophysical Corporation Summary Good vector

More information

6.869 Advances in Computer Vision Spring 2010, A. Torralba

6.869 Advances in Computer Vision Spring 2010, A. Torralba 6.869 Advances in Computer Vision Spring 2010, A. Torralba Due date: Wednesday, Feb 17, 2010 Problem set 1 You need to submit a report with brief descriptions of what you did. The most important part is

More information

Time-Lapse Panoramas for the Egyptian Heritage

Time-Lapse Panoramas for the Egyptian Heritage Time-Lapse Panoramas for the Egyptian Heritage Mohammad NABIL Anas SAID CULTNAT, Bibliotheca Alexandrina While laser scanning and Photogrammetry has become commonly-used methods for recording historical

More information

High Dynamic Range Texture Compression

High Dynamic Range Texture Compression High Dynamic Range Texture Compression Kimmo Roimela Tomi Aarnio Joonas Ita ranta Nokia Research Center Figure 1: Encoding extreme colors. Left to right: original (48 bpp), our method (8 bpp), DXTC-encoded

More information

Histograms& Light Meters HOW THEY WORK TOGETHER

Histograms& Light Meters HOW THEY WORK TOGETHER Histograms& Light Meters HOW THEY WORK TOGETHER WHAT IS A HISTOGRAM? Frequency* 0 Darker to Lighter Steps 255 Shadow Midtones Highlights Figure 1 Anatomy of a Photographic Histogram *Frequency indicates

More information

Multi-Path Fading Channel

Multi-Path Fading Channel Instructor: Prof. Dr. Noor M. Khan Department of Electronic Engineering, Muhammad Ali Jinnah University, Islamabad Campus, Islamabad, PAKISTAN Ph: +9 (51) 111-878787, Ext. 19 (Office), 186 (Lab) Fax: +9

More information

Quintic Hardware Tutorial Camera Set-Up

Quintic Hardware Tutorial Camera Set-Up Quintic Hardware Tutorial Camera Set-Up 1 All Quintic Live High-Speed cameras are specifically designed to meet a wide range of needs including coaching, performance analysis and research. Quintic LIVE

More information

Picture Style Editor Ver Instruction Manual

Picture Style Editor Ver Instruction Manual ENGLISH Picture Style File Creating Software Picture Style Editor Ver. 1.18 Instruction Manual Content of this Instruction Manual PSE stands for Picture Style Editor. In this manual, the windows used in

More information

PARALLEL ALGORITHMS FOR HISTOGRAM-BASED IMAGE REGISTRATION. Benjamin Guthier, Stephan Kopf, Matthias Wichtlhuber, Wolfgang Effelsberg

PARALLEL ALGORITHMS FOR HISTOGRAM-BASED IMAGE REGISTRATION. Benjamin Guthier, Stephan Kopf, Matthias Wichtlhuber, Wolfgang Effelsberg This is a preliminary version of an article published by Benjamin Guthier, Stephan Kopf, Matthias Wichtlhuber, and Wolfgang Effelsberg. Parallel algorithms for histogram-based image registration. Proc.

More information

A Novel Hybrid Exposure Fusion Using Boosting Laplacian Pyramid

A Novel Hybrid Exposure Fusion Using Boosting Laplacian Pyramid A Novel Hybrid Exposure Fusion Using Boosting Laplacian Pyramid S.Abdulrahaman M.Tech (DECS) G.Pullaiah College of Engineering & Technology, Nandikotkur Road, Kurnool, A.P-518452. Abstract: THE DYNAMIC

More information

DETERMINING LENS VIGNETTING WITH HDR TECHNIQUES

DETERMINING LENS VIGNETTING WITH HDR TECHNIQUES Национален Комитет по Осветление Bulgarian National Committee on Illumination XII National Conference on Lighting Light 2007 10 12 June 2007, Varna, Bulgaria DETERMINING LENS VIGNETTING WITH HDR TECHNIQUES

More information

Picture Style Editor Ver Instruction Manual

Picture Style Editor Ver Instruction Manual ENGLISH Picture Style File Creating Software Picture Style Editor Ver. 1.15 Instruction Manual Content of this Instruction Manual PSE stands for Picture Style Editor. indicates the selection procedure

More information

Channel. Muhammad Ali Jinnah University, Islamabad Campus, Pakistan. Multi-Path Fading. Dr. Noor M Khan EE, MAJU

Channel. Muhammad Ali Jinnah University, Islamabad Campus, Pakistan. Multi-Path Fading. Dr. Noor M Khan EE, MAJU Instructor: Prof. Dr. Noor M. Khan Department of Electronic Engineering, Muhammad Ali Jinnah University, Islamabad Campus, Islamabad, PAKISTAN Ph: +9 (51) 111-878787, Ext. 19 (Office), 186 (Lab) Fax: +9

More information

Aperture. The lens opening that allows more, or less light onto the sensor formed by a diaphragm inside the actual lens.

Aperture. The lens opening that allows more, or less light onto the sensor formed by a diaphragm inside the actual lens. PHOTOGRAPHY TERMS: AE - Auto Exposure. When the camera is set to this mode, it will automatically set all the required modes for the light conditions. I.e. Shutter speed, aperture and white balance. The

More information

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do? Computational Photography The ultimate camera What does it do? Image from Durand & Freeman s MIT Course on Computational Photography Today s reading Szeliski Chapter 9 The ultimate camera Infinite resolution

More information