Dual-fisheye Lens Stitching for 360-degree Imaging & Video. Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington

Similar documents
Image stitching. Image stitching. Video summarization. Applications of image stitching. Stitching = alignment + blending. geometrical registration

Photographing Long Scenes with Multiviewpoint

METHODS AND ALGORITHMS FOR STITCHING 360-DEGREE VIDEO

Creating a Panorama Photograph Using Photoshop Elements

Homographies and Mosaics

The key to a fisheye is the relationship between latitude ø of the 3D vector and radius on the 2D fisheye image, namely a linear one where

Homographies and Mosaics

Recognizing Panoramas

IMAGE FUSION. How to Best Utilize Dual Cameras for Enhanced Image Quality. Corephotonics White Paper

Image Processing & Projective geometry

Image Mosaicing. Jinxiang Chai. Source: faculty.cs.tamu.edu/jchai/cpsc641_spring10/lectures/lecture8.ppt

High Performance Imaging Using Large Camera Arrays

Novel Hemispheric Image Formation: Concepts & Applications

Advanced Diploma in. Photoshop. Summary Notes

HDR videos acquisition

Colour correction for panoramic imaging

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E

Beacon Island Report / Notes

Stitching distortion-free mosaic images for QWA using PTGui. Georg von Arx

Multi Viewpoint Panoramas

CS 465 Prelim 1. Tuesday 4 October hours. Problem 1: Image formats (18 pts)

Extended View Toolkit

PandroidWiz and Presets

Geometry-Based Populated Chessboard Recognition

Registering and Distorting Images

A Short History of Using Cameras for Weld Monitoring

A short introduction to panoramic images

Improved SIFT Matching for Image Pairs with a Scale Difference

Midterm Examination CS 534: Computational Photography

Single Camera Catadioptric Stereo System

Finding the nodal point

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Video Registration: Key Challenges. Richard Szeliski Microsoft Research

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

Camera Requirements For Precision Agriculture

Camera Requirements For Precision Agriculture

multiframe visual-inertial blur estimation and removal for unmodified smartphones

Brief summary report of novel digital capture techniques

Panoramic Vision System for an Intelligent Vehicle using. a Laser Sensor and Cameras

The principles of CCTV design in VideoCAD

Recent advances in deblurring and image stabilization. Michal Šorel Academy of Sciences of the Czech Republic

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

How to combine images in Photoshop

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

Vision Review: Image Processing. Course web page:

Chapter 18 Optical Elements

Panoramic Image Mosaics

Digital Design and Communication Teaching (DiDACT) University of Sheffield Department of Landscape. Adobe Photoshop CS4 INTRODUCTION WORKSHOPS

Capturing Omni-Directional Stereoscopic Spherical Projections with a Single Camera

Technical information about PhoToPlan

Be aware that there is no universal notation for the various quantities.

INTERFEROMETER VI-direct

Using Line and Ellipse Features for Rectification of Broadcast Hockey Video

Feature Extraction and Pattern Recognition from Fisheye Images in the Spatial Domain

Adobe Photoshop. Levels

Computer Vision. The Pinhole Camera Model

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Catadioptric Stereo For Robot Localization

Unit 1: Image Formation

Applications of Flash and No-Flash Image Pairs in Mobile Phone Photography

Panoramic Image Stitching based on Feature Extraction and Correlation

All projected images must be visible from the camera point of view. The content exists in 2D - an "unwrapped" view of the content in the aspect ratio

The New Rig Camera Process in TNTmips Pro 2018

Compact Dual Field-of-View Telescope for Small Satellite Payloads. Jim Peterson Trent Newswander

Digital Photographic Imaging Using MOEMS

Light Field based 360º Panoramas

This document is a preview generated by EVS

Princeton University COS429 Computer Vision Problem Set 1: Building a Camera

303SPH SPHERICAL VR HEAD

6.869 Advances in Computer Vision Spring 2010, A. Torralba

ECEN 4606, UNDERGRADUATE OPTICS LAB

Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances

Creating Stitched Panoramas

Synthetic Stereoscopic Panoramic Images

Panoramas. Featuring ROD PLANCK. Rod Planck DECEMBER 29, 2017 ADVANCED

Objective Quality Assessment Method for Stitched Images

Webcam Image Alignment

Parameters of Image Quality

Remote sensing image correction

Standard Operating Procedure for Flat Port Camera Calibration

One Week to Better Photography

Reikan FoCal Aperture Sharpness Test Report

LENSES. INEL 6088 Computer Vision

Testo SuperResolution the patent-pending technology for high-resolution thermal images

Keywords Unidirectional scanning, Bidirectional scanning, Overlapping region, Mosaic image, Split image

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

SIM University Projector Specifications. Stuart Nicholson System Architect. May 9, 2012

CALIBRATION OF IMAGING SATELLITE SENSORS

Reikan FoCal Aperture Sharpness Test Report

Stitching panorama photographs with Hugin software Dirk Pons, New Zealand

IEEE TRANSACTIONS ON IMAGE PROCESSING VOL. XX, NO. X, MONTH YEAR 1. Affine Covariant Features for Fisheye Distortion Local Modelling

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

MISB RP RECOMMENDED PRACTICE. 25 June H.264 Bandwidth/Quality/Latency Tradeoffs. 1 Scope. 2 Informative References.

Moving Object Detection for Intelligent Visual Surveillance

HDR Darkroom 2 User Manual

ADC COMPACT FULL-LEG/FULL-SPINE APPLICATION SOFTWARE USER MANUAL

Layered Motion Compensation for Moving Image Compression. Gary Demos Hollywood Post Alliance Rancho Mirage, California 21 Feb 2008

An Effective Method for Removing Scratches and Restoring Low -Quality QR Code Images

Transcription:

Dual-fisheye Lens Stitching for 360-degree Imaging & Video Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington

Introduction 360-degree imaging: the process of taking multiple photographs and stitching them together to create a 360x180-degree image. Imaging systems: Catadioptric: using lens+mirror no stitching seams but limited in field of view (cannot produce 360x180 images). Polydioptric: using multiple wide-view lenses in a same rig. Columbia U s catadioptric camera GoPro s Odyssey 360 rig (16 cameras) Facebook Surround 360 rig (17 cameras) 2

Motivation Polydioptric cameras become more and more popular in 360-degree imaging and video. But most professional polydioptric optical systems are Bulky Very expensive (ranging $15,000 ~ $30,000 for a complete system) not for the masses Samsung Gear 360 camera: Gear 360: Very compact Affordable (around $300/camera) Use two fisheye lenses Affordable optics and simpler hardware. Complexity shifted to stitching algorithm. 3

Challenges Little overlap between two fisheye images taken by Gear 360. Mis-alignment between two lenses in the camera. Fisheye light drop-off: intensity decreases moving away from the center of the image. 4

Challenges (cont.) Image Alignment & Stitching using Conventional Methods Input Good amount of overlap Very limited amount of overlap Feature extraction & matching (SIFT) Inlier features outnumber outliers (thanks to the large overlap) Overwhelming incorrect feature matches due to : small overlap area + fisheye extreme distortion at boundary Outliers elimination (RANSAC) Inliers retained correctly (reliable) Inliers, mistook as outliers, got removed (not reliable) 5

6 Challenges (cont.) Image Alignment & Stitching using Conventional Methods Outliers elimination (RANSAC) Inliers retained correctly (reliable) Inliers, mistook as outliers, got removed (not reliable) Warp image Not enough reliable inliers cannot warp image Warp image to the homography estimated from the inliers. Straighten the pano. Blend the overlap! Done!

Gear360 Stitching Method Unconventional stitching method Specialized for cameras Individual lens characteristics Relative difference between two lenses put on a same system Two-step alignment Gear360-specific alignment, mostly computed offline, followed by Heuristic (& adaptive) method for refined alignment, computed online 7

Our Approach Intensity compensation: for the light fall off away from the center Unwarping: Derive geometric transformation to unwarp the Gear360 fisheye images Alignment: Two-step alignment Blending: Ramp function blending (simple & fast) Intensity Profile Lenses mis-alignment Information Intensity Compensation Fisheye Unwarping Two-step Alignment Blending 8

Intensity Profiling The intensity as well as the incident lights get distorted toward the radius (fisheye lens s natural effect) Need to compensate for the light drop off profiling experiment Measure the intensity along the radius Polynomial curve fit the data 9

Intensity Profiling: Result Assume the optical symmetry of the fisheye lens Compensate the light fall off using the constructed intensity profile Intensity compensation 10

11 Next Intensity Profile Lenses mis-alignment Information Intensity Compensation Fisheye Unwarping Two-step Alignment Blending Intensity Profile Lenses mis-alignment Information Intensity Compensation Fisheye Unwarping Two-step Alignment Blending

12 Unwarping: Fisheye Optical Model Fisheye lenses achieve larger than 180-degree field of view by bending the incident lights Image courtesy: Pierre Toscani

13 Unwarping: Fisheye Geometry Model Image courtesy: Paul Bourke 2D to 3D: x-coordinate: yaw y-coordinate: pitch

Unwarping: derivation Treat the original fisheye image (2-D) in a unit polar coordinate Reconstruct the 3-D surface structures in a unit sphere Map the 3-D into 2-D: preserves distant (straight lines straight lines), and compatible with 360 viewer 14

15 Unwarping: derivation x_prj = ρ*cos(theta) y_prj = ρ*sin(theta)

Unwarping: Result Unwarp View Port on PTGui 360 Viewer View Port on PTGui 360 Viewer 16

17 Unwarping: Result Display on non-360 viewer View Port on PTGui 360 Viewer

18 Unwarp the light-compensated images Original (output of Gear360) Light compensated & Unwarped

Next Intensity Profile Lenses mis-alignment Information Intensity Compensation Fisheye Unwarping Two-step Alignment Blending 19

20 Blending First blending version: ramp blending (fast) Weights in left image Weights in right image Overlapped region

Fisheye Lens Calibration Study the lens optic distortion Goal: make sure if there is any distortion, we can compensate to make straight lines straight, mid points mid, etc. Known patterns (checkerboards) Images (fisheye) Calibration Process Calib. Matrix (used to correct the fisheye distortion) Calibration: using OCamCalib_v3.0 for fisheye lens with FOV > 180 degree. (OpenCV Calib. may not work with fisheye lens whose FOV > 180 deg) 21

Calibration Experiments Checkerboard patterns with predefined size. Cover all rotational angles of the fisheye lens. Approach the lens as much as possible (without creating visible shadow) Get all checkboard corners detected for all images fail: take different set of images & start over. The image set on RHS has all corners detected correctly. 22

Calibration Results Affine matrix shows that the lens is accurate (affine parameters at 5 th, 6 th decimal place after zero. A = 1.000028, 0.000062, 0-0.000006, 1, 0 1957.061640, 1943.391170, 1 Translation from center: deviated from center: 1957.061640 (3888/2) = 13.0616 pixels horizontally 1943.391170 (3888/2) = -0.6088 pixel vertically Thus, no need for individual lens compensation 23

24 Affine Transformation Geometric transformation: Change coordinate of each image pixel (to map them to new places) Image intensity remains intact Nice features of Affine Transformation: Preserve lines, points & planes preserve shapes Fast to derive: need two sets of correspondent points Image courtesy: wiki Courtesy: Mathworks

Next Intensity Profile Lenses mis-alignment Information Intensity Compensation Fisheye Unwarping Two-step Alignment Blending 25

Two-Step Alignment Compensate for the relative mis-alignment between two lenses Estimate points correspondence for a possible alignment [manually] [offline] Adaptively align images to minimize any small & remnant discontinuity in the overlapping regions after the first alignment [automatically] [online] 26

Lens Alignment How much the images produced by two lenses differentiate geometrically? Use checkboards to find the mis-alignment patterns at overlapping regions 180 o Gear360 overlap checkerboards Each of all test cameras is put here 27

Mis-alignment Pattern in Overlapping Regions Front lens Back lens Front lens Back lens Camera #1 overlaps Camera #2 Absolute coordinate of pixel coordinates (e.g. checkerboard square) can vary from camera to camera. Relative position of same checkerboard squares in the overlapping regions remains similar/same among different cameras. 28

Lens Alignment Unwarp the fisheyes Arrange the images in a 360x180 pano before extract control points (so they are loosely aligned) 1 2 29

Lens Alignment 1 2 Control points selection set of correspondent pairs (~200 pairs) Solve for an affine matrix [computed offline] Align: warp (apply affine transformation on) one image to another 30

31 Lens Alignment WITHOUT Alignment

32 Lens Alignment With the proposed Alignment The estimated affine matrix makes both images align vertically

Refined Alignment Objects/persons too close to camera Need more than the first alignment method The person sitting ~ 1m to the camera The first alignment is already applied. 33

Refined Alignment There is possible mis-matches after the first alignment This mis-alignment builds up when objects move closer to camera Use normalized cross-correlation to find the best match adaptively to the scene & objects, thus minimized any misalignment caused by object s distance to camera Correlation review Measures the similarity of two signals Two functions matched when their crosscorrelation maximized Variation in exposure & lenses at the overlapped regions normalized cross-correlation Image courtesy: wiki 34

35 Refined Alignment Normalized cross-correlation [J. P. Lewis95] (old but fast & works well):

Refined Alignment Bottom image Top Image Bottom Image Top Image Find a best match of the templates of the top image in the references of the bottom one. Create 8 pairs of correspondent points solve for an affine matrix [computed online] Warp image accordingly using this matrix. Match found! Match found! 36

37 Refined Alignment WITHOUT Refined Alignment * The first alignment is already applied.

38 Refined Alignment With the Proposed Refined Alignment * On top of the first alignment.

39 Results Evaluate the Samsung Note-5 s stitching algorithm vs. the proposed method.

40 Results (Phone vs. Proposed) Stitched by Samsung Note-5

41 Results (Phone vs. Proposed) Stitched by the Proposed Method

42 Results (Phone vs. Proposed) Stitched by Samsung Note-5

43 Results (Phone vs. Proposed) Stitched by the Proposed Method

44 Results (Phone vs. Proposed) Stitched by Samsung Note-5

45 Results (Phone vs. Proposed) Stitched by the Proposed Method

46 Results (Phone vs. Proposed) Stitched by Samsung Note-5

47 Results (Phone vs. Proposed) Stitched by the Proposed Method

48 Results (Phone vs. Proposed) Stitched by Samsung Note-5

49 Results (Phone vs. Proposed) Stitched by the Proposed Method

50 Results (Phone vs. Proposed): in 360-deg viewer Stitched by Samsung Note-5 Stitched by the Proposed Method

Conclusion A new method for dual-fisheye (Gear360) stitching is presented. A system is built in OpenCV (C++) & Matlab. The proposed method is comprised of four steps: Fisheye intensity compensation Geometric derivation & fisheye unwarping Two-step image registration that aligns one unwarped image to another Blending using a ramp function Results show that the proposed method produces similar stitched image quality to the one by Samsung Note-5 for most of the cases and outperforms for some other cases (e.g. a person posing close to the lenses boundary). 51

Conclusion (cont.) One paper accepted for publication in ICASSP 2017. One provisional U.S. patent filled. One proposal to MPEG standardization for fisheye light fall-off at MPEG meeting Chengdu 2016*. *Lens Shading Parameters Metadata for Omnidirectional Video, ISO/IEC JTC1/SC29/WG11 MPEG2016/ m39469. Presented by Dr. Budagavi, Samsung Electronics. 52

Future Work Improve stitching quality for still image [under development]. Develop 360-degree video stitching framework [under development]. 360-degree video compression 360-degree video stabilization 53

54 Improve Stitching Quality Further align the patterned parts caused by parallax.

Improve Stitching Quality (cont.) The discontinuity cause: this approach solves an overdetermined system of control points for a warping matrix resulted in an approximated solution in a least-squares sense cannot transform all control points to desire positions. Solution: solve for an interpolation grids (of unwarped image size) by weighted least-squares to deform the image. points in the overlapping area are transformed to desired positions precisely. points outside the overlapping area are transformed to desired positions less accurately [minimal impact on stitching quality]. 55

56 Improve Stitching Quality (cont.) Preliminary result (no blending yet, seam visible)

57 Improve Stitching Quality (cont.) The current method The method being developed (without blending) The current method The method being developed (without blending)

58 360-degree video for dual-fisheye lens camera Please see demo.

Thank You! 59

References [Map1] Map Projection: http://www.progonos.com/furuti/mapproj [Map2] Map Projection: https://en.wikipedia.org/wiki/map_projection [J. P. Lewis95] Fast Normalized Cross-Correlation, Industrial Light & Magic, 1995 60