Reprojection of 3D points of Superquadrics Curvature caught by Kinect IR-depth sensor to CCD of RGB camera

Similar documents
Reprojection of 3D points of Superquadrics Curvature caught by Kinect IR-depth sensor to CCD of RGB camera

USING ROBOCOMP AND KINECT IN AUGMENTED REALITY APPLICATIONS. Leandro P. Serrano July 2011, Coimbra

Computer Vision Slides curtesy of Professor Gregory Dudek

Sensors and Sensing Cameras and Camera Calibration

Computer Vision. The Pinhole Camera Model

Image Processing & Projective geometry

Road Boundary Estimation in Construction Sites Michael Darms, Matthias Komar, Dirk Waldbauer, Stefan Lüke

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

ME 6406 MACHINE VISION. Georgia Institute of Technology

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

AR 2 kanoid: Augmented Reality ARkanoid

Coded Aperture for Projector and Camera for Robust 3D measurement

BEAMFORMING WITH KINECT V2

Various Calibration Functions for Webcams and AIBO under Linux

Image stitching. Image stitching. Video summarization. Applications of image stitching. Stitching = alignment + blending. geometrical registration

Geometry-Based Populated Chessboard Recognition

Princeton University COS429 Computer Vision Problem Set 1: Building a Camera

3DUNDERWORLD-SLS v.3.0

Lane Detection in Automotive

Dual-fisheye Lens Stitching for 360-degree Imaging & Video. Tuan Ho, PhD. Student Electrical Engineering Dept., UT Arlington

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

Guided Filtering Using Reflected IR Image for Improving Quality of Depth Image

CSE 527: Introduction to Computer Vision

Lecture 02 Image Formation 1

Assignment: Cameras and Light

Unit 1: Image Formation

A Publicly Available RGB-D Data Set of Muslim Prayer Postures Recorded Using Microsoft Kinect for Windows

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Image Mosaicing. Jinxiang Chai. Source: faculty.cs.tamu.edu/jchai/cpsc641_spring10/lectures/lecture8.ppt

Toward an Augmented Reality System for Violin Learning Support

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

Midterm Examination CS 534: Computational Photography

Impact of Thermal and Environmental Conditions on the Kinect Sensor

Catadioptric Stereo For Robot Localization

PLazeR. a planar laser rangefinder. Robert Ying (ry2242) Derek Xingzhou He (xh2187) Peiqian Li (pl2521) Minh Trang Nguyen (mnn2108)

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Checkerboard Tracker for Camera Calibration. Andrew DeKelaita EE368

ROAD TO THE BEST ALPR IMAGES

A Personal Surround Environment: Projective Display with Correction for Display Surface Geometry and Extreme Lens Distortion

Blind navigation with a wearable range camera and vibrotactile helmet

Rectifying the Planet USING SPACE TO HELP LIFE ON EARTH

Goal of this Section. Capturing Reflectance From Theory to Practice. Acquisition Basics. How can we measure material properties? Special Purpose Tools

Digital Photographic Imaging Using MOEMS

Computational Rephotography

Sensory Fusion for Image

Computer Vision. Thursday, August 30

Computational Re-Photography Soonmin Bae, Aseem Agarwala, and Fredo Durand

Method for out-of-focus camera calibration

A Comparison Between Camera Calibration Software Toolboxes

Detecting Greenery in Near Infrared Images of Ground-level Scenes

Computer and Machine Vision

Visual Servoing. Charlie Kemp. 4632B/8803 Mobile Manipulation Lecture 8

Face Detection System on Ada boost Algorithm Using Haar Classifiers

Leica RCD30 Calibration Certificate

Panoramic Mosaicing with a 180 Field of View Lens

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

A Case Study of Security and Privacy Threats from Augmented Reality (AR)

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

LIST 04 Submission Date: 04/05/2017; Cut-off: 14/05/2017. Part 1 Theory. Figure 1: horizontal profile of the R, G and B components.

Cameras. CSE 455, Winter 2010 January 25, 2010

Sébastien Equis, and Pierre Jacquot EPFL, Nanophotonics and Metrology Laboratory, Lausanne, Switzerland

Finger rotation detection using a Color Pattern Mask

Lane Detection in Automotive

Dr F. Cuzzolin 1. September 29, 2015

Estimation of spectral response of a consumer grade digital still camera and its application for temperature measurement

According to the proposed AWB methods as described in Chapter 3, the following

HDR videos acquisition

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

Panoramic Vision System for an Intelligent Vehicle using. a Laser Sensor and Cameras

CS415 Human Computer Interaction

Opto Engineering S.r.l.

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

Recognizing Panoramas

DETERMINING CALIBRATION PARAMETERS FOR A HARTMANN- SHACK WAVEFRONT SENSOR

CALIBRATION REPORT SUMMARY

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Automatic source camera identification using the intrinsic lens radial distortion

Digital deformation model for fisheye image rectification

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Light Field based 360º Panoramas

A 3D Profile Parallel Detecting System Based on Differential Confocal Microscopy. Y.H. Wang, X.F. Yu and Y.T. Fei

NUST FALCONS. Team Description for RoboCup Small Size League, 2011

DEFENSE APPLICATIONS IN HYPERSPECTRAL REMOTE SENSING

Photogrammetric System using Visible Light Communication

Computer Vision. Howie Choset Introduction to Robotics

Introduction to DSP ECE-S352 Fall Quarter 2000 Matlab Project 1

STRUCTURE SENSOR QUICK START GUIDE

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

A Comparison of Monocular Camera Calibration Techniques

Novel calibration method for structured-light system with an out-of-focus projector

Charged Coupled Device (CCD) S.Vidhya

6.869 Advances in Computer Vision Spring 2010, A. Torralba

ROS Tutorial. Me133a Joseph & Daniel 11/01/2017

CIS581: Computer Vision and Computational Photography Homework: Cameras and Convolution Due: Sept. 14, 2017 at 3:00 pm

Vision Research at. Validation of a Novel Hartmann-Moiré Wavefront Sensor with Large Dynamic Range. Wavefront Science Congress, Feb.

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT

Transcription:

Facoltà di Ingegneria Reprojection of 3D points of Superquadrics Curvature caught by Kinect IR-depth sensor to CCD of RGB camera Mariolino De Cecco, Nicolo Biasi, Ilya Afanasyev Trento, 2012 1/20

Content 1. Kinect installation. 2. Kinect calibration. 3. Kinect outputs. 4. MATLAB Preprocessing Kinect data. 5. MATLAB Inputs. 6. SQ Curvature Software. 7. Tests. 8. Results. 9. Links. 2/20

Kinect installation 1 st attempt 1. OpenNI backend: 1. Nicolas Burrus software [1]: Kinect RGBDemo v0.6.1. -> supports OpenNI/Nite backends [2] and has the experimental infrared support with OpenNI (still buggy) -> I tried to check this work for OpenNI backend under Windows 64bit. 2. Install Sensor Kinect drivers under Windows 32 bit [3] and OpenNI/Nite modules (Win32) [2] according to the instruction [4] and verified that they work properly. 3. Use RGB-D Capture [1] to grab Kinect RGB and IR images with intensity.raw and depth.raw files. -> the software doesn t grab IR images!! -> only RGB image, intensity.raw, depth.raw and calibration file (calibration.yml without distortion parameters)!! 4. calibrate-openni-intrinsics --pattern-size 0.0405 grab1 calibration.yml -> gives openni_calibration.yml with Intrinsic matrix and distortion coefficients for Kinect RGB camera!! 5. calibrate-openni-depth.exe --pattern-size 0.0405 grab1 gives partly processed figures with a message about errors. 3/20

The comparison of Kinect calibration results for - Libfreenect backend (under OS Linux / Ubuntu 10.10-32 bit) - OpenNI backend (under OS Windows 32bit) Libfreenect backend OpenNI backend 23/11/2011 The same figures of Intrinsic Matrixes for Kinect RGB and Depth Cameras The Depth Camera Calibration for OpenNI backend is NOT available!!! -

The comparison of Kinect calibration results for - Libfreenect backend (under OS Linux / Ubuntu 10.10-32 bit) - OpenNI backend (under OS Windows 32bit) Libfreenect backend OpenNI backend 23/11/2011 There is NO extrinsic mapping between Kinect Depth and RGB cameras for OpenNI backend The reprojection of 3D points from Kinect Depth Camera to RGB image for OpenNI is NOT possible!! -

Kinect installation 2 nd attempt 1. Libfreenect backend (Windows): 1. Nicolas Burrus software [1]: Kinect RGBDemo v0.6.1. -> supports Libfreenect backend [2] and has RGB and infrared support -> I tried to check the work under Windows 64bit. 2. Install Xbox NUI Motor drivers under Windows 32 bit [3] and OpenNI/Nite modules (Win32) [2] according to the instruction [4] and verified that they work properly. 3. Use RGB-D Capture [1] to grab Kinect RGB and IR images with intensity.raw and depth.raw files. -> the software doesn t grab IR images!! -> only RGB image, intensity.raw, depth.raw and calibration file (calibration.yml without distortion paraeters)!! 4. calibrate-openni-intrinsics --pattern-size 0.025 grab1 calibration.yml -> gives openni_calibration.yml with Intrinsic matrix and distortion coefficients for Kinect RGB camera!! 5. calibrate-openni-depth.exe --pattern-size 0.025 grab1 gives partly processed figures with a message about errors. 6/20

Kinect installation New Idea 1. OpenNI backend: 1. Nicolas Burrus software [1]: Kinect RGBDemo v0.6.1. -> supports OpenNI/Nite backends [2] and has the experimental infrared support with OpenNI (still buggy) -> I tried to check this work for OpenNI backend under Windows 64bit. 2. Install Sensor Kinect drivers under Windows 32 bit [3] and OpenNI/Nite modules (Win32) [2] according to the instruction [4] and verified that they work properly. 3. Use RGB-D Capture [1] to grab Kinect RGB and IR images with intensity.raw and depth.raw files. -> the software doesn t grab IR images!! -> only RGB image, intensity.raw, depth.raw and calibration file (calibration.yml without distortion paraeters)!! 4. calibrate-openni-intrinsics --pattern-size 0.025 grab1 calibration.yml -> gives openni_calibration.yml with Intrinsic matrix and distortion coefficients for Kinect RGB camera!! 5. calibrate-openni-depth.exe --pattern-size 0.025 grab1 gives partly processed figures with a message about errors. 7/20

5. MATLAB Inputs. Cube_Curvatura_20110925\ main_reproject.m -> color.png (480x640 pixels) -> calibdata.mat: - matrixes of intrinsic parameters for IR camera (K_ir) and RGB camera (K_rgb) - extrinsic mapping between ir (depth) and rgb Kinect camera (R,T) - distortion coefficients (kc_ir, kc_rgb) -> Points.mat (N x 6), where N number of points with (x,y,z,r,g,b) info 8/20

6. MATLAB Software. 6.1 Elimination of the ground. RANSAC search of the ground by using the plane equation. 9/20

6.2 3D Points Reprojection. Transformation from IR (Depth) to RGB camera reference system: P_rgb = R * P_ir + T, where R,T - extrinsic mapping between ir and rgb Kinect cameras; P_rgb, P_ir points in RGB and IR camera reference systems. Cube Reprojection without considering distortion (up figures). The distance from Kinect to the cube z = 0.5 m Cube Reprojection with considering distortion (bottom figures).??? Bad calibration? 10/2 0

6.2 3D Points Reprojection. Cube and Cylinder Reprojection without considering distortion (up figures). The distance from Kinect to the objects z = 0.6 m??? Bad calibration?! 23/11/2011 Cube and Cylinder Reprojection with considering distortion (bottom figures). 11/2 0

6.2 3D Points Reprojection. Transformation from IR (Depth) to RGB camera reference system: P_rgb = R * P_ir + T, where R,T - extrinsic mapping between ir and rgb Kinect cameras; P_rgb, P_ir points in RGB and IR camera reference systems. Cube Reprojection with calibration parameters from Nicolo (upper figures). The distance from Kinect to the cube z = 0.5 m Bad calibration?! Cube Reprojection with calibration parameters from Alberto (bottom figures). 12/2 0

6.2 3D Points Reprojection. Camera Calibration and 3d Reconstruction [9] for so-called pinhole camera model. The scene view is formed by projecting 3D points into the image plane using a perspective transformation. H where A R(T) A R(T) A = K_rgb R(T) = [R, T] - are the coordinates of a 3D point in the world coordinate space. - are the coordinates of the projection point in pixels. - matrix of intrinsic parameters (does not depend on the scene viewed). - a principal point (that is usually at the image center) - are the focal lengths expressed in pixel-related units. - the matrix of extrinsic parameters. Cube Reprojection with calibration parameters from Alberto (bottom figures). 13/2 0

6.3 RANSAC fitting SQ to 3D data points. With using Levenberg-Marquardt algorithm of distance minimization from SQ to 3D points. Red points outliers. Blue points inliers. Green points SQ model. 14/2 0

6.3 Object structure creation and reprojection on image. Reprojection of the lines between the 3D points of cube vertexes to CCD of Kinect RGB camera (figure a) and to the image with reprojected 3D points (figure b). The information about vertex position was obtained in the previous stage of RANSAC cube pose estimation. Red lines the cube frameworks Figure a Figure b 15/2 0

9. References 1. Nicolas Burrus. Kinect. RGBDemo, calibrate and visualize Kinect output. // http://nicolas.burrus.name/, 2011. 2. OpenNI Modules. // www.openni.org, 3. SensorKinect drivers. // github social coding. https://github.com/avin2/sensorkinect. 4. How-to: Successfully install Kinect on Windows (OpenNI and NITE). // Vangos Pterneas blog: http://studentguru.gr/b/vangos/archive/2011/01/20/how-tosuccessfully-install-kinect-windows-openni-nite.aspx 5. Install OpenKinect for Windows 7 and XP. // http://kinect.dashhacks.com/kinect-guides/2011/01/09/installopenkinect-windows-7-and-xp 6. Camera Calibration and 3d Reconstruction. // OpenCV (Open Source Computer Vision) v2.1 documentation: http://opencv.willowgarage.com/documentation/cpp/camera_calibr ation_and_3d_reconstruction.html 16/20