PLazeR. a planar laser rangefinder. Robert Ying (ry2242) Derek Xingzhou He (xh2187) Peiqian Li (pl2521) Minh Trang Nguyen (mnn2108)

Similar documents
Design Description Document - 1D FIR Filter

>>> from numpy import random as r >>> I = r.rand(256,256);

Available online at ScienceDirect. Ehsan Golkar*, Anton Satria Prabuwono

Lane Detection in Automotive

Lane Detection in Automotive

Image Filtering. Median Filtering

CIS581: Computer Vision and Computational Photography Homework: Cameras and Convolution Due: Sept. 14, 2017 at 3:00 pm

Computer Vision. Howie Choset Introduction to Robotics

>>> from numpy import random as r >>> I = r.rand(256,256);

Convolution Engine: Balancing Efficiency and Flexibility in Specialized Computing

An Embedded Pointing System for Lecture Rooms Installing Multiple Screen

4.5.1 Mirroring Gain/Offset Registers GPIO CMV Snapshot Control... 14

Image Processing. Adam Finkelstein Princeton University COS 426, Spring 2019

Image processing. Case Study. 2-diemensional Image Convolution. From a hardware perspective. Often massively yparallel.

Image Manipulation: Filters and Convolutions

Face Detection System on Ada boost Algorithm Using Haar Classifiers

COURSE ECE-411 IMAGE PROCESSING. Er. DEEPAK SHARMA Asstt. Prof., ECE department. MMEC, MM University, Mullana.

Open Source Digital Camera on Field Programmable Gate Arrays

Solution Set #2

INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET

CS534 Introduction to Computer Vision. Linear Filters. Ahmed Elgammal Dept. of Computer Science Rutgers University

Figures from Embedded System Design: A Unified Hardware/Software Introduction, Frank Vahid and Tony Givargis, New York, John Wiley, 2002

Manufacturing Metrology Team

Embedded Systems CSEE W4840. Design Document. Hardware implementation of connected component labelling

MATLAB Image Processing Toolbox

Sensors and Sensing Cameras and Camera Calibration

Images and Graphics. 4. Images and Graphics - Copyright Denis Hamelin - Ryerson University

Image Deblurring. This chapter describes how to deblur an image using the toolbox deblurring functions.

Real-Time License Plate Localisation on FPGA

Proposed Method for Off-line Signature Recognition and Verification using Neural Network

Computing for Engineers in Python

Implementing Sobel & Canny Edge Detection Algorithms

Preliminary Design Review

DESIGN OF A LASER DISTANCE SENSOR WITH A WEB CAMERA FOR A MOBILE ROBOT

Vision Review: Image Processing. Course web page:

Study guide for Graduate Computer Vision

Range Sensing strategies

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

CEE598 - Visual Sensing for Civil Infrastructure Eng. & Mgmt.

MINIATURE X-RAY SOURCES AND THE EFFECTS OF SPOT SIZE ON SYSTEM PERFORMANCE

Exercise questions for Machine vision

Efficient Construction of SIFT Multi-Scale Image Pyramids for Embedded Robot Vision

Part Number SuperPix TM image sensor is one of SuperPix TM 2 Mega Digital image sensor series products. These series sensors have the same maximum ima

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

Midterm Examination CS 534: Computational Photography

X-RAY COMPUTED TOMOGRAPHY

Image Processing : Introduction

FPGA based Real-time Automatic Number Plate Recognition System for Modern License Plates in Sri Lanka

CSC 320 H1S CSC320 Exam Study Guide (Last updated: April 2, 2015) Winter 2015

High Performance Imaging Using Large Camera Arrays

Single Camera Catadioptric Stereo System

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

Calibration. Click Process Images in the top right, then select the color tab on the bottom right and click the Color Threshold icon.

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

TRIANGULATION-BASED light projection is a typical

Hardware-accelerated CCD readout smear correction for Fast Solar Polarimeter

Matlab (see Homework 1: Intro to Matlab) Linear Filters (Reading: 7.1, ) Correlation. Convolution. Linear Filtering (warm-up slide) R ij

Understanding Matrices to Perform Basic Image Processing on Digital Images

A 3D Profile Parallel Detecting System Based on Differential Confocal Microscopy. Y.H. Wang, X.F. Yu and Y.T. Fei

A moment-preserving approach for depth from defocus

Measurement of Visual Resolution of Display Screens

30 lesions. 30 lesions. false positive fraction

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications )

Installation. Binary images. EE 454 Image Processing Project. In this section you will learn

Circular averaging filter (pillbox) Approximates the two-dimensional Laplacian operator. Laplacian of Gaussian filter

Background. Computer Vision & Digital Image Processing. Improved Bartlane transmitted image. Example Bartlane transmitted image

Convolutional Networks Overview

In-line measurements of rolling stock macro-geometry

An Accurate phase calibration Technique for digital beamforming in the multi-transceiver TIGER-3 HF radar system

Image Processing for feature extraction

ONE of the most common and robust beamforming algorithms

SIGNAL PROCESSING ALGORITHMS FOR HIGH-PRECISION NAVIGATION AND GUIDANCE FOR UNDERWATER AUTONOMOUS SENSING SYSTEMS

Rubik's Cube Solver William Pitt c Professor Rosin Dr Mumford Bsc Computer Science School of Computer Science and Informatics 03/05/2013

Fundamentals of Multimedia

Blurred Image Restoration Using Canny Edge Detection and Blind Deconvolution Algorithm

Computer and Machine Vision

02/02/10. Image Filtering. Computer Vision CS 543 / ECE 549 University of Illinois. Derek Hoiem

Automatic Electricity Meter Reading Based on Image Processing

EE482: Digital Signal Processing Applications

12 Color Models and Color Applications. Chapter 12. Color Models and Color Applications. Department of Computer Science and Engineering 12-1

Open Source Digital Camera on Field Programmable Gate Arrays

Coding & Signal Processing for Holographic Data Storage. Vijayakumar Bhagavatula

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

II. Basic Concepts in Display Systems

Guided Filtering Using Reflected IR Image for Improving Quality of Depth Image

Displacement Measurement of Burr Arch-Truss Under Dynamic Loading Based on Image Processing Technology

Image Denoising Using Statistical and Non Statistical Method

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

The Henryk Niewodniczański INSTITUTE OF NUCLEAR PHYSICS Polish Academy of Sciences ul. Radzikowskiego 152, Kraków, Poland.

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

CS 4501: Introduction to Computer Vision. Filtering and Edge Detection

HDR videos acquisition

3D-scanning system for railway current collector contact strips

Quality Control of PCB using Image Processing

Pipelining Harris Corner Detection with a Tiny FPGA for a Mobile Robot

BULLET SPOT DIMENSION ANALYZER USING IMAGE PROCESSING

Optimized Image Scaling Processor using VLSI

Digital Image Processing Lec.(3) 4 th class

The BIOS in many personal computers stores the date and time in BCD. M-Mushtaq Hussain

An Efficient Noise Removing Technique Using Mdbut Filter in Images

Transcription:

PLazeR a planar laser rangefinder Robert Ying (ry2242) Derek Xingzhou He (xh2187) Peiqian Li (pl2521) Minh Trang Nguyen (mnn2108) Overview & Motivation Detecting the distance between a sensor and objects in a scene is a useful tool for machine perception and robotics such as modeling 3 D objects, finding target distance used in military purposes, and using it as a measuring tools instead of tape measuring. This project is about determining the distance to an object using laser beam. Most rangefinders use either light or sound as their primary media, and then use triangulation or time of flight to determine distance. We intend to use a planar laser to build a laser rangefinder that can simultaneously determine the distance to various objects in the scene. As we can see in the image below, the distance of an object is related to the height of the projected laser line. We can detect the laser line at each column in the camera image, and then use that to generate a planar point cloud.

Architecture This system takes the input data from a USB camera and calculates the distance of the object in question. The system can be divided into three main components: software, software hardware interface, and hardware. User space software will be used to read from the camera feed, convert the images to the correct format, and transmit it to the FPGA board that does the algorithm. After the FPGA processes the image to get the laser line at each column, the result is sent back to the software to be processed to a calculated distance. A kernel module driver will monitor and control the transmission. Software The software programming that we use will be mainly on C. The software component of this project include these following steps: Read the image data from the camera The image from the camera will be feeded to the software program in the format of three 480 x 640 matrix representing the pixel information, one matrix for each channel color which are red, green, and blue. After that, those matrixes will be transferred to FPGA board which will complete the image processing and laser detecting process before feeding the laser line information back to the software. Having the processed image and the differential vector from hardware, the distance from the camera and the distance from the laser can be calculated using trigonometry. Reading in the data interpretation from hardware After converting the camera image into scene coordinates using camera intrinsic matrix, the data we received from hardware are the image profile and the camera parameter. The image profile includes two vectors: one is a vector with a size of 640 representing the horizontal distance of the laser beam to the y axis, the second one is a vector with the size of 480 representing the height of the laser beam to the x axis. The camera parameters are the horizontal distance from the laser to the camera, the distance from the camera to the wall, and Gaussian kernel mean and variance. Solve for the angle θ To calculate θ, we will use the image of the laser beam on the wall, or a flat surface. The angle will be calculated using simple linear interpolation. Calculating the distance Having the angle Theta calculated above and the distance from the laser to the camera, the distance of the object can be calculated using trigonometric equation.

Hardware/Software Interface Calculating distance from the image uint8[640*480*3] bool uint8 uint8 uint16[640] uint16[480] bool three channel RGB image data r/w flag Gaussian filter bandwidth laser threshold value y axis differentials x axis differentials ready flag Timing: v Kernel module v [set r/w][write 640x480x3 bytes][unset r/w] [read differentials] [unset ready] [gaussian filter][threshold][average][set ready] ^ FPGA ^

Hardware The primary purpose of the FPGA in this system is to implement the function f: I x σ x η L where I is the image space Z 640x480 +, σ is the bandwidth of a Gaussian kernel, and η is the thresholding value for the desired value. The output of the function L is then the pixel wise displacement between the calibrated laser line and the detected line position, represented as a vector in the space Z + 640. We implement this function as a two dimensional convolution and other operations between the image and a fixed size k by k kernel. In particular, we will only need to buffer the k 2 data points for all of the operations. Step 1: Gaussian Blurring As the camera is fairly low cost, we will need to apply some preprocessing before the data can be analyzed. In particular, there is nontrivial sampling error and measurement error that results in a speckled noise pattern on the captured image. In hardware, we can efficiently eliminate the effect of this noise by convolving the image with a Gaussian kernel. We can implement the convolution as follows (example given as a 3x3 kernel, though we would likely use 15x15 or other larger windows):

In essence, we will implement k rolling buffers of length k across the image, so that we can 1 raster scan a k by k matrix to perform the convolution with. This is fairly efficient, as we need only use k 2 multiplications; moreover, we can do these multiplications with integral values only, saving floating point multipliers. Step 2: Thresholding The strength of the laser is such that it will saturate the camera. Therefore, both the horizontal and the vertical lines will appear as white in the image feed. We therefore can specify a threshold (its value to be calibrated and provided for the run) and run thresholding for each pixel to get the area of the lines. For this purpose, each of the pixel is simply compared with the thresholding constant stored in DRAM and provided by the software, resulting in a black & white binary signal for each pixel. Step 3: Averaging For each column of the above mentioned binary image, we can determine its average point now by calculating: on all columns j. This should be easy to implement on a FPGA board with arithmetic tools. The resulted array is returned to the software for processing. 1 http://blog.teledynedalsa.com/2012/05/image filtering in fpgas/

Milestones Milestone 1 Get software prototype working Design and test thresholding and averaging hardware Milestone 2 Design and test Gaussian convolution hardware Write the kernel module Milestone 3 System integration done Use for actual measurements and report the result Reference How to calculate distance http://www.cse.unr.edu/~bebis/cs791e/notes/cameraparameters.pdf http://www.seattlerobotics.org/encoder/200110/vision.htm Hardware implementation http://www.rgshoup.com/prof/pubs/fpga93.pdf