Vehicle Speed Estimation Based On The Image

Similar documents
THE RESTORATION OF DEFOCUS IMAGES WITH LINEAR CHANGE DEFOCUS RADIUS

A Recognition of License Plate Images from Fast Moving Vehicles Using Blur Kernel Estimation

Edge Width Estimation for Defocus Map from a Single Image

10.3 Polar Coordinates

Photo-Consistent Motion Blur Modeling for Realistic Image Synthesis

Multi-Resolution Estimation of Optical Flow on Vehicle Tracking under Unpredictable Environments

BLIND IMAGE DECONVOLUTION: MOTION BLUR ESTIMATION

Applying the Filtered Back-Projection Method to Extract Signal at Specific Position

Degradation of BER by Group Delay in Digital Phase Modulation

SUPER RESOLUTION INTRODUCTION

Image Deblurring. This chapter describes how to deblur an image using the toolbox deblurring functions.

Restoration of Motion Blurred Document Images

4 STUDY OF DEBLURRING TECHNIQUES FOR RESTORED MOTION BLURRED IMAGES

Blind Single-Image Super Resolution Reconstruction with Defocus Blur

A moment-preserving approach for depth from defocus

Research on 3-D measurement system based on handheld microscope

International Journal of Advancedd Research in Biology, Ecology, Science and Technology (IJARBEST)

An Automatic System for Detecting the Vehicle Registration Plate from Video in Foggy and Rainy Environments using Restoration Technique

ENHANCEMENT OF SYNTHETIC APERTURE FOCUSING TECHNIQUE (SAFT) BY ADVANCED SIGNAL PROCESSING

Blind Blur Estimation Using Low Rank Approximation of Cepstrum

A software video stabilization system for automotive oriented applications

BASIC OPERATIONS IN IMAGE PROCESSING USING MATLAB

THE SINUSOIDAL WAVEFORM

Image Processing for feature extraction

Deblurring. Basics, Problem definition and variants

Blurred Image Restoration Using Canny Edge Detection and Blind Deconvolution Algorithm

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Impact Factor (SJIF): International Journal of Advance Research in Engineering, Science & Technology

Linear Motion Deblurring from Single Images Using Genetic Algorithms

COMPUTATIONAL IMAGING. Berthold K.P. Horn

SURVEILLANCE SYSTEMS WITH AUTOMATIC RESTORATION OF LINEAR MOTION AND OUT-OF-FOCUS BLURRED IMAGES. Received August 2008; accepted October 2008

A Mathematical model for the determination of distance of an object in a 2D image

Postprocessing of nonuniform MRI

Improving Signal- to- noise Ratio in Remotely Sensed Imagery Using an Invertible Blur Technique

A No Reference Image Blur Detection using CPBD Metric and Deblurring of Gaussian Blurred Images using Lucy-Richardson Algorithm

Sloshing Damping Control in a Cylindrical Container on a Wheeled Mobile Robot Using Dual-Swing Active-Vibration Reduction

Modeling and Synthesis of Aperture Effects in Cameras

DEFOCUS BLUR PARAMETER ESTIMATION TECHNIQUE

Motion Estimation from a Single Blurred Image

Computer Vision Slides curtesy of Professor Gregory Dudek

Super Sampling of Digital Video 22 February ( x ) Ψ

Applications of Monte Carlo Methods in Charged Particles Optics

A SIMPLE ANALYSIS OF NEAR-FIELD BORESIGHT ERROR REQUIREMENTS

CHAPTER 1 INTRODUCTION

Removing Temporal Stationary Blur in Route Panoramas

Depth Perception with a Single Camera

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

An Intuitional Method for Mobile Robot Path-planning in a Dynamic Environment

Interpolation of CFA Color Images with Hybrid Image Denoising

Digital Imaging Systems for Historical Documents

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

fast blur removal for wearable QR code scanners

Camera Resolution and Distortion: Advanced Edge Fitting

SIMULATION AND ANALYSIS OF 60 GHz MILLIMETER- WAVE INDOOR PROPAGATION CHARACTERISTICS BASE ON THE METHOD OF SBR/IMAGE

Analysis on the Factors Causing the Real-Time Image Blurry and Development of Methods for the Image Restoration

Broadband Radial Waveguide Power Combiner with Improved Isolation among Adjacent Output Ports

Improved motion invariant imaging with time varying shutter functions

Defense Technical Information Center Compilation Part Notice

EE4830 Digital Image Processing Lecture 7. Image Restoration. March 19 th, 2007 Lexing Xie ee.columbia.edu>

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

sensors ISSN

Module 3: Video Sampling Lecture 18: Filtering operations in Camera and display devices. The Lecture Contains: Effect of Temporal Aperture:

Anti-shaking Algorithm for the Mobile Phone Camera in Dim Light Conditions

5.4 Multiple-Angle Identities

Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network

Image Restoration. Lecture 7, March 23 rd, Lexing Xie. EE4830 Digital Image Processing

Improved Fusing Infrared and Electro-Optic Signals for. High Resolution Night Images

Single-Image Shape from Defocus

This is an author-deposited version published in: Eprints ID: 3672

10/21/2009. d R. d L. r L d B L08. POSE ESTIMATION, MOTORS. EECS 498-6: Autonomous Robotics Laboratory. Midterm 1. Mean: 53.9/67 Stddev: 7.

1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis

Super-Resolution UWB Radar Imaging Algorithm Based on Extended Capon with Reference Signal Optimization

ME 6406 MACHINE VISION. Georgia Institute of Technology

The introduction and background in the previous chapters provided context in

Feature Extraction of Human Lip Prints

GENERATING RETURN WAVEFORM FOR LLRI ONBOARD CHANDRAYAN-1

Quality Measure of Multicamera Image for Geometric Distortion

Reconstruction of Non-Cartesian MRI Data

2. (8pts) If θ is an acute angle, find the values of all the trigonometric functions of θ given

multiframe visual-inertial blur estimation and removal for unmodified smartphones

A Study of Slanted-Edge MTF Stability and Repeatability

Comparison of an Optical-Digital Restoration Technique with Digital Methods for Microscopy Defocused Images

A Unified Framework for the Consumer-Grade Image Pipeline

High Dynamic Range image capturing by Spatial Varying Exposed Color Filter Array with specific Demosaicking Algorithm

A Vehicle Speed Measurement System for Nighttime with Camera

RECOMMENDATION ITU-R S.1257

Enhancement. Degradation model H and noise must be known/predicted first before restoration. Noise model Degradation Model

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

1 Mathematical Methods Units 1 and 2

An Efficient Noise Removing Technique Using Mdbut Filter in Images

The Formation of an Aerial Image, part 3

Multi-Image Deblurring For Real-Time Face Recognition System

Examples of image processing

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

1. Measure angle in degrees and radians 2. Find coterminal angles 3. Determine the arc length of a circle

Image Processing Final Test

Admin Deblurring & Deconvolution Different types of blur

Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University!

Real-Time Scanning Goniometric Radiometer for Rapid Characterization of Laser Diodes and VCSELs

Transcription:

SETIT 007 4 th International Conference: Sciences of Electronic, Technologies of Information and Telecommunications March 5-9, 007 TUNISIA Vehicle Speed Estimation Based On The Image Gholam ali rezai rad*, Javad mohamadi** *Iran, Tehran, University of Science and Technology Rezai@iust.ac.ir **Iran, Tehran, University of Science and Technology Javadm59i@gmail.com Abstract: In this paper we propose a novel approach for vehicle speed estimation in frequency domain, for etract blur parameters. Motion blur is the result when the camera shutter remains open for an etended period of time and a relative motion between camera and object occurs. Blur parameters are direction and length of motion. There is direct relation between motion parameters and vehicle speed, In the other word the when speed of vehicle is increased, also motion length is increased. Speed estimation is performed by camera parameters, imaging geometry and etracted blur parameters. Our proposed method improved the accuracy of speed estimation and measurement of motion blur parameters, with aided Radon Transform between 30-50 percent. Key words: Speed Estimation, Motion blur, Frequency Domain, Radon Transform. INTRODUCTION Various methods for speed estimation are proposed in recent years. All approaches attempt to increase accuracy and decrease cost of hardware implementation. Method of speed estimation is categorized into two classes. First, Active Method: The most popular methods include using RADAR 1 and LIDAR devices to detect the speed of a vehicle. RADAR devices bounce a radio signal off to a moving vehicle, and the reflected signal is picked up by a receiver. The traffic radar receiver then measures the frequency difference between the original and reflected signals, and converts it into the speed of the moving vehicle. A LIDAR device times how long it takes a light pulse to travel from the LIDAR gun to the vehicle and back. Based on this information, LIDAR can quickly find the distance between the gun and the vehicle. By making several measurements and comparing the distance the vehicle traveled between measurements, LIDAR can determine the vehicle's speed accurately. Second, assive Method [3,4,5]: In these Methods, speed information, is etracted from a sequence of real-time traffic images, taken from passive camera. Moving edge is etracted and processed the resulting edge information to obtain quantitative geometric measurements of vehicles. 1 Radio Detection and Ranging Leaser Infrared Detection and Ranging Active methods are usually more epensive compared to a passive method. Motion blur is the result when the camera shutter remains open for an etended period of time and a relative motion between camera and object occurs. Blur parameters are direction and length of motion. There is direct relation between motion parameters and vehicle speed. Blur parameter identification methods can be classified into two type, spatial domain and frequency domain [6,7]. In the spatial domain, first, the sobel edge detector is applied to the image, then with use a iterative method, motion length is etracted. Since for linear motion of a scene the blurring effect mainly occurs in the motion direction, the intensity of high frequency components a long this direction is decreased. For etracting motion direction, derivative of the image in the motion direction is suppressed more of the image intensity compared to other directions. In the frequency domain, the parallel lines in D Fourier spectrum of taken image are appeared when the blurring is occurred, that there is direct relation between distance (orientation) lines in Fourier spectrum and motion length (motion direction). In the other word when the speed of vehicle is increased, also motion length is increased. In this paper we propose a novel approach for vehicle speed estimation that used Radon Transform for etract blur parameter in the frequency domain, based on a single image taken by a still camera. Due to the relative motion between camera and the moving vehicle, motion blur will appear on the image because of finite eposure time. For any fied shutter speed, the moving distance - 1 -

SETIT007 of the vehicle is proportional to the amount of blur caused by the imaging process. Thus, if the parameters of the motion blur (the length and orientation of the motion blur) can be identified, it is possible to recover the speed of the moving vehicle according to the imaging geometry. Our proposed method improved the accuracy of speed estimation and measurement of motion blur parameters, with aided Radon Transform, between 30-50 percent, other than eistence methods such as Huei-Yung Lin& Kun-Jhih Li method. (c) 1. Mathematical Model of Linear Motion Blurring and its Attributes Commonly used linear model for image blur is given by: + + g (, y) = h(, α, y, β ) f ( α, β ) dαdβ (1) where h (, α, y, β ) is a linear SF 1, f (, y) is the ideal image, g (, y ) is the observed image.if we consider the Spatially Invariant case of uniform linear motion along horizontal direction, the SF h (, y ) is given by: 1 h(, y ) = L if 0 otherwise + y L and y = tan ϕ As seen in equation (), motion blur depends on two parameters: "L" Motion length and "ϕ " is Motion direction. The frequency response of "h" is a SINC function. Then, in its frequency response the dominant parallel lines that correspond to very low values near zero is occurred. Figure (1) shows one image that affected by motion blur and its Fourier spectrum. The dominant arallel dark lines are obvious in the Fourier spectrum of this image in the figure (1-c). () (d) (e) Figure1: Original image image withϕ =45, L=30 (c) its Fourier spectrum (d) image withϕ =60, L=30 (e) its Fourier spectrum.. Motion Blur arameters Estimation To use a motion blurred image for vehicle speed estimation, we required the blur parameters are etracted, that included the moving direction of the vehicle and the length of the motion blur. These blur parameters will be used not only for the vehicle speed detection, but also for image deblurring..1. Motion Direction Estimation For most cases, the moving directions of vehicles are parallel to the image horizontal scanlines. However, if a vehicle is moving up hill or down hill such that the motion is not along the horizontal direction in the image, then the direction of motion blur has to be identified. To find motion direction, we give D Fourier spectrum of motion blur image, having a careful look at Fourier spectrum and analysis them the eistence of a relation between motion direction and the direction of this dark lines is determined, In the other word angle between dark line with vertical ais in Fourier spectrum is proportional with direction of motion θαφ.the direction of these parallel lines are found by applying Radon Transform on the Fourier spectrum. The Radon Transform is complicated, but it can be applied on the Fourier spectrum as an image and there is no need to determine edge points. Figure (), shows eample of Fourier spectrum of image and correspond Radon Transform. How over with very analysis, the result obtain by Radon Transfer (equation 3) is concluded thatφ = θ in degree unit. + R( ρ, θ ) = f ( ρ cosθ z sinθ, ρ sinθ + z cosθ ) dz (3) This equality is not image dependent, because the direction of parallel dark line depends on SF. 1 oint Spread Function - -

SETIT007 Figure:,, illustrate Radon Transform with, L=40 andϕ =0, 90 degree, respectively,. Motion Length Estimation To estimate the motion length of the vehicle, the length of the motion blur, the blurred image is first rectified according to the motion direction vehicle to create a new blurred image with horizontal motion direction. The rectifying image is rotation image with (-θ ) degree. That θ is motion direction of vehicle. The frequency response of degradation function in the horizontal direction is given by equation (4): H ( u ) = Luπ sin( ) N u π L sin( ) N One way to find L is solving equation H(u) = 0 for a given u. However, in proposed method L is found without a need to solve H(u) = 0 [6].Those u for which H (u) = 0 are correspond to the straight dark lines that are caused by motion blur. On the other hand, looking carefully at these dark lines in several frequency responses of motion blurred images show that increasing the length of motion blur creates more lines in frequency response and decreasing the length of motion blur decreases number of these lines. Figure (3) illustrates this fact. (4) lines in the Fourier spectrum, since motion length(l) and lines distance(d) is related together are not equal, because of there is mathematical relation between distance lines(d) and Motion length(l). Therefore, there is established, are as follow. Several images by specific motion length are generated and then lines distance is measured, with proposed algorithm. For each of images the relation between L&d is plotted in a dl space. Then, by using a curve fitting algorithm, with a polynomial L(d) of degree n, that fits the data. Figure (4), plot Ld curve of equation (6), that show mathematical relation between motion length L and lines distanced. Ld ( ) = pd + pd + pd + pd + pd+ p (6) 5 4 3 1 3 4 5 6 Figure 4: Ld curve The constant values i are given in table (1). It is important to note that equation(6) is image independent because the positions of parallel dark lines on the frequency response depend only on zero points of the frequency response of degradation function that is caused by convolving the SF to the input image. 1 1 3-3.363e-7 9.8061e-5-0.0103 4 5 6 0.5105-1.5511 146.0971 Table 1: The constant values for equation (6). Figure3: dark lines in the Fourier spectrum L=0, d=3 L=30, d=1 Therefore: 1 L (5) d 3. Vehicle Speed Estimation and Eperimental Result The proposed method for vehicle speed estimation is based on a pinhole camera model. We consider the case that the moving vehicle travels along a direction perpendicular to the optical ais of the camera. As shown in figure (5), the displacement of a moving object can be computed using similar triangles for a fied camera eposure time. This proportional illustrate L is motion length and d is distance between dark lines. At this stage,we calculated distance between lines.however we can t calculate motion length by finding distance between - 3 -

SETIT007 Figure 5: inhole camera model for speed estimation We are used the triangular similarity relation between two triangles AOB and CB for finding distance of the object movement (in piel) and the blur length L (in piel) during a period of time. the relation is illustrated by equation(7): L S Z = (7) f where Z is the distance from the camera to the moving vehicle and f is the focal length of the camera. If the shutter speed of the camera is T seconds and the piel size of the CCD in the horizontal direction is S,then the speed V of the moving vehicle can be calculated by equation(8): V Z L S = = T T f In above equation, S (8) and f are the internal parameters of the camera. S should be assigned according to the manufacturer's data sheet and f can be obtained either from camera settings or camera calibration. T is given by the camera setting. The Z distance between the moving vehicle and the camera is a constant and should be measured physically. Thus, the only unknown parameter is L, which should be estimated to complete the speed estimation of the moving vehicle. According to equation (8), the correctness of the speed measurement depends on all of the five parameters. 4. Eperimental Result In the first eperiment, an image of static vehicle in front of background is presented. As shown in figure (6-a), by using MATLAB software, synthetic motion blurred image with {15,0,5,30,35,40}piels of horizontal blur length is created. Then, with aided purposed algorithm in the frequency domain, Motion Length is calculated, that are as flow, {15,0,9,36,41}. The output data is showed 3.3% error in the Motion lengths, equally ± 1 piel error, at the worst case. Figure 6: Vehicle image blurred with L=30 Restored image using Weiner filter The results of calculating Motion length in the frequency domain is better rather than spatially domain. The deblurred images are restored using Weiner filter with the best focused result. If the image restoration is applied their on the whole image, some ringing effects [8] will appear when the object region is deblurred. The restored image with L=30, is shown in the figure (6-b). The second eperiment is performed for the highway vehicle speed estimation. Figure (7), shows the recorded motion blurred image by CANON digital camera. The actual speed of the vehicle is approimately 40 km/ hr. The camera parameters for the eperiment are: focal length 5.8 mm, piel size 0.004 mm, shutter speed 1/100 seconds, distance from the camera to the vehicle 5.30m and Motion blur length is 9 piels, from left to right in the scene. Thus, the speed of the vehicle should be approimately 38.5 km/ hr. it is less than 3.75% of error. Figure 7: Moving vehicle Restored image 5. Verification Other Method and Their Accuracy Our proposed method improved the accuracy of speed estimation and measurement of motion blur parameters, with aided Radon Transform between 30-50 percent, other than eistence methods such as Huei-Yung Lin& Kun-Jhih Li method [1,]. Table illustrated a typical comparing between our method and other method such as Huei-Yung Lin& Kun-Jhih Li method[1,] and speed estimation using Uncalibrated Cameras method [5]. - 4 -

SETIT007 Errors Meth od Motion Length Estimation Error(piel) Speed Estimation, Minimum Error (%) Speed Estimation, Maimum Error (%) boundary artifacts, IEEE Transactions on image processing, VOL. 14, NO. 10, October 005. Uncalibrated Camera Huei-Yung Lin& Kun-Jhih roposed 1.65 15 7 5 5 15 8.4 Table : A typical comparing between proposed method and other method. 6. Conclusion In this paper model of motion blur for a moving vehicle in front of a still background is proposed. The radon Transform is used for etracted motion blur parameter from a single image is taken by CCD camera. Our method improved the accuracy of speed estimation and measurement of motion blur parameters, with aided Radon Transform between 30-50 percent, other than eistence method such as Huei- Yung Lin& Kun-Jhih Li [1,]. The simulation result shows the error this method is half than Huei-Yung Lin& Kun-Jhih Li method error. REFERENCES [1] Huei-Yung Lin and Kun-Jhih Li, MOTION BLUR REMOVAL AND ITS ALICATION TO VEHICLE SEED DETECTION in IEEE International Conference on Image rocessing (ICI), Vol.5 4-7 Oct, 004. [] Huei-Yung Lin and Chia-Hong Chang, Automatic Speed Measurement of Spherical Object Using an Offthe shelf Digital Camera, roceedings of the 005 IEEE, International Conference o Mechatronics July 10-1, 005, Taipei, Taiwan. [3] D. J. Dailey/ L.Li, An Algorithm to Estimate Vehicle Speed Using Un-Calibrated Cameras, Intelligent transportation systems, IEEE ITSC'99, 5-8 October 1999, Tokyo, Japan. [4] Todd N.SCHOEFLIN, and Daniel J.DAILEY, Algorithms for calibrating roadside traffic cameras and estimating mean Vehicle Speed, IEEE Intelligent Vehicles Symposium, 14-17 June, 004, arma, Italy. [5] Mei Yu, Gangyi Jig, and Bokang Yu, An integrative method for video based traffic parameter etraction in ITS, The IEEE Asia-acific Conference, 4-6 Dec 001. [6] Muhsen Ebrahimi Mughaddam and Mansour Jamzad, finding point spread function of motion blur using RADON TRANSFORM and modeling the motion length, roceedings of the Fourth IEEE International Symposium 18-1 Dec. 004. [7] Muhsen Ebrahimi Mughaddam and Mansour Jamzad, Motion blur using identigication in noisy images using fuzzy sets, Symposium on signal processing and Information Technology IEEE, 005. [8] Stanley J. Reeves, Fast image restoration without - 5 -