Determining Crash Data Using Camera Matching Photogrammetric Technique

Similar documents
Subjective Color Preferences of Common Road Sign Materials Under Headlamp Bulb Illumination

Close-Range Photogrammetry for Accident Reconstruction Measurements

McNally & Associates Accident Reconstruction Services, LLC 3/2004 to Present

Photogrammetric Measurement Error Associated with Lens Distortion

Lenses and Focal Length

SAE TECHNICAL PAPER SERIES. Noise Classification of Aircrafts using Artificial Neural Networks

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

Sample Copy. Not For Distribution.

4) Click on Load Point Cloud to load the.czp file from Scene. Open Intersection_Demo.czp

PHOTOGRAMMETRY STEREOSCOPY FLIGHT PLANNING PHOTOGRAMMETRIC DEFINITIONS GROUND CONTROL INTRODUCTION

Simulation and Animation Tools for Analysis of Vehicle Collision: SMAC (Simulation Model of Automobile Collisions) and Carmma (Simulation Animations)

BASICS: NOT SO BASIC BASICS

UAV PHOTOGRAMMETRY COMPARED TO TRADITIONAL RTK GPS SURVEYING

What Is Forensic Engineering? p. 1 Introduction p. 1 Definitions p. 1 Accident Reconstruction p. 2 Typical Clients and Projects p.

Volume 1 - Module 6 Geometry of Aerial Photography. I. Classification of Photographs. Vertical

Discomfort and Disability Glare from Halogen and HID Headlamp Systems

ARC Network Police Program: New Grassroots Police Program for all Traffic Investigation and Reconstruction Teams

NAFE Journal Papers on Forensic Engineering Accident Reconstruction Edition Vol. I Forensic Engineering Collection

ContextCapture Quick guide for photo acquisition

Lighting Techniques 18 The Color of Light 21 SAMPLE

TECHNICAL INFORMATION Traffic Template Catalog No. TT1

Use of Photogrammetry for Sensor Location and Orientation

Enlarging and Reducing Shapes

Appendix 8.2 Information to be Read in Conjunction with Visualisations

FIELD CORRELATED LIFE TEST SUPPLEMENT TO SAE/USCAR-2 SUMMARY OF CONTENTS 1. SCOPE OUTLINE REFERENCED DOCUMENTS EQUIPMENT...

PUBLICATION 213. Think Safety First

Unmanned Aerial Vehicle Data Acquisition for Damage Assessment in. Hurricane Events

A New Capability for Crash Site Documentation

CSI: Rombalds Moor Photogrammetry Photography

2016 IEEE Global Humanitarian Technology Conference (GHTC 2016)

SIGN PERMIT APPLICATION

Inserting and Creating ImagesChapter1:

DOCUMENTING AND MEASURING DEBRIS REMOVAL

Evaluation of Distortion Error with Fuzzy Logic

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

MEM: Intro to Robotics. Assignment 3I. Due: Wednesday 10/15 11:59 EST

Following are the geometrical elements of the aerial photographs:

Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter

Before the Environment Court at Auckland ENV-2013-AKL

Pavement Surface Condition/Performance Assessment: Reliability and Relevancy of Procedures and Technologies

Proposed Kumototo Site 10 Wellington

CONCEPT REVIEW GUIDELINES

BROADCAST ENGINEERING 5/05 WHITE PAPER TUTORIAL. HEADLINE: HDTV Lens Design: Management of Light Transmission

HIGH INTEGRITY DIE CASTING PROCESSES

Douglas Photo. Version for iosand Android

CALIBRATION OF IMAGING SATELLITE SENSORS

Technical information about PhoToPlan

CALIBRATION OF OPTICAL SATELLITE SENSORS

Performance Factors. Technical Assistance. Fundamental Optics

SITE PREPARATION. Capture Station Placement and Maintenance. General Maintenance Plan

GEORGE M. JANES & ASSOCIATES. September 4, Ted Fink Greenplan 302 Pells Rd. Rhinebeck, NY 12572

Application Note (A16)

COPYRIGHTED MATERIAL. Overview

DISPLAY REGULATIONS PAGE 1 OF 6

COPYRIGHTED MATERIAL OVERVIEW 1

SITE PLAN APPLICATION

ON-SITE BRANDING OPPORTUNITIES

Nikon Rayfact VF. Features. Applications. ver2.2

General Physics II. Ray Optics

1. Redistributions of documents, or parts of documents, must retain the SWGIT cover page containing the disclaimer.

PROCEEDINGS OF SPIE. , "Front Matter: Volume 8488," Proc. SPIE 8488, Zoom Lenses IV, (19 October 2012); doi: /12.

CALIBRATING THE NEW ULTRACAM OSPREY OBLIQUE AERIAL SENSOR Michael Gruber, Wolfgang Walcher

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K.

Crime Scene Diagramming: Back to Basics

RESEARCH ON LOW ALTITUDE IMAGE ACQUISITION SYSTEM

Photoshop CS6 First Edition

Technology of Floor. Maintenance and Current Trends. William J. Schalitz, editor. ASTM Stock Number: STP1448

Technical Report Documentation Page 2. Government 3. Recipient s Catalog No.

Finding Aid to the Ghirardelli Square Architectural Records, No online items

2016 3rd Conference on Power Engineering and Renewable Energy (ICPERE 2016)

SECTION 4 DESIGN SURVEYS

Physics 132: Lecture Fundamentals of Physics

NON-METRIC BIRD S EYE VIEW

T&E Express SCSU Mobile Lab Program

11/25/2009 CHAPTER THREE INTRODUCTION INTRODUCTION (CONT D) THE AERIAL CAMERA: LENS PHOTOGRAPHIC SENSORS

SITE PLAN, SUBDIVISION & EXTERIOR DESIGN REVIEW PROCESS

Evaluation of Photometric Data Files for Use in Headlamp Light Distribution

Photo Scale The photo scale and representative fraction may be calculated as follows: PS = f / H Variables: PS - Photo Scale, f - camera focal

Development of Gaze Detection Technology toward Driver's State Estimation

EXHIBIT DISPLAY REGULATIONS LINEAR BOOTH April Orange County Conven on Center Orlando, Florida

Calibration Aids for Metron

Time-Lapse Panoramas for the Egyptian Heritage

Movie 10 (Chapter 17 extract) Photomerge

I-I. S/Scientific Report No. I. Duane C. Brown. C-!3 P.O0. Box 1226 Melbourne, Florida

Technology Accreditation Canada (TAC) SURVEY/GEOMATICS TECHNOLOGY TECHNOLOGIST Canadian Technology Accreditation Criteria (CTAC)

F E N C E S A N D G A T E S

Norris Sucker Rod Project. Andrew Dickey, Justin O Neal, and Daniel Whittlesey

The Fastest, Easiest, Most Accurate Way To Compare Parts To Their CAD Data

AR 2 kanoid: Augmented Reality ARkanoid

Hello, welcome to the video lecture series on Digital Image Processing.

INDIAN SCHOOL MUSCAT SENIOR SECTION DEPARTMENT OF PHYSICS CLASS X REFLECTION AND REFRACTION OF LIGHT QUESTION BANK

Town of Gawler Signage

INSERTING THE PAST IN VIDEO SEQUENCES

Autofocus Problems The Camera Lens

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System

LOGO USAGE JANUARY 2010 VERSION 2.0 BRAND IDENTITY GUIDELINES 6

Section 4-02 Typical Sections TABLE OF CONTENTS. INTRODUCTION...2 General...2

Optical Measurements of Water Droplet Characteristics in Turbulent Gasoline Pipe Flow

QUICKSTART COURSE - MODULE 1 PART 2

AIRCRAFT CONTROL AND SIMULATION

Transcription:

SAE TECHNICAL PAPER SERIES 2001-01-3313 Determining Crash Data Using Camera Matching Photogrammetric Technique Stephen Fenton, William Neale, Nathan Rose and Christopher Hughes Knott Laboratory, Inc. Reprinted From: ATTCE 2001 Proceedings Volume 1: Safety (P-367) Automotive & Transportation Technology Congress & Exhibition October 1-3, 2001 Barcelona, Spain 400 Commonwealth Drive, Warrendale, PA 15096-0001 U.S.A. Tel: (724) 776-4841 Fax: (724) 776-5760

The appearance of this ISSN code at the bottom of this page indicates SAE s consent that copies of the paper may be made for personal or internal use of specific clients. This consent is given on the condition, however, that the copier pay a $8.00 per article copy fee through the Copyright Clearance Center, Inc. Operations Center, 222 Rosewood Drive, Danvers, MA 01923 for copying beyond that permitted by Sections 107 or 108 of the U.S. Copyright Law. This consent does not extend to other kinds of copying such as copying for general distribution, for advertising or promotional purposes, for creating new collective works, or for resale. SAE routinely stocks printed papers for a period of three years following date of publication. Direct your orders to SAE Customer Sales and Satisfaction Department. Quantity reprint rates can be obtained from the Customer Sales and Satisfaction Department. To request permission to reprint a technical paper or permission to use copyrighted SAE publications in other works, contact the SAE Publications Group. All SAE papers, standards, and selected books are abstracted and indexed in the Global Mobility Database No part of this publication may be reproduced in any form, in an electronic retrieval system or otherwise, without the prior written permission of the publisher. ISSN 0148-7191 Copyright 2001 SAE International and Messe Düsseldorf. Positions and opinions advanced in this paper are those of the author(s) and not necessarily those of SAE. The author is solely responsible for the content of the paper. A process is available by which discussions will be printed with the paper if it is published in SAE Transactions. For permission to publish this paper in full or in part, contact the SAE Publications Group. Persons wishing to submit papers to be considered for presentation or publication through SAE should send the manuscript or a 300 word abstract of a proposed manuscript to: Secretary, Engineering Meetings Board, SAE. Printed in USA

2001-01-3313 Determining Crash Data Using Camera Matching Photogrammetric Technique Stephen Fenton, William Neale, Nathan Rose and Christopher Hughes Knott Laboratory, Inc. Copyright 2001 SAE International and Messe Düsseldorf. ABSTRACT Accident scene photographs contain important information that can be useful in determining how accidents happened. However, dimensions are difficult to gather from photographs. The size of an object in the photographs depends on how far away from the camera the object is located. An object in the background looks smaller and will measure smaller than the same size object in the foreground. This phenomenon is called perspective distortion. Photogrammetry was introduced in the late 1800 s as a tool to compensate for the perspective distortion and assist in gathering dimensions from photographs. One of the early techniques was to create a transparent miniature of a photograph and place the miniature in the view screen on the camera. The camera was then taken to the scene and matched to the correct position such that the image in the scene matched the image in the view screen. Today, using computer modeling software, a scene can be created in the computer model that matches the actual photograph. Using a technique called camera matching, the camera in the computer can be adjusted to match the photograph. Once properly matched, dimensions within the photograph can be gathered. This technique is useful in gathering dimensional data from crash scene photographs like the point of impact and the point of rest of crash vehicles. Once the crash scene dimensions are determined, the accident can be reconstructed using the principals of conservation of momentum and energy. INTRODUCTION In the late 1980 s, a photogrammetric technique called the reverse projection method was introduced 1,2. This method was based on re-establishing the original camera viewpoint by returning to the scene with a transparency placed in a camera s view screen and viewing the scene through the transparency. A Nikon F- 3 camera with a detachable view prism was designed so that the transparency could be placed in the view screen. At the scene, one person would position the camera and adjust the focal length so that the transparency matched the scene. A second person would mark the position of the lost features such as the tire marks and positions of the vehicles. Once these features were determined, they would be surveyed to determine the necessary crash data. This method of reverse projection was tedious and time-consuming making it difficult to accomplish at scenes with heavy traffic. Due to the high cost of the camera, and the time/safety issues, this process was not often used. A safer and more cost effective solution has arisen using computer modeling software. Instead of creating a transparency of the scene, the scene can be recreated using computer modeling software. A virtual camera can be created in the computer model. The scene photograph can be digitized and placed in the virtual camera s view port. In the computer model, the camera can be adjusted so that the modeled scene matches the background image. Once the camera is properly matched, the necessary dimensions can be identified and measured. DISCUSSION PROCEDURE Below is an outline of the steps involved in gathering dimensions from photographs using the camera matching photogrammetric technique. 1. Create a digital model of the scene. Gather dimensional data of known objects by either obtaining an aerial photograph of the scene or by performing a scene inspection. Using scene data, create three-dimensional digital computer model of the accident scene. 2. Import scene photographs into the computer. Digitize photographs. Calibrate photographs as background images in the computer modeled scene. 3. Camera match the digital model to the background images.

Create a camera in the computer model for the background image. A guideline for placement is to consider the height of the photographer. Place the camera approximately five feet five inches above the ground level. Start with a 50 mm lens. Based on the observed discrepancies between the perspective of the photograph and perspective of the computer scene, adjust the focal length and vanishing point of the camera accordingly. If the objects look stretched when matched to the photograph, increase the focal length and move the vanishing point away. If the objects look squished, decrease the focal length and bring the vanishing point closer. 4. Testing Accuracy of Camera Position Measure other objects in the scene and compare their dimensions to the known dimensions of the objects. Figure 2 CAD Model over Aerial Photo CASE STUDY To explain the process in greater detail, these authors created an accident scene and photographed the vehicles and the skid marks. The locations of the vehicles and skid marks were surveyed. Our objective was to determine the distances between the skid marks and vehicles using the camera matching photogrammetric technique and compare the results to the survey. Figure 1 Original Photograph The first step was to gather dimensions from the scene so that a 3D model of the accident scene could be made. This can either be done by visiting the scene or, in this case, by gathering the data from an aerial photograph. The locations of the lane lines and curbs were measured and documented from the aerial. This information was used to create a three-dimensional model of the accident 3 site in AutoCAD. The data was then imported into a 3D4 modeling program called 3DStudio Max. A virtual camera was then placed in the computer modeled scene within 3DStudio. The original photograph was digitized and placed as the background in the camera view port. The camera was set at a height of five feet five inches off the roadway surface and the camera s focal length was set at 50mm. The camera s location was moved throughout the computer model while attempting to match the three-dimensional model to the photograph in the background. As mentioned previously, there were three dimensions that the camera could be moved (x, y, z), however the z dimension was known relatively well because we had a good idea of the photographer s eye height. The camera could also be rotated about the x, y, and z-axis. The position and rotational values were relatively easy to adjust, however the camera s focal length was the most difficult parameter to adjust correctly. When selecting the virtual camera settings, it is recommended that a 50mm lens be selected initially. A 50mm lens most reasonably represents what the human eye sees. However, often times the camera used may have had a zoom lens. If you are able to contact the photographer to determine the setting, this could save you some time. Through trial and error, you will notice that as the camera s focal length is changed from 50mm to 28mm, objects in the scene become distorted. In this case, the lane stripes became shorter and did not match up with the lane stripes in the photograph in the background. As the focal length was changed from 50mm to 80mm, the lane lines became longer and still did not match up correctly. After modifying the focal length and adjusting the camera s position and rotation, the computer model eventually matched the photographic background. After some practice, this process became easier.

Figure 3 CAD Model Matched to Photograph The next step in the process was to import scaled models of the vehicles into the scene and to position the models so that they matched the photograph in the background. Figure 5 Top View of CAD Model In this case, we were interested in the length of the skid marks and the locations of the vehicles at the point of impact and rest. The photograph clearly showed the point of rest of the vehicles, and the point of impact was determined by making copies of the vehicles and placing them at the end of the tire marks. The red Honda was position so that its front tires lined up with the end of the skid marks. This was the red Honda s point of impact position. The black Audi was placed so that the damage to the right side matched the front of the red Honda. This was the black Audi s point of impact position. Figure 4 CAD Model with Vehicles Added Boxes with the same exterior dimensions (length, width and height) were used to represent the vehicles. The vehicles were positioned in their correct location at the points of rest. The tire marks were traced on the roadway so that they matched the photograph in the background. Once the vehicles and tire marks were recreated, the scene could be viewed from the top to determine distances. Figure 6 Top View with Vehicles added at Point of Impact In this case, we identified the start of the right skid mark as Pt. A and the end as Pt. B. We identified the start of the left skid mark as Pt. C and the end as Pt. D. The black Audi s left rear tire location was identified as Pt E and the front left tire was identified as Pt F. The red Honda s right rear tire was identified as Pt G, and the front right tire was identified as Pt H. A dimensional analysis was performed comparing the actual

dimensions gathered in the field to the dimensions determined in the camera matching analysis. These authors found that the dimensions determined through the camera matching process were on average within 2.2% of the field dimensions. Point ID Field Dimension Camera Matched Dimension Percent Difference A-B 59.76 61.56 3.01% A-C 13.64 12.91 5.35% A-D 60.12 61.42 2.16% A-E 80.58 83.02 3.03% A-F 77.08 78.47 1.80% A-G 124.09 123.38 0.57% A-H 128.32 128.96 0.50% B-C 72.63 73.86 1.69% B-D 5.13 5.42 5.65% B-E 47.86 48.27 0.86% B-F 39.54 39.35 0.48% B-G 64.63 61.98 4.10% B-H 69.51 68.11 2.01% C-D 72.61 73.38 1.06% C-E 93.83 95.49 1.77% C-F 90.58 91.19 0.67% C-G 137.17 135.82 0.98% C-H 141.60 141.57 0.02% D-E 52.86 53.72 1.63% D-F 44.49 44.74 0.56% D-G 65.35 63.14 3.38% D-H 70.75 69.80 1.34% E-F 8.74 9.14 4.58% E-G 70.39 69.33 1.51% E-H 69.43 69.16 0.39% F-G 64.73 63.56 1.81% F-H 64.67 64.81 0.22% G-H 8.63 9.34 8.23% Average 2.12% CONCLUSION Using this camera matching photogrammetric technique enables dimensions to be gathered from photographs quickly and safely using typical computer modeling software. The process provides a very descriptive and compelling visual record that can be used to gather important crash data. Based on the case study, the results are well within the levels of accuracy that make this process useful, although the accuracy depends on the ability of the user to accurately place the camera in the correct position with the correct focal length. This same limitation is inherent in the reverse projection method also. REFERENCES 1 Woolley, R., White, K., Asay, A., and Bready, J.; Determination of Vehicle Crush from Two Photographs and the Use of 3D Displacement Vectors in Accident Reconstruction ; Society of Automotive Engineers - 910118, 1991. 2 Smith G., Allsop D.; A Case Comparison of Single- Image Photogrammetry Methods ; Society of Automotive Engineers - 890737, 1989. 3 AutoCAD 2000, Autodesk, San Rafael, California. 4 3DStudio Max, Discreet, a Division of Autodesk. San Rafael, California. CONTACT The authors can be reached at the following address: Knott Laboratory, Inc. 7185 South Tucson Way Englewood, CO 80112 USA Phone: (303) 925-1900 Fax: (303) 925-1901 Email: sfenton@knottlab.com http://www.knottlab.com/