Traversing the UNSW campus using

Size: px
Start display at page:

Download "Traversing the UNSW campus using"

Transcription

1 GMAT Thesis B UNSW School of Surveying and Spatial Information Systems Traversing the UNSW campus using Terrestrial Photogrammetry Author: Jarrod Braybon z j.braybon@student.unsw.edu.au Supervisor - Dr. Bruce Harvey Co-supervisors - Yincai Zhao and Professor John Trinder Date: October 21 st 2011

2 Declaration of Academic Integrity ii

3 Abstract This thesis presents the findings of a comparison between the results presented by Gabriel Scarmana at the 2010 Fèdèration Internationale des Gèometres (FIG) and the results of an independent test. It is to be determined if Scarmana s presented results of using close range photogrammetry to traverse around buildings can be successfully replicated by an inexperienced user. Experimental procedures were kept constant where possible to maintain consistency and the same software program, Photomodeler Pro, was used for processing. An increase in the quality of camera was one change to the project parameters, it was anticipated this would improve positional accuracy. Over a distance of 140m, 52 photographs were taken and 284 reference points were identified and processed. Upon completion of the traverse the largest positional error was calculated to be 0.651m from the coordinates measured using traditional surveying methods. This error occurred at the furthest point from the origin. The traverse was reprocessed as an incomplete loop with the positional errors increasing to over 2 m. From this it was determined that a closed loop provides considerably more accurate positional results. The m positional error over 140 m is significantly better than that suggested by Scarmana (1 m error for every 150 m of traverse). This improvement in accuracy is believed to be due to the higher quality camera used in this project From the results obtained in this thesis it can be concluded that Scarmana s results can be reproduced by an inexperienced user to a similar standard. Several areas of potential improvement to the method were identified including: the use of portable targets; investigating the affect of angle geometry on positional accuracy; and the use of more control points to improve three dimensional accuracy. With improvements to accuracy, photogrammetry may become a useful alternative surveying technique in the future. iii

4 Table of Contents Abstract... iii List of Figures... v List of Tables... v Acknowledgements... vi 1. Introduction Background Photogrammetry Close Range Photogrammetry Photogrammetry Process Factors affecting Photogrammetry Scaling Photogrammetry FIG paper Location Camera and Software Process Results Camera Calibration Camera Parameters Principal Distance or Focal Length Principal Point Indicated Principal Point - X p Y p Radial Distortions - K 1 K 2 K Decentring Distortions - P 1 P Camera Calibration Project Camera Camera Calculations Field Of View Calculation Pixel Size Calculation View Angle Calculation Photomodeler Pro Calibration Camera Calibration Results Photomodeler Calibration iii

5 3.3.3 Calibration Problems Image Acquisition Camera Calibration Results using Photomodeler Pro Photomodeler Comparison iwitness Camera Calibration iwitness Results and Comparisons Field work Trial Location Field work Processing in Photomodeler Coordinate system Total Station Check Results and Analysis Full Loop Point accuracies AutoCAD Comparisons Point Residuals Point Angles Incomplete Loop Point accuracies AutoCAD Comparisons Point Residuals Comparisons Conclusions References Bibliography Appendix Camera Features FOV calculation View angle calculation iv

6 List of Figures Figure 1 - Single point and Multipoint Triangulation Figure 2 - Multiple POI s from multiple camera images Figure 3 - Analogue Photogrammetry System Figure 4 - Digital Photogrammetry System Figure 5 - Factors influencing accuracy of photogrammetric measurements Figure 6 - Gabriel Scarmana's camera projections Figure 7 - Results obtained by Scarmana Figure 8 - Elements of a lens system Figure 9 - Radial Distortions Figure 10 - Misalignment of the components of a lens system Figure 11 - Decentring Distortion values Figure 12 - Referencing Tutorial in Photomodeler Figure 13 - Photomodeler calibration grid and camera locations Figure 14 - Image used for Photomodeler calibration Figure 15 - Residuals produced by Photomodeler calibration Figure 16 - Coded targets and layout for iwitness calibration Figure 17 - Trial site Figure 18 - The Hut Dance Studio Figure 19- Displayed photograph values Figure 20 - Camera setup positions Figure 21 - Epipolar lines intersection Figure 22 - Processing results Figure 23 - Control point coordinates Figure 24 - Control network Figure 25 - Plotted error bars for easting and northing Figure 26 - Easting Errors Figure 27 - Northing Errors Figure 28 - Height Errors Figure 29 - Incomplete loop Figure 30 - Plotted easting and northing error bars Figure 31 - ENH errors for incomplete loop List of Tables Table 1 - Calibration Results Table 2 - Results from Photomodeler calibration Table 3 - Result comparisons between two Photomodeler calibrations of the same camera Table 4 - Results from calibration of the same camera using iwitness and Photomodeler Table 5 - Control coordinates Table 6 - Positional precisions Table 7 - Coordinate Comparison Table 8 - Top 5 worst residuals Table 9 - Top 5 worst angles Table 10 - Precision Comparisons Table 11 - AutoCAD Comparison Table 12 - Incomplete loop residuals v

7 Acknowledgements I would like to thank my supervisor Dr Bruce Harvey for all of his help and time as well as providing guidance for the duration of this thesis. I would also like to thank Professor John Trinder and Yincai Zhao for their considerable assistance as co-supervisors. A special thank you to Paul Wigmore for his assistance as a field hand when completing the field work component of this thesis. vi

8 1. Introduction At the 2010 Fèdèration Internationale des Gèometres (FIG) conference in Sydney, Gabriel Scarmana proposed the use of terrestrial photogrammetry as an alternative method for traversing buildings and using non-stereo convergent images to coordinate features. Scarmana traversed a distance of approximately 450m around a city block located within the business district of Surfers Paradise with the plan to measure a set of 80 points of interest (i.e. public assets such as traffic signs, bus shelters, street lights and major trees) located along the streets (Scarmana, 2010). Images were taken every metres as Scarmana moved forward around the city loop with shorter distances used when entering a turn at street corners. Three well defined marks were established through the use of a Leica TC2002 total station and used as coordinates for the initial control points of the network. Control marks assist in initial orientation and scaling of a project. Sequential photographs are then connected by the identification of suitable points of interest. A suitable point of interest must have a clearly defined edge or centre where the same point can be confidently and accurately be marked consistently on several different photographs. The best objects to use as points of interest are edges of windows, road centrelines or the intersection of cracks in the footpath. Scarmana used the Windows-based photogrammetry software Photomodeler Pro from EOS Systems Inc. to process the results. This program takes 2D photograph images and creates a 3D representation of the image complete with 3D coordinates. Photomodeler is designed in such a way that the user need not be an expert in the photogrammetry field. Using this method, Scarmana found that photogrammetry could be successfully used as a simple alternative method of surveying, with an accuracy error of approximately 1 m for every 150 m traversed. This thesis presents the results of an independent test of Scarmana s proposal. The aim of the project was to determine if the method could be reproduced to the same standard by a nonphotogrammetry expert. It is not suggested that his proposal or results are flawed in any way. Attempts were made to minimise changes to the method to ensure consistency, however one notable improvement to the project was the quality of the camera used. The author started the project relatively inexperienced in the field of photogrammetry. Over the course of the research a better understanding of the technique, including its advantages and limitations, was gained. Trial field work, software tutorials, learning about the features and settings 1

9 of the camera all assisted with the successful completion of the project. Much of the discussion focuses on lessons learned from the project, and future recommendations for anyone working in the field with a limited understanding of photogrammetry. 2

10 2. Background 2.1 Photogrammetry Photogrammetry is a coordination technique that utilises methods of image measurement and interpretation to derive and determine the shape, location and orientation of an object or point of interest (POI) from one or more photographs of that object. (Luhman et al, 2006) A POI of an image refers to a distinct area or object in a photograph that can be clearly defined and referenced. Some examples of points that could be used as a POI are: Building corners Window edges Street signs Footpaths Road markings When selecting a POI to reference it is important to use all areas of the image including both foreground and background areas. The focus of this investigation was close range photogrammetry, where the object or POI is less than 300 metres from the origin of the camera. Using multiple two dimensional photographs, three dimensional (3D) coordinates of a POI are produced by analysing the position of each photograph relative to each other. The photogrammetric process can be applied to any situation where the object in question can be photographically recorded. The fundamental principle behind photogrammetry is a process called triangulation. Triangulation is used when two or more photographs with common features or POI s visible in both images are taken from different locations. Rays (or lines of sight) are created from the origin of the camera to the POI and are mathematically intersected to produce 3D coordinates for the POI. Figure 1 - Single point and Multipoint Triangulation (Geodetic Surveys, 2006). A bundle triangulation occurs where a numerical fit is simultaneously calculated for all distributed images (or bundles of rays). The bundle adjustment makes use of known input control coordinates 3

11 and, using scales and rays to common POI s, is able to adjust a coordinate system for all images. In a complex system of equations an adjustment technique does the following (Luhman et al, 2006): 1. Estimates the 3D coordinates of each referenced POI 2. Orientates each photograph 3. Detects gross errors and outliers Triangulation is the principle used by theodolites to produce 3D point measurements: By mathematically intersecting converging lines in space, the precise location of the point can be determined. However, unlike theodolites, photogrammetry can measure multiple points at a time with virtually no limit on the number of simultaneously triangulated points. (Geodetic Surveys, 2006) Multiple photographs produce multiple lines of sight. If the positional location and direction of the camera are known, the lines of sight can be mathematically intersected to produce the xyz coordinates of the POI (see Figure 1 above). The produced xyz coordinates are calculated in a local Cartesian coordinate system. If MGA coordinates are required, a transformation process is undertaken. As the main use of processing software is 3D model creation, it is unable to handle scale factors and is not able to perform this transformation. A coordinate transformation to the map grid of Australia (MGA) was not specifically required for this thesis. Figure 2 - Multiple POI s from multiple camera images (Trinder, 2011). The increasing availability of digital cameras over the past 20 years has led to an increase in the use of photogrammetry and its applications. The fundamental photogrammetric process has changed with this new technology and has improved rapidly over the last decade. Whereas special photogrammetric measuring instruments were once previously required for anyone planning a 4

12 photogrammetric project, standard computing equipment is now used due to the high degree of automation in all systems. Furthermore, expertise in the field is no longer necessary and a nonphotogrammetric specialist is, in most cases, able to carry out all fieldwork and processing unaided. Figures 3 and 4 below show the reduction in total project time since the introduction of digital cameras. Figure 3 - Analogue Photogrammetry System. Figure 4 - Digital Photogrammetry System Close Range Photogrammetry Close range photogrammetry is a specialised, predominantly terrestrial based, branch of photogrammetry. It uses a camera to object distance of less than 300 metres. Specialised digital cameras have been specifically calibrated for intended purposes and are used in the majority of close range photogrammetry applications. The advantages of close range photogrammetry over conventional surveying techniques in industry and engineering are (Trinder, 2011): It is a precise measuring technique For industrial monitoring, it involves a minimum if down-time of the production line. The photography is a permanent record for the future use of the images It provides for rapid, remote measuring It is usually cheaper than field techniques Complete details of the object are available in the images The mathematical properties of the image coordinates and camera positions govern the relationship between the image and the objects. The three components of the perspective centre, the object point and the image point are collinear and, together in the bundle adjustment, yield a functional model called the collinearity equations: x j = x 0 - Δx j - f [ m 11 (X j - X c ) + m 12 (Y j - Y c ) + m 13 (Z j - Z c )]_ (1) m 31 (X j - X c ) + m 32 (Y j - Y c ) + m 33 (Z j - Z c ) 5

13 y j = y 0 - Δy j - f [ m 21 (X j - X c ) + m 22 (Y j - Y c ) + m 23 (Z j - Z c )]_ (2) m 31 (X j - X c ) + m 32 (Y j - Y c ) + m 33 (Z j - Z c ) Where (Trinder, 2011): x j, y j are image coordinates. x 0, y 0 are displacement coordinates between the actual origin of the image coordinates and the true origin defined by the principal point. Δx j, Δy j are the corrections applied to the image coordinates for systematic errors in image geometry. f is the camera principal distance or focal length. X j, Y j, Z j are the object coordinates of point j. X c, Y c, Z c, are the coordinates of the camera in the object space coordinate system. m m 33 are the elements of a 3 x 3 orthogonal rotation matrix M which is a function of 3 rotations of the camera coordinate system, ω, φ, κ about the 3 axes x, y and z respectively. There are two collinearity equations produced for each point on a photograph but as there is three unknown s xyz and only two equations we are unable to solve for the object coordinates. However, when we have a common image in multiple photographs we have four or more equations, two from each image, which allows us to solve for the three unknown values of xyz The collinearity equations describe the fundamental mathematical model for photogrammetric mapping. They demonstrate the relationship between the image and the object coordinate systems. With the collinearity equations, the bundle adjustment can perform and solve the two basic functions of photogrammetric mapping: Resection: In resection, the position and orientation of an image is determined by placing a set of at least three points with known coordinates in the object frame as well as in the image frame. Intersection: In intersection, two images with known position and orientation are used to determine the coordinates in the object frame of features found on the two images simultaneously, employing the principle of stereovision. Both the resection and intersection method are implemented through an iterative least squares adjustment. 6

14 2.1.2 Photogrammetry Process Due to digital advancements, modern processes are usually highly automated and require minimal referencing and calculations from the user. Below is a simplified outline of the four major stages of a photogrammetric coordination project. 1. Recording Targeting - when selecting the areas for an image it must first be determined what POI s will be visible in the photograph. It is important to have approximately visible POI s in each image to help improve automation and increase accuracy (Scarmana, 2011). Using a large number of POI s will help link the rays within each photograph for the bundle adjustment, increasing redundancy and improving accuracy. Automation can be further improved by using coded targets. Determination of control points or scaling lengths - in order to give the POI meaningful coordinates, a coordinate system must be defined. This is usually done by implementing control points (3 or more) into the first photograph. Each control point should not have more than one identical coordinate to another control point (i.e. the xy, yz and xz coordinates should not match for any control points). Having one of the x,y or z coordinates matching is acceptable. Control points also help with scaling and orientating the photographs but are not essential. Other methods of scaling of the photographs can be seen in Section below (Luhman et al, 2006). 2. Pre-processing Computation - calculation of control points with a total station to help coordinate the photographs (Luhman et al, 2006). 3. Orientation Measurement of image points - identification and measurement of control points and common POI s. (Points of interest that are visible in two or more images) Approximation - a rough calculation is given for unknown parameters and POI s based on the control points and the scale calculated. This is crucial as it yields approximate values for the bundle adjustment to work with. Bundle adjustment - adjustment program which simultaneously calculates parameters of both interior (camera) and exterior (photograph) orientation. The object point coordinates are also calculated by the bundle adjustment (Luhman et al, 2006). Removal of outliers - any gross errors are detected and removed (Luhman et al, 2006). 4. Measurement and Analysis Single point measurement - 3D coordinates are created for all referenced POI s. 7

15 Graphical plotting - final coordinated POI s are easily mapped or made available for a CAD program (Luhman et al, 2006) Factors affecting Photogrammetry The accuracy achieved from a photogrammetric measurement will vary quite significantly depending on the many interrelated factors that are involved in the photogrammetric process. The most influential factors include: The quality of the camera and lens in use - the resolution of the camera plays a significant role in the ability to precisely pin point the location of a POI. The sizes of the objects being photographed for measurement or coordination - smaller objects increase the accuracy of the photogrammetric process. The number of photographs taken - possibly the most influential factor determining the accuracy of results; increasing the number of photographs increases the level of redundancy, which should lead to higher accuracy in the final output. The geometric layout of the pictures relative to the object and to each other - the wider the angles between each photograph taken, the higher the accuracy of the coordination. The ideal ray intersection would be at 90. However, this is not always possible and smaller angles can be used. The quality of the results will be compromised if the angle of intersection is less than 60 (Clemente et al, 2008). However, care must be taken to check that enough POI s can be seen to ensure that the image is useful for calculations. Figure 5 below illustrates the affects of the four factors and their influence on accuracy. The higher on the pyramid, the more accurate the results. To achieve the highest accuracy (a higher pyramid) a combination of higher resolution images, smaller object size, as many photographs as possible and optimal width geometry is needed Figure 5 - Factors influencing accuracy of photogrammetric measurements. 8

16 2.1.4 Scaling Photogrammetry When an image is taken, the photogrammetric measurements essentially have no scale dimensions. In order to scale objects in the image so it is possible to produce coordinates for a POI, it is necessary that at least one known length measurement is visible in the image. If the actual coordinates of two or more points in the image are known beforehand, this can be used to calculate the distances between the two and hence give the image a scale. Another possibility for calculating the scale of an image is to use a targeted fixture and measure along the object. The known distance between the target marks can be used to scale the photographs. The most common form of scaling fixtures is scale bars (Marshall, 1989). Whenever possible, more than one distance should be used to scale the measurement as this enables scale errors to be found. This is important because, when a single scale distance is used and it is in error, the entire measurement will be incorrectly scaled. On the other hand, if multiple scale distances are used, scale errors can be detected and removed. With two known distances, if one is in error, a scale error can be detected, but it is usually not possible to determine which one is in error (sometimes, however, it is possible to tell by inspecting the scale points). With three known scale distances, it is usually possible to determine which is in error and remove it. When scale bars are used, use of a bar that has more than two targets is an effective technique. Alternatively, more than one scale bar can be used. A combination of both techniques can also be used. Whenever feasible, it is recommended that multiple scale distances are used to maximise the accuracies of the results. The scale distance(s) should be as long as practical because any inaccuracy in the scale distance is magnified by the proportion of the size of the object to the scale distance (Atkinson, 1996). One disadvantage that is introduced by using a scale bar is the inability to include a vertical direction. Using coordinated points instead of scale bars allows the introduction of heights, orientation and azimuth. 2.2 FIG paper In 2010, at the 34 th FIG conference in Sydney, Mr Gabriel Scarmana proposed an alternative concept for mapping and navigating in GPS degraded areas. In such areas as dense forest or amongst high rise buildings, GPS signals can be quite difficult to obtain or sometimes even impossible. Scarmana (2010) proposed that an alternate method be employed where otherwise reliable GPS navigation signals were blocked or weakened due to nearby high rise buildings or signal interference. 9

17 Scarmana s proposal attempts to use close range photogrammetry to survey city blocks where the only sensory input is a single low-cost digital camera (Scarmana, 2010). The process involves traversing around a city block using a series of photographs taken from a simple off the shelf camera and extracting 3D coordinates from visible POI s in each photograph. Scarmana s main objective was to calculate and coordinate several important POI s rather than every visible one. Scarmana noted that he intended to combine his data with local and state government authorities who routinely carry out periodic surveys of public assets in order to update and monitor their state (Scarmana, 2010) Location Scarmana performed his experiment in narrow lanes sandwiched between high-rise buildings at Surfers Paradise on the Gold Coast where GPS signal paths provided limited visibility to satellites and caused multipath effects, resulting in degraded navigation accuracy and reliability (Scarmana, 2010). In this project, the site used for recreating Scarmana s project will be the UNSW campus, as it has similar site characteristics. The final site selection was determined to be the Hut Dance studio in the north-west corner of the UNSW Kensington Campus. Justifications for this site are outlined later in Section Camera and Software Scarmana used a Fuji A500 camera for his fieldwork. This is an off the shelf, readily available camera with no special photogrammetric functions or lenses. It takes 5 megapixel photographs which is low by today s standard, and retailed for around $100 in 2008 (Scarmana, 2010). Scarmana theorised that, if he could produce reasonable mapping and navigation results using a simple camera, then the possibilities for using this technology as a navigation tool in the future would expand. This investigation uses a more sophisticated camera than that used by Scarmana. A digital SLR Canon 450D camera with 12 megapixels, which retails for around $1200 (2010), was used in an attempt to eliminate camera quality as a source of error or limitation. Section 3.1 outlines the camera in more detail. Scarmana s processing and coordination of the photographs was done using Photomodeler Pro, a photogrammetry program developed by EOS Systems. This low cost software is user friendly, has a broad range of applications and is designed for use by non-photogrammetric experts. This program was also used in this work in an attempt to maintain consistency between the two projects. More information on Photomodeler can be found in Section

18 2.2.3 Process Measurements obtained from any photogrammetric processing cannot be fully accurate unless the internal characteristics of the camera are known. Before any photogrammetric measurements are made, the camera must be calibrated to determine the optical and geometric characteristics of the camera (Scarmana, 2010). Scarmana used Photomodeler s built-in calibration program to determine the focal length and camera distortions. The process used by Photomodeler for the calibration can be found in Section 3.3. Scarmana s proposed traverse length was a distance of approximately 450 m with the plan to measure a set of 80 POI s (i.e. public assets such as traffic signs, bus shelters, street lights and major trees) located along the streets (Scarmana, 2010). Scarmana s mapping/measuring project started from three well defined control points. These three marks were established through the use of a Leica TC2002 total station and consisted of natural permanent targets such as the corner of tiles on building walls or stable street signs (Scarmana, 2010). The coordinates of the initial control points were measured in GDA94 creating all future coordinate calculations in the same Datum. Scarmana suggests that it was important that these three control points were spread apart at different distances and did not lie on the same line. These control marks assisted initial orientation and scaling of his project. The first three images Scarmana took of his traverse were images of the control points in progression so as to bring forward along the street the correct scaling and orientation (Scarmana, 2010). From then on images were taken every metres as Scarmana moved forward around the city loop. Scarmana was forced to use such short distances for long straights due to environmental constraints. Scarmana suggests that, although not necessary, it is advantageous to use shorter distances when entering a turn at street corners. The geometry of the intersecting rays is a vital component of the processing if the images. It is desirable to have the rays intersect at 90 and not at any angles less than 60 (Clemente et al,2008). To improve angles, images may be taken in a zigzag pattern by alternating on different sides of the road, as long as multiple POI s are visible in at least two images. To ensure that enough POI s were recorded and no more field work would be required, Scarmana took an additional 20 more photographs than was necessary. To accurately connect sequential photographs together there must be sufficient suitable POI s in each image. A suitable POI must have a clearly defined edge or centre where the same point can be confidently and accurately marked consistently on several different photographs. The best objects to use as POI s are edges of windows, road centrelines or the intersection of cracks in the footpath. If 11

19 an area has unsuitable POI s there are several ways to overcome the problem. The ideal method is to place temporary marks in the field of view of the camera; stick-on coded targets or more substantial objects such as change plates can be used. The marks need not be coordinated but rather used for a transfer of coordinates. Figure 6 below shows an example of the trajectory of Scarmana s camera as he took images in progression. Photomodeler computes the 3D coordinates of the camera at each setup. It can be seen that Scarmana used a zigzagging technique whilst taking his images. The zigzagging technique is used when one photograph is taken from one side of the road in a forward looking direction and then another photograph is taken from the other side of the road in the same direction but a little further along in the direction of travel. This technique is the most advantageous as it usually captures a larger array of POI s. However care must be taken to maintain suitable angles of intersection. Figure 6 - Gabriel Scarmana's camera projections (Scarmana, 2010). Scarmana loaded the images into Photomodeler and began marking all common visible POI s. Photomodeler has many useful tools to help mark images including a sub-pixel marking tool, which is used to help determine the centroid of circular targets. Photomodeler suggests that these point marking tools are accurate to around 1 pixel. 1 pixel equates to 5.1 μm on the image plane, 0.4 mm at 2 m from the camera and 3 mm at 15 m from the camera. The referencing stage is the final stage before the bundle adjustment is calculated. Common POI s were referenced in multiple photographs with at least six common POI s being required to fully reference an image. Once the minimum number of points has been referenced in at least two images, automatic processing occurs. During this phase, Photomodeler processes the camera calibration and the referencing data and creates spatial point coordinates to produce 3D coordinates of all selected POI s (Scarmana, 2010). 12

20 2.2.4 Results Over the 450 metres travelled, Scarmana s final coordinate errors were ±3m or 1:150m. An example of the errors can be seen below in Figure 7. Figure 7 - Results obtained by Scarmana (Scarmana, 2010). The above graph shows that positional accuracy degraded with distance travelled. Scarmana suggests that positional errors are directly dependent upon on (Scarmana, 2010): Distance covered Number of observation or camera stations Precision of the system components Measuring geometry The distance covered in this thesis will be significantly shorter with a distance of 140 metres travelled. Based on Scarmana s results this project should produce results to within one metre accuracies. However this project is using a far superior camera so it would be expected that the accuracies would be increased even further. 2.3 Camera Calibration Instrument calibration is an important element in all surveying fields including close range photogrammetry. Camera calibration has multiple functions for close range photogrammetry as it accurately evaluates several functions of the camera that can affect image calculation and coordination. Apart from evaluating both the performance and stability of the cameras lens, an accurate calibration can also determine the optical and geometrical parameters of the lens, camera system and image data acquisition system. Photomodeler has an inbuilt calibration program that is simple to run and records and stores all parameters for that camera for all its future calculations. The parameters that are solved for by the program include: Principal Distance or Focal Length 13

21 Principal Points Format Width/Height Radial Distortions Decentring Distortions Along with the general calibration that is performed before the commencement of field work there is also the option to perform an infield calibration which will produce a more accurate set of results since it is possible to calibrate the camera using objects of similar size. The calibration results can be seen later in this thesis in Section Camera Parameters Principal Distance or Focal Length The principal distance of a camera refers to the perpendicular distance from the perspective centre of the lens system to the image plane (Fryer, 1996b). In Figure 8 below this distance is shown as c and is often referred to as the focal length of a camera when the camera is focused at infinity. This principal distance is a key parameter in defining the calibration of a camera. However, in many applications where close range photogrammetry is used, the value can be determined during the image processing stage. Using the geometric configuration of the camera station and the mathematical techniques (Fryer, 1996b) that are employed to calculate the 3D coordinates from the images, the principal distance or focal length of the camera can also be calculated. This means that an approximate value only is required for early processing. For this camera/lens system the focal length is about 24mm. Figure 8 - Elements of a lens system (Fryer, 1989) Principal Point The principal point represents the exact geometrical centre of the image plane. Its location is determined by projecting a direct axial ray through the perspective centre of the lens to the image. 14

22 2.4.3 Indicated Principal Point - X p Y p In modern digital cameras the fiducial origin is now referred to as the indicated principal point. The point refers to the point on the image plane that the processing software determines to be the ideal position for the origin. In an ideal camera with no distortions the indicated principal point would correspond with the principal point. However, it is rare to find a camera system free from errors. Therefore, to centre the image coordinates correctly, it is necessary to add calculated offsets (X p and Y p ) from the principal point to the origin of the principal point coordinate system. The origin of the principal point coordinate system will vary depending on software used. The offset between the principal point and indicated principal point generally has a magnitude of less than 1mm. Section 3.3 shows the calculated difference between the principal and indicated principal point for this thesis Radial Distortions - K 1 K 2 K 3 The radial distortion component of the calibration is the determination of any radial movement of the image rays from the principal point, i.e. closer to or further away from the principal point. The amount of distortion increases with the distance of the image rays from the principal point as seen in Figure 9 below. There is also a relationship between the focussing value and the amount of radial distortion that occurs. Figure 9 - Radial Distortions (Fryer, 1996b). According to Fryer (1996b) the radial distortion (δ r ) is expressed by a polynomiall with a series of odd powered terms: δ r = k 1 r 3 + k 2 r 5 + k 3 r 7... (3) where k 1, k 2 and k 3 are the radial distortion coefficients when the lens is focused at infinity and r corresponds to the radial distance between the principal point and the image plane origin. The radial distance is derived from the following equation: 15

23 r 2 = (x-x p ) 2 + (y-y p ) 2 (4) Values for the radial distortions of the camera used in this thesis can be found in Section Decentring Distortions - P 1 P 2 Ideally all lenses in a camera system should be perfectly aligned, but this is not always the case. During a calibration the amount of decentring distortion can be calculated and accounted for when performing further calculations with the images. Any displacement of the lens element, be it vertical or rotational, will cause some geometric displacement of the images (Fryer, 1996b). As the amount of distortion that normally occurs is so minute (rarely exceeds 30 μm at largest point or 6 pixels) it is difficult to physically see what is happening. An exaggerated example can be seen below in Figure 10. Figure 10 - Misalignment of the components of a lens system (Fryer, 1996b). Figure 11 below shows the effect of the radial distance from the fiducial origin on the amount of decentring distortion. Figure 11 - Decentring Distortion values (Fryer, 1996b). 16

24 According to Fryer (1996b) the above distortion is modelled on the following equation, where P(r) refers to the amount of decentring distortion that occurs: (5) P 1 and P 2 refer to values when the camera is focused at infinity and r is the radial distance between the principal point and the fiducial origin. Values for the decentring distortions of the camera used in this thesis can be found in Section

25 3. Camera Calibration 3.1 Project Camera The camera used for this thesis project was the 12 megapixel Canon EOS 450D (s/n Camera body: , Lens: , Camera Number: 4). Prior to beginning field work it was essential that the user became familiar with the different modes, settings and functions of the camera. A full list of the cameras specifications can be found in appendix 8.1. Type - the Canon EOS is a non-metric camera meaning it usually is cheaper than a metric camera, has interchangeable lenses, is lighter in weight and is smaller. However, non-metric cameras have an unstable interior orientation. The effective focal length may change for each exposure and the direction of the optical axis may alter with focusing movement. Settings - after several experiments it was determined that the best setting for the camera is the M, or manual setting. On this setting both the shutter speed and aperture values can be set to the appropriate values. When outside there is no particular combination of settings that will be appropriate for all photographs. Extras - when taking images, particularly for the calibrations, a tripod should be used for stability. A cable release and eye piece should also be used to increase the accuracy of the calibration. Save format - images are saved in JPEG format. This is the only format that must be saved, as Photomodeler does not require the RAW data for its processing. Raw image files are sometimes called digital negatives, as they fulfil the same role as negatives in film photography. The camera used is the single largest distinguishing factor when it comes to the quality of the results obtained. Photographs of high resolution with appropriate exposure allow for referenced marks to be precisely marked in all photographs. With the manual mode you have full control over every aspect of your camera. You are able to set the aperture, shutter speed, ISO, white balance, and flash values. A display in the viewfinder reports whether the camera thinks your settings will result in under, over, or correctly exposed photos Camera Calculations In addition to understanding the functions and settings of the camera to be used in the survey, it was necessary to calculate several internal measurements of the camera, including FOV, pixel size and view angles. Calculations can be found in detail in Appendix 8.2 and 8.3 Camera Settings for Calculations Vertical Camera Height: m 18

26 Wall to Camera Distance: 2.01m Camera Shooting Mode: Manual Shutter Speed: 1/125 Aperture: 8.0 ISO: 400 Image Quality: L Note: The camera settings were kept constant for the entirety of the exercise Field Of View Calculation Field of view (FOV) is an important parameter as it is necessary to capture everything that is seen in the viewfinder in the image. Simple field exercises were performed to determine whether the extents seen in the view finder were identical to the extents produced in the photograph. The experiment consisted of measuring the distance visible through the view finder on a wall both vertically and horizontally and comparing the measurements to a photograph taken with a scale bar (level staff) in the image for an accurate measurement of the photograph distance. It was also useful to know the FOV angles for planning close range photogrammetry surveys. Horizontal View Finder: m Photograph: m Therefore, at 2.01m the photograph will capture 43mm more of the image horizontally than is seen in the view finder ( 21.5mm to the left and right of the image). Vertical View Finder: m Photograph: m Therefore, at 2.01m the photograph will capture 48mm more of the image vertically than is seen in the view finder ( 24 mm to the top and bottom of the image). From the above calculations, it can be said that when a photograph is taken more of the image will be captured in the photograph than is seen in the view finder. Given that the extra image area taken in the photograph in both the horizontal and vertical directions differ by 5 mm it can be assumed that there is an equal amount of extra image taken on all four sides, i.e. 23 mm for an image taken at 2.01 m from the object. The discrepancy would have been due to the facts that level staffs were used as the scale bar and that human error is entered into the calculations when estimating the millimetres between intervals. 19

27 3.1.3 Pixel Size Calculation Pixel, or Picture Element, is defined in the Oxford Dictionary online edition, as a minute area of illumination on a display screen, one of many from which an image is composed (Oxford, 2011). All electronic displays consist of thousands of illuminated pixels that, when lined together, form an image or display. From the previous calculations, both the horizontal and vertical distances of the photograph have already been determined. Using this information it was possible to calculate the size of each pixel as well as determining if they are square or rectangular. Horizontal Distance: m N o of Pixels: 4272 Therefore there are pixels per millimetre horizontally Vertical Distance: m N o of Pixels: 2848 Therefore there are pixels per millimetre vertically It is reasonable to assume that each pixel is square. The slight discrepancy in the number of pixels per millimetre would again be due to the human error introduced in estimating the distances whilst using the level staff. If each pixel is assumed to be square, then there are 2.46 pixels per millimetre or each pixel is 0.41mm x 0.41 mm at a distance of 2.01 m. This equates to 5.1 μm square on the image plane View Angle Calculation Knowing the view angle of a camera makes it possible to calculate the distance from the object that is required in order to fully capture the required parts of the object in a photograph. Using the values from the previous calculations the following viewing angles were determined: Horizontal - The horizontal view angle is The horizontal view angle that is produced in the photograph is similar to that of the human eye which has a viewing angle of about 45. Vertical - The vertical view angle is The calculations show that, when photographing an image, the field of view extends approximately 45 from the origin horizontally and approximately 30 vertically from the origin. If the size of the 20

28 object to be captured is known, these values can be used to position the camera at the correct distance to capture the entire object in the photograph. An object three metres tall would require a distance of 5.2m in order to capture the whole image. Knowing the viewing angle also indicates where the camera should be placed to achieve a good overlap of photographs. It should be noted that the calculations were also performed for the values obtained from the view finder. As expected, the view angles were slightly less than the angles produced in the photograph. 3.2 Photomodeler Pro The program used for the majority of this project was Photomodeler Pro 6 which is a Windows based photogrammetry software program that provides image-based modelling, for accurate measurement and 3D models in engineering, architecture, film and forensics (EOS Systems, 2011). Photomodeler takes 2D photograph images and creates a 3D representation of the image complete with 3D coordinates. An advantage of Photomodeler is that it is designed in such a way that the user need not be an expert in the photogrammetry field. Photomodeler was used in this project to maintain consistency with Scarmana s experiments and results. As Photomodeler is designed for non-photogrammetry experts, the website provides several interactive tutorials designed to instruct the user on the basics of the program. Relevant tutorials were completed prior to using the software. The tutorials covered the basics of Photomodeler including: Calibration, both single sheet as well as in field calibration Point projection Dimensioning Measuring Referencing Automated Coded Targets Figure 12 - Referencing Tutorial in Photomodeler (EOS Systems, 2011). 21

29 3.3 Calibration Initial practical work focussed on calibration and analysis of the Canon EOS 450D camera. As mentioned in Section 2.3, many different distortions can occur in the internal geometry of a camera that will affect the overall accuracy of the measurements. For this project the camera has been calibrated three times, twice with Photomodeler and once with iwitness. The calibration was carried out twice with Photomodeler to analyse consistency with the results and distortions and once with iwitness to compare results using different software Camera Calibration Results Table 1 below shows the results from the first successful calibration using Photomodeler. Table 1 - Calibration Results. Photomodeler Calibration Summary Iterations: 3 First Error: Last Error: Calibration Values Focal Length: mm Xp - principal point x: mm Yp - principal point y: mm Fw - format width: mm Fh - format height: mm K1 - radial distortion 1: 1.784e-004 K2 - radial distortion 2: e-007 K3 - radial distortion 3: 0.000e+000 P1 - decentering distortion 1: e-006 P2 - decentering distortion 2: e-006 Point Marking Residuals Overall RMS: pixels Maximum: pixels Minimum: pixels Maximum RMS: pixels Minimum RMS: pixels Photomodeler Calibration The Photomodeler calibration process is a completely automated process that uses a series of images taken of a 10x10 calibration grid. Images are taken from the four sides of the grid at various rotations. The grid and camera location/orientation can be seen below in Figure 13 22

30 Figure 13 - Photomodeler calibration grid and camera locations. The calibration grid can be printed at various sizes depending on the object/s that are being modelled. The grid was printed on A1 and photographed from a distance of about 1.5 m. An alternative method that could have been used would involve projecting the grid on to a wall. It is important to calibrate the camera at a distance similar to the objects to be photographed as when the lens is focused the internal geometry of the camera changes. Each time the internal geometry of the camera is changed so will the values of the calibration. With the lens being used, the Canon EOS will focus at infinity when objects are over three metres away. Photomodeler s standard calibration is designed for projects where the object to be photographed is small in size and less than a few metres away. Twelve images in total were taken for the calibration but it must be noted that Photomodeler will perform a calibration with only six images. Twelve images were in the process to increase redundancy in and improve the accuracy of the results. At each side of the grid three images were taken, one at horizontal orientation then two more at a 90 left rotation and 90 right rotation. It is important that each image has all four control points visible and that the field of view is covered by as much of the grid as possible. The four control points can be seen in the above Figure 12 as the four marks outlined by heavier circles Calibration Problems Several problems arose during the calibration process that caused the calibration to either fail or give insufficient results. These problems provided an insight into how the calibration is performed 23

31 and what factors are the most influential when taking the photographs. The major problems consisted of: Background error - the first calibration attempt took place in the corridor of the top level of the Electrical Engineering Building. The surface of the floor consists of black and white speckled linoleum and the room is lit with florescent bulbs. The image acquisition was completed with relative ease but problems were encountered during the initial processing of the images. Photomodeler was picking up sections of the floor around the calibration sheet and using them as reference points to perform the calibration. This caused errors to be greatly exaggerated and the calibration to fail. An unsuccessful attempt was made to manually remove these unwanted marks from the calibration. Glossy cover - the second problem that occurred during the calibration process was an error with Photomodeler recognising the dots on the grid. The program provided the following advice on resolving this issue: A large percentage of your points are sub-pixel marked so it is assumed you are striving for a high accuracy result. The largest residual (Point ) is greater than 1.00 pixels. Suggestion: In high accuracy projects, strive to get all point residuals under 1.00 pixels. If you have just a few high residual points, study them on each photo to ensure they are marked and referenced correctly. If many of your points have high residuals then make sure the camera stations are solving correctly. Ensure that you are using the best calibrated camera possible. Remove points that have been manually marked unless you need them. It was originally thought that this error was due to light reflections on the grid. An attempt was made to evenly light the whole grid with transportable photography lamps, but this had little to no effect. It was then suggested that the fact that the grid was printed on glossy paper might be having an effect. After re-printing the grid on matt paper, the above error no longer appeared. Image coverage - The calibration grid should cover at least 80% of the combined image format. It is not essential that each individual image has 80% coverage. Less than 80% coverage will result in less accurate calibration. 24

32 3.3.4 Image Acquisition To ensure camera and image stability a tripod and remote trigger were used for image acquisition. The focus was set for the first image then unchanged for the remainder of the photographs (although it was checked each time before taking an image). The grid was taped to the floor and weights were placed on corners and edges for extra stability. When taking the photographs, the camera was set to TV setting which allows for the shutter speed to be manually set and the aperture automatically set accordingly. This setting was used following experimentation with the differences between setting either the shutter speed or aperture manually as well as setting both manually. It was found that the best setting for taking images in this light was to set the shutter speed manually to 1. No flash was used during the image taking process. Figure 14 below is an example of one of the photographs taken. Figure 14 - Image used for Photomodeler calibration Camera Calibration Results using Photomodeler Pro Table 2 below shows the results from the first successful calibration using Photomodeler. Table 2 - Results from Photomodeler calibration. Fri Apr 0812:50: Status: successful Problems and Suggestions None Processing Iterations: 3 First Error: Last Error: Camera Calibration Standard Deviations Canon EOS 450D [24.00] Std Dev. Focal Length mm 5.6e-004 mm Xp - principal point x mm 9.5e-004 mm Yp - principal point y mm mm Fw - format width mm 2.9e-004 mm 25

33 Fh - format height Value: mm K1 - radial distortion e e-007 K2 - radial distortion e e-009 K3 - radial distortion e+000 P1 - decentering distortion e e-007 P2 - decentering distortion e e-007 Photograph Quality Total Number: 12 Bad Photos: 0 Weak Photos: 0 OK Photos: 12 Average Photo Point Coverage: 87% Point Marking Residuals Overall RMS: pixels Maximum: pixels Point 19 on Photo 5 Minimum: pixels Point 60 on Photo 6 Maximum RMS: pixels Point 3 Minimum RMS: pixels Point 43 If the calibration is to be acceptable and useable for a project, then the value of the Last Error should be less than 1. The last error value for this calibration (0.598) is acceptable (<1), thus the calibration was successful and stored for use. Photomodeler cannot solve for both the Format Width and the Format Height. The Format Width refers to the total width of the image format or image plane. The principal point and indicated principal points should always be less than 1 mm apart, in this case they were calculated as x=5.5x10-4 mm and y=0.05 mm. Radial Distortion 3 was not calculated as it is only required when wide angle lenses are used. For an acceptable outcome, the Maximum Residual value should be no greater than 1.5 pixels and be under 1 pixel for a high accuracy; in this case the MR was 0.258, indicating a high accuracy calibration. The Maximum RMS for this calibration was pixels on point 3. This value is well below the suggested value 0.5 pixels which is a maximum required for an accurate calibration. Figure 15 below shows the residuals for one of the calibration images magnified 2000x. The lines are a representation of where Photomodeler determines the points should be. When looking at the residuals several factors must be checked: All points should not be pointing the same way All points should not be pointing towards the centre All points should not be pointing away from centre There should be no distinct patterns All lines should be random in both direction and size 26

34 By viewing these factors it is possible determine whether the calibration contains systematic errors. If the residual lines are completely random then it is likely that the only errors present are random errors that can be ignored. If a distinct pattern is found then further investigation must be completed as there are most likely systematic errors involved in the calibration process. Systematic errors may have been caused by bad lighting in one area of the calibration grid or a slight movement of the grid between photographs. Figure 15 - Residuals produced by Photomodeler calibration Photomodeler Comparison Table 3 - Result comparisons between two Photomodeler calibrations of the same camera. Parameter Calibration one Calibration two Difference mm mm mm % Focal Length Principal point (xp) Principal point (yp) Format width (fw) Radial distortion (K1) 1.784e e E Radial distortion (K2) e e E Decentering distortion (P1) e E E Decentering distortion (P2) e e A comparison of results from two different Photomodeler calibrations is important in order to prove reliability and give credibility to the first calibration results. Table 3 above shows that the two calibrations yielded similar results which indicate a reliable calibration iwitness Camera Calibration iwitness, another photogrammetry program, has an automated built-in camera calibration program. Similar to that of Photomodeler, twelve photographs are taken of specially coded targets from various locations around the targets and loaded into the program. The major difference is that iwitness has several different individual coded targets that need to be individually placed. At least 27

35 thirteen targets must be used and one or more must be at a different height to the others. The layout used for the calibration in this investigation can be seen in Figure 16 below. Figure 16 - Coded targets and layout for iwitness calibration. Seventeen images were taken in an effort to maximise the image quality of the photographs taken and loaded into the iwitness program. One notable advantage of this program is that calibration takes about 2 min to perform, as compared to 15 minutes with the Photomodeler program. This is a consideration to be examined when selecting the processing software. Another consideration is the fact that iwitness is less dependent on the location of the camera when taking the photographs. Whereas Photomodeler has very specific locations required for the image acquisition, the photographs can be taken from any position around the targets for iwitness. The iwitness calibration was done with the camera positioned 1.5 m from the coded targets. iwitness has an option to print larger targets for the calibration process which allows the camera to be calibrated at a longer distance iwitness Results and Comparisons Table 4 below compares the results obtained from the iwitness calibration with the Photomodeler results. Table 4 - Results from calibration of the same camera using iwitness and Photomodeler. Parameter iwitness Photomodeler Difference mm mm mm % Focal Length Principal point (xp) Principal point (yp) Radial distortion (K1) 1.979e e E Radial distortion (K2) e e E Decentering distortion (P1) e e E Decentering distortion (P2) e e E

36 The first major comparison made between the two calibration results was the difference between focal lengths. The value given to the focal length is in direct proportion with the length of the lens when focused for the image acquisition. The difference of over 1mm in focal length was expected due to the fact that the calibrations were performed at different times, and thus the camera was refocussed (changing focal length). The second major difference in the calibration comparisons was the difference in principal point locations. This is because although both Photomodeler and iwitness use the same reference frame they clearly use a different location for the origin of their principal point coordinate system. From the x and y principal point values it was determined that Photomodeler uses the bottom left corner of its image for the origin of its principal point coordinate system whilst iwitness uses a point closer to the geometrical centre of the photographs. 29

37 4. Field work The field work component of this thesis was carried out over four days during August and September. The majority of the photographs were taken on August 10 th but after an initial processing attempt it was discovered that more photographs were required to complete a loop. Three more field days were completed on the 22 nd of August and the 7 th and 13 th of September Trial Before the commencement of field work a trial was completed to determine what factors need to be considered during the field work to achieve maximum accuracy. A simple straight line of the walkway at UNSW was used as the test area due to being able to use the intersections of the pavers as POI s to reference in photographs. The site chosen can be seen in Figure 17. Figure 17 - Trial site. The trial was invaluable in helping to understand how Photomodeler works and the importance of referencing corresponding points correctly. One incorrectly referenced point will cause large positional errors in the job and Photomodeler will be unsuccessful in its attempt to process the photographs. It became clear that it can be easy to miss reference points in areas where there are similar looking POI s i.e. the joints of pavers on a path. In an attempt to avoid this care was taken when in the field to select distinguishable POI s when possible. When doing some brief processing in Photomodeler it was discovered that each photograph in the project has three different options for the processing. Use and adjust, Use and don t adjust and Do not use. This became useful when the incomplete loop was processed. If the project is processed and a photograph is unsuccessfully orientated Photomodeler will automatically change that photographs setting to Do not use and it must be manually changed back to Use and adjust before the project is reprocessed. 30

38 4.2 Location The chosen site for the replication of Scarmana s project was the Hut Dance studio on campus at the University of NSW. This site was chosen due to several different distinguishing factors. The first factor influencing the selection of the site was its proximity to Room EE401 where the post processing was to be completed. A close proximity allowed for easy site access when new photographs were required to complete a section of the loop. The second influencing factor was the simplicity of the loop around the Hut. The loop consisted of four straight sides with no unusual features or sharp turns. As Scarmana had indicated that traversing around tight corners could be particularly difficult, this was a consideration in site selection. The final consideration was the foot traffic and general use of the area. It would be impractical to select an area which deals with heavy foot traffic on a daily basis. The Hut is a relatively quiet pedestrian area. Figure 18 - The Hut Dance Studio. Traverse path highlighted in blue. 4.3 Field work A brief inspection of the site was conducted before taking photographs to enable a mental positioning of cameras and an evaluation of the best position for each camera station set up. The loop must be started in an area where there are several clearly defined POI s because, early in processing there are limited photographs to reference POI s. The area shown in Figure 23 was selected as the optimal starting photograph for the loop traverse. The photograph includes two separate pedestrian crossings as well as a wall of windows and panel intersections. As discussed below in Section 4.4 there were over 40 referenced POI s in this starting photograph. 31

39 A manual camera setting was selected for the entirety of the field work to optimise the quality of the photographs. As the photographs were taken from several different positions, some in direct sunlight and some in shaded areas, there was no single perfect setting for the aperture and shutter speed values. To compensate for this, the aperture value was set to 9 and the shutter speed was adjusted to suit the conditions for each individual photograph. The shutter speed was increased or decreased until the Exposure Compensator was set on zero as this produced the highest quality photographs. Shutter speed Exposure compensation Aperture ISO Speed Figure 19- Displayed photograph values. The exposure compensation ranges from -2 to +2 and is used to alter the standard exposure set by the camera. The image can be made to look brighter or darker by changing the exposure time (shutter speed) which in turn changes the exposure compensation value. The starting value of the exposure compensation is dependent on a combination of how bright the area of the image is and the shutter speed. During the initial field work 54 photographs were taken in the loop around the Hut. More photographs than necessary were taken to ensure difficult areas were able to be processed. After several attempts at processing it was determined that more photographs from strategic locations were required. At the completion of the field work 80 photographs had been taken over the four days. Scarmana reported that entering and rounding corners were the most difficult parts of the processing (Scarmana, 2010). Although several extra photographs were initially taken near the four corners of the loop, areas of each corner requiring further imaging were identified. The camera was then strategically positioned so that sufficient POI s could be recognised and referenced to other photographs. This project used a different technique to Scarmana s zigzagging approach to taking. Three photographs were taken every 15-20m, one in a forward looking direction parallel to the direction of travel and the other two either side in a convergent direction a typical camera set up can be seen below in Figure 20. After the trial it was decided that this was the best way for an inexperienced user to achieve the best possible chance of photographing multiple POI s in multiple photographs. This 32

40 method does have disadvantages as it dramatically increases the number of photographs to be processed and creates a risk of using small angles in the processing calculations. Figure 20 - Camera setup positions In the above Figure the camera stations are 15m apart in the north south direction and 5m in the east west direction. The white dots are referenced POI s. 4.4 Processing in Photomodeler Photographs were loaded into Photomodeler for cross-referencing of common POI s. Initially only two photographs were referenced together with as many POI s as possible and processed. The starting photographs had 40 reference points to assist in strengthening the network during the early stages of the processing. Once Photomodeler had processed the two images and arbitrarily orientated the project a third photograph was added and the project was reprocessed. Images were added one by one in a clockwise direction around the Hut and the project was always processed after the addition of one or two images. Each time the project is processed, Photomodeler completes a bundle adjustment for all photographs simultaneously by referencing each image individually. By processing the project continuously, errors are easily spotted. A single full project adjustment at the end of the referencing process can make it difficult for an untrained photogrammetrist to spot potential error sources in the adjustment should it fail. The referencing procedure is the major stage of the processing of the photographs in Photomodeler. It involves marking several points in one photograph and then referencing them to other photographs by marking corresponding points. Photomodeler requires at least six common reference points on each image before it will attempt any orientation and adjustment calculations. If Photomodeler is successful in its processing and the photographs are marked as orientated then it is possible to proceed to referencing a new image. When a photograph has been orientated, Photomodeler has solved for the relative positions (xyz and rotations) from which the image was 33

41 recorded. Although Photomodeler will give the option to process a photograph after six reference points are marked on it, this will usually fail. After several trials it became clear that the orientation process will only succeed with between 10 and 15 referenced POI s. A spread of these reference points across the image is also important to strengthen the geometry of the network for the bundle adjustment. When two images have been successfully orientated Photomodeler will display epipolar lines to further aid in the referencing process. Epipolar lines represent the rays produced from the principal point of the first image projected onto the second image (Schmalfeldt, 2003). In other words a ray is produced upon which the point on the first image should appear on the second. If multiple orientated photographs are used then multiple epipolar lines will be produced. If the initial orientation was strong then the referenced point should appear at the intersection point of all epipolar lines. Figure 21 below shows an intersection of multiple epipolar lines. Figure 21 - Epipolar lines intersection. As mentioned by Scarmana there were difficulties when entering and rounding the corners of the loop. At each corner there were insufficient visible POI s in the images to successfully navigate around. After a careful analysis of what needed to be imaged several more photographs were taken to ensure the completion of the loop. The ability to determine the optimal camera positions for corners is acquired with experience in the field. After each adjustment Photomodeler provides a list of possible errors and indicates the point with the highest residual. Errors listed could refer to poor geometry of points on the photographs, a maximum residual over 5 pixels and whether or not all images have enough common tie points to create one complete loop or series. Figure 22 below shows the final output by Photomodeler with camera positions and referenced POI displayed. The output has been overlaid onto an image of the site obtained from NearMap. 34

42 Figure 22 - Processing results. 4.5 Coordinate system It is critical to incorporate an accurate coordinate system into the processing as it forms the basis for all of the internal measurements made by the bundle adjustments and collinearity equations. If errors are made in the initial control coordinate system then all internal calculations and measurements will be compromised. The coordinate system not only gives the entire project a reference frame but also produces a project scale as well as an orientation. Photomodeler requires three coordinated points (xyz) from one photograph to be input into the program before any calculated measurements can be made (three green points in Figure 23 below). It is critical for the accuracy of the project that the three coordinated points do not lie in a single line. Improved accuracy would be achieved if Photomodeler allowed for an input of four control points so that it was not necessary that all points lay on the same plane. Figure 23 - Control point coordinates. 35

43 Because the control points are essentially what anchor the project, the greater redundancy that occur in these point positions the higher the total project accuracy. Therefore, control points should occur in as many photographs as possible. As all the results of this thesis are expressed in a local coordinate system with internal local positional accuracies, it was possible to define an arbitrary coordinate system. A Sokkia Set530rk was used to create the coordinates for three control points from the arbitrary coordinate system. Whilst the instrument was set up several other points were coordinated which could be used as intermittent check points during the processing of the photographs. The control points are listed in the table below Table 5 - Control coordinates. Pt ID E N H Total Station Check To check the accuracies of the positions produced by Photomodeler, a comparison of several points was made using a Sokkia total station. Three station setups were conducted and 11 points were coordinated with respect to the defined coordinate system. The points were spread around the loop to give an interpretation of the location of the largest positional differences. Section 5 below outlines the results of the comparisons between the coordinates obtained by traditional survey methods and those obtained by close range photogrammetry. Figure 24 - Control network. 36

44 5. Results and Analysis 5.1 Full Loop The first full loop processing of the photographs in Photomodeler consisted of 284 referenced points in 52 photographs. The project achieved a successful overall adjustment with the largest residual being 4.8 pixels on point Photomodeler suggests that the largest residual should not exceed 5 pixels for a successful project. Photomodeler will still adjust the project but an alert will appear informing the user that the largest residual falls outside the tolerances of an accurate job. Tolerances can be changed depending on the accuracy required for the project Point accuracies The first comparison to be made from Photomodeler s output of coordinates was the precision of each point. Table 6 and Figure 25 below demonstrate a random sample of the precision of points or error bars from around the loop traverse. The error bars have been magnified 100x for ease of viewing. As expected the error bars increased as the points moved away from the origin due to the fact that the errors are propagated from this origin point in an equal amount in both a clockwise and anticlockwise direction e.g. points 834 and 988. Figure 25 - Plotted error bars for easting and northing. Note: Four points exist directly south of point 834 but were omitted from this graph for ease of viewing. 37

45 Table 6 - Positional precisions. At the beginning of the loop the error bars are in the magnitude of 2-3 cm in the x direction and 3-4 cm in the y direction. It would initially be assumed that the error bars would be similar in both directions as no significant gross errors have yet been introduced into the project and the errors would consist only of random errors. The precision in the x direction ranges from 1.6 cm to 22 cm and 2 cm to 58 cm in the y direction. Both maximum values occurred at point 838 which was 120 m from the starting point. Points 828, 836, 838 and 856 all have significantly worse precisions in all directions when compared to all other points in the loop. In particular, the y and z precision components of these points are in excess of two times larger than all other points. This sudden jump in precision is attributed to the fact that they were taken from a distance of over 50m and appear in only two photographs. Clearly marking a point at such a distance is difficult as the image becomes blurred and pixelated as the photograph is enlarged. Point ID X Precision Y Precision Z Precision (cm) (cm) (cm)

46 Although Scarmana does not record the precisions he achieved in his project based on final results, it would be assumed that they would be of a similar magnitude AutoCAD Comparisons The best comparison that can be made is between coordinates that are produced by Photomodeler and the control points coordinated using the Sokkia total station. Table 7 below is an output of coordinates of 13 randomly distributed points around the loop and the differences in their coordinates. Order From Start Table 7 - Coordinate Comparison. CAD Coordinates Photo coordinates Difference Pt ID E N H E N H E N H From Table 7 above it can be seen that the discrepancy in coordinates between Photomodeler s adjustment and the control points get larger as the points travel away from the defined points (4, 7 and 13). This is as expected as, with any loop traverse, the errors are increased with respect to the distance from the control points held fixed. At point 834, which is furthest from the three control points held fixed, the errors are the largest in all three directions. The maximum average error obtained from this project was m in the horizontal positioning of the point. As Scarmana achieved an average error of 1m for every 150 m travelled in the horizontal positional direction it was assumed the error would be of a similar magnitude. The height component for all points is significantly smaller than the x and y components due to the points all lying on the ground. Any points that are raised significantly have a considerably larger error in height. As the samples analysed her are a random selection of the data points, it is expected that the overall project accuracy is in this range. The positional accuracies also correlate with the x and y precisions 39

47 discussed in Section above. From the table it was predictable that the point with the largest discrepancy to the calculated AutoCAD coordinate would be Point 834. Easting difference The easting component of the positional accuracies of the coordinates ranges from 1.9 mm to 480 mm. This falls within the expected accuracies for a project of this distance using a quality camera. The graph below demonstrates the movement in easting discrepancies from the calculated coordinates as the points move away from the starting point to a maximum distance at point 834. It can then be seen that there is a decrease in easting error as the loop is closed and the points become closer to the starting control points that were held fixed. Figure 26 - Easting Errors. The root mean squared (RMS) value was calculated for the easting component of the positional error using the formula RMS = (6) Where D is the positional difference between Photomodeler and AutoCAD and n is the number of errors being calculated. The RMS value for the easting component was calculated at m with a standard deviation of m. Northing difference In the northing direction the positional errors ranged from 63 mm at the closest point to the start marks to 647 mm at the points further away. The north errors follow the same trend as the easting errors in that they increase to a maximum at the furthest point and then decrease back to small errors as the points converge on the start. The graph below demonstrates the movement of the northing errors. 40

48 Figure 27 - Northing Errors. It is interesting to note that point 834 does not have the largest northing error as would have been expected. Point 722 which is closer to the start than 834 has a much higher error in the northing direction. It is uncertain as to what gross errors have led to this point having the largest error in the north direction. In all points the error in the northerly direction was larger than that of its easterly counterpart. This is due to the fact that the entire project moves predominantly in a north south direction with limited movement in the east-west direction. For the northing errors, the RMS is m and the standard deviation is m. Height difference The error in the height component of the positions of each point is relatively small and quite consistent with the exception of point 834. All height differences were less than 50 mm with the exception of two (points 1154 and 834). The graph below again demonstrates an increase, then decrease in positional differences. It appears that the higher the marked point is from the original control points, the larger the error in the height. Almost all points are on the same level as the control points except for points 1154 and 834 and their errors reflect this. Point 834 is over two metres above the heights of the three anchored control points and hence has an error significantly larger (6x) than all other points. This would suggest that Photomodeler has difficulty in calculating heights of points on images and as the height is increased the errors are increased at a faster rate. 41

49 Figure 28 - Height Errors. For the heighting errors, the RMS is 0.092m and the standard deviation is 0.082m Point Residuals The point residuals of the project all fell within the 5 pixel tolerance suggested by Photomodeler. Some points fell outside the tolerance of 5 pixels during the processing but all were redundant points and were simply removed from the photograph/s. Photomodeler has a function which continuously displays the highest residual in pixels, which point it is and which photograph it appears on. Overall the residuals of the project averaged pixels which was less than expected. After research and viewing of Photomodeler s tutorial videos it was originally thought that the average would be around 3-4 pixels due to the inexperienced nature of the user. Table 8 below lists the top 5 points with the worst residuals. Table 8 - Top 5 worst residuals. Id X (m) Y (m) Z (m) Largest Residual (Pixels) Project Average Investigations of the points with higher residuals, including those removed, showed three distinguishing factors that were common to all points. The first was the distance of the referenced point from the camera station. The further away the point, the larger the residual for that particular point. Accurate selection of the right reference mark requires an increase in the zoom of the photograph, once the photograph is zoomed in too far all objects in the image become pixelated 42

50 including the reference point. If the mark is crucial to the project then the exact position of the reference point must be estimated. Otherwise the point should be ignored and not referenced. The second observation made was with respect to the object point being referenced. In positions where there are limited POI s available, improvisations were necessary and features such as cracks in the path and wall were used. To ensure visibility in multiple photographs quite significant cracks must be used as reference points. Ideally the middle of a crack should be selected as this feature is usually distinguishable by colour in all photographs as opposed to edges which may not always be discernible. The errors occur when the exact centre of the crack in multiple photographs is not selected. The third factor common to the high residuals became evident when referencing corners or edges of buildings and features. The reliability and accuracy with which the corners and edges can be identified is highly dependent on the perspective of the image in relation to the feature of interest. Unless an image is taken perpendicular to the face of a structure, the position of the edges and corners of the structure will be difficult to accurately reference. All errors that have occurred during the project are now knowledge for future work. If the project was to be repeated, care would be taken to select more appropriate POI s where possible or to use portable manual targets in the field Point Angles It is evident that a significant improvement in the accuracy of results could be achieved with an increase in intersection angles. Due to inexperience in the photogrammetry field, several points were calculated using very poor geometry. It is suggested that a good overall accuracy requires intersection angles or projection rays of between 60 and 90 because smaller angles will compromise the geometry of the bundle adjustment. Photomodeler will continue to process the project regardless of the geometry of each projection ray and will only notify after processing if the project contains an angle of less than 5. The table below lists the five smallest angles in the project all of which were subsequently removed. No further points could be removed without compromising the integrity of other sections of the project. 43

51 Table 9 - Top 5 worst angles. The final project had angles ranging from 5 to 90 with an average of 46 and a median of 48. This is well below what is recommended as the standard in the photogrammetry field. However, there were sections of the loop that required small angles due to the limited space. In areas where the field of view was limited it was necessary to compromise between using small angles and capturing enough POI s to reference to other photographs. The area in the south west corner of this traverse in particular was difficult due to the narrow path between two opposing buildings. Id X (m) Y (m) Z (m) Angle (Deg) Project Average Based on the results of this project, it is evident that, with further experience in the field of photogrammetry, it would be possible to set the camera stations closer to optimal positions. Network geometry is crucial to the project accuracy and without doubt the accuracy of this project could be increased with improvement in the ray intersection angles. 5.2 Incomplete Loop For comparison purposes, the project was processed again with several photographs set to Do not use. To gain an accurate reading of the types of errors that occur when the loop is incomplete there should be no connection between the photographs at the start and end of the loop. Because almost all of the photographs on the western side of the loop use points from photographs 2 and 3 to orientate themselves, fifteen photographs (19, 21-29, 31-32, 68 and 71-72) were taken from the loop before it was reprocessed. Figure 29 below illustrates the incomplete loop. Figure 29 - Incomplete loop. 44

EXPERIMENT ON PARAMETER SELECTION OF IMAGE DISTORTION MODEL

EXPERIMENT ON PARAMETER SELECTION OF IMAGE DISTORTION MODEL IARS Volume XXXVI, art 5, Dresden 5-7 September 006 EXERIMENT ON ARAMETER SELECTION OF IMAGE DISTORTION MODEL Ryuji Matsuoa*, Noboru Sudo, Hideyo Yootsua, Mitsuo Sone Toai University Research & Information

More information

CSI: Rombalds Moor Photogrammetry Photography

CSI: Rombalds Moor Photogrammetry Photography Photogrammetry Photography Photogrammetry Training 26 th March 10:00 Welcome Presentation image capture Practice 12:30 13:15 Lunch More practice 16:00 (ish) Finish or earlier What is photogrammetry 'photo'

More information

Geometry of Aerial Photographs

Geometry of Aerial Photographs Geometry of Aerial Photographs Aerial Cameras Aerial cameras must be (details in lectures): Geometrically stable Have fast and efficient shutters Have high geometric and optical quality lenses They can

More information

Technical information about PhoToPlan

Technical information about PhoToPlan Technical information about PhoToPlan The following pages shall give you a detailed overview of the possibilities using PhoToPlan. kubit GmbH Fiedlerstr. 36, 01307 Dresden, Germany Fon: +49 3 51/41 767

More information

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation

More information

MINIMISING SYSTEMATIC ERRORS IN DEMS CAUSED BY AN INACCURATE LENS MODEL

MINIMISING SYSTEMATIC ERRORS IN DEMS CAUSED BY AN INACCURATE LENS MODEL MINIMISING SYSTEMATIC ERRORS IN DEMS CAUSED BY AN INACCURATE LENS MODEL R. Wackrow a, J.H. Chandler a and T. Gardner b a Dept. Civil and Building Engineering, Loughborough University, LE11 3TU, UK (r.wackrow,

More information

CALIBRATION OF AN AMATEUR CAMERA FOR VARIOUS OBJECT DISTANCES

CALIBRATION OF AN AMATEUR CAMERA FOR VARIOUS OBJECT DISTANCES CALIBRATION OF AN AMATEUR CAMERA FOR VARIOUS OBJECT DISTANCES Sanjib K. Ghosh, Monir Rahimi and Zhengdong Shi Laval University 1355 Pav. Casault, Laval University QUEBEC G1K 7P4 CAN A D A Commission V

More information

Phase One ixu-rs1000 Accuracy Assessment Report Yu. Raizman, PhaseOne.Industrial, Israel

Phase One ixu-rs1000 Accuracy Assessment Report Yu. Raizman, PhaseOne.Industrial, Israel 17 th International Scientific and Technical Conference FROM IMAGERY TO DIGITAL REALITY: ERS & Photogrammetry Phase One ixu-rs1000 Accuracy Assessment Report Yu. Raizman, PhaseOne.Industrial, Israel 1.

More information

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES Petteri PÖNTINEN Helsinki University of Technology, Institute of Photogrammetry and Remote Sensing, Finland petteri.pontinen@hut.fi KEY WORDS: Cocentricity,

More information

Volume 1 - Module 6 Geometry of Aerial Photography. I. Classification of Photographs. Vertical

Volume 1 - Module 6 Geometry of Aerial Photography. I. Classification of Photographs. Vertical RSCC Volume 1 Introduction to Photo Interpretation and Photogrammetry Table of Contents Module 1 Module 2 Module 3.1 Module 3.2 Module 4 Module 5 Module 6 Module 7 Module 8 Labs Volume 1 - Module 6 Geometry

More information

A Study of Slanted-Edge MTF Stability and Repeatability

A Study of Slanted-Edge MTF Stability and Repeatability A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency

More information

not to be republished NCERT Introduction To Aerial Photographs Chapter 6

not to be republished NCERT Introduction To Aerial Photographs Chapter 6 Chapter 6 Introduction To Aerial Photographs Figure 6.1 Terrestrial photograph of Mussorrie town of similar features, then we have to place ourselves somewhere in the air. When we do so and look down,

More information

AgilEye Manual Version 2.0 February 28, 2007

AgilEye Manual Version 2.0 February 28, 2007 AgilEye Manual Version 2.0 February 28, 2007 1717 Louisiana NE Suite 202 Albuquerque, NM 87110 (505) 268-4742 support@agiloptics.com 2 (505) 268-4742 v. 2.0 February 07, 2007 3 Introduction AgilEye Wavefront

More information

One Week to Better Photography

One Week to Better Photography One Week to Better Photography Glossary Adobe Bridge Useful application packaged with Adobe Photoshop that previews, organizes and renames digital image files and creates digital contact sheets Adobe Photoshop

More information

Following are the geometrical elements of the aerial photographs:

Following are the geometrical elements of the aerial photographs: Geometrical elements/characteristics of aerial photograph: An aerial photograph is a central or perspective projection, where the bundles of perspective rays meet at a point of origin called perspective

More information

Sample Copy. Not For Distribution.

Sample Copy. Not For Distribution. Photogrammetry, GIS & Remote Sensing Quick Reference Book i EDUCREATION PUBLISHING Shubham Vihar, Mangla, Bilaspur, Chhattisgarh - 495001 Website: www.educreation.in Copyright, 2017, S.S. Manugula, V.

More information

Opto Engineering S.r.l.

Opto Engineering S.r.l. TUTORIAL #1 Telecentric Lenses: basic information and working principles On line dimensional control is one of the most challenging and difficult applications of vision systems. On the other hand, besides

More information

Close-Range Photogrammetry for Accident Reconstruction Measurements

Close-Range Photogrammetry for Accident Reconstruction Measurements Close-Range Photogrammetry for Accident Reconstruction Measurements iwitness TM Close-Range Photogrammetry Software www.iwitnessphoto.com Lee DeChant Principal DeChant Consulting Services DCS Inc Bellevue,

More information

Suveying Lectures for CE 498

Suveying Lectures for CE 498 Suveying Lectures for CE 498 SURVEYING CLASSIFICATIONS Surveying work can be classified as follows: 1- Preliminary Surveying In this surveying the detailed data are collected by determining its locations

More information

Basics of Photogrammetry Note#6

Basics of Photogrammetry Note#6 Basics of Photogrammetry Note#6 Photogrammetry Art and science of making accurate measurements by means of aerial photography Analog: visual and manual analysis of aerial photographs in hard-copy format

More information

Table of Contents. 1. High-Resolution Images with the D800E Aperture and Complex Subjects Color Aliasing and Moiré...

Table of Contents. 1. High-Resolution Images with the D800E Aperture and Complex Subjects Color Aliasing and Moiré... Technical Guide Introduction This Technical Guide details the principal techniques used to create two of the more technically advanced photographs in the D800/D800E brochure. Take this opportunity to admire

More information

Specifications for Post-Earthquake Precise Levelling and GNSS Survey. Version 1.0 National Geodetic Office

Specifications for Post-Earthquake Precise Levelling and GNSS Survey. Version 1.0 National Geodetic Office Specifications for Post-Earthquake Precise Levelling and GNSS Survey Version 1.0 National Geodetic Office 24 November 2010 Specification for Post-Earthquake Precise Levelling and GNSS Survey Page 1 of

More information

Structure from Motion (SfM) Photogrammetry Field Methods Manual for Students

Structure from Motion (SfM) Photogrammetry Field Methods Manual for Students Structure from Motion (SfM) Photogrammetry Field Methods Manual for Students Written by Katherine Shervais (UNAVCO) Introduction to SfM for Field Education The purpose of the Analyzing High Resolution

More information

Handbook of practical camera calibration methods and models CHAPTER 6 MISCELLANEOUS ISSUES

Handbook of practical camera calibration methods and models CHAPTER 6 MISCELLANEOUS ISSUES CHAPTER 6 MISCELLANEOUS ISSUES Executive summary This chapter collects together some material on a number of miscellaneous issues such as use of cameras underwater and some practical tips on the use of

More information

NON-METRIC BIRD S EYE VIEW

NON-METRIC BIRD S EYE VIEW NON-METRIC BIRD S EYE VIEW Prof. A. Georgopoulos, M. Modatsos Lab. of Photogrammetry, Dept. of Rural & Surv. Engineering, National Technical University of Athens, 9, Iroon Polytechniou, GR-15780 Greece

More information

Handbook of practical camera calibration methods and models CHAPTER 2 CAMERA CALIBRATION MODEL SELECTION

Handbook of practical camera calibration methods and models CHAPTER 2 CAMERA CALIBRATION MODEL SELECTION CHAPTER 2 CAMERA CALIBRATION MODEL SELECTION Executive summary The interface between object space and image space in a camera is the lens. A lens can be modeled using by a pin-hole or a parametric function.

More information

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008 Luzern, Switzerland, acquired at 5 cm GSD, 2008. Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008 Shawn Slade, Doug Flint and Ruedi Wagner Leica Geosystems AG, Airborne

More information

PRINCIPLE PROCEDURE ACTIVITY. AIM To observe diffraction of light due to a thin slit.

PRINCIPLE PROCEDURE ACTIVITY. AIM To observe diffraction of light due to a thin slit. ACTIVITY 12 AIM To observe diffraction of light due to a thin slit. APPARATUS AND MATERIAL REQUIRED Two razor blades, one adhesive tape/cello-tape, source of light (electric bulb/ laser pencil), a piece

More information

PHOTOGRAMMETRIC METHODS FOR QUALITY CONTROL OF TRENCHLESS CONSTRUCTION PROJECTS

PHOTOGRAMMETRIC METHODS FOR QUALITY CONTROL OF TRENCHLESS CONSTRUCTION PROJECTS North American Society for Trenchless Technology (NASTT) No-Dig Show 2010 Chicago, Illinois May 2-7, 2010 Paper D-1-03 PHOTOGRAMMETRIC METHODS FOR QUALITY CONTROL OF TRENCHLESS CONSTRUCTION PROJECTS Jason

More information

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design Computer Aided Design Several CAD tools use Ray Tracing (see

More information

10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions

10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions 10.2 SUMMARY Refraction in Lenses Converging lenses bring parallel rays together after they are refracted. Diverging lenses cause parallel rays to move apart after they are refracted. Rays are refracted

More information

Exercise 1-3. Radar Antennas EXERCISE OBJECTIVE DISCUSSION OUTLINE DISCUSSION OF FUNDAMENTALS. Antenna types

Exercise 1-3. Radar Antennas EXERCISE OBJECTIVE DISCUSSION OUTLINE DISCUSSION OF FUNDAMENTALS. Antenna types Exercise 1-3 Radar Antennas EXERCISE OBJECTIVE When you have completed this exercise, you will be familiar with the role of the antenna in a radar system. You will also be familiar with the intrinsic characteristics

More information

A New Capability for Crash Site Documentation

A New Capability for Crash Site Documentation A New Capability for Crash Site Documentation By Major Adam Cybanski, Directorate of Flight Safety, Ottawa Major Adam Cybanski is the officer responsible for helicopter investigation (DFS 2-4) at the Canadian

More information

The diffraction of light

The diffraction of light 7 The diffraction of light 7.1 Introduction As introduced in Chapter 6, the reciprocal lattice is the basis upon which the geometry of X-ray and electron diffraction patterns can be most easily understood

More information

Metric Accuracy Testing with Mobile Phone Cameras

Metric Accuracy Testing with Mobile Phone Cameras Metric Accuracy Testing with Mobile Phone Cameras Armin Gruen,, Devrim Akca Chair of Photogrammetry and Remote Sensing ETH Zurich Switzerland www.photogrammetry.ethz.ch Devrim Akca, the 21. ISPRS Congress,

More information

Appendix A: Detailed Field Procedures

Appendix A: Detailed Field Procedures Appendix A: Detailed Field Procedures Camera Calibration Considerations Over the course of generating camera-lens calibration files for this project and other research, it was found that the Canon 7D (crop

More information

Speed and Image Brightness uniformity of telecentric lenses

Speed and Image Brightness uniformity of telecentric lenses Specialist Article Published by: elektronikpraxis.de Issue: 11 / 2013 Speed and Image Brightness uniformity of telecentric lenses Author: Dr.-Ing. Claudia Brückner, Optics Developer, Vision & Control GmbH

More information

Stereo Image Capture and Interest Point Correlation for 3D Modeling

Stereo Image Capture and Interest Point Correlation for 3D Modeling Stereo Image Capture and Interest Point Correlation for 3D Modeling Andrew Crocker, Eileen King, and Tommy Markley Department of Math, Statistics, and Computer Science St. Olaf College 1500 St. Olaf Avenue,

More information

INSTRUCTION MANUAL FOR THE MODEL C OPTICAL TESTER

INSTRUCTION MANUAL FOR THE MODEL C OPTICAL TESTER INSTRUCTION MANUAL FOR THE MODEL C OPTICAL TESTER INSTRUCTION MANUAL FOR THE MODEL C OPTICAL TESTER Data Optics, Inc. (734) 483-8228 115 Holmes Road or (800) 321-9026 Ypsilanti, Michigan 48198-3020 Fax:

More information

PHOTOGRAMMETRY STEREOSCOPY FLIGHT PLANNING PHOTOGRAMMETRIC DEFINITIONS GROUND CONTROL INTRODUCTION

PHOTOGRAMMETRY STEREOSCOPY FLIGHT PLANNING PHOTOGRAMMETRIC DEFINITIONS GROUND CONTROL INTRODUCTION PHOTOGRAMMETRY STEREOSCOPY FLIGHT PLANNING PHOTOGRAMMETRIC DEFINITIONS GROUND CONTROL INTRODUCTION Before aerial photography and photogrammetry became a reliable mapping tool, planimetric and topographic

More information

Panorama Photogrammetry for Architectural Applications

Panorama Photogrammetry for Architectural Applications Panorama Photogrammetry for Architectural Applications Thomas Luhmann University of Applied Sciences ldenburg Institute for Applied Photogrammetry and Geoinformatics fener Str. 16, D-26121 ldenburg, Germany

More information

Technical Guide Technical Guide

Technical Guide Technical Guide Technical Guide Technical Guide Introduction This Technical Guide details the principal techniques used to create two of the more technically advanced photographs in the D800/D800E catalog. Enjoy this

More information

PROPERTY OF THE LARGE FORMAT DIGITAL AERIAL CAMERA DMC II

PROPERTY OF THE LARGE FORMAT DIGITAL AERIAL CAMERA DMC II PROPERTY OF THE LARGE FORMAT DIGITAL AERIAL CAMERA II K. Jacobsen a, K. Neumann b a Institute of Photogrammetry and GeoInformation, Leibniz University Hannover, Germany jacobsen@ipi.uni-hannover.de b Z/I

More information

NREM 345 Week 2, Material covered this week contributes to the accomplishment of the following course goal:

NREM 345 Week 2, Material covered this week contributes to the accomplishment of the following course goal: NREM 345 Week 2, 2010 Reading assignment: Chapter. 4 and Sec. 5.1 to 5.2.4 Material covered this week contributes to the accomplishment of the following course goal: Goal 1: Develop the understanding and

More information

Zoom-Dependent Camera Calibration in Digital Close-Range Photogrammetry

Zoom-Dependent Camera Calibration in Digital Close-Range Photogrammetry Zoom-Dependent Camera Calibration in Digital Close-Range Photogrammetry C.S. Fraser and S. Al-Ajlouni Abstract One of the well-known constraints applying to the adoption of consumer-grade digital cameras

More information

HD aerial video for coastal zone ecological mapping

HD aerial video for coastal zone ecological mapping HD aerial video for coastal zone ecological mapping Albert K. Chong University of Otago, Dunedin, New Zealand Phone: +64 3 479-7587 Fax: +64 3 479-7586 Email: albert.chong@surveying.otago.ac.nz Presented

More information

Engineering Graphics Essentials with AutoCAD 2015 Instruction

Engineering Graphics Essentials with AutoCAD 2015 Instruction Kirstie Plantenberg Engineering Graphics Essentials with AutoCAD 2015 Instruction Text and Video Instruction Multimedia Disc SDC P U B L I C AT I O N S Better Textbooks. Lower Prices. www.sdcpublications.com

More information

MODULE No. 34: Digital Photography and Enhancement

MODULE No. 34: Digital Photography and Enhancement SUBJECT Paper No. and Title Module No. and Title Module Tag PAPER No. 8: Questioned Document FSC_P8_M34 TABLE OF CONTENTS 1. Learning Outcomes 2. Introduction 3. Cameras and Scanners 4. Image Enhancement

More information

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc How to Optimize the Sharpness of Your Photographic Prints: Part II - Practical Limits to Sharpness in Photography and a Useful Chart to Deteremine the Optimal f-stop. Robert B.Hallock hallock@physics.umass.edu

More information

DEVELOPMENT AND APPLICATION OF AN EXTENDED GEOMETRIC MODEL FOR HIGH RESOLUTION PANORAMIC CAMERAS

DEVELOPMENT AND APPLICATION OF AN EXTENDED GEOMETRIC MODEL FOR HIGH RESOLUTION PANORAMIC CAMERAS DEVELOPMENT AND APPLICATION OF AN EXTENDED GEOMETRIC MODEL FOR HIGH RESOLUTION PANORAMIC CAMERAS D. Schneider, H.-G. Maas Dresden University of Technology Institute of Photogrammetry and Remote Sensing

More information

Quintic Hardware Tutorial Camera Set-Up

Quintic Hardware Tutorial Camera Set-Up Quintic Hardware Tutorial Camera Set-Up 1 All Quintic Live High-Speed cameras are specifically designed to meet a wide range of needs including coaching, performance analysis and research. Quintic LIVE

More information

This experiment is under development and thus we appreciate any and all comments as we design an interesting and achievable set of goals.

This experiment is under development and thus we appreciate any and all comments as we design an interesting and achievable set of goals. Experiment 7 Geometrical Optics You will be introduced to ray optics and image formation in this experiment. We will use the optical rail, lenses, and the camera body to quantify image formation and magnification;

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

Topic 6 - Optics Depth of Field and Circle Of Confusion

Topic 6 - Optics Depth of Field and Circle Of Confusion Topic 6 - Optics Depth of Field and Circle Of Confusion Learning Outcomes In this lesson, we will learn all about depth of field and a concept known as the Circle of Confusion. By the end of this lesson,

More information

ENGINEERING GRAPHICS ESSENTIALS

ENGINEERING GRAPHICS ESSENTIALS ENGINEERING GRAPHICS ESSENTIALS with AutoCAD 2012 Instruction Introduction to AutoCAD Engineering Graphics Principles Hand Sketching Text and Independent Learning CD Independent Learning CD: A Comprehensive

More information

RESULTS OF 3D PHOTOGRAMMETRY ON THE CMS BARREL YOKE

RESULTS OF 3D PHOTOGRAMMETRY ON THE CMS BARREL YOKE RESULTS OF 3D PHOTOGRAMMETRY ON THE CMS BARREL YOKE R. GOUDARD, C. HUMBERTCLAUDE *1, K. NUMMIARO CERN, European Laboratory for Particle Physics, Geneva, Switzerland 1. INTRODUCTION Compact Muon Solenoid

More information

CALIBRATION OF OPTICAL SATELLITE SENSORS

CALIBRATION OF OPTICAL SATELLITE SENSORS CALIBRATION OF OPTICAL SATELLITE SENSORS KARSTEN JACOBSEN University of Hannover Institute of Photogrammetry and Geoinformation Nienburger Str. 1, D-30167 Hannover, Germany jacobsen@ipi.uni-hannover.de

More information

CALIBRATING THE NEW ULTRACAM OSPREY OBLIQUE AERIAL SENSOR Michael Gruber, Wolfgang Walcher

CALIBRATING THE NEW ULTRACAM OSPREY OBLIQUE AERIAL SENSOR Michael Gruber, Wolfgang Walcher CALIBRATING THE NEW ULTRACAM OSPREY OBLIQUE AERIAL SENSOR Michael Gruber, Wolfgang Walcher Microsoft UltraCam Business Unit Anzengrubergasse 8/4, 8010 Graz / Austria {michgrub, wwalcher}@microsoft.com

More information

Chapter 18 Optical Elements

Chapter 18 Optical Elements Chapter 18 Optical Elements GOALS When you have mastered the content of this chapter, you will be able to achieve the following goals: Definitions Define each of the following terms and use it in an operational

More information

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation Optical Performance of Nikon F-Mount Lenses Landon Carter May 11, 2016 2.671 Measurement and Instrumentation Abstract In photographic systems, lenses are one of the most important pieces of the system

More information

PHOTOGRAMMETRIC RESECTION DIFFERENCES BASED ON LABORATORY vs. OPERATIONAL CALIBRATIONS

PHOTOGRAMMETRIC RESECTION DIFFERENCES BASED ON LABORATORY vs. OPERATIONAL CALIBRATIONS PHOTOGRAMMETRIC RESECTION DIFFERENCES BASED ON LABORATORY vs. OPERATIONAL CALIBRATIONS Dean C. MERCHANT Topo Photo Inc. Columbus, Ohio USA merchant.2@osu.edu KEY WORDS: Photogrammetry, Calibration, GPS,

More information

Focus on an optical blind spot A closer look at lenses and the basics of CCTV optical performances,

Focus on an optical blind spot A closer look at lenses and the basics of CCTV optical performances, Focus on an optical blind spot A closer look at lenses and the basics of CCTV optical performances, by David Elberbaum M any security/cctv installers and dealers wish to know more about lens basics, lens

More information

Understanding Projection Systems

Understanding Projection Systems Understanding Projection Systems A Point: A point has no dimensions, a theoretical location that has neither length, width nor height. A point shows an exact location in space. It is important to understand

More information

Camera Calibration Certificate No: DMC III 27542

Camera Calibration Certificate No: DMC III 27542 Calibration DMC III Camera Calibration Certificate No: DMC III 27542 For Peregrine Aerial Surveys, Inc. #201 1255 Townline Road Abbotsford, B.C. V2T 6E1 Canada Calib_DMCIII_27542.docx Document Version

More information

Aerial photography: Principles. Frame capture sensors: Analog film and digital cameras

Aerial photography: Principles. Frame capture sensors: Analog film and digital cameras Aerial photography: Principles Frame capture sensors: Analog film and digital cameras Overview Introduction Frame vs scanning sensors Cameras (film and digital) Photogrammetry Orthophotos Air photos are

More information

DETERMINATION OF ST. GEORGE BASILICA TOWER HISTORICAL INCLINATION FROM CONTEMPORARY PHOTOGRAPH

DETERMINATION OF ST. GEORGE BASILICA TOWER HISTORICAL INCLINATION FROM CONTEMPORARY PHOTOGRAPH DETERMINATION OF ST. GEORGE BASILICA TOWER HISTORICAL INCLINATION FROM CONTEMPORARY PHOTOGRAPH Bronislav KOSKA 1 1 Czech Technical University in Prague, Faculty of Civil Engineering Thákurova 7, Prague

More information

EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES 4.2 AIM 4.1 INTRODUCTION

EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES 4.2 AIM 4.1 INTRODUCTION EXPERIMENT 4 INVESTIGATIONS WITH MIRRORS AND LENSES Structure 4.1 Introduction 4.2 Aim 4.3 What is Parallax? 4.4 Locating Images 4.5 Investigations with Real Images Focal Length of a Concave Mirror Focal

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

More information

Film Cameras Digital SLR Cameras Point and Shoot Bridge Compact Mirror less

Film Cameras Digital SLR Cameras Point and Shoot Bridge Compact Mirror less Film Cameras Digital SLR Cameras Point and Shoot Bridge Compact Mirror less Portraits Landscapes Macro Sports Wildlife Architecture Fashion Live Music Travel Street Weddings Kids Food CAMERA SENSOR

More information

Photographing Art By Mark Pemberton March 26, 2009

Photographing Art By Mark Pemberton March 26, 2009 Photographing Art By Mark Pemberton March 26, 2009 Introduction Almost all artists need to photograph their artwork at some time or another. Usually this is for the purpose of creating a portfolio of their

More information

Cameras have number of controls that allow the user to change the way the photograph looks.

Cameras have number of controls that allow the user to change the way the photograph looks. Anatomy of a camera - Camera Controls Cameras have number of controls that allow the user to change the way the photograph looks. Focus In the eye the cornea and the lens adjust the focus on the retina.

More information

ContextCapture Quick guide for photo acquisition

ContextCapture Quick guide for photo acquisition ContextCapture Quick guide for photo acquisition ContextCapture is automatically turning photos into 3D models, meaning that the quality of the input dataset has a deep impact on the output 3D model which

More information

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS 6.098 Digital and Computational Photography 6.882 Advanced Computational Photography Bill Freeman Frédo Durand MIT - EECS Administrivia PSet 1 is out Due Thursday February 23 Digital SLR initiation? During

More information

Principles of Photogrammetry

Principles of Photogrammetry Winter 2014 1 Instructor: Contact Information. Office: Room # ENE 229C. Tel: (403) 220-7105. E-mail: ahabib@ucalgary.ca Lectures (SB 148): Monday, Wednesday& Friday (10:00 a.m. 10:50 a.m.). Office Hours:

More information

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS INFOTEH-JAHORINA Vol. 10, Ref. E-VI-11, p. 892-896, March 2011. MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS Jelena Cvetković, Aleksej Makarov, Sasa Vujić, Vlatacom d.o.o. Beograd Abstract -

More information

APPLICATION AND ACCURACY POTENTIAL OF A STRICT GEOMETRIC MODEL FOR ROTATING LINE CAMERAS

APPLICATION AND ACCURACY POTENTIAL OF A STRICT GEOMETRIC MODEL FOR ROTATING LINE CAMERAS APPLICATION AND ACCURACY POTENTIAL OF A STRICT GEOMETRIC MODEL FOR ROTATING LINE CAMERAS D. Schneider, H.-G. Maas Dresden University of Technology Institute of Photogrammetry and Remote Sensing Mommsenstr.

More information

I-I. S/Scientific Report No. I. Duane C. Brown. C-!3 P.O0. Box 1226 Melbourne, Florida

I-I. S/Scientific Report No. I. Duane C. Brown. C-!3 P.O0. Box 1226 Melbourne, Florida S AFCRL.-63-481 LOCATION AND DETERMINATION OF THE LOCATION OF THE ENTRANCE PUPIL -0 (CENTER OF PROJECTION) I- ~OF PC-1000 CAMERA IN OBJECT SPACE S Ronald G. Davis Duane C. Brown - L INSTRUMENT CORPORATION

More information

Using Optics to Optimize Your Machine Vision Application

Using Optics to Optimize Your Machine Vision Application Expert Guide Using Optics to Optimize Your Machine Vision Application Introduction The lens is responsible for creating sufficient image quality to enable the vision system to extract the desired information

More information

Ch 24. Geometric Optics

Ch 24. Geometric Optics text concept Ch 24. Geometric Optics Fig. 24 3 A point source of light P and its image P, in a plane mirror. Angle of incidence =angle of reflection. text. Fig. 24 4 The blue dashed line through object

More information

CAMERA BASICS. Stops of light

CAMERA BASICS. Stops of light CAMERA BASICS Stops of light A stop of light isn t a quantifiable measurement it s a relative measurement. A stop of light is defined as a doubling or halving of any quantity of light. The word stop is

More information

ELIMINATION OF COLOR FRINGES IN DIGITAL PHOTOGRAPHS CAUSED BY LATERAL CHROMATIC ABERRATION

ELIMINATION OF COLOR FRINGES IN DIGITAL PHOTOGRAPHS CAUSED BY LATERAL CHROMATIC ABERRATION ELIMINATION OF COLOR FRINGES IN DIGITAL PHOTOGRAPHS CAUSED BY LATERAL CHROMATIC ABERRATION V. Kaufmann, R. Ladstädter Institute of Remote Sensing and Photogrammetry Graz University of Technology, Austria

More information

An Introduction to Automatic Optical Inspection (AOI)

An Introduction to Automatic Optical Inspection (AOI) An Introduction to Automatic Optical Inspection (AOI) Process Analysis The following script has been prepared by DCB Automation to give more information to organisations who are considering the use of

More information

Laboratory 7: Properties of Lenses and Mirrors

Laboratory 7: Properties of Lenses and Mirrors Laboratory 7: Properties of Lenses and Mirrors Converging and Diverging Lens Focal Lengths: A converging lens is thicker at the center than at the periphery and light from an object at infinity passes

More information

High Precision Positioning Unit 1: Accuracy, Precision, and Error Student Exercise

High Precision Positioning Unit 1: Accuracy, Precision, and Error Student Exercise High Precision Positioning Unit 1: Accuracy, Precision, and Error Student Exercise Ian Lauer and Ben Crosby (Idaho State University) This assignment follows the Unit 1 introductory presentation and lecture.

More information

Sensors and Sensing Cameras and Camera Calibration

Sensors and Sensing Cameras and Camera Calibration Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014

More information

PERFORMANCE EVALUATIONS OF MACRO LENSES FOR DIGITAL DOCUMENTATION OF SMALL OBJECTS

PERFORMANCE EVALUATIONS OF MACRO LENSES FOR DIGITAL DOCUMENTATION OF SMALL OBJECTS PERFORMANCE EVALUATIONS OF MACRO LENSES FOR DIGITAL DOCUMENTATION OF SMALL OBJECTS ideharu Yanagi a, Yuichi onma b, irofumi Chikatsu b a Spatial Information Technology Division, Japan Association of Surveyors,

More information

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

IMAGE SENSOR SOLUTIONS. KAC-96-1/5 Lens Kit. KODAK KAC-96-1/5 Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2 KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

More information

Using PhotoModeler for 2D Template Digitizing Eos Systems Inc.

Using PhotoModeler for 2D Template Digitizing Eos Systems Inc. Using PhotoModeler for 2D Template Digitizing 2017 Eos Systems Inc. Table of Contents The Problem... 3 Why use a photogrammetry package?... 3 Caveats and License to Use... 3 The Basic Premise... 3 The

More information

Optical design of a high resolution vision lens

Optical design of a high resolution vision lens Optical design of a high resolution vision lens Paul Claassen, optical designer, paul.claassen@sioux.eu Marnix Tas, optical specialist, marnix.tas@sioux.eu Prof L.Beckmann, l.beckmann@hccnet.nl Summary:

More information

Exposure settings & Lens choices

Exposure settings & Lens choices Exposure settings & Lens choices Graham Relf Tynemouth Photographic Society September 2018 www.tynemouthps.org We will look at the 3 variables available for manual control of digital photos: Exposure time/duration,

More information

LENSES. INEL 6088 Computer Vision

LENSES. INEL 6088 Computer Vision LENSES INEL 6088 Computer Vision Digital camera A digital camera replaces film with a sensor array Each cell in the array is a Charge Coupled Device light-sensitive diode that converts photons to electrons

More information

Integral 3-D Television Using a 2000-Scanning Line Video System

Integral 3-D Television Using a 2000-Scanning Line Video System Integral 3-D Television Using a 2000-Scanning Line Video System We have developed an integral three-dimensional (3-D) television that uses a 2000-scanning line video system. An integral 3-D television

More information

INSTRUCTION MANUAL: PHOTOGRAMMETRY AS A NON-CONTACT MEASUREMENT SYSTEM IN LARGE SCALE STRUCTURAL TESTING

INSTRUCTION MANUAL: PHOTOGRAMMETRY AS A NON-CONTACT MEASUREMENT SYSTEM IN LARGE SCALE STRUCTURAL TESTING INSTRUCTION MANUAL: PHOTOGRAMMETRY AS A NON-CONTACT MEASUREMENT SYSTEM IN LARGE SCALE STRUCTURAL TESTING CEE 597 Summer Independent Study Deliverable Submitted: August 16, 2012 Anahid A. Behrouzi, Rui

More information

CALIBRATION OF IMAGING SATELLITE SENSORS

CALIBRATION OF IMAGING SATELLITE SENSORS CALIBRATION OF IMAGING SATELLITE SENSORS Jacobsen, K. Institute of Photogrammetry and GeoInformation, University of Hannover jacobsen@ipi.uni-hannover.de KEY WORDS: imaging satellites, geometry, calibration

More information

Lab #10 Digital Orthophoto Creation (Using Leica Photogrammetry Suite)

Lab #10 Digital Orthophoto Creation (Using Leica Photogrammetry Suite) Lab #10 Digital Orthophoto Creation (Using Leica Photogrammetry Suite) References: Leica Photogrammetry Suite Project Manager: Users Guide, Leica Geosystems LLC. Leica Photogrammetry Suite 9.2 Introduction:

More information

Lighting Techniques 18 The Color of Light 21 SAMPLE

Lighting Techniques 18 The Color of Light 21 SAMPLE Advanced Evidence Photography Contents Table of Contents General Photographic Principles. 2 Camera Operation 2 Selecting a Lens 2 Focusing 3 Depth of Field 4 Controlling Exposure 6 Reciprocity 7 ISO Speed

More information

The Bellows Extension Exposure Factor: Including Useful Reference Charts for use in the Field

The Bellows Extension Exposure Factor: Including Useful Reference Charts for use in the Field The Bellows Extension Exposure Factor: Including Useful Reference Charts for use in the Field Robert B. Hallock hallock@physics.umass.edu revised May 23, 2005 Abstract: The need for a bellows correction

More information

How to combine images in Photoshop

How to combine images in Photoshop How to combine images in Photoshop In Photoshop, you can use multiple layers to combine images, but there are two other ways to create a single image from mulitple images. Create a panoramic image with

More information

Imaging Optics Fundamentals

Imaging Optics Fundamentals Imaging Optics Fundamentals Gregory Hollows Director, Machine Vision Solutions Edmund Optics Why Are We Here? Topics for Discussion Fundamental Parameters of your system Field of View Working Distance

More information