A Comparative Study of Structured Light and Laser Range Finding Devices

Size: px
Start display at page:

Download "A Comparative Study of Structured Light and Laser Range Finding Devices"

Transcription

1 A Comparative Study of Structured Light and Laser Range Finding Devices Todd Bernhard Anuraag Chintalapally Daniel Zukowski Abstract This is a survey of the resolution of data gathered by different 3D-imaging devices. The sensors covered in this paper consist of the Asus Xtion Pro Live [1], Microsoft Kinect [2], Microsoft Kinect with Nyko Zoom Lens [3], and Hokuyo URG- 04LX Laser Range Finder [4]. The results of this survey suggest that the Asus and Kinect (both with and without the Zoom Lens) possess similar capabilities, and are well-suited to near real-time applications, while the Hokuyo URG-04LX Laser Range Finder is able to detect smaller features than both the Kinect and the Asus, but is only viable for non-real-time applications. I. INTRODUCTION This survey was conducted with the goal of deciding upon the 3D-imaging platform for the Autoponics project [5] being conducted at Solid State Depot (SSD) in Boulder, Colorado [6]. Autoponics is a method of growing plants in which robots and other computerized machinery perform all aspects of plant food production: seeding, growing, harvesting, and processing. The Autoponics project at SSD is using the Robot Operating System (ROS) [7] framework for reading data and controlling system operations. Therefore, the ideal 3D-imaging device should have two important properties: it must be supported by ROS, and it must have a sufficiently high resolution to gather useful and reliable data. Specifically, the point cloud data produced by the device must be of sufficient resolution such that it can be used to produce a 3D model used for path planning and end-effector manipulation (i.e. harvesting parts of a plant). All of the devices tested in this survey are supported by ROS, and thus the ability to distinguish plants and their component features becomes the most important factor. The sensor package (which includes the imaging device) of the Autoponics system is mounted on a vertically-oriented cartesian robot, and so we will refer to a coordinate system throughout this paper with respect to the orientation of this vertical cartestian robot. Let us define the coordinate system such that the sensor is at the origin, pointed directly towards the positive Z-axis, which is parallel to the ground and orthogonal to the vertical XY-plane. The depth data is translated into Z-coordinates, and the resulting point clouds generated by the sensors create the basis for a varying surface that extends in the X- and Y-directions. Although the experiment is conducted in a similar manner for every device, it is important to note the differences in the purpose of each image sensor, as it may explain some of the results. The Microsoft Kinect was originally created as a enhanced gaming device for the Xbox platform. The primary purpose of the Kinect is to track players and their gestures. Since it is a mass market device, the cost of the sensor is relatively cheap at approximately $150. Another mass market device is the Asus Xtion Pro Live which costs about $190. The Xtion Pro was built with the purpose of sensing human movement for the PC platform. Since they are built to enhance an interactive experience, both the Kinect and Xtion Pro are real-time systems. The third image sensor used in the experiment is the Hokuyo URG-04LX Laser Range Finder. The URG-04LX has significant differences in function and purpose from the previous systems, which helps explain its price tag of almost $1,200. The URG-04LX is intended for use in indoor robot navigation and obstacle avoidance. It delivers a line of depth data in real-time, but creating 3D point clouds from this data requires post-processing, described below, and introduces enough latency that it is not considered a real-time system. The remainder of this paper is as follows: Materials, Methods, Results, Discussion, and Conclusion. II. MATERIALS The experiment conducted in this survey consists of the pieces seen in Figure 1. The three main components of the experiment are the 3D imaging devices, visual target, and the white cardboard backdrop behind the target. All three are placed on a flat eight-foot table. The visual target (Figure 2) is a one-foot by one-foot square of 1 4 inch-thick acrylic, laminated with an opaque brown material so as to eliminate the factor of transparency from the experimental variables. The features of the target are laser-cut holes that increase in size from right to left and from bottom to top. The diameter of the holes are 0.25 cm, 0.50 cm, 0.75 cm, 1 cm, 2 cm, 3 cm, 4 cm, 6 cm, 8 cm, and 10 cm. The Xtion Pro and the Kinect were tested without modification. Both of these devices use structured light to cast a grid of infrared points onto the environment. Because the light is distributed in two dimensions, each frame of the sensor data produces a complete point cloud. The Hokuyo Laser Range Finder differs in that it creates a 1D line of points at different depths, or a slice of a point cloud. To generate a full 3D point

2 topics, inserts the X-coordinates, buffers the modified laser scans, and then displays a collection of slices simultaneously in the ROS Visualizer (RVIZ) [9] to create the full 3D point cloud. III. METHODS We conducted two different tests in this survey the size of the smallest detectable feature and the depth resolution at increasing distances between the sensor and the target. The depth resolution demonstrates the minimum consistently detectable difference in depth of different features. Fig. 1: The test setup from the perspective of a sensor. In the foreground is the Microsoft Kinect with the Nyko Zoom Lens. Beyond it is the visual target and the backdrop. A. Smallest Feature Test The sensor was placed at one end of the table and the white backdrop at the other. We gradually moved the target from 10 cm from the sensor to 250 cm, taking screenshots and noting the distance each time a feature (the holes) was no longer consistently visible. Figure 4 shows a screenshot from this test. B. Depth Resolution Test In this test, the sensor was again stationary at one end of the table. The target is placed at increasing distances from the sensor, from 50 cm to 210 cm at 10 cm increments. At each distance, we placed the backdrop immediately behind the target and slowly increased the distance between the backdrop and the visual target until we could visibly differentiate between the backdrop and the features of the target. The depth data was mapped to the color spectrum, rather than simply gray-scale, making the difference more noticeable. A. Smallest Feature Test IV. RESULTS As distance increased, all devices exhibited an increase in the size of the smallest detectable feature, as shown in Figure Fig. 2: Diagram of the visual target. The outline is a 1ft by 1ft square. cloud from this data, we orient the laser such that it produces slices parrallel to the YZ-plane and take a series of slices by incrementing the X-position of the laser. The full point cloud is produced by programming a ROS node that stitches the slices together by inserting an X-coordinate into each YZ laser scan. The laser is mounted to a carriage on a Makerslide [8] rail, as shown in Figure 3. A stepper motor, also mounted to the carriage, controls a belt drive to move the carriage along the rail, and continually publishes its position to a ROS message topic. The stitching node reads both the position and laser scan Fig. 3: Picture of the X-Carriage on the Makerslide. The Hokuyo laser scanner (right, black and square) is mounted next to the stepper motor (center, black). Also visible is the IPEVO Document Camera (left, long and silver), which was unused in these tests.

3 Device Angular resolution (deg) Kinect Xtion Pro Live URG-04LX TABLE I: The published specifications of each device Device Linear Res at 1 m Smallest Feature at 1 m Kinect 0.16 cm 3 cm Xtion Pro Live 0.16 cm 2 cm URG-04LX 0.63 cm 1 cm TABLE II: The published specifications of each device distance, which we will call linear resolution, is given by L = D tan A Fig. 4: An example screenshot from the smallest feature test, using the Asus Xtion Pro at 60 cm from the target. Fig. 5: Size of smallest detectable feature at varying distances. Smaller values indicate better performance. 5. The Kinect performed worse than the other devices at every distance. The Kinect with the Nyko Zoom lens and Asus performed nearly identically, and the Hokuyo performed better than the other devices. However, the data for the Hokuyo is restricted to distances under 1.5 m. The three structured light sensors (Kinect with and without the Zoom lens and the Xtion Pro Live) all exhibit nearly linear growth until approximately 1.6 meters, at which point their performance degrades sharply. B. Ratio of Smallest Feature Size to Published Linear Resolution The minimum size of detectable features at given distances is closely related to the angular resolution of the sensors. The distance between two pixels of the point cloud at a given where L is the linear resolution, A is the angular resolution of the sensor, and D is the distance between the sensor and the target. While our experiments aimed to determine the best device for our application, we were also interested in comparing our experimental results with the published specifications for the devices we tested. According to the published specifications, we determined the angular resolution of each device, except the Nyko Zoom Lens 1, listed in Table I. As expected, the scattered light sensors do not generate any useful data at ranges less than roughly 0.5 meters. The Kinect claims to have a minimum effective distance of 0.8 m, however we were able to detect a 2 cm feature at a distance of 0.6 m. Similarly, the Xtion Pro Live claims a minimum effective distance of 0.8 m, and we detected a 1 cm feature at a distance of 0.6 m. Based on the angular resolution of the data, the linear resolution at 1 m for each device is presented in Table II, along with the smallest detectable feature size at 1 m. As a means to approximate the minimum number of pixels required to discern a feature, we used the ratio of the experimentally-derived minimum detectable feature size over the theoretical linear resolution calculated from published device specifications. Figure 6 depicts this ratio as a function of distance. To detect a feature, the Hokuyo required the feature to be 2-5 times larger than the linear resolution at a given distance, whereas the Kinect and Asus required features to be approximately times larger than the linear resolution at a given distance. C. Depth Resolution Test Of the three devices tested, the Asus exhibited the best depth resolution, while the Kinect with Nyko Zoom exhibited the worst depth resolution. Figure 7 charts the depth resolution over distance for all three devices, and includes a linear regression fit for each data series. 1 Nyko has not published any specifications regarding the angular resolution or field of view of the Kinect with the lens attatched.

4 V. DISCUSSION Fig. 6: This graph displays the ratios of the sizes of the smallest detectable feature at a given distance over the theoretical linear resolution at that distance. This quantity is an approximation of the number of pixels necessary to distinguish a feature from its surroundings. Fig. 7: The depth resolution test yielded highly variable results, but least-squares regression shows that there are consistent trends within the data. The results of our experiment suggest that the Hokuyo URG-04LX Laser Range Finder has superior feature-detection ability when compared to the Xtion Pro, Kinect, and Kinect with Nyko Zoom Lens. Surprisingly, there was little difference in performance between the Kinect and Kinect with Nyko Zoom lens structured light sensors. The Xtion Pro Live outperformed the Kinect, both with and without the Zoom lens, in both tests, particularly in the depth resolution. The Nyko Zoom proved to be a double-edged sword, improving the Kinect s smallest feature detection to the level of the Xtion Pro Live, but negatively affected its depth resolution. All of the image sensors had a significantly lower usable resolution than what was listed. Although this is to be expected based on the observations that it takes more than one or two pixels to distinguish a feature in a point cloud, and the resolution listed by the manufacturers is just the real-world distance between each observed point. In practical applications, the minimum detectable feature size for the Kinect and Asus is times larger than the linear resolution at a given distance. In this regard, the Hokuyo performed closer to published specs, requiring minimum detectable feature sizes of only 2-5x what the published specs suggested. Users of the Kinect and Asus devices should consider this large discrepancy when choosing to use either sensor for feature recognition applications. However, the poorer resolution does allow for a faster sampling rate and greatly reduced cost. For the purposes of the Autoponics project, realtime sampling is not critical, and accurate detection of small features is important; thus, the Hokuyo Laser Range Finder is a more appropriate device for mapping and feature detection in our application. There were a number of possible sources of error in the experiments. The measurements included some degree of subjectivity, as each data point required a decision by the experimenter. During the smallest feature test, the holes frequently flickered within a small range of distance, rather than blinking out immediately. To maintain consistency, we waited until the end of this range, though in some cases it was an asymptotic falloff instead of a clear end. In addition, at longer distances the smaller features shrank to only a few pixels in diameter, that may have been detectable to a finely tuned algorithm, but not to the human eye. In future testing, we intend to use an image processing algorithm rather than a human operator to create static thresholds and will conduct more trials for more reliable averages. Similarly, the method used to test the depth resolution required human judgement. The depth data was mapped to the color spectrum, and thus the test relied on the experimenter to decide when the backdrop seen through the holes of the target was a different hue than the target itself. This proved to be a highly subjective metric, as evidenced by the high variance of our results. We suspect the error may have varied with the color spectrum (i.e. the differences in hues of green were more difficult to distinguish than hues of blue); by depending on color, slight color blindness in the experimenters may have

5 contributed minor error as well. Again, in future testing we will use an image processing algorithm for more consistent results, in addition to more trials. By varying the offset of the color spectrum during the trials such as cycling from green to green, then red to red, then blue to blue we can eliminate the bias due to particular hues. The experimental setup may have also contributed minor errors. It is difficult to align the visual target at the correct angle to the image sensor. Both the laser from the URG-04LX and the field of view for the Kinect and Xtion Pro must be perfectly orthogonal to the target acrylic board to get accurate measurements. Although we used a level to orient the target acrylic and the image sensor, our equipment was not accurate enough to ensure perfect angles for any measurement. Though the offset was minor, it likely affected our results. Finally, the visual target was composed of features of constant size, and in the smallest feature test this limited the number of possible measurements. To remove this barrier, one could construct a target with a window of variable size. This device would allow for continuous measurements as one increases the distance between the sensor and the target. REFERENCES [1] Asus xtion pro live, Sensor/ Xtion PRO LIVE/. [2] Microsoft xbox kinect, [3] Microsoft xbox kinect, [4] Hokuyo 04lx-ug01 laser scanner, 07scanner/urg 04lx ug01.html. [5] The autoponics project, [6] Solid state depot, [7] Ros (robot operating system, [8] Makerslide, [9] Rviz, VI. CONCLUSION In this paper, we have presented and discussed the materials, methods, and results for surveying different sensors used to generating point clouds for mapping and feature detection. We conducted what we believe to be a fair experiment in real-world conditions, using four different systems that are available on the market today. Published sensor specifications do not necessarily provide enough information to evaluate the efficacy of a device for a specific application. While the linear resolution of the Kinect and Asus look impressive on paper, the practical use of these devices for feature detection falls short of what one might expect from the published figures. The data we collected provides a perspective on the feature detection abilities and depth resolution that can be expected when using these devices in a real-world environment. From this data, we were able to determine that, among the devices tested, the Hokuyo URG-04LX Laser Range Finder is best suited for use as the primary tool for feature detection and localization for the Autponics project. In addition to superior feature detection abilities, the URG-04LX is able to provide useful data at short distances, unlike the structured light sensors. Although our experiment was inherently subjective in the techniques used to collect data, the URG-04LX was clearly superior in terms of its ability to recognize small features at near and far distances. ACKNOWLEDGMENT For acting as a mentor, and providing valuable guidance to the Autoponics project team, we would like to thank Dr. Correll from the. We also want to acknowledge members of the Solid State Depot for providing an unique perspective on both our experiment and the Autoponics system.

Journal of Mechatronics, Electrical Power, and Vehicular Technology

Journal of Mechatronics, Electrical Power, and Vehicular Technology Journal of Mechatronics, Electrical Power, and Vehicular Technology 8 (2017) 85 94 Journal of Mechatronics, Electrical Power, and Vehicular Technology e-issn: 2088-6985 p-issn: 2087-3379 www.mevjournal.com

More information

Development of a Low-Cost SLAM Radar for Applications in Robotics

Development of a Low-Cost SLAM Radar for Applications in Robotics Development of a Low-Cost SLAM Radar for Applications in Robotics Thomas Irps; Stephen Prior; Darren Lewis; Witold Mielniczek; Mantas Brazinskas; Chris Barlow; Mehmet Karamanoglu Department of Product

More information

The introduction and background in the previous chapters provided context in

The introduction and background in the previous chapters provided context in Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at

More information

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Taichi Yamada 1, Yeow Li Sa 1 and Akihisa Ohya 1 1 Graduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1,

More information

Imaging Fourier transform spectrometer

Imaging Fourier transform spectrometer Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 2001 Imaging Fourier transform spectrometer Eric Sztanko Follow this and additional works at: http://scholarworks.rit.edu/theses

More information

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS INFOTEH-JAHORINA Vol. 10, Ref. E-VI-11, p. 892-896, March 2011. MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS Jelena Cvetković, Aleksej Makarov, Sasa Vujić, Vlatacom d.o.o. Beograd Abstract -

More information

Working with the BCC Jitter Filter

Working with the BCC Jitter Filter Working with the BCC Jitter Filter Jitter allows you to vary one or more attributes of a source layer over time, such as size, position, opacity, brightness, or contrast. Additional controls choose the

More information

Android User manual. Intel Education Lab Camera by Intellisense CONTENTS

Android User manual. Intel Education Lab Camera by Intellisense CONTENTS Intel Education Lab Camera by Intellisense Android User manual CONTENTS Introduction General Information Common Features Time Lapse Kinematics Motion Cam Microscope Universal Logger Pathfinder Graph Challenge

More information

Phased Array Velocity Sensor Operational Advantages and Data Analysis

Phased Array Velocity Sensor Operational Advantages and Data Analysis Phased Array Velocity Sensor Operational Advantages and Data Analysis Matt Burdyny, Omer Poroy and Dr. Peter Spain Abstract - In recent years the underwater navigation industry has expanded into more diverse

More information

LEGO 2D Planar Manipulator (with zero offset between Z1 and Z2 axes of rotation)

LEGO 2D Planar Manipulator (with zero offset between Z1 and Z2 axes of rotation) LEGO 2D Planar Manipulator (with zero offset between Z1 and Z2 axes of rotation) Uses some parts not found in NXT Mindstorms Kit 9797 e.g. 2 nd Turntable, 1x12 plates, and 15100: Pin-hole Friction Peg.

More information

Quintic Hardware Tutorial Camera Set-Up

Quintic Hardware Tutorial Camera Set-Up Quintic Hardware Tutorial Camera Set-Up 1 All Quintic Live High-Speed cameras are specifically designed to meet a wide range of needs including coaching, performance analysis and research. Quintic LIVE

More information

Testo SuperResolution the patent-pending technology for high-resolution thermal images

Testo SuperResolution the patent-pending technology for high-resolution thermal images Professional article background article Testo SuperResolution the patent-pending technology for high-resolution thermal images Abstract In many industrial or trade applications, it is necessary to reliably

More information

The End of Thresholds: Subwavelength Optical Linewidth Measurement Using the Flux-Area Technique

The End of Thresholds: Subwavelength Optical Linewidth Measurement Using the Flux-Area Technique The End of Thresholds: Subwavelength Optical Linewidth Measurement Using the Flux-Area Technique Peter Fiekowsky Automated Visual Inspection, Los Altos, California ABSTRACT The patented Flux-Area technique

More information

Geometric Optics. This is a double-convex glass lens mounted in a wooden frame. We will use this as the eyepiece for our microscope.

Geometric Optics. This is a double-convex glass lens mounted in a wooden frame. We will use this as the eyepiece for our microscope. I. Before you come to lab Read through this handout in its entirety. II. Learning Objectives As a result of performing this lab, you will be able to: 1. Use the thin lens equation to determine the focal

More information

Coherent Laser Measurement and Control Beam Diagnostics

Coherent Laser Measurement and Control Beam Diagnostics Coherent Laser Measurement and Control M 2 Propagation Analyzer Measurement and display of CW laser divergence, M 2 (or k) and astigmatism sizes 0.2 mm to 25 mm Wavelengths from 220 nm to 15 µm Determination

More information

OPTICS IN MOTION. Introduction: Competing Technologies: 1 of 6 3/18/2012 6:27 PM.

OPTICS IN MOTION. Introduction: Competing Technologies:  1 of 6 3/18/2012 6:27 PM. 1 of 6 3/18/2012 6:27 PM OPTICS IN MOTION STANDARD AND CUSTOM FAST STEERING MIRRORS Home Products Contact Tutorial Navigate Our Site 1) Laser Beam Stabilization to design and build a custom 3.5 x 5 inch,

More information

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim MEM380 Applied Autonomous Robots I Winter 2011 Feedback Control USARSim Transforming Accelerations into Position Estimates In a perfect world It s not a perfect world. We have noise and bias in our acceleration

More information

PHYS 1112L - Introductory Physics Laboratory II

PHYS 1112L - Introductory Physics Laboratory II PHYS 1112L - Introductory Physics Laboratory II Laboratory Advanced Sheet Snell's Law 1. Objectives. The objectives of this laboratory are a. to determine the index of refraction of a liquid using Snell's

More information

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world. Sensing Key requirement of autonomous systems. An AS should be connected to the outside world. Autonomous systems Convert a physical value to an electrical value. From temperature, humidity, light, to

More information

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011) Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces

More information

Adaptive Coronagraphy Using a Digital Micromirror Array

Adaptive Coronagraphy Using a Digital Micromirror Array Adaptive Coronagraphy Using a Digital Micromirror Array Oregon State University Department of Physics by Brad Hermens Advisor: Dr. William Hetherington June 6, 2014 Abstract Coronagraphs have been used

More information

1. Redistributions of documents, or parts of documents, must retain the SWGIT cover page containing the disclaimer.

1. Redistributions of documents, or parts of documents, must retain the SWGIT cover page containing the disclaimer. a Disclaimer: As a condition to the use of this document and the information contained herein, the SWGIT requests notification by e-mail before or contemporaneously to the introduction of this document,

More information

OPTIV CLASSIC 321 GL TECHNICAL DATA

OPTIV CLASSIC 321 GL TECHNICAL DATA OPTIV CLASSIC 321 GL TECHNICAL DATA TECHNICAL DATA Product description The Optiv Classic 321 GL offers an innovative design for non-contact measurement. The benchtop video-based measuring machine is equipped

More information

Helpful Alignment Tips for Machine Shops

Helpful Alignment Tips for Machine Shops Table of Contents Background... 3 Offset or Centerline Misalignment... 3 Parallelism or Angular Misalignment... 4 Alignment Equipment Needed... 5 How it Works... 5 Measuring Procedure... 5 Making the Measurements...

More information

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital

More information

Information & Instructions

Information & Instructions KEY FEATURES 1. USB 3.0 For the Fastest Transfer Rates Up to 10X faster than regular USB 2.0 connections (also USB 2.0 compatible) 2. High Resolution 4.2 MegaPixels resolution gives accurate profile measurements

More information

Properties of two light sensors

Properties of two light sensors Properties of two light sensors Timo Paukku Dinnesen (timo@daimi.au.dk) University of Aarhus Aabogade 34 8200 Aarhus N, Denmark January 10, 2006 1 Introduction Many projects using the LEGO Mindstorms RCX

More information

Lab 7: Introduction to Webots and Sensor Modeling

Lab 7: Introduction to Webots and Sensor Modeling Lab 7: Introduction to Webots and Sensor Modeling This laboratory requires the following software: Webots simulator C development tools (gcc, make, etc.) The laboratory duration is approximately two hours.

More information

Visual compass for the NIFTi robot

Visual compass for the NIFTi robot CENTER FOR MACHINE PERCEPTION CZECH TECHNICAL UNIVERSITY IN PRAGUE Visual compass for the NIFTi robot Tomáš Nouza nouzato1@fel.cvut.cz June 27, 2013 TECHNICAL REPORT Available at https://cw.felk.cvut.cz/doku.php/misc/projects/nifti/sw/start/visual

More information

Development of an Education System for Surface Mount Work of a Printed Circuit Board

Development of an Education System for Surface Mount Work of a Printed Circuit Board Development of an Education System for Surface Mount Work of a Printed Circuit Board H. Ishii, T. Kobayashi, H. Fujino, Y. Nishimura, H. Shimoda, H. Yoshikawa Kyoto University Gokasho, Uji, Kyoto, 611-0011,

More information

Assisting and Guiding Visually Impaired in Indoor Environments

Assisting and Guiding Visually Impaired in Indoor Environments Avestia Publishing 9 International Journal of Mechanical Engineering and Mechatronics Volume 1, Issue 1, Year 2012 Journal ISSN: 1929-2724 Article ID: 002, DOI: 10.11159/ijmem.2012.002 Assisting and Guiding

More information

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Semi-Autonomous Parking for Enhanced Safety and Efficiency Technical Report 105 Semi-Autonomous Parking for Enhanced Safety and Efficiency Sriram Vishwanath WNCG June 2017 Data-Supported Transportation Operations & Planning Center (D-STOP) A Tier 1 USDOT University

More information

Background Suppression with Photoelectric Sensors Challenges and Solutions

Background Suppression with Photoelectric Sensors Challenges and Solutions Background Suppression with Photoelectric Sensors Challenges and Solutions Gary Frigyes, Product Manager Ed Myers, Product Manager Jeff Allison, Product Manager Pepperl+Fuchs Twinsburg, OH www.am.pepperl-fuchs.com

More information

Leica Viva Image Assisted Surveying & Image Notes

Leica Viva Image Assisted Surveying & Image Notes Leica Viva Image Assisted Surveying & Image Notes Contents 1. Introduction 3. Image Notes 4. Availability 5. Summary 1. Introduction Image Assisted Surveying Camera live view of what the total station

More information

Infrared Endoscopy and its Practicality for Surgery. Phys 173 June 2014 Kevin Kohler A

Infrared Endoscopy and its Practicality for Surgery. Phys 173 June 2014 Kevin Kohler A Infrared Endoscopy and its Practicality for Surgery Phys 173 June 2014 Kevin Kohler A09320836 Abstract The focus of this experiment was to see if there was a wavelength of light that would allow for surgeons

More information

A Study of Slanted-Edge MTF Stability and Repeatability

A Study of Slanted-Edge MTF Stability and Repeatability A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency

More information

APPENDIX E INSTRUMENT VERIFICATION STRIP REPORT. Final Remedial Investigation Report for the Former Camp Croft Spartanburg, South Carolina Appendices

APPENDIX E INSTRUMENT VERIFICATION STRIP REPORT. Final Remedial Investigation Report for the Former Camp Croft Spartanburg, South Carolina Appendices Final Remedial Investigation Report for the Former Camp Croft APPENDIX E INSTRUMENT VERIFICATION STRIP REPORT Contract No.: W912DY-10-D-0028 Page E-1 Task Order No.: 0005 Final Remedial Investigation Report

More information

Preliminary Design Review

Preliminary Design Review Proximity Identification, characterization, And Neutralization by thinking before Acquisition (PIRANHA) Preliminary Design Review Customer: Barbara Bicknell Jeffrey Weber Team: Aaron Buysse Kevin Rauhauser

More information

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes: Evaluating Commercial Scanners for Astronomical Images Robert J. Simcoe Associate Harvard College Observatory rjsimcoe@cfa.harvard.edu Introduction: Many organizations have expressed interest in using

More information

Design of High-Precision Infrared Multi-Touch Screen Based on the EFM32

Design of High-Precision Infrared Multi-Touch Screen Based on the EFM32 Sensors & Transducers 204 by IFSA Publishing, S. L. http://www.sensorsportal.com Design of High-Precision Infrared Multi-Touch Screen Based on the EFM32 Zhong XIAOLING, Guo YONG, Zhang WEI, Xie XINGHONG,

More information

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation Optical Performance of Nikon F-Mount Lenses Landon Carter May 11, 2016 2.671 Measurement and Instrumentation Abstract In photographic systems, lenses are one of the most important pieces of the system

More information

Speed and Image Brightness uniformity of telecentric lenses

Speed and Image Brightness uniformity of telecentric lenses Specialist Article Published by: elektronikpraxis.de Issue: 11 / 2013 Speed and Image Brightness uniformity of telecentric lenses Author: Dr.-Ing. Claudia Brückner, Optics Developer, Vision & Control GmbH

More information

Using Frequency Diversity to Improve Measurement Speed Roger Dygert MI Technologies, 1125 Satellite Blvd., Suite 100 Suwanee, GA 30024

Using Frequency Diversity to Improve Measurement Speed Roger Dygert MI Technologies, 1125 Satellite Blvd., Suite 100 Suwanee, GA 30024 Using Frequency Diversity to Improve Measurement Speed Roger Dygert MI Technologies, 1125 Satellite Blvd., Suite 1 Suwanee, GA 324 ABSTRACT Conventional antenna measurement systems use a multiplexer or

More information

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital

More information

TECHNICAL DATA. OPTIV CLASSIC 322 Version 3/2013

TECHNICAL DATA. OPTIV CLASSIC 322 Version 3/2013 TECHNICAL DATA OPTIV CLASSIC 322 Version 3/2013 Technical Data Product description The Optiv Classic 322 combines optical and tactile measurement in one system (optional touchtrigger probe). The system

More information

Unpredictable movement performance of Virtual Reality headsets

Unpredictable movement performance of Virtual Reality headsets Unpredictable movement performance of Virtual Reality headsets 2 1. Introduction Virtual Reality headsets use a combination of sensors to track the orientation of the headset, in order to move the displayed

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Progress Report. Mohammadtaghi G. Poshtmashhadi. Supervisor: Professor António M. Pascoal

Progress Report. Mohammadtaghi G. Poshtmashhadi. Supervisor: Professor António M. Pascoal Progress Report Mohammadtaghi G. Poshtmashhadi Supervisor: Professor António M. Pascoal OceaNet meeting presentation April 2017 2 Work program Main Research Topic Autonomous Marine Vehicle Control and

More information

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing Digital Image Processing Lecture # 6 Corner Detection & Color Processing 1 Corners Corners (interest points) Unlike edges, corners (patches of pixels surrounding the corner) do not necessarily correspond

More information

Visual Perception. human perception display devices. CS Visual Perception

Visual Perception. human perception display devices. CS Visual Perception Visual Perception human perception display devices 1 Reference Chapters 4, 5 Designing with the Mind in Mind by Jeff Johnson 2 Visual Perception Most user interfaces are visual in nature. So, it is important

More information

Lab 8: Introduction to the e-puck Robot

Lab 8: Introduction to the e-puck Robot Lab 8: Introduction to the e-puck Robot This laboratory requires the following equipment: C development tools (gcc, make, etc.) C30 programming tools for the e-puck robot The development tree which is

More information

PHYS 3153 Methods of Experimental Physics II O2. Applications of Interferometry

PHYS 3153 Methods of Experimental Physics II O2. Applications of Interferometry Purpose PHYS 3153 Methods of Experimental Physics II O2. Applications of Interferometry In this experiment, you will study the principles and applications of interferometry. Equipment and components PASCO

More information

Improving the Detection of Near Earth Objects for Ground Based Telescopes

Improving the Detection of Near Earth Objects for Ground Based Telescopes Improving the Detection of Near Earth Objects for Ground Based Telescopes Anthony O'Dell Captain, United States Air Force Air Force Research Laboratories ABSTRACT Congress has mandated the detection of

More information

Durst HL 2506 AF. Durst HL 2506 AF

Durst HL 2506 AF. Durst HL 2506 AF Durst HL 2506 AF Durst HL 3506 AF Professional horizontal enlarger for colour and BW-enlargements from film formats up to 25 x 25 cm (10 x 10 in.) with computer driven Permanent Closed Loop light monitoring

More information

11Beamage-3. CMOS Beam Profiling Cameras

11Beamage-3. CMOS Beam Profiling Cameras 11Beamage-3 CMOS Beam Profiling Cameras Key Features USB 3.0 FOR THE FASTEST TRANSFER RATES Up to 10X faster than regular USB 2.0 connections (also USB 2.0 compatible) HIGH RESOLUTION 2.2 MPixels resolution

More information

This document explains the reasons behind this phenomenon and describes how to overcome it.

This document explains the reasons behind this phenomenon and describes how to overcome it. Internal: 734-00583B-EN Release date: 17 December 2008 Cast Effects in Wide Angle Photography Overview Shooting images with wide angle lenses and exploiting large format camera movements can result in

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Sensors and Materials, Vol. 28, No. 6 (2016) 695 705 MYU Tokyo 695 S & M 1227 Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Chun-Chi Lai and Kuo-Lan Su * Department

More information

Bias errors in PIV: the pixel locking effect revisited.

Bias errors in PIV: the pixel locking effect revisited. Bias errors in PIV: the pixel locking effect revisited. E.F.J. Overmars 1, N.G.W. Warncke, C. Poelma and J. Westerweel 1: Laboratory for Aero & Hydrodynamics, University of Technology, Delft, The Netherlands,

More information

A LARGE COMBINATION HORIZONTAL AND VERTICAL NEAR FIELD MEASUREMENT FACILITY FOR SATELLITE ANTENNA CHARACTERIZATION

A LARGE COMBINATION HORIZONTAL AND VERTICAL NEAR FIELD MEASUREMENT FACILITY FOR SATELLITE ANTENNA CHARACTERIZATION A LARGE COMBINATION HORIZONTAL AND VERTICAL NEAR FIELD MEASUREMENT FACILITY FOR SATELLITE ANTENNA CHARACTERIZATION John Demas Nearfield Systems Inc. 1330 E. 223rd Street Bldg. 524 Carson, CA 90745 USA

More information

Frequency Hopping Pattern Recognition Algorithms for Wireless Sensor Networks

Frequency Hopping Pattern Recognition Algorithms for Wireless Sensor Networks Frequency Hopping Pattern Recognition Algorithms for Wireless Sensor Networks Min Song, Trent Allison Department of Electrical and Computer Engineering Old Dominion University Norfolk, VA 23529, USA Abstract

More information

PAD Correlator Computer

PAD Correlator Computer ALIGNMENT OF CONVENTIONAL ROATING ARM INSTRUMENT GENERAL PRINCIPLES The most important thing in aligning the instrument is ensuring that the beam GOES OVER THE CENTER OF THE TABLE. The particular direction

More information

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit) Vishnu Nath Usage of computer vision and humanoid robotics to create autonomous robots (Ximea Currera RL04C Camera Kit) Acknowledgements Firstly, I would like to thank Ivan Klimkovic of Ximea Corporation,

More information

Laboratory 1: Motion in One Dimension

Laboratory 1: Motion in One Dimension Phys 131L Spring 2018 Laboratory 1: Motion in One Dimension Classical physics describes the motion of objects with the fundamental goal of tracking the position of an object as time passes. The simplest

More information

Viewing Environments for Cross-Media Image Comparisons

Viewing Environments for Cross-Media Image Comparisons Viewing Environments for Cross-Media Image Comparisons Karen Braun and Mark D. Fairchild Munsell Color Science Laboratory, Center for Imaging Science Rochester Institute of Technology, Rochester, New York

More information

Fig Color spectrum seen by passing white light through a prism.

Fig Color spectrum seen by passing white light through a prism. 1. Explain about color fundamentals. Color of an object is determined by the nature of the light reflected from it. When a beam of sunlight passes through a glass prism, the emerging beam of light is not

More information

A Systematic Approach for Evaluating LED Street Light Fixtures

A Systematic Approach for Evaluating LED Street Light Fixtures A Systematic Approach for Evaluating LED Street Light Fixtures By Blake Redfield LED street lights are gaining popularity throughout the US and the world. Numerous companies have offered LED street lights

More information

PH 481/581 Physical Optics Winter 2013

PH 481/581 Physical Optics Winter 2013 PH 481/581 Physical Optics Winter 2013 Laboratory #1 Week of January 14 Read: Handout (Introduction & Projects #2 & 3 from Newport Project in Optics Workbook), pp. 150-170 of "Optics" by Hecht Do: 1. Experiment

More information

Face Detection using 3-D Time-of-Flight and Colour Cameras

Face Detection using 3-D Time-of-Flight and Colour Cameras Face Detection using 3-D Time-of-Flight and Colour Cameras Jan Fischer, Daniel Seitz, Alexander Verl Fraunhofer IPA, Nobelstr. 12, 70597 Stuttgart, Germany Abstract This paper presents a novel method to

More information

Mode analysis of Oxide-Confined VCSELs using near-far field approaches

Mode analysis of Oxide-Confined VCSELs using near-far field approaches Annual report 998, Dept. of Optoelectronics, University of Ulm Mode analysis of Oxide-Confined VCSELs using near-far field approaches Safwat William Zaki Mahmoud We analyze the transverse mode structure

More information

CRISATEL High Resolution Multispectral System

CRISATEL High Resolution Multispectral System CRISATEL High Resolution Multispectral System Pascal Cotte and Marcel Dupouy Lumiere Technology, Paris, France We have designed and built a high resolution multispectral image acquisition system for digitizing

More information

VisionGauge OnLine Standard Edition Spec Sheet

VisionGauge OnLine Standard Edition Spec Sheet VisionGauge OnLine Standard Edition Spec Sheet VISIONx INC. www.visionxinc.com Powerful & Easy to Use Intuitive Interface VisionGauge OnLine is a powerful and easy-to-use machine vision software for automated

More information

Improving Measurement Accuracy of Position Sensitive Detector (PSD) for a New Scanning PSD Microscopy System

Improving Measurement Accuracy of Position Sensitive Detector (PSD) for a New Scanning PSD Microscopy System Proceedings of the 2014 IEEE International Conference on Robotics and Biomimetics December 5-10, 2014, Bali, Indonesia Improving Measurement Accuracy of Position Sensitive Detector (PSD) for a New Scanning

More information

INNOVATIVE CAMERA CHARACTERIZATION BASED ON LED LIGHT SOURCE

INNOVATIVE CAMERA CHARACTERIZATION BASED ON LED LIGHT SOURCE Image Engineering imagequalitytools INNOVATIVE CAMERA CHARACTERIZATION BASED ON LED LIGHT SOURCE Image Engineering Relative Power ILLUMINATION DEVICES imagequalitytools The most flexible LED-based light

More information

Exercise 1-3. Radar Antennas EXERCISE OBJECTIVE DISCUSSION OUTLINE DISCUSSION OF FUNDAMENTALS. Antenna types

Exercise 1-3. Radar Antennas EXERCISE OBJECTIVE DISCUSSION OUTLINE DISCUSSION OF FUNDAMENTALS. Antenna types Exercise 1-3 Radar Antennas EXERCISE OBJECTIVE When you have completed this exercise, you will be familiar with the role of the antenna in a radar system. You will also be familiar with the intrinsic characteristics

More information

Application Note. Digital Low-Light CMOS Camera. NOCTURN Camera: Optimized for Long-Range Observation in Low Light Conditions

Application Note. Digital Low-Light CMOS Camera. NOCTURN Camera: Optimized for Long-Range Observation in Low Light Conditions Digital Low-Light CMOS Camera Application Note NOCTURN Camera: Optimized for Long-Range Observation in Low Light Conditions PHOTONIS Digital Imaging, LLC. 6170 Research Road Suite 208 Frisco, TX USA 75033

More information

Cost efficient design Operates in full sunlight Low power consumption Wide field of view Small footprint Simple serial connectivity Long Range

Cost efficient design Operates in full sunlight Low power consumption Wide field of view Small footprint Simple serial connectivity Long Range Cost efficient design Operates in full sunlight Low power consumption Wide field of view Small footprint Simple serial connectivity Long Range sweep v1.0 CAUTION This device contains a component which

More information

771 Series LASER SPECTRUM ANALYZER. The Power of Precision in Spectral Analysis. It's Our Business to be Exact! bristol-inst.com

771 Series LASER SPECTRUM ANALYZER. The Power of Precision in Spectral Analysis. It's Our Business to be Exact! bristol-inst.com 771 Series LASER SPECTRUM ANALYZER The Power of Precision in Spectral Analysis It's Our Business to be Exact! bristol-inst.com The 771 Series Laser Spectrum Analyzer combines proven Michelson interferometer

More information

Lab 3 Swinging pendulum experiment

Lab 3 Swinging pendulum experiment Lab 3 Swinging pendulum experiment Agenda Time 10 min Item Review agenda Introduce the swinging pendulum experiment and apparatus 95 min Lab activity I ll try to give you a 5- minute warning before the

More information

Visual Perception Based Behaviors for a Small Autonomous Mobile Robot

Visual Perception Based Behaviors for a Small Autonomous Mobile Robot Visual Perception Based Behaviors for a Small Autonomous Mobile Robot Scott Jantz and Keith L Doty Machine Intelligence Laboratory Mekatronix, Inc. Department of Electrical and Computer Engineering Gainesville,

More information

AgilOptics mirrors increase coupling efficiency into a 4 µm diameter fiber by 750%.

AgilOptics mirrors increase coupling efficiency into a 4 µm diameter fiber by 750%. Application Note AN004: Fiber Coupling Improvement Introduction AgilOptics mirrors increase coupling efficiency into a 4 µm diameter fiber by 750%. Industrial lasers used for cutting, welding, drilling,

More information

Chapter 4 MASK Encryption: Results with Image Analysis

Chapter 4 MASK Encryption: Results with Image Analysis 95 Chapter 4 MASK Encryption: Results with Image Analysis This chapter discusses the tests conducted and analysis made on MASK encryption, with gray scale and colour images. Statistical analysis including

More information

Far field intensity distributions of an OMEGA laser beam were measured with

Far field intensity distributions of an OMEGA laser beam were measured with Experimental Investigation of the Far Field on OMEGA with an Annular Apertured Near Field Uyen Tran Advisor: Sean P. Regan Laboratory for Laser Energetics Summer High School Research Program 200 1 Abstract

More information

Leica DMi8A Quick Guide

Leica DMi8A Quick Guide Leica DMi8A Quick Guide 1 Optical Microscope Quick Start Guide The following instructions are provided as a Quick Start Guide for powering up, running measurements, and shutting down Leica s DMi8A Inverted

More information

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11 GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11 Global Positioning Systems GPS is a technology that provides Location coordinates Elevation For any location with a decent view of the sky

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Helicopter Aerial Laser Ranging

Helicopter Aerial Laser Ranging Helicopter Aerial Laser Ranging Håkan Sterner TopEye AB P.O.Box 1017, SE-551 11 Jönköping, Sweden 1 Introduction Measuring distances with light has been used for terrestrial surveys since the fifties.

More information

Blur Detection for Historical Document Images

Blur Detection for Historical Document Images Blur Detection for Historical Document Images Ben Baker FamilySearch bakerb@familysearch.org ABSTRACT FamilySearch captures millions of digital images annually using digital cameras at sites throughout

More information

Towards a New Age Graphic Design DIGITAL PRINTING

Towards a New Age Graphic Design DIGITAL PRINTING 90 Chapter 08 Towards a New Age Graphic Design DIGITAL IMAGING and PRINTING Graphic designers work with visual images, either for print media or for digital media. With the advent of computers, most of

More information

A Beam-Level Delivery Accuracy Study of the Robotic Image Guided Radiosurgery System Using a Scintillator/CCD Phantom

A Beam-Level Delivery Accuracy Study of the Robotic Image Guided Radiosurgery System Using a Scintillator/CCD Phantom A Beam-Level Delivery Accuracy Study of the Robotic Image Guided Radiosurgery System Using a Scintillator/CCD Phantom Lei Wang 1, Shi Liu 1, Brett Nelson 2 1. Department of Radiation Oncology, Stanford

More information

PH 481/581 Physical Optics Winter 2014

PH 481/581 Physical Optics Winter 2014 PH 481/581 Physical Optics Winter 2014 Laboratory #1 Week of January 13 Read: Handout (Introduction & Projects #2 & 3 from Newport Project in Optics Workbook), pp.150-170 of Optics by Hecht Do: 1. Experiment

More information

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital

More information

Optical Coherence: Recreation of the Experiment of Thompson and Wolf

Optical Coherence: Recreation of the Experiment of Thompson and Wolf Optical Coherence: Recreation of the Experiment of Thompson and Wolf David Collins Senior project Department of Physics, California Polytechnic State University San Luis Obispo June 2010 Abstract The purpose

More information

PHYS 1112L - Introductory Physics Laboratory II

PHYS 1112L - Introductory Physics Laboratory II PHYS 1112L - Introductory Physics Laboratory II Laboratory Advanced Sheet Thin Lenses 1. Objectives. The objectives of this laboratory are a. to be able to measure the focal length of a converging lens.

More information

High Accuracy Spherical Near-Field Measurements On a Stationary Antenna

High Accuracy Spherical Near-Field Measurements On a Stationary Antenna High Accuracy Spherical Near-Field Measurements On a Stationary Antenna Greg Hindman, Hulean Tyler Nearfield Systems Inc. 19730 Magellan Drive Torrance, CA 90502 ABSTRACT Most conventional spherical near-field

More information

Managing Complex Land Mobile Radio Systems

Managing Complex Land Mobile Radio Systems Anyone responsible for a multiple-site, multiple-channel land mobile radio communications system knows that management of even just a single site can often be a complex task. Failures or degradation in

More information

Learning and Using Models of Kicking Motions for Legged Robots

Learning and Using Models of Kicking Motions for Legged Robots Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract

More information

Office europeen des Publication number : EUROPEAN PATENT APPLICATION

Office europeen des Publication number : EUROPEAN PATENT APPLICATION Office europeen des brevets @ Publication number : 0 465 1 36 A2 @ EUROPEAN PATENT APPLICATION @ Application number: 91305842.6 @ Int. CI.5 : G02B 26/10 (22) Date of filing : 27.06.91 ( ) Priority : 27.06.90

More information

Robot Visual Mapper. Hung Dang, Jasdeep Hundal and Ramu Nachiappan. Fig. 1: A typical image of Rovio s environment

Robot Visual Mapper. Hung Dang, Jasdeep Hundal and Ramu Nachiappan. Fig. 1: A typical image of Rovio s environment Robot Visual Mapper Hung Dang, Jasdeep Hundal and Ramu Nachiappan Abstract Mapping is an essential component of autonomous robot path planning and navigation. The standard approach often employs laser

More information