Preliminary Design Review

Similar documents
Ch 5 Hardware Components for Automation

A CubeSat-Based Optical Communication Network for Low Earth Orbit

Design of a Free Space Optical Communication Module for Small Satellites

PLazeR. a planar laser rangefinder. Robert Ying (ry2242) Derek Xingzhou He (xh2187) Peiqian Li (pl2521) Minh Trang Nguyen (mnn2108)

Lecture 2. Electromagnetic radiation principles. Units, image resolutions.

Optical Correlator for Image Motion Compensation in the Focal Plane of a Satellite Camera

Digital Image Processing

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Digital Image Processing

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

Robotic Capture and De-Orbit of a Tumbling and Heavy Target from Low Earth Orbit

Target Range Analysis for the LOFTI Triple Field-of-View Camera

Range Sensing strategies

Open Access The Application of Digital Image Processing Method in Range Finding by Camera

DISC Experiment Overview & On-Orbit Performance Results

Congress Best Paper Award

LDOR: Laser Directed Object Retrieving Robot. Final Report

CubeSat Integration into the Space Situational Awareness Architecture

Picture of Team. Bryce Walker. Charles Swenson. Alex Christensen. Jackson Pontsler. Erik Stromberg. Cody Palmer. Benjamin Maxfield.

Payload Configuration, Integration and Testing of the Deformable Mirror Demonstration Mission (DeMi) CubeSat

Cedarville University Little Blue

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

Kit for building your own THz Time-Domain Spectrometer

Operating instructions Contrast sensor O5K / / 2016

Beam Analysis BeamWatch Non-contact, Focus Spot Size and Position monitor for high power YAG, Diode and Fiber lasers. Disruptive Technology

Background Suppression with Photoelectric Sensors Challenges and Solutions

Autotracker III. Applications...

Figure 1. Proposed Mission Operations Functions. Key Performance Parameters Success criteria of an amateur communicator on board of Moon-exploration

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

Why select a BOS zoom lens over a COTS lens?

Sintec Optronics Technology Pte Ltd 10 Bukit Batok Crescent #07-02 The Spire Singapore Tel: Fax:

1. Diffuse sensor, intensity difference 2. Diffuse sensor with background suppression 3. Retro-reflective sensor with polarization filter 4.

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

Relative Cost and Performance Comparison of GEO Space Situational Awareness Architectures

Image and Multidimensional Signal Processing

PRESENTED BY HUMANOID IIT KANPUR

More Info at Open Access Database by S. Dutta and T. Schmidt

Status of Active Debris Removal (ADR) developments at the Swiss Space Center

Cost efficient design Operates in full sunlight Low power consumption Wide field of view Small footprint Simple serial connectivity Long Range

debris manoeuvre by photon pressure

Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances

Vision Review: Image Processing. Course web page:

Design and Navigation Control of an Advanced Level CANSAT. Mansur ÇELEBİ Aeronautics and Space Technologies Institute Turkish Air Force Academy

Novel Hemispheric Image Formation: Concepts & Applications

High resolution infrared cameras provide enhanced thermal detail for R&D applications

Wireless Power Transmission of Solar Energy from Space to Earth Using Microwaves

CMOS Star Tracker: Camera Calibration Procedures

Primary POC: Prof. Hyochoong Bang Organization: Korea Advanced Institute of Science and Technology KAIST POC

Cost efficient design Operates in full sunlight Low power consumption Wide field of view Small footprint Simple serial connectivity Long Range

detected by Himawari-8 then the location will be uplinked to approaching Cubesats as an urgent location for medium resolution imaging.

CSE548, AMS542: Analysis of Algorithms, Fall 2016 Date: Sep 25. Homework #1. ( Due: Oct 10 ) Figure 1: The laser game.

POINTING ERROR CORRECTION FOR MEMS LASER COMMUNICATION SYSTEMS

Status of Free-Space Optical Communications Program at JPL

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

MRO Delay Line. Performance of Beam Compressor for Agilent Laser Head INT-406-VEN The Cambridge Delay Line Team. rev 0.

EEE 187: Robotics. Summary 11: Sensors used in Robotics

Minho MSL - A New Generation of soccer robots

Coherent Laser Measurement and Control Beam Diagnostics

IRST ANALYSIS REPORT

Deep Space Communication The further you go, the harder it gets. D. Kanipe, Sept. 2013

Satellite Sub-systems

Machine Vision for the Life Sciences

Prototype Realization

Digital Imaging Rochester Institute of Technology

Design of a Remote-Cockpit for small Aerospace Vehicles

MODULAR ADAPTIVE OPTICS TESTBED FOR THE NPOI

Spatially Resolved Backscatter Ceilometer

The introduction and background in the previous chapters provided context in

CIS581: Computer Vision and Computational Photography Homework: Cameras and Convolution Due: Sept. 14, 2017 at 3:00 pm

ENSC 470/894 Lab 1 V1.4 (Oct )

DESIGN OF A TWO DIMENSIONAL MICROPROCESSOR BASED PARABOLIC ANTENNA CONTROLLER

The RVS3000 rendezvous and docking sensor technology

An Optical Characteristic Testing System for the Infrared Fiber in a Transmission Bandwidth 9-11μm

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

University of Florida Department of Electrical and Computer Engineering Intelligent Machine Design Laboratory EEL 4665 Spring 2013 LOSAT

3-Axis Attitude Determination and Control of the AeroCube-4 CubeSats

CSE 564: Scientific Visualization

Date Morning/Afternoon Time allowed: 1 hour 30 minutes

Adaptive Optics for LIGO

Process Metrix Mobile Laser Contouring System (LCS) for Converter Lining Thickness Monitoring

Senior Design I. Fast Acquisition and Real-time Tracking Vehicle. University of Central Florida

Design of the cryo-optical test of the Planck reflectors

Platform Independent Launch Vehicle Avionics

JEPPIAAR ENGINEERING COLLEGE

Improving the Detection of Near Earth Objects for Ground Based Telescopes

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements

Making Industries Smarter

Vision-Guided Motion. Presented by Tom Gray

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

, TECHNOLOGY. SAULT COLLEGE OF APPLIED ARTS SAULT STE. MARIE, ONTARIO COURSE OUTLINE COURSE OUTLINE: ROBOTIC & CONTROL SYSTEMS

ENSC 470/894 Lab 1 V2.0 (Sept )

Receiver Obstacle Railway Transmitter Road Figure 1: Existing obstacle-detecting system (light-interrupting type). Millimetre-wave-based System In ord

Design Description Document

العطاء رقم )7106/67( الخاص بشراء أجهز لقسم الهندسة الكهربائية على حساب البحث العلمي

Intellipaper Testing Machine Manual

Digital Image Processing

Transcription:

Proximity Identification, characterization, And Neutralization by thinking before Acquisition (PIRANHA) Preliminary Design Review Customer: Barbara Bicknell Jeffrey Weber Team: Aaron Buysse Kevin Rauhauser Chad Caplan Ryan Slabaugh Matt Holmes Rebecca Travers Colin Nugen Advisor: Dr. Jelliffe Jackson 11/11/2013 University of Colorado Aerospace Engineering Sciences 1

Background Need & Motivation Objective & CONOPS Functional Block Diagram Baseline Design Critical Elements & Feasibility Detection Ranging Position Finding Size Characterization Extrusion Overall Feasibility Summary Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 2

Facts Kessler Syndrome Millions of pieces of debris in orbit 1 Debris larger than 4 inches will cause catastrophic damage to space asset More than 22,000 pieces of debris in this category 1 Largest density of debris is in Low Earth Orbit (750 800km) No active measures taken to reduce debris European Space Agency Space debris in orbit. 2 Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 3

Space Operations Simulation Center Debris DCS 6DOF Robot PIRANHA DETECT & RANGE THINK POSITION FINDING COMMUNICATE Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 4

Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 5

Detect & Position Finding Size Characterization Extrusion Detection Camera & Rangefinder Camera Optical Rim Sensor Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 6

Rangefinder Camera PIRANHA 90 cm 2-DOF Actuator 60 cm Software & Computing Hardware Optical Rim Sensor Heritage Debris Capture System Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 7

2-DOF Motor & Feedback Control for Centering Camera Field of View 11/11/2013 University of Colorado Aerospace Engineering Sciences 8

Functional Requirement FNC.2 PIRANHA shall detect the presence of debris within a 20m proximity of the DCS. Critical Project Element CPE.1.1 PIRANHA must detect debris in a simulated space environment. Baseline Design from Trade Study Camera Detection CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 9

Digital images are composed of a finite number of units called pixels these hold one value to represent color or shade. As distance increases from the lens, same number of pixels must cover a larger area. Object fills some number of pixels. Number of pixels filled decreases with distance from the lens. Field of view =Object 2 pixels Field of view angle 11/11/2013 University of Colorado Aerospace Engineering Sciences 10

Want some knowledge of the object in view Relationship exists between pixels filled by the object and knowledge that can be gained Light enters camera like this: Camera produces image like this: = Object Field of View 2 pixels No way to tell object shape with just one pixel filled 11/11/2013 University of Colorado Aerospace Engineering Sciences 11

Experiments have been done specifically with electro-optical sensors exploring knowledge which can be gained vs. pixels filled by the object Knowledge of Object to be Gained Required Resolution (pixels/minimum object dimension) Detection 2.0 ± 0.5 Orientation 2.8 ± 0.7 Recognition 8.0 ± 1.6 Identification 12.8 ± 3.0 Object fills two pixels across its smallest dimension Detection possible Experiment Results 3 =Object 8 pixels 11/11/2013 University of Colorado Aerospace Engineering Sciences Camera field of view 6 x 8 pixels 6 pixels

Must find pixel size at maximum range, and relate to smallest object detectable. pixels Field of view Range to object Worst Case: = 20 m & Object dimension = 10 cm 11/11/2013 Camera Field of View ( ) Point Grey Flea 3 36 Pixels ( ) Pixels per minimum dimension 2160 University of Colorado Aerospace Engineering Sciences 16.6 13

Frame at = Frame at Difference in Frames Simple video taken with camera rotating such that object moves across image frame Background image subtracted from each frame in video to detect object moving in foreground 11/11/2013 University of Colorado Aerospace Engineering Sciences 14

Outline Considerations in First-Level Analysis Johnson does not define visual tasks very well. Experiments were on relatively large objects in laboratory conditions. Method of detection not accounted for. Plan for Addressing Considerations Perform tests in the SOSC to assure detection is possible and reliable. Feasibility Point Grey Flea 3 USB 3.0 represents a feasible solution for detecting. Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Background Detection Ranging Position Size Specification 6 Point Grey Flea 3 USB 3.0 11/11/2013 University of Colorado Aerospace Engineering Sciences 15 Value Pixels 4096 x 2160 Field of View Dimensions Weight Power Sensor Size 48 X 36 degrees 29 X 29 X 30 mm 0.57 N 5V <3 W Price $975 Extrusion 5.76 x 4.29 mm Summary

Functional Requirement FNC.2 PIRANHA shall detect the presence of debris within a 20m proximity of the DCS. Centering Debris in Camera Field of View Design Requirement DES.2.1 Detection of debris shall also include determining relative distance between the DCS and the debris. Baseline Design Camera & rangefinder on actuator with feedback control. Range Debris Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 16

Convolution filters used to remove noise and detect edges of object Sobel Filter Computes spatial gradient (horizontal and vertical) of pixel values to detect edges Averaging Filter Computes average of surrounding pixels to remove noise Original Image Image w/ Sobel Filter (Edge Detection) Image w/ Sobel and Averaging Filter Use feedback control and motors to align centroid with center of field of view 11/11/2013 University of Colorado Aerospace Engineering Sciences 17

First-Level Analysis: Laser Rangefinder Rangefinder Camera Field of View 20 m Distance where Rangefinder is centered on Debris (m) 11/11/2013 (degrees) 20 0.114 ± 0.143 10 0.229 ± 0.286 5 0.458 ± 0.573 Height rangefinder is mounted above camera Angle between rangefinder and camera field of view minimum diameter of debris University of Colorado Aerospace Engineering Sciences 18

Considerations in First-Level Analysis Debris is contrasted from background Have feedback control to align debris in center of field of view Fluke 411D Rangefinder 4 MX-106 Servo & Bracket 5 Microchip PIC18F452 Plan for Addressing Considerations Designing feedback control law Feasibility Fluke 411D Rangefinder, MX-106 servos, and microcontroller represent a feasible solution for ranging. Specification 4 Weight Supply Voltage Resolution Value 1.47 N 5 V 3mm Specification 5 Value Weight 1.5 N Supply Voltage 11-15 V Resolution 0.088 Microcontroller will be used to command servos Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Size 123 x 50 x 26mm Price $100 Price w/ board $250 Holding Torque 8.2Nm Size 40 x 65 x 46 mm Price for 2 motors $1000 Price w/ brackets $1060 Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 19

Functional Requirement FNC.4 PIRANHA shall output relative position and velocity vectors (pointing and tracking telemetry) of the debris with respect to the DCS. Design Requirement CPE.1.2 PIRANHA must be capable of outputting commands at the rates and specifications required by the SOSC robot. Baseline Design Camera & Rangefinder Position Finding CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 20

Camera and Laser First-Level Analysis B A r B B r C C C r D A r D D (debris centroid) DCS A A B DCS Gimbal Point E r A E r D Sense Known Calculate C D Front of Camera Debris E Desired Goal Frame E Earth (J2000 reference) 11/11/2013 University of Colorado Aerospace Engineering Sciences 21

A T Calculation of B A (DCS frame) B (Camera frame) Camera reference frame will have two degrees of freedom. Motor actuation (angles and ) can be measured with: Stepper motor with known step size Encoder Potentiometer Inclinometer COTS motor that outputs position One COTS option is MX-106 servo TBA 11/11/2013 University of Colorado Aerospace Engineering Sciences 22

Considerations in First-Level Analysis Transformation from DCS frame to Camera frame is known accurately Range to the object is known accurately Have inertial position as a function of time Plan for Addressing Considerations Create STK scenario to obtain inertial position data Feasibility MX-106 Servo represents a feasible solution for position finding Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Mass Specification Supply Voltage MX-106 Servo Value 1.5 N 12-15 V Resolution 0.088 Size 40 x 65 x 46 mm Price for 2 motors $1000 Position, temperature, and voltage feedback Easily extendable to multiple degrees of freedom Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 23

Functional Requirement FNC.3 PIRANHA shall characterize objects based on their size and shape. Possible Scenarios Contrasted Image of Target Baseline Design Camera Image Processing Pixels Max Diameter Size Characterization CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 24

First-Level Analysis Using the focal length (f) of the camera and the height (v) of the digital image sensor; the vertical field of view ( ) can be calculated. Range to Object (r) Field of View ( ) Image Sensor Height (v) Object Camera Body Then, the distance per pixel (d) in the image can be calculated using geometric relations and knowledge of the number of pixels ( ) in the vertical direction. Focal Length (ƒ) 11/11/2013 University of Colorado Aerospace Engineering Sciences 25

First-Level Analysis Algorithm was developed to determine size. Noise was taken care of with pixel density check. Sample image was taken and loaded into Matlab. Image taken at 2 ft with object of diameter roughly 1.19 in. Diameter was calculated using distance per pixel and number of pixels across the object. Calculated Size: 1.25 in Error: 5% 11/11/2013 University of Colorado Aerospace Engineering Sciences 26

Considerations in First-Level Analysis Contrast Lighting Spherical Debris Plan for Addressing Considerations Pictures in SOSC environment Feasibility Point Grey Flea 3 USB 3.0 and ASUS laptop represent a feasible solution for size characterization Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 ASUS Laptop Specification 7 Value Model Number X401U-BE20602Z Processor 1.7GHz Memory 4GB Dimensions 24 X 34 X 3 cm Weight 18 N Price $230 Point Grey Flea 3 USB 3.0 Specification 6 Value Pixels 4096 x 2160 Field of View 48X36 degrees Dimensions 29 X 29 X 30 mm Weight 0.57 N Power 5V <3W Sensor Size 5.76 x 4.29 mm Price $975 Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 27

Functional Requirement FNC.3 PIRANHA shall characterize objects based on their size and shape. Design Requirement DES.3.2 & DES.3.3 If extrusion is detected, PIRANHA shall stop DCS approach and alert DCS of extrusion. Baseline Design Photoelectric sensors around the rim will detect extrusions. Extrusion Detection CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 28

Baseline design from trade study changed Original design: Pressure transducers Unfeasible to approximate impact force Risk of creating more debris, contrary to U.S. Government Orbital Debris Mitigation Standard Practices 8 New Design: Photoelectric sensor Can detect extrusion without physical contact, reduces risk of extrusion hitting the DCS Requires few constraints on the extrusion characteristics Simple electrical/software integration Alert signal sent to DCS if beam is broken per requirements Optical Rim Sensor Baseline design switched to optical rim sensor. 11/11/2013 University of Colorado Aerospace Engineering Sciences 29

Sensors will be mounted off the rim to provide full sensor coverage of DCS rim Sensors and reflectors will require brackets to mount to DCS COTS brackets could be used Mounting brackets specific to sensors available for purchase with sensors Reflectors Photoelectric Sensor Reflector Top view of DCS Laser beam Tethers attach sensors/reflectors to DCS to minimize debris creation if extrusion were to hit one of them. Mounting Bracket Inside of DCS Receiver Transmitter Photoelectric Transmitter 11/11/2013 University of Colorado Aerospace Engineering Sciences 30

Sensor must have range of 24cm*8 = 1.92m to transmit around rim Beam spreading must be low for required range to meet required pointing accuracy Top view of DCS 24 cm DCS Dimensions 5 mm width Photoelectric Sensor 9,10 Automation Direct FALD-XO-OA (Transmitter) Automation Direct FALD- BN-OA (Receiver) Range (m) Beam/Receiver Dimensions 50 Beam spot at 2m range: <5mm -- 50 Receiver area: 18mm Output Cost Size Digital $75 $37.50 For this transmitter/receiver pair, worst case pointing accuracy = ± 0.44 18mm diameter 83.7mm long 18mm diameter 83.7mm long 11/11/2013 University of Colorado Aerospace Engineering Sciences 31

Considerations in First-Level Analysis Detectable extrusions assumed to have a minimum thickness of 5mm Assumed required pointing accuracy achievable Plan for Addressing Considerations Obtain photoelectric sensors and reflectors as early as possible and test mechanical methods for achieving required pointing accuracy. Feasibility Automation Direct FALD-XO-OA/FALD-BN-OA photoelectric transmitter/receiver pair represents a feasible solution for extrusion detection. Automation Direct FALD-XO-OA transmitter Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Outline Background Detection Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 32

Detection Point Grey Flea 3 camera Ranging & Position Finding Fluke 411D with Porcupine adapter board MX-106 servos with mounting brackets Size Characterization Point Grey Flea 3 camera ASUS X401U-BE20602Z laptop PIC18F452 microcontroller Extrusion Detection Automation Direct FALD-XO-OA (transmitter) Automation Direct FALD-BN-OA (receiver) Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Estimated Total Cost: $2700.00 Estimated Total Size: 24 X 34 X 20 cm Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 33

Detection Take pictures in SOSC Ranging & Position Finding Designing feedback control law Create STK scenario to obtain inertial ephemeris data Create CAD of gimbal assembly Size Characterization Testing algorithms with pictures from SOSC Extrusion Detection Create CAD of sensor brackets Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 34

11/11/2013 University of Colorado Aerospace Engineering Sciences 35

1 The Threat of Orbital Debris and Protecting NASA Space Assets From Satellite Collisions, National Aeronautics and Space Administration, Web. 09 Sept. 2013. 2 "Space in Images - 2012-11 - Space Debris." Space in Images - 2012-11 - Space Debris. European Space Agency, n.d. Web. 04 Oct. 2013. 3 Johnson, J.B. Analysis of Image Forming Systems, Selected Papers on Infrared Design, SPIE Vol. 513 Part Two, Society of Photo-Optical Instrumentation Engineers, 1985. 4 Porcupine Electronics. [http://porcupineelectronics.com/laserbotics.html] Web. 13 Oct. 2013 5 Trossen Robotoics. [http://www.trossenrobotics.com/p/mx-106t-dynamixel-robot-servo.aspx]. Web. 13 Oct. 2013 6 Point Grey Imaging and Innovation. [http://ww2.ptgrey.com/] Web. 13 Oct. 2013 7 ASUS X401U-BE20602Z BestBuy. [http://www.bestbuy.com/site/asus-14-laptop-4gb-memory-500gb-hard-drive-matteblack/8850089.p?id=1218914354022&skuid=8850089#tab=specifications]. Web. 13 Oct. 2013 8 U.S. Government Orbital Debris Mitigation Standard Practices, National Aeronautics and Space Administration, Feb. 2001, Web. 13 Sept. 2013. 9 Through Beam (SS / FA / FB Series) AutomationDirect.com, AutomationDirect.com, 2013, Web. 12 Oct. 2013. 10 FA Series Laser Photoelectric Sensors: M18 (18mm) plastic DC, AutomationDirect.com, n.d., Web. 12 Oct. 2013. 11/11/2013 University of Colorado Aerospace Engineering Sciences 36

Feasible Solution Cost Point Grey Flea 3 USB 3.0 with lens $975 Fluke 411D Rangefinder with Porcupine PCB $250 2 Servo motors with brackets $1060 PIC18F452 Microcontroller $80 ASUS Laptop $230 Automation Direct transmitter & receiver $112.50 Total $2707.50 11/11/2013 University of Colorado Aerospace Engineering Sciences 37

Experiments have been done to explore the relationship between imaging system resolution per minimum object dimension and the visual task which can be done with the image. Johnson, 1958 -> quantized tasks which can be done with an image Associated minimum resolution to complete those tasks Defines imaging system resolution as the number of line pairs which can be fully resolved over the field of view One line pair Camera with one line pair occupying field of view Does the field of view have enough pixels to resolve the line pair? 11/11/2013 University of Colorado Aerospace Engineering Sciences 38

-If the field of view contains one pixel: -Half black, half white -Returns some value between black and white, and appears some shade of grey Camera sees Camera reports -If the field of view contains two pixels: -one pixel is white, the other is black -Each pixel returns either white or black -line pair is resolved Pixel(s) Camera sees Camera reports *Two pixels minimum are required to resolve a line pair 11/11/2013 University of Colorado Aerospace Engineering Sciences 39

E 11/11/2013 University of Colorado Aerospace Engineering Sciences 40

E 11/11/2013 University of Colorado Aerospace Engineering Sciences 41

11/11/2013 University of Colorado Aerospace Engineering Sciences 42

r reference position of object in frame x position of object in camera in frame e in frame position error (e = r x) y actual position of object θ d desired orientation of motor θ a actual orientation of motor θ e orientation error (θ e = θ d θ a ) V voltage applied to motor 11/11/2013 University of Colorado Aerospace Engineering Sciences 43

Averaging filter kernel, A, is Subset of input matrix multiplied element-wise by averaging filter kernel Sum of these products is value of output pixel Sobel filter kernels in horizontal, Dx, and vertical, Dy, are Output pixel has value 11/11/2013 University of Colorado Aerospace Engineering Sciences 44