Proximity Identification, characterization, And Neutralization by thinking before Acquisition (PIRANHA) Preliminary Design Review Customer: Barbara Bicknell Jeffrey Weber Team: Aaron Buysse Kevin Rauhauser Chad Caplan Ryan Slabaugh Matt Holmes Rebecca Travers Colin Nugen Advisor: Dr. Jelliffe Jackson 11/11/2013 University of Colorado Aerospace Engineering Sciences 1
Background Need & Motivation Objective & CONOPS Functional Block Diagram Baseline Design Critical Elements & Feasibility Detection Ranging Position Finding Size Characterization Extrusion Overall Feasibility Summary Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 2
Facts Kessler Syndrome Millions of pieces of debris in orbit 1 Debris larger than 4 inches will cause catastrophic damage to space asset More than 22,000 pieces of debris in this category 1 Largest density of debris is in Low Earth Orbit (750 800km) No active measures taken to reduce debris European Space Agency Space debris in orbit. 2 Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 3
Space Operations Simulation Center Debris DCS 6DOF Robot PIRANHA DETECT & RANGE THINK POSITION FINDING COMMUNICATE Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 4
Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 5
Detect & Position Finding Size Characterization Extrusion Detection Camera & Rangefinder Camera Optical Rim Sensor Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 6
Rangefinder Camera PIRANHA 90 cm 2-DOF Actuator 60 cm Software & Computing Hardware Optical Rim Sensor Heritage Debris Capture System Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 7
2-DOF Motor & Feedback Control for Centering Camera Field of View 11/11/2013 University of Colorado Aerospace Engineering Sciences 8
Functional Requirement FNC.2 PIRANHA shall detect the presence of debris within a 20m proximity of the DCS. Critical Project Element CPE.1.1 PIRANHA must detect debris in a simulated space environment. Baseline Design from Trade Study Camera Detection CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 9
Digital images are composed of a finite number of units called pixels these hold one value to represent color or shade. As distance increases from the lens, same number of pixels must cover a larger area. Object fills some number of pixels. Number of pixels filled decreases with distance from the lens. Field of view =Object 2 pixels Field of view angle 11/11/2013 University of Colorado Aerospace Engineering Sciences 10
Want some knowledge of the object in view Relationship exists between pixels filled by the object and knowledge that can be gained Light enters camera like this: Camera produces image like this: = Object Field of View 2 pixels No way to tell object shape with just one pixel filled 11/11/2013 University of Colorado Aerospace Engineering Sciences 11
Experiments have been done specifically with electro-optical sensors exploring knowledge which can be gained vs. pixels filled by the object Knowledge of Object to be Gained Required Resolution (pixels/minimum object dimension) Detection 2.0 ± 0.5 Orientation 2.8 ± 0.7 Recognition 8.0 ± 1.6 Identification 12.8 ± 3.0 Object fills two pixels across its smallest dimension Detection possible Experiment Results 3 =Object 8 pixels 11/11/2013 University of Colorado Aerospace Engineering Sciences Camera field of view 6 x 8 pixels 6 pixels
Must find pixel size at maximum range, and relate to smallest object detectable. pixels Field of view Range to object Worst Case: = 20 m & Object dimension = 10 cm 11/11/2013 Camera Field of View ( ) Point Grey Flea 3 36 Pixels ( ) Pixels per minimum dimension 2160 University of Colorado Aerospace Engineering Sciences 16.6 13
Frame at = Frame at Difference in Frames Simple video taken with camera rotating such that object moves across image frame Background image subtracted from each frame in video to detect object moving in foreground 11/11/2013 University of Colorado Aerospace Engineering Sciences 14
Outline Considerations in First-Level Analysis Johnson does not define visual tasks very well. Experiments were on relatively large objects in laboratory conditions. Method of detection not accounted for. Plan for Addressing Considerations Perform tests in the SOSC to assure detection is possible and reliable. Feasibility Point Grey Flea 3 USB 3.0 represents a feasible solution for detecting. Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Background Detection Ranging Position Size Specification 6 Point Grey Flea 3 USB 3.0 11/11/2013 University of Colorado Aerospace Engineering Sciences 15 Value Pixels 4096 x 2160 Field of View Dimensions Weight Power Sensor Size 48 X 36 degrees 29 X 29 X 30 mm 0.57 N 5V <3 W Price $975 Extrusion 5.76 x 4.29 mm Summary
Functional Requirement FNC.2 PIRANHA shall detect the presence of debris within a 20m proximity of the DCS. Centering Debris in Camera Field of View Design Requirement DES.2.1 Detection of debris shall also include determining relative distance between the DCS and the debris. Baseline Design Camera & rangefinder on actuator with feedback control. Range Debris Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 16
Convolution filters used to remove noise and detect edges of object Sobel Filter Computes spatial gradient (horizontal and vertical) of pixel values to detect edges Averaging Filter Computes average of surrounding pixels to remove noise Original Image Image w/ Sobel Filter (Edge Detection) Image w/ Sobel and Averaging Filter Use feedback control and motors to align centroid with center of field of view 11/11/2013 University of Colorado Aerospace Engineering Sciences 17
First-Level Analysis: Laser Rangefinder Rangefinder Camera Field of View 20 m Distance where Rangefinder is centered on Debris (m) 11/11/2013 (degrees) 20 0.114 ± 0.143 10 0.229 ± 0.286 5 0.458 ± 0.573 Height rangefinder is mounted above camera Angle between rangefinder and camera field of view minimum diameter of debris University of Colorado Aerospace Engineering Sciences 18
Considerations in First-Level Analysis Debris is contrasted from background Have feedback control to align debris in center of field of view Fluke 411D Rangefinder 4 MX-106 Servo & Bracket 5 Microchip PIC18F452 Plan for Addressing Considerations Designing feedback control law Feasibility Fluke 411D Rangefinder, MX-106 servos, and microcontroller represent a feasible solution for ranging. Specification 4 Weight Supply Voltage Resolution Value 1.47 N 5 V 3mm Specification 5 Value Weight 1.5 N Supply Voltage 11-15 V Resolution 0.088 Microcontroller will be used to command servos Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Size 123 x 50 x 26mm Price $100 Price w/ board $250 Holding Torque 8.2Nm Size 40 x 65 x 46 mm Price for 2 motors $1000 Price w/ brackets $1060 Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 19
Functional Requirement FNC.4 PIRANHA shall output relative position and velocity vectors (pointing and tracking telemetry) of the debris with respect to the DCS. Design Requirement CPE.1.2 PIRANHA must be capable of outputting commands at the rates and specifications required by the SOSC robot. Baseline Design Camera & Rangefinder Position Finding CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 20
Camera and Laser First-Level Analysis B A r B B r C C C r D A r D D (debris centroid) DCS A A B DCS Gimbal Point E r A E r D Sense Known Calculate C D Front of Camera Debris E Desired Goal Frame E Earth (J2000 reference) 11/11/2013 University of Colorado Aerospace Engineering Sciences 21
A T Calculation of B A (DCS frame) B (Camera frame) Camera reference frame will have two degrees of freedom. Motor actuation (angles and ) can be measured with: Stepper motor with known step size Encoder Potentiometer Inclinometer COTS motor that outputs position One COTS option is MX-106 servo TBA 11/11/2013 University of Colorado Aerospace Engineering Sciences 22
Considerations in First-Level Analysis Transformation from DCS frame to Camera frame is known accurately Range to the object is known accurately Have inertial position as a function of time Plan for Addressing Considerations Create STK scenario to obtain inertial position data Feasibility MX-106 Servo represents a feasible solution for position finding Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Mass Specification Supply Voltage MX-106 Servo Value 1.5 N 12-15 V Resolution 0.088 Size 40 x 65 x 46 mm Price for 2 motors $1000 Position, temperature, and voltage feedback Easily extendable to multiple degrees of freedom Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 23
Functional Requirement FNC.3 PIRANHA shall characterize objects based on their size and shape. Possible Scenarios Contrasted Image of Target Baseline Design Camera Image Processing Pixels Max Diameter Size Characterization CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 24
First-Level Analysis Using the focal length (f) of the camera and the height (v) of the digital image sensor; the vertical field of view ( ) can be calculated. Range to Object (r) Field of View ( ) Image Sensor Height (v) Object Camera Body Then, the distance per pixel (d) in the image can be calculated using geometric relations and knowledge of the number of pixels ( ) in the vertical direction. Focal Length (ƒ) 11/11/2013 University of Colorado Aerospace Engineering Sciences 25
First-Level Analysis Algorithm was developed to determine size. Noise was taken care of with pixel density check. Sample image was taken and loaded into Matlab. Image taken at 2 ft with object of diameter roughly 1.19 in. Diameter was calculated using distance per pixel and number of pixels across the object. Calculated Size: 1.25 in Error: 5% 11/11/2013 University of Colorado Aerospace Engineering Sciences 26
Considerations in First-Level Analysis Contrast Lighting Spherical Debris Plan for Addressing Considerations Pictures in SOSC environment Feasibility Point Grey Flea 3 USB 3.0 and ASUS laptop represent a feasible solution for size characterization Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 ASUS Laptop Specification 7 Value Model Number X401U-BE20602Z Processor 1.7GHz Memory 4GB Dimensions 24 X 34 X 3 cm Weight 18 N Price $230 Point Grey Flea 3 USB 3.0 Specification 6 Value Pixels 4096 x 2160 Field of View 48X36 degrees Dimensions 29 X 29 X 30 mm Weight 0.57 N Power 5V <3W Sensor Size 5.76 x 4.29 mm Price $975 Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 27
Functional Requirement FNC.3 PIRANHA shall characterize objects based on their size and shape. Design Requirement DES.3.2 & DES.3.3 If extrusion is detected, PIRANHA shall stop DCS approach and alert DCS of extrusion. Baseline Design Photoelectric sensors around the rim will detect extrusions. Extrusion Detection CONOPS Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 28
Baseline design from trade study changed Original design: Pressure transducers Unfeasible to approximate impact force Risk of creating more debris, contrary to U.S. Government Orbital Debris Mitigation Standard Practices 8 New Design: Photoelectric sensor Can detect extrusion without physical contact, reduces risk of extrusion hitting the DCS Requires few constraints on the extrusion characteristics Simple electrical/software integration Alert signal sent to DCS if beam is broken per requirements Optical Rim Sensor Baseline design switched to optical rim sensor. 11/11/2013 University of Colorado Aerospace Engineering Sciences 29
Sensors will be mounted off the rim to provide full sensor coverage of DCS rim Sensors and reflectors will require brackets to mount to DCS COTS brackets could be used Mounting brackets specific to sensors available for purchase with sensors Reflectors Photoelectric Sensor Reflector Top view of DCS Laser beam Tethers attach sensors/reflectors to DCS to minimize debris creation if extrusion were to hit one of them. Mounting Bracket Inside of DCS Receiver Transmitter Photoelectric Transmitter 11/11/2013 University of Colorado Aerospace Engineering Sciences 30
Sensor must have range of 24cm*8 = 1.92m to transmit around rim Beam spreading must be low for required range to meet required pointing accuracy Top view of DCS 24 cm DCS Dimensions 5 mm width Photoelectric Sensor 9,10 Automation Direct FALD-XO-OA (Transmitter) Automation Direct FALD- BN-OA (Receiver) Range (m) Beam/Receiver Dimensions 50 Beam spot at 2m range: <5mm -- 50 Receiver area: 18mm Output Cost Size Digital $75 $37.50 For this transmitter/receiver pair, worst case pointing accuracy = ± 0.44 18mm diameter 83.7mm long 18mm diameter 83.7mm long 11/11/2013 University of Colorado Aerospace Engineering Sciences 31
Considerations in First-Level Analysis Detectable extrusions assumed to have a minimum thickness of 5mm Assumed required pointing accuracy achievable Plan for Addressing Considerations Obtain photoelectric sensors and reflectors as early as possible and test mechanical methods for achieving required pointing accuracy. Feasibility Automation Direct FALD-XO-OA/FALD-BN-OA photoelectric transmitter/receiver pair represents a feasible solution for extrusion detection. Automation Direct FALD-XO-OA transmitter Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Outline Background Detection Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 32
Detection Point Grey Flea 3 camera Ranging & Position Finding Fluke 411D with Porcupine adapter board MX-106 servos with mounting brackets Size Characterization Point Grey Flea 3 camera ASUS X401U-BE20602Z laptop PIC18F452 microcontroller Extrusion Detection Automation Direct FALD-XO-OA (transmitter) Automation Direct FALD-BN-OA (receiver) Functional Requirements FNC.1 FNC.2 FNC.3 FNC.4 FNC.5 Estimated Total Cost: $2700.00 Estimated Total Size: 24 X 34 X 20 cm Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 33
Detection Take pictures in SOSC Ranging & Position Finding Designing feedback control law Create STK scenario to obtain inertial ephemeris data Create CAD of gimbal assembly Size Characterization Testing algorithms with pictures from SOSC Extrusion Detection Create CAD of sensor brackets Outline Background Detection Ranging Position Size Extrusion Summary 11/11/2013 University of Colorado Aerospace Engineering Sciences 34
11/11/2013 University of Colorado Aerospace Engineering Sciences 35
1 The Threat of Orbital Debris and Protecting NASA Space Assets From Satellite Collisions, National Aeronautics and Space Administration, Web. 09 Sept. 2013. 2 "Space in Images - 2012-11 - Space Debris." Space in Images - 2012-11 - Space Debris. European Space Agency, n.d. Web. 04 Oct. 2013. 3 Johnson, J.B. Analysis of Image Forming Systems, Selected Papers on Infrared Design, SPIE Vol. 513 Part Two, Society of Photo-Optical Instrumentation Engineers, 1985. 4 Porcupine Electronics. [http://porcupineelectronics.com/laserbotics.html] Web. 13 Oct. 2013 5 Trossen Robotoics. [http://www.trossenrobotics.com/p/mx-106t-dynamixel-robot-servo.aspx]. Web. 13 Oct. 2013 6 Point Grey Imaging and Innovation. [http://ww2.ptgrey.com/] Web. 13 Oct. 2013 7 ASUS X401U-BE20602Z BestBuy. [http://www.bestbuy.com/site/asus-14-laptop-4gb-memory-500gb-hard-drive-matteblack/8850089.p?id=1218914354022&skuid=8850089#tab=specifications]. Web. 13 Oct. 2013 8 U.S. Government Orbital Debris Mitigation Standard Practices, National Aeronautics and Space Administration, Feb. 2001, Web. 13 Sept. 2013. 9 Through Beam (SS / FA / FB Series) AutomationDirect.com, AutomationDirect.com, 2013, Web. 12 Oct. 2013. 10 FA Series Laser Photoelectric Sensors: M18 (18mm) plastic DC, AutomationDirect.com, n.d., Web. 12 Oct. 2013. 11/11/2013 University of Colorado Aerospace Engineering Sciences 36
Feasible Solution Cost Point Grey Flea 3 USB 3.0 with lens $975 Fluke 411D Rangefinder with Porcupine PCB $250 2 Servo motors with brackets $1060 PIC18F452 Microcontroller $80 ASUS Laptop $230 Automation Direct transmitter & receiver $112.50 Total $2707.50 11/11/2013 University of Colorado Aerospace Engineering Sciences 37
Experiments have been done to explore the relationship between imaging system resolution per minimum object dimension and the visual task which can be done with the image. Johnson, 1958 -> quantized tasks which can be done with an image Associated minimum resolution to complete those tasks Defines imaging system resolution as the number of line pairs which can be fully resolved over the field of view One line pair Camera with one line pair occupying field of view Does the field of view have enough pixels to resolve the line pair? 11/11/2013 University of Colorado Aerospace Engineering Sciences 38
-If the field of view contains one pixel: -Half black, half white -Returns some value between black and white, and appears some shade of grey Camera sees Camera reports -If the field of view contains two pixels: -one pixel is white, the other is black -Each pixel returns either white or black -line pair is resolved Pixel(s) Camera sees Camera reports *Two pixels minimum are required to resolve a line pair 11/11/2013 University of Colorado Aerospace Engineering Sciences 39
E 11/11/2013 University of Colorado Aerospace Engineering Sciences 40
E 11/11/2013 University of Colorado Aerospace Engineering Sciences 41
11/11/2013 University of Colorado Aerospace Engineering Sciences 42
r reference position of object in frame x position of object in camera in frame e in frame position error (e = r x) y actual position of object θ d desired orientation of motor θ a actual orientation of motor θ e orientation error (θ e = θ d θ a ) V voltage applied to motor 11/11/2013 University of Colorado Aerospace Engineering Sciences 43
Averaging filter kernel, A, is Subset of input matrix multiplied element-wise by averaging filter kernel Sum of these products is value of output pixel Sobel filter kernels in horizontal, Dx, and vertical, Dy, are Output pixel has value 11/11/2013 University of Colorado Aerospace Engineering Sciences 44