EECS498: Autonomous Robotics Laboratory

Similar documents
Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Unit 1: Image Formation

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens.

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

LENSES. INEL 6088 Computer Vision

Computational Photography and Video. Prof. Marc Pollefeys

Sensors and Sensing Cameras and Camera Calibration

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Image stitching. Image stitching. Video summarization. Applications of image stitching. Stitching = alignment + blending. geometrical registration

Applied Optics. , Physics Department (Room #36-401) , ,

Lenses. Overview. Terminology. The pinhole camera. Pinhole camera Lenses Principles of operation Limitations

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

VC 16/17 TP2 Image Formation

Field & Post Production The Media School Indiana University Syllabus - Fall 2016 v1.0

VC 14/15 TP2 Image Formation

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Field & Post Production The Media School Indiana University Syllabus - Fall 2018 v1.0

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Computer Vision. The Pinhole Camera Model

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

A CAMERA IS A LIGHT TIGHT BOX

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

6.A44 Computational Photography

VC 11/12 T2 Image Formation

Two strategies for realistic rendering capture real world data synthesize from bottom up

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Measuring Voltage, Current & Resistance Building: Resistive Networks, V and I Dividers Design and Build a Resistance Indicator

Basic principles of photography. David Capel 346B IST

Cameras. CSE 455, Winter 2010 January 25, 2010

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

CNC Lathe Programming-Basic

6.01, Spring Semester, 2008 Final exam announcement and practice final, Revised May 12 1

How do we see the world?

Princeton University COS429 Computer Vision Problem Set 1: Building a Camera

Image Formation and Capture

Computational Approaches to Cameras

Getting light to imager. Capturing Images. Depth and Distance. Ideal Imaging. CS559 Lecture 2 Lights, Cameras, Eyes

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

CS 443: Imaging and Multimedia Cameras and Lenses

Lenses, exposure, and (de)focus

An Introduction to. Photographic Exposure: Aperture, ISO and Shutter Speed

Building a Real Camera. Slides Credit: Svetlana Lazebnik

E X P E R I M E N T 12

EECS150 Spring 2007 Lab Lecture #5. Shah Bawany. 2/16/2007 EECS150 Lab Lecture #5 1

Shutter Speed. Introduction. Lesson Four. A quick refresher:

Double Aperture Camera for High Resolution Measurement

This document is a preview generated by EVS

Lens Openings & Shutter Speeds

Introduction to Digital Photography

Lecture 02 Image Formation 1

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

Field & Post Production The Media School Indiana University Syllabus - Spring 2018

Model-Based Design for Sensor Systems

Course Syllabus OSE 3200 Geometric Optics

Building a Real Camera

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

Course Syllabus OSE 3200 Geometric Optics

Image Capture and Problems

6.869 Advances in Computer Vision Spring 2010, A. Torralba

Computer Vision. Thursday, August 30

Get the Shot! Photography + Instagram Workshop September 21, 2013 BlogPodium. Saturday, 21 September, 13

What will be on the midterm?

Cameras, lenses, and sensors

Basic Camera Concepts. How to properly utilize your camera

Be aware that there is no universal notation for the various quantities.

Introduction. Related Work

Cameras and Sensors. Today. Today. It receives light from all directions. BIL721: Computational Photography! Spring 2015, Lecture 2!

MEM: Intro to Robotics. Assignment 3I. Due: Wednesday 10/15 11:59 EST

Tonemapping and bilateral filtering

CSE 527: Introduction to Computer Vision

How to Convert & Resize Images in Bulk

Midterm Examination CS 534: Computational Photography

PHOTOGRAPHY Mohamed Nuzrath [MBCS]

Image Processing & Projective geometry

Working with your Camera

Lecture 7: Camera Models

ELEC Dr Reji Mathew Electrical Engineering UNSW

This document explains the reasons behind this phenomenon and describes how to overcome it.

This histogram represents the +½ stop exposure from the bracket illustrated on the first page.

CPSC 425: Computer Vision

Lecture 2 Camera Models

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc

Leica RCD30 Calibration Certificate

*Contest and Rules Adapted and/or cited from the 2007 Trinity College Home Firefighting Robot Contest

MAKE YOUR OWN PINHOLE CAMERA

To do this, the lens itself had to be set to viewing mode so light passed through just as it does when making the

Tolerancing in Zemax. Lecture 4

School Based Projects

MEM455/800 Robotics II/Advance Robotics Winter 2009

Digital Photography: Course Syllabus

Prof. Feng Liu. Winter /09/2017

CSSE220 BomberMan programming assignment Team Project

Chapter 25 Optical Instruments

Transcription:

EECS498: Autonomous Robotics Laboratory Edwin Olson University of Michigan

Course Overview Goal: Develop a pragmatic understanding of both theoretical principles and real-world issues, enabling you to design and program robotic systems incorporating sensing, planning, and acting. Course topics: Kinematics Inverse Kinematics Sensors & Sensor Processing Motors & Control Planning State Estimation Embedded Systems

Evaluation Two major labs, each with multiple check points. ArmLab BotLab Midterm bonus Labs 30% Midterms 32% Final Project 32% Quizzes 5% Course Eval 1%

Lab/Project Deliverables In addition to short-response lab writeups: ArmLab Create a poster - Abstract, effective visuals BotLab Oral presentations (e.g. power point) Final project Interactive demonstration in Tishman hall

Course Policies Collaboration Peer programming, not parallelization No use of outside resources Teams can share ideas, but not solutions/code Group work certifications I participated and contributed to team discussions on each problem, and I attest to the integrity of each solution. Our team met as a group on [DATE(s)]. Note any qualifications (we re reasonable). Signatures

Lateness Assignments due at 11:59p; 10% lateness penalty per day; no credit after three days Excused missed exams/quizzes Quizzes: not considered in grading Exams: oral make-up exams Unexcused exams/quizzes: 0.

Lab Policies Food restricted Non-sticky beverages at stations Anything else discouraged, but some tolerance for responsible snacking away from workstation. No removal of equipment without advance permission. Notify staff of accidents, broken equipment. Secret door code: XXXXXX

Teams ArmLab & BotLab Teams assigned by staff Final project Student-selected teams Peer Evaluations

Teaming Working on a team is an engineering problem in itself. At the beginning of each lab, discuss When/where will you meet? What do you expect of each other? What will you do if problems arise?

Final Projects Scope Implement a more complicated algorithm Implement a system of multiple algorithms Develop a principled new algorithm Develop a compelling real-world implementation Evaluation 50% Technical merit 25% Interactivity and engagingness of presentation 25% Web exhibit

Course Resources Email lists eecs498-staff@april.eecs.umich.edu eecs498@april.eecs.umich.edu Subscribe yourself at: http://april.eecs.umich.edu/ mailman/listinfo/eecs498 Wiki http://april.eecs.umich.edu/courses/eecs498_w12/wiki

Course Resources Apps Peer evaluations Real-time course standing Books There is no textbook.

Shared lab space Lab space is shared with 373 Creates some scheduling hazards!

Lab Hours M T W R F 8 9 10 Labture Labture 11 12 1 Ols 2 Ols/Mort 3 Mort 4 5 6 7 8

Cameras and Image Formation 16

World Simplest Camera? The world Film Just hold up a piece of film Do we get an image on the film? For each piece of the film, where do the photons come from?

World Simplest Camera? The world Film Just hold up a piece of film Do we get an image on the film? For each piece of the film, where do the photons come from?

World Simplest Camera? The world Film Just hold up a piece of film Do we get an image on the film? For each piece of the film, where do the photons come from?

World Simplest Camera? The world Film Just hold up a piece of film Do we get an image on the film? For each piece of the film, where do the photons come from?

World Simplest Camera? The world Film Just hold up a piece of film Do we get an image on the film? For each piece of the film, where do the photons come from?

World Simplest Camera? The world Film Just hold up a piece of film Do we get an image on the film? For each piece of the film, where do the photons come from?

Let s add an aperture An aperture blocks all but a small subset of the rays Causes the image to appear in focus!

Aperture Size Why not make the aperture super small? A pin-hole lens. Not enough light to register on our film What happens when the aperture is bigger? More rays can fit through--- blurrier image Is there any way of getting a sharp image, but allow more light through? Yes! A lens.

Lenses z f A lens collects rays with a particular divergence and refocuses them to a point. But points at the wrong distance won t be refocused exactly. Depth of field: how much of the scene is in focus We re going to ignore this today, however--- we re going to assume a pin-hole model.

Perspective Projection f z x x The pinhole creates two similar triangles Allows us to determine x in terms of x

Perspective Projection f z x x The pinhole creates two similar triangles Allows us to determine x in terms of x x = -xf/z (why is it negative? we ll assume from here on out that the camera unflips the image.)

Perspective Projection f z x x What are the pixel coordinates where the flame appears? x = fx/z + c Measure f in pixels and add an offset (so that the middle pixel is in the middle of the image)

Lens distortions Unfortunately, real (imperfect) lenses further complicate life. Undistorted Pin cushion Barrel (common)

Calibration Often use a planar target Compute geometrical relationship between points on (known) target and observed points. For planar targets: a homography Optimize camera parameters to match observed images.

Correcting for lens distortion Radial Distortion 1. Compute the pixel coordinates assuming the lens is undistorted 2. Convert to polar form 3. Compute r = f(r) 4. Convert r and θ back to Cartesian coordinates. r,θ Function f() is typically nasty polynomial functions. We find the parameters by using non-linear optimization algorithms

Color Cameras Incoming light is described in terms of a power spectral density Color isn t a physical property of light It s made up by our eyes and brain! Different types of incoming light can have the same color S Response M Response L Response Eye

Just for fun...

Bayer Patterns

Bayer Patterns Why does this matter? At each pixel, two color channels are interpolated based on nearby pixels Thus, a color camera is more blurry than a monochrome camera.

Bayer Pattern Artifacts When the color of an area is uniform, Bayer patterns work well. What happens when there is a rapid change in color? R, G, and B sub-pixels may observe different PSDs Interpolated colors may not exist anywhere! Average of nearby red pixels = red... so there will be a red output pixel even though the incoming light is either white or black.

JCam

Visualization Fraction of brain devoted to vision 25-50% (depending on who you ask) That s an awful lot of processing power Try to use it when you re working on a hard problem!

Why make visualizations? Visualization is the single best use of researcher time. Find bugs faster Verify algorithms and build intuition Generate figures/ movies for papers/talks

Visualization Tips Start by visualizing When designing a system, design your debugging interface first. Visualize creatively Experiment with different rendering schemes. A pretty interface is often a good interface. Exploit time Make movies, not just images Especially with iterative algorithms! Become an expert in a visualization package Vis

Example: ICP

Minard s Graph of Napoleon s Army

Name Voyager

Graph Clustering

Vis