Introduction to Vision & Robotics

Similar documents
Introduction to Vision & Robotics

Introduction to Vision & Robotics

Advanced Robotics Introduction

Advanced Robotics Introduction

CS494/594: Software for Intelligent Robotics

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

Unit 1: Introduction to Autonomous Robotics

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Unit 1: Introduction to Autonomous Robotics

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Intelligent Robotics Sensors and Actuators

Lecture 23: Robotics. Instructor: Joelle Pineau Class web page: What is a robot?

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

JNTU World. Introduction to Robotics. Materials Provided by JNTU World Team. JNTU World JNTU World. Downloaded From JNTU World (

Overview of Challenges in the Development of Autonomous Mobile Robots. August 23, 2011

What is a robot? Introduction. Some Current State-of-the-Art Robots. More State-of-the-Art Research Robots. Version:

Cognitive Robotics 2017/2018

COS Lecture 1 Autonomous Robot Navigation

Robotics and Autonomous Systems

Introduction To Robotics (Kinematics, Dynamics, and Design)

Creating a 3D environment map from 2D camera images in robotics

Autonomous Mobile Robots

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Human Robot Interaction (HRI)

COSC343: Artificial Intelligence

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

ARTIFICIAL INTELLIGENCE - ROBOTICS

Collective Robotics. Marcin Pilat

MTRX 4700 : Experimental Robotics

Development of intelligent systems

RoboCup. Presented by Shane Murphy April 24, 2003

Robotics Enabling Autonomy in Challenging Environments

INTRODUCTION to ROBOTICS

Overview Agents, environments, typical components

Cognitive Robotics 2016/2017

Introduction to Computer Science

Introduction to Robotics

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Subsumption Architecture in Swarm Robotics. Cuong Nguyen Viet 16/11/2015

Insights into High-level Visual Perception

Russell and Norvig: an active, artificial agent. continuum of physical configurations and motions

Digital image processing vs. computer vision Higher-level anchoring

Robotics. Lecturer: Dr. Saeed Shiry Ghidary

Planning in autonomous mobile robotics

Slides that go with the book

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Artificial Intelligence and Mobile Robots: Successes and Challenges

Robotics: Evolution, Technology and Applications

Revised and extended. Accompanies this course pages heavier Perception treated more thoroughly. 1 - Introduction

By Marek Perkowski ECE Seminar, Friday January 26, 2001

Introduction to Robotics

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Henry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012

CS594, Section 30682:

DREAM BIG ROBOT CHALLENGE. DESIGN CHALLENGE Program a humanoid robot to successfully navigate an obstacle course.

UNIT1. Keywords page 13-14

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page

Robotics Introduction Matteo Matteucci

Plan for the 2nd hour. What is AI. Acting humanly: The Turing test. EDAF70: Applied Artificial Intelligence Agents (Chapter 2 of AIMA)

5a. Reactive Agents. COMP3411: Artificial Intelligence. Outline. History of Reactive Agents. Reactive Agents. History of Reactive Agents

Embodiment from Engineer s Point of View

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

Experimental Robotics CMPUT 412. Martin Jagersand Camilo Perez

JEPPIAAR ENGINEERING COLLEGE

Chapter 1 Introduction

CORC Exploring Robotics. Unit A: Introduction To Robotics

Robot Hardware Non-visual Sensors. Ioannis Rekleitis

Control Arbitration. Oct 12, 2005 RSS II Una-May O Reilly

CORC 3303 Exploring Robotics. Why Teams?

Key-Words: - Fuzzy Behaviour Controls, Multiple Target Tracking, Obstacle Avoidance, Ultrasonic Range Finders

What is Artificial Intelligence? Alternate Definitions (Russell + Norvig) Human intelligence

CS 599: Distributed Intelligence in Robotics

Multi-Robot Teamwork Cooperative Multi-Robot Systems

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Sensing and Perception

Intelligent Robotic Systems. What is a Robot? Is This a Robot? Prof. Richard Voyles Department of Computer Engineering University of Denver

Autonomous Robotics. CS Fall Amarda Shehu. Department of Computer Science George Mason University

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

CSC384 Intro to Artificial Intelligence* *The following slides are based on Fahiem Bacchus course lecture notes.

Robotic Systems ECE 401RB Fall 2007

TECHNOLOGY DEVELOPMENT AREAS IN AAWA

UNIT VI. Current approaches to programming are classified as into two major categories:

EN407: Robotics. Dr. Rohan Munasinghe Dept. of Electronic and Telecommunication Engineering University of Moratuwa

LDOR: Laser Directed Object Retrieving Robot. Final Report

Lets Learn of Robot Technology

An Introduction to Robotics. Elliot Ratchik, MS Former Senior Scientist, Hoffman LaRoche And Mannkind Corp.

LOCAL OPERATOR INTERFACE. target alert teleop commands detection function sensor displays hardware configuration SEARCH. Search Controller MANUAL

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Hybrid architectures. IAR Lecture 6 Barbara Webb

Interface Design V: Beyond the Desktop

Inf2D 01: Intelligent Agents and their Environments

Introduction to Robotics

An Introduction To Modular Robots

Recommended Text. Logistics. Course Logistics. Intelligent Robotic Systems

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

International Journal of Informative & Futuristic Research ISSN (Online):

A conversation with Russell Stewart, July 29, 2015

TEAMS OF ROBOTIC BOATS. Paul Scerri Associate Research Professor Robotics Institute Carnegie Mellon University

Transcription:

Introduction to Vision & Robotics by Bob Fisher rbf@inf.ed.ac.uk Introduction to Robotics Introduction Some definitions Applications of robotics and vision The challenge: a demonstration Historical highlights 1

Vision and Robotics: some definitions Connecting the computer to the raw unwashed world (Russell & Norvig) create [from 2 d image] an accurate representation tti of the three dimensional world and its properties, then using this information we can perform any visual task (Aloimonos & Rosenfeld) Vision is the direct extraction of affordances from the optic array Gibson A robot is: A programmable multi function manipulator designed to move material, parts, or specialised devices through variable programmed motions for the performance of a variety of tasks (Robot Institute of America) Robotics is the intelligent connection of perception to action (Brady) Applications: dull, dirty or dangerous Visual inspection of parts Detecting crime on CCTV Welding on cars N.B. Overlap with automation 2

Applications: dull, dirty or dangerous Robot vacuum cleaners Cleaning nuclear plants Robot sewer inspection N.B. Overlaps with teleoperation Applications: dull, dirty or dangerous Visual aids for driving Demining Space exploration 3

Applications: also...? Entertainment industry Service industry Science A challenging problem We donʹt have much introspective insight into how we see or how we control action Building vision and robot systems involves a variety of interacting technology domains: Mechanical, electrical, digital, computational... This has proved to be a hard problem for AI Can beat the human grandmaster at chess Canʹt replace a house cleaner 4

Vision and robotics uses all areas of AI: Problem solving, planning, search, inference, knowledge representation, learning etc... But we canʹt just plug sensors and effectors onto an AI simulation and expect it to work Have constraints such as: Limited, noisy, raw information Continuous dynamic problem space Time, power, cost and hardware limitations Often solutions grounded in these constraints do not resemble conventional AI approaches Ancient Greek hydraulic and mechanical automata Hero of Alexandria AD 100 5

Renaissance optics: The algorithmic connection between the world and the image - Durer c.1500 18 th century clockwork animals Vaucanson s duck 6

Early 20 th century Electronic devices for remote control Tesla Methods for transducing images into electrical signals Robot used to describe artificial i humanoid slaves in Capek s play Rossum s Universal Robots 1920 1940s 1950s Development of electronic computer and control theory Used for artificial creatures e.g. Walter s tortoise and John Hopkins beast 7

1960s Industrial robot arms: Unimation Methods for image enhancement and pattern recognition 1970s Work on systems in restricted domains e.g. Shakey in blocks world Freddy assembly task 8

1980s Tackling more realistic problems: Natural scene analysis Face recognition Dynamic locomotion Significant impact in manufacturing Active vision Recent highlights: Leg Lab - MIT 1980 onward 1995 biped acrobatics 9

(Leg lab continued) 2000 complex biped Recent highlights: NavLab CMU 1987 onwards 1995 No hands across America drive from Pittsburgh to SanDiego 98.2% autonomous 10

Military Predator UAV Walking Reactive Insects Atilla & Ghengis MIT Brooks Lab c. 1990 11

Barrett Gripper Classical Control Paradigm: SPA SPA is serial ad hoc analytical assumptous SPA lacks speed and efficiency flexibility and adaptivity modularity and scalability error-tolerance and robustness 12

Quotations by R. Brooks fast, cheap, and out of control Planning is just a way of avoiding figuring out what to do next. The world is its own best model Complex behavior need not necessarily be the product of a complex control system Simplicity is a virtue Robots should be cheap All on board computation is important Systems should ldbe build incrementally Intelligence is in the eye of the observer No representation, no calibration, no complex computers Subsumption Architecture if not if not Evaluation of progress Scheduling of subtasks (sub-) Goal approach if not Path planning if not Self-localization localization & -calibration if not Obstacle avoidance sense if not Move when clear act 13

Evaluation of the Subsumption Architecture I wouldn t want one to be my chauffeur (C. Torpe) Modifications at low levels affect higher levels Often there the hierarchy is not strict Priorities rather than inhibition Representations, plans, and models do help Reproducibility is a virtue SPA is top down, SubsArc is bottom up neats vs. scruffies Sensing the world Keypoints: Why robots need sensing Factors that affect sensing capability Contact sensing Proximity and range sensing Sensing light 14

Why robots need sensing For a robot to act successfully in the real world it needs to be able to perceive the world, and itself in the world. Can consider sensing tasks in two broad classes: Finding out what is out there: e.g. is there a goal; is this a team-mate; is there danger? = Recognition Finding out where things are: e.g. where is the ball and how can I get to it; where is the cliff-edge and how can I avoid it? = Location But note that this need not be explicit knowledge Sensing capability depends on a number of factors: 1 What signals are available? Light Pressure & Sound Chemicals 15

N.B. Many more signals in world than humans usually sense: e.g. Electric fish generate electric field and detect distortion Sensing capability depends on a number of factors: 1 What signals are available? 2 What are the capabilities of the sensors? Distance Vision Hearing Smell Contact Taste Pressure Temperature Internal Balance Actuator position/ movement Pain/damage 16

Note this differs across animals: e.g. Bees see ultraviolet light Need to choose what to build in to robot options and costs Visible More like a target? Ultraviolet Sensors perform transduction Transduction: transformation of energy from one form to another (typically, (yp y, into electrical signals) 17

Sensors perform transduction Sensor characteristics mean there is rarely an isomorphic mapping between the environment and the internal signal, e.g: - Most transducers have a limited range - Most transducers have a limited resolution, accuracy, and repeatability - Most transducers have lags or sampling delays - Many transducers have a non-linear response - Biological transducers are often adaptive - Good sensors are usually expensive in cost, power, size Sensing capability depends on a number of factors: 1 What signals are available? 2 What are the capabilities of the sensors? 3 What processing is taking place? E.g. extracting useful information from a sound signal is difficult: 18

Sound sources cause air vibration Diaphragm (ear drum or microphone) has complex pattern of vibration in response to sound Usually analysed by separating frequencies and grouping through harmonic/temporal cues frequency time Sensing capability depends on a number of factors: 1 What signals are available? g 2 What are the capabilities of our sensors? 3 What processing is taking place? 4 What is the task? 19

Classical view Transduction Processing Internal model Actuators Task Decision on Action Plan of Action Motor commands Alternative view Task specific Task specific Task specific transduction ti processing action Task 2 specific transduction Task 3 specific transduction Task 2 specific processing Task 3 specific processing Task 2 specific action Task 3 specific action Principal function is location E.g. bump switch or pressure sensor: is the object contacting this part of the robot? Contact sensors 20

Principal function is location E.g. bump switch or pressure sensor: is the object contacting this part of the robot? Antennae: extend the range with flexible element Contact sensors Contact sensors Can also use for recognition e.g. Is it moving or are you? Human touch can distinguish shape, force, slip, surface texture Rat whiskers used to distinguish textures 21

Contact sensors Note these kinds of sensors can also be used to detect flow e.g. wind sensors Proximity and Range Sensors Again main function is position: distance to object at specific angle to robot Typically works by emitting signal and detecting ti reflection Short range = proximity sensor, e.g. IR 22

Proximity and Range Sensors Over longer distance = range sensors e.g. Sonar: emit sound and detect reflection 23

a. Sonar reflection time gives range b. Can only resolve objects of beam width c. Apparent range shorter than axial range d. Angle too large so wall invisible e. Invisible corner f. False reflection makes apparent range greater Using sonar to construct an occupancy grid Robot wants to know about free Robot wants to know about free space Map space as grid Each element has a value which is the probability it contains an obstacle Update probability estimates from sonar readings 24

25 Learning the map Assuming robot knows where it is in grid, sensory input provides noisy information about obstacles, e.g. for sonar R β α r II I III s Probability p(z O) of grid element z=(r,α) in region I if occupied (O) given measurement s Using Bayesian approach where p(o) will depend on previous measurements + 2 β α β λ R r R ) (~ ) ~ ( ) ( ) ( ) ( ) ( ) ( O p O z p O p O z p O p O z p z O p + = Sample occupancy grid Noisy fusion of multiple sonar observations

Proximity and Range Sensors More accurate information (same principle) from laser range finder Either planar or scanning 1,000,000 pixels per second Range of 30 metres Accuracy of few mms Sample Laser Scan 26

Light sensors Why is it so useful to detect light? Straight lines mean the rays reflected from objects can be used to form an image, giving you where. Very short wavelengths gives detailed structural information (including reflectance properties of surface, seen as colour) to determine what. Very fast, it is especially useful over large distances. But requires half our brain to do vision Conclusions Robots need sensing: location, objects, obstacles Commonly used sensors: laser range, sonar, contact, proprioceptic, GPS (outdoors), markers General scene scanning vs affordances 27