COSC343: Artificial Intelligence

Similar documents
Robotics using Lego Mindstorms EV3 (Intermediate)

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

understanding sensors

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

5a. Reactive Agents. COMP3411: Artificial Intelligence. Outline. History of Reactive Agents. Reactive Agents. History of Reactive Agents

Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl. LEGO Bowling Workbook

Chapter 1. Robots and Programs

Mindstorms NXT. mindstorms.lego.com

6.081, Fall Semester, 2006 Assignment for Week 6 1

Deriving Consistency from LEGOs

The light sensor, rotation sensor, and motors may all be monitored using the view function on the RCX.

Designing Toys That Come Alive: Curious Robots for Creative Play

Embodiment from Engineer s Point of View

Lab book. Exploring Robotics (CORC3303)

Pre-Day Questionnaire

FLL Coaches Clinic Chassis and Attachments. Patrick R. Michaud

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

acknowledgments...xv introduction...xvii 1 LEGO MINDSTORMS NXT 2.0: people, pieces, and potential getting started with the NXT 2.0 set...

Robotics Workshop. for Parents and Teachers. September 27, 2014 Wichita State University College of Engineering. Karen Reynolds

Robotics 2a. What Have We Got to Work With?

Automatic Headlights

A Rubik s Cube Solving Robot Using Basic Lego Mindstorms NXT kit

Where C= circumference, π = 3.14, and D = diameter EV3 Distance. Developed by Joanna M. Skluzacek Wisconsin 4-H 2016 Page 1

Unit 1: Introduction to Autonomous Robotics

Smart-M3-Based Robot Interaction in Cyber-Physical Systems

Let There Be Light. Opening Files. Deleting National Technology and Science Press

An Introduction to Programming using the NXT Robot:

Unit 1: Introduction to Autonomous Robotics

Erik Von Burg Mesa Public Schools Gifted and Talented Program Johnson Elementary School

Pre-Activity Quiz. 2 feet forward in a straight line? 1. What is a design challenge? 2. How do you program a robot to move

Session 11 Introduction to Robotics and Programming mbot. >_ {Code4Loop}; Roochir Purani

Levels of Description: A Role for Robots in Cognitive Science Education

Evolved Neurodynamics for Robot Control

Closed-Loop Transportation Simulation. Outlines

The Nomenclature and Geometry of LEGO

CSC C85 Embedded Systems Project # 1 Robot Localization

CURIE Academy, Summer 2014 Lab 2: Computer Engineering Software Perspective Sign-Off Sheet

Activity Template. Subject Area(s): Science and Technology Activity Title: Header. Grade Level: 9-12 Time Required: Group Size:

Agent-based/Robotics Programming Lab II

Today s Menu. Near Infrared Sensors

2.4 Sensorized robots

Welcome to. NXT Basics. Presenter: Wael Hajj Ali With assistance of: Ammar Shehadeh - Souhaib Alzanki - Samer Abuthaher

EV3 Advanced Topics for FLL

Welcome to Lego Rovers

Vision Ques t. Vision Quest. Use the Vision Sensor to drive your robot in Vision Quest!

Subsumption Architecture in Swarm Robotics. Cuong Nguyen Viet 16/11/2015

How Do You Make a Program Wait?

Multi-Robot Cooperative System For Object Detection

Saphira Robot Control Architecture

Instructional Technology Center

Some prior experience with building programs in Scratch is assumed. You can find some introductory materials here:

Sample Pages. Classroom Activities for the Busy Teacher: NXT. 2 nd Edition. Classroom Activities for the Busy Teacher: NXT -

Robot Architectures. Prof. Holly Yanco Spring 2014

Unit 4: Robot Chassis Construction

Q Learning Behavior on Autonomous Navigation of Physical Robot

Introduction.

1 Lab + Hwk 4: Introduction to the e-puck Robot

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

Artificial Intelligence Planning and Decision Making

Downloading a ROBOTC Sample Program

Robot Programming Manual

Lab 7: Introduction to Webots and Sensor Modeling

Digital Devices in the Digital Technologies curriculum

Robot Architectures. Prof. Yanco , Fall 2011

Ev3 Robotics Programming 101

Learning serious knowledge while "playing"with robots

Robotics Engineering DoDEA Career Technology Education Robot Programming

The use of programmable robots in the education of programming

Reactive Planning with Evolutionary Computation

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Control Arbitration. Oct 12, 2005 RSS II Una-May O Reilly

Introduction to Robotics Rubrics

Intelligent Robotics: Introduction

Line Detection. Duration Minutes. Di culty Intermediate. Learning Objectives Students will:

LEGO Mindstorms Class: Lesson 1

Workshops Elisava Introduction to programming and electronics (Scratch & Arduino)

Chassis & Attachments 101. Chassis Overview

AI Application Processing Requirements

Hi everyone. educational environment based on team work that nurtures creativity and innovation preparing them for a world of increasing

Advanced Robotics Introduction

Instructors. Manual GEARED. After-School Robotics Program By Haley Hanson

Morse Code Autonomous Challenge. Overview. Challenge. Activity. Difficulty. Materials Needed. Class Time. Grade Level. Learning Focus.

Lego Nxt in Physical Etoys

Hybrid architectures. IAR Lecture 6 Barbara Webb

Evolving non-trivial Behaviors on Real Robots: an Autonomous Robot that Picks up Objects

A Flexible and Innovative Platform for Autonomous Mobile Robots

Robots are similar to humans if you consider that both use inputs and outputs to sense and react to the world.

VEX Robotics Platform and ROBOTC Software. Introduction

Advanced Robotics Introduction

Week Lesson Assignment SD Technology Standards. SPA Handout. Handouts. Handouts/quiz. Video/handout. Handout. Video, handout.

University of Florida Department of Electrical and Computer Engineering EEL 5666 Intelligent Machines Design Laboratory GetMAD Final Report

Robotics Initiative at IIT IPRO 316. Fall 2003

Introduction to the VEX Robotics Platform and ROBOTC Software

Designing an Embedded System for Autonomous Building Map Exploration Robot

Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization

Robotics Connection Pte Ltd

CS6510 Dissertation. User Guided Behavior-based Robotic System. Final Report. Submitted. Dr Andy Chun, Associate Professor

MAKEBLOCK MUSIC ROBOT KIT V2.0

DE1.3 Electronics 1. Tips on Team Projects

TU Graz Robotics Challenge 2017

Transcription:

COSC343: Artificial Intelligence Lecture 2: Starting from scratch: robotics and embodied AI Alistair Knott Dept. of Computer Science, University of Otago Alistair Knott (Otago) COSC343 Lecture 2 1 / 29

Embodied AI: starting at the beginning How hard is it to build something like us? One way of measuring the difficulty of a task is to look at how long evolution took to discover a solution. Evolving plants from single-cell organisms Evolving fish/vertebrates from plants Evolving mammals Evolving primates Evolving the ancestors of great apes Evolving homo sapiens 1 billion years 1.5 billion years 300 million years 130 million years 100 million years 15.5 million years Most of evolutionary time was spent designing robust systems for physical survival. In this light, what s distinctively human seems relatively unimportant! Alistair Knott (Otago) COSC343 Lecture 2 2 / 29

An evolutionary approach to AI Proponents of embodied AI believe that in order to reproduce human intelligence, we need to retrace an evolutionary process: We should begin by building robust systems that perform very simple tasks but in the real world. When we ve solved this problem, we can progressively add new functionality to our systems, to allow them to perform more complex tasks. At every stage, we need to have a robust, working real-world system. Maybe systems of this kind will provide a good framework on which to implement distinctively human capabilities. This line of thought is associated with a researcher called Rodney Brooks. Alistair Knott (Otago) COSC343 Lecture 2 3 / 29

Beginning with some simple agents Some AI researchers are interested in modelling simple biological organisms like cockroaches, woodlice, etc. These creatures are basically reflex agents (c.f. Lecture 1): They sense the world, and their actions are linked directly to what they sense. They don t have any internal representations of the world. They don t do any reasoning. They don t do any planning. Alistair Knott (Otago) COSC343 Lecture 2 4 / 29

Simple reflex agents Recall: in Lecture 1, you heard about the agent function, which maps from percept histories to actions: f : P A For a simple reflex agent, this function is much simpler: it just maps the current percept to actions. f : P A Alistair Knott (Otago) COSC343 Lecture 2 5 / 29

Braitenberg Vehicles One very simple simulated organism is a Braitenberg vehicle. The discs on the front are light sensors. Lots of light strong signal. Less light weaker signal. The sensors are connected to motors driving the wheels. Alistair Knott (Otago) COSC343 Lecture 2 6 / 29

Braitenberg Vehicles Two initial configurations: A B A : Same side connections B : Cross connection From these simple connections can you work out the behaviour? Alistair Knott (Otago) COSC343 Lecture 2 7 / 29

Perception to Action These simple agents blur the lines between perception and action. Perception processes that are used to interpret the environment of an agent. These are processes that turn stimulation into meaningful features. Action any process that changes the environment. (Including the position of the agent in the environment!) Actually, the lines are blurred in human agents too. Alistair Knott (Otago) COSC343 Lecture 2 8 / 29

Simple Agents Complex behaviour Often you can get complex behaviour emerging from a simple action function being executed in a complex environment. Alistair Knott (Otago) COSC343 Lecture 2 9 / 29

Simple Agents Complex behaviour Often you can get complex behaviour emerging from a simple action function being executed in a complex environment. If an agent is working in the real world, its behaviour is a result of interactions between its agent function and the environment it is in. We can distinguish between Algorithmic Complexity Behavioural Complexity Alistair Knott (Otago) COSC343 Lecture 2 9 / 29

Emergent Behaviour Behaviours which result from the interaction between the agent function and the environment can be termed emergent behaviours. Some particularly interesting emergent behaviours occur when several agents are placed in the same environment. Alistair Knott (Otago) COSC343 Lecture 2 10 / 29

Emergent Behaviour Behaviours which result from the interaction between the agent function and the environment can be termed emergent behaviours. Some particularly interesting emergent behaviours occur when several agents are placed in the same environment. It s very hard to experiment with emergent behaviours except by building simulations and seeing how they work. Often even simulations are not enough - you need to build real robots. Alistair Knott (Otago) COSC343 Lecture 2 10 / 29

The subsumption architecture Rodney Brooks developed an influential model of the architecture of a reflex agent function. The basic idea is that there are lots of separate reflex functions, built on top of one another. Sensory signals Perception Action computation Perception Action computation Corridor traveling X Perception Action computation X Perception Obstacle avoidance Action computation X Action Wandering 1998 Morgan Kaufman Publishers Each agent function is called a behaviour. Behaviours all run concurrently. E.g. wander around, avoid obstacle. One behaviour s output can override that of less urgent behaviours. Alistair Knott (Otago) COSC343 Lecture 2 11 / 29

An example of the subsumption architecture Say you re building a robot whose task is to grab any drink can it sees. Behaviour 1: move around the environment at random. Behaviour 2: if you bump into an obstacle, inhibit Behaviour 1, and avoid the obstacle. Behaviour 3: if you see a drink can directly ahead of you, inhibit Behaviour 1, and extend your arm. Behaviour 4: if something appears between your fingers, inhibit Behaviour 3 and execute a grasp action. Alistair Knott (Otago) COSC343 Lecture 2 12 / 29

Embodied AI: the importance of practical work Roboticists stress the importance of working in the physical world. It s very hard to simulate all the relevant aspects of a robot s physical environment. The physical world is far harder to work with than we might think there are always unexpected problems. There are also often unexpected benefits from working with real physical systems. So we re going to do some practical work. Alistair Knott (Otago) COSC343 Lecture 2 13 / 29

The LEGO Mindstorms project Researchers at the MIT Media Lab have been using LEGO for prototyping robotic systems for a long time. One of these projects led to a collaboration with LEGO, which resulted in the Mindstorms LEGO kit. Mindstorms is now a very popular product, which is used by many schools and universities, and comes with a range of different operating systems and programming languages. We re using the third generation of Mindstorms, called EV3. Alistair Knott (Otago) COSC343 Lecture 2 14 / 29

Mindstorms EV3 components As well as a lot of different ordinary LEGO pieces, the EV3 kit comes with some special ones. 1. The EV3 brick: a microcontroller, with four inputs and three outputs. The heart of the EV3 is a 32-bit ARM microcontroller chip. This has a CPU, ROM, RAM and I/O routines. To run, the chip needs an operating system. This is known as its firmware. The chip also needs a program for the O/S to run. The program needs to be given in bytecode (one step up from assembler). Both the firmware and the bytecode are downloaded onto the NXT from an ordinary computer, via a USB link. Alistair Knott (Otago) COSC343 Lecture 2 15 / 29

Mindstorms components 2. A number of different sensors. Two touch sensors: basically on-off buttons. These are often connected to whisker sensors. A colour sensor detects light intensity in 3 wavelengths (RGB). This can be used to pick up either ambient light, or to detect the reflectance of a (close) object. The sensor shines a beam of light outwards. If there s a close object, it picks up the light reflected off the object in question. A sonar sensor, which calculates the distance of surfaces in front of it. A microphone, which records sound intensity. Alistair Knott (Otago) COSC343 Lecture 2 16 / 29

Mindstorms components 3. Two servomotors (actuators). The motors all deliver rotational force (i.e. torque). Motors can be programmed to turn at particular speeds, either forwards or backwards. They can also be programmed to lock, or to freewheel. Servomotors come with rotation sensors. So they can be programmed to rotate a precise angle and then stop. Alistair Knott (Otago) COSC343 Lecture 2 17 / 29

A simple mobile robot The robots you will be working with all have the same design. They re navigational robots - i.e. they move around in their environment. They are turtles: they have a chassis with two separately controllable wheels at the front, and a pivot wheel at the back. They have a light sensor underneath, for sensing the colour of the ground at this point. They have two whisker sensors on the front, for sensing contact with objects to the left and to the right. They also have a microphone and a sonar device (but there s only space to plug in one of these at a time). Alistair Knott (Otago) COSC343 Lecture 2 18 / 29

A simple mobile robot The robots you will be working with all have the same design. They re navigational robots - i.e. they move around in their environment. They are turtles: they have a chassis with two separately controllable wheels at the front, and a pivot wheel at the back. They have a light sensor underneath, for sensing the colour of the ground at this point. They have two whisker sensors on the front, for sensing contact with objects to the left and to the right. They also have a microphone and a sonar device (but there s only space to plug in one of these at a time). What commands would you need to give to a turtle to make it turn? Alistair Knott (Otago) COSC343 Lecture 2 18 / 29

The ROBOTC programming language We ll be programming the robots using a language based on C, called ROBOTC. Here s a simple example program. task main() { motor[motorc] = 100; // Start Motor C running at power 100 motor[motorb] = 100; // Start Motor B running at power 100 wait1msec(4000); // Wait 4000 milliseconds motor[motorc] = -100; // Start Motor C running backwards motor[motorb] = -100; // Start Motor B running backwards wait1msec(4000); // Wait 4000 milliseconds } Alistair Knott (Otago) COSC343 Lecture 2 19 / 29

The ROBOTC program development environment Alistair Knott (Otago) COSC343 Lecture 2 20 / 29

NXC program development and execution Program development cycle: Boot up on Windows. Turn the robot on, and connect it to your machine s USB port. Start the ROBOTC app. Write a program (in the top panel). Then hit Compile Program. (Errors appear in the bottom panel.) When it compilies cleanly, hit Download to Robot. See the 343 web page for much more info. Alistair Knott (Otago) COSC343 Lecture 2 21 / 29

The EV3 brick control panel The EV3 brick has an LCD display, and a few buttons to navigate a hierarchical menu. Hit the right button, and then select rc to get a listing of all ROBOTC programs on the robot. When you select a program, it will run. Make sure the robot s not on the edge of a table when you do this! To abort the program, hit the top-left button. Alistair Knott (Otago) COSC343 Lecture 2 22 / 29

Synched motors and servomotors task main() { nsyncedmotors = synchbc; // Motor B is master, C is slave nsyncedturnratio = 100; // motors move with 100% alignment nmotorencoder[motorb] = 0; // set the position of Motor B to 0 while(nmotorencoder[motorb] < 720) // run Motor B at 30% power, { // until it reaches posn 720. motor[motorb] = 30; // (Motor C is synched) } motor[motorb] = 0; wait1msec(3000); // turn both motors off // Wait for the above routine to finish! } Alistair Knott (Otago) COSC343 Lecture 2 23 / 29

Threads in ROBOTC The NXT supports multiple threads (called tasks). Every program has to contain a task called main. In this example, the main starts TOne and TTwo. TTwo kills TOne 1 if the bump sensor is pressed, and restarts TOne when the bump sensor is released. task TOne(); task TTwo(); task main() { starttask(tone); starttask(ttwo); while(true) { wait1msec(300); [then do something else, maybe] } return; } Alistair Knott (Otago) COSC343 Lecture 2 24 / 29

Threads in ROBOTC task TOne() { while(true) { wait1msec(300); [then do something] } return; } task TTwo() { while(true) { wait1msec(300); while(sensorvalue(touchsensor) == 1) { stoptask(tone); } starttask(tone); } return; } Alistair Knott (Otago) COSC343 Lecture 2 25 / 29

For more information about ROBOTC... There are lots of sample programs in the programming environment. ( file open sample program.) The 343 web page (LEGO resources) has lots more info, including: A user manual A useful API guide. Alistair Knott (Otago) COSC343 Lecture 2 26 / 29

The challenge The physical capabilities of a LEGO robot are very limited. The challenge is: to program this simple robot to produce complex behaviours. Alistair Knott (Otago) COSC343 Lecture 2 27 / 29

Summary Embodied AI: models the aspects of intelligence which relate to how an agent operates in the real world. In complex environments, simple systems can generate complex behaviours. The subsumption architecture is a useful architecture for embodied agents. We can study embodied agents using LEGO Mindstorms robots. Alistair Knott (Otago) COSC343 Lecture 2 28 / 29

Reading For this lecture: take a look at the info about LEGO robots on the COSC343 webpage. For next lecture: AIMA Sections 18.1, 18.2 Alistair Knott (Otago) COSC343 Lecture 2 29 / 29