Design Lab Fall 2011 Controlling Robots

Similar documents
6.01 Fall to provide feedback and steer the motor in the head towards a light.

Robots in Hallways. 1 Introduction. Goals:

6.01 Infrastructure Guide

6.01 Infrastructure Guide

Design Lab 6: Divide et impera

Experiment P02: Understanding Motion II Velocity and Time (Motion Sensor)

Experiment P01: Understanding Motion I Distance and Time (Motion Sensor)

6.081, Fall Semester, 2006 Assignment for Week 6 1

6.01, Fall Semester, 2007 Assignment 10 - Design Lab, Issued: Tuesday, November 6th 1

understanding sensors

INTRODUCTION TO DATA STUDIO

Lab book. Exploring Robotics (CORC3303)

LAB 5: Mobile robots -- Modeling, control and tracking

Note: Objective: Prelab: ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/22/2018 2/02/2018)

CSCI 4190 Introduction to Robotic Algorithms, Spring 2003 Lab 1: out Thursday January 16, to be completed by Thursday January 30

Robotics using Lego Mindstorms EV3 (Intermediate)

Overview. The Game Idea

Megamark Arduino Library Documentation

due Thursday 10/14 at 11pm (Part 1 appears in a separate document. Both parts have the same submission deadline.)

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

An Introduction to Programming using the NXT Robot:

LAB 1 Linear Motion and Freefall

Quantizer step: volts Input Voltage [V]

Graph Matching. walk back and forth in front of. Motion Detector

Chapter 1. Robots and Programs

Part II Coding the Animation

CURIE Academy, Summer 2014 Lab 2: Computer Engineering Software Perspective Sign-Off Sheet

Lab 1. Motion in a Straight Line

Progress Report. Mohammadtaghi G. Poshtmashhadi. Supervisor: Professor António M. Pascoal

MESA Cyber Robot Challenge: Robot Controller Guide

Saphira Robot Control Architecture

The light sensor, rotation sensor, and motors may all be monitored using the view function on the RCX.

Properties of Sound. Goals and Introduction

Design Project Introduction DE2-based SecurityBot

Chapter 7: Instrumentation systems

6.01, Fall Semester, 2007 Assignment 11, Issued: Tuesday, Nov. 13 1

Module: Arduino as Signal Generator

Introduction to Pioneer Robots

This manual describes the Motion Sensor hardware and the locally written software that interfaces to it.

PHYSICS 220 LAB #1: ONE-DIMENSIONAL MOTION

Chapter 14. using data wires

Welcome to. NXT Basics. Presenter: Wael Hajj Ali With assistance of: Ammar Shehadeh - Souhaib Alzanki - Samer Abuthaher

Multi Robot Navigation and Mapping for Combat Environment

OZOBOT BASIC TRAINING LESSON 1 WHAT IS OZOBOT?

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

Lab 7: Introduction to Webots and Sensor Modeling

LDOR: Laser Directed Object Retrieving Robot. Final Report

PURPOSE: To understand the how position-time and velocity-time graphs describe motion in the real world.

Exercise 1-1. Control of the Robot, Using RoboCIM EXERCISE OBJECTIVE

Photoshop Exercise 2 Developing X

Measure Mission 1. Name Date

I.1 Smart Machines. Unit Overview:

EE-110 Introduction to Engineering & Laboratory Experience Saeid Rahimi, Ph.D. Labs Introduction to Arduino

Annex IV - Stencyl Tutorial

Revision for Grade 6 in Unit #1 Design & Technology Subject Your Name:... Grade 6/

4: EXPERIMENTS WITH SOUND PULSES

WALLY ROTARY ENCODER. USER MANUAL v. 1.0

A New Simulator for Botball Robots

Computer Tools for Data Acquisition

VACUUM MARAUDERS V1.0

iphoto Objective Course Outline

Modeling Your Motion When Walking

EdPy app documentation

Parts of a Lego RCX Robot

Robotics Platform Training Notes

Agent-based/Robotics Programming Lab II

Momentum and Impulse. Objective. Theory. Investigate the relationship between impulse and momentum.

Using Cura for the first time

Lulzbot Taz Workflow Computation + Construction Lab Iowa State University

ArbStudio Triggers. Using Both Input & Output Trigger With ArbStudio APPLICATION BRIEF LAB912

Lesson 3: Arduino. Goals

Supplementary User Manual for BSWA Impedance Tube Measurement Systems

Ev3 Robotics Programming 101

Experiment P55: Light Intensity vs. Position (Light Sensor, Motion Sensor)

Incorporating a Software System for Robotics Control and Coordination in Mechatronics Curriculum and Research

6.01, Fall Semester, 2007 Assignment 8, Issued: Tuesday, Oct. 23rd 1

Experiment P10: Acceleration of a Dynamics Cart II (Motion Sensor)

Sensor Calibration Lab

Physics 3 Lab 5 Normal Modes and Resonance

LAB II. INTRODUCTION TO LABVIEW

Deriving Consistency from LEGOs

VARIANT: LIMITS GAME MANUAL

TETRIX PULSE Workshop Guide

Figure 3.1: This ranging sensor can measure the distance to nearby objects.

Introduction to programming with Fable

Running the PR2. Chapter Getting set up Out of the box Batteries and power

EITN90 Radar and Remote Sensing Lab 2

Copyright Jniz - HowTo

6.01, Fall Semester, 2007 Assignment 9b - Design Lab, Issued: Wednesday, Oct. 31st 1

Getting Started Guide AR10 Humanoid Robotic Hand. AR10 Hand 10 Degrees of Freedom Humanoid Hand

Laboratory 7: CONTROL SYSTEMS FUNDAMENTALS

Technical Note How to Compensate Lateral Chromatic Aberration

Perimeter and Area: The Mysterious Connection Assessment

Vision Ques t. Vision Quest. Use the Vision Sensor to drive your robot in Vision Quest!

Attitude and Heading Reference Systems

GE423 Laboratory Assignment 6 Robot Sensors and Wall-Following

University of Florida Department of Electrical and Computer Engineering Intelligent Machine Design Laboratory EEL 4665 Spring 2013 LOSAT

Blue-Bot TEACHER GUIDE

Step 1 - Setting Up the Scene

UWYO VR SETUP INSTRUCTIONS

Transcription:

Design Lab 2 6.01 Fall 2011 Controlling Robots Goals: Experiment with state machines controlling real machines Investigate real-world distance sensors on 6.01 robots: sonars Build and demonstrate a state machine to make the robot do a task: following a boundary 1 Materials This lab should be done with a partner. Each partnership should have: A lab laptop. A robot, a (long, gray) serial cable, and a (short, blue) serial-to-usb adapter. The serial cable is a long beige or gray cable. Most of the robots already have one attached. Warning: if your robot starts to go too fast or get away from you, pick it up!! A white foam-core board with bubble-wrap on one side. Be sure to mail all of your code and data to your partner. You will both need to bring it with you to your first interview. 2 Simple Brains A brain is a Python program that specifies behavior for the robot. The process of constructing and running a brain is described in detail in the Robot Infrastructure Guide. Objective: Build a state machine brain for controlling a robot, first in the soar simulator, then on a real robot. Run a simple brain in soar, and record the robot s path Modify the simple brain to make the robot rotate in place Run the brain on the Pioneer robot platform Some of the software and design labs contain the command athrun 6.01 getfiles. Please disregard this instruction; the same files are available on the 6.01 OCW Scholar site as a.zip file, labeled Code for [Design or Software Lab number]. 1

Resources: ~/Desktop/6.01/designLab02/smBrain.py: a simple robot brain, which uses the 6.01 state machine class sm tutorial.py: a virtual world in soar Detailed guidance : Do the following with a 6.01 lab laptop: 1. Run a brain in the simulator. a. In the Terminal window, type soar &. b. Click soar s Simulator button and double-click tutorial.py. This loads a specific virtual world into our robot simulator. c. Click soar s Brain button, navigate to Desktop/6.01/designLab02/smBrain.py, and click Open. This loads a specific state machine definition into the robot simulator. That state machine describes the actions that the robot will take in response to sensed information about the virtual world surrounding it. d. Click soar s Start button, and let the robot run for a little while. e. Click soar s Stop button. f. Notice the graph that was produced; it shows a slime trail of the path that the robot followed while the brain was running. You can just close the window. (If you don t want the brain to produce a slime trail, you can set the drawslimetrail argument to the Robot- Graphics constructor in the smbrain.py file to be False). 2. Modify the brain and run it. a. In the Terminal window, type idle & to open up an Idle environment. b. Click Idle s File menu, select Open..., navigate to Desktop/6.01/designLab02/smBrain.py, and click Open. c. The state machine that controls the robot s actions is defined by the MySMClass definition. Think of this state machine as taking sensory data as input, and returning as output instructions to the robot on how to behave. The io.action object returned as the output by the getnextvalues method of the MySmClass tells the robot how to change its behavior, and has two attributes that are important to us: fvel: specifies the forward velocity of the robot (in meters per second) rvel: specifies the rotational velocity of the robot (in radians per second), where positive rotation is counterclockwise d. Find the place where the velocities are set in the brain, and then modify it so that it makes the simulated robot rotate in place. e. Save the file. f. Go back to the soar window and click the Reload Brain button g. Run the brain by clicking the Start and then the Stop buttons. 3. Run it on the robot a. Connect the robot to your laptop, making sure the cable is tied around the handle in the back of the robot. b. Power on the robot, with a switch on the side panel. 2

c. Click soar s Pioneer button, to select the robot. You should be able to hear the sonar sensors making a ticking noise. d. One partner should be in charge of keeping the robot safe. Keep the cable from getting tangled in the robot s wheels. If the robot starts to get away from you, pick it up, then, turn it off using the switch on the robot. e. Click soar s Start button. 3 Sonars Objective: Investigate the behavior of the sonar sensors, and modify a robot brain to make a robot keep a certain distance from an obstacle. Don t spend more than 10 or 15 minutes experimenting with the sonars. When you re done, ask a staff member for a checkoff. The inp argument to the getnextvalues method of MySMClass is an instance of the soar.io.sensorinput class, which we have imported as io.sensorinput. It has two attributes, odometry and sonars. For this lab, we will just use the sonars attribute, which contains a list of 8 numbers representing readings from the robot s 8 sonar sensors, which give a distance reading in meters. The first reading in the list (index 0) is from the leftmost (from the robot s perspective) sensor; the reading from the rightmost sensor is the last one (index 7). Detailed guidance : Modify the brain so that it sets both velocities to 0, and uncomment the line print inp.sonars[3] Reload the brain and run it. It will print the value of inp.sonars[3], which is the reading from one of the forward-facing sonar sensors. From how far away can you get reliable distance readings? What happens when the closest thing is farther away than that? What happens with things very close to the sensor? Does changing the angle between the sonar transducer and the surface that it is pointed toward affect the readings? Does this behavior depend on the material of the surface? Try bubble wrap versus smooth foam core. Now, set the sonarmonitor argument to the RobotGraphics constructor to be True. Reload the brain and run it. This will bring up a window that shows all the sonar readings graphically. The length of the beam corresponds to the reading; red beams correspond to no valid measurement. Test that all your sonars are working by blocking each one in turn. If you notice a problem with any of the sensors, talk to the staff. Checkoff 1. Wk.2.2.1: Explain to a staff member the results of your experiments with the sonars. Demonstrate that you know your partner s name and email address. 3

Make the robot move forward to approximately 0.5 meters of an obstacle in front of it and keep it at that distance, even if the obstacle moves back and forth. Do this by editing the getnextvalues method of MySMClass; there is no need to change any other part of the brain. Don t set the forward velocity higher than 0.3 (or lower than -0.3). Debug it in simulation, by clicking soar s Simulator button and choosing tutorial.py. Once it seems good, run it on a real robot, by choosing soar s Pioneer button. Checkoff 2. Wk.2.2.2: Demonstrate your distance-keeping brain on a real robot to a staff member. 4 Following Boundaries Objective: Our goal now is to build a state machine that controls the robot to do a more complicated task: 1. When there is nothing nearby, it should move straight forward. 2. As soon as it reaches an obstacle in front, it should follow the boundary of the obstacle, keeping the right side of the robot between 0.3 and 0.5 meters from the obstacle. Draw a state-transition diagram that describes each distinct situation (state) during wall-following and what the desired output (action) and next state should be in response to the possible inputs (sonar readings) in that state. Start by considering the case of the robot moving straight ahead through empty space and then think about the input conditions that you encounter and the new states that result. Think carefully about what to do at both inside and outside corners. Remember that the robots rotate about their center points. Try to keep the number of states to a minimum. 4

Checkoff 3. Wk.2.2.3: Show your state-transition diagram to a staff member. Make clear what the conditions on state transitions are, and what actions are associated with each state. Copy your current smbrain.py file to boundarybrain.py (you can do this with Save As in idle), and modify it to implement the state machine defined by your diagram. Make sure that you define a startstate attribute and a getnextvalues method. Try hard to keep your solution simple and general. Use good software practice: do not repeat code, use helper procedures with mnemonic names, try to use few arbitrary constants and give the ones you do use descriptive names. To debug, add print statements that show the relevant inputs, the current state, the next state, and the output action. Record a slime trail of the simulated robot following a sequence of walls; make sure that it can handle outside and inside corners. Going around very sharp corners or hairpin turns, such as the L in tutorial.py, is not required, but is extra cool. Checkoff 4. Wk.2.2.4: Demonstrate your boundary follower to a staff member. Explain why it behaves the way it does. Mail your code to both partners. 5

MIT OpenCourseWare http://ocw.mit.edu 6.01SC Introduction to Electrical Engineering and Computer Science Spring 2011 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.