due Thursday 10/14 at 11pm (Part 1 appears in a separate document. Both parts have the same submission deadline.)

Similar documents
Lab book. Exploring Robotics (CORC3303)

I.1 Smart Machines. Unit Overview:

Studuino Icon Programming Environment Guide

Learn about the RoboMind programming environment

Robotics using Lego Mindstorms EV3 (Intermediate)

In this problem set you will practice designing a simulation and implementing a program that uses classes.

Laboratory 7: CONTROL SYSTEMS FUNDAMENTALS

understanding sensors

6.01 Fall to provide feedback and steer the motor in the head towards a light.

Agent-based/Robotics Programming Lab II

Capstone Python Project Features

1. Controlling the DC Motors

Lab 1: Testing and Measurement on the r-one

CSCI 4190 Introduction to Robotic Algorithms, Spring 2003 Lab 1: out Thursday January 16, to be completed by Thursday January 30

Linear Motion Servo Plants: IP01 or IP02. Linear Experiment #0: Integration with WinCon. IP01 and IP02. Student Handout

Lab 8: Introduction to the e-puck Robot

Table of Contents. Lesson 1 Getting Started

Sample Pages. Classroom Activities for the Busy Teacher: NXT. 2 nd Edition. Classroom Activities for the Busy Teacher: NXT -

Chapter 14. using data wires

An Introduction to Programming using the NXT Robot:

ECE 497 Introduction to Mobile Robotics Spring 09-10

6.1 - Introduction to Periodic Functions

Momentum and Impulse. Objective. Theory. Investigate the relationship between impulse and momentum.

You Can Make a Difference! Due November 11/12 (Implementation plans due in class on 11/9)

Nebraska 4-H Robotics and GPS/GIS and SPIRIT Robotics Projects

Precalculus Lesson 9.2 Graphs of Polar Equations Mrs. Snow, Instructor

THE SINUSOIDAL WAVEFORM

Where C= circumference, π = 3.14, and D = diameter EV3 Distance. Developed by Joanna M. Skluzacek Wisconsin 4-H 2016 Page 1

Open Loop Frequency Response

CS/NEUR125 Brains, Minds, and Machines. Due: Wednesday, February 8

Robonz Robotics Competition 2007

AutoCAD LT 2012 Tutorial. Randy H. Shih Oregon Institute of Technology SDC PUBLICATIONS. Schroff Development Corporation

Lab 7: Introduction to Webots and Sensor Modeling

The light sensor, rotation sensor, and motors may all be monitored using the view function on the RCX.

PRO VIDEO ANNOUNCEMENTS

AutoCAD LT 2009 Tutorial

NX 7.5. Table of Contents. Lesson 3 More Features

Instruction Manual. 1) Starting Amnesia

A New Simulator for Botball Robots

Design Lab Fall 2011 Controlling Robots

Automatic Tool Changer (ATC) for the prolight A Supplement to the prolight 1000 User s Guide

SDC. AutoCAD LT 2007 Tutorial. Randy H. Shih. Schroff Development Corporation Oregon Institute of Technology

Deriving Consistency from LEGOs

Worksheet Answer Key: Tree Measurer Projects > Tree Measurer

Measuring the Thickness of Fills & Coatings

GE 320: Introduction to Control Systems

CSC C85 Embedded Systems Project # 1 Robot Localization

CURIE Academy, Summer 2014 Lab 2: Computer Engineering Software Perspective Sign-Off Sheet

Table of Contents. Sample Pages - get the whole book at

Making a Drawing Template

Note: Objective: Prelab: ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/22/2018 2/02/2018)

Making an Architectural Drawing Template

SolidWorks Tutorial 1. Axis

In this activity, you will program the BASIC Stamp to control the rotation of each of the Parallax pre-modified servos on the Boe-Bot.

Exercise 1. Consider the following figure. The shaded portion of the circle is called the sector of the circle corresponding to the angle θ.

CiberRato 2019 Rules and Technical Specifications

ENSC 470/894 Lab 3 Version 6.0 (Nov. 19, 2015)

Assignment 5 due Monday, May 7

Getting Started. with Easy Blue Print

1. Creating geometry based on sketches 2. Using sketch lines as reference 3. Using sketches to drive changes in geometry

LAB 1 Linear Motion and Freefall

AutoCAD Tutorial First Level. 2D Fundamentals. Randy H. Shih SDC. Better Textbooks. Lower Prices.

XI. Rotary Attachment Setups

Some prior experience with building programs in Scratch is assumed. You can find some introductory materials here:

The Beauty and Joy of Computing Lab Exercise 10: Shall we play a game? Objectives. Background (Pre-Lab Reading)

PHYSICS 220 LAB #1: ONE-DIMENSIONAL MOTION

Lab S-3: Beamforming with Phasors. N r k. is the time shift applied to r k

CONCEPTS EXPLAINED CONCEPTS (IN ORDER)

LEGO Mindstorms Class: Lesson 1

with MultiMedia CD Randy H. Shih Jack Zecher SDC PUBLICATIONS Schroff Development Corporation

Inspiring the Next Engineers and Scientists

Setup Download the Arduino library (link) for Processing and the Lab 12 sketches (link).

Midi Fighter 3D. User Guide DJTECHTOOLS.COM. Ver 1.03

Fall Music 320A Homework #2 Sinusoids, Complex Sinusoids 145 points Theory and Lab Problems Due Thursday 10/11/2018 before class

Distributed Intelligence in Autonomous Robotics. Assignment #1 Out: Thursday, January 16, 2003 Due: Tuesday, January 28, 2003

Parts of a Lego RCX Robot

**IT IS STRONGLY RECOMMENDED THAT YOU WATCH THE HOW-TO VIDEOS (BY PROF. SCHULTE-GRAHAME), POSTED ON THE COURSE WEBSITE, PRIOR TO ATTEMPTING THIS LAB

How Do You Make a Program Wait?

Mechatronics Laboratory Assignment 3 Introduction to I/O with the F28335 Motor Control Processor

RETRO User guide RETRO. Photoshop actions. For PS CC, CS6, CS5, CS4. User Guide

Creating a 3D Assembly Drawing

2809 CAD TRAINING: Part 1 Sketching and Making 3D Parts. Contents

Programming 2 Servos. Learn to connect and write code to control two servos.

RoboMind Challenges. Line Following. Description. Make robots navigate by itself. Make sure you have the latest software

Name: Date Completed: Basic Inventor Skills I

Engineering & Computer Graphics Workbook Using SolidWorks 2014

COPYRIGHTED MATERIAL CREATE A BUTTON SYMBOL

Note to Teacher. Description of the investigation. Time Required. Materials. Procedures for Wheel Size Matters TEACHER. LESSONS WHEEL SIZE / Overview

Robotic Manipulation Lab 1: Getting Acquainted with the Denso Robot Arms Fall 2010

Mindstorms NXT. mindstorms.lego.com

Western Kansas Lego Robotics Competition April 16, 2018 Fort Hays State University

Your EdVenture into Robotics 10 Lesson plans

Ev3 Robotics Programming 101

Momo Software Context Aware User Interface Application USER MANUAL. Burak Kerim AKKUŞ Ender BULUT Hüseyin Can DOĞAN

Graph Matching. walk back and forth in front of. Motion Detector

Unit 3.11 Introduction to Absolute & Polar Coordinates in AutoCAD Ardrey Kell High School Charlotte, NC Date: 11/18/13 (Revision II)

The ideal K-12 science microscope solution. User Guide. for use with the Nova5000

Lab 1: Steady State Error and Step Response MAE 433, Spring 2012

CS123. Programming Your Personal Robot. Part 3: Reasoning Under Uncertainty

Conversational CAM Manual

Transcription:

CS2 Fall 200 Project 3 Part 2 due Thursday 0/4 at pm (Part appears in a separate document. Both parts have the same submission deadline.) You must work either on your own or with one partner. You may discuss background issues and general solution strategies with others, but the project you submit must be the work of just you (and your partner). If you work with a partner, you and your partner must first register as a group in CMS and then submit your work as a group. Objectives Completing this project will solidify your understanding of user-defined functions and vectors. You will also do more graphics. In Part 2 you will develop code for the irobot Create (a robot) using a simulator. This is an opportunity to see and appreciate the approximation and errors associated with real robotic control programs! Matlab Files and Simulator Toolbox Download the file p3part2.zip from the Projects page on the course website. The seven files contained in p3part2.zip must be in the Current Directory in Matlab. This project uses the irobot Create Simulation Toolbox, which is a set of Matlab files developed in a joint project between CS2 and MAE480 Autonomous Mobile Robot. The simulation toolbox is based on the irobot Create Matlab Toolbox for controlling the actual robot, developed at the US Naval Academy. The code that runs on the simulator can run directly on the Create robot. The simulator toolbox runs on all 2008 and later versions of Matlab on Windows machines, but on Macs it works with the 200 full version not student version of Matlab only. Use the PCs in the campus computer labs if you have trouble with the simulator on your own computer. We are working with CIT and ACCEL to install the simulator in as many labs as possible. On the Projects page of the course website there is a Simulation Toolbox Bulletin that gives the installation instructions as well as lists the labs where the toolbox is already installed. Still, at any lab PC perform this quick check to see if the simulator is correctly installed before you start working: Look in Matlab s search path for a directory that ends with...\irobotcreatesimulatortoolbox. The capitalization must be exact! You can see the search path by typing in the Command Window: path. The simulator toolbox directory would be near the top or bottom of the displayed list, if it is there at all. If the exact directory name irobotcreatesimulatortoolbox is present, then you can start working at the computer as the simulator is correctly installed. If the irobotcreatesimulatortoolbox directory is not present or if only the lowercase version of this name is present, then you need to install the simulator or correct the path before you can use the simulator. See the Simulation Toolbox Bulletin on the course website for instructions. The Robot and an Example Control Program The robot is round with a diameter of roughly 34cm. It has two motored wheels that we can control; a third castor wheel is near the front of the robot for balance. The robot has many sensors, but in this project we will make use of three kinds of sensors only: bump sensors which detect collision, cliff sensors for reading markings on the floor, and the virtual wall sensor for reading infrared signal. In the final problem we will make use of an overhead localization system for determining the robot s global position. An example control program, driveforwarduntilwall, demonstrates two functions available in the simulator for robot movement and sensing. Here is how to run any control program in the simulator, assuming that the simulator code is accessible:

. Set the Current Directory to the folder that holds the control program file and map file. 2. Type SimulatorGUI in the Command Window. The Simulator starts with a blue circle representing the robot in the center. The blue line indicates the heading of the robot. By default the robot is at (0,0) oriented towards the east. 3. Click the Load Map button. Select the map file in the dialog box that opens up. For this example the map is squareenclosure.txt. Four walls (lines) forming a square room appears. 4. Optional: Change the position of the robot by clicking the Set Position button. Then make one click in the plot area to position the center of the robot and a second click to indicate the robot s orientation. For this example keep the robot inside the room facing but not touching a wall. 5. Optional: Under Sensors select the Bump checkbox to indicate that you wish to visualize the bump sensor (magenta when it is activated). 6. Click the Start button under Autonomous to select a control program to run. Select driveforwarduntilwall.m. The robot moves forward until it hits a wall. You will hear four beeps. Observe that the Manual Control keys are dimmed (inactive) when autonomous code is running and light up once the control program ends. You can stop autonomous code execution by clicking the Stop key under Autonomous. Now read driveforwarduntilwall.m and note the following: The function parameter serport is for communication with a real robot; in the simulator it is not meaningful. Nevertheless we must keep that parameter and use it when calling functions that run on the robot. The code you develop in the simulator is the same code for controlling the real robot! The BumpsWheelDropsSensorsRoomba function reads specific sensors on the robot and detects, among other things, bumps on the front half circle of the robot. We make use of only three of the six returned values: BumpRight, BumpLeft, and BumpFront. The robot actually has only two physical bumpers: if the right bumper is activated BumpRight is, if the left bumper is activated BumpLeft is, if both left and right bumpers are activated it is treated as a front hit so BumpFront is and BumpRight and BumpLeft are both 0. The only argument needed in this function is serport. The SetDriveWheelsCreate function allows us the set the velocity of the right and left wheels. Use three arguments: serport, right wheel velocity, and left wheel velocity. The unit is meters/second and the range is -0.5 to 0.5. Negative velocity is backward. The pause halts the execution of code, not the robot. After setting the wheel velocities we pause code execution to let the robot move. If you remove the pause the code executes calls functions on the robot too fast and the program will be stuck. StopCreate is a subfunction in driveforwarduntilwall. Look at the given code: it simply sets the wheel velocity to zero. Signal is a subfunction in driveforwarduntilwall. We include it so that you hear an audible signal when the control program finishes execution. Now modify driveforwarduntilwall.m: Add pause(2) after the loop ends and before calling StopCreate. When you re-run the program (set the position of the robot in the simulator, click Autonomous Start, and so on), you will notice that after the robot hits a wall, the robot slips along the wall before it stops and you hear the four beeps. This is because after hitting the wall, the wheels keep spinning for two seconds, pushing the robot against the wall. The simulator models wheel slips, and likely you will see them as well as other approximations and errors when you solve the next two problems. Now you re ready to write your own control program! 2 The Randomly Wandering Robot Complete the function randomwalkrobot to have our robot wander randomly inside a room until it hits a wall. Assume the room is completely walled and the robot starts inside the room not touching any wall. In each step of this random walk, the robot rotates a random angle value from its current heading and then drives forward for one second at.5 m/s. 2

The random angle is generated at each step as specified here. A rotation of π/2 toπ/2 is called an advance angle since the robot would then move forward relative to its previous position. A rotation of π/2 to 3π/2 is called a retreat angle. Generate a random angle such that it is w times as likely to get an advance angle than a retreat angle. Note: First choose whether it is an advance or a retreat angle. Then generate a uniformly random number within the advance or retreat range. Implement a function (not subfunction) to get this random angle: Retreat Advance function d = getrandangle(w) % d is a random angle in degrees. d is w times as likely to be an advance % angle than a retreat angle. w is a positive integer. % d is an advance angle if -90<d<90; d is a retreat angle if 90<d<270. Take a look at the incomplete function randomwalkrobot. The weight w for advance versus retreat is set to be three, but make sure that your code works for different positive integer values of w. InrandomWalkRobot you must call your function getrandangle. In addition to functions BumpsWheelDropsSensorsRoomba and SetDriveWheelsCreate demonstrated in the example above, you will need the function turnangle. For example, the function call turnangle(serport,.2,80) turns the robot 80 degrees at.2 rad/s. The allowed angle range is -360 to 360 degrees and positive is counter clockwise. The allowed speed range is 0 to.2 rad/s. Each call to turnangle produces a line of output under normal execution; don t worry about it. Use the map file squareenclosure.txt with this problem. (However the solution does not depend on the shape of the enclosure.) Submit your functions randomwalkrobot and getrandangle on CMS. 3 The Reconnaissance Robot Our intrepid robot now goes on a mission to discover the markings on the floor of a secret lab. The rectangular lab has three solid walls on the west, south, and east sides. The north side has a virtual wall, i.e., an infrared beam marks the northern extent of the lab. It is known that the lab floor is white and marked with dark lines, but near the walls and virtual wall, about a robot s width, there is no floor marking. In addition to the bump sensors the robot will employ its cliff sensors, which measure the reflectivity of the floor, and a virtual wall sensor. In order to map out the floor markings, the robot will also use a GPS-like system to obtain its location. Complete the control program exploreroom to enable the robot to perform this recon mission. Range of virtual wall 3. Traversing the Floor As usual, start by decomposing the problem! The first task is to travel over the entire floor area. Things to know/consider: The robot always begins in the southwest corner facing east. Use the map file labsmall.txt for program development and later test the program also on lab.txt. The only robot sensors to use for traveling are the bump sensors and virtual wall sensors. The statement vws = VirtualWallSensorCreate(serPort) will assign to variable vws the value if a virtual wall is sensed and 0 otherwise. It s OK for the robot to be in the infrared range it won t get fried and hopefully it is close to completing its mission by then. The only robot motion functions to use are those used in the previous problem: SetDriveWheelsCreate and turnangle. What is your plan for moving over the entire floor? Can you decompose that plan into specific parts that are repeated in a systematic way? 3

You probably want the robot to move at the maximum allowed speed, but to minimize hard bumps and wheel slips you might want to call the sensors frequently. Remember to insert a pause to slow down repeated function calls to the robot; otherwise your program will hang. Test your code to make sure that the robot covers the floor. You are now working with a robot (although in a simulator) so keep in mind that there are errors and approximations everywhere. The robot wheels will slip sometimes; the right-angle turns are not always so right-angled. Some redundancy (overlap) is good, so don t spend a lot of time worrying about things like exactly how far (what fraction of a second) a move needs to be. We can t get perfection, but we can get a program that works. 3.2 Mapping the Markings The robot will produce a map of the markings at the end of its travel. As it travels, it needs to check the reflectivity of the floor under its cliff sensors. Dark colors have low reflectivity and it is known that most white floors have a reflectivity of above 20 while dark paint has a reflectivity around 3. Here s the general plan: when the robot detects a point that it decides is dark, it calls its localization system to get its current coordinates and records them. At the end of its travels it makes a plot of the recorded coordinates. But it s not quite that simple... The localization system gives the coordinates of the center of the robot,(x c,y c ), and the robot s orientation, θ. The four cliff sensors are symmetric about the robots orientation line: the front-left and front-right sensors are.07π from the orientation line; φ=.07π the left and right sensors are π/3 from the orientation line. So if, for α=π/3 example, the front-left cliff sensor detects a dark spot, the coordinates of the dark spot are θ ( x c + r cos(θ + φ),y c + r sin(θ + φ) ) (x c,y c ) where r is the radius of the robot, 0.7m. The statement rf = CliffFrontLeftSignalStrengthRoomba(serPort) assigns to rf the reflectivity of the floor under the front-left cliff sensor; it is a value in the range [0,00]. The three functions for the other three cliff sensors have these names and can be called in the same way: CliffFrontRightSignalStrengthRoomba CliffLeftSignalStrengthRoomba CliffRightSignalStrengthRoomba [Revised 0/4] To get the current location and heading of the robot, call the function OverheadLocalizationCreate: [xc,yc,theta]= OverheadLocalizationCreate(serPort) The returned values xc and yc are the x- and y-coordinates of the center of the robot; theta is the heading of the robot in radians from the positive x-axis. [End revision 0/4] You will implement the subfunction ReadFloor to call the four cliff sensor functions and return the coordinates of any dark spots found: function [x, y]= ReadFloor(serPort) % Return the coordinates of any dark spots detected by cliff sensors. % x and y are vectors that store the coordinates of dark spots. If there % are no dark spots, x=[] and y=[]. The length of x (and y) is the number % of dark spots detected. % serport is the serial port number (for controlling the actual robot). ReadFloor is called by the main function exploreroom every time that the robot needs to take a reading of the floor on which it stands. If there is no dark spot, the returned vectors are empty, i.e., they are vectors 4

of length 0. If there is one dark spot, then the returned vectors are of length. Since there are only four sensors the maximum length of the returned vectors is 4. After the robot stops moving, the program should issue an audible signal call subfunction Signal. Then the program draws a plot of the coordinates of the dark spot: figure() % start a figure window and number it window plot(xdark,ydark, * ) % xdark, ydark are vectors storing the x, y coords of the dark spots axis equal axis([0.5 0.5]) grid on % show grid lines on the plot Below are figures of the actual marking in the map labsmall.txt and an example of a robot-generated map of the marking. As one would expect, the result is not a perfect match! (But it s reasonably close.).5.5 0.5 0.5 0 0 0.5.5 0 0 0.5.5 Submit your file exploreroom.m on CMS. 5