University of Florida Department of Electrical and Computer Engineering EEL 5666 Intelligent Machines Design Laboratory Formal Report

Similar documents
LDOR: Laser Directed Object Retrieving Robot. Final Report

Final Report. Chazer Gator. by Siddharth Garg

Figure 1. Overall Picture

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

University of Florida Department of Electrical and Computer Engineering EEL 5666 Intelligent Machines Design Laboratory GetMAD Final Report

Rack Attack. EEL 5666: Intelligent Machines Design Laboratory, University of Florida, Drs. A. Antonio Arroyo and E. M.

MASTER SHIFU. STUDENT NAME: Vikramadityan. M ROBOT NAME: Master Shifu COURSE NAME: Intelligent Machine Design Lab

Final Report Metallocalizer

T.E.S.L.A (Terrain Exoskeleton (that) Shocks Large Animals) Mark Tate

DEVELOPMENT OF A HUMANOID ROBOT FOR EDUCATION AND OUTREACH. K. Kelly, D. B. MacManus, C. McGinn

A Semi-Minimalistic Approach to Humanoid Design

Park Ranger. Li Yang April 21, 2014

The ROUS: Gait Experiments with Quadruped Agents Megan Grimm, A. Antonio Arroyo

Final Report. by Mingwei Liu. Robot Name: Danner

FUmanoid Team Description Paper 2010

Robotic Swing Drive as Exploit of Stiffness Control Implementation

SELF-BALANCING MOBILE ROBOT TILTER

Robo-Erectus Jr-2013 KidSize Team Description Paper.

EEL5666C IMDL Spring 2006 Student: Andrew Joseph. *Alarm-o-bot*

RoboSAR Written Report 1

Bogobots-TecMTY humanoid kid-size team 2009

Visual Perception Based Behaviors for a Small Autonomous Mobile Robot

Wakey Wakey Autonomous Alarm robot

Special Sensor Report: CMUcam Vision Board

Nao Devils Dortmund. Team Description for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann

Chapter 1. Robot and Robotics PP

Andrew Kobyljanec. Intelligent Machine Design Lab EEL 5666C January 31, ffitibot. Gra. raffiti. Formal Report

I plan to build a four-legged robot with these objectives in mind:

istand I can Stand SPECIAL SENSOR REPORT

University of Florida. Department of Electrical Engineering EEL5666. Intelligent Machine Design Laboratory. Doc Bloc. Larry Brock.

Project Number: P13203

Abstract. 1. Introduction

RC_Biped Final Report Stephen Bagg M&AE 490 Spring 2007 Lab members: Alex Veach, Denise Wong Dept: Theoretical and Applied Mechanics Professor: Andy

Design and Implementation of a Simplified Humanoid Robot with 8 DOF

Testing of the FE Walking Robot

The project. General challenges and problems. Our subjects. The attachment and locomotion system

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

SPIDER ROBOT Presented by :

A PROTOTYPE CLIMBING ROBOT FOR INSPECTION OF COMPLEX FERROUS STRUCTURES

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

Team Description for Humanoid KidSize League of RoboCup Stephen McGill, Seung Joon Yi, Yida Zhang, Aditya Sreekumar, and Professor Dan Lee

EVALUATING THE DYNAMICS OF HEXAPOD TYPE ROBOT

Humanoids. Lecture Outline. RSS 2010 Lecture # 19 Una-May O Reilly. Definition and motivation. Locomotion. Why humanoids? What are humanoids?

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Chapter 1 Introduction

Kid-Size Humanoid Soccer Robot Design by TKU Team

Technical Cognitive Systems

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

University of Florida Department of Electrical and Computer Engineering Intelligent Machine Design Laboratory EEL 4665 Spring 2013 LOSAT

Dynamically Adaptive Inverted Pendulum Platfom

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Korea Humanoid Robot Projects

EEL 4665/5666 Intelligent Machines Design Laboratory. Messenger. Final Report. Date: 4/22/14 Name: Revant shah

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Implement a Robot for the Trinity College Fire Fighting Robot Competition.

Robot: icub This humanoid helps us study the brain

Robo-Erectus Tr-2010 TeenSize Team Description Paper.

Shuffle Traveling of Humanoid Robots

Part 1: Determining the Sensors and Feedback Mechanism

TigerBot IV Rochester Institute of Technology

Gusano. University of Florida EEL 5666 Intelligent Machine Design Lab. Student: Christian Yanes Date: December 4, 2001 Professor: Dr. A.

EEL5666 Intelligent Machines Design Lab. Project Report

THE IMPORTANCE OF PLANNING AND DRAWING IN DESIGN

KUDOS Team Description Paper for Humanoid Kidsize League of RoboCup 2016

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

Design and prototype of the Sucker vacuuming robot

GROUP BEHAVIOR IN MOBILE AUTONOMOUS AGENTS. Bruce Turner Intelligent Machine Design Lab Summer 1999

Special Sensor Report: CMUcam. David Winkler 12/10/02 Intelligent Machines Design Lab Dr. A. A. Arroyo TAs: Uriel Rodriguez Jason Plew

DEVELOPMENT OF THE HUMANOID ROBOT HUBO-FX-1

Physical Presence in Virtual Worlds using PhysX

NavBot: The Navigational Search-and-Rescue Robot

POKER BOT. Justin McIntire EEL5666 IMDL. Dr. Schwartz and Dr. Arroyo

DC Motor and Servo motor Control with ARM and Arduino. Created by:

Roborodentia Robot: Tektronix. Sean Yap Advisor: John Seng California Polytechnic State University, San Luis Obispo June 8th, 2016

Speed Control of a Pneumatic Monopod using a Neural Network

Development of Running Robot Based on Charge Coupled Device

SELF STABILIZING PLATFORM

sin( x m cos( The position of the mass point D is specified by a set of state variables, (θ roll, θ pitch, r) related to the Cartesian coordinates by:

Building an autonomous light finder robot

Optimal Control System Design

4R and 5R Parallel Mechanism Mobile Robots

International Journal of Innovations in Engineering and Technology (IJIET) Nadu, India

Department of Electrical and Computer Engineering EEL Intelligent Machine Design Laboratory S.L.I.K Salt Laying Ice Killer FINAL REPORT

Cedarville University Little Blue

Birth of An Intelligent Humanoid Robot in Singapore

Today s Menu. Near Infrared Sensors

MOBILE ROBOT LOCALIZATION with POSITION CONTROL

Overview of Challenges in the Development of Autonomous Mobile Robots. August 23, 2011

CONTROL SYSTEM TO BALANCE A BIPED ROBOT BY THE SENSING OF COG TRAJECTORIES

Alph and Ralph: Machine Intelligence and Herding Behavior Megan Grimm, Dr. A. Antonio Arroyo

Range Rover Autonomous Golf Ball Collector

Development and Evaluation of a Centaur Robot

UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE

UNIT VI. Current approaches to programming are classified as into two major categories:

Diggle Smalls. Nick Cox University of Florida, ECE IMDL EEL4665C Instructors: Dr. Arroyo, Dr. Schwartz TAs: Josh Weaver, Andy Gray, Devin Hughes

A Low Resolution Vision System

An Introduction To Modular Robots

I. INTRODUCTION MAIN BLOCKS OF ROBOT

INTERNATIONAL JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY (IJEET) TWO WHEELED SELF BALANCING ROBOT FOR AUTONOMOUS NAVIGATION

Transcription:

Date: 03/25/10 Name: Sean Frucht TAs: Mike Pridgen Thomas Vermeer Instructors: Dr. A. Antonio Arroyo Dr. Eric M. Schwartz University of Florida Department of Electrical and Computer Engineering EEL 5666 Intelligent Machines Design Laboratory Formal Report

Table of Contents Opening...3 Abstract...3 Executive Summary...4 Introduction...5 Main Body...6 Integrated System...6 Mobile Platform...6 Actuation...6 Sensors...7 Tilt...7 Sonar...7 Bump...7 Inferred Distance...7 CMUcam3...7 Behaviors...7 Following...7 Walking...7 Getting- up...7 Obstacle Avoidance...7 Experimental Layout and Results...8 Closing...12 Conclusion...13 Documentation...14 Appendices...14 2

OPENING Abstract This paper covers the proposal, development, and testing of a biped robot "Ender" and quadruped "Kitten". The project goals are to create a biped robot which is able to stand, walk, obstacle avoid, get- up, and follow Kitten. To accomplish this Ender will make use of five sensors and two data systems. Ender will use a Pridgen Vermeer Robotics Xmega128 microcontroller linked to the aforementioned sensors along with twelve HSR- 5990TG Robotic servos. The frame for Ender will be comprised of Lynxmotion Servo Erector Set brackets and custom machined parts. 3

Executive Summary Ender, a two legged robot, was designed to track the smaller four legged Kitten. The majority of the class was spent on developing Ender which on media day consisted of 4 sensors and a CMUcam3. Ender uses 10 servos for movement Kitten was developed and built in the last two weeks of class and served as a moving red box for Ender to track. Kitten contained only one IR sensor for simple obstacle avoidance. Ender uses a CMUcam3 to track kitten. The CMUcam is mounted on two servos to allow for pan and tilt, by rewriting a part of the CMUcam's firmware it was possible to use it to control the camera's pan and tilt separately from Ender and only return the positions of the servos. Doing this gave Ender the ability to walk in one direction while tracking in another direction. many experiments were done to determine the most effective color to use with the CMUcam and it turns out that using a LED of any standard color was effective due to the fact that it emits a pure representation of the color that changes only minutely due to changes in lighting characteristics. Red LEDs were chosen and used as the color source in this project. Sharp GP2D12 IR were used for obstacle avoidance on both Ender and Kitten. These sensors were very easy to use and gave reliable data in most indoor areas. The largest issue to using these was due to the sensors nonfunctional 3.3V state and setup of the PVR board. While the sensors are stated to work on a 3.3V source when they are given this they will saturate at great distances and give a false "close" reading. To remedy this the servo wires had to be split and the power had to be taken from the servo power bus. With the power at 5V the sensors output a analog voltage from 0-5V which was being read with a reference voltage of ~2V on the board. To fix the low reference voltage a jumper was used to connect a 5V line from the servo power bus with Port A- 0 on the board and the Vref register was set to 0x20 which allows the reference voltage to be set to whatever is on Port A- 0. Finally a small delay of 1ms was put on the conversion routine to allow for the conversion to fully complete when reading sensors in rapid succession. The walking for Ender is controlled by a series of preprogrammed parametric gaits that allow for walking forward and turning based on parameters passed into them. These gaits were easy to develop but required a large amount of time due to their experimentation based design approach. On the other hand Kitten's gaits were based on Inverse Kinematics which allowed for a dynamic changes to gait characteristics such as speed and range of movement Ender's behaviors were all based on following kitten. Walking forward was designed around a preprogrammed gait that only aloud for change of speed between step interpolations and did not allow for the change in actual gait. This gait had to be continuously tweaked by hand to improve its stability and is still not as efficient as possible. Turning was designed to be parametric and consists of four parts. In part one the robot will lift its right or left foot off the floor and balance on its other foot. Once balance has been established the lifted foot will rotate a given number of degrees and extend itself in front of the stationary foot. This throws the robot off balance and requires that part three, placing the rotated foot back on the ground, to happen within a small timeframe for the robot to not fall over of exhibit large oscillations on landing. Once part three has been completed part 4 quickly lifts the original stationary leg and rotates the robot back into a standing position by pivoting on the rotated foot. Part 4 tries to use the momentum generated in part 3 to allow the final pose to have little to no oscillations. Ender's obstacle avoidance is configured such that it will take priority over other functions any time a sensor locates something within a minimum safe turn distance and will take a high but not necessarily the highest priority if it notices anything at all. Kitten's obstacle 4

avoidance on the other hand is it's only behavior and will change the length and direction of its steps as it approaches a obstacle based on sensor readings. Introduction Biped Locomotion is a growing branch in robotics which began in the later 20th century and has begun to boom in the past 6 years. Legged locomotion has been used in industry for years to overcome the large environmental footprint created by wheeled locomotion in large vehicles and to better transverse uneven terrain. These industrial walkers, however, generally consist of more than four legs which allow for stabilization. Biped locomotion does not inherently include this stability. Because of this, one of the largest hurdles for creating consumer and industrial biped robots is creating stable designs. I propose to design a 10 degree of freedom biped (4 per leg 2 for camera pan and tilt) robot which can walk while avoiding obstacles and following a smaller 12 degree of freedom(3 per leg x 4) quadruped. 5

MAIN BODY Integrated System The robot will use the Pridgen Vermeer Robotics Xmega128 microcontroller to control a variety of sensors and interface devices as shown in figures 1 and 2 below. Figure 2 - Kitten's Integrated System Figure 1 - Ender's Integrated System Mobile Platform The mobile platform for my robot will be constructed out of Lynxmotion Servo Erector Set (SES) brackets. These brackets are machined out of aluminum and have a standardized bolt pattern which allows for easy assembly and simple design. The design of the platform is integral to the success of my robot. If the joints are not designed in such a way that each of the servo axis intersect at a common point for multi- DoF joints, then the inverse kinematic calculations will become much more complex. Also, the frame must be designed such that the forces on the servo outputs don't create unnecessary torques while standing still, otherwise the servos will be forced to continually use power to hold the robot in a standing position. Actuation The robot will be controlled by 12 HSR- 5990TG servos. These servos have been designed specifically for robotic applications. I chose these for my robot due to three main factors. First, the HSR- 5990TG servos (HSR servos) run through the HMI (Hitec Multi- protocol Interface) protocol which is a protocol designed to provide position feedback and daisy chaining through Figure 3 - A HSR- 5990 Servo 6

serial connections. Furthermore, the HSR servos are standard size servos but come with additional parts to add a second output shaft directly below the main shaft which is not powered but allows for much greater weight distribution, and facilitates the use of the servo for joints. Finally the servos have much higher speed (0.14/60 ), precision, and torque (417 oz.in) capabilities than standard servos of the same weight and size. These high torque servos are required because at certain times the robots balance may center around pivoting on a single servo, in which case its stability will be determined by if this servo has the required torque to rectify the unbalance. An example of this would be standing on one leg; your body becomes an inverted pendulum pivoting on your ankle. During this time your balance is accomplished mainly by rotating your ankle which supports your entire weight. Sensors I propose the use of the following 5 types of sensors and 2 methods to augment the sensors: Tilt - the tilt sensor will be used to notify the robot in the case of falling over so that it can autonomously return to a standing position. Bump - there will be 1 to 2 bump switches in each foot to read contact with the terrain and control which stage of the walking process the robot is currently engaged in. Inferred Distance - these will be used for obstacle avoidance and will be mounted on two servos at the hip of the robot. Kitten will also use one mounted on a servo. Sonar - this will also be used for obstacle avoidance. CMUcam3 - this will be used to track Kitten around the arena. Behaviors Walking - walking will be done with a static gait that is preprogrammed. Getting Up - the robot will stand up from rest using a preprogrammed static movement. Obstacle Avoidance - while walking, the robot will use infrared sensors and sonar to detect obstructions, once detected the robot will stop, turn in place until it finds a clear path, and then continue on its course. Following - Ender will follow Kitten around an area. 7

Experimental Layout and Results Designs for Ender Design 1: Figure 4-3d Rendering of Design 1. Design 1 had three main problems. First, the tubing that connected each joint would begin to vibrate loosely after activity which caused large instabilities to occur. Secondly, the small brackets that rotate around the axis of each servo limit the range of movement of said servo. Finally, to accommodate the camera two servos were required for pan and tilt operations, these servos were removed from the knees. To remedy these problems design 2 uses longer brackets which take the place of the tubing and allow for higher range of movement over the original brackets. Design 2: Figure 5 - Design 1 Built Figure 6 - Full View of Ender Figure 7 - Ender's Wiring Figure 8 - Close- up of Ender Along with fixing the issues in design 1, design 2 has custom made feet which have battery holders built in. Having battery holders built into the feet lowers the center of gravity and allows for greater 8

maneuverability due to the extra shifting weight that can be used to lift one foot off the ground. Aside from the camera there was no change in the mounting style for the electronics between design 1 and design 2. It was found that design 2 had become larger to the point that the batteries could no longer support the current draw from the twelve servos for extended amounts of time. this was due to the height and sheer weight of the large aluminum frame. Although the battery life for design 2 was shortened drastically it was also found (during its short 3-7 minute run times) that this taller design was much less stable and had issues with shifting its weight effectively. Figure 9 - Full View of Ender Figure 10 - Ender's Back and Wiring Design 3 (Final Design) Design 3 was created to fix the power issues created in design 2. Because time was running out for the class it was not feasible to purchase larger batteries with higher current discharging capabilities. This meant that the number of servos/weight of the robot would have to be decreased for it to work effectively. Therefore, Design 3 incorporated only 8 HSR 5990TG servos for its legs and 2 standard hobby servos for the pan and tilt functions on the camera. This shorter design made walking about 4 times easier than with the higher DOF frames and allowed me to get Ender walking fairly quickly. Another design change that came from the insufficient battery problem was that Design 3 uses a 9V battery to power the electronics on the PVR board and the CMUcam3 while the servos remained powered separately by the 6V RC battery pack. Because two different batteries were used they had to have their grounds tied together. CMUcam3 The original design of Ender used a CMUcam2 to track Kitten around, however, after hours of trying to get the CMUcam2's servo ports to work I was informed by Seattle Robotics that they had changed the 9

processor and in doing so lost the 4 pins required to create the PWM for the servos(although the processor couldn't support the servos, the ports were left and still had power routed to them). Luckily I was barrowing Seon Kim's CMUcam3 while I waited over a month to get my CMUcam2, and after hearing about the CMUcam2's servo port issues he agreed to allow me to continue using the camera throughout the rest of the semester. Data/Experiments Pertaining to CMUcam The CMUcam's most important function is blob tracking, however, out of the box the camera comes out of focus and must be focused before anything will work. Focusing is done repeatedly turning the lens by small increments and dumping a frame to the computer. Once the camera is in focus you can use the CMUcam GUI to grab colors from a dumped frame by selecting a tolerance and clicking on desired color in the dumped frame. Picking the proper color to track is the most important part of setting up the CMUcam. The following are some results I had while testing different colors: Shiny/Reflective colors or materials - These fared the worst as they would effectively change color depending on the lighting of the room. Light colors such as baby blue or light green - These can be picked up, however, they were prone to creating false negatives in the environment due to their similarities to white light Blue/Green - Blue can track fairly well in a room without windows, however, when placed with a windowed backdrop the camera tends to saturate blue causing a loss of tracking. Red - Red is the easiest to track of the colors I tested, it seems to be picked up by the camera much more heavily than other colors in a variety of environments. One problem that developed regardless of the color to be track was lighting. As the amount of light in a room changes the colors of objects change due to shadows and lack of light. By setting careful tolerances on the choice of color it is possible to get around some small changes in light, however, it is nearly impossible to account for all changes in light even by calibrating the camera to the light in the room prior to tracking a color. To remedy this problem I conducted the following two tests: Spotlights mounted to camera to control lighting - This solution worked great, as long as the object remained in the cone of light created by the spotlights it was almost 100% sure to be tracked. The major problem with this setup was creating a spotlight with a projected cone of light large enough such that servo- ing towards an object didn't remove the object out of the cone for any period of time. Tracking a light source - This solution worked the best of all tests done. By using an Ultra bright Red LED a nearly perfectly colored Red circle which barely changes due to external lighting. The camera also reliably tracks an LED at much greater ranges than that of a solid matte object of the same color. 10

In conclusion, it was determined that a matte red cardboard box fitted with 3 super bright red LED was the most effective object to track using the CMUcam. This box will be mounted on Kitten's back to provide a simple "proof of concept" method of tracking. Designs for Kitten Design 1 Kitten was created the weekend before media day and consists of a PVR Microcontroller, a single IR sensor, 12 HXT 900 sub micro servos and 6 AAA batteries. The inspiration for Kitten's design came from last semesters CAT (created by Seon Kim) which was a much larger and more capable quadruped. The largest design issue with kitten are its servos which are cheap $3 servos from china and are prone to failure. 11

CLOSING Having never done anything with electronics or robotics prior to the class I was surprised with the sheer breadth of possibilities and complexities that surround even the "simple" robots that come from a class like this. During my life before this point I liked to work on mechanical based projects such as rebuilding my cars engine or making scrap commercial air conditioners into super powered floor fans. These projects were always very linear, as in the car engine was always a car engine and always did the same thing regardless of how it was constructed. Robots on the other hand do not share this linearity, they take raw information and do whatever they need/want with it. Because of the huge range of possibilities and my own love for new things, the hardest part of the class became picking one thing to stick to. For me every time one of my classmates created something interesting for their robot I wanted to stop everything and somehow incorporate it into my robot which was something I could not begin to do if I ever wanted to finish. Due to the nonlinear style of the class I feel I've never learned more practical and applicable knowledge for a single subject in one semester than I did in this class. 12

Conclusion By media day both Ender and Kitten were able to walk and avoid obstacles. Ender was able to walk/turn reliably on two legs and follow kitten around a large arena. However, due to Ender's slow walking gait he was not able to walk fast enough to keep up with kitten even when kitten walked at half speed. In the future I plan to use inverse kinematics on Ender and develop a better form of actuation that allows for dynamic changes in stiffness and allow for free rotation so that Enders walk can be more efficient by incorporating aspects of passive dynamic walking and damping similar to what a humans muscle does as it passes through a standard walk cycle. I also plan to create an upper body and redistribute the weight of ender such that it is proportional to that of a human so that I can apply as much of a humans gait characteristics to the robot as possible. By attaching an upper body and 2 robotic arms I hope to allow Ender to have a larger freedom of activities such as sorting objects and writing along with using it to better change the position of the robots center of mass during motion.. 13

Documentation Documents Hitec HSR- 5990TG Servos: http://www.lynxmotion.com/images/data/hsr5990tg.pdf Appendices Code Sharp IR sensor GP2D12: (http://www.acroname.com/robotics/info/articles/sharp/sharp.html) CMUcam3: (http://www.cmucam.org/) Pridgen Vermeer Robotics Xmega128 Microcontroller: http://plaza.ufl.edu/rhaegar/xmega%20manual.pdf Servo Erector Set Brackets http://www.lynxmotion.com/ The code for both robots can be found at: http://plaza.ufl.edu/seanfrucht/index.shtml 14