Mini Turty II Robot Getting Started V1.0

Similar documents
CONNECT TO BUFFALO ROUTER

CSCE 574 Robotics Fall 2018

Running the PR2. Chapter Getting set up Out of the box Batteries and power

ROS Tutorial. Me133a Joseph & Daniel 11/01/2017

@ The ULTIMATE Intellivision Manual

@ The ULTIMATE Manual

STEP BY STEP GUIDE (RASPBERRY PI)

UWYO VR SETUP INSTRUCTIONS

Release Notes v KINOVA Gen3 Ultra lightweight robot enabled by KINOVA KORTEX

Endurance R/C Wi-Fi Servo Controller 2 Instructions

Infoblox and Ansible Integration

EITN90 Radar and Remote Sensing Lab 2

Downloading a ROBOTC Sample Program

Scan Sat Network S.L.

Say hello to BAXTER! A.P.R.I.L. Project - Residential Workshop Plymouth MSc. CNCR Gabriella Pizzuto & MSc. Eng. Ricardo de Azambuja

TM5. Guide Book. Hardware Version: 2.00 Software Version: 1.62

Lab 8: Introduction to the e-puck Robot

Other RTOS services Embedded Motion Control 2012

Setting up the Camera. AXIS Camera Manager Click on AXIS Camera Manager Icon:

ANSYS v14.5. Manager Installation Guide CAE Associates

TurtleBot2&ROS - Learning TB2

XLR PRO Radio Frequency (RF) Modem. Getting Started Guide

AES 7705i MultiNet Receiver System Initial Installation and Setup Guide

VoIP Paging Amplifier and Elastix Server

1 Lab + Hwk 4: Introduction to the e-puck Robot

Outernet L-band on Rasbian Documentation

Make sure you have these items handy

PaperCut VCA Cash Acceptor Manual

FINAL REVIEW. Well done you are an MC Hacker. Welcome to Hacking Minecraft.

Zero Touch Provisioning of NIOS on Openstack using Ansible

Ansible Tower Quick Setup Guide

Quick Start Guide. RSP-Z2 Dual Channel Analog-IP Interface

Emergency Stop Final Project

AirScope Spectrum Analyzer User s Manual

Goal: Test WiFi MU-MIMO station Download, one 2x2 station, one 1x1.

An Escape Room set in the world of Assassin s Creed Origins. Content

VIRTUAL MUSEUM BETA 1 INTRODUCTION MINIMUM REQUIREMENTS WHAT DOES BETA 1 MEAN? CASTLEFORD TIGERS HERITAGE PROJECT

Figure 1 The Raith 150 TWO

Vision Ques t. Vision Quest. Use the Vision Sensor to drive your robot in Vision Quest!

OverDrive for Kindle, Kindle Paperwhite, Kindle Voyage, and Kindle Oasis (not Kindle Fire and Fire Tablet) Contents

Kaseya 2. User Guide. Version 7.0

Super HUD- User Guide

Autonomous Localization

Manual Web Portal pettracer GPS cat collar Version 1.0

Manual. ios App. Ver2.2. v Mall Drive, Commack, NY (P) (F)

PaperCut MF - Fuji Xerox ApeosPort V+ Embedded Manual

Ansible Tower Quick Setup Guide

Quick Start Guide.indd 1 05/11/15 10:07

9/2/2013 Excellent ID. Operational Manual eskan SADL handheld scanner

things you should know first: Technology Tablets Download free app Puffin Acdemy. More info in the Resources page on your educator dashboard.

Implementation Of Vision-Based Landing Target Detection For VTOL UAV Using Raspberry Pi

Universal Camera Registration User Guide for ILS 9.75 & 12.75

Cell Biology and Bioimaging Core

Chanalyzer 4. Chanalyzer 4 by MetaGeek USER GUIDE page 1

Introduction to CLI Automation with Ansible

TurboVUi Solo. User Guide. For Version 6 Software Document # S Please check the accompanying CD for a newer version of this document

3D Scanning Guide. 0. Login. I. Startup

Vision Centric Challenge 2019 S-SLAM: Simple SLAM

Congratulations on your decision to purchase the Triquetra Auto Zero Touch Plate for All Three Axis.

General Workflow Instructions for capturing 360 images using Theta V, editing in Photoshop, and publishing to Google StreetView

The ideal K-12 science microscope solution. User Guide. for use with the Nova5000

EOS 80D (W) Wireless Function Instruction Manual ENGLISH INSTRUCTION MANUAL

DSI-600 EMI Test & Measurement Receiver

Marine Robotics. Alfredo Martins. Unmanned Autonomous Vehicles in Air Land and Sea. Politecnico Milano June 2016

I I. Technical Report. "Teaching Grasping Points Using Natural Movements" R R. Yalım Işleyici Guillem Alenyà

KM-4800w. Copy/Scan Operation Manual

Introduction to Pioneer Robots

IOT Based Secure System For Monitoring And Control of Coal Mine Environment of Robotics.

Installation guide. Activate. Install your TV. Uninstall. 1 min 10 mins. 30 mins

Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl. Kinect2Scratch Workbook

EinScan-SE. Desktop 3D Scanner. User Manual

Install the App. Search the App/Play Store for SiOnyx Aurora. Tap Get/Install. (Screens will differ slightly between ios and Android devices.

Scripted Introduction

UCP-Config Program Version: 3.28 HG A

J, K, L. Each command, 31. Fully qualified domain name (FQDN), 116

MiR Robot Interface 2.0

Note: Objective: Prelab: ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/22/2018 2/02/2018)

A User s Guide to the Robot Virtual Worlds App RVW APP. ROBOTC Graphical Programming. Virtual Programming Challenges to Foster Computational Thinking

How to Present External Audio Stimuli from PC - Focusrite Soundcards using the Aurical Aud Audiometer

LC-10 Chipless TagReader v 2.0 August 2006

Pointstreak. Electronic Gamesheet. Scorekeepers Manual

Mobile Target Tracking Using Radio Sensor Network

Overview... 3 Starting the Software... 3 Adding Your Profile... 3 Updating your Profile... 4

QAM Snare Navigator Quick Set-up Guide- Wi-Fi version

Pointstreak. Electronic Gamesheet. Scorekeepers Manual

TT-208. User s Manual. 300Mps 5.8 GHz. IP Camera Wireless Transmission Kit

Collaborative Robotic Navigation Using EZ-Robots

APNT#1166 Banner Engineering Driver v How To Guide

Fetch Robotics. Release Indigo. Fetch Robotics Inc

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3

Mars Rover: System Block Diagram. November 19, By: Dan Dunn Colin Shea Eric Spiller. Advisors: Dr. Huggins Dr. Malinowski Mr.

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Accessing e-books with your e-reader

DocuSign for Sugar 7 v1.0. Overview. Quick Start Guide. Published December 5, 2013

Inserting and Creating ImagesChapter1:

Website Link

COALESCE V2 CENTRAL COALESCE CENTRAL USER GUIDE WC-COA 24/7 TECHNICAL SUPPORT AT OR VISIT BLACKBOX.COM. Display Name.

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

Installation guide. Activate. Install your Broadband. Install your Phone. Install your TV. 1 min. 30 mins

Make: Sensors. Tero Karvinen, Kimmo Karvinen, and Ville Valtokari. (Hi MAKER MEDIA SEBASTOPOL. CA

Transcription:

Mini Turty II Robot Getting Started V1.0

Rhoeby Dynamics Mini Turty II Robot Getting Started Getting Started with Mini Turty II Robot Thank you for your purchase, and welcome to Rhoeby Dynamics products! This document provides a quick overview of the steps required to run ROS Navigation on the Mini Turty II platform.

Pre-requisites To proceed, you will need the following items: A ROS enabled PC (for robot setup, and running the ROS application RViz) An HDMI monitor and USB keyboard (for robot setup) Your ROS navigation-capable Mini Turty II robot! With the above to hand, proceed to the Quick-Start Instructions to get up and running now. Quick-Start Instructions 1. Plug in the following items to the robot: HDMI monitor (to Raspberry Pi 3) USB keyboard (to Raspberry Pi 3) 2. Switch on the Mini Turty II, using the switch on the side of the robot 3. Wait for Linux to boot and the login prompt to appear on the HDMI monitor 4. Login with the following: - user name: ubuntu - password: rhoeby001 5. Add your WiFi network credentials $ sudo vi /etc/network/interfaces Modify these lines to match your network: iface wlan0 inet dhcp wpa-ssid "Your Network SSID" wpa-psk "Your Password" Note: retain the quotes (""), eg if your WiFI network is called NETGEAR101, and your password is mypassword, the file entries would look like this: iface wlan0 inet dhcp wpa-ssid "NETGEAR101" wpa-psk "mypassword"

6. Reboot the robot $ sudo reboot 7. On the ROS-enabled PC, 'ssh' into the robot (default robot name is: ubuntu): $ ssh -l ubuntu ubuntu (Alternatively, in Windows, you could use PuTTY) 8. At the prompt, verify you can see the easy start scripts called: 'mini_turty_init.sh', 'min_turty_mapping.sh' and 'mini_turty_nav.sh' $ ls catkin_ws robot tmp mini_turty_init.sh mini_turty_mapping.sh mini_turty_nav.sh 9. Unplug HDMI, keyboard. Your robot is ready for action! Things You Can Do With Your Mini Turty II Robot There are many things you can do with your Mini-Turty II robot, including: Basic ROS Learning Teleop: robot remote control Map Building: make maps of your home or office for the robot to use Navigation: the robot moves autonomously around your home or office Frontier Exploration: the robot autonomously explores unknown terrain* Tele-Viewing: see what your robot sees, even from another room* Computer Vision: Mini Turty II recognizes objects in its environment* *Coming soon, to a robot near you! In the sections below, we explore each of these features in more detail.

Basic ROS Learning Learn the basics of ROS including things like running ROS nodes, working with topics, using rosrun and roslaunch, etc. See and interact with working examples of things like publisher and subscriber, record and playback of data (eg LiDAR data). Follow instructions for running ROS across multiple machines, and much more. Explore such topics as robot coordinate frames, like using the view_frames tool to graphically examine the robot tf frames. Graphical output helps simplify your view of the robot internal structure. Teleop Mini-Turty II supports tele-operation (remote control of the robot) via several methods, including: PC keyboard control (via SSH terminal, eg PuTTY*, or Linux minicom ) Phone-based control (install Play Store app Map Nav ) ROS control via cmd_vel topic * PuTTY is a simple terminal program you can run on any Windows PC, see http://putty.org All of the above methods control the robot via the built-in WiFi connection.

Using PuTTY for Teleop Using PuTTY is easy. Simply download the program from http://putty.org and follow the instructions for installation. Then enter the IP address of your robot and use SSH to connect to it. Log in using the Mini-Turty II username and password (default: ubuntu, pasword: rhoeby001), and begin using your robot!

Using Phone-Based Control You can remotely control Mini-Turty II using a phone-based application called Map Nav, which is available in the Play Store. Download and install the app, specify the IP address of the robot and begin driving your robot around using the virtual joystick control! With nothing more than the phone app itself (and a properly setup robot), you can be controlling your robot quickly. Using ROS Control Via the cmd_vel Topic Mini-Turty II subscribes to the ROS cmd_vel topic. As such you can remote control the robot by issuing commands using that topic. The following command tells the robot to move forward at 0.2 m/s: rostopic pub -1 /cmd_vel geometry_msgs/twist '[0.2, 0, 0]' '[0, 0, 0]' You can stop the robot by issuing this command: rostopic pub -1 /cmd_vel geometry_msgs/twist '[0, 0, 0]' '[0, 0, 0]' The first argument 0.2 is linear motion on the x-axis. The next two are the y-axis and z-axis respectively. The robot can only move in the x-axis because it is a differential drive robot. However, it does support angular motion in the z-axis! So you turn the robot by applying some z-axis angular motion, for example:

rostopic pub -1 /cmd_vel geometry_msgs/twist '[0, 0, 0]' '[0, 0, 0.1]' The above command will cause the robot to rotate on the spot! Of course you can combine motions to provide linear and angular motion at the same time, like this: rostopic pub -1 /cmd_vel geometry_msgs/twist '[0.1, 0, 0]' '[0, 0, 0.05]' The above command will cause the robot to move forward whilst simultaneously turning. Map Building The map building aspect is an exciting function of your new robot! It provides the means to make maps of your home or office for the robot to use. This can be both an interesting learning experience, and very useful to your robot in support of its more advanced features. Mapping: is the process of building maps based on data acquired from one or more sensors. A LiDAR sweep of the 360 degrees view and the taking of many range samples yields a crude map. But this map is usually far from complete. The second part of the process is to take many of these 360 degree scans and assemble them into a more complete map. As the robot begins to move around in it s environment, it is able to estimate where is it relative to the current and previously scanned data (the

process known as localization ), and then take new scans and add them into the map. The process of performing localization and mapping together is referred to as Simultaneous Localization And Mapping, or just simply SLAM! Using your robot, run mapping techniques such as: gmapping hector mapping karto mapping Save the map(s) for later use by the robot during navigation. Starting Mapping 1. At the ssh prompt, initialize the robot: $./mini_turty_init 2. Wait for initialization to complete, then initiate mapping $./mini_turty_mapping 3. On the ROS-enabled PC, setup your connection to the robot, by following these instructions: http://wiki.ros.org/ros/networksetup and http://wiki.ros.org/ros/tutorials/multiplemachines Basically... Set hostname (default is: ubuntu) $ sudo vi /etc/hostname $ sudo vi /etc/hosts 127.0.1.1 ubuntu Add remote machine (where RViz will run) $ sudo vi /etc/hosts (on the Mini Turty II robot) 192.168.0.51 vm-ubuntu $ sudo vi /etc/hosts (on the remote PC machine) 192.168.0.27 ubuntu

The above IP addresses and host names are just examples, replace these with your own addresses. 4. Set environment variable (on remote machine): $ export ROS_MASTER_URI=http://ubuntu:11311' 5. Run RViz (on the remote machine) $ rosrun rviz rviz 6. In RViz display topic 'laser_data', you should now see the scanner output on screen (you may need to set the Fixed Frame appropriately) 7. In RViz display the topic '/map', you should now see a partial map being built 8. In the ssh session (where you ran./mini_turty_mapping.sh, tele-operate the robot using the keyboard arrow keys. 9. Observe the map being built up 10. When enough map has been created, save the map using the following command (in a new ssh session): $ rosrun map_server map_saver Voila! Your map is saved!

Navigation Autonomous navigation includes use of the generated map, path planning, obstacle detection and obstacle avoidance. Navigation is comprised of path planning to reach a goal within a map, along with detection of new obstacles that are not part of the map. This is where the preceding elements of LiDAR, scanning, and mapping are all brought together into a functioning system. During navigation, the software provides real-time updates to the proposed path to keep the robot within the mapped area, whilst avoiding any new obstacles. In the picture below, we see a navigation goal being set. The user clicks a location on the map where they would like to robot to go, and the navigation software displays a large green arrow to show this goal.

In addition to showing the goal, the picture above also shows: the mapped data: the black areas the scan data from the LiDAR: the white dots the robots current position: the red circle obstacle inflation: the turquoise/purple areas In deriving a path to the goal, the robot must allow a safe distance around all obstacles. To this end, the purpose of obstacle inflation is to make obstacles appear larger than they are, and with a safety margin, to help ensure the robot does not collide with them. So the turquoise areas are areas the robot should avoid: they are the danger spots where collision could occur. During path planning, the robot footprint (in red) should not be made to overlap significantly with those inflated areas. As the robot navigates towards a goal, it maintains an area around itself where it performs obstacle inflation, and seeks to keep its path outside of those areas.

Running navigation Running navigation is simple with your Mini-Turty II robot. 1. To run the navigation suite, first close any previous mapping session (press 'q', followed by Ctrl-C), then execute the following: $./mini_turty_nav.sh Note: you may need to modify the path to the new map, as specified in mini_turty_nav.sh, see the line: 'roslaunch scanner2d amcl_rd_demo.launch map_file:=/home/ubuntu/robot/maps/map.yaml' Just do vi mini_turty_nav.sh and make your changes, if necessary depending on where you have your saved map. Frontier Exploration Frontier Exploration is one of the newest and coolest features of our robots. This functionality is what enables the robot to autonomously enter previously unexplored areas, and without any human intervention travel around to fully explore its environment whilst building a map for later use. Set Mini-Turty II free to explore the great unknown! Tele-Viewing Tele-Viewing allows users to see what the robots sees, even from another room! Using the on-robot camera with teleop, users can remotely operate the robot and monitor what s in the robot field of view.

Computer Vision Using OpenCV, run the Find Cat app to enable Mini-Turty II to search the mapped area for the presence of a cat! With this function the robot searches the entire mapped area and when it finds a cat it moves towards it and highlights the cat face with a bright magenta circle on the tele-viewed video image! This is a super-fun application that provides many opportunities to learn about the exciting field of computer vision. The app could be adapted to detect dogs, people and even many household objects. For further assistance, visit www.rhoeby.com, or send email to 'support@rhoeby.com'. Enjoy your new robot!