Probabilistic Robotics Course. Robots and Sensors Orazio

Similar documents
Development of intelligent systems

Sensing and Perception

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

MOBILE ROBOTICS. Sensors An Introduction

Training Schedule. Robotic System Design using Arduino Platform

Intelligent Robotics Sensors and Actuators

Mechatronics Engineering and Automation Faculty of Engineering, Ain Shams University MCT-151, Spring 2015 Lab-4: Electric Actuators

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

RPLIDAR A1. Introduction and Datasheet. Low Cost 360 Degree Laser Range Scanner. Model: A1M8. Shanghai Slamtec.Co.,Ltd rev.1.

Building Perceptive Robots with INTEL Euclid Development kit

RPLIDAR A2. Introduction and Datasheet. Low Cost 360 Degree Laser Range Scanner. Model: A2M5 A2M6 OPTMAG. Shanghai Slamtec.Co.,Ltd rev.1.

Robot Hardware Non-visual Sensors. Ioannis Rekleitis

MDM5253 DC Motor Driver Module with Position and Current Feedback User Manual

RPLIDAR A3. Introduction and Datasheet. Low Cost 360 Degree Laser Range Scanner. Model: A3M1. Shanghai Slamtec.Co.,Ltd rev.1.

RPLIDAR A1. Introduction and Datasheet. Low Cost 360 Degree Laser Range Scanner rev.2.1. Model: A1M8. Shanghai Slamtec.Co.

Lab 2. Logistics & Travel. Installing all the packages. Makeup class Recorded class Class time to work on lab Remote class

Sensors and Actuators

Range Sensing strategies

RPLIDAR A2. Introduction and Datasheet. Model: A2M3 A2M4 OPTMAG. Shanghai Slamtec.Co.,Ltd rev.1.0 Low Cost 360 Degree Laser Range Scanner

Sensors. human sensing. basic sensory. advanced sensory. 5+N senses <link> tactile touchless (distant) virtual. e.g. camera, radar / lidar, MS Kinect

Overview of Challenges in the Development of Autonomous Mobile Robots. August 23, 2011

Mars Rover: System Block Diagram. November 19, By: Dan Dunn Colin Shea Eric Spiller. Advisors: Dr. Huggins Dr. Malinowski Mr.

MOBILE ROBOT LOCALIZATION with POSITION CONTROL

TurtleBot2&ROS - Learning TB2

DC Motor and Servo motor Control with ARM and Arduino. Created by:

16. Sensors 217. eye hand control. br-er16-01e.cdr

Design Project Introduction DE2-based SecurityBot

Citrus Circuits Fall Workshop Series. Roborio and Sensors. Paul Ngo and Ellie Hass

LaserPING Rangefinder Module (#28041)

Solar Powered Obstacle Avoiding Robot

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

Revolutionizing 2D measurement. Maximizing longevity. Challenging expectations. R2100 Multi-Ray LED Scanner

Vision-based Localization and Mapping with Heterogeneous Teams of Ground and Micro Flying Robots

Signals, Instruments, and Systems W7. Embedded Systems General Concepts and

League <BART LAB AssistBot (THAILAND)>

Introduction to Arduino HW Labs

10/21/2009. d R. d L. r L d B L08. POSE ESTIMATION, MOTORS. EECS 498-6: Autonomous Robotics Laboratory. Midterm 1. Mean: 53.9/67 Stddev: 7.

An Example of robots with their sensors

Chapter 2 Sensors. The Author(s) 2018 M. Ben-Ari and F. Mondada, Elements of Robotics, / _2

DC motor control using arduino

EEE 187: Robotics. Summary 11: Sensors used in Robotics

What is a robot? Introduction. Some Current State-of-the-Art Robots. More State-of-the-Art Research Robots. Version:

DASL 120 Introduction to Microcontrollers

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

YDLIDAR G4 DATASHEET. Doc#: 文档编码 :

An Example of robots with their sensors

Sensors and Sensing Motors, Encoders and Motor Control

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

Indoor Positioning by the Fusion of Wireless Metrics and Sensors

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Object Detection for Collision Avoidance in ITS

VEX Robotics Platform and ROBOTC Software. Introduction

Today s Menu. Near Infrared Sensors

Lab 5: Inverted Pendulum PID Control

Developing Applications for the ROBOBO! robot

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

ECE 477 Digital Systems Senior Design Project Rev 8/09. Homework 5: Theory of Operation and Hardware Design Narrative

Electronics, Sensors, and Actuators

Jaguar Motor Controller (Stellaris Brushed DC Motor Control Module with CAN)

Team Description Paper

RoboTurk 2014 Team Description

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim

Time of Flight Capture

Chapter 7: The motors of the robot

Megamark Arduino Library Documentation

Robotics Enabling Autonomy in Challenging Environments

Visione per il veicolo Paolo Medici 2017/ Visual Perception

WELCOME TO THE SEMINAR ON INTRODUCTION TO ROBOTICS

Experiment (2) DC Motor Control (Direction and Speed)

Design of double loop-locked system for brush-less DC motor based on DSP

ME375 Lab Project. Bradley Boane & Jeremy Bourque April 25, 2018

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

WheelCommander Wizard User s Manual

B Robo Claw 2 Channel 25A Motor Controller Data Sheet

Dynamically Adaptive Inverted Pendulum Platfom

Micromouse Meeting #3 Lecture #2. Power Motors Encoders

B RoboClaw 2 Channel 30A Motor Controller Data Sheet

MEM380 Applied Autonomous Robots I Fall Introduction to Sensors & Perception

MOBILE ROBOT CRUISE CONTROLLER

Laser Marking Cards and Laser Marking Software

Brushed DC Motor Control. Module with CAN (MDL-BDC24)

INVESTIGATOR UNMANNED GROUND VEHICLE WITH NATURAL HUMAN INTERFACE FOR SURVEILLANCE AND RECONNAISSANCE. By Kevin French

YDLIDAR F4PRO DATASHEET

Sensors and Sensing Cameras and Camera Calibration

CS545 Contents XIV. Components of a Robotic System. Signal Processing. Reading Assignment for Next Class

Introduction to the VEX Robotics Platform and ROBOTC Software

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Prototype Realization

PIC Functionality. General I/O Dedicated Interrupt Change State Interrupt Input Capture Output Compare PWM ADC RS232

PRESENTED BY HUMANOID IIT KANPUR

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Perception. Autonomous Mobile Robots. Sensors. Vision Uncertainties, Fusion Features. Autonomous Systems Lab. Zürich. Cognition.

IMGD 3100 Novel Interfaces for Interactive Environments: Physical Input

Sensor system of a small biped entertainment robot

KMUTT Kickers: Team Description Paper

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

Lecture: Sensors , Fall 2008

Transcription:

Probabilistic Robotics Course Robots and Sensors Orazio Giorgio Grisetti grisetti@dis.uniroma1.it Dept of Computer Control and Management Engineering Sapienza University of Rome

Outline Robot Devices Overview of Typical sensors and Actuators Mobile Bases MARRTino Hardware Firmware

Mobile Base A mobile platform is a device capable of moving in the environment and carrying a certain load (sensors and actuators) At low level the inputs are the desired velocities of the joints, and the output is the state of the joints At high level it can be controlled with linear/angular velocity, and provides the relative position of the mobiel base w.r.t. an initial instant, obtained by integrating the joint s states (odometry).

Sensors for Ego-Motion Wheel encoders mounted on the wheels IMU: Accelerometers Gyros The estimate of ego-motion is obtained by integrating the sensor measurements of these devices. This results in an accumulated drift due to the noise affecting the measurement In absence of an external reference there is no way to recover from these errors

Measuring the Environment Perception of the environment Active: Ultrasound Laser range finder Structured-light cameras Infrared Passive: RGB Cameras Tactiles

Laser Scanner Wide FOV Highly Accurate Approved security for collision detection

Typical Scans

RGB Monocular Camera

RGB Monocular Camera Cameras measure the intensity of the light projected onto a (typically planar) ccd through a system of lenses and/or mirrors Provide a lot of information Project 3D onto 2D, which results in the unobservability of the depth The scene can be reconstructed by multiple images (see SfM)

RGB Stereo Camera reconstruction from top Stereo cameras are combination of 2 monocular cameras that allow triangulation, given a known geometry. If the corresponding points in the images are known, we can reconstruct the 3D scene. Error in the depth depends on the distance! Sensible to lack of texture

RGBD Cameras Cameras that are able to sense the color and the depth even with poor/no texture Use an active light source and retrieve the depth either via stereo triangulation (emitter and source are in different positions) Time of flight (emitter and source are in the same position) Environment conditions should allow to sense the emitted light. Typically OK indoors

MARRtino Is a simple but complete mobile base designed to be used in the MARR course. The cost of the parts is around 300 euro It is entirely open source It is integrated in ROS through a simple node that publishes/subscribes standard topics

Orazio Is a simplified yet complete redesign of MARRtino, with the goals of Using easy-to-find hardware (Arduino) Reducing the assembly time (2 hours for non skilled users) It is entirely open source It is integrated in ros through a simple node that publishes/subscribes standard topics Firmware at https://bitbucket.org/ggrisetti/arduino_robot

Electronics Left Left Motor Encoder Right Right Encoder Motor ½ H-Bridge ½ H-Bridge RS232 Controller Board PC

Electronics Left Left Motor Encoder Right Right Encoder Motor ½ H-Bridge ½ H-Bridge RS232 Controller Board PC

Electronics The PC communicates with arduino through USB Each encoder provides two signals Each PWM requires at least 2 wires The wiring of the PWM depends on the H-Bridge used PC To H Bridge encoders

Power Control Board: 6V from one of the batteries H bridges: 12 V from both batteries, 5V from logic The system can either charge the batteries or be powered ON. Left Left Motor Encoder Right Right Encoder Motor ½ H-Bridge 12v ½ H-Bridge Controller Board Battery 12v Controller is powered through USB Controller and H bridges share the GND

Encoders Each encoder has two signals (A, B) and requires a 5V voltage supplied by the controller board The encoders are managed by the Quadrature Encoder Module (QEI) of the controller, that takes care of counting ticks and direction Encoder A B 5V Controller Board

Encoders Each encoder has two signals (A, B) and requires a 5V voltage supplied by the controller board The encoders are managed by the Quadrature Encoder Module (QEI) of the controller, that takes care of counting ticks and direction Encoder A B 5V Controller Board

Encoders Each encoder has two signals (A, B) and requires a 5V voltage supplied by the controller board The encoders are managed by interrupt on edges, that takes care of counting ticks and direction Encoder A B 5V Controller Board

H Bridge The motor is connected to the H Bridge, that provides the necessary voltage and current to drive it. The H bridge requires 12V power directly from the battery The controller board controls the H bridge by* A square wave whose duty cycle is proportional to the voltage applied to the motor, that controls the speed (PWM) A direction pin, that reverts the voltage when asserted, causing the motor to rotate in the opposite direction Motor ½ H-Bridge PWM dir Controller Board 12V

PC connection T X R X g n d Controller Board FTDI US B RS232 The robot communicates with the PC through an RS232 interface at TTL levels (0-5V) The TTL-RS232 is converted in USB through an FTDI chip The device is visible on Linux as /dev/ttyxxx PC