Major Project SSAD. Mentor : Raghudeep SSAD Mentor :Manish Jha Group : Group20 Members : Harshit Daga ( ) Aman Saxena ( )

Similar documents
KI-SUNG SUH USING NAO INTRODUCTION TO INTERACTIVE HUMANOID ROBOTS

Meet Pepper. Because of this, Pepper will truly change the way we live our lives.

RobotStadium: Online Humanoid Robot Soccer Simulation Competition

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

W/BO project. Wi-Fi over AIBO RÉANT Gilles French Erasmus student

Interface System for NAO Robots

Introduction to Talking Robots

CORC 3303 Exploring Robotics. Why Teams?

Robotics Laboratory. Report Nao. 7 th of July Authors: Arnaud van Pottelsberghe Brieuc della Faille Laurent Parez Pierre-Yves Morelle

Concept and Architecture of a Centaur Robot

Development and Evaluation of a Centaur Robot

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League

Software Computer Vision - Driver Assistance

Heuristic localization and mapping for active sensing with humanoid robot NAO

RoboCup. Presented by Shane Murphy April 24, 2003

Hierarchical Case-Based Reasoning Behavior Control for Humanoid Robot

Concept and Architecture of a Centaur Robot

FU-Fighters. The Soccer Robots of Freie Universität Berlin. Why RoboCup? What is RoboCup?

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

MAKER: Development of Smart Mobile Robot System to Help Middle School Students Learn about Robot Perception

RB-Ais-01. Aisoy1 Programmable Interactive Robotic Companion. Renewed and funny dialogs

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Development of a Mobile Robotic Simulator. K. Kelly, P. Wardlaw, C. McGinn

U ROBOT March 12, 2008 Kyung Chul Shin Yujin Robot Co.

Humanoid Robot NAO: Developing Behaviors for Football Humanoid Robots

The Future of AI A Robotics Perspective

Robocup Electrical Team 2006 Description Paper

The use of programmable robots in the education of programming

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

UChile Team Research Report 2009

A*STAR Unveils Singapore s First Social Robots at Robocup2010

NaOISIS : A 3-D Behavioural Simulator for the NAO Humanoid Robot


Proposal of a Kit-Style Robot as the New Standard Platform for the Four-Legged League

Developing a Computer Vision System for Autonomous Rover Navigation

Lab 7: Introduction to Webots and Sensor Modeling

CIT Brains (Kid Size League)

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

NUST FALCONS. Team Description for RoboCup Small Size League, 2011

Team KMUTT: Team Description Paper

Robotic Systems ECE 401RB Fall 2007

Human Robot Interaction: Coaching to Play Soccer via Spoken-Language

Issues in Information Systems Volume 13, Issue 2, pp , 2012

Service Robots in an Intelligent House

Learning and Using Models of Kicking Motions for Legged Robots

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

: Robots for Education and Entertainment. Sara Schütz

Chandrakant Ramesh BOTHE. Human-Humanoid Interaction by Verbal Dialogue

Team Description Paper

Baset Adult-Size 2016 Team Description Paper

FalconBots RoboCup Humanoid Kid -Size 2014 Team Description Paper. Minero, V., Juárez, J.C., Arenas, D. U., Quiroz, J., Flores, J.A.

Nao Devils Dortmund. Team Description for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann

Fernando Ribeiro, Gil Lopes, Davide Oliveira, Fátima Gonçalves, Júlio

ZJUDancer Team Description Paper

Michael Cowling, CQUniversity. This work is licensed under a Creative Commons Attribution 4.0 International License

Kid-Size Humanoid Soccer Robot Design by TKU Team

Physical Etoys: Freedom beyond the digital world

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Humanoid Robots. by Julie Chambon

Task Allocation: Role Assignment. Dr. Daisy Tang

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

On-demand printable robots

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

S.P.Q.R. Legged Team Report from RoboCup 2003

KMUTT Kickers: Team Description Paper

The UPennalizers RoboCup Standard Platform League Team Description Paper 2017

Autonomous and Mobile Robotics Prof. Giuseppe Oriolo. Introduction: Applications, Problems, Architectures

Human Robot Interaction

LEGO MINDSTORMS CHEERLEADING ROBOTS

Implementation of Face Detection and Recognition of Indonesian Language in Communication Between Humans and Robots

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Android Speech Interface to a Home Robot July 2012

Training NAO using Kinect

Intuitive Vision Robot Kit For Efficient Education

We create robot! You create future!

LEGO MINDSTORMS COMPETITIONS

CIT Brains & Team KIS

Team TH-MOS. Liu Xingjie, Wang Qian, Qian Peng, Shi Xunlei, Cheng Jiakai Department of Engineering physics, Tsinghua University, Beijing, China

Associated Emotion and its Expression in an Entertainment Robot QRIO

Learning and Using Models of Kicking Motions for Legged Robots

Available online at ScienceDirect. Procedia Computer Science 76 (2015 )

Mindstorms NXT. mindstorms.lego.com

Cost Oriented Humanoid Robots

Team Description 2006 for Team RO-PE A

KUDOS Team Description Paper for Humanoid Kidsize League of RoboCup 2016

Graz University of Technology (Austria)

RoboCup TDP Team ZSTT

A Model Based Approach for Human Recognition and Reception by Robot

Effects of Shader Technology: Current-Generation Game Consoles and Real-Time. Graphics Applications

Smart-M3-Based Robot Interaction in Cyber-Physical Systems

SAFETY INSTRUCTIONS. NOTE: When you make your robot walking with the battery cable plugged in, your robot may fall.

Does JoiTech Messi dream of RoboCup Goal?

CITBrains (Kid Size League)

Virtual Testing of Autonomous Vehicles

Sensor system of a small biped entertainment robot

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Pegasus-21. Cumulative Design Review. Senior Design Project Spring 2016

An Overview of Simulation Software for Non-Experts to Perform Multi-Robot Experiments

Transcription:

Major Project SSAD Advisor : Dr. Kamalakar Karlapalem Mentor : Raghudeep SSAD Mentor :Manish Jha Group : Group20 Members : Harshit Daga (200801028) Aman Saxena (200801010)

We were supposed to calculate the error in the localisation in the NAO robot.. In general words.. Say we assume a map given in terms of rectangular coordinates.. And we assume that the robot is standing at some (x,y) after some given operation say moving from some other position.. But actually it had not reached that point because of the errors in the motor and the movement of the robot.. So we were supposed to calcuate that theta,h,k to move so that it could reach the desired position

Nao is an autonomous, programmable and medium-sized humanoid robot, developed by the French company Aldebaran Robotics. Its is designed for entertainment purposes, and will be able to interact with its owner, with evolving behaviors and functionalities. Additionally, the user will be able to teach Nao new behaviors using a computer with Wi-Fi connectivity. The behaviours creating software is designed to fit any users level: from graphical block editing for beginners to code for more skilled users. The possible behaviors are limited only by our imagination!.

Nao is based on a Linux platform and scripted with Urbi, an easyto-learn programming language, with the option of a graphic interface for beginners or code commands for experts. On August 15, 2007, Nao replaced the robot dog Aibo by Sony as the standard platform for the Robocup ("Robot Soccer World Cup"), an international robotics competition

Specificatio n Technical Specification : Height : 58cm Weight : 4.3kg Autonomy : 45min Degree of freedom : 21 to 25 CPU : x86 AMD GEODE 500MHz Built-in OS: Linux Compatible :Windows, MacOS, Linux Programming languages : C++, C, Python, Urbi Vision : Two CMOS 640 x 480 cameras Connectivity : Ethernet, Wi-Fi All versions feature an inertial sensor and 4 ultrasound captors that provide stability and positionning within space to NAO. Nao also features a powerful multimedia system (4 microphones, 2 hi-fi speakers, 2 CMOS cameras) for text-to-speech synthesis, sound localization or facial and shape recognition amongst various other abilities. The package includes a dedicated programming software, Aldebaran Choregraphe and Nao is compatible with Microsoft Robotics Studio, Cyberbotics' Webots and Gostai Urbi Studio.

What we DID?? Our Task was divided into 2 parts : First Part was making the C/C++ Code. Second Part was putting the code on NAOqi. First Task : C/C++ Code In this task we made a C/C++ code that satisfied the required conditions. Were given 1. The martix. 2. The standing position of the robot. 3. Number and the co-ordinates of the obstacles.(consider obstacles as stationary) 4. Theta, h, k as an input taken by the user so generate a virtual real map which have been got by censor readings if working on robot. We made this code in C and this is the code that can be used for any general robot for the purpose of localisation. Second Task : NAOqi Part : We put the C code modules on the NAOqi so that the general code that we made for any robot is now compatible with the simulator on which NAO robot works thus we register our modules in NAOqi for any other person to use it.

SAID We broke it into two modules : error generation and graphical interphase. In error generation part, we will be calculating error on the basis of censor readings we will be getting. Graphics generation using glut/gl library. DID WITH PROBLEMS AND REASON OF NOT DOING THAT As we were not able to take the sensor readings from the robot ( reason mentioned in next few slides ) we just made a C/C++ code general for all the robots that then can be changed according to the robot simulator. But the task was still completed if seen in coding terms.

Task was given to make the robot move to the point that will be specified in the input as the standing position of the robot as because of the error the robot is not at that point.so we were supposed to take the sensor readings from robots eye and then using these sensor readings and from the ideal map that we know we uses geometry to calculate the ( theta, h, k ) that the robot should move in order to reach the desired point. But for the senor reading can be obtained from 2 sensors provided in the robot One is a set of Infrared Sensors located in its eyes Second is a 4 Ultrasonic Sensors located on its chest. Problem : But since there is no API developed till now to get the reading of the infrared sensors from the NAO robot memory so we were not able to use these sensor for the purpose of getting the actual obstacle position from the place the robot is standing. And if we uses the chest sensor then in that case robot need to rotate and this will introduce a great amount of error in our calculation.

Solution : As we can t introduce more error by taking the sensor reading from the robots chest as it need to be rotated so we just made a C/C++ code and NAOqi part considering the map and ideal situation given as input and then generating the virtual map that will be seen as per by the robot using the mathematics.and thus we got the 2 set of points as ideal map points and the points as seen by the robot and then performing the localisation algorithm on these set we calculated the theta, h, k the robot need to move in order to reach the correct position on the map

What we learned and ideas that we have. Learn ed/overvied : *The localization algorithm *Coding part of Robotics. *About NAO. Ideas : *Virtual Map Generation. *Usage of previous data. Learned 1.) Localization Algorithm : How the error in the robots location can be corrected by using the maths 2.)Coding part of Robotic : We got an overview of some what how the robots is coded and how these code are used to control the robot. 3.)About NAO : We got to know about (overview ) of the working of the NAO how this robot is used to play a game like football how it get the idea of the obstacles etc

What we learned and ideas that we have. Learn ed/overvied : *The localization algorithm *Coding part of Robotics. *About NAO. Ideas : *Virtual Map Generation. *Usage of previous data. Ideas: 1.)Virtual Map Generation : In the localisation algorithm we generated an virtual map as we were not having the sensor reading. In apart from what was specified in the alogrithm to calculated the virtual map as seen actually by the robot we just figured out another way of generating the set of points and what we felt was that that points we more close to what actually we can get from the sensor reading ( as we were provided the theta, h, k as in the input ). 2.) Usage of Previous Data : Since we were not able to work on the robot its self but still what we thought was this We thought to move the robot step by step to its correct position but since after every step we need to get the sensor reading again and it will take time to get the readings so we decided that since after every step we are improving the error so if we begin with a particular number of readings and after every step we decrease this number then we can reduce some time (and it is sure that we need to every time take the sensor reading because of various factors in place and cant use the stored one )

Thank You -Harshit Daga -Aman Saxena