GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

Similar documents
INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction

Baset Adult-Size 2016 Team Description Paper

A Real-time Human-Robot Interaction system based on gestures for assistive scenarios

Community Update and Next Steps

Toward an Augmented Reality System for Violin Learning Support

Hand Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided

Book Cover Recognition Project

Real-time Body Gestures Recognition using Training Set Constrained Reduction

Guided Filtering Using Reflected IR Image for Improving Quality of Depth Image

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

CSE Tue 10/09. Nadir Weibel

R (2) Controlling System Application with hands by identifying movements through Camera

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Driver Assistance for "Keeping Hands on the Wheel and Eyes on the Road"

Non Verbal Communication of Emotions in Social Robots

Gesture Control of a Mobile Robot using Kinect Sensor

Haptic presentation of 3D objects in virtual reality for the visually disabled

GESTURE RECOGNITION WITH 3D CNNS

Learning Actions from Demonstration

AAU SUMMER SCHOOL PROGRAMMING SOCIAL ROBOTS FOR HUMAN INTERACTION LECTURE 10 MULTIMODAL HUMAN-ROBOT INTERACTION

Gesture Recognition with Real World Environment using Kinect: A Review

3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

Building Perceptive Robots with INTEL Euclid Development kit

Human-Robot Interaction for Remote Application

Knowledge Representation and Cognition in Natural Language Processing

Dynamic Distortion Correction for Endoscopy Systems with Exchangeable Optics

Classification for Motion Game Based on EEG Sensing

Impeding Forgers at Photo Inception

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

A Study on Motion-Based UI for Running Games with Kinect

Air Marshalling with the Kinect

Recognizing Military Gestures: Developing a Gesture Recognition Interface. Jonathan Lebron

Mobile Motion: Multimodal Device Augmentation for Musical Applications

Outline. Comparison of Kinect and Bumblebee2 in Indoor Environments. Introduction (Cont d) Introduction

Recent Advances in Sampling-based Alpha Matting

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Team Description Paper

Natural Interaction with Social Robots

Comparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application

SPL 2017 Team Description Paper

Controlling Humanoid Robot Using Head Movements

BORG. The team of the University of Groningen Team Description Paper

Cost Oriented Humanoid Robots

Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface

Stabilize humanoid robot teleoperated by a RGB-D sensor

Team Description

Human-Robot Interaction in Service Robotics

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

Real-Time Face Detection and Tracking for High Resolution Smart Camera System

Semantic Localization of Indoor Places. Lukas Kuster

Drive Me : a Interaction System Between Human and Robot

Designing and Implementing an Interactive Social Robot from Off-the-shelf Components

Autonomous Monitoring Framework with Fallen Person Pose Estimation and Vital Sign Detection

Robust Hand Gesture Recognition by Using Depth Data and Skeletal Information with Kinect V2 Sensor

Using Gestures to Interact with a Service Robot using Kinect 2

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

Training NAO using Kinect

Full-Body Gesture Recognition for Embodied Conversational Agents: The UTEP AGENT Gesture Tool

Team Description Paper

Computer Vision in Human-Computer Interaction


Hand Waving Gesture Detection using a Far-infrared Sensor Array with Thermo-spatial Region of Interest

User Evaluation of an Interactive Learning Framework for Single-Arm and Dual-Arm Robots

May Edited by: Roemi E. Fernández Héctor Montes

Real Time Hand Gesture Tracking for Network Centric Application

Humanoid Robotics (TIF 160)

Open Source in Mobile Robotics

Composite Body-Tracking:

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

Webcam Image Alignment

Real-Time Recognition of Human Postures for Human-Robot Interaction

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Context-sensitive speech recognition for human-robot interaction

Robot Planning with a Semantic Map

Research on Hand Gesture Recognition Using Convolutional Neural Network

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Intelligent Interactions: Artificial Intelligence and Motion Capture for Negotiation of Gestural Interactions

Towards Intuitive Industrial Human-Robot Collaboration

The Hand Gesture Recognition System Using Depth Camera

Corey Pittman Fallon Blvd NE, Palm Bay, FL USA

KeJia: The Intelligent Domestic Robot for 2015

Real Time Word to Picture Translation for Chinese Restaurant Menus

Vision System for a Robot Guide System

IEEE TRANSACTIONS ON CYBERNETICS 1. Derek McColl, Member, IEEE, Chuan Jiang, and Goldie Nejat, Member, IEEE

Humanoid Robotics (TIF 160)

Practical Image and Video Processing Using MATLAB

Overview of current developments in haptic APIs

Evaluation of Image Segmentation Based on Histograms

arxiv: v1 [cs.lg] 2 Jan 2018

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Emotional BWI Segway Robot

The Future of AI A Robotics Perspective

Hand Gesture Recognition System Using Camera

What was the first gestural interface?

Manipulation. Manipulation. Better Vision through Manipulation. Giorgio Metta Paul Fitzpatrick. Humanoid Robotics Group.

Enabling Cursor Control Using on Pinch Gesture Recognition

Panoramic Vision System for an Intelligent Vehicle using. a Laser Sensor and Cameras

Subregion Mosaicking Applied to Nonideal Iris Recognition

ROS Tutorial. Me133a Joseph & Daniel 11/01/2017

Transcription:

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION Gerard Canal, Cecilio Angulo, and Sergio Escalera

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 2/27 Introduction Nowadays robots are able to perform many useful tasks. Most of the human communication is non-verbal. HRI research on a gesture-based interaction system.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 3/27 Motivation Elderly or handicapped person case.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 4/27 Outline Goals Resources System overview Gesture Recognition HRI methods Results: Gesture recognition performance Results: User evaluation Conclusions Future work

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 5/27 Goals Design of a system easy to use and intuitive. Real time, therefore, fast response. Static and dynamic gestures recognition. Accuracy in pointing at the location. Allowing the robot to respond in an intuitive manner. Solving ambiguous situations.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 6/27 Goals Design of a system easy to use and intuitive. Real time, therefore, fast response. Static and dynamic gestures recognition. Accuracy in pointing at the location. Allowing the robot to respond in an intuitive manner. Solving ambiguous situations.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 7/27 Goals System set up Allowing the robot to respond in an intuitive manner. Vision sensor too large to be carried by the robot. DARPA Grand Challenge idea of a driving humanoid.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 8/27 Hardware resources Microsoft Kinect version 2. Windows 8.1 driver and USB 3.0. NAO. CPU Geode. NoaQi OS. Two laptops: Intel i5 Intel Core 2 duo Wifibot. Intel Atom. Ubuntu 12.04.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 9/27 Hardware resources modifications

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 10/27 Software resources ROS: Robot Operating System. To program the robots. SMACH to implement the Finite State Machines in Python. Indigo Igloo version in Ubuntu 14.04. Kinect for Windows SDK 2.0. C++ mode. PCL: Point Cloud Library. Implemented in C++.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 11/27 System overview

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 12/27 System overview

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 13/27 Gesture Recognition Two types of gestures: Static Dynamic One gesture of each type: Wave Point at Described by means of skeletal features [1]. [1] J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman, and A. Blake. Real-time human pose recognition in parts from single depth images. In Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 11, pages 1297 1304, Washington, DC, USA, 2011. IEEE Computer Society.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 14/27 Skeletal features Wave gesture: θ 1 : Neck Hand distance θ 2 : Elbow angle Point at gesture: θ 1 : Hand Hip distance θ 2 : Elbow angle θ 3 : Hand 3D position

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 15/27 Gesture recognition: Dynamic Time Warping Using a weighted L1 distance measure: A gesture is recognized when the input sequence is close enough to the model:.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 16/27 Static gesture recognition Check that features are within some thresholds and the involved limb is not moving during a certain number of frames. θ 1 > T1, θ 2 > T2 Dynamic and Static recognition performed in a multi-threaded joint way.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 17/27 Gesture recognition: Pointing gesture related methods Ground plane detection by RANSAC model fitting [2]. Pointed point extraction using skeletal joints information. Object segmentation by Euclidean Cluster Extraction [3]. [2] M. A. Fischler and R. C. Bolles. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commununications of the ACM, 24(6):381 395, June 1981. [3] R. B. Rusu. Clustering and segmentation. In Semantic 3D Object Maps for Everyday Robot Manipulation, volume 85 of Springer Tracts in Advanced Robotics, chapter 6, pages 75 85. Springer Berlin Heidelberg, 2013.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 18/27 HRI methods: Object disambiguation Extra information may be needed in case of doubt. Solve it by means of a small spoken dialogue. Use of simple questions about object s features like size and position.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 19/27 HRI methods: Interaction techniques The robot performs human-like gestures. Non-repetitive verbalization of its actions to enhance understanding.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 20/27 Results: Recognition performance. Jaccard index Performance measured on a labeled set: 61 gesture samples, 27 static and 34 dynamic 2082 gesture frames Overlap / Jaccard index as performance metric. LOOCV test mean Jaccard Index: Static gestures: 0.46 Dynamic gestures: 0.49 Mean: 0.49

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 21/27 Results: User experience evaluation Testing environment.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 22/27 Results: User experience evaluation. Users survey 24 users tested the system

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 23/27 Results: User experience evaluation. Users survey

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 24/27 Demonstration

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 25/27 Conclusions Potential utility in household environments. Natural gestures as said by the test users. Easy to interact with the system and able to fulfill a task successfully in most of the cases. Working in near real time (~20 FPS), with correct response times. Generic and scalable framework.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 26/27 Future improvements Enhancement of the pointing location estimation: Solve user pointing imprecisions by learning from them. Use of other cues such as gaze direction. Hand pose estimation. More precise navigation (no free path assumption, scene understanding). Affective and cognitive interaction.

Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 27 THANK YOU. **No robot was harmed in the making of this paper.