3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

Similar documents
What was the first gestural interface?

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE Tue 10/09. Nadir Weibel

Driver Assistance for "Keeping Hands on the Wheel and Eyes on the Road"

Classifying 3D Input Devices

Robust Hand Gesture Recognition for Robotic Hand Control

Toward an Augmented Reality System for Violin Learning Support

Computer Vision in Human-Computer Interaction

Air Marshalling with the Kinect

GESTURE RECOGNITION WITH 3D CNNS

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

The Hand Gesture Recognition System Using Depth Camera

Telling What-Is-What in Video. Gerard Medioni

Classifying 3D Input Devices

MEASURING AND ANALYZING FINE MOTOR SKILLS

Augmented and Virtual Reality

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

Natural Gesture Based Interaction for Handheld Augmented Reality

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Touch & Gesture. HCID 520 User Interface Software & Technology

Stabilize humanoid robot teleoperated by a RGB-D sensor

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Comparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application

Gesture Recognition with Real World Environment using Kinect: A Review

Touch & Gesture. HCID 520 User Interface Software & Technology

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Enabling Cursor Control Using on Pinch Gesture Recognition

International Journal of Research in Computer and Communication Technology, Vol 2, Issue 12, December- 2013

VIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR

INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

Lecture 1 Introduction to Computer Vision. Lin ZHANG, PhD School of Software Engineering, Tongji University Spring 2015

Today. CS 395T Visual Recognition. Course content. Administration. Expectations. Paper reviews

Input devices and interaction. Ruth Aylett

DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

A Kinect-based 3D hand-gesture interface for 3D databases

2. Visually- Guided Grasping (3D)

Controlling vehicle functions with natural body language

Image Manipulation Interface using Depth-based Hand Gesture

An Un-awarely Collected Real World Face Database: The ISL-Door Face Database

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Building Perceptive Robots with INTEL Euclid Development kit

ifinger Study of Gesture Recognition Technologies & Its Applications Volume II of II

CS 131 Lecture 1: Course introduction

Intro to Virtual Reality (Cont)

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Virtual Grasping Using a Data Glove

REAL TIME GESTURE RECOGNITION SYSTEM FOR ADAS CHEE YING XUAN A REPORT SUBMITTED TO. Universiti Tunku Abdul Rahman

Future Directions for Augmented Reality. Mark Billinghurst

Motion-Aware Displays

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Mixed Reality Simulators

CENG 595 Selected Topics in Computer Engineering Computer Vision. Zafer ARICAN, PhD

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Humanoid Hands. CHENG Gang Dec Rollin Justin Robot.mp4

Community Update and Next Steps

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

Hand Gesture Recognition System Using Camera

ReVRSR: Remote Virtual Reality for Service Robots

Virtual Touch Human Computer Interaction at a Distance

Outline. Comparison of Kinect and Bumblebee2 in Indoor Environments. Introduction (Cont d) Introduction

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 )

Lecture 1 Introduction to Computer Vision. Lin ZHANG, PhD School of Software Engineering, Tongji University Spring 2018

Interacting with a Self-portrait Camera Using Gestures

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

WHITE PAPER Need for Gesture Recognition. April 2014

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

HUMAN MACHINE INTERFACE

Realtime 3D Computer Graphics Virtual Reality

Lecture 1 Introduction to Computer Vision. Lin ZHANG, PhD School of Software Engineering, Tongji University Spring 2014

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Real-time AR Edutainment System Using Sensor Based Motion Recognition

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

COMPARATIVE STUDY AND ANALYSIS FOR GESTURE RECOGNITION METHODOLOGIES

DEVELOPMENT OF VIRTUAL REALITY TRAINING PLATFORM FOR POWER PLANT APPLICATIONS

Using the Kinect body tracking in virtual reality applications

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

A Real Time Static & Dynamic Hand Gesture Recognition System

COMS W4172 Design Principles

Hand Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided

Webcam Based Image Control System

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Peter Berkelman. ACHI/DigitalWorld

Augmented and Mixed Reality Virtual and Mirror Worlds. January 20, 2009

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Designing a Lightweight Gesture Recognizer Based on the Kinect Version 2

Service Robots in an Intelligent House

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN

Transcription:

3D Interaction using Hand Motion Tracking Srinath Sridhar Antti Oulasvirta EIT ICT Labs Smart Spaces Summer School 05-June-2013

Speaker Srinath Sridhar PhD Student Supervised by Prof. Dr. Christian Theobalt and Dr. Antti Oulasvirta Max Planck Institut für Informatik, Saarbrücken, Germany www.mpi-inf.mpg.de/~ssridhar/ Antti Oulasvirta Senior Researcher Max Planck Institut für Informatik, Saarbrücken, Germany www.mpi-inf.mpg.de/~oantti/ 2

Overview of Today s Session We will have four parts. Part I : You are here! Part II : Introduction to 3D Interaction using Hand Motion Tracking Part III : Introduction to the Leap Motion Sensor and SDK Part IV : Hands-on Exercises Please feel free to interrupt with questions anytime 3

Requirements for Today s Session Requirements WiFi enabled laptop Laptop with WebSocket compatible web browser (Firefox 6+, Chrome 14+, IE 10) Text editor and basic Java/C++ skills Google Earth for Windows or Mac Cool 3D interaction ideas Audience poll Requirements Teams 4

Objectives Gain the ability to understand and create 3D interactive interfaces using hand motion tracking Computer vision techniques for hand motion tracking and their relative performance Different sensing devices with emphasis on the Leap Motion sensor and the Leap SDK Implement a simple 3D interaction interface for Google Earth 5

Part II INTRODUCTION TO 3D INTERACTION USING HAND MOTION TRACKING 6

Motivation 7

Motivation The Human Hand Joints 26 degrees-of-freedom Muscles fine motor control Brain Grasping and gestures 8

Motivation Potential HCI Applications 2D/3D UI Interaction Sign Language Recognition Retargeting Musical Instrument Tony Stark / Tom Cruise esque interface of the future 9

Components of 3D Interaction using Hand Tracking Interaction Design (Part III & IV) 3D Interaction Interface (3D Desktop, Google Earth, etc.) Computer Human Articulated Hand Motion Tracking (Output: Set of points, skeleton, etc.) Computer Vision (Part II) 10

Requirements for Hand Tracking in HCI Interactive: Real-time performance and minimum latency Markerless: Not use gloves or markers DoF: Capture many degrees-of-freedom or hand skeleton Occlusions: Robust to partial self-occlusions Environment: General background and illumination 11

Leap Motion Tracking semantically meaningful parts of the hand each with 6 DoF (fingertips, palm) Very high accuracy and low latency Internally uses a depth sensor No skeleton tracking 12

Efficient model-based 3D tracking of hand articulations using Kinect Oikonomidis et al. (ICCV 2011, CVPR 2012) Captures 26 DoF of the hand using a model composed of geometric primitives Performance - 15 Hz; Latency due to Kinect Limited to range of the Kinect Skin colour-based segmentation of depth data 3D Interaction using Hand Motion Tracking 05-June-2013 13

6D Hands: Markerless Hand-Tracking for Computer Aided Design Wang et al. (UIST 2011) Captures 27 DoF of the hand using a skeleton hand model Performance - 17 Hz Skin colour-based segmentation of depth data Used as a control interface for 3D CAD Modelling 14

Hybrid Hand Tracking using RGB and Depth Data MPI Informatik Captures 26 DoF of the hand using a kinematic skeleton model Performance - 17 Hz. 30-60 ms latency Uses colour information from RGB cameras and depth data Multi-view camera setup with 5 RGB and 1 Depth camera Interface for musical expression 15

Hand Tracking Approach Multi-view Image Sequence Feature Extraction Voting Depth Data Normalization Database of Hand Poses Final Pose CG Lunch 07-Feb-2013 16

Comparison of Hand Motion Tracking Systems System Interactive No. of DoF Accuracy Technology Number of Views Leap Motion ICS FORTH 20 fps Low 15 fps High 0-36+ No articulation s HCI Application High Depth 1 Google Earth, 3D UI, etc. 26 10mm Depth + RGB Wang et al. 15 fps 27 - RGB (also depth) MPI 17 fps 26 13mm Depth + RGB ETH Zurich 1 Object interaction 2 (also 1) 3D CAD Modelling 4-6 Musical Instrument 2 fpm 26+ ~10-15mm RGB 7-8 Multiple hands Intel 50 fps ~26 - Depth 1-17

Part III INTRODUCTION TO SENSING DEVICES AND THE LEAP MOTION SDK 18

What is the Leap Motion controller? A close range depth sensor Range < 50cm Similar to Microsoft Kinect, Softkinetic Depthsense, etc. Bundled API for tracking Fingertips Hands Tools (any pointy object) USB 2.0/3.0 input Available in June/July for $70 Air Space app store 3D Interaction using Hand Motion Tracking 05-June-2013 19

How does it (most likely) work? Possibly time-of-flight with stereo Structured light (Kinect) Leap Motion 3D Interaction using Hand Motion Tracking 05-June-2013 20

Functionality Exposed in API Hands Palm center and orientation Fingers Fingertip location Finger length (not exact) Finger pointing direction Tools (any pointy object) Tooltip location Tool length Tool pointing direction https://developer.leapmotion.com/documentation/guide/leap_overview 3D Interaction using Hand Motion Tracking 05-June-2013 21

Pros and Cons of the Leap Motion Jitter-free point tracking High frame rate Low latency Fairly large FOV No skeleton tracking Tracked points have no semantics No access to raw data Depth data RGB data (if available) Single viewpoint 22

Other Depth Sensors 23

BREAK?

Part IV HANDS-ON EXERCISES 25

Information Connect to WiFi SSID: minerva Password: 3dinteraction Please install Google Earth if you have not. Google Earth API basics are enough Visit: 192.168.1.100:8080 You should see this: 26

Exercises Overview Implement panning and zooming using one of the following. Datastructure from the Leap Motion SDK 3D Position Data from Intel Depth Tracker Implement flying at the terrain level using one of the following. Datastructure from the Leap Motion SDK (Hint: think about the palm) 6D Position Data from Intel Depth Tracker Bonus Panning with clutching 27

Exercises Implement panning and zooming using one of the following Datastructure from the Leap Motion SDK 3D Position Data from Intel Depth Tracker 28

WRAP-UP 29

Conclusion Feedback Contact Srinath: ssridhar@mpi-inf.mpg.de Antti: oantti@mpi-inf.mpg.de 30