3D Interaction using Hand Motion Tracking Srinath Sridhar Antti Oulasvirta EIT ICT Labs Smart Spaces Summer School 05-June-2013
Speaker Srinath Sridhar PhD Student Supervised by Prof. Dr. Christian Theobalt and Dr. Antti Oulasvirta Max Planck Institut für Informatik, Saarbrücken, Germany www.mpi-inf.mpg.de/~ssridhar/ Antti Oulasvirta Senior Researcher Max Planck Institut für Informatik, Saarbrücken, Germany www.mpi-inf.mpg.de/~oantti/ 2
Overview of Today s Session We will have four parts. Part I : You are here! Part II : Introduction to 3D Interaction using Hand Motion Tracking Part III : Introduction to the Leap Motion Sensor and SDK Part IV : Hands-on Exercises Please feel free to interrupt with questions anytime 3
Requirements for Today s Session Requirements WiFi enabled laptop Laptop with WebSocket compatible web browser (Firefox 6+, Chrome 14+, IE 10) Text editor and basic Java/C++ skills Google Earth for Windows or Mac Cool 3D interaction ideas Audience poll Requirements Teams 4
Objectives Gain the ability to understand and create 3D interactive interfaces using hand motion tracking Computer vision techniques for hand motion tracking and their relative performance Different sensing devices with emphasis on the Leap Motion sensor and the Leap SDK Implement a simple 3D interaction interface for Google Earth 5
Part II INTRODUCTION TO 3D INTERACTION USING HAND MOTION TRACKING 6
Motivation 7
Motivation The Human Hand Joints 26 degrees-of-freedom Muscles fine motor control Brain Grasping and gestures 8
Motivation Potential HCI Applications 2D/3D UI Interaction Sign Language Recognition Retargeting Musical Instrument Tony Stark / Tom Cruise esque interface of the future 9
Components of 3D Interaction using Hand Tracking Interaction Design (Part III & IV) 3D Interaction Interface (3D Desktop, Google Earth, etc.) Computer Human Articulated Hand Motion Tracking (Output: Set of points, skeleton, etc.) Computer Vision (Part II) 10
Requirements for Hand Tracking in HCI Interactive: Real-time performance and minimum latency Markerless: Not use gloves or markers DoF: Capture many degrees-of-freedom or hand skeleton Occlusions: Robust to partial self-occlusions Environment: General background and illumination 11
Leap Motion Tracking semantically meaningful parts of the hand each with 6 DoF (fingertips, palm) Very high accuracy and low latency Internally uses a depth sensor No skeleton tracking 12
Efficient model-based 3D tracking of hand articulations using Kinect Oikonomidis et al. (ICCV 2011, CVPR 2012) Captures 26 DoF of the hand using a model composed of geometric primitives Performance - 15 Hz; Latency due to Kinect Limited to range of the Kinect Skin colour-based segmentation of depth data 3D Interaction using Hand Motion Tracking 05-June-2013 13
6D Hands: Markerless Hand-Tracking for Computer Aided Design Wang et al. (UIST 2011) Captures 27 DoF of the hand using a skeleton hand model Performance - 17 Hz Skin colour-based segmentation of depth data Used as a control interface for 3D CAD Modelling 14
Hybrid Hand Tracking using RGB and Depth Data MPI Informatik Captures 26 DoF of the hand using a kinematic skeleton model Performance - 17 Hz. 30-60 ms latency Uses colour information from RGB cameras and depth data Multi-view camera setup with 5 RGB and 1 Depth camera Interface for musical expression 15
Hand Tracking Approach Multi-view Image Sequence Feature Extraction Voting Depth Data Normalization Database of Hand Poses Final Pose CG Lunch 07-Feb-2013 16
Comparison of Hand Motion Tracking Systems System Interactive No. of DoF Accuracy Technology Number of Views Leap Motion ICS FORTH 20 fps Low 15 fps High 0-36+ No articulation s HCI Application High Depth 1 Google Earth, 3D UI, etc. 26 10mm Depth + RGB Wang et al. 15 fps 27 - RGB (also depth) MPI 17 fps 26 13mm Depth + RGB ETH Zurich 1 Object interaction 2 (also 1) 3D CAD Modelling 4-6 Musical Instrument 2 fpm 26+ ~10-15mm RGB 7-8 Multiple hands Intel 50 fps ~26 - Depth 1-17
Part III INTRODUCTION TO SENSING DEVICES AND THE LEAP MOTION SDK 18
What is the Leap Motion controller? A close range depth sensor Range < 50cm Similar to Microsoft Kinect, Softkinetic Depthsense, etc. Bundled API for tracking Fingertips Hands Tools (any pointy object) USB 2.0/3.0 input Available in June/July for $70 Air Space app store 3D Interaction using Hand Motion Tracking 05-June-2013 19
How does it (most likely) work? Possibly time-of-flight with stereo Structured light (Kinect) Leap Motion 3D Interaction using Hand Motion Tracking 05-June-2013 20
Functionality Exposed in API Hands Palm center and orientation Fingers Fingertip location Finger length (not exact) Finger pointing direction Tools (any pointy object) Tooltip location Tool length Tool pointing direction https://developer.leapmotion.com/documentation/guide/leap_overview 3D Interaction using Hand Motion Tracking 05-June-2013 21
Pros and Cons of the Leap Motion Jitter-free point tracking High frame rate Low latency Fairly large FOV No skeleton tracking Tracked points have no semantics No access to raw data Depth data RGB data (if available) Single viewpoint 22
Other Depth Sensors 23
BREAK?
Part IV HANDS-ON EXERCISES 25
Information Connect to WiFi SSID: minerva Password: 3dinteraction Please install Google Earth if you have not. Google Earth API basics are enough Visit: 192.168.1.100:8080 You should see this: 26
Exercises Overview Implement panning and zooming using one of the following. Datastructure from the Leap Motion SDK 3D Position Data from Intel Depth Tracker Implement flying at the terrain level using one of the following. Datastructure from the Leap Motion SDK (Hint: think about the palm) 6D Position Data from Intel Depth Tracker Bonus Panning with clutching 27
Exercises Implement panning and zooming using one of the following Datastructure from the Leap Motion SDK 3D Position Data from Intel Depth Tracker 28
WRAP-UP 29
Conclusion Feedback Contact Srinath: ssridhar@mpi-inf.mpg.de Antti: oantti@mpi-inf.mpg.de 30