Composite Body-Tracking:

Similar documents
VR/AR Concepts in Architecture And Available Tools

Harry Plummer KC BA Digital Arts. Virtual Space. Assignment 1: Concept Proposal 23/03/16. Word count: of 7

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Classification for Motion Game Based on EEG Sensing

Draft TR: Conceptual Model for Multimedia XR Systems

Air Marshalling with the Kinect

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Head Tracking for Google Cardboard by Simond Lee

interactive laboratory

SVEn. Shared Virtual Environment. Tobias Manroth, Nils Pospischil, Philipp Schoemacker, Arnulph Fuhrmann. Cologne University of Applied Sciences

ReVRSR: Remote Virtual Reality for Service Robots

The WalkOVR is a motion suit that features built-in motion sensors and sophisticated motion capture algorithms and track gamers movements in real

Affordance based Human Motion Synthesizing System

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

A Study on Motion-Based UI for Running Games with Kinect

Immersive Real Acting Space with Gesture Tracking Sensors

Contents. Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement.

Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies

ISCW 2001 Tutorial. An Introduction to Augmented Reality

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Background - Too Little Control

RUIS for Unity Introduction. Quickstart

Step. A Big Step Forward for Virtual Reality

Dexta Robotics Inc. DEXMO Development Kit 1. Introduction. Features. User Manual [V2.3] Motion capture ability. Variable force feedback

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

INTRODUCING THE VIRTUAL REALITY FLIGHT SIMULATOR FOR SURGEONS

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

PASSENGER. Story of a convergent pipeline. Thomas Felix TG - Passenger Ubisoft Montréal. Pierre Blaizeau TWINE Ubisoft Montréal

VIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR

Augmented and Virtual Reality

A Publicly Available RGB-D Data Set of Muslim Prayer Postures Recorded Using Microsoft Kinect for Windows

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

Classifying 3D Input Devices

More Efficient and Intuitive PLM by Integrated AR/VR. Round Table Session Georg Fiechtner

MIXED REALITY BENEFITS IN BUSINESS

Virtual Reality as Innovative Approach to the Interior Designing

A Modular Architecture for an Interactive Real-Time Simulation and Training Environment for Satellite On-Orbit Servicing

New Skills: Finding visual cues for where characters hold their weight

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

HARDWARE SETUP GUIDE. 1 P age

Haplug: A Haptic Plug for Dynamic VR Interactions

Multi-User Collaboration on Complex Data in Virtual and Augmented Reality

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

Getting Real with the Library. Samuel Putnam, Sara Gonzalez Marston Science Library University of Florida

EMMA Software Quick Start Guide

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106)

Digitalisation as day-to-day-business

Virtual Co-Location for Crime Scene Investigation and Going Beyond

Virtual Reality and Natural Interactions

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

Real Time Hand Gesture Tracking for Network Centric Application

Exploring Geoscience with AR/VR Technologies

Unpredictable movement performance of Virtual Reality headsets

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Virtual Grasping Using a Data Glove

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

ADVANCED WHACK A MOLE VR

Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment

Team Breaking Bat Architecture Design Specification. Virtual Slugger

DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

Classifying 3D Input Devices

SIU-CAVE. Cave Automatic Virtual Environment. Project Design. Version 1.0 (DRAFT) Prepared for. Dr. Christos Mousas JBU.

VR/AR Innovation Report August 2016

Building a bimanual gesture based 3D user interface for Blender

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

Realtime 3D Computer Graphics Virtual Reality

Reflecting on Comic Con - Lecture 12. Mario Romero 2016/11/11

SUNY Immersive Augmented Reality Classroom. IITG Grant Dr. Ibrahim Yucel Dr. Michael J. Reale

Virtual Reality in Neuro- Rehabilitation and Beyond

A RANGE OF LOCATION BASED VR PRODUCTS BY

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Combining complementary skills, research, novel technologies.

Mobile Virtual Reality what is that and how it works? Alexey Rybakov, Senior Engineer, Technical Evangelist at DataArt

VR-Plugin. for Autodesk Maya.

FLEXLINK DESIGN TOOL VR GUIDE. documentation

Campus Space Planning for VR/AR

Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl. Kinect2Scratch Workbook

Immersive Mobile Technologies: Virtual/Augmented Reality Technologies in the Classroom

Mario Romero 2014/11/05. Multimodal Interaction and Interfaces Mixed Reality

A Gesture-Based Interface for Seamless Communication between Real and Virtual Worlds

Available theses in industrial robotics (October 2016) Prof. Paolo Rocco Prof. Andrea Maria Zanchettin

Evaluating Lightweight Optical Hand Tracking for Virtual Reality Rehabilitation

HARDWARE SETUP GUIDE. 1 P age

From Gamers to Tango Dancers Bridging Games Engines and Distributed Control System Frameworks for Virtual Reality (VR) based scientific simulations

Representation of Intractable Objects and Action Sequences in VR Using Hand Gesture Recognition

Research Seminar. Stefano CARRINO fr.ch

Virtual SATURN : Saturn launches Europe s first virtual reality shopping world for consumer electronics

VR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing

State Of The Union.. Past, Present, And Future Of Wearable Glasses. Salvatore Vilardi V.P. of Product Development Immy Inc.

FATE WEAVER. Lingbing Jiang U Final Game Pitch

My project is based on How museum installations could be combined with gesture technologies to make them more interactive.

The use of gestures in computer aided design

Available online at ScienceDirect. Procedia Computer Science 109C (2017) 59 66

Transcription:

Composite Body-Tracking: Device Abstraction Layer with Data Fusion for Gesture Recognition in Virtual Reality Applications Vortragender: Betreuer: Verantwortlicher Professor: Luis Alejandro Rojas Vargas M. Sc. Florian Weidner Prof. Dr. Wolfgang Broll Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 1 2016-12-13

Structure 1. Motivation 2. Objective 3. The ALVR System 3.1. Body Structure 3.2. Abstraction Layer 3.3. Data Fusion 3.4. Gesture Recognition 3.5. Interface UE4 3.6. Prototype 4. Results 5. Evaluation 6. Summary Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 2 12/13/2016

1. Motivation [LM] [MK] [OPT] Heterogeneous Devices Different type of information (Data Model) Different interfaces No unified gesture system [HTC] [OCR] Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 3 2016-12-13

2. Objectives Design and Implementation of a Device Abstraction Layer with Data Fusion of body-tracking information (position and orientation). Data Model (body structure) Gesture Recognition System Virtual Reality applications Implementation in Unreal Engine (UE) Prototype Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 4 2016-12-13

3. The ALVR System Abstraction Layer for Virtual Reality (ALVR) consist of: C++ Library (Dll) Device Server (Remote Devices) Basic Application (Body model visualization and Gesture configuration) Plugin for Unreal Engine Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 5 2016-12-13

3. The ALVR System Microsoft Kinect V2: Markerless system for body tracking information. Leap Motion: Markerless device aimed at detecting hands and fingers. Optitrack. Optical Motion Capture (MoCap) system with markers for body tracking through Rigid Body method (Reference Sensor). Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 6 2016-12-13

ALVR Library Configuration file Kinect Leap Motion Optitrack... Device Abstraction Layer T T T Data Fusion Gestures DB Gesture Recognition Unreal Engine 4 Sensor n T Sensor m Device Server T: Coordinate transformation module Body tracking information Information through files Gesture information Body tracking information through network Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 7 2016-12-13

3.1 Body Structure (Data Model) Shoulder Left Elbow Left Wrist Left Hand Left Head Neck Spine Shoulder Spine Mid Spine Base Origin Shoulder Right Elbow Right Wrist Right Hand Right Pinky Metacarpal Proximal Intermediate Distal Ring Hand* Hand Middle Thumb Index Hand * Hip Left Knee Left Ankle Left Foot Left Hip Right Knee Right Ankle Right Foot Right Hand * 65 Joints (25 Body 20 Each hand) Linear and hierarchical access Unified output Body-structure Extensible structure (Joints and body part) Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 8 2016-12-13

3.2 Abstraction Layer Device A Device B Tracking Data Device Server Tracking Data UDP/TCP Device Abstraction Layer Proxy Device Client Body Structure Data A Data B Device Tracking Interface of each device. Synchronization Device Server / Device Client. Output data Body-structure Device Tracking: Manual, Static, Dynamic. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 9 2016-12-13

3.2 Abstraction Layer Device Server Device B Hardware Thread Command Thread Data Update Thread A TCP UDP Device Client n Device Server Support for remote devices. Multiple connections to one device. Different Update-Rates. Only one device connected to the Server Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 10 2016-12-13

3.3 Data Fusion Body Structure Device 1 Device 2 Device 3 Device Abstraction Layer Data 1 Data 2 Data 3 F F F T T T Data Merge Body Structure Output Data F: Filter T: Coordinate transformation F: (median filter) adapt the information (remove spurious information impulsive noise). T: transform the tracking data to a common coordinate system. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 11 2016-12-13

3.3 Data Fusion Data Model for the used devices: Kinect Complementary Redundant Not Tracked Leap Motion Optitrack Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 12 2016-12-13

3.3 Data Fusion Data Merge Fusion of redundant information NN DD 1 JJJJJJJJJJIIIIIIIIIIIIIIIIIIIIII NN ii DD ii=0 Where NN DD = NNNNNNNNNNNN oooo DDDDDDDDDDDDDD Fusion of complementary information RRRRRRRRRRRRRR JJJJJJJJJJ = QQ JJ JJJJJJJJJJ PPPPPPPPPPPPPPPP TTTTTTTTTTTTTTTTTTTT JJJJJJJJJJ = TT JJ RRRRRRRRRRRRRR JJJJJJJJJJ RRRRRRRRRRRRRRRRRR JJJJJJJJJJ = TT JJ + QQ JJ, Where TT JJ TTTTTTTTTTTTTTTTTTTTTT VVVVVVVVVVVV and QQ JJ OOOOOOOOOOOOOOOOOOOOOO QQQQQQQQQQQQQQQQQQQQ Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 13 2016-12-13

3.4 Gesture Recognition Tracking Data Gesture Joints 1 Joint X,Y,Z Data Gestures DB Selector T DTW Gesture X Body Structure Ref Joint Gesture recognition pipeline * T: Coordinate transformation Real time comparison of the tracking information with the Gesture DB. DTW algorithm for the 3 signals (X,Y,Z information). Configurable 3D spatial gestures. Variable reference joint. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 14 2016-12-13

3.5 Interface to UE 4 VR Application Unreal Engine Core ALVR Plug-in Modules ALVR Library The Interface provides: Tracking information about user's body joints List of the used devices. Position and orientation information about the used devices in the work area. Information of the performed gestures Devices Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 15 2016-12-13

3.6 Prototype User interaction in a closed work area. Visualization through a screen. The aim is to hit three virtual targets with virtual objects. Floating menu. Extra gestures from Leap Motion. 3 preconfigured gestures to create and control virtual objects. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 16 2016-12-13

3.6 Prototype Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 17 2016-12-13

4. Results VIDEO Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 18 2016-12-13

5. Evaluation Detection accuracy Test 1 +Y α User direction +X Gesture Samples False Hit Accuracy Swipe Left 20 2 18 90% Up 20 1 19 95% Push 20 3 17 85% New User +Z Work area Gesture Samples False Hit Accuracy Swipe Left 20 0 20 100% Up 20 1 19 95% Push 20 2 18 90% Expert User Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 19 2016-12-13

5. Evaluation Detection accuracy Test 2 +Y line of sight User Push Gesture α +Y line of sight User α UP Gesture -X Work area +Z -X Work area +Z Gesture Samples False Hit Accuracy Swipe Left 20 3 17 85% Up 20 8 12 60% Push 20 4 16 80% New User Gesture Samples False Hit Accuracy Swipe Left 20 2 18 90% Up 20 4 16 80% Push 20 2 18 90% Expert User Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 20 2016-12-13

5. Evaluation - Interference Problem Interference due to the use of similar wavelength for optical sensor Different logic for the recognition of elements (contrast vs intensity) Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 21 2016-12-13

5. Evaluation - Interference Problem Change the position of the marker Physical (optical) filter Digital filter in firmware Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 22 2016-12-13

6. Summary and Future Work Design and Implementation of a Device Abstraction Layer. Data Fusion of body-tracking information. Design of a Data Model for body tracking information. Device Server for distributed devices. Implementation of a gesture recognition system. Plug-in for Unreal Engine 4. Prototype in Unreal Engine 4. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 23 2016-12-13

6. Summary and Future Work Increase the supported devices. Extend the System to multiple user for collaborative Virtual Reality. Improve the Tracking Device module to support mixed reality. Prediction system to assist the data fusion process. Implementation of a dynamic windows size method for the gesture recognition. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 24 2016-12-13

Bibliography [LM] https://developer.leapmotion.com/documentation/cpp/devguide/leap_overview.html accessed 24.11.2016 [MK] http://www.windowscentral.com/kinect-windows-v2-sensor-sales-end-developers-can-use-xbox-one-version accessed 24.11.2016 [HTC] http://arstechnica.com/gaming/2016/10/best-vr-headset-2016-psvr-rift-vive/ accessed 24.11.2016 [OCR] https://www3.oculus.com/en-us/blog/oculus-rift-pre-orders-now-open-first-shipments-march-28/ accessed 24.11.2016 [OPT] http://optitrack.com/products/prime-13/ accessed 24.11.2016 [DEVAL] J. Ohlenburg, W. Broll, and I. Lindt, DEVAL: a device abstraction layer for VR/AR," Proceedings of the 4th international conference on Universal access in human computer interaction: coping with diversity UAHCI'07, pp. 497-506, 2007. D. Y. Kwon and M. Gross, \A Framework for 3D Spatial Gesture Design and Modeling Using a Wearable Input Device," in Proceedings of the 11th IEEE International Symposium on Wearable Computers, (Boston, MA, USA), pp. 95{101, IEEE Press, 2007. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 25 12/13/2016

Thanks for your attention Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 26 2016-12-13