Composite Body-Tracking: Device Abstraction Layer with Data Fusion for Gesture Recognition in Virtual Reality Applications Vortragender: Betreuer: Verantwortlicher Professor: Luis Alejandro Rojas Vargas M. Sc. Florian Weidner Prof. Dr. Wolfgang Broll Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 1 2016-12-13
Structure 1. Motivation 2. Objective 3. The ALVR System 3.1. Body Structure 3.2. Abstraction Layer 3.3. Data Fusion 3.4. Gesture Recognition 3.5. Interface UE4 3.6. Prototype 4. Results 5. Evaluation 6. Summary Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 2 12/13/2016
1. Motivation [LM] [MK] [OPT] Heterogeneous Devices Different type of information (Data Model) Different interfaces No unified gesture system [HTC] [OCR] Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 3 2016-12-13
2. Objectives Design and Implementation of a Device Abstraction Layer with Data Fusion of body-tracking information (position and orientation). Data Model (body structure) Gesture Recognition System Virtual Reality applications Implementation in Unreal Engine (UE) Prototype Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 4 2016-12-13
3. The ALVR System Abstraction Layer for Virtual Reality (ALVR) consist of: C++ Library (Dll) Device Server (Remote Devices) Basic Application (Body model visualization and Gesture configuration) Plugin for Unreal Engine Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 5 2016-12-13
3. The ALVR System Microsoft Kinect V2: Markerless system for body tracking information. Leap Motion: Markerless device aimed at detecting hands and fingers. Optitrack. Optical Motion Capture (MoCap) system with markers for body tracking through Rigid Body method (Reference Sensor). Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 6 2016-12-13
ALVR Library Configuration file Kinect Leap Motion Optitrack... Device Abstraction Layer T T T Data Fusion Gestures DB Gesture Recognition Unreal Engine 4 Sensor n T Sensor m Device Server T: Coordinate transformation module Body tracking information Information through files Gesture information Body tracking information through network Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 7 2016-12-13
3.1 Body Structure (Data Model) Shoulder Left Elbow Left Wrist Left Hand Left Head Neck Spine Shoulder Spine Mid Spine Base Origin Shoulder Right Elbow Right Wrist Right Hand Right Pinky Metacarpal Proximal Intermediate Distal Ring Hand* Hand Middle Thumb Index Hand * Hip Left Knee Left Ankle Left Foot Left Hip Right Knee Right Ankle Right Foot Right Hand * 65 Joints (25 Body 20 Each hand) Linear and hierarchical access Unified output Body-structure Extensible structure (Joints and body part) Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 8 2016-12-13
3.2 Abstraction Layer Device A Device B Tracking Data Device Server Tracking Data UDP/TCP Device Abstraction Layer Proxy Device Client Body Structure Data A Data B Device Tracking Interface of each device. Synchronization Device Server / Device Client. Output data Body-structure Device Tracking: Manual, Static, Dynamic. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 9 2016-12-13
3.2 Abstraction Layer Device Server Device B Hardware Thread Command Thread Data Update Thread A TCP UDP Device Client n Device Server Support for remote devices. Multiple connections to one device. Different Update-Rates. Only one device connected to the Server Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 10 2016-12-13
3.3 Data Fusion Body Structure Device 1 Device 2 Device 3 Device Abstraction Layer Data 1 Data 2 Data 3 F F F T T T Data Merge Body Structure Output Data F: Filter T: Coordinate transformation F: (median filter) adapt the information (remove spurious information impulsive noise). T: transform the tracking data to a common coordinate system. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 11 2016-12-13
3.3 Data Fusion Data Model for the used devices: Kinect Complementary Redundant Not Tracked Leap Motion Optitrack Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 12 2016-12-13
3.3 Data Fusion Data Merge Fusion of redundant information NN DD 1 JJJJJJJJJJIIIIIIIIIIIIIIIIIIIIII NN ii DD ii=0 Where NN DD = NNNNNNNNNNNN oooo DDDDDDDDDDDDDD Fusion of complementary information RRRRRRRRRRRRRR JJJJJJJJJJ = QQ JJ JJJJJJJJJJ PPPPPPPPPPPPPPPP TTTTTTTTTTTTTTTTTTTT JJJJJJJJJJ = TT JJ RRRRRRRRRRRRRR JJJJJJJJJJ RRRRRRRRRRRRRRRRRR JJJJJJJJJJ = TT JJ + QQ JJ, Where TT JJ TTTTTTTTTTTTTTTTTTTTTT VVVVVVVVVVVV and QQ JJ OOOOOOOOOOOOOOOOOOOOOO QQQQQQQQQQQQQQQQQQQQ Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 13 2016-12-13
3.4 Gesture Recognition Tracking Data Gesture Joints 1 Joint X,Y,Z Data Gestures DB Selector T DTW Gesture X Body Structure Ref Joint Gesture recognition pipeline * T: Coordinate transformation Real time comparison of the tracking information with the Gesture DB. DTW algorithm for the 3 signals (X,Y,Z information). Configurable 3D spatial gestures. Variable reference joint. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 14 2016-12-13
3.5 Interface to UE 4 VR Application Unreal Engine Core ALVR Plug-in Modules ALVR Library The Interface provides: Tracking information about user's body joints List of the used devices. Position and orientation information about the used devices in the work area. Information of the performed gestures Devices Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 15 2016-12-13
3.6 Prototype User interaction in a closed work area. Visualization through a screen. The aim is to hit three virtual targets with virtual objects. Floating menu. Extra gestures from Leap Motion. 3 preconfigured gestures to create and control virtual objects. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 16 2016-12-13
3.6 Prototype Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 17 2016-12-13
4. Results VIDEO Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 18 2016-12-13
5. Evaluation Detection accuracy Test 1 +Y α User direction +X Gesture Samples False Hit Accuracy Swipe Left 20 2 18 90% Up 20 1 19 95% Push 20 3 17 85% New User +Z Work area Gesture Samples False Hit Accuracy Swipe Left 20 0 20 100% Up 20 1 19 95% Push 20 2 18 90% Expert User Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 19 2016-12-13
5. Evaluation Detection accuracy Test 2 +Y line of sight User Push Gesture α +Y line of sight User α UP Gesture -X Work area +Z -X Work area +Z Gesture Samples False Hit Accuracy Swipe Left 20 3 17 85% Up 20 8 12 60% Push 20 4 16 80% New User Gesture Samples False Hit Accuracy Swipe Left 20 2 18 90% Up 20 4 16 80% Push 20 2 18 90% Expert User Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 20 2016-12-13
5. Evaluation - Interference Problem Interference due to the use of similar wavelength for optical sensor Different logic for the recognition of elements (contrast vs intensity) Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 21 2016-12-13
5. Evaluation - Interference Problem Change the position of the marker Physical (optical) filter Digital filter in firmware Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 22 2016-12-13
6. Summary and Future Work Design and Implementation of a Device Abstraction Layer. Data Fusion of body-tracking information. Design of a Data Model for body tracking information. Device Server for distributed devices. Implementation of a gesture recognition system. Plug-in for Unreal Engine 4. Prototype in Unreal Engine 4. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 23 2016-12-13
6. Summary and Future Work Increase the supported devices. Extend the System to multiple user for collaborative Virtual Reality. Improve the Tracking Device module to support mixed reality. Prediction system to assist the data fusion process. Implementation of a dynamic windows size method for the gesture recognition. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 24 2016-12-13
Bibliography [LM] https://developer.leapmotion.com/documentation/cpp/devguide/leap_overview.html accessed 24.11.2016 [MK] http://www.windowscentral.com/kinect-windows-v2-sensor-sales-end-developers-can-use-xbox-one-version accessed 24.11.2016 [HTC] http://arstechnica.com/gaming/2016/10/best-vr-headset-2016-psvr-rift-vive/ accessed 24.11.2016 [OCR] https://www3.oculus.com/en-us/blog/oculus-rift-pre-orders-now-open-first-shipments-march-28/ accessed 24.11.2016 [OPT] http://optitrack.com/products/prime-13/ accessed 24.11.2016 [DEVAL] J. Ohlenburg, W. Broll, and I. Lindt, DEVAL: a device abstraction layer for VR/AR," Proceedings of the 4th international conference on Universal access in human computer interaction: coping with diversity UAHCI'07, pp. 497-506, 2007. D. Y. Kwon and M. Gross, \A Framework for 3D Spatial Gesture Design and Modeling Using a Wearable Input Device," in Proceedings of the 11th IEEE International Symposium on Wearable Computers, (Boston, MA, USA), pp. 95{101, IEEE Press, 2007. Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 25 12/13/2016
Thanks for your attention Luis Alejandro Rojas Vargas Virtual Worlds and Digital Games Group 26 2016-12-13