A Study on Motion-Based UI for Running Games with Kinect

Similar documents
Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

The Making of a Kinect-based Control Car and Its Application in Engineering Education

Classification for Motion Game Based on EEG Sensing

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Toward an Augmented Reality System for Violin Learning Support

Learning Based Interface Modeling using Augmented Reality

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Advancements in Gesture Recognition Technology

A Publicly Available RGB-D Data Set of Muslim Prayer Postures Recorded Using Microsoft Kinect for Windows

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

The Control of Avatar Motion Using Hand Gesture

International Conference on Advances in Mechanical Engineering and Industrial Informatics (AMEII 2015)

3D Human-Gesture Interface for Fighting Games Using Motion Recognition Sensor

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl. Kinect2Scratch Workbook

New Skills: Finding visual cues for where characters hold their weight

Open World Virtual Reality Role Playing Game

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

3 rd Grade: April Lesson 6: Comic Strip, Recess Drawing

Gesture Recognition with Real World Environment using Kinect: A Review

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

ADVANCED WHACK A MOLE VR

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Boneshaker A Generic Framework for Building Physical Therapy Games

Real-time AR Edutainment System Using Sensor Based Motion Recognition

Air Marshalling with the Kinect

TrampTroller. Using a trampoline as an input device.

Short Course on Computational Illumination

The User Activity Reasoning Model Based on Context-Awareness in a Virtual Living Space

The Hand Gesture Recognition System Using Depth Camera

Composite Body-Tracking:

A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML

Application and Research of Kinect Motion Sensing Technology on Substation Simulation Training System

Proposal for A KINECT-Based Auscultation Practice System

Designing games for older adults: an affordance based approach

MRT: Mixed-Reality Tabletop

R (2) Controlling System Application with hands by identifying movements through Camera

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Immersive Guided Tours for Virtual Tourism through 3D City Models

Requirements Specification. An MMORPG Game Using Oculus Rift

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

Gesture Control FPS Horror/Survivor Game Third Year Project (COMP30040)

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

A Smart Home Design and Implementation Based on Kinect

Stabilize humanoid robot teleoperated by a RGB-D sensor

Image Interpretation System for Informed Consent to Patients by Use of a Skeletal Tracking

Adaptive Projection Displays: an interactive art piece to explore a low cost system for public interactivity. A Thesis. Submitted to the Faculty

FATE WEAVER. Lingbing Jiang U Final Game Pitch

What was the first gestural interface?

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

CS Game Programming, Fall 2014

Kinect for Windows in VisionLab. Johan van Althuis Martin Dijkstra Bart van Apeldoorn. 20 January 2017

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Obstacle Dodger. Nick Raptakis James Luther ELE 408/409 Final Project Professor Bin Li. Project Description:

Immersive Real Acting Space with Gesture Tracking Sensors

Development of Video Chat System Based on Space Sharing and Haptic Communication

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Performing Art Utilizing Interactive Technology -Media Performance <Silent Mobius>-

KINECT CONTROLLED HUMANOID AND HELICOPTER

MATH 8 FALL 2010 CLASS 27, 11/19/ Directional derivatives Recall that the definitions of partial derivatives of f(x, y) involved limits

Electronic Travel Aid Based on. Consumer Depth Devices to Avoid Moving Objects

Analysis and Synthesis of Latin Dance Using Motion Capture Data

Platform KEY FEATURES OF THE FLUURMAT 2 SOFTWARE PLATFORM:

AR Tamagotchi : Animate Everything Around Us

Using a Game Development Platform to Improve Advanced Programming Skills

BoBoiBoy Interactive Holographic Action Card Game Application

Navigating the Virtual Environment Using Microsoft Kinect

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Hand Gesture Recognition System for Daily Information Retrieval Swapnil V.Ghorpade 1, Sagar A.Patil 2,Amol B.Gore 3, Govind A.

Software Design Document

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space

School of Engineering Department of Electrical and Computer Engineering. VR Biking. Yue Yang Zongwen Tang. Team Project Number: S17-50

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR

3. Bow is drawn parallel to bridge and is in the proper direction a. Open strings: whole, half and quarter notes

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture

Virtual Reality Game using Oculus Rift

Background - Too Little Control

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

CS295-1 Final Project : AIBO

A study of non-meaning hand motion in conversation through the Body motion Analysis

Gesture Control in a Virtual Environment

HUMAN COMPUTER INTERFACE

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

SVEn. Shared Virtual Environment. Tobias Manroth, Nils Pospischil, Philipp Schoemacker, Arnulph Fuhrmann. Cologne University of Applied Sciences

Virtual Reality as Innovative Approach to the Interior Designing

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

HUMANOID ROBOT PROGRAMMING THROUGH FACE EXPRESSIONS

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity

Implementation of Image processing using augmented reality

International Journal of Informative & Futuristic Research ISSN (Online):

SteamVR Unity Plugin Quickstart Guide

Real Time Hand Gesture Tracking for Network Centric Application

Hand Gesture Recognition System Using Camera

Building a bimanual gesture based 3D user interface for Blender

CompuScholar, Inc. Alignment to Utah Game Development Fundamentals Standards

Outline. Comparison of Kinect and Bumblebee2 in Indoor Environments. Introduction (Cont d) Introduction

Community Update and Next Steps

A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment

Transcription:

A Study on Motion-Based UI for Running Games with Kinect Jimin Kim, Pyeong Oh, Hanho Lee, Sun-Jeong Kim * Interaction Design Graduate School, Hallym University 1 Hallymdaehak-gil, Chuncheon-si, Gangwon-do 200-702 Korea {Rudengkim, vudhrh}@gmail.com, lee_hanho@naver.com, sunkim@hallym.ac.kr ABSTRACT This study examines the efficiency of human motion-based UI for video games with motion capture system, Kinect. We took an investigation to play with the Kinect sensor in the running game which was developed and designed using two kinds of UI. One UI consists of more intuitive and familiar motions such as turning and jumping. The other UI consists of arm motions like raising hands. As a result, UI with arm motions was easier for users to master and results in higher success rates to play than the other UI. Therefore we can conclude when a game is developed using Kinect and its UI is configured with motion recognition, the motion with the arms rather than the other parts of the body helps player better to enhance the play skills and immerse in the game. KEYWORDS NUI (Natural User Interface), Kinect, Video Games, Unity3D, Motion Recognition 1 INTRODUCTION With the development of computers, the input devices for human computer interaction have been diversified into keyboard, mouse, touch pad, speech recognition, and so on. Nowadays many people have actively researched on NUI (Natural User Interface) especially using Kinect which is a motion sensor that came onto the market at low price by Microsoft on 2011. Kinect as gesture recognition sensor is showing potential as an interface of new generation replacing the mouse and keyboard. Some studies show the potential of using motion based interaction for learning. For example Touching Notes present how gesture based interfaces can stimulate and motivate children into learning the basics of music notation [1]. For motivating students and enhancing effectiveness, ARCS model of motivation design is considered during developing the Kinect sensor-assisted game based learning system [2]. ARCS model consists of four major steps for learners to become and remain motivated in the learning process: Attention, Relevance, Confidence, and Satisfaction. In addition to learning system, Kinect is utilized in the area of rehabilitation. An interactive gamebased rehabilitation tool for balance training of adults with neurological injury was developed [3]. Instead of WiiFit, Kinect provided markerless full-body tracking on a conventional PC. The Kinect-based rehabilitation game, JewelMine consists of a set of static balance training exercises which encourage the players to reach out of their base of support [4]. Most previous studies dealt with the accuracy of the recognition of an implementation of a gesture, but the efficiency and convenience of UI itself were not considered. Our research motivation is searching more natural and efficient motions to make it easy for users to play games with Kinect. In this study we focus on analyzing motion-based UI for running games with Kinect. 2 MOTION-BASED UI WITH KINECT We implemented a running game using a game engine, Unity3D (ver. 4.6.0) and Kinect SDK (ver. 1.8). In common running games, a main character is usually running during play. To avoid obstacles, players choose to jump them or ISBN: 978-1-9491968-07-9 2015 SDIWC 158

turn left or right. For input of jumping and turning, we implemented two kinds of motionbased UIs and then examined which one is more natural and efficient for users to play. display ten prefabs at each frame, and among them we place five obstacles in the scene. We prepare five types of obstacles like Figure 1. (c) (d) (c) (e) Figure 1. Five obstacles are used in our running game. 2.1 Implementation of a Running Game In our game, the road where a main character runs is randomly generated using prefabs. We (d) Figure 2. Screen shots of our running game. ISBN: 978-1-9491968-07-9 2015 SDIWC 159

Figure 2 shows several screen shots of our running game. A main character keeps running and a user selects to jump or turn left or right whenever she meets an obstacle. If she fails to avoid an obstacle or gets out of the road, the game is over. The game score is the run distance. To obtain human motion data from a Kinect motion sensor in Unity3D application, we used Kinect Wrapper Package for Unity3D [5] which is provided by Entertainment Technology Center (ETC) in Carnegie Mellon University. After importing this package we could use assets for motion tracking. To obtain the position data of joints we modified the function of Update() in the script KinectPointController.cs which is attached to the prefab KinectPointMan. Kinect tracks the skeleton and a tracked skeleton provides the information about the positions of twenty joints of the user s body (Figure 3). where an unit vector u is (X1 Cx, Y1 Cy) / (X1 Cx, Y1 Cy), an unit vector v is (X2 Cx, Y2 Cy) / (X2 Cx, Y2 Cy), and (Cx, Cy), (X1, Y1), and (X2, Y2) are the coordinates of joints positions (Figure 4). Figure 4. A joint angle θ consists of three joints. As a matter of fact, the position of a joint is 3D coordinates. However we carried out the calculation in 2D for speed. In case of turning gestures, the coordinates of joints positions were projected onto XY plane and then a joint angle was computed in 2D. In case of a jumping gesture, the coordinates of joints positions were projected onto YZ plane and then a joint angle was computed in 2D too. 2.2 Design of Motion-Based UIs For experiments we designed two kinds of motion-based UIs. One UI was designed with more intuitive and familiar motions like Figure 5. Figure 5 shows a gesture to make a game character to turn right. Figure 5 shows a gesture to make a game character to jump. Figure 3. Twenty joints names which Kinect tracks [6]. To recognize a gesture, we should compute a joint angle θ. We used an equation as follows: θ = acos( u v ) (1) Figure 5. Motions for turning and jumping. ISBN: 978-1-9491968-07-9 2015 SDIWC 160

To turn, a user leans the upper body to left or right side. To recognize this motion, we got the positions of three joints: Head, Shoulder Center, and Hip Center. We decided whether to turn left or right by comparing the positions of x coordinates of Head and Shoulder Center. For example, if the value of x coordinates of Head is less than the value of x coordinates of Shoulder Center, this gesture is turning left. Otherwise, it is turning right. To jump, a user bends the knee. To recognize this motion, we got the positions of three joints: Hip, Knee, and Ankle. After computing a joint angle, if this angle is within threshold range, we decided that a user makes a motion for jumping. Because joint angles of initial standing posture are not equal to zero, we defined the range of threshold for each joint angle (Table 1). Table 1. The range of threshold for joint angles (degrees) Types of Motions Initial Maximum Minimum UI with natural motions UI with arm motions Turning -10~10 60 180 Jumping 0~15 60 180 Turning 45~55 20 140 Jumping 45~57-90 -30 3 EXPERIMENTS and RESULTS To compare two types of motion-based UIs, eleven university students took part in an experimental investigation. For each UI they played our running game ten times. We averaged top five game scores per person. Figure 7 shows its result. In the graph, vertical values (0~120) represent the average of game scores and horizontal values (1~11) represent the ID of person. Figure 6. An arm motions for turning and jumping. The other UI was designed with arm motions like Figure 6. This UI design focuses on the convenience of gestures. Figure 6 shows an arm gesture to make a game character to turn right. Figure 6 shows an arm gesture to make a game character to jump. To turn right, a user raises the right hand. To recognize this motion, we got the positions of three joints: Elbow Right, Shoulder Right, and Hip Right. To detect the motion of turning left, we get the position of three joints such as Elbow Left, Shoulder Left, and Hip Left. To jump, a user raises both hands. To recognize this motion, we got the positions of six joints: Elbow Left, Elbow Right, Shoulder Left, Shoulder Right, Hip Left, and Hip Right. Figure 7. The average of game scores for eleven test subjects. As a result, UI with arm motions is easier for users to master and results in higher success rates to play our running game than the other UI. We become aware of its reason as follows: people frequently use arms to express their judgment rather than other parts of the body ISBN: 978-1-9491968-07-9 2015 SDIWC 161

and therefore arm motion becomes to be easier for users and trained more precisely than the motion of other parts of body. 4 CONCLUSIONS In this study we designed two types of motionbased UI for a running game with Kinect. One UI consists of more natural and intuitive motions such as leaning the upper body to left or right side for turning left or right. The other UI consists of only arm motions like raising both hands for jumping. As the result of experiments, game players prefer arm motions because people usually use their arms to express their decision in daily life and it makes users easy to enhance the play skills, which helps them to immerse in the game. Therefore, for improving the mastery and immersion level of players, it is better to design motion-based UI with arm motions rather than the motion of other parts of body. In proceedings of VRW 2012 (IEEE Virtual Reality Short Papers and Posters), pp. 171-172, March 4-8, 2012. [5] http://wiki.etc.cmu.edu/unity3d/index.php/microsoft _Kinect_-_Microsoft_SDK [6] https://msdn.microsoft.com/enus/library/jj131025.aspx ACKNOWLEDGEMENTS This research was supported by Hallym University Research Fund (HRF-201409-012). REFERENCES [1] M. Renzi, S. Vassos, T. Catarci, and S. Kimani, Touching Notes: A Gesture-Based Game for Teaching Music to Childer, in proceedings of TEI 2015 (9 th International Conference on Tangible, Embedded and Embodied Interaction), Stanford, CA, USA, pp. 603-606, January 15-19, 2015. [2] C.-H. Tsai, Y.-H. Kuo, K.-C. Chu, and J.-C. Yen, Development and Evaluation of Game-based Learning System Using the Microsoft Kinect Sensor, International Journal of Distributed Sensor Network, in press. [3] B.Lange, C.-Y. Chang, E. Suma, B. Newman, A.S. Rizzo, and M. Bolas, Devlopement and evaluation of low cost game-based balance rehabilitation tool using the microsoft kinect sensor, In proceedings of EMBC 2011 (IEEE Engineering in Medicine and Biology Society), pp. 1831-1834, August 30- September 3, 2011. [4] B. Lange, S. Koening, E. McConnell, C. Chang, R. Juang, E. Suma, M. Bolas, and A. Rizzo, Interative game-based rehabilitation using Microsoft Kinect, ISBN: 978-1-9491968-07-9 2015 SDIWC 162