Real-time AR Edutainment System Using Sensor Based Motion Recognition

Similar documents
Immersive Real Acting Space with Gesture Tracking Sensors

Performing Art Utilizing Interactive Technology -Media Performance <Silent Mobius>-

Implementation of Augmented Reality System for Smartphone Advertisements

Interactive Media Artworks as Play Therapy through Five Senses

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Enhancing Shipboard Maintenance with Augmented Reality

Learning Based Interface Modeling using Augmented Reality

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

A Study on the Physical Effects in 4D

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Mixed Reality technology applied research on railway sector

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Realization of Multi-User Tangible Non-Glasses Mixed Reality Space

Augmented Reality Tools for Integrative Science and Arts STEAM Education

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

AR Tamagotchi : Animate Everything Around Us

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Clock Synchronization of Pseudolite Using Time Transfer Technique Based on GPS Code Measurement

Projection Mapping Contents Development of architectural heritage

interactive laboratory

revolutionizing Subhead Can Be Placed Here healthcare Anders Gronstedt, Ph.D., President, Gronstedt Group September 22, 2017

Hand Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided

Design Principles of Virtual Exhibits in Museums based on Virtual Reality Technology

Boneshaker A Generic Framework for Building Physical Therapy Games

A Study on Motion-Based UI for Running Games with Kinect

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

CONTENT RICH INTERACTIVE, AND IMMERSIVE EXPERIENCES, IN ADVERTISING, MARKETING, AND EDUCATION

Air Marshalling with the Kinect

Virtual Co-Location for Crime Scene Investigation and Going Beyond

The Analysis of the Media Convergence Ecosystem Value Chain

A C A D / C A M. Virtual Reality/Augmented Reality. December 10, Sung-Hoon Ahn

Virtual Reality as Innovative Approach to the Interior Designing

Implement of weather simulation system using EEG for immersion of game play

Interior Design with Augmented Reality

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Immersive Guided Tours for Virtual Tourism through 3D City Models

Gesture Recognition with Real World Environment using Kinect: A Review

Design and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone

Motion sickness issues in VR content

Symbol Timing Detection for OFDM Signals with Time Varying Gain

COMPUTER GAME DESIGN (GAME)

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

Virtual and Augmented Reality Applications

Advancements in Gesture Recognition Technology

Design of background and characters in mobile game by using image-processing methods

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL

제 1 HCI Korea, 증강현실전시기술의적용사례및분석. Woontack Woo ( 우운택 ), Ph.D. KAIST GSCT UVR Lab. Tw

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

Augmented Reality Lecture notes 01 1

Advances In Natural And Applied Sciences 2018 April; 12(4): pages DOI: /anas

Development of the A-STEAM Type Technological Models with Creative and Characteristic Contents for Infants Based on Smart Devices

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy

Interior Design using Augmented Reality Environment

A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment

BoBoiBoy Interactive Holographic Action Card Game Application

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

The Seamless Localization System for Interworking in Indoor and Outdoor Environments

A Study on the Components of Visual Perception in. Media Artwork that Increasing Immersion

Chapter 1 Virtual World Fundamentals

LANEY COLLEGE COURSE OUTLINE

3D Human-Gesture Interface for Fighting Games Using Motion Recognition Sensor

ELG 5121/CSI 7631 Fall Projects Overview. Projects List

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

The Control of Avatar Motion Using Hand Gesture

Virtual Reality in Neuro- Rehabilitation and Beyond

Image Manipulation Interface using Depth-based Hand Gesture

A contemporary interactive computer game for visually impaired teens

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

Realtime 3D Computer Graphics Virtual Reality

Install simple system for playing environmental animation in the stereo display

A 12-bit 100kS/s SAR ADC for Biomedical Applications. Sung-Chan Rho 1 and Shin-Il Lim 2. Seoul, Korea. Abstract

The Holographic Human for surgical navigation using Microsoft HoloLens

Mechanical Life: Expression of Artificial Life Shown in Kinetic Art

AUGMENTED REALITY IN VOLUMETRIC MEDICAL IMAGING USING STEREOSCOPIC 3D DISPLAY

Ubiquitous Home Simulation Using Augmented Reality

Prepare Checkout and download some of the apps in preparation for our session today. AR Runner MetaVerse CoSpaces

TEAM JAKD WIICONTROL

VIP-Emulator: To Design Interactive Architecture for adaptive mixed Reality Space

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

SIU-CAVE. Cave Automatic Virtual Environment. Project Design. Version 1.0 (DRAFT) Prepared for. Dr. Christos Mousas JBU.

A Driver Assaulting Event Detection Using Intel Real-Sense Camera

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Digitalisation as day-to-day-business

3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

User Interface Software Projects

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

WiiInteract: Designing Immersive and Interactive Application with a Wii Remote Controller

Transforming Industries with Enlighten

Augmented and Virtual Reality

WIMPing Out: Looking More Deeply at Digital Game Interfaces

Non-Contact Gesture Recognition Using the Electric Field Disturbance for Smart Device Application

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Korean Wave (Hallyu) of Knowledge through Content Curation, Infographics, and Digital Storytelling

Classifying 3D Input Devices

Transcription:

, pp. 271-278 http://dx.doi.org/10.14257/ijseia.2016.10.1.26 Real-time AR Edutainment System Using Sensor Based Motion Recognition Sungdae Hong 1, Hyunyi Jung 2 and Sanghyun Seo 3,* 1 Dept. of Film and Digital Media, Seokyeong University, 2 GSAIM, ChungAng University 3 Electronics and Telecommunications Research Institute 1 sungdaehong@naver.com, 2 hyunning1202@gmail.com, 3 shseo@etri.re.kr Abstract Recently the Natural User Interface (NUI) technology which is capable of appreciating the whole human body has come to the fore with the development of digital technology. And this new interface has settled as the competitive contents in the growing experiential learning contents market, by arousing participants interests and maximizing their learning effect, through a gesture recognition-based noncontact type interactive education. This study suggests various interpretations on the basis of the education contents production which enables participant to experience the human body in real time. Also it has developed a gesture interface by utilizing the Kinect sensor that can recognize a participant s skeleton and behavior, and its education contents design has produced the end product by using Unity 3D authoring tool which can combine the real-time 3D model and animation together. Consequently, participant will be able to learn various inner organs such as brain and heart by different gestures, and the developed real-time human body exploring augmented reality system is expected to be widely used in many educational institutions Keywords: Augmented Reality, Kinect, Interactive Human, NUI, Physical Interface 1. Introduction Use of ICT (Information & Communication Technology) such as new media is rapidly increasing also in education field and the smart education market is expanding every day all over the world. However, it is actually rare to find a field application case of the latest ICT and it is only utilized in application of textbook like e-book. Therefore, a development of tangible type edutainment contents design which promotes the sense of reality and immersion was tried as one of the alternatives which meet the requirements for the high quality educational contents. Especially it is necessary to produce a tangible type edutainment design which can maximize the learning effect through an experiential factor. So, this study suggests a human body recognition real-time interactive tangible type experiential-learning contents design which recognizes a movement of the user and reacts to it. In Chapter 2, an interaction technology for virtual reality-based human body learning educational experience is discussed. Next to that, the applicable range for the contents production and development is sought by exploring other research cases for the real-time educational experience contents. In Chapter 3, a process of the real-time human body experiencing edutainment design production is materialized. Also it abstracts the human skeleton recognition and image processing of the Kinect sensor interface for the participant s behavior recognition in real time, and materializes a design which applies the augmented reality-based real time 3D contents. * Corresponding Author ISSN: 1738-9984 IJSEIA Copyright c 2016 SERSC

Real-time game engine Unity 3D was used as a way of demonstrating the final contents, and lastly, the conclusion and direction of future research for the real-time human body experiencing edutainment design is described. 2. Previous Work In digital contents field, the physical interactive games such as Xbox 360 and Nintendo Wii, specific functional games in various fields like education, entertainment, military, management, or medical treatment field, and virtual reality-based interactive experiencing contents have been leading the market. And recently, the virtual reality-based participatory contents is expanding to the learning service that a user can directly participate in the virtual contents, interact with and experience the learning contents [1,2]. 2.1. Digital Participatory Interface Technology Japanese Nintendo developed a game service named Wii, which adopted Wiimote, a 3D acceleration sensor-equipped game controller shown in Figure 1. This technology provides an interaction function which is required for playing the game, by recognizing the user s controller operation, and it demonstrates a superior function in recognizing even minor motions which is needed for game environment. But because of inconvenience in using the device and limited service-able environment, it is not appropriate for the learning purpose. Figure 1. Nintendo Wii Figure 2. KINECT by Microsoft Kinect is a motion sensing input device officially launched by Microsoft in October, 2010 for game console, XBOX 360. Edutainment content for children unveiled by Microsoft in 2012 is characterized by a use of gesture recognition technology, by enhancing voice recognition on the Kinect which recognizes a user s motion through 3D camera (Figure 2). However, it is not suitable for the education or broadcasting field because of its low accuracy in interaction performance, and even it requires an additional training for the user to operate it [3, 4]. 2.2. Case of Digital Experience Contents D strict shown in Figure 3 developed the contents which can freely control commercial menus with the infrared ray-based electronic glove. The augmented reality commercials solution developed by this company is a technology that enables a user to find the commercial images floated in the air by moving them around, through the hands motion with the infrared ray and marker- equipped electronic glove. 272 Copyright c 2016 SERSC

Figure 3. Hyper Presentation by D strict Figure 4. GestureTek s 3D Avatar Figure 5. 4D Technology Showcase by D strict Also GestureTek developed a full body 3D avatar control technology through a user extraction. Figure 4 shows the 3D avatar control by GestureTek. This technology performs a virtual reality navigation function by extracting human skeleton data in the user image. But unfortunately, it cannot provide an augmented reality experience which combines user image and virtual image partially. D strict also released a 4D system which obtains a view transform in the virtual space by recognizing the user s posture and gesture, through the Kinect sensor. Based on this, it supports a space movement and turning and can appreciate a concrete motion such as foot stamping or stretching-out a hand. Figure 5 shows the 4D technology showcase by D strict. 3. AR Based Human Body Experience System An augmented reality is a linkage system of virtual object. According to a definition by Paul Milgram, a mixed reality can be divided into Augmented Reality characterized by a virtual object overlapped on the basis of the real world, and Augmented Virtuality characterized by a reality superimposition on the basis of the virtual world [5]. It can also coordinate the virtual images in the real world and carry out a real-time interaction and especially it is a technology which provides a user with an improved sense of immersion and reality, as it is a seamlessly demonstrated technology within a 3D space [6]. Figure 6 shows the human-body learning H/W system. This system is a learning service technology that participants can take part in direct 3D stereoscopic virtual contents and experience the human-body learning contents through mutual interactions [7]. It can increase an immersion degree in learning as well as a sense of bond between learners, by a mutual interaction through user s motion in the real-time 3D virtual space. It is a device allowing close observation on user s skeleton, organ, frame, and even muscles by the Kinect sensor for user s motion, and the Kinect sensor which detects Copyright c 2016 SERSC 273

user s motion suggests the contents shown in Figure 7, which enables participant to learn the human body in detail, by showing organs, bones, and muscles within the human body in the image-based augmented reality space. Figure 6. Real-Time Human-Body Learning H/W System Figure 7. The Concept of Real-Time Human-Body Learning As the 3D Depth Sensors are equipped in Kinect which is used as a main device in a materialization of this system, it can recognize 1.2m to 3.5m depth information. Materialization is achieved by Unity 3D game engine, so that it can segue into a 3D virtual space area that can recognize a participant s motion, through a natural dissolve from 2D webcam to 3D camera. Figure 8 and Figure 9 shows the workflow of original AR workflow and the simplified AR workflow respectively. We employed the simplified AR flow in our system. Motion detection algorithm is applied in the augmented reality and the tracking technology is materialized in order that it can recognize a participant s motion in real-time and accurately harmonize synchronization of images displayed in the 3D character and UHD screen. Figure 10 explains the technical plan of our exhibition system. Participant is recognized by 2D camera as a skeleton according to its distance and a user can directly select the desired screen with a motion gesture. Divided into three layers - Skeleton and Organs, Skeleton, and Organ, screen interface is designed for user to see the whole body by touching each layer icon. In case of Organs, four organ icons - Brain, Heart, Lung, Liver, Stomach, and Intestine - are shown at the right section (Figure 11, 12). Touching a hand over each icon, a user can see the contacted organ and in that case, other parts become translucent Figure 13). With a touch on the organ icon, the selected 274 Copyright c 2016 SERSC

organ is shown largely at the screen with a rotation and each function is displayed in text, and then it is back to its place after 8 to 10 seconds later. Figure 8. General AR Workflow Figure 9. Simplified AR Workflow Figure 10. Technical Plan of Distance Sensor System A user s figure shown through a webcam is changed into 3D state if a user enters into the area of 3D by a dissolve effect, and UI design enables a user to be recognized by webcam as a skeleton frame according to the distance and also to easily select the desired screen. Figure 14 shows the operation of our system. A complete system of the developed real-time interactive human body experiencing contents is shown in the figure, and a conversational space where a real time interaction is possible is materialized by an expansion of Unity 3D engine. To be more specific, equipped with the photo-realistic 2D camera and depth sensor (Z-cam), it recognizes a user s motion by figuring out the depth-map obtained from Z-cam and user s skeleton structure. Also in order to enhance a realistic effect for a participant, it is built by UHD display system. Copyright c 2016 SERSC 275

Figure 11. Screen Transition of 3D Virtual Environment Figure 12. 3D Objects of Organs Figure 13. Interaction Image of 3D Organs Figure 14. The Result of our System 4. Conclusions It is said that NUI type experience contents which appreciates the whole human body is an effort to bridge a gap between a user and an avatar by practically projecting a user s motion onto an avatar and also that it induces a direct illusory experience of the case commonly through a camouflage of the process, behavior or action [8, 9]. Therefore, this study provides a strong experience effect of game behavior that is possible for a user to experience directly, and introduces a tangible type edutainment which improves learning effect by utilizing the human-body experiencing educational contents. This study is the contents suggesting a new direction which leads to a NUI-based interactive learning education. An event that lets a user experience a human body through a response to various motion gestures in real time has been applied and it materializes a system which switches a whole body into a single interface. It has also attempted to improve an effect of experiential learning as much as possible, by reproducing the role and state of real organs into an interactive simulation format. This user-based interactive education contents are expected to suggest a new direction to a participation method and learning education in the growing edutainment market. Also it will not leave its role just within a boundary of interaction, but also expand the education field as well. In future research, it is expected to study and develop the contents, by implementing a system which can achieve a learning effect on the basis of multiple participants and getting something out of just an individual participation. Acknowledgments This research was supported by SeoKyung University in 2016. 276 Copyright c 2016 SERSC

References [1] H.-C. Yoon and J.-S. Park, Avatar Animation Using Skeleton Tracking With Kinect Sensor, International Journal of Advancements in Computing Technology (IJACT), vol. 5, no. 12, (2013), pp. 339-344. [2] L. Meng, H. Xinyuan, Guoxin, The Research and Experiment about Interactivity and Immersion of Virtual Reality, International Journal of Digital Content Technology and its Applications (JDCTA), vol.7, no. 13, (2013), pp.132-141. [3] P. Horejsi1 and T. Gorner, Using Kinect Technology Equipment for Ergonomics, MM Science Journal, (2013), pp. 388-340. [4] C. Sinthanayothin, N. Wongwaen and W. Bholsithi, Skeleton Tracking using Kinect Sensor & Displaying in 3D Virtual Scene, International Journal of Advancements in Computing Technology (IJACT), vol. 4, no. 11, (2012), pp. 213-223. [5] P. Milgram, Augmented Reality, A class of displays on the Reality-Virtuality Continuum, SPIE vol. 2351, (1994), pp. 282-292. [6] J. S. Lee, J. A. Noh, S. H. Lim and J. S. Lee, An Activity Contents Technology Trend Based on Virtual Reality, ETRI, Electronics and Telecommunications Trends, (2012), pp. 73-82. [7] S.-M. Hwang, H.-G. Yeom, Computer Interface Construction for Recognition of Motion and Voice using KINECT, International Journal of Advancements in Computing Technology (IJACT), vol. 5, no. 12, (2013), pp. 235-240. [8] J.-Y. Kim and J.-H. Sung, The Formation of New Game Generation in Game-Extended Space: Focused on the Experience Game, Korea Game Society, vol. 10, no. 5, (2010), pp. 1-11. [9] H. Y. Jung, Characteristics of Interactive Performance Focused on Body Medium, ChungAng University, (2013). Authors Sungdae Hong is currently a full-time faculty member of the Dept. of Film and Digital Media at Seokyeong University in South Korea. He received his BS degree in Computer Design Major and his MS and Ph.D. from Chung-Ang University (Seoul) with an Art and Technology major in 2002~2008. With unique educational background from CG to Media Arts, he is interested in Interactive Media Art and Image Processing. He has presented his artworks in various digital art shows including SIGGRAPH 2006 G-Studio and ASCI Digital07, Graphite 2006~2007, Digital Playground in Island 2010~2013 (Director). His research is focused on Digital Image Processing. Hyunyi Jung received her B.A. degree in Dance from Rutgers, the state University of New Jersey New Brunswick, NJ, USA. She received M.S. Degree and is pursuing Ph.D Degree in Art & Technology at Graduate School of Advanced Imaging Science, Multimedia & Film in Chung-Ang University. She engages in the interface between movement and technology, and carries out various activities in order to apply new possibilities rising from the integration of different media to exhibition and area of creation using such media. Her research focuses on media performance and interactive art. Copyright c 2016 SERSC 277

Sanghyun Seo received his B.S. Degree in Computer Science and Engineering from Chung-Ang University in Seoul, Rep. of Korea, in 1998 and M.S. and Ph.D. degrees in GSAIM Dep. at Chung-Ang University in 2000 and 2010 respectively. He was a senior researcher at G-Inno System from 2002 to 2005. He was the postdoctoral researcher at Chung-Ang University, in 2010 and the post-doctoral researcher at LIRIS Lab, Lyon 1 University from Feb. 2011 to Feb 2013. Now, he is working at the ETRI. His research interests are in the area of computer graphics and non-photorealistic rendering, 3D GIS system and game technology. 278 Copyright c 2016 SERSC