Standard for metadata configuration to match scale and color difference among heterogeneous MR devices

Similar documents
Activities at SC 24 WG 9: An Overview

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

CS 565 Computer Vision. Nazar Khan PUCIT Lecture 4: Colour

MAR Visualization Requirements for AR based Training

Color Image Processing

VR based HCI Techniques & Application. November 29, 2002

ISO/IEC JTC 1/SC 29 N 16019

Color Image Processing

Virtual Reality Technology and Convergence. NBA 6120 February 14, 2018 Donald P. Greenberg Lecture 7

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Additive Color Synthesis

Paper on: Optical Camouflage

The eye, displays and visual effects

For a long time I limited myself to one color as a form of discipline. Pablo Picasso. Color Image Processing

Output Devices - Visual

User Interfaces in Panoramic Augmented Reality Environments

Draft TR: Conceptual Model for Multimedia XR Systems

Notes 1 Three Point Lighting 3- POINT STUDIO LIGHTING

Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror

LECTURE 07 COLORS IN IMAGES & VIDEO

Color. Used heavily in human vision. Color is a pixel property, making some recognition problems easy

Augmented System for Immersive 3D Expansion and Interaction

Virtual Reality Technology and Convergence. NBAY 6120 March 20, 2018 Donald P. Greenberg Lecture 7

Color. Used heavily in human vision. Color is a pixel property, making some recognition problems easy

Techniques. Introduction to Drawing Final Exam Study Guide

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

Colors in Images & Video

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

Immersive Authoring of Tangible Augmented Reality Applications

arxiv: v1 [cs.hc] 11 Oct 2017

CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

Ensure that you have downloaded all the dataset files from your course Resources, and that they are extracted to the route of your C: drive.

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

Application of 3D Terrain Representation System for Highway Landscape Design

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

Perceived realism has a significant impact on presence

Activities at SC 24 WG 9: An Overview

Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire. Introduction

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

ISO/IEC TS TECHNICAL SPECIFICATION

Reinventing movies How do we tell stories in VR? Diego Gutierrez Graphics & Imaging Lab Universidad de Zaragoza

Virtual Reality. Lecture #11 NBA 6120 Donald P. Greenberg September 30, 2015

Effective Contents Creation for Spatial AR Exhibition

Virtual Reality. NBAY 6120 April 4, 2016 Donald P. Greenberg Lecture 9

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

Subjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen

Supporting Mixed Reality Visualization in Web3D Standard

AUGMENTED REALITY IN VOLUMETRIC MEDICAL IMAGING USING STEREOSCOPIC 3D DISPLAY

Omni-Directional Catadioptric Acquisition System

INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET

Color theory Quick guide for graphic artists

Augmented and mixed reality (AR & MR)

Extending X3D for Augmented Reality

Invisibility Cloak. (Application to IMAGE PROCESSING) DEPARTMENT OF ELECTRONICS AND COMMUNICATIONS ENGINEERING

Density vs. Contrast

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

One Size Doesn't Fit All Aligning VR Environments to Workflows

COLOR AS A DESIGN ELEMENT

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Color Science. What light is. Measuring light. CS 4620 Lecture 15. Salient property is the spectral power distribution (SPD)

12 Color Models and Color Applications. Chapter 12. Color Models and Color Applications. Department of Computer Science and Engineering 12-1

Digital Image Processing

3D Scanning Guide. 0. Login. I. Startup

The Human Visual System!

The Blackbody s Black Body

MOVIE-BASED VR THERAPY SYSTEM FOR TREATMENT OF ANTHROPOPHOBIA

Color and Images. Computer Science and Engineering College of Engineering The Ohio State University. Lecture 16

Color: Readings: Ch 6: color spaces color histograms color segmentation

Color images C1 C2 C3

Color Image Processing II

To discuss. Color Science Color Models in image. Computer Graphics 2

Chapter 3 Part 2 Color image processing

Comparative Analysis of RGB and HSV Color Models in Extracting Color Features of Green Dye Solutions

Optical camouflage technology

Color Management User Guide

Unit 8: Color Image Processing

An Introduction into Virtual Reality Environments. Stefan Seipel

Computer Graphics Si Lu Fall /27/2016

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel

VU Rendering SS Unit 8: Tone Reproduction

Image Characteristics and Their Effect on Driving Simulator Validity

VC 16/17 TP4 Colour and Noise

Augmented Reality Mixed Reality

Vision, Color, and Illusions. Vision: How we see

INTERIOUR DESIGN USING AUGMENTED REALITY

Tangible User Interfaces

Color Image Processing. Jen-Chang Liu, Spring 2006

Figure 1: Energy Distributions for light

Classifying 3D Input Devices

Effective Iconography....convey ideas without words; attract attention...

Digital Image Processing (DIP)

Experiment 10. Diffraction and interference of light

Digital Image Processing Color Models &Processing

Machine Vision Basics

Transcription:

Standard for metadata configuration to match scale and color difference among heterogeneous MR devices ISO-IEC JTC 1 SC 24 WG 9 Meetings, Jan., 2019 Seoul, Korea Gerard J. Kim, Korea Univ., Korea Dongsik Jo, Wonkwang Univ., Korea Howon Kim, ETRI, Korea

Motivation Participant s perceptions of virtual object according to different mixed reality devices (e.g. color or scale) Real object Virtual object Participants must be able to consistently perceive color and scale among different MR devices (same color and size) 2

CorrectAR (MatchAR) To perceive the same color and scale among different MR devices Virtual object Virtual object Real object Virtual object Video seethrough Optical seethrough Validation experimental standards : setting, procedure, 3

Previous works (1/2) Users under-estimate egocentric distance in a VE Throwing at a shorter distance Third Person View and Guidance For More Natural Motor Behaviour In Immersive Basketball Playing (VRST 2014)

Previous works (2/2) Various design decisions affect user depth perceptions - Aerial perspective - Billboarding, cast shadows - Ray tracing - Dimensionality (2D VS. 3D) - Shading - Texture Designing for depth perceptions in augmented reality (ISMAR, 2017) https://www.youtube.com/watch?v=klq-gn99qbw

How to derive the standard guideline With respect to AR/MR, What may be parameters to affect user perceptions - Independent variables : Display type, distance, - Dependent variables : Perceived size(scale), color difference between virtual and real Validation experimental standards for mixed and augmented reality - Control variables - Experimental setup - Procedure

Parameters to affect user perceptions

Parameters to affect user perceptions Device characteristics Display type (video or optical) Resolution Aspect ratio Brightness Contrast FOV (viewing Angle) Refresh rate Response time (Tracking) Environment Light condition Occlusion Object characteristics Type (LOD) - Realistic representation Texture quality Distance between the user and the object Viewing direction 8 Participants perceptions Color Scale Naturalness Visibility Readability...

Parameters to affect user perceptions Device characteristics Display type (video or optical) Resolution Aspect ratio Brightness Contrast FOV (viewing Angle) Refresh rate Response time (Tracking) Environment Light condition Occlusion Object characteristics Type (LOD) - Realistic representation Texture quality Distance between the user and the object Viewing direction Control variables Independent variables Experiment Control variables 9 Participants perceptions Color Scale Naturalness Visibility Readability... dependent variables

Validation experimental standards

Display type (Video see-through HMD) 출처 : UX Collective 11

Display type (Video see-through HMD) 12 http://ovrvision.com/entop/

Display type (Optical see-through HMD) 출처 : UX Collective 13

Display type (Optical see-through HMD) 14 https://www.microsoft.com/en-ie/hololens

Display type (Optical see-through HMD) MS Hololens 15

Experimental configuration

Experimental configuration HMD Dark environment (3mx3m curtain) Real object Studio lights Marker Fixed location Joystick (or controller) 17

Experimental configuration Video see-through HMD Optical see-through HMD 18

Experimental configuration HMD Joystick Marker for tracking Real object 19

Real object for comparison Real Cube Size (5.5cm) 6 Colors Use cubic puzzle for scale/color comparison 20

Control variables Light condition Install curtain to match light condition Install studio lights - Same light condition Real object Virtual object 21

Realistic Rendering Matching between real and virtual environments - Same lights and shadow - e.g. Create the same shadow in virtual environments as real-life shadow with the same light conditions Light : #1 (left) Light : #2 (left, right) 22

Stereo camera calibration for the video see-through HMD https://www.youtube.com/watch?v=wsjqimfmxdy 23

Movie 24

Experiment 1: Scale perception Video see-through HMD VS. Optical see-through HMD - Independent variables: Display type - Dependent variables: User s scale perception - Participants: Total 60 people (Between-subjects, each 30 people) Scale Comparison Comparison Real cube Video see-through HMD Optical see-through HMD 25

Procedure Look at a real cube on a desk The subject is sitting on a chair in a fixed position and wears a MR HMD Compare the size with a virtual cube next to a real cube Real object Virtual object - for removing the user s mental load

Scale Adjustment Initial size of the virtual cube : 150mm x 150mm The size of the real cube : 55mm x 55mm Distance between the subject and the real cube (& the virtual cube) - 10cm, 40cm, 70cm (3 cases in our experiments) Scale adjustment using a joystick Until the subject controls the virtual cube equal to the size of the actual cube 27

Joystick Interaction Try to fit the virtual cube to the same size as a real cube using a joystick large scale small scale Button 4/5: large scale control Button 6/7: small scale control 28

Video see-through HMD Results: Scale 10cm 40cm 70cm 피실험자 1 5.7cm 5.7cm 5.4cm 피실험자 2 5.4 5.4 4.4 피실험자 3 5.4 5.0 5.1 피실험자 4 4.9 5.3 4.8 피실험자 5 5.6 5.2 4.8 피실험자 6 5.4 5.7 5.4 피실험자 7 5.4 6.0 6.0 피실험자 8 6.6 6.0 6.5 피실험자 9 6.0 6.0 5.8 30 people 피실험자 10 5.8 6.2 6.2 피실험자 11 5.6 5.9 9.3 피실험자 12 5.9 7.4 6.5 피실험자 13 6.2 6.1 5.6 피실험자 14 7.0 6.4 6.2 피실험자 15 5.8 6.6 6.0 피실험자 16 6.4 6.5 6.4 피실험자 17 7.4 6.8 7.3 피실험자 18 6.2 6.2 5.9 피실험자 19 6.6 7.0 6.0 피실험자 20 5.9 6.3 6.1 피실험자 21 6.3 6.7 6.2 피실험자 22 7.2 7.3 5.9 29

Optical see-through HMD Results: Scale 10cm 40cm 70cm 피실험자 1 5.7cm 5.7cm 5.4cm 피실험자 2 5.4 5.4 4.4 피실험자 3 5.4 5.0 5.1 피실험자 4 4.9 5.3 4.8 피실험자 5 5.6 5.2 4.8 피실험자 6 5.4 5.7 5.4 피실험자 7 5.4 6.0 6.0 피실험자 8 6.6 6.0 6.5 피실험자 9 6.0 6.0 5.8 30 people 피실험자 10 5.8 6.2 6.2 피실험자 11 5.6 5.9 9.3 피실험자 12 5.9 7.4 6.5 피실험자 13 6.2 6.1 5.6 피실험자 14 7.0 6.4 6.2 피실험자 15 5.8 6.6 6.0 피실험자 16 6.4 6.5 6.4 피실험자 17 7.4 6.8 7.3 피실험자 18 6.2 6.2 5.9 피실험자 19 6.6 7.0 6.0 피실험자 20 5.9 6.3 6.1 피실험자 21 6.3 6.7 6.2 피실험자 22 7.2 7.3 5.9 30

One-way Anova 31

Results: Scale Video : Difference by Distances 7 6 5 5 3 2 6.04 10cm 6.07 6.21 40cm 70cm Video See-through HMD No statistically significant difference by distances (p-value > 0.05) 32

Results: Scale Optical : Difference by Distances 7 6 5 5 3 5.88 5.98 6.78 2 10cm 40cm Optical See-through HMD 70cm No statistically significant difference by distances (p-value > 0.05) 33

Result: Scale 7 6 5 5 3 2 6.04 10cm 5.88 5.98 6.07 40cm 6.78 6.21 70cm Video Optical No statistically significant difference by distances and display types (p-value > 0.05) But, Optical/video see-through HMD : tend to look bigger than the real cube - Under-estimate - e.g. In case of the video see-through HMD, people feel that the size 6.04 is equal to 5.5 cm. - Scale gain : video 1.11, optical 1.13 34

Experiment 2: Color perception Video see-through HMD VS. Optical see-through HMD - Independent variables: Display type - Dependent variables: User s color perception - Participants: Total 60 people (Between-subjects, each 30 people) Virtual cube Virtual cube Color Comparison Comparison Video see-through HMD Actual cube Optical see-through HMD 35

Same Procedure Look at a real cube on a desk The subject is sitting on a chair in a fixed position and wears a MR HMD Compare the size with a virtual cube next to a real cube Real object Virtual object - for removing the user s mental load

Color Values Actual cube Virtual cube Actual cube Virtual cube Actual cube Using a color meter, we estimated RGB colors in the real cube To create virtual cubes, each RGB is set to 7 levels. - Total 21 (R 7 + G 7 + B 7) - Adjusted material color values 37

Color Values Actual cube R attribute G attribute B attribute R section 189 30 44 G section 49 115 56 B section 57 90 142 Average RGB values in the real cube Measured 5 times, Omit decimal point 38

Color values for experimental setting B The formula for 3D distance is: Sqrt(dr 2 +dg 2 +db 2 ) G Where dr, dg and db are the difference on the r, g and b axis. R Create 7 virtual cubes with different colors - R/G/B 3 color : total 21 setting Try to select the virtual cube to the same color as a real cube using a wireless keyborad 39

Color mapping table 40

Video see-through HMD Color 측정실험결과 R G B 피실험자 1 7 1 1 피실험자 2 5 2 2 피실험자 3 6 1 2 피실험자 4 6 1 1 피실험자 5 6 2 1 피실험자 6 6 2 1 피실험자 7 7 3 1 피실험자 8 6 1 1 피실험자 9 7 2 1 30 people 피실험자 10 7 1 1 피실험자 11 6 1 2 피실험자 12 7 2 2 피실험자 13 6 1 2 피실험자 14 7 1 1 피실험자 15 6 5 1 피실험자 16 6 1 1 피실험자 17 7 2 1 피실험자 18 7 2 3 피실험자 19 7 6 2 피실험자 20 6 2 1 피실험자 21 5 2 1 피실험자 22 7 6 3 41

Optical see-through HMD Color 측정실험결과 R G B 피실험자 1 7 1 1 피실험자 2 5 2 2 피실험자 3 6 1 2 피실험자 4 6 1 1 피실험자 5 6 2 1 피실험자 6 6 2 1 피실험자 7 7 3 1 피실험자 8 6 1 1 피실험자 9 7 2 1 30 people 피실험자 10 7 1 1 피실험자 11 6 1 2 피실험자 12 7 2 2 피실험자 13 6 1 2 피실험자 14 7 1 1 피실험자 15 6 5 1 피실험자 16 6 1 1 피실험자 17 7 2 1 피실험자 18 7 2 3 피실험자 19 7 6 2 피실험자 20 6 2 1 피실험자 21 5 2 1 피실험자 22 7 6 3 42

Results : Color 255 200 150 100 50 175 211 106 107 R G Video 124 130 B Optical Red color only had statistically significant difference by display types (p-value < 0.05) - Initial Red color :189 result : 211(Video), 175(Optical) In case of the optical see-through HMD, people feel that red 189 is equal to 175. color under-estimation 43

Future works 44

Color values for experimental setting HSI - Hue, saturation, intensity - Correspondence to the way humans describe and interpret color Optical see-through - Lower intensity Video see-through - Higher intensity 45

AdaptAR Participants perceptions of virtual object according to same mixed reality devices (e.g. color or scale) Real object Virtual object Real object Virtual object Optical seethrough Optical seethrough e.g Disparity depending on each person 46

Acknowledgement This research is supported by Ministry of Culture, Sports, and Tourism (MCST) and Korea Content Agency (KOCCA) in the Culture Technology (CT) Research & Development Program 2018 ( 본연구는문화체육관광부및한국콘텐츠진흥원의 2018년도문화기술연구개발지원사업으로수행되었음 ) 47