VR Basics. Virtual Reality /23/2018

Similar documents
VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Virtual Reality Technology and Convergence. NBAY 6120 March 20, 2018 Donald P. Greenberg Lecture 7

Virtual Reality Technology and Convergence. NBA 6120 February 14, 2018 Donald P. Greenberg Lecture 7

Virtual Reality. NBAY 6120 April 4, 2016 Donald P. Greenberg Lecture 9

Intro to Virtual Reality (Cont)

Regan Mandryk. Depth and Space Perception

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Life Science Chapter 2 Study Guide

Virtual Reality. Lecture #11 NBA 6120 Donald P. Greenberg September 30, 2015

SteamVR Unity Plugin Quickstart Guide

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

Vision. Definition. Sensing of objects by the light reflected off the objects into our eyes

1. What are the components of your nervous system? 2. How do telescopes and human eyes work?

Chapter: Sound and Light

Slide 4 Now we have the same components that we find in our eye. The analogy is made clear in this slide. Slide 5 Important structures in the eye

Output Devices - Visual

Seeing and Perception. External features of the Eye

FOCUS COMPUTER ENGINEERING SERIES. Eyestrain Reduction. Laure Leroy

The Human Brain and Senses: Memory

FLEXLINK DESIGN TOOL VR GUIDE. documentation

Omni-Directional Catadioptric Acquisition System

revolutionizing Subhead Can Be Placed Here healthcare Anders Gronstedt, Ph.D., President, Gronstedt Group September 22, 2017

The eye, displays and visual effects

Best Practices for VR Applications

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Output Devices - I

The Human Visual System!

Augmented and Virtual Reality 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

Head Tracking for Google Cardboard by Simond Lee

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Exploring 3D in Flash

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Refraction, Lenses, and Prisms

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134

VR based HCI Techniques & Application. November 29, 2002

Augmented and Virtual Reality

PHGY Physiology. The Process of Vision. SENSORY PHYSIOLOGY Vision. Martin Paré. Visible Light. Ocular Anatomy. Ocular Anatomy.

PHGY Physiology. SENSORY PHYSIOLOGY Vision. Martin Paré

EYE ANATOMY. Multimedia Health Education. Disclaimer

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

CSC 170 Introduction to Computers and Their Applications. Lecture #3 Digital Graphics and Video Basics. Bitmap Basics

COPYRIGHTED MATERIAL. Overview

User s handbook Last updated in December 2017

COPYRIGHTED MATERIAL OVERVIEW 1

EC-433 Digital Image Processing

Name: Date: Block: Light Unit Study Guide Matching Match the correct definition to each term. 1. Waves

BIMXplorer v1.3.1 installation instructions and user guide

Sensation. Our sensory and perceptual processes work together to help us sort out complext processes

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

OPTICAL SYSTEMS OBJECTIVES

Mastery. Chapter Content. What is light? CHAPTER 11 LESSON 1 C A

III: Vision. Objectives:

AS Psychology Activity 4

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro

pcon.planner PRO Plugin VR-Viewer

Human Senses : Vision week 11 Dr. Belal Gharaibeh

Range Sensing strategies

Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May

Achieving High Quality Mobile VR Games

Visual Effects of Light. Prof. Grega Bizjak, PhD Laboratory of Lighting and Photometry Faculty of Electrical Engineering University of Ljubljana

Vision 1. Physical Properties of Light. Overview of Topics. Light, Optics, & The Eye Chaudhuri, Chapter 8

Virtual Reality and Natural Interactions

DIGITAL IMAGE PROCESSING LECTURE # 4 DIGITAL IMAGE FUNDAMENTALS-I

Chapter 24 Geometrical Optics. Copyright 2010 Pearson Education, Inc.

Thinking About Psychology: The Science of Mind and Behavior 2e. Charles T. Blair-Broeker Randal M. Ernst

Light. Path of Light. Looking at things. Depth and Distance. Getting light to imager. CS559 Lecture 2 Lights, Cameras, Eyes

English PRO-642. Advanced Features: On-Screen Display

sclera pupil What happens to light that enters the eye?

HW- Finish your vision book!

The Optics of Mirrors

2 The First Steps in Vision

Instructional Resources/Materials: Light vocabulary cards printed (class set) Enough for each student (See card sort below)

Unit 3: Energy On the Move

Realtime 3D Computer Graphics Virtual Reality

Feeding human senses through Immersion

Section 1: Sound. Sound and Light Section 1

Sensation and Perception

Reviews of Virtual Reality and Computer World

the dimensionality of the world Travelling through Space and Time Learning Outcomes Johannes M. Zanker

Students: Bar Uliel, Moran Nisan,Sapir Mordoch Supervisors: Yaron Honen,Boaz Sternfeld

Paper on: Optical Camouflage

Lecture 30 Chapter 26 The Human Eye & Visual Perception. Chapter 27 Color

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Basic Principles of the Surgical Microscope. by Charles L. Crain

Chapter 36. Image Formation

CHAPTER 4. Sensation & Perception. Lecture Overview. Introduction to Sensation & Perception PSYCHOLOGY PSYCHOLOGY PSYCHOLOGY. Understanding Sensation

LO - Lab #06 - The Amazing Human Eye

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

The Human Eye and a Camera 12.1

Name: Date: Waves and Electromagnetic Spectrum, Sound Waves, and Light Waves Study Guide For Final

12.1. Human Perception of Light. Perceiving Light

Visual Effects of. Light. Warmth. Light is life. Sun as a deity (god) If sun would turn off the life on earth would extinct

Topic 4: Lenses and Vision. Lens a curved transparent material through which light passes (transmit) Ex) glass, plastic

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Computational Near-Eye Displays: Engineering the Interface Between our Visual System and the Digital World. Gordon Wetzstein Stanford University

Input devices and interaction. Ruth Aylett

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Chapter 23 Study Questions Name: Class:

Lenses. Images. Difference between Real and Virtual Images

Contents STARTUP MICROSCOPE CONTROLS CAMERA CONTROLS SOFTWARE CONTROLS EXPOSURE AND CONTRAST MONOCHROME IMAGE HANDLING

Transcription:

VR Basics Reference: Virtual Reality Technology and Applications & Stanford VR Course Virtual Reality 101 Virtual Reality An interactive computer simulation that senses the user s state and replaces / augments sensory feedback to one of more senses in such a way as to immerse them within the virtual environment. Four basic elements: 1. Virtual Environment 2. Virtual Presence 3. Sensory Feedback 4. Interactivity 1

Virtual Reality 101 Virtual Environment descriptions of objects within the simulation and the rules / relationships that govern the objects Displayed through visual, aural, haptic devices and perceived through vision, hearing and touch, etc Environment Topology areas and features Objects things that occupy space Intermediaries avatars / controlled by players User Interface Elements interface elements such as buttons, switches sliders Virtual Reality 101 Virtual Presence The feeling of being in the environment Physical Presence performed by using synthetic stimuli that affect the user s senses Mental Presence engagement, expectations, the feeling of being part of the world Telepresence the feeling of virtual presence in a distant location Absence the idea of removing oneself from the environment / ignoring the environment. To immerse yourself in a book you must detach from the environment around you. 2

Virtual Reality 101 Physical (sensory) Virtual Presence Virtual Reality 101 Physical (sensory) Virtual Presence 3

Virtual Reality 101 Sensory Feedback Most feedback is visual Usually, at minimum, requires tracking the user s position and orientation ART Hand Tracking System https://www.youtube.com/watch?v=7o9odigmeg8&ab _channel=virtalis Virtual Reality 101 Interactivity - Respond to the user s actions Ability to affect the environment Update the location / view (move around) Perspective First Person looking through the eyes of an avatar Second Person Looking from the immediate vicinity of relevant activity (maybe the arm of a robot) Third Person viewing from an independent location 4

Virtual Reality History 1838 - Stereoscopes 1931 Brave New World discussed movies called feelies that added touch to motion pictures 1957 Sensorama - virtual bicycle ride featuring moving displays, wind and vibration (first VR) Virtual Reality History 1957 Sensorama - virtual bicycle ride featuring moving displays, wind and vibration (first VR) 5

Virtual Reality History 1968 Sword of Damocles superimposed wireframe images over the environment (first AR) Virtual Reality History 1970s Videoplace allowed for interaction for artists 1987 VR on cover of Scientific America & Star Trek TNG Holodeck 1990s CAVE (Cave Automatic Virtual Environment) where walls are screens https://youtu.be/k_rcxof5bre?t=122 6

Virtual Reality History 1995 Nintendo Virtual Boy 2012+ - Oculus Rift, HTC Vive, Explosion of VR and AR devices from major companies and startups Virtual Reality Applications Flight and Driving Simulations https://www.youtube.com/watch?v=i- Ku1ZEFBJY&ab_channel=BigReviewTV Surgery Simulators https://youtu.be/squlhteuqy0?t=2m31s Design / Visualization https://www.youtube.com/watch?v=1q3kvwqfti8&ab_chann el=emulate3d- IndustrialControlsTesting%2CSimulationandDemonstration 7

Mixed Reality Mixed Reality 8

Modeling & Simulation Resiliency against superstorms Visualize events for public education Explain scenarios Modeling & Simulation 9

Medical Mixed Reality Mixed Reality encompasses all types of reality, from real to virtual Augmented Reality refers to virtual objects being merged with reality The lesser known Augmented Virtuality consists of real world objects merging with the virtual world 10

Mixed Reality The lesser known Augmented Virtuality consists of real world objects merging with the virtual world Here we see my son using a 3D printed T-Rex Jaw bone to try and complete the virtual skeleton Virtual Reality System 11

Virtual Reality System Human Visual System Light passes through the cornea, which protects the eye Passes through the pupil, the hole in the iris Light passes through a flexible lens to redirect rays of light Light passes through the vitreous humor, a gelatinous substance in the eye and strikes the fovea The photoreceptors on the retina are in central vision 12

Human Visual System Two types of photoreceptors: Rods Brightness Cones Color Photoreceptors are transmitted via the optic nerve and the optic chasm, the lateral geniculate nucleus, occipital cortex and visual cortex Depth Cues Monocular depth cues, stronger than binocular cues 2D Lighting & Shadows close is bright, far is dim Perspective smaller the further they are Size of known objects size expectations Detail more detail in close objects Occlusion blocking objects are in front Relative Motion far away things move slowly http://paulbourke.net/stereographics/stereorender/ 13

Depth Cues Lighting & Shadows influences the orientation and distance between lights and objects Depth Cues Occlusion/Interposition when objects partial hide one another the hidden object is interpreted as further away http://paulbourke.net/stereographics/stereorender/ 14

Depth Cues Relative Size objects of the same type that get smaller appear further away http://paulbourke.net/stereographics/stereorender/ Depth Cues Detail more detail in close objects http://paulbourke.net/stereographics/stereorender/ 15

Depth Cues Perspective smaller the further they are, two types: Oblique Perspective edges parallel Linear Perspective has a vanishing point Depth Cues Variation in Visibility Fog, particles suspended in air http://paulbourke.net/stereographics/stereorender/ 16

Depth Cues Motion Parallax objects close move faster than objects further away Depth Cues Motion Parallax objects close move faster than objects further away https://youtu.be/7_bdqyw0jdo?t=148 17

Depth Cues Proprioceptive Cues information from joints and muscles though the nervous system Accommodation muscles that shape the lens and the sharpness of the image Depth Cues Proprioceptive Cues information from joints and muscles though the nervous system Convergence muscles that rotate the eye in the socket 18

Depth Cues Stereopsis using our two eyes that are horizontally separated to obtain supplementary depth cues Retinal Disparity The closer the object, the more obtuse the angle is on the fovea of the eye Depth Cues Binocular disparity is a strong depth cue but it requires creating two images, one for each eye from independent views If the depth cues are not correct, one may become dominant, depth can be exaggerated or the scene may become discomforting to view Using stereo pairs does not fix the problem of Accommodation since we are looking at a flat plane at all times Max separation should be 1/30 th the distance from the viewer to the display http://paulbourke.net/stereographics/stereorender/ 19

Artificial Stereoscopic Vision Two images are presented to the user, with some method being used to only allow one image to be seen by each eye Objects will appear behind, at or in front of a screen based on where the angle of our eyes intersects Glasses Based Stereo Anaglyph Passive and inexpensive Only requires rendering the display in two colors Not capable of rendering all colors Monochrome Anaglyph convert to grey scale and assign red channel to the left eye and green and blue to the right 20

Glasses Based Stereo Anaglyph Full color Anaglyph assign red channel to the left eye and green and blue to the right Glasses Based Stereo Polarization Restricts light that reaches each eye Two images are projected on the same screen, each with a different polarizing filter Glasses restrict what type of light reaches each eye Two types: Linear Polarized requires the head to stay level Circularly Polarized does not require head to stay level 21

Glasses Based Stereo Polarization Produces full-color images and does not feature binocular rivalry Costs 50% more than anaglyph Does not require power or any synchronization with the display Lightweight Requires lower resolution for a single screen or multiple projectors Glasses Based Stereo Shutter Glasses Actively shutters to present light to one eye while blocking light to the other Uses liquid crystal shutter glasses that turn opaque when voltage is applied Use a timing signal to synchronize the display with the glasses using infrared or radio Ghosting or cross talk between glasses can be apparent Require batteries, more expensive 22

Glasses Based Stereo Shutter Glasses Ghosting Auto-Stereoscopic Screens Parallax-Barrier Screens Nintendo 3DS Interface a left and right image using a filter to separate what parts are visible Requires exact placement of the eyes to get the proper image 23

Auto-Stereoscopic Screens Lenticular Networks Semi-cylindrical lenses in front of interlaced left and right images VR Headsets A separate screen is reserved for each eye with a divider between them, placed very close to the eyes Crystalline lenses alter our accommodation so that we can focus on a point normally too close for our eyes 24

Parallax http://paulbourke.net/stereographics/stereorender/ Stereo Pairs Toe-in (where our eyes face inward at a single point) is easy to implement Drawback: projection planes separate vertically If the depth cues are not correct, one may become dominant, depth can be exaggerated or the scene may become discomforting to view http://paulbourke.net/stereographics/stereorender/ 25

Stereo Pairs Off-axis Projection planes are lined and is less stressful stereo Requires the use of a nonsymmetric camera frustum http://paulbourke.net/stereographics/stereorender/ Creating the Off-Axis Frustum Calculate widthdiv2 Tan θ = opp adj Tan aperture 2 = widthdiv2 camera. near http://paulbourke.net/stereographics/stereorender/ 26

Creating the Off-Axis Frustum Calculate the Frustrum s left, right, bottom, top, near, far Tan θ = opp adj Tan aperture 2 = widthdiv2 camera. near near plane D = 0.5 eyeseparation camera f 0 http://paulbourke.net/stereographics/stereorender/ Tracking & User Environment Tracking the user s pose and actions is important Pose sensors include: 27

Tracking & User Environment In a perfect world motion tracking should be: Small Self-Contained Complete (6 degrees of freedom) Accurate ( < 1mm & < 0.1 degrees) Fast ( 1kHz ) Occlusion Free Robust (temp, moisture, radio frequency) Unlimited working area Wireless Cheap Of course, this is impossible to achieve in the real world Mechanical Tracking Assumes direct physical connection between user and measurement device Segments with joints for measuring angles High accuracy and sampling but is complex and introduces motion constraints 28

Ultrasonic Tracking Uses high frequency sound to measure distance between a speaker and receiver In dry air @ 20 degrees C sound moves at 343.2 m/sec Length = c * travel time Three receivers are used in an XYZ configuration Greatest weakness is the changing speed of sound based on temperature, pressure, humidity, and occlusion Optical Tracking / Videometric Optical tracking can use active or passive markers, usually with infrared light Camera lens can be mathematically modeled and using triangulation, a 3D point can be determined Videometric has the camera attached to the user and set points around the room 29

Optical Tracking / Videometric Electromagnetic Tracking Based on the local vector of the magnetic field Earth s magnetic field isn t accurate enough Uses a source and a sensor to determine 6 DoF pose Sensors are compact, light and cheap No ferromagnetic materials can be nearby Frequency is limited by the generation of the magnetic fields 30

Considerations to Improve Presentation Ghosting leakage of the left eye image to the right eye and the opposite. High Contrast increases ghosting Screen border objects that get cut off by the screen will conflict since it is set as zero-parallax Occlusion by Audience if someone is in front of you it will create conflicting depth cues Motion Cues Animation enhances understanding of depth Vertical Structure - http://paulbourke.net/stereographics/stereorender/ Considerations to Improve Presentation Parallax/Structure Interference frequency of geometry matches parallax separation Noisy Texture high frequency information gives little depth information Mirror Reflections many times we cheat with reflections, with stereo it can create big problems http://paulbourke.net/stereographics/stereorender/ 31

Considerations to Improve Presentation Specular Highlights based on camera position and since we have two different cameras, you might get different images Positive Parallax easier to look at, reducing eye-strain Focal Distance Changes quick changes cause eye-strain http://paulbourke.net/stereographics/stereorender/ Causes of Visual Fatigue in Stereoscopic Vision Conflict between accommodation and convergence In artificial vision we accommodate on the screen but converge on the depth of the object During adolescence a link is developed between accommodation and convergence If the disparity is too great it can cause double vision Eyestrain Reduction In Stereoscopy 32

VR in Unity HTC Vive Enable Virtual Reality in your Application VR in Unity HTC Vive Remove Oculus and choose OpenVR for HTC Vive Steam and Steam VR are required to make OpenVR applications work on your computer 33

VR in Unity HTC Vive For People Using Macs Unity OpenVR requires Metal graphics and 64 bit application and OpenGL is not supported Can work with macos 10.11.6+ but optimized for 10.13 High Sierra+ VR in Unity HTC Vive menu sensor trackpad trigger system status light grip micro usb 34

VR in Unity HTC Vive Go into your Input Manager Add 16 additional Inputs (18 -> 34) Consult the Unity manual on the controller hardware that is exposed: https://docs.unity3d.com/manual/openvrcontrollers.html VR in Unity HTC Vive Example of Input for left and right controller menu buttons Note that to access joystick buttons, we use the syntax: joystick button X 35

VR in Unity HTC Vive Example of Input for left trackpad horizontal and vertical axes Note that axis 1 and 2 are labeled the X axis and Y axis in Unity, respectively Note the Dead Zone, this is how far you need to move the trackpad before if starts returning values that are not zero No trackpad / axis is perfect, set your dead zone to some value above zero, or you will get drift! VR in Unity HTC Vive We have three Classes to work with under the XR namespace: XR.XRSettings global XR settings such as: enabled, what devices is loaded, supported devices XR.XRDevice describes the XR device being used, such as: is present, refresh rate, model XR.InputTracking Handling data coming in from the tracking system such as position, rotation 36

VR in Unity HTC Vive At any time we could potentially lose connection to our controllers due to Bluetooth issues, battery failure or one thrown through a monitor We need to continually check to see if the controllers are available We can query the connected joysticks using: string[] joysticks = Input.GetJoystickNames(); Vive Controllers have specific names: OpenVR Controller - Left OpenVR Controller - Right Iterating through the array of joystick names and searching for these strings will let us know if they are connected VR in Unity HTC Vive 37

VR in Unity HTC Vive We can query button presses as usual with Input.GetKeyDown and use the JoystickButton virtual keycodes: We can query axes based on the names we setup in the Input Manager C# Delegates & Events Think of a delegate as a function pointer They are created using the keyword delegate with the signature of the method afterwards A delegate is able to hold a method and call it later It is the first part in creating Events, which fire off at the correct time and notify subscribers of the event public delegate int DelegateName(int x, float y) Notice that this part looks just like a typical method 38

C# Delegates & Events Events work with delegates to fire off when something important happens public event delegatetype InputRecieved; An event will notify everyone who has subscribed Only methods with the delegate s signature are allowed to subscribe to the event! public void HandleLTrackpadHorz() { // Do Something here} C# Delegates & Events We can add and subtract methods to the event using the += and -= operators OnLeftTrackpadHorz += HandleLTrackpadHorz; OnLeftTrackpadHorz -= HandleLTrackpadHorz; When the times comes, our event calls all the methods that has been subscribed Tell all subscribers Player.ReactToInput() Checking if event happened Yes OnInputReceived Movement.Reaction() MenuSystem.DealWith() 39

C# Delegates & Events Make your delegates and events Poll for a change, check if anyone has subscribed and fire! Somewhere else make your method and register it VR in Unity HTC Vive XRNode Numeration the possible objects that are being tracked that you can query for information such as position and rotation XRNode.LeftEye left eye XRNode.RightEye right eye XRNode.CenterEye between the left and right eye XRNode.Head user s head XRNode.LeftHand left hand XRNode.RightHand right hand XRNode.GameController not associated with a hand XRNode.TrackingReference stationary physical device XRNode.HardwareTracker device arbitrarily attached to other objects Those in red can have multiples in the scene, require you to call InputTracking.GetNodeStates() 40

VR in Unity HTC Vive We can query tracking data such as the position and orientation: VR in Unity SteamVR The SteamVR plugin is also offered on the Unity Asset Store, which provides many helpful scripts and hints at how to use VR successfully Using OpenVR/ OpenXR properly means learning Valve s way of coding For instance, input is handled very differently than we have seen so far in Unity Note, what we are about to go over was released September 21 st, 2018 so it is very new and buggy even I can t get everything to work yet! 41

VR in Unity SteamVR Actions Actions in SteamVR are a way to abstract away all the input possibilities Rather than thinking about the controller being depressed by X%, we instead think of the action we want to use, such as Grab There are 6 different types of action Input: 1. Boolean binary, either it is on or off no inbetween 2. Single analog values between 0 and 1 such as a throttle 3. Vector2 X & Y analog such as a touch pad 4. Vector3 not commonly used 5. Pose tracking your VR controllers 6. Skeleton estimates finger orientation One output option - vibration VR in Unity SteamVR Actions Actions are placed in groups called Action Sets Helps to allow the player to rebind your actions Component SteamVR_ActivateSetOnLoad which turns on and off actions based on the scene you are in Activates in Start(), deactivates in OnDestroy() 42

VR in Unity SteamVR Input Window Actions are stored in an actions.json file at your project s root Default set is active all the times and the device specific sets are only active while using that device Pressing Save & Generate creates scriptable objects for each action so we can use them in inspectors VR in Unity SteamVR Input Window The first time your game is loaded either your default bindings will be used or SteamVR will ask the user to create a binding or use a community binding Go into Open Binding UI to bind your controls Will take you to a local page to setup the binding 43

VR in Unity SteamVR Input Window VR in Unity SteamVR Input Window 44

VR in Unity SteamVR Input Window For example, I define my own action called Walk, which is a vector2 type and is mandatory to play the game Create a new script and expose the action and the input source in the inspector VR in Unity SteamVR Input Window 45

VR in Unity SteamVR Input Window 46