Intro to Virtual Reality (Cont)

Similar documents
Special Topic: Virtual Reality

Rendering Challenges of VR

Virtual Reality. NBAY 6120 April 4, 2016 Donald P. Greenberg Lecture 9

Virtual Reality Technology and Convergence. NBA 6120 February 14, 2018 Donald P. Greenberg Lecture 7

Virtual Reality Technology and Convergence. NBAY 6120 March 20, 2018 Donald P. Greenberg Lecture 7

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

The Human Visual System!

Virtual Reality. Lecture #11 NBA 6120 Donald P. Greenberg September 30, 2015

Mobile Virtual Reality what is that and how it works? Alexey Rybakov, Senior Engineer, Technical Evangelist at DataArt

Computational Near-Eye Displays: Engineering the Interface Between our Visual System and the Digital World. Gordon Wetzstein Stanford University

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

Head Tracking for Google Cardboard by Simond Lee

ReVRSR: Remote Virtual Reality for Service Robots

Head Mounted Display Optics II!

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Static Scene Light Field Stereoscope

Best Practices for VR Applications

A Low Cost Optical See-Through HMD - Do-it-yourself

Regan Mandryk. Depth and Space Perception

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

VR/AR Concepts in Architecture And Available Tools

Christian Richardt. Stereoscopic 3D Videos and Panoramas

Design and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone

Range Sensing strategies

Coded photography , , Computational Photography Fall 2018, Lecture 14

Bring Imagination to Life with Virtual Reality: Everything You Need to Know About VR for Events

Virtual Reality in aviation training

interactive laboratory

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

Coded photography , , Computational Photography Fall 2017, Lecture 18

Construction of visualization system for scientific experiments

Augmented and Virtual Reality

Introduction.

Unpredictable movement performance of Virtual Reality headsets

Is This Real Life? Augmented & Virtual Reality in Your Library

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

VIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR

Immersive Visualization On the Cheap. Amy Trost Data Services Librarian Universities at Shady Grove/UMD Libraries December 6, 2019

Future Directions for Augmented Reality. Mark Billinghurst

4/23/16. Virtual Reality. Virtual reality. Virtual reality is a hot topic today. Virtual reality

VR Basics. Virtual Reality /23/2018

Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May

Potential Uses of Virtual and Augmented Reality Devices in Commercial Training Applications

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

ADVANCED WHACK A MOLE VR

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu

CSC 170 Introduction to Computers and Their Applications. Lecture #3 Digital Graphics and Video Basics. Bitmap Basics

Virtual Reality in Neuro- Rehabilitation and Beyond

Cameras have finite depth of field or depth of focus

Measuring Latency in Virtual Reality Systems

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Output Devices - I

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Visual Effects of Light. Prof. Grega Bizjak, PhD Laboratory of Lighting and Photometry Faculty of Electrical Engineering University of Ljubljana


Introduction to Virtual Reality (based on a talk by Bill Mark)

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f)

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

EE 267: Introduction and Overview!

Considerations for Standardization of VR Display. Suk-Ju Kang, Sogang University

Time of Flight Capture

COURSES. Summary and Outlook. James Tompkin

Visual Effects of. Light. Warmth. Light is life. Sun as a deity (god) If sun would turn off the life on earth would extinct

New AR/VR Trends in Aerospace

VR System Input & Tracking

Instrumentation (ch. 4 in Lecture notes)

Making Virtual Reality a Reality. Surviving the hype cycle to achieve real societal benefit.

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

Announcements. The appearance of colors

LECTURE 3: PERCEPTION AND VISUAL DISPLAYS

Tobii Pro VR Integration based on HTC Vive Development Kit Description

Well..How Did I Get Here?

Computational Approaches to Cameras

doi: /

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

Learning technology trends and implications

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Digital photography , , Computational Photography Fall 2017, Lecture 2

Computer Vision. The Pinhole Camera Model

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1

What is a digital image?

Sky Italia & Immersive Media Experience Age. Geneve - Jan18th, 2017

Motion perception PSY 310 Greg Francis. Lecture 24. Aperture problem

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Immersive Aerial Cinematography

Light-Field Database Creation and Depth Estimation

Panoramas. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Depth Imaging the engine of the renaissance of VR/AR

AMD Ryzen VR Ready Premium and AMD VR Ready Processor Badge Guidelines for Marketing Materials. September 2017 PID# A

Geog183: Cartographic Design and Geovisualization Spring Quarter 2018 Lecture 2: The human vision system

The eye, displays and visual effects

Oculus Rift Getting Started Guide

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

VR based HCI Techniques & Application. November 29, 2002

Capturing Light in man and machine

Sensors and Sensing Cameras and Camera Calibration

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

Transcription:

Lecture 37: Intro to Virtual Reality (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A

Overview of VR Topics Areas we will discuss over next few lectures VR Displays VR Rendering VR Imaging CS184/284A Ren Ng

Display Requirements Derive From Human Perception Example 3: Binocular Stereo and Eye Focus ( Accommodation )

Two Eyes: Two Views CS184/284A Charles Wheatstone stereoscope, 1838 Ren Ng

Recall: Current VR HMD Optical Design Image credit: ifixit.com https://www.ifixit.com/teardown/oculus+rift+cv1+teardown/60612

Stereo Vergence

Stereo Vergence

Stereo Vergence

Stereo Passive (no tracking of eyes) Present each eye with perspective view corresponding to that eye s location relative to the other eye Eyes will con(verge) by rotating physically in sockets in order to bring closer and further objects into physical alignment on retina CS184/284A Ren Ng

Human Eye Muscles and Optical Controls Slide credit: Gordon Wetzstein

Human Eye Muscles and Optical Controls far focus! 16 years: ~8cm to "! 50 years: ~50cm to " (mostly irrelevant)! near focus! adithyakiran.wordpress.com! Slide credit: Gordon Wetzstein

Accommodation and Vergence Accommodation: changing the optical power of the eye (lens) to focus at different distances Eye accommodated to focus on a distant object Eye accommodated to focus on a nearby object Vergence: rotation of the eye in its socket to ensure projection of object is centered on the retina

Accommodation Vergence Conflict Given design of current VR displays, consider what happens when objects are up-close to eye in virtual scene Eyes must remain accommodated to far distance (otherwise image on screen won t be in focus) But eyes must converge in attempt to fuse stereoscopic images of object up close Brain receives conflicting depth clues (discomfort, fatigue, nausea) This problem stems from nature of display design. If you could just make a display that emits the light field that would be produced by a virtual scene, then you could avoid the accommodation - vergence conflict

Aside: Research on Near-Eye Light Field Displays Goal: recreate light field in front of eye Lanman and Luebke, SIGGRAPH Asia 2013.

Display Requirements Derive From Human Perception Example: Motion Parallax from Eye Motion

The 5D Plenoptic Function P (x, y, z,, ) 3D Position 2D Direction [Adelson, Bergen 1991] CS184/284A Ren Ng

Discussion: How to Track Head Position for VR? Need to track 3D position and orientation of head and eyes to render left/right viewpoints correctly High positional accuracy needed (e.g. 1 mm), because user can move very close to objects and very precisely relative to them Rendering needs to reflect this view Discussion: Ideas on how to track position and orientation of a VR headset? CS184/284A Ren Ng

Google Cardboard: Tracking Using Headset Camera Tracking uses gyro / rearfacing camera to estimate user s viewpoint 2D rotation tracking generally works well 3D positional tracking a challenge in general environments CS184/284A Ren Ng

Environment-Supported Vision-Based Tracking? Image credit: gizmodo.com Early VR test room at Valve, with markers positioned throughout environment

Oculus Rift IR LED Tracking System Oculus Rift + IR LED sensor

Oculus Rift LED Tracking System (DK2) Headset contains: 40 IR LEDs Gyro + accelerometer (1000Hz) External 60Hz IR Camera Image credit: ifixit.com Photo taken with IR-sensitive camera (IR LEDs not visible in real life)

Oculus Rift IR LED Tracking Hardware Photo taken with IR-sensitive camera https://www.ifixit.com/teardown/oculus+rift+constellation+teardown/61128

Oculus Rift IR Camera IR filter (blocks visible spectrum) Camera lens CMOS sensor Note: silicon is sensitive to visible and IR wavelengths https://www.ifixit.com/teardown/oculus+rift+constellation+teardown/61128

Recall: Passive Optical Motion Capture Retroflective markers attached to subject IR illumination and cameras Markers on subject Positions by triangulation from multiple cameras 8+ cameras, 240 Hz, occlusions are difficult Slide credit: Steve Marschner

Active Optical Motion Capture Each LED marker emits unique blinking pattern (ID) Reduce marker ambiguities / unintended swapping Have some lag to acquire marker IDs Phoenix Technology Phase Space

Oculus Rift Uses Active Marker Motion Capture Credit: Oliver Kreylos, https://www.youtube.com/watch?v=o7dt9im34oi Motion capture: unknown shape, multiple cameras VR head tracking: known shape, single camera

6 DOF Head Pose Estimation Head pose: 6 degrees of freedom (unknowns) 3D position and 3D rotation of headset (e.g. can represent as 4x4 matrix) Inputs: Fixed: relative 3D position of markers on headset (e.g. can represent each marker offset as 4x4 matrix) Fixed: camera viewpoint (ignoring distortion, also a 4x4 projective mapping of 3D scene to 2D image) Each frame: 2D position of each headset marker in image Pose calculation: Write down equations mapping each marker to image pixel location as a function of 6 degrees of freedom Solve for 6 degrees of freedom (e.g. least squares) CS184/284A Ren Ng

HTC Vive Tracking System ( Lighthouse ) Structured light transmitter Photodiode arrays on headset and hand-held controllers

Vive Headset & Controllers Have Array of IR Photodiodes IR photodiode Image credit: uploadvr.com (Prototype) Headset and controller are covered with IR photodiodes

HTC Vive Structured Light Emitter ( Lighthouse ) Light emitter contains array of LEDs (white) and two spinning wheels with lasers Sequence of LED flash and laser sweeps provide structured lighting throughout room Credit: Gizmodo: http://gizmodo.com/this-is-how-valve-s-amazing-lighthouse-tracking-technol-1705356768

HTC Vive Tracking System ( Lighthouse ) Structured light transmitter Photodiode arrays on headset and hand-held controllers

Vive Headset & Controllers Have Array of IR Photodiodes IR photodiode Image credit: uploadvr.com (Prototype) Headset and controller are covered with IR photodiodes

HTC Vive Structured Light Emitter ( Lighthouse ) Light emitter contains array of LEDs (white) and two spinning wheels with lasers Sequence of LED flash and laser sweeps provide structured lighting throughout room Credit: Gizmodo: http://gizmodo.com/this-is-how-valve-s-amazing-lighthouse-tracking-technol-1705356768

HTC Vive Tracking System For each frame, lighthouse does the following: LED pulse, followed by horizontal laser sweep LED pulse, followed by vertical laser sweep Each photodiode on headset measures time offset between pulse and laser arrival Determines the x and y offset in the lighthouse s field of view In effect, obtain an image containing the 2D location of each photodiode in the world (Can think of the lighthouse as a virtual camera ) CS184/284A Ren Ng

HTC Vive Tracking System ( Lighthouse ) Credit: rvdm88 / youtube. https://www.youtube.com/watch?v=j54dottt7k0

Tracking Summary Looked at three tracking methods Camera on headset + computer vision + gyro External camera + marker array on headset External structured light + sensor array on headset 3D tracking + depth sensing an active research area SLAM, PTAM, DTAM Microsoft Hololens, Google Tango, Intel Realsense, CS184/284A Ren Ng

Acknowledgments Thanks to Kayvon Fatahalian, Alyosha Efros and Brian Wandell for lecture resources and slides!