Realtime 3D Computer Graphics Virtual Reality

Similar documents
Input devices and interaction. Ruth Aylett

VR System Input & Tracking

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Classifying 3D Input Devices

Input devices and interaction. Ruth Aylett

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS

What was the first gestural interface?

Classifying 3D Input Devices

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System

Virtual Environments: Tracking and Interaction

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Touch & Gesture. HCID 520 User Interface Software & Technology

Augmented Reality And Ubiquitous Computing using HCI

Chapter 1 - Introduction

Heads up interaction: glasgow university multimodal research. Eve Hoggan

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Realtime 3D Computer Graphics Virtual Reality

Virtual Environments. Ruth Aylett

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

VR based HCI Techniques & Application. November 29, 2002

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Touch & Gesture. HCID 520 User Interface Software & Technology

Technologies. Philippe Fuchs Ecole des Mines, ParisTech, Paris, France. Virtual Reality: Concepts and. Guillaume Moreau.

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Augmented and Virtual Reality

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

DATA GLOVES USING VIRTUAL REALITY

MEASURING AND ANALYZING FINE MOTOR SKILLS

CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR

Virtual Reality & Interaction

Augmented Reality Lecture notes 01 1

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Flexible Gesture Recognition for Immersive Virtual Environments

Vorlesung Mensch-Maschine-Interaktion. The solution space. Chapter 4 Analyzing the Requirements and Understanding the Design Space

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

Augmented and Virtual Reality 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

GESTUR. Sensing & Feedback Glove for interfacing with Virtual Reality

3D Interaction Techniques

Building a gesture based information display

Haptic, vestibular and other physical input/output devices

CSE 165: 3D User Interaction. Lecture #11: Travel

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

Contents. Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement.

By: Celine, Yan Ran, Yuolmae. Image from oss

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

A Hybrid Immersive / Non-Immersive

Virtual Reality Calendar Tour Guide

VICs: A Modular Vision-Based HCI Framework

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

Cosc VR Interaction. Interaction in Virtual Environments

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Geo-Located Content in Virtual and Augmented Reality

Virtual Grasping Using a Data Glove

IMGD 4000 Technical Game Development II Interaction and Immersion

3D UIs 101 Doug Bowman

Indoor Positioning by the Fusion of Wireless Metrics and Sensors

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

A Survey of Hand Posture and Gesture Recognition Techniques and Technology

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Guidelines for choosing VR Devices from Interaction Techniques

The use of gestures in computer aided design

Virtual/Augmented Reality (VR/AR) 101

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

The Control of Avatar Motion Using Hand Gesture

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Spatial Mechanism Design in Virtual Reality With Networking

Research Seminar. Stefano CARRINO fr.ch

Enabling Cursor Control Using on Pinch Gesture Recognition

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg


Interaction in VR: Manipulation

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury

Realtime 3D Computer Graphics Virtual Reality

Spatial Tracking Basics

Introduction to Embedded Systems

DEVELOPMENT OF VIRTUAL REALITY TRAINING PLATFORM FOR POWER PLANT APPLICATIONS

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Peter Berkelman. ACHI/DigitalWorld

Haptics CS327A

VE Input Devices. Doug Bowman Virginia Tech

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

UUIs Ubiquitous User Interfaces

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Introduction to Virtual Reality (based on a talk by Bill Mark)

Intelligent Robotics Sensors and Actuators

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

3D Graphical User Interface on personal computer using p5 Data Glove

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Transcription:

Realtime 3D Computer Graphics Virtual Reality Virtual Reality Input Devices Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1 WIMP: Windows, Icons, Menu, Pointer

Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 1 WIMP: Windows, Icons, Menu, Pointer Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 1 WIMP: Windows, Icons, Menu, Pointer

Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 1 WIMP: Windows, Icons, Menu, Pointer Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 1 WIMP: Windows, Icons, Menu, Pointer

Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 5. Movement effect sensors: Measure the effect user movement has to the surrounding (no kinematics involved). 1 WIMP: Windows, Icons, Menu, Pointer Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 5. Movement effect sensors: Measure the effect user movement has to the surrounding (no kinematics involved). 6. Skin sensors, neural interfaces, bio-sensors: Measure skin resistance, brain activity and other body related data. 1 WIMP: Windows, Icons, Menu, Pointer

Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 5. Movement effect sensors: Measure the effect user movement has to the surrounding (no kinematics involved). 6. Skin sensors, neural interfaces, bio-sensors: Measure skin resistance, brain activity and other body related data. and hybrid devices. 1 WIMP: Windows, Icons, Menu, Pointer

Input is measured by a multitude of physical and biological principles, e.g., Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism

Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, )

Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia

Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia Input devices produce data... Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia Input devices produce data......discrete event based (buttons, state changers).

Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia Input devices produce data......discrete event based (buttons, state changers)....continuously (discrete but continuously sampled). Electromagnetic tracker used to be most common see: put-that-there (Bolt, 1980) Transmitter Creates three orthogonal lowfrequency magnetic fields Short range version: < 1m Long range version: < 3m Receiver(s) Three perpendicular antennas. Distance is inferred from the currents induced in the antennas. Receiver Transmitter - Noisy requires filtering. - Affected by metal requires non-linear calibration. - Wireless versions expensive. 6DOF Magnetic tracker & DataGlove

Acoustic trackers Uses ultrasound Typical setup for 3 DOF: 3 microphones and1 speaker Distance is inferred from the travel time of the sound + No interference with metal + Relatively inexpensive - Line of sight issues - Sensitive to air temperature and certain noises Logitech Fly Mouse Hybrid trackers (e.g.,intersense IS-600/900) inertial (orientation) acoustic (position) Inertial trackers (Intersense IS-300) + Less noise, lag - Only 3 DOFs (orientation) Use gyroscopes and accelerometers Optical marker based tracker marker reflects IR light Combined to unique spatial configuration per tracked position + No interference with metal + Low latency + High resolution - Line of sight issues (more cameras help) 6DOF optical tracker by ART

3D mice/wands Several buttons and sensors for selection of binary states and/or continuous state changes (e.g., potentiometers). Often hybrid devices for additional position/orientation. tracked wand ring mouse space orb CubicMouseTM First 12 DOF input device Tracks position and rotation of rods using potentiometers Other shapes and implementations possible Mini Cubic Mouse pictures courtesy of IMK Fraunhofer Gesellschaft

Data Gloves Used to track the user s finger movements. For posture and gesture detection. Almost always used with a tracker sensor mounted on the wrist Common types: 5DT Glove (left) 5/16 sensors CyberGlove (right) 18/22 sensors here hybrid modification for flexion and pinch Data Gloves Used to track the user s finger movements. For posture and gesture detection. Almost always used with a tracker sensor mounted on the wrist Common types: Sensors: 20/suit 100 updates/sec 3 meters range from base unit Resolution<2 mm and <.2 degrees Electronic unit (2 hours battery life) 5DT Glove (left) 5/16 sensors Wireless suit (Ascension Technology) CyberGlove (right) 18/22 sensors here hybrid modification for flexion and pinch Body suites Used to track the overall body movement Angles measured by resistance or by inverse kinematics based on certain body points

Head-prop Courtesy of Hinkley et al. Cyberglove with haptics ShapeTape-prop courtesy of Balakrishnan et al. Treadmill types (e.g. bicycles) Head-prop Courtesy of Hinkley et al. Treadmill types (e.g. bicycles) Cyberglove with haptics ShapeTape-prop courtesy of Balakrishnan et al. all the preceding and/or Speech Input continuous vs. one-time recognition choice and placement of microphone training vs. no training handling of false positive recognition surrounding noise interference Can complement other modes of interaction!" multi-modal interaction (by, e.g., additionally including gesture processing which benefits from the VR sensory equipment)

Fiktion: Interaktion The Ultimate Display

The Ultimate Display The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked. The Ultimate Display The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked. (Sutherland 1965)

Fiktion: Interface für das Ultimate Display? Fiktion: Results of physical contact

References Bolt, R. A. (1980): Put That There: Voice and Gesture at the Graphics Interface. In: Computer Graphics 14/3, (pp. 262-270) Bühl, Achim (1997): Die virtuelle Gesellschaft. Politik, Ökonomie und Kultur im Zeichen des Cyberspace. In: Gräf, Lorenz/ Krajewski, Markus (Hrsg.): Soziologie des Internet. Handeln im elektronischen Web- Werk, Frankfurt/M. /New York: Campus, 39-59. Gibson, William (1984): Neuromancer (first print) Gibson, William (1999): Neuromancer, 9. Aufl., München: Heyne 1999 Okoshi, T. (1976): Three-Dimensional Imaging Techniques, Academic Press, New York. Peters, G. (2000). Theories of Three-Dimensional Object Perception - A Survey. Recent Research Developments in Pattern Recognition, Transworld Research Network. Sutherland, I.E. (1968): A Head-Mounted Three-Dimensional Display. In: AFIPS Conference Proceedings, Vol. 33, Part I, pp. 757-764.