Virtual Environments: Tracking and Interaction

Size: px
Start display at page:

Download "Virtual Environments: Tracking and Interaction"

Transcription

1 Virtual Environments: Tracking and Interaction Simon Julier Department of Computer Science University College London Outline Problem Statement: Models of Interaction Tracking Requirements Tracking Systems: Hardware Sources of errors Interaction: Basic interaction Locomotion Selection & Manipulation Problem Statement Problem Statement: Models of Interaction Tracking Requirements Tracking Systems: Hardware Sources of errors Interaction: Basic interaction Locomotion Selection & Manipulation 1

2 Tracking and Interaction User User Interface Devices Computer Synthetic Environment Tracking and interaction happens here Real Environment Basic Interaction Tasks Locomotion or Travel How to effect movement through the space Selection How to indicate an object of interest Manipulation How to move an object of interest Symbolic How to enter text and other parameters Models of Interaction Extended Desktop Model The user needs tools to do 3D tasks Virtual Reality Model The user is using their body as an interface to the world The system responds to everything they do or say 2

3 Extended Desktop Model Focus on analysing a task and creating devices that fit the task Study ergonomics of the device and applicability/suitability for the role Limits of ED Model 3D tasks are quite complicated to perform Tasks can become very specialised Leads to a proliferation of real (and virtual) devices Fakespace Cubic Mouse Types of Device 3DConnexion Spacemouse Polhemus Isotrak 3-Ball Logitech 3D Mouse Ascension Wanda 3DConnexion Spaceball Inition 3DiStick 3

4 Virtual Reality Model Need to track the user precisely and interpret what they do Focus is on users exploring the environment Tension between magical and mundane responses of the environment Mundane are where the world responds as if it was controlled by laws of physics Magical are everything else (casting spells, automatic doors, etc ) Limits of VR Model Can t track user over very large areas E.g. Some form of locomotion metaphor will be required for long distance travel (see later) Physical constraints of systems Limited precision and tracking points Lack of physical force feedback Tracking System Problem Statement: Models of Interaction Tracking Requirements Tracking Systems: Hardware Sources of errors Interaction: Basic interaction Locomotion Selection & Manipulation 4

5 Connection Between Interaction and Tracking Irrespective of interaction model, user must be instrumented in some way to convey information to the system This is carried out using the tracking system Requirements for Trackers Resolution Be able to detect small changes in the system Accuracy The size of the range of the correct positions reported by the system Sample Rate The frequency the sensors are checked for new data. Sampling rate must be greater than the data rate Data Rate The no. of computed position/sec, the higher the rate, the more desirable the system will be. Requirements for Trackers Update rate The rate new positions are reported to the host computer Lag the delay between the new movement made and the new position reported Range of operations The area/range/volume in which the tracker can accurately report the positions. E.g., the distance, the height. This is determined by the wire length, signal strength, etc. 5

6 Requirements for Trackers Robustness The ability the tracker can cope with the amount of uncertainty and noise. (e.g. water, metal, keys) Fitness for tracking multiple objects Ability to independently determine the positions of multiple objects. This is determined by the design of the system architecture. Ability to cope with alteration caused by the one remote object onto the other. For example, if one sensor is occluded by another sensor. Types of Tracking Technology Many types of tracker are available From ultrasonic, consumer devices ($10s) through to very precise mechanical trackers ($100,000s) Not all trackers are suited to all applications E.G. mechanical trackers aren t that suitable for CAVEs since you see the device Cost is still a big problem if you want to track at a fine enough scale for head-tracked virtual reality The Ideal Tracker Magical, ideal tracker would have these characteristics: Tiny (transistor size) Self-Contained Complete (6 DoF) Accurate (1mm position, 0.1 degree orientation) Fast (1000Hz, <1ms latency) Immune to occlusions (no line-of-sight requirement) Robust (no interference) No range limitation Cheap 6

7 Tracking Technologies 5 main types: mechanical, inertial, acoustic, optical, magnetic. Most can be classed as: Outside-in Inside-out Outside-In: user emits signal to indicate its location to the system Inside-Out: systems emits signal to user which senses location Mechanical Trackers First & simplest systems Use prior knowledge or rigid mechanical pieces and measurements from sensors. Typically boom-type tracked displays with counterweights. Mechanical Trackers Some example systems 7

8 Mechanical Trackers Pros Accurate Low latency Force-feedback No Line of Sight or Magnetic Interference Problems Cons Large & cumbersome Limited range Inertial Trackers 3 linear accelerometers measure acceleration vector Rotated using current rotation matrix (orientation) determine by gyroscopes Inertial Trackers Pros Small (chip form), self-contained. Immune to occlusions No interference Low latency (typically <2ms) High sample rate Cons Drift is the show stopper Accelerometer bias of 1 milli-g 4.5m drift after 30s Close, but no silver-bullet High potential as part of hybrid systems 8

9 Acoustic Trackers Uses sound waves for transmission and sensing Involves pulses at intervals SONAR is best known, determining time of a pulse Uses ultrasound Outside-In (microphone sensors) (Logitech Acoustic Tracker) (Samba De Amigo Maracas) Acoustic Trackers Pros Very small so can be worn Line of sight less of an issue than with optical systems Better range than mechanical systems Cons Size proportional to range Environment considerations (temperature, humidity) Acoustic issues can cause slow update rate (10Hz) (5-100ms) Attenuation at desirable high frequencies (reduced interference) Jingling of keys Magnetic Trackers Measures changes in the magnetic field Can be done by magnetometers (for DC) Or by induced current in an electromagnetic field (for AC) 3 sensors orthogonally arranged will produce a 3D vector In tracking, a multi-coil source unit with each coil energised (excited) and when measured results in position and orientation. Compass: uses the earth s naturally occurring DC magnetic field to determine heading, can be used here (Ascension spacepad) 9

10 Magnetic Trackers Pros User-worn component small No line of sight issues (magnetic fields go through us) One source unit can excite many sensor units Very low latency (~5ms) Ability to track multiple users using a single source unit Cons Field distortions (foreign objects, natural magnetic field) Requires some compensation Jingling of keys (or anything magnetically conductive) Need to wait for energised excitation of coil to subside before the next one so update is slow Jitter increases over distance from emitter/sensor Optical Trackers Measures reflected or emitted light Involves a source (active or passive) and sensor Sensors can be analogue or digital Photo sensing (light intensity) or Image forming (CCD) Triangulation with multiple sensors Possible to be both outside-in and insideout Optical Trackers Pros Analogue sensors with active light source gives high update and spatial precision Passive with image-forming sensors could be used in an unaffected environment Image forming sensors provide closed-loop feedback of real environment and tracker Cons Line of sight is critical Target s orientation harder to determine 10

11 Hybrid Trackers No single solution that suits all applications Many different approaches, each with advantages and limitations Can address the limitations by building hybrid systems which combine the advantages of each approach Inertial sensors have provided the basis for several successful hybrid systems due to their advantages Example, the VisTracker users an opto-inertial hybrid Hybrid Tracking Algorithms Hybrid tracking is an example of a data fusion algorithm: Information from a set of disparate modalities Fused together to provide consistent estimate Most common implementation is to use a Kalman filter Kalman Filtering The Kalman filter is a recursive minimum mean squared error estimator It uses a predict-update cycle: Initialize Predict Update This makes it possible to combine lots of types of information in an asynchronous manner 11

12 Fusing Multiple Measurements Camera Inertial Camera Prediction (using motion model) t t+50ms t+100ms Update Predict Update Predict Hybrid Trackers InterSense IS-900 Tracking system for VR-Walkthrough applications Inertial (orientation & position) & Ultrasonic (drift correction) hybrid tracker which has highly accurate 6 degree of freedom tracking in a wide area. Features fast updates, low latency, filtering to reduce jitter and advanced prediction algorithms to reduce latency very smooth and precise The four sensors, including a head tracker, a hand tracker, a wand (with four buttons and an integrated joystick), and a stylus (with two buttons). Tracking Errors Static Tracked Object Misalignment Spatial Distortion (inaccuracy) Spatial Jitter (noise) Creep Dynamic Tracked Object Lag (time delay, tracker + subsystems complex relation) Latency Jitter (variations in latency) Dynamic Errors (other inaccuracies, e.g. prediction algorithms) 12

13 Tracking Errors of < 1 Degree Noticable Misalignment Referentials: W: world B: base (referential) of tracker S: sensor of tracker M: display (manipulator) B BS S SM M Transformation (pose) AB: Transformation that modify the referential A into B Pose of B with respect to A 4x4 Homogeneous transformation matrix AB = (BA) -1 and AB = AC.CB WB W WM WM = WB.BS.SM Spatial Distortion Repeatable errors at different poses in the tracking volumes Many factors including incorrect calibration and persistent environmental disturbances 13

14 Spatial Jitter These are caused by noises in the sensor Even with same noise on sensors, the jitter on pose estimates can change with the pose Hybrid sensors can improve the performance A General Method for Comparing the Expected Performance of Tracking and Motion Capture Systems - Dannett Allen, Greg Welch Creep Slow but steady changes in tracker output over time Caused by temperature drift or other similar start up transients in a system Evaluation of a Solid State Gyroscope for Robotics Applications - Barshan and Durrant-Whyte Measurements from stationary gyro System Latency Mine, M. Characterization of end-to-end delays in head-mounted displays. Tech. Rep. TR93-001, Department of Computer Science, The University of North Carolina, Definition: End to end delay Total time required for image displayed by HMD to change in response to the movement of the user s head. 14

15 Delays in HMD Pipeline Tracking system comprises Physical sensing Filtering on tracking device Transmission delays (RS232, Ethernet, etc.) Application delay Collision detection, interaction events, etc. Image generation At roughly the display refresh rate Display system Time taken to transfer and display an image from Mine (1993) Measuring delay Mine constructed a system to measure delay in HMD systems Measurement at several points in pipeline. Tracking Application Image generation Display T start T report T display T display +17ms 15

16 Measuring delay from Mine (1993) Results Tracking delays Best had delays ~10ms. Worst, delays of ~60ms. More tracked units implies longer delay Application/Image generation 55ms on average. Although application delay was minimal. Display system delay NTSC has delay of 16.67ms. Tracking Summary Quite a complex and challenging problem No real ideal solution Several tracking technologies exist with different levels of suitability based on the application in question. All of the technologies display both pros and cons. The ultimate tracker will probably not be developed from a single technology, but as a hybrid of these technologies. A VR application should provide the following: High data rates for accurate mapping without lag High tolerance to environmentally induced errors Consistent registration between physical and virtual environments Good sociability so that multiple users can move freely 16

17 Interaction Problem Statement: Models of Interaction Tracking Requirements Tracking Systems: Hardware Sources of errors Interaction: Basic interaction Locomotion Selection & Manipulation Basic Interaction Tasks Locomotion or Travel How to effect movement through the space Selection How to indicate an object of interest Manipulation How to move an object of interest Symbolic How to enter text and other parameters Direct Locomotion User walks from one part of the environment to another Intuitive, easy to use Requires a great deal of space 17

18 Constrained Walking User walks but motion is constrained VirtuSphere Treadmills However, most forms can be very difficult to use Mismatch in perceptual cues Dynamics / inertia of device make it hard to navigate effectively CirculaFloor Floor consists of a set of movable tiles As the user walks forwards, tiles move in front of the user s feet to allow near infinite movement CirculaFloor QuickTime and a YUV420 codec decompressor are needed to see this picture. 18

19 Walking-in-Place User walks in place Movement detected by gait analysis No perceptual mismatch Redirected Walking in the CAVE Problems with walking in the CAVE: You eventually hit the walls You can turn and see the missing back wall One means of countering this is to rotate the environment The user is directed back to the front wall Redirected Walking in the CAVE Apply a small rotation to the scene to cause user to turn towards centre Sufficiently small that not consciously noticed Subject responds to maintain balance Increase rate when user is navigating or rapidly turning head Results: Variance in number of times user saw back wall decreased Rates of simulator sickness were not increased Some users did not notice the rotation 19

20 Basic Interaction Tasks Locomotion or Travel How to effect movement through the space Selection How to indicate an object of interest Manipulation How to move an object of interest Symbolic How to enter text and other parameters Locomotion User points (somehow) in the direction of motion User presses a button Selection and Manipulation User points at object with their hand User selects by pressing a button User grabs by pressing 2 nd button Object is rigidly attached to hand coordinate system 20

21 Selection Only Occlusion selection Similar to selection with a mouse Put hand over object (occlude it) to select it Locomotion Travel in Immersive Virtual Environments: An Evaluation of Viewpoint Motion Control Techniques, Bowman, Koller and Hodges One of the first rigorous studies of some of the trade-offs between different travel techniques Taxonomy of Travel Bowman, Koller and Hodges 21

22 Quality Factors 1. Speed (appropriate velocity) 2. Accuracy (proximity to the desired target) 3. Spatial Awareness (the user s implicit knowledge of his position and orientation within the environment during and after travel) 4. Ease of Learning (the ability of a novice user to use the technique) 5. Ease of Use (the complexity or cognitive load of the technique from the user s point of view) 6. Information Gathering (the user s ability to actively obtain information from the environment during travel) 7. Presence (the user s sense of immersion or being within the environment) Experiment 1 Absolute motion task Gaze v. Point AND constrained v. unconstrained Note the immediate trade-offs with point and gaze Bowman claimed expected gaze to be better Neck muscles are more stable More immediate feedback Eight subjects, each doing four times 80 trials (five times 4 distances to target, four target sizes) Experiment 1 No difference between techniques Significant factors were target distance and size Bowman, Koller and Hodges 22

23 Experiment 1 No difference between techniques Significant factors were target distance and size Bowman, Koller and Hodges Experiment 2 Relative motion task No prior expectation Though there is an obvious one Need forward and reverse direction Nine subjects, four sets of 20 trials Bowman, Koller and Hodges Experiment 2 Obvious difference Can t point at target and look departure point simultaneously Bowman, Koller and Hodges 23

24 Summary of 1 st Two Experiments Bowman, Koller and Hodges Experiment 3 Testing spatial awareness based on four travel variations Constant speed (slow) Constant speed (fast) Variable speed (smooth acceleration) Jump (instant translation) Concern is that jumps and other fast transitions confuse users Experiment 3 However, there was no main effect This is still worth further study Bowman, Koller and Hodges 24

25 Other Locomotion Techniques Direct walking Constrained movement Redirected walking Selection and Manipulation Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction, Mine, Brooks Jr. and Sequin One of the first papers to discuss a range of selection and manipulation tasks Body-Relative Interaction Provides Physical frame of reference in which to work More direct and precise sense of control Eyes off interaction Enables Direct object manipulation (for sense of position of object) Physical Mnemonics (objects fixed relative to body) Gestural Actions (invoking commands) 25

26 Working within Arms Reach Takes advantage of proprioception Provides direct mapping between hand motion and object motion Provides finer angular precision of motion Ray-Based Interaction Ray-Based Ray is centred on user s hand All manipulations are relative to hand motion Translation in beam direction is hard Rotation in local object coordinates is nearly impossible Mark Mine, Object-Centred Interaction Object-Centred Select with ray as before Local movements of hand are copied to object local coordinates Mark Mine, 26

27 Go-Go Hand Interaction Arm stretches to reach object Amplifies local movements Stretch Go-Go Hand Technique, Bowman & Hodges, based on Go-Go Hand from Pouyrev, Billinghurst, Weghorst, Ichikawa World in Miniature (WIM) Interaction Smaller version of the world created and superimposed on the real world User controls WIM using hanheld ball Can interact with environment by selecting 1:1 scale or same object on WIM World in Miniature, Stoakley and Pausch Scaled-World Grab Automatically scale world, so that selected object is within arms reach Near and far objects easily moved user doesn t always notice scaling dramatic effects with slight head movement 27

28 Mine, Brooks Jr, Sequin Scaled-World Grab for Locomotion User transports himself by grabbing an object in the desired travel direction and pulling himself towards it User can view the point of interest from all sides very simply For exploration of nearby objects, virtual walking is more suitable; while going much further, invoking a separate scaling operation or switch to an alternate movement mode is better Physical Mnemonics Storing of virtual objects and controls relative to user s body 1.Pull-down menus 2.Hand-held widgets 3.Field of View-Relative mode switching 28

29 Pull-Down Menus Problems with virtual menus Heads-up are difficult to manage Fixed in world often get lost Could enable menu with.. Virtual button (too small) Physical button (low acceptability) Instead hide menus around the body, e.g. above FOV Hand-Held Widgets Hold controls in hands, rather than on objects User relative motion of hands to effect widget changes Mine, Brooks Jr, Sequin FOV-Relative Mode Switching Change behaviour depending on whether a limb is visible Hand visible, use occlusion selection Hand not visible, use ray selection 29

30 Gestural Actions Head butt zoom Look at Menus Two handed flying Over the shoulder deletion Mine, Brooks Jr, Sequin Experiment 1 Align docking cube with target cube as quickly as possible Comparing three manipulation techniques Object in hand Object at fixed distance Object at variable distance (scaled by arm extension) Experiment 1 18 subjects In hand was significantly faster Mine, Brooks Jr, Sequin 30

31 Experiment 2 Virtual widget comparison Comparing Widget in hand Widget fixed in space 18 subjects (as before) Performance measured by accuracy not time Experiment 2 Widget in hand was significantly better Mine, Brooks Jr, Sequin Putting it All Together QuickTime and a decompressor are needed to see this picture. 31

32 Summary Tracking systems provide a way to model the user (VR model) or provide direct input to control system (EM model) A lot of work has been done and is being done in 3D interaction Covered locomotion and selection & manipulation However it is still quite tedious to use most 3D user interfaces Lack of precision is probably main problem However, people are able to interact 32

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

VR System Input & Tracking

VR System Input & Tracking Human-Computer Interface VR System Input & Tracking 071011-1 2017 년가을학기 9/13/2017 박경신 System Software User Interface Software Input Devices Output Devices User Human-Virtual Reality Interface User Monitoring

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

CSC 2524, Fall 2017 AR/VR Interaction Interface

CSC 2524, Fall 2017 AR/VR Interaction Interface CSC 2524, Fall 2017 AR/VR Interaction Interface Karan Singh Adapted from and with thanks to Mark Billinghurst Typical Virtual Reality System HMD User Interface Input Tracking How can we Interact in VR?

More information

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS What 40 Years in Simulation Has Taught Us About Fidelity, Performance, Reliability and Creating a Commercially Successful Simulator.

More information

Cosc VR Interaction. Interaction in Virtual Environments

Cosc VR Interaction. Interaction in Virtual Environments Cosc 4471 Interaction in Virtual Environments VR Interaction In traditional interfaces we need to use interaction metaphors Windows, Mouse, Pointer (WIMP) Limited input degrees of freedom imply modality

More information

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote, Kinect Contents Why is it important? Interaction is basic to VEs We defined them as interactive

More information

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Tracking Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Outline of this talk Introduction: what makes a good tracking system? Example hardware and their tradeoffs Taxonomy of tasks:

More information

Range Sensing strategies

Range Sensing strategies Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart and Nourbakhsh 4.1.6 Range Sensors (time of flight) (1) Large range distance measurement -> called

More information

Realtime 3D Computer Graphics Virtual Reality

Realtime 3D Computer Graphics Virtual Reality Realtime 3D Computer Graphics Virtual Reality Virtual Reality Input Devices Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1 WIMP:

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Improved Pedestrian Navigation Based on Drift-Reduced NavChip MEMS IMU

Improved Pedestrian Navigation Based on Drift-Reduced NavChip MEMS IMU Improved Pedestrian Navigation Based on Drift-Reduced NavChip MEMS IMU Eric Foxlin Aug. 3, 2009 WPI Workshop on Precision Indoor Personnel Location and Tracking for Emergency Responders Outline Summary

More information

COMS W4172 Travel 2 Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 April 3, 2018 1 Physical Locomotion Walking Simulators

More information

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world. Sensing Key requirement of autonomous systems. An AS should be connected to the outside world. Autonomous systems Convert a physical value to an electrical value. From temperature, humidity, light, to

More information

Extended Kalman Filtering

Extended Kalman Filtering Extended Kalman Filtering Andre Cornman, Darren Mei Stanford EE 267, Virtual Reality, Course Report, Instructors: Gordon Wetzstein and Robert Konrad Abstract When working with virtual reality, one of the

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System

A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System FOR U M Short Papers A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System Abstract Results of a comparison study of the tracking accuracy of two commercially

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

NavShoe Pedestrian Inertial Navigation Technology Brief

NavShoe Pedestrian Inertial Navigation Technology Brief NavShoe Pedestrian Inertial Navigation Technology Brief Eric Foxlin Aug. 8, 2006 WPI Workshop on Precision Indoor Personnel Location and Tracking for Emergency Responders The Problem GPS doesn t work indoors

More information

CSE 165: 3D User Interaction. Lecture #11: Travel

CSE 165: 3D User Interaction. Lecture #11: Travel CSE 165: 3D User Interaction Lecture #11: Travel 2 Announcements Homework 3 is on-line, due next Friday Media Teaching Lab has Merge VR viewers to borrow for cell phone based VR http://acms.ucsd.edu/students/medialab/equipment

More information

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Chan-Su Lee Kwang-Man Oh Chan-Jong Park VR Center, ETRI 161 Kajong-Dong, Yusong-Gu Taejon, 305-350, KOREA +82-42-860-{5319,

More information

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor: UNDERGRADUATE REPORT Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool by Walter Miranda Advisor: UG 2006-10 I R INSTITUTE FOR SYSTEMS RESEARCH ISR develops, applies

More information

A Hybrid Immersive / Non-Immersive

A Hybrid Immersive / Non-Immersive A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain

More information

Augmented and Virtual Reality

Augmented and Virtual Reality CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS

More information

VR/AR Concepts in Architecture And Available Tools

VR/AR Concepts in Architecture And Available Tools VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality

More information

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments Virtual Environments 1 A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based Virtual Environments Changming He, Andrew Lewis, and Jun Jo Griffith University, School of

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User

More information

Intelligent Robotics Sensors and Actuators

Intelligent Robotics Sensors and Actuators Intelligent Robotics Sensors and Actuators Luís Paulo Reis (University of Porto) Nuno Lau (University of Aveiro) The Perception Problem Do we need perception? Complexity Uncertainty Dynamic World Detection/Correction

More information

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM Annals of the University of Petroşani, Mechanical Engineering, 8 (2006), 73-78 73 VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM JOZEF NOVÁK-MARCINČIN 1, PETER BRÁZDA 2 Abstract: Paper describes

More information

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute. CS-525U: 3D User Interaction Intro to 3D UI Robert W. Lindeman Worcester Polytechnic Institute Department of Computer Science gogo@wpi.edu Why Study 3D UI? Relevant to real-world tasks Can use familiarity

More information

Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction

Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction Joan De Boeck Chris Raymaekers Karin Coninx Limburgs Universitair Centrum Expertise centre for Digital Media (EDM) Universitaire

More information

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Doug A. Bowman, Chadwick A. Wingrave, Joshua M. Campbell, and Vinh Q. Ly Department of Computer Science (0106)

More information

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Humanoid robot. Honda's ASIMO, an example of a humanoid robot Humanoid robot Honda's ASIMO, an example of a humanoid robot A humanoid robot is a robot with its overall appearance based on that of the human body, allowing interaction with made-for-human tools or environments.

More information

Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR

Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR Interactions. For the technology is only part of the equationwith

More information

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Doug A. Bowman Graphics, Visualization, and Usability Center College of Computing Georgia Institute of Technology

More information

Lecture: Sensors , Fall 2008

Lecture: Sensors , Fall 2008 All images are in the public domain and were obtained from the web unless otherwise cited. 15-491, Fall 2008 Outline Sensor types and overview Common sensors in detail Sensor modeling and calibration Perception

More information

3D UIs 101 Doug Bowman

3D UIs 101 Doug Bowman 3D UIs 101 Doug Bowman Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses The Wii Remote and You 3D UI and

More information

Virtuelle Realität. Overview. Part 13: Interaction in VR: Navigation. Navigation Wayfinding Travel. Virtuelle Realität. Prof.

Virtuelle Realität. Overview. Part 13: Interaction in VR: Navigation. Navigation Wayfinding Travel. Virtuelle Realität. Prof. Part 13: Interaction in VR: Navigation Virtuelle Realität Wintersemester 2006/07 Prof. Bernhard Jung Overview Navigation Wayfinding Travel Further information: D. A. Bowman, E. Kruijff, J. J. LaViola,

More information

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity and acceleration sensing Force sensing Vision based

More information

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful? Brainstorm In addition to cameras / Kinect, what other kinds of sensors would be useful? How do you evaluate different sensors? Classification of Sensors Proprioceptive sensors measure values internally

More information

Proprioception & force sensing

Proprioception & force sensing Proprioception & force sensing Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jussi Rantala, Jukka

More information

Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques

Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques Abstract Doug A. Bowman Department of Computer Science Virginia Polytechnic Institute & State University Applications

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Robotic Vehicle Design

Robotic Vehicle Design Robotic Vehicle Design Sensors, measurements and interfacing Jim Keller July 19, 2005 Sensor Design Types Topology in system Specifications/Considerations for Selection Placement Estimators Summary Sensor

More information

Development of intelligent systems

Development of intelligent systems Development of intelligent systems (RInS) Robot sensors Danijel Skočaj University of Ljubljana Faculty of Computer and Information Science Academic year: 2017/18 Development of intelligent systems Robotic

More information

Virtual Reality & Interaction

Virtual Reality & Interaction Virtual Reality & Interaction Virtual Reality Input Devices Output Devices Augmented Reality Applications What is Virtual Reality? narrow: immersive environment with head tracking, headmounted display,

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system Line of Sight Method for Tracker Calibration in Projection-Based VR Systems Marek Czernuszenko, Daniel Sandin, Thomas DeFanti fmarek j dan j tomg @evl.uic.edu Electronic Visualization Laboratory (EVL)

More information

Time of Flight Capture

Time of Flight Capture Time of Flight Capture CS635 Spring 2017 Daniel G. Aliaga Department of Computer Science Purdue University Range Acquisition Taxonomy Range acquisition Contact Transmissive Mechanical (CMM, jointed arm)

More information

Augmented and mixed reality (AR & MR)

Augmented and mixed reality (AR & MR) Augmented and mixed reality (AR & MR) Doug Bowman CS 5754 Based on original lecture notes by Ivan Poupyrev AR/MR example (C) 2008 Doug Bowman, Virginia Tech 2 Definitions Augmented reality: Refers to a

More information

Augmented Reality Mixed Reality

Augmented Reality Mixed Reality Augmented Reality and Virtual Reality Augmented Reality Mixed Reality 029511-1 2008 년가을학기 11/17/2008 박경신 Virtual Reality Totally immersive environment Visual senses are under control of system (sometimes

More information

A FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS

A FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS A FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS Patrick Rößler, Frederik Beutler, and Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute of Computer Science and

More information

Gestaltung und Strukturierung virtueller Welten. Bauhaus - Universität Weimar. Research at InfAR. 2ooo

Gestaltung und Strukturierung virtueller Welten. Bauhaus - Universität Weimar. Research at InfAR. 2ooo Gestaltung und Strukturierung virtueller Welten Research at InfAR 2ooo 1 IEEE VR 99 Bowman, D., Kruijff, E., LaViola, J., and Poupyrev, I. "The Art and Science of 3D Interaction." Full-day tutorial presented

More information

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Morteza Ghazisaedy David Adamczyk Daniel J. Sandin Robert V. Kenyon Thomas A. DeFanti Electronic Visualization Laboratory (EVL) Department

More information

Robotic Vehicle Design

Robotic Vehicle Design Robotic Vehicle Design Sensors, measurements and interfacing Jim Keller July 2008 1of 14 Sensor Design Types Topology in system Specifications/Considerations for Selection Placement Estimators Summary

More information

MEASURING AND ANALYZING FINE MOTOR SKILLS

MEASURING AND ANALYZING FINE MOTOR SKILLS MEASURING AND ANALYZING FINE MOTOR SKILLS PART 1: MOTION TRACKING AND EMG OF FINE MOVEMENTS PART 2: HIGH-FIDELITY CAPTURE OF HAND AND FINGER BIOMECHANICS Abstract This white paper discusses an example

More information

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space Chapter 2 Understanding and Conceptualizing Interaction Anna Loparev Intro HCI University of Rochester 01/29/2013 1 Problem space Concepts and facts relevant to the problem Users Current UX Technology

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology

More information

The architectural walkthrough one of the earliest

The architectural walkthrough one of the earliest Editors: Michael R. Macedonia and Lawrence J. Rosenblum Designing Animal Habitats within an Immersive VE The architectural walkthrough one of the earliest virtual environment (VE) applications is still

More information

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury Réalité Virtuelle et Interactions Interaction 3D Année 2016-2017 / 5 Info à Polytech Paris-Sud Cédric Fleury (cedric.fleury@lri.fr) Virtual Reality Virtual environment (VE) 3D virtual world Simulated by

More information

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005. Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.

More information

Learning From Where Students Look While Observing Simulated Physical Phenomena

Learning From Where Students Look While Observing Simulated Physical Phenomena Learning From Where Students Look While Observing Simulated Physical Phenomena Dedra Demaree, Stephen Stonebraker, Wenhui Zhao and Lei Bao The Ohio State University 1 Introduction The Ohio State University

More information

FORCE FEEDBACK. Roope Raisamo

FORCE FEEDBACK. Roope Raisamo FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces

More information

Integrated Navigation System

Integrated Navigation System Integrated Navigation System Adhika Lie adhika@aem.umn.edu AEM 5333: Design, Build, Model, Simulate, Test and Fly Small Uninhabited Aerial Vehicles Feb 14, 2013 1 Navigation System Where am I? Position,

More information

INDOOR HEADING MEASUREMENT SYSTEM

INDOOR HEADING MEASUREMENT SYSTEM INDOOR HEADING MEASUREMENT SYSTEM Marius Malcius Department of Research and Development AB Prospero polis, Lithuania m.malcius@orodur.lt Darius Munčys Department of Research and Development AB Prospero

More information

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Tangible User Interface for CAVE TM based on Augmented Reality Technique Tangible User Interface for CAVE TM based on Augmented Reality Technique JI-SUN KIM Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of

More information

Physical Presence in Virtual Worlds using PhysX

Physical Presence in Virtual Worlds using PhysX Physical Presence in Virtual Worlds using PhysX One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are

More information

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES IADIS International Conference Computer Graphics and Visualization 27 TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES Nicoletta Adamo-Villani Purdue University, Department of Computer

More information

EVALUATING 3D INTERACTION TECHNIQUES

EVALUATING 3D INTERACTION TECHNIQUES EVALUATING 3D INTERACTION TECHNIQUES ROBERT J. TEATHER QUALIFYING EXAM REPORT SUPERVISOR: WOLFGANG STUERZLINGER DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING, YORK UNIVERSITY TORONTO, ONTARIO MAY, 2011

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

History of Virtual Reality. Trends & Milestones

History of Virtual Reality. Trends & Milestones History of Virtual Reality (based on a talk by Greg Welch) Trends & Milestones Displays (head-mounted) video only, CG overlay, CG only, mixed video CRT vs. LCD Tracking magnetic, mechanical, ultrasonic,

More information

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI RV - AULA 05 - PSI3502/2018 User Experience, Human Computer Interaction and UI Outline Discuss some general principles of UI (user interface) design followed by an overview of typical interaction tasks

More information

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology Robot Sensors 2.12 Introduction to Robotics Lecture Handout September 20, 2004 H. Harry Asada Massachusetts Institute of Technology Touch Sensor CCD Camera Vision System Ultrasonic Sensor Photo removed

More information

Recent Progress on Wearable Augmented Interaction at AIST

Recent Progress on Wearable Augmented Interaction at AIST Recent Progress on Wearable Augmented Interaction at AIST Takeshi Kurata 12 1 Human Interface Technology Lab University of Washington 2 AIST, Japan kurata@ieee.org Weavy The goal of the Weavy project team

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,

More information

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21 Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:

More information

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg OughtToPilot Project Report of Submission PC128 to 2008 Propeller Design Contest Jason Edelberg Table of Contents Project Number.. 3 Project Description.. 4 Schematic 5 Source Code. Attached Separately

More information

Laboratory Seven Stepper Motor and Feedback Control

Laboratory Seven Stepper Motor and Feedback Control EE3940 Microprocessor Systems Laboratory Prof. Andrew Campbell Spring 2003 Groups Names Laboratory Seven Stepper Motor and Feedback Control In this experiment you will experiment with a stepper motor and

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Issues and Challenges of 3D User Interfaces: Effects of Distraction Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an

More information

MOVING COWS IN SPACE: EXPLOITING PROPRIOCEPTION AS A FRAMEWORK FOR VIRTUAL ENVIRONMENT INTERACTION

MOVING COWS IN SPACE: EXPLOITING PROPRIOCEPTION AS A FRAMEWORK FOR VIRTUAL ENVIRONMENT INTERACTION 1 MOVING COWS IN SPACE: EXPLOITING PROPRIOCEPTION AS A FRAMEWORK FOR VIRTUAL ENVIRONMENT INTERACTION Category: Research Format: Traditional Print Paper ABSTRACT Manipulation in immersive virtual environments

More information

MEMS Solutions For VR & AR

MEMS Solutions For VR & AR MEMS Solutions For VR & AR Sensor Expo 2017 San Jose June 28 th 2017 MEMS Sensors & Actuators at ST 2 Motion Environmental Audio Physical change Sense Electro MEMS Mechanical Signal Mechanical Actuate

More information

Testbed Evaluation of Virtual Environment Interaction Techniques

Testbed Evaluation of Virtual Environment Interaction Techniques Testbed Evaluation of Virtual Environment Interaction Techniques Doug A. Bowman Department of Computer Science (0106) Virginia Polytechnic & State University Blacksburg, VA 24061 USA (540) 231-7537 bowman@vt.edu

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Marco Cavallo Merging Worlds: A Location-based Approach to Mixed Reality Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Introduction: A New Realm of Reality 2 http://www.samsung.com/sg/wearables/gear-vr/

More information

AUTOMATIC SPEED CONTROL FOR NAVIGATION IN 3D VIRTUAL ENVIRONMENT

AUTOMATIC SPEED CONTROL FOR NAVIGATION IN 3D VIRTUAL ENVIRONMENT AUTOMATIC SPEED CONTROL FOR NAVIGATION IN 3D VIRTUAL ENVIRONMENT DOMOKOS M. PAPOI A THESIS SUBMITTED TO THE FACULTY OF GRADUATE STUDIES IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER

More information

The introduction and background in the previous chapters provided context in

The introduction and background in the previous chapters provided context in Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at

More information

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology Virtual Reality man made reality sense world What is Virtual Reality? Dipl-Ing Indra Kusumah Digital Product Design Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indrakusumah@iptfraunhoferde wwwiptfraunhoferde

More information

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays Einführung in die Erweiterte Realität 5. Head-Mounted Displays Prof. Gudrun Klinker, Ph.D. Institut für Informatik,Technische Universität München klinker@in.tum.de Nov 30, 2004 Agenda 1. Technological

More information

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011 Sponsored by Nisarg Kothari Carnegie Mellon University April 26, 2011 Motivation Why indoor localization? Navigating malls, airports, office buildings Museum tours, context aware apps Augmented reality

More information

User Interface Constraints for Immersive Virtual Environment Applications

User Interface Constraints for Immersive Virtual Environment Applications User Interface Constraints for Immersive Virtual Environment Applications Doug A. Bowman and Larry F. Hodges {bowman, hodges}@cc.gatech.edu Graphics, Visualization, and Usability Center College of Computing

More information

Interaction in VR: Manipulation

Interaction in VR: Manipulation Part 8: Interaction in VR: Manipulation Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Control Methods Selection Techniques Manipulation Techniques Taxonomy Further reading: D.

More information

Mechatronics Project Report

Mechatronics Project Report Mechatronics Project Report Introduction Robotic fish are utilized in the Dynamic Systems Laboratory in order to study and model schooling in fish populations, with the goal of being able to manage aquatic

More information

Attitude and Heading Reference Systems

Attitude and Heading Reference Systems Attitude and Heading Reference Systems FY-AHRS-2000B Installation Instructions V1.0 Guilin FeiYu Electronic Technology Co., Ltd Addr: Rm. B305,Innovation Building, Information Industry Park,ChaoYang Road,Qi

More information