Realtime 3D Computer Graphics Virtual Reality

Size: px
Start display at page:

Download "Realtime 3D Computer Graphics Virtual Reality"

Transcription

1 Realtime 3D Computer Graphics Virtual Reality Virtual Reality Input Devices Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1 WIMP: Windows, Icons, Menu, Pointer

2 Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 1 WIMP: Windows, Icons, Menu, Pointer Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 1 WIMP: Windows, Icons, Menu, Pointer

3 Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 1 WIMP: Windows, Icons, Menu, Pointer Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 1 WIMP: Windows, Icons, Menu, Pointer

4 Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 5. Movement effect sensors: Measure the effect user movement has to the surrounding (no kinematics involved). 1 WIMP: Windows, Icons, Menu, Pointer Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 5. Movement effect sensors: Measure the effect user movement has to the surrounding (no kinematics involved). 6. Skin sensors, neural interfaces, bio-sensors: Measure skin resistance, brain activity and other body related data. 1 WIMP: Windows, Icons, Menu, Pointer

5 Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1. Motion Trackers: Position and orientation of a reference system in 3D requires to measure 6 Degrees of Freedom (DOFs). 2. 3D Mice/Wands etc. : Specialized devices for point and click WIMP 1 -style metaphors have to account for additional DOFs. 3. Joint sensors: Sensors which measure movement of user s joints (Also possible with trackers and inverse kinematics). 4. Props: Real placeholders for virtual objects. 5. Movement effect sensors: Measure the effect user movement has to the surrounding (no kinematics involved). 6. Skin sensors, neural interfaces, bio-sensors: Measure skin resistance, brain activity and other body related data. and hybrid devices. 1 WIMP: Windows, Icons, Menu, Pointer

6 Input is measured by a multitude of physical and biological principles, e.g., Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism

7 Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, )

8 Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia

9 Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia Input devices produce data... Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia Input devices produce data......discrete event based (buttons, state changers).

10 Input is measured by a multitude of physical and biological principles, e.g., 1.electro-magnetism 2.optics (marker/marker less, visible spectrum/infrared) 3.electrics (voltage, impedance, electrical flow, ) 4.acoustics (ultrasound, ) 5.inertia Input devices produce data......discrete event based (buttons, state changers)....continuously (discrete but continuously sampled). Electromagnetic tracker used to be most common see: put-that-there (Bolt, 1980) Transmitter Creates three orthogonal lowfrequency magnetic fields Short range version: < 1m Long range version: < 3m Receiver(s) Three perpendicular antennas. Distance is inferred from the currents induced in the antennas. Receiver Transmitter - Noisy requires filtering. - Affected by metal requires non-linear calibration. - Wireless versions expensive. 6DOF Magnetic tracker & DataGlove

11 Acoustic trackers Uses ultrasound Typical setup for 3 DOF: 3 microphones and1 speaker Distance is inferred from the travel time of the sound + No interference with metal + Relatively inexpensive - Line of sight issues - Sensitive to air temperature and certain noises Logitech Fly Mouse Hybrid trackers (e.g.,intersense IS-600/900) inertial (orientation) acoustic (position) Inertial trackers (Intersense IS-300) + Less noise, lag - Only 3 DOFs (orientation) Use gyroscopes and accelerometers Optical marker based tracker marker reflects IR light Combined to unique spatial configuration per tracked position + No interference with metal + Low latency + High resolution - Line of sight issues (more cameras help) 6DOF optical tracker by ART

12 3D mice/wands Several buttons and sensors for selection of binary states and/or continuous state changes (e.g., potentiometers). Often hybrid devices for additional position/orientation. tracked wand ring mouse space orb CubicMouseTM First 12 DOF input device Tracks position and rotation of rods using potentiometers Other shapes and implementations possible Mini Cubic Mouse pictures courtesy of IMK Fraunhofer Gesellschaft

13 Data Gloves Used to track the user s finger movements. For posture and gesture detection. Almost always used with a tracker sensor mounted on the wrist Common types: 5DT Glove (left) 5/16 sensors CyberGlove (right) 18/22 sensors here hybrid modification for flexion and pinch Data Gloves Used to track the user s finger movements. For posture and gesture detection. Almost always used with a tracker sensor mounted on the wrist Common types: Sensors: 20/suit 100 updates/sec 3 meters range from base unit Resolution<2 mm and <.2 degrees Electronic unit (2 hours battery life) 5DT Glove (left) 5/16 sensors Wireless suit (Ascension Technology) CyberGlove (right) 18/22 sensors here hybrid modification for flexion and pinch Body suites Used to track the overall body movement Angles measured by resistance or by inverse kinematics based on certain body points

14 Head-prop Courtesy of Hinkley et al. Cyberglove with haptics ShapeTape-prop courtesy of Balakrishnan et al. Treadmill types (e.g. bicycles) Head-prop Courtesy of Hinkley et al. Treadmill types (e.g. bicycles) Cyberglove with haptics ShapeTape-prop courtesy of Balakrishnan et al. all the preceding and/or Speech Input continuous vs. one-time recognition choice and placement of microphone training vs. no training handling of false positive recognition surrounding noise interference Can complement other modes of interaction!" multi-modal interaction (by, e.g., additionally including gesture processing which benefits from the VR sensory equipment)

15 Fiktion: Interaktion The Ultimate Display

16 The Ultimate Display The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked. The Ultimate Display The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked. (Sutherland 1965)

17 Fiktion: Interface für das Ultimate Display? Fiktion: Results of physical contact

18 References Bolt, R. A. (1980): Put That There: Voice and Gesture at the Graphics Interface. In: Computer Graphics 14/3, (pp ) Bühl, Achim (1997): Die virtuelle Gesellschaft. Politik, Ökonomie und Kultur im Zeichen des Cyberspace. In: Gräf, Lorenz/ Krajewski, Markus (Hrsg.): Soziologie des Internet. Handeln im elektronischen Web- Werk, Frankfurt/M. /New York: Campus, Gibson, William (1984): Neuromancer (first print) Gibson, William (1999): Neuromancer, 9. Aufl., München: Heyne 1999 Okoshi, T. (1976): Three-Dimensional Imaging Techniques, Academic Press, New York. Peters, G. (2000). Theories of Three-Dimensional Object Perception - A Survey. Recent Research Developments in Pattern Recognition, Transworld Research Network. Sutherland, I.E. (1968): A Head-Mounted Three-Dimensional Display. In: AFIPS Conference Proceedings, Vol. 33, Part I, pp

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time

More information

VR System Input & Tracking

VR System Input & Tracking Human-Computer Interface VR System Input & Tracking 071011-1 2017 년가을학기 9/13/2017 박경신 System Software User Interface Software Input Devices Output Devices User Human-Virtual Reality Interface User Monitoring

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e. VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote, Kinect Contents Why is it important? Interaction is basic to VEs We defined them as interactive

More information

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS What 40 Years in Simulation Has Taught Us About Fidelity, Performance, Reliability and Creating a Commercially Successful Simulator.

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System

A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System FOR U M Short Papers A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System Abstract Results of a comparison study of the tracking accuracy of two commercially

More information

Virtual Environments: Tracking and Interaction

Virtual Environments: Tracking and Interaction Virtual Environments: Tracking and Interaction Simon Julier Department of Computer Science University College London http://www.cs.ucl.ac.uk/teaching/ve Outline Problem Statement: Models of Interaction

More information

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger There were things I resented

More information

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality And Ubiquitous Computing using HCI Augmented Reality And Ubiquitous Computing using HCI Ashmit Kolli MS in Data Science Michigan Technological University CS5760 Topic Assignment 2 akolli@mtu.edu Abstract : Direct use of the hand as an input

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Heads up interaction: glasgow university multimodal research. Eve Hoggan Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Realtime 3D Computer Graphics Virtual Reality

Realtime 3D Computer Graphics Virtual Reality Realtime 3D Computer Graphics Virtual Reality Marc Erich Latoschik AI & VR Lab Artificial Intelligence Group University of Bielefeld Virtual Reality (or VR for short) Virtual Reality (or VR for short)

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Tracking Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Outline of this talk Introduction: what makes a good tracking system? Example hardware and their tradeoffs Taxonomy of tasks:

More information

VR based HCI Techniques & Application. November 29, 2002

VR based HCI Techniques & Application. November 29, 2002 VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Tangible User Interface for CAVE TM based on Augmented Reality Technique Tangible User Interface for CAVE TM based on Augmented Reality Technique JI-SUN KIM Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology Natural User Interfaces What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger

More information

Technologies. Philippe Fuchs Ecole des Mines, ParisTech, Paris, France. Virtual Reality: Concepts and. Guillaume Moreau.

Technologies. Philippe Fuchs Ecole des Mines, ParisTech, Paris, France. Virtual Reality: Concepts and. Guillaume Moreau. Virtual Reality: Concepts and Technologies Editors Philippe Fuchs Ecole des Mines, ParisTech, Paris, France Guillaume Moreau Ecole Centrale de Nantes, CERMA, Nantes, France Pascal Guitton INRIA, University

More information

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Issues and Challenges of 3D User Interfaces: Effects of Distraction Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an

More information

Augmented and Virtual Reality

Augmented and Virtual Reality CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS

More information

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision 11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste

More information

DATA GLOVES USING VIRTUAL REALITY

DATA GLOVES USING VIRTUAL REALITY DATA GLOVES USING VIRTUAL REALITY Raghavendra S.N 1 1 Assistant Professor, Information science and engineering, sri venkateshwara college of engineering, Bangalore, raghavendraewit@gmail.com ABSTRACT This

More information

MEASURING AND ANALYZING FINE MOTOR SKILLS

MEASURING AND ANALYZING FINE MOTOR SKILLS MEASURING AND ANALYZING FINE MOTOR SKILLS PART 1: MOTION TRACKING AND EMG OF FINE MOVEMENTS PART 2: HIGH-FIDELITY CAPTURE OF HAND AND FINGER BIOMECHANICS Abstract This white paper discusses an example

More information

CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR

CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR Karan Singh Inspired and adapted from material by Mark Billinghurst What is this course about? Fundamentals

More information

Virtual Reality & Interaction

Virtual Reality & Interaction Virtual Reality & Interaction Virtual Reality Input Devices Output Devices Augmented Reality Applications What is Virtual Reality? narrow: immersive environment with head tracking, headmounted display,

More information

Augmented Reality Lecture notes 01 1

Augmented Reality Lecture notes 01 1 IntroductiontoAugmentedReality Lecture notes 01 1 Definition Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated

More information

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Doug A. Bowman, Chadwick A. Wingrave, Joshua M. Campbell, and Vinh Q. Ly Department of Computer Science (0106)

More information

Flexible Gesture Recognition for Immersive Virtual Environments

Flexible Gesture Recognition for Immersive Virtual Environments Flexible Gesture Recognition for Immersive Virtual Environments Matthias Deller, Achim Ebert, Michael Bender, and Hans Hagen German Research Center for Artificial Intelligence, Kaiserslautern, Germany

More information

Vorlesung Mensch-Maschine-Interaktion. The solution space. Chapter 4 Analyzing the Requirements and Understanding the Design Space

Vorlesung Mensch-Maschine-Interaktion. The solution space. Chapter 4 Analyzing the Requirements and Understanding the Design Space Vorlesung Mensch-Maschine-Interaktion LFE Medieninformatik Ludwig-Maximilians-Universität München http://www.hcilab.org/albrecht/ Chapter 4 3.7 Design Space for Input/Output Slide 2 The solution space

More information

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute. CS-525U: 3D User Interaction Intro to 3D UI Robert W. Lindeman Worcester Polytechnic Institute Department of Computer Science gogo@wpi.edu Why Study 3D UI? Relevant to real-world tasks Can use familiarity

More information

Augmented and Virtual Reality 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

Augmented and Virtual Reality 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group Augmented and Virtual Reality 6.S063 Engineering Interaction Technologies Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group AR supplements the real world VR replaces the real world mixed reality real

More information

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor: UNDERGRADUATE REPORT Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool by Walter Miranda Advisor: UG 2006-10 I R INSTITUTE FOR SYSTEMS RESEARCH ISR develops, applies

More information

GESTUR. Sensing & Feedback Glove for interfacing with Virtual Reality

GESTUR. Sensing & Feedback Glove for interfacing with Virtual Reality GESTUR Sensing & Feedback Glove for interfacing with Virtual Reality Initial Design Review ECE 189A, Fall 2016 University of California, Santa Barbara History & Introduction - Oculus and Vive are great

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

Building a gesture based information display

Building a gesture based information display Chair for Com puter Aided Medical Procedures & cam par.in.tum.de Building a gesture based information display Diplomarbeit Kickoff Presentation by Nikolas Dörfler Feb 01, 2008 Chair for Computer Aided

More information

Haptic, vestibular and other physical input/output devices

Haptic, vestibular and other physical input/output devices Human Touch Sensing - recap Haptic, vestibular and other physical input/output devices SGN-5406 Virtual Reality Autumn 2007 ismo.rakkolainen@tut.fi The human sensitive areas for touch: Hand, face Many

More information

CSE 165: 3D User Interaction. Lecture #11: Travel

CSE 165: 3D User Interaction. Lecture #11: Travel CSE 165: 3D User Interaction Lecture #11: Travel 2 Announcements Homework 3 is on-line, due next Friday Media Teaching Lab has Merge VR viewers to borrow for cell phone based VR http://acms.ucsd.edu/students/medialab/equipment

More information

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005. Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.

More information

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION International Journal of Advanced Research in Engineering and Technology (IJARET) Volume 7, Issue 2, March-April 2016, pp. 159 167, Article ID: IJARET_07_02_015 Available online at http://www.iaeme.com/ijaret/issues.asp?jtype=ijaret&vtype=7&itype=2

More information

Contents. Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement.

Contents. Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement. Kazutaka Mitobe Akita University, Japan Contents Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement Hand MoCap Learning system Conclusions 1

More information

By: Celine, Yan Ran, Yuolmae. Image from oss

By: Celine, Yan Ran, Yuolmae. Image from oss IMMERSION By: Celine, Yan Ran, Yuolmae Image from oss Content 1. Char Davies 2. Osmose 3. The Ultimate Display, Ivan Sutherland 4. Virtual Environments, Scott Fisher Artist A Canadian contemporary artist

More information

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15) Outline 01076568 Human Computer Interaction Chapter 5 : Paradigms Introduction Paradigms for interaction (15) ดร.ชมพ น ท จ นจาคาม [kjchompo@gmail.com] สาขาว ชาว ศวกรรมคอมพ วเตอร คณะว ศวกรรมศาสตร สถาบ นเทคโนโลย

More information

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction

More information

A Hybrid Immersive / Non-Immersive

A Hybrid Immersive / Non-Immersive A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain

More information

Virtual Reality Calendar Tour Guide

Virtual Reality Calendar Tour Guide Technical Disclosure Commons Defensive Publications Series October 02, 2017 Virtual Reality Calendar Tour Guide Walter Ianneo Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

VICs: A Modular Vision-Based HCI Framework

VICs: A Modular Vision-Based HCI Framework VICs: A Modular Vision-Based HCI Framework The Visual Interaction Cues Project Guangqi Ye, Jason Corso Darius Burschka, & Greg Hager CIRL, 1 Today, I ll be presenting work that is part of an ongoing project

More information

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng. Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng. Multimedia Communications Research Laboratory University of Ottawa Ontario Research Network of E-Commerce www.mcrlab.uottawa.ca abed@mcrlab.uottawa.ca

More information

Cosc VR Interaction. Interaction in Virtual Environments

Cosc VR Interaction. Interaction in Virtual Environments Cosc 4471 Interaction in Virtual Environments VR Interaction In traditional interfaces we need to use interaction metaphors Windows, Mouse, Pointer (WIMP) Limited input degrees of freedom imply modality

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

Geo-Located Content in Virtual and Augmented Reality

Geo-Located Content in Virtual and Augmented Reality Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Virtual Grasping Using a Data Glove

Virtual Grasping Using a Data Glove Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct

More information

IMGD 4000 Technical Game Development II Interaction and Immersion

IMGD 4000 Technical Game Development II Interaction and Immersion IMGD 4000 Technical Game Development II Interaction and Immersion Robert W. Lindeman Associate Professor Human Interaction in Virtual Environments (HIVE) Lab Department of Computer Science Worcester Polytechnic

More information

3D UIs 101 Doug Bowman

3D UIs 101 Doug Bowman 3D UIs 101 Doug Bowman Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses The Wii Remote and You 3D UI and

More information

Indoor Positioning by the Fusion of Wireless Metrics and Sensors

Indoor Positioning by the Fusion of Wireless Metrics and Sensors Indoor Positioning by the Fusion of Wireless Metrics and Sensors Asst. Prof. Dr. Özgür TAMER Dokuz Eylül University Electrical and Electronics Eng. Dept Indoor Positioning Indoor positioning systems (IPS)

More information

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity and acceleration sensing Force sensing Vision based

More information

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Application Areas of AI   Artificial intelligence is divided into different branches which are mentioned below: Week 2 - o Expert Systems o Natural Language Processing (NLP) o Computer Vision o Speech Recognition And Generation o Robotics o Neural Network o Virtual Reality APPLICATION AREAS OF ARTIFICIAL INTELLIGENCE

More information

A Survey of Hand Posture and Gesture Recognition Techniques and Technology

A Survey of Hand Posture and Gesture Recognition Techniques and Technology ASurvey of Hand Posture and Gesture Recognition Techniques and Technology Joseph J. LaViola Jr. Department of Computer Science Brown University Providence, Rhode Island 02912 CS-99-11 June 1999 A Survey

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

The use of gestures in computer aided design

The use of gestures in computer aided design Loughborough University Institutional Repository The use of gestures in computer aided design This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: CASE,

More information

Virtual/Augmented Reality (VR/AR) 101

Virtual/Augmented Reality (VR/AR) 101 Virtual/Augmented Reality (VR/AR) 101 Dr. Judy M. Vance Virtual Reality Applications Center (VRAC) Mechanical Engineering Department Iowa State University Ames, IA Virtual Reality Virtual Reality Virtual

More information

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,

More information

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS THE PINNACLE OF VIRTUAL REALITY CONTROLLERS PRODUCT INFORMATION The Manus VR Glove is a high-end data glove that brings intuitive interaction to virtual reality. Its unique design and cutting edge technology

More information

3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK

3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK 3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK 2008 3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK SCHOOL OF MECHANICAL AND AEROSPACE ENGINEERING 2008 3D INTERACTION DESIGN

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

The Control of Avatar Motion Using Hand Gesture

The Control of Avatar Motion Using Hand Gesture The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,

More information

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Chan-Su Lee Kwang-Man Oh Chan-Jong Park VR Center, ETRI 161 Kajong-Dong, Yusong-Gu Taejon, 305-350, KOREA +82-42-860-{5319,

More information

Spatial Mechanism Design in Virtual Reality With Networking

Spatial Mechanism Design in Virtual Reality With Networking Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 9-2001 Spatial Mechanism Design in Virtual Reality With Networking John N. Kihonge Iowa State University

More information

Research Seminar. Stefano CARRINO fr.ch

Research Seminar. Stefano CARRINO  fr.ch Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks

More information

Enabling Cursor Control Using on Pinch Gesture Recognition

Enabling Cursor Control Using on Pinch Gesture Recognition Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on

More information

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Morteza Ghazisaedy David Adamczyk Daniel J. Sandin Robert V. Kenyon Thomas A. DeFanti Electronic Visualization Laboratory (EVL) Department

More information

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg Immersive Natives Die Zukunft der virtuellen Realität Prof. Dr. Frank Steinicke Human-Computer Interaction, Universität Hamburg Immersion Presence Place Illusion + Plausibility Illusion + Social Presence

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Interaction in VR: Manipulation

Interaction in VR: Manipulation Part 8: Interaction in VR: Manipulation Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Control Methods Selection Techniques Manipulation Techniques Taxonomy Further reading: D.

More information

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury Réalité Virtuelle et Interactions Interaction 3D Année 2016-2017 / 5 Info à Polytech Paris-Sud Cédric Fleury (cedric.fleury@lri.fr) Virtual Reality Virtual environment (VE) 3D virtual world Simulated by

More information

Realtime 3D Computer Graphics Virtual Reality

Realtime 3D Computer Graphics Virtual Reality Realtime 3D Computer Graphics Virtual Reality Virtual Reality Display Systems VR display systems Morton Heilig began designing the first multisensory virtual experiences in 1956 (patented in 1961): Sensorama

More information

Spatial Tracking Basics

Spatial Tracking Basics COMS W4172 3D Tracking Technologies (Non-Optical) Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 April 24, 2018 1 Spatial

More information

Introduction to Embedded Systems

Introduction to Embedded Systems Introduction to Embedded Systems Edward A. Lee & Sanjit Seshia UC Berkeley EECS 124 Spring 2008 Copyright 2008, Edward A. Lee & Sanjit Seshia, All rights reserved Lecture 3: Sensors and Actuators Sensors

More information

DEVELOPMENT OF VIRTUAL REALITY TRAINING PLATFORM FOR POWER PLANT APPLICATIONS

DEVELOPMENT OF VIRTUAL REALITY TRAINING PLATFORM FOR POWER PLANT APPLICATIONS MultiScience - XXX. microcad International Multidisciplinary Scientific Conference University of Miskolc, Hungary, 21-22 April 2016, ISBN 978-963-358-113-1 DEVELOPMENT OF VIRTUAL REALITY TRAINING PLATFORM

More information

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation Direct Manipulation and Instrumental Interaction 1 Review: Interaction vs. Interface What s the difference between user interaction and user interface? Interface refers to what the system presents to the

More information

Peter Berkelman. ACHI/DigitalWorld

Peter Berkelman. ACHI/DigitalWorld Magnetic Levitation Haptic Peter Berkelman ACHI/DigitalWorld February 25, 2013 Outline: Haptics - Force Feedback Sample devices: Phantoms, Novint Falcon, Force Dimension Inertia, friction, hysteresis/backlash

More information

Haptics CS327A

Haptics CS327A Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller

More information

VE Input Devices. Doug Bowman Virginia Tech

VE Input Devices. Doug Bowman Virginia Tech VE Input Devices Doug Bowman Virginia Tech Goals and Motivation Provide practical introduction to the input devices used in VEs Examine common and state of the art input devices look for general trends

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

UUIs Ubiquitous User Interfaces

UUIs Ubiquitous User Interfaces UUIs Ubiquitous User Interfaces Alexander Nelson April 16th, 2018 University of Arkansas - Department of Computer Science and Computer Engineering The Problem As more and more computation is woven into

More information

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Technical Disclosure Commons Defensive Publications Series October 02, 2017 Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Adam Glazier Nadav Ashkenazi Matthew

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Intelligent Robotics Sensors and Actuators

Intelligent Robotics Sensors and Actuators Intelligent Robotics Sensors and Actuators Luís Paulo Reis (University of Porto) Nuno Lau (University of Aveiro) The Perception Problem Do we need perception? Complexity Uncertainty Dynamic World Detection/Correction

More information

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-103 DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS

More information

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

PRODUCTS DOSSIER.  / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1 PRODUCTS DOSSIER DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es / hello@neurodigital.es Product information PAGE 1 Minimum System Specs Operating System Windows 8.1 or newer Processor

More information

3D Graphical User Interface on personal computer using p5 Data Glove

3D Graphical User Interface on personal computer using p5 Data Glove www.ijcsi.org 155 3D Graphical User Interface on personal computer using p5 Data Glove Khyati R. Nirmal 1, Nitin Mishra 2 1 M.Tech *, Departement of IT, NRI Institutions, RGPV Bhopal, Madhya Pradesh, India

More information

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Pushkar Shukla 1, Shehjar Safaya 2, Utkarsh Sharma 3 B.Tech, College of Engineering Roorkee, Roorkee, India 1 B.Tech, College of

More information