Designing and Building the PIT: a Head-Tracked Stereo Workspace for Two Users

Size: px
Start display at page:

Download "Designing and Building the PIT: a Head-Tracked Stereo Workspace for Two Users"

Transcription

1 Designing and Building the PIT: a Head-Tracked Stereo Workspace for Two Users Kevin Arthur, Timothy Preston, Russell M. Taylor II, Frederick P. Brooks, Jr., Mary C. Whitton, William V. Wright Department of Computer Science University of North Carolina at Chapel Hill Sitterson Hall, Chapel Hill, NC Additional information is available at: Abstract The PIT ( Protein Interactive Theater ) is a dual-screen, stereo display system and workspace designed specifically for two-person, seated, local collaboration. Each user has a correct head-tracked stereo view of a shared virtual space containing a 3D model under study. The model appears to occupy the same location in lab space for each user, allowing them to augment their verbal communication with gestures. This paper describes our motivation and goals in designing the PIT workspace, the design and fabrication of the hardware and software components, and initial user testing. Motivation and design goals The PIT display system is the most recent of a series of displays used by the GRIP molecular graphics project at UNC [Brooks et al., 1990]. While we intend for the PIT to be generally applicable to other application domains, our initial use is in molecular graphics applications such as protein fitting. The PIT design is motivated by observations gathered over the years by collaborating with biochemists to develop molecular graphics systems. The following were our primary goals in developing the PIT. High quality 3D display. To provide strong 3D depth cues, we employ the technique of head-tracked stereo display. For each user, a stereo image is displayed and updated in real time according to the perspective projection determined by the positions of the user s eyes. The stereo and motion parallax cues give users the illusion of a stable 3D scene located in front of them and fixed in laboratory space. The users wear LCD shutter glasses and tracking sensors. High-resolution images (four images per frame, each rendered at pixels) are displayed on two large rear-projection screens oriented at 90 degrees to each other, with one screen corresponding to each user. Rendering is performed on a Silicon Graphics Onyx workstation with InfiniteReality graphics. We allow for decoupling the application s display and simulation loops so that they run as separate processes, in order to maintain a high display update rate that is independent of the complexity of computations the application may be performing. The display screens may also be oriented at 120 degrees to each other for applications desiring a highresolution, wide field-of-view panoramic view for a single user. Including a second user. Over the years, we have observed that our users, who are biochemists and other scientists, quite often work in pairs to conduct their experiments. To allow close collaboration between the two users we wanted to provide the second user with a view equal in quality and realism to Page 1 of 10

2 that given to the first user. In our past work we found that using a single head-mounted display for the first user and a monitor view for the second user was unsatisfactory; the monitor viewer found it difficult to understand the scene. The situation is improved somewhat with large-screen head-tracked displays such as the CAVE [Cruz-Neira et al., 1993], but, because the perspective images are correct for only the first user, the second user will still see an inaccurate view unless he or she makes an effort to follow the other s movements. More seriously, the second user s view jumps around when the first user moves. This limits that user s ability to interact with the virtual scene and to collaborate actively with the other user. For the PIT display, we decided to provide two head-tracked stereo views, so that each user has an independent view of the virtual scene under study, and may optionally apply their own custom viewing and model manipulations. To create two independent stereo views without perceptible flicker on a single screen using LCD shutter glasses requires a refresh rate higher than that available with current commercially available projectors and shutter glasses. This drove our decision to have two projectors and opened the question of whether the two screens should be side-by-side or in some other configuration. We decided to use two projection screens oriented at 90 to each other, and to display one user s view on each screen. The display gives the illusion of a stable 3D structure fixed in lab space in front of the two seated users. Because they see the displayed structure in the same position in physical space, they may physically point to the same places in the model with their hands. The users may work with their two instances of the virtual scene registered in this way, or they may alter the view according to their own preference. For example, a toggle button is provided for either user to rotate his view by 90 so that both users are viewing the model along the same axis, as if they were both seated at the same virtual position, looking in the same direction. In this case they use virtual icons to point to the scene. We have optimized the display for local collaboration, where the two users are seated next to one another in the same physical place, rather than for distance or tele-collaboration. Additional users can stand behind the two users and view the screens with stereo glasses (without head tracking). Figure 1. Two users interacting in the PIT. Page 2 of 10

3 Access to common devices. Over the years, our chemist collaborators have preferred to use multiple physical input devices such as dials and buttons rather than use modal or overloaded 2D interfaces. We wanted to provide a 3D display that would allow the user access to a lab notebook, a keyboard, a small display for text and GUI elements, physical dials and buttons, and other common devices. This led us away from opaque head-mounted displays, and towards head-tracked stereo in an environment that included space for a table, portable PC s, and other equipment. Input devices may be shared or user-specific. Our intention is to provide devices that are appropriate and natural for the tasks that need to be performed without restricting the users to particular modes of 3D or 2D interaction. Each user has a laptop PC for accessing 2D graphical user interface elements, for reading experiment plans, and for making lab notebook entries based on observations. Two tracked 6 degree-of-freedom handheld controllers provide pointing, picking, and other scene manipulations. Physical dials and buttons are used for shared interactions such as repositioning the displayed structure and controlling PIT display modes. A SensAble Technologies PHANToM arm is available for force-feedback display. Table-top workspace within arm s reach. We found in our earlier experiments using head-mounted displays for molecular graphics applications that users did not choose to fly around or walk around room-filling molecules. The PIT provides a more comfortable environment where the user can sit down and easily reach and manipulate the model under study. Its size can be made similar to the 2 cm/å scale of the familiar brass models that chemists have historically used. Related work Systems employing head-tracked stereo on multiple large screens for a single tracked user, such as the CAVE, have been actively used in recent years [Cruz-Neira et al., 1993; Deering, 1993]. Single-screen head-tracked workbench style displays have also been widely reported [Agrawala et al., 1997; Czernuszenko et al, 1997; Grant et al., 1998]. Agrawala and others [1997] extended the Responsive Workbench display to allow for two tracked users viewing a single screen. They made custom modifications to their shutter glasses and video system to multiplex the four views on a single projector. Extensions to CAVE systems to allow for multiple tracked users are reported in on-line CAVE documentation maintained by the Electronic Visualization Laboratory at UIC. Others have used headmounted displays for multi-user immersive virtual reality or augmented reality systems [Blanchard, 1990; Szalavari, 1998]. Hardware components Figure 2 shows a block diagram of the hardware components of the PIT workspace. Image generation. We use a Silicon Graphics Onyx2 workstation with Infinite Reality graphics to generate two video output channels of pixels each. The video is displayed in a rearprojection arrangement using two AmPro 3600 projectors positioned to give their smallest focussed image size. A StereoGraphics CrystalEyes box doubles the vertical refresh rate of the projectors to provide stereo images synchronized to LCD shutter glasses. Each eye s view is rendered into a region in the frame buffer of pixels [StereoGraphics, 1997]. Page 3 of 10

4 Figure 2. PIT hardware components. In order to do geometrically correct head-tracked stereo, we must accurately measure the physical positions of the screen corners. We have done this manually, and have also used a theodolite surveying device to obtain more accurate measurements [Grant et al., 1998]. We have found that we also need to retune the projectors frequently to maintain alignment of the displayed video with the physical screen corners and minimize other distortions. We also require accurate measurements of the tracker sensor-tohead position transformation, and of the users inter-pupillary distance (IPD). We allow the users to adjust the software s IPD value at runtime using dials. Screen construction. Figures 3 and 4 show the PIT screen material and frame used for our initial implementation of the system. The screen surfaces are each 3 ft. high by 4 ft. wide. We chose a screen size that would provide an adequate field of view for a seated user with line of sight intersecting the center of the screen, as constrained by the smallest focussed image we could get from the projectors. Screen material. We chose a vinyl screen material according to its brightness and gain properties. We wished to maximize image brightness across the range of normal head positions, without brightness falloff being noticeable, and without serious inter-reflection between the two screens. A single sheet is used across the two screen areas and is tensioned to make both surfaces vertical and flat. A Naugahyde fabric border is stitched around the outside of the screen, and behind the corner seam. The border contains grommet holes for tying the screen to the external frame, as well as loops through the material to contain a rod to help keep the screen material flat. The material is gathered at the corner, between the two Page 4 of 10

5 sides of the screen, and re-sealed. A loop in the border behind it is used for support. We desired that the inside seam at the corner be straight and not allow light leakage across the two screens. We also required that it be of minimal thickness so as to give the impression of a continuous display surface with no visible line at the seam when the screens are in a 120 panoramic display configuration. In our first screen the corner seam was heat-sealed. In our second, the seam was stitched. Stitching produced a more durable seam that was less susceptible to tearing, but we found its visual appearance to be worse. The stitched seam wasn t as straight as the heat-sealed seam, and light leakage was visible at the holes created by the stitching. Screen frame and projectors. We required that the frame that holds the screens be made of a non-ferrous material so that its presence would not interfere with the electromagnetic tracker. We chose to use 4-inch PVC pipe for this reason, and for its light weight, rigidity and convenience. Holes are spaced along the outside of the frame for attaching the screen with adjustable elastic bungee cords passing through the grommets. The projectors are fixed to the floor behind the screens on mobile wooden frames that have adjustable bolts for leveling the projectors and adjusting their height. Screen hinge. The corner of the screen frame is hinged to allow the screens to be placed at angles of 90 and 120. We fix the screens and projectors in place using pins through holes in the screen frame, projector mounts, and the lab floor. The left screen and projector are fixed at all times and the right screen and projector can be easily moved between the two positions and secured to the floor with the pins. To maintain a straight seam at the corner between the two screens we place a vertical glass rod through a loop in the screen border material. Lessons learned. Were we building the PIT screens again today, we would construct the frame from 3- inch T-section fiberglass impregnated extrusions, for looks and ease of manufacture. This material would simplify right angle joints and holes for the bungee cord. The T-section also offers the possibility of a less complex hinge assembly. The glass rod used for tensioning the screen could be made of less fragile material, perhaps as simple as a 1 4 board. Both rod and bungee cord are needed for tensioning the screen. We find that three pieces of cord on each side are sufficient and that surprisingly little tension is required to ensure a flat screen. Servers for peripheral devices. We use a Dell 200 Pentium Pro PC running the Linux operating system as a server for multiple input devices. The PC runs a server process using the locally developed VRPN library ( Virtual Reality Peripheral Network, described at It relays data from trackers, buttons, and other input devices over the local ethernet. Additional PC servers provide optional access to a PHANToM force-feedback arm, the UNC ceiling tracker, or a sound server for audio output. We also use an SGI button and dial box connected directly to the serial port of an SGI Onyx. Tracking. For head and hand tracking we have used both a commercial electromagnetic tracker, and a locally developed optical tracker. We use the VRPN server PC to drive an Ascension Flock of Birds tracker with Extended Range Transmitter and four sensor units (two for head tracking and two for hand tracking). The server reads records from the four tracker sensors in parallel at an average rate of 100 Hz. We have observed distortion in tracker readings across the PIT working volume, and intend to apply correction tables using methods developed by Livingston and State [1997]. Page 5 of 10

6 Figure 3. The PIT workspace in its 120 configuration. Figure 4. Detail of screen corner, showing frame hinge, screen material, and glass rod for support. Page 6 of 10

7 For more accurate and faster head tracking we have also used the UNC ceiling tracker [Ward et al., 1992; Welch and Bishop, 1997]. We attach a HiBall optical sensor to each pair of stereo glasses (see Figure 5). The sensor views active infrared LEDs on the ceiling above the PIT and processes these views to provide position and orientation records at approximately 1 khz. The ceiling tracker s latency is userconfigurable from approximately 1 ms to 100 ms, depending on the level of filtering. We expect that a level of filtering resulting in approximately 30 ms of tracker latency will be adequate. In our initial tests, the infrared LEDs on the ceiling occasionally interfered with the infrared synchronization signal sent to the LCD shutter glasses, triggering the shutters. To solve this problem we plan to investigate methods to filter or shield the glasses from seeing the ceiling signal. Software structure Figure 5. StereoGraphics shutter glasses with the UNC HiBall tracker. The PIT software library has been designed to facilitate fast conversion of existing applications, and to allow for the same application code to run on different display devices, with either one or two users. Figure 6 shows the basic software structure. The core PIT code handles all functions required to display head-tracked stereo images and to allow a small set of viewing manipulations (primarily translate, rotate, scale, grab world, and adjust IPD). The user performs these manipulations using dials, buttons, hand controllers, and GUI control panels displayed on the laptop computers using the X and Tcl/Tk libraries. Applications link with the PIT API library, which is written in C++ and uses OpenGL and the UNC Vlib, VRPN, and other supporting libraries. We are testing methods for decoupling the simulation and display processes, similar to methods described by Pausch et al. [1994] and Shaw et al. [1992]. Our intent is that at runtime, PIT applications will be split into two main threads of execution: a model simulation loop, executing application-defined code, and a display loop, executing PIT library code and applicationdefined callback functions. An application must supply a display callback function that draws the model in world space using OpenGL. The PIT library sets the necessary viewing projections prior to calling the display function four times per frame (once per user, per eye). The application s display callback is passed information about which view is being drawn (left or right user, and left or right eye), and may use those parameters to customize its model display. In the simplest mode of operation, the display callback ignores these parameters and draws the same geometry for both users. The PIT will also call other optional application-specified callbacks: a model update callback to coordinate data sharing between the application s simulation and display code, and a graphics initialization callback, called once during initialization in the display execution thread. Page 7 of 10

8 Simulation Code Update Model API Query Functions Handle Devices Update View Draw World Simulation Process Display Process PIT Library Code Application Code Figure 6. PIT software structure. Blocks indicate modules of source code. Arrows indicate data passed between the code modules. The PIT API library also includes functions to modify display parameters and to query the PIT for data from the trackers, hand controllers, and other input devices to perform more complicated interactions such as selecting and moving individual model elements. To provide 3D picking, the application queries the PIT API for the status of the hand controller s buttons, and for the hand controller s position and orientation with respect to the application s model space. Evaluation and next steps We have ported the crystallography application CORWIN (for coupled reciprocal windows ) for members of the Biochemistry department at UNC to use in the PIT. CORWIN allows users to manipulate molecules to perform protein-fitting tasks. The proteins can be viewed in real-space or reciprocal-space (Fourier-space) representations. Users manipulate a protein model by moving individual bond residues and rotating about individual bonds to make the model fit an electron density map. The basic PIT display provides controls for changing viewing and world-space transformations. The CORWIN application code handles application-specific interaction, such as selecting and manipulating individual atoms and residues, and provides other interactions through a GUI interface on the laptop computers. We plan to test the effectiveness of the PIT version of CORWIN by having UNC biochemists use the system on real data. We plan to gather additional human-factors results by extracting from CORWIN a simple 3D fitting task and conducting controlled experiments to measure performance under different viewing conditions and using different manipulation techniques. We have also created general polygonal model viewers for the PIT for other applications, such as architectural walkthrough or mechanical part design, and are working with outside users to port other Page 8 of 10

9 molecular graphics applications to the PIT. For additional information, including a list of material specifications and sources, and additional images and video sequences of the PIT, please see the project web page at Acknowledgments We thank the following people for their contributions to this work: Sumedh Barde, David Harrison, John Thomas, Ruigang Yang, Hans Weber, Brent Insko, Michael Meehan, Kurtis Keller, Andy Wilson, and members of the UNC-CH Tracker and nanomanipulator projects. The PIT screen design was done in collaboration with Fakespace, Inc. Work on the PIT is funded by the NIH Division of Research Resources grant number RR02170 with other significant support from Intel Corporation. References Agrawala, Maneesh, Andrew C. Beers, Bernd Fröhlich, and Pat Hanrahan. The two-user responsive workbench: Support for collaboration through individual views of a shared space. Proceedings of SIGGRAPH 97. Computer Graphics Proceedings, Annual Conference Series, 1996, ACM SIGGRAPH, pp Blanchard, Chuck, Scott Burgess, Young Harvill, Jaron Lanier, Ann Lasko, Mark Oberman, and Michael Teitel. Reality built for two: A virtual reality tool. Proceedings of the 1990 Symposium on Interactive 3D Graphics, Computer Graphics Vol. 24, No. 2, March 1990, pp Brooks, F. P., Jr., M. Ouh-Young, J. J. Batter, and P. J. Kilpatrick. Project GROPE: Haptic displays for scientific visualization, Proceedings of SIGGRAPH 90, Computer Graphics, August 1990, Dallas, TX, pp Cruz-Neira, Carolina, Daniel J. Sandin, and Thomas A. DeFanti. Surround-screen projection-based virtual reality: The design and implementation of the CAVE. Proceedings of SIGGRAPH 93. Computer Graphics Proceedings, Annual Conference Series, 1993, ACM SIGGRAPH, New York, 1993, pp Czernuszenko, M. D. Pape, D. Sandin, T. DeFanti, G. Dawe, and M. Brown, The ImmersaDesk and Infinity Wall Projection-Based Virtual Reality Displays, Computer Graphics, Vol. 31 Number 2, May Deering, Michael. Making virtual reality more real: Experience with the virtual portal. Proceedings of Graphics Interface 93, Canadian Information Processing Society. pp Grant, Brian, Aron Helser and Russell M. Taylor II, Adding force display to a stereoscopic head-tracked projection display. Proceedings of VRAIS '98, Atlanta, Georgia, Livingston, Mark A. and Andrei State. Magnetic tracker calibration for improved augmented reality registration. In Presence: Teleoperators and Virtual Environments, MIT Press, Vol. 6, No. 5, October, 1997, pp Pausch, Randy, Matthew Conway, Robert DeLine, Rich Gossweiler, Steve Maile, Jonathan Ashton and Richard Stoakley Alice & DIVER: A Software Architecture for the Rapid Prototyping of Virtual Environments, Course notes for SIGGRAPH '94 course, Programming Virtual Worlds, Page 9 of 10

10 Shaw, Chris, Jiandong Liang, Mark Green, and Yunqi Sun. The decoupled simulation model for virtual reality systems. Proceedings of the CHI 92 Conference on Human Factors in Computing Systems, 1992, pp StereoGraphics Corporation StereoGraphics developers handbook. StereoGraphics Corporation, San Rafael, CA. Available at Szalavari, Zs., D. Schmalstieg, A. Fuhrmann, and M. Gervautz. Studierstube - An environment for collaboration in augmented reality. to appear in Virtual Reality Journal, Available at Ward, Mark, Ronald Azuma, Robert Bennett, Stefan Gottschalk, and Henry Fuchs. A demonstrated optical tracker with scalable work area for head-mounted display systems. Proceedings of 1992 Symposium on Interactive 3D Graphics, Computer Graphics, Vol 25, No. 2, pp Welch, Gregory and Gary Bishop. SCAAT: Incremental tracking with incomplete information, Proceedings of SIGGRAPH 97. Computer Graphics Proceedings, Annual Conference Series, 1997, pp Page 10 of 10

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system Line of Sight Method for Tracker Calibration in Projection-Based VR Systems Marek Czernuszenko, Daniel Sandin, Thomas DeFanti fmarek j dan j tomg @evl.uic.edu Electronic Visualization Laboratory (EVL)

More information

The Visual Cliff Revisited: A Virtual Presence Study on Locomotion. Extended Abstract

The Visual Cliff Revisited: A Virtual Presence Study on Locomotion. Extended Abstract The Visual Cliff Revisited: A Virtual Presence Study on Locomotion 1-Martin Usoh, 2-Kevin Arthur, 2-Mary Whitton, 2-Rui Bastos, 1-Anthony Steed, 2-Fred Brooks, 1-Mel Slater 1-Department of Computer Science

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Morteza Ghazisaedy David Adamczyk Daniel J. Sandin Robert V. Kenyon Thomas A. DeFanti Electronic Visualization Laboratory (EVL) Department

More information

Visual Data Mining and the MiniCAVE Jürgen Symanzik Utah State University, Logan, UT

Visual Data Mining and the MiniCAVE Jürgen Symanzik Utah State University, Logan, UT Visual Data Mining and the MiniCAVE Jürgen Symanzik Utah State University, Logan, UT *e-mail: symanzik@sunfs.math.usu.edu WWW: http://www.math.usu.edu/~symanzik Contents Visual Data Mining Software & Tools

More information

Haptic Rendering and Volumetric Visualization with SenSitus

Haptic Rendering and Volumetric Visualization with SenSitus Haptic Rendering and Volumetric Visualization with SenSitus Stefan Birmanns, Ph.D. Department of Molecular Biology The Scripps Research Institute 10550 N. Torrey Pines Road, Mail TPC6 La Jolla, California,

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Video-Based Measurement of System Latency

Video-Based Measurement of System Latency Video-Based Measurement of System Latency Ding He, Fuhu Liu, Dave Pape, Greg Dawe, Dan Sandin Electronic Visualization Laboratory University of Illinois at Chicago {eric, liufuhu, pape, dawe}@evl.uic.edu,

More information

Collaborative Visualization in Augmented Reality

Collaborative Visualization in Augmented Reality Collaborative Visualization in Augmented Reality S TUDIERSTUBE is an augmented reality system that has several advantages over conventional desktop and other virtual reality environments, including true

More information

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote 8 th International LS-DYNA Users Conference Visualization Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote Todd J. Furlong Principal Engineer - Graphics and Visualization

More information

A Hybrid Immersive / Non-Immersive

A Hybrid Immersive / Non-Immersive A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain

More information

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS Announcements Homework project 2 Due tomorrow May 5 at 2pm To be demonstrated in VR lab B210 Even hour teams start at 2pm Odd hour teams start

More information

MRT: Mixed-Reality Tabletop

MRT: Mixed-Reality Tabletop MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having

More information

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology Virtual Reality man made reality sense world What is Virtual Reality? Dipl-Ing Indra Kusumah Digital Product Design Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indrakusumah@iptfraunhoferde wwwiptfraunhoferde

More information

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e. VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

Interaction and Co-located Collaboration in Large Projection-Based Virtual Environments

Interaction and Co-located Collaboration in Large Projection-Based Virtual Environments Interaction and Co-located Collaboration in Large Projection-Based Virtual Environments Andreas Simon 1, Armin Dressler 1, Hans-Peter Krüger 1, Sascha Scholz 1, and Jürgen Wind 2 1 Fraunhofer IMK Virtual

More information

Realistic Visual Environment for Immersive Projection Display System

Realistic Visual Environment for Immersive Projection Display System Realistic Visual Environment for Immersive Projection Display System Hasup Lee Center for Education and Research of Symbiotic, Safe and Secure System Design Keio University Yokohama, Japan hasups@sdm.keio.ac.jp

More information

LOW COST CAVE SIMPLIFIED SYSTEM

LOW COST CAVE SIMPLIFIED SYSTEM LOW COST CAVE SIMPLIFIED SYSTEM C. Quintero 1, W.J. Sarmiento 1, 2, E.L. Sierra-Ballén 1, 2 1 Grupo de Investigación en Multimedia Facultad de Ingeniería Programa ingeniería en multimedia Universidad Militar

More information

Experience of Immersive Virtual World Using Cellular Phone Interface

Experience of Immersive Virtual World Using Cellular Phone Interface Experience of Immersive Virtual World Using Cellular Phone Interface Tetsuro Ogi 1, 2, 3, Koji Yamamoto 3, Toshio Yamada 1, Michitaka Hirose 2 1 Gifu MVL Research Center, TAO Iutelligent Modeling Laboratory,

More information

Spatial Mechanism Design in Virtual Reality With Networking

Spatial Mechanism Design in Virtual Reality With Networking Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 9-2001 Spatial Mechanism Design in Virtual Reality With Networking John N. Kihonge Iowa State University

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

Video-Based Measurement of System Latency

Video-Based Measurement of System Latency Video-Based Measurement of System Latency Ding He, Fuhu Liu, Dave Pape, Greg Dawe, Dan Sandin Electronic Visualization Laboratory University of Illinois at Chicago {eric, liufuhu, pape, dawe}@evl.uic.edu,

More information

VR based HCI Techniques & Application. November 29, 2002

VR based HCI Techniques & Application. November 29, 2002 VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted

More information

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM Annals of the University of Petroşani, Mechanical Engineering, 8 (2006), 73-78 73 VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM JOZEF NOVÁK-MARCINČIN 1, PETER BRÁZDA 2 Abstract: Paper describes

More information

Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror

Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror IPT-EGVE Symposium (2007) B. Fröhlich, R. Blach, and R. van Liere (Editors) Short Papers Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror K. Murase 1 T. Ogi 1 K. Saito 2

More information

DETC2001/CIE21267 DESIGN SYNTHESIS IN A VIRTUAL ENVIRONMENT

DETC2001/CIE21267 DESIGN SYNTHESIS IN A VIRTUAL ENVIRONMENT Proceedings of DETC 01: ASME 2001 Design Engineering Technical Conferences and Computers and Information in Engineering Conference Pittsburgh, Pennsylvania, September 9-12, 2001 DETC2001/CIE21267 DESIGN

More information

Enhancing Fish Tank VR

Enhancing Fish Tank VR Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands mullie robertl @cwi.nl Abstract Fish tank VR systems provide head

More information

Virtual Reality & Interaction

Virtual Reality & Interaction Virtual Reality & Interaction Virtual Reality Input Devices Output Devices Augmented Reality Applications What is Virtual Reality? narrow: immersive environment with head tracking, headmounted display,

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Reviews of Virtual Reality and Computer World

Reviews of Virtual Reality and Computer World Reviews of Virtual Reality and Computer World Mehul Desai 1,Akash Kukadia 2, Vatsal H. shah 3 1 IT Dept., Birla VishvaKarmaMahavidyalayaEngineering College, desaimehul94@gmail.com 2 IT Dept.,Birla VishvaKarmaMahavidyalayaEngineering

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

Design of the ImmersiveTouch : a High-Performance Haptic Augmented Virtual Reality System

Design of the ImmersiveTouch : a High-Performance Haptic Augmented Virtual Reality System Design of the ImmersiveTouch : a High-Performance Haptic Augmented Virtual Reality System Cristian Luciano, Pat Banerjee, Lucian Florea, Greg Dawe Electronic Visualization Laboratory Industrial Virtual

More information

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction

More information

Interactive intuitive mixed-reality interface for Virtual Architecture

Interactive intuitive mixed-reality interface for Virtual Architecture I 3 - EYE-CUBE Interactive intuitive mixed-reality interface for Virtual Architecture STEPHEN K. WITTKOPF, SZE LEE TEO National University of Singapore Department of Architecture and Fellow of Asia Research

More information

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast. 11. Image Processing Image processing concerns about modifying or transforming images. Applications may include enhancing an image or adding special effects to an image. Here we will learn some of the

More information

Enhancing Fish Tank VR

Enhancing Fish Tank VR Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands fmulliejrobertlg@cwi.nl Abstract Fish tank VR systems provide head

More information

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel An Introduction into Virtual Reality Environments What is Virtual Reality? Technically defined: Stefan Seipel stefan.seipel@hig.se VR is a medium in terms of a collection of technical hardware (similar

More information

History of Virtual Reality. Trends & Milestones

History of Virtual Reality. Trends & Milestones History of Virtual Reality (based on a talk by Greg Welch) Trends & Milestones Displays (head-mounted) video only, CG overlay, CG only, mixed video CRT vs. LCD Tracking magnetic, mechanical, ultrasonic,

More information

Trends & Milestones. History of Virtual Reality. Sensorama (1956) Visually Coupled Systems. Heilig s HMD (1960)

Trends & Milestones. History of Virtual Reality. Sensorama (1956) Visually Coupled Systems. Heilig s HMD (1960) Trends & Milestones History of Virtual Reality (thanks, Greg Welch) Displays (head-mounted) video only, CG overlay, CG only, mixed video CRT vs. LCD Tracking magnetic, mechanical, ultrasonic, optical local

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

An Introduction into Virtual Reality Environments. Stefan Seipel

An Introduction into Virtual Reality Environments. Stefan Seipel An Introduction into Virtual Reality Environments Stefan Seipel stefan.seipel@hig.se What is Virtual Reality? Technically defined: VR is a medium in terms of a collection of technical hardware (similar

More information

Vendor Response Sheet Technical Specifications

Vendor Response Sheet Technical Specifications TENDER NOTICE NO: IPR/TN/PUR/TPT/ET/17-18/38 DATED 27-2-2018 Vendor Response Sheet Technical Specifications 1. 3D Fully Immersive Projection and Display System Item No. 1 2 3 4 5 6 Specifications A complete

More information

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments An Introduction into Virtual Reality Environments What is Virtual Reality? Technically defined: Stefan Seipel, MDI Inst. f. Informationsteknologi stefan.seipel@hci.uu.se VR is a medium in terms of a collection

More information

VR System Input & Tracking

VR System Input & Tracking Human-Computer Interface VR System Input & Tracking 071011-1 2017 년가을학기 9/13/2017 박경신 System Software User Interface Software Input Devices Output Devices User Human-Virtual Reality Interface User Monitoring

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments

A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments Invited Paper A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments J.P. Rolland', Y. Ha', L. Davjs2'1, H. Hua3, C. Gao', and F.

More information

DICELIB: A REAL TIME SYNCHRONIZATION LIBRARY FOR MULTI-PROJECTION VIRTUAL REALITY DISTRIBUTED ENVIRONMENTS

DICELIB: A REAL TIME SYNCHRONIZATION LIBRARY FOR MULTI-PROJECTION VIRTUAL REALITY DISTRIBUTED ENVIRONMENTS DICELIB: A REAL TIME SYNCHRONIZATION LIBRARY FOR MULTI-PROJECTION VIRTUAL REALITY DISTRIBUTED ENVIRONMENTS Abstract: The recent availability of PC-clusters offers an alternative solution instead of high-end

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

The Use of Visual and Auditory Feedback for Assembly Task Performance in a Virtual Environment

The Use of Visual and Auditory Feedback for Assembly Task Performance in a Virtual Environment The Use of and Auditory Feedback for Assembly Task Performance in a Virtual Environment Ying Zhang, Terrence Fernando, Reza Sotudeh, Hannan Xiao University of Hertfordshire, University of Salford, University

More information

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

Physical Presence in Virtual Worlds using PhysX

Physical Presence in Virtual Worlds using PhysX Physical Presence in Virtual Worlds using PhysX One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are

More information

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING Proceedings of the 1998 Winter Simulation Conference D.J. Medeiros, E.F. Watson, J.S. Carson and M.S. Manivannan, eds. SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF

More information

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents From: AAAI Technical Report SS-00-04. Compilation copyright 2000, AAAI (www.aaai.org). All rights reserved. Visual Programming Agents for Virtual Environments Craig Barnes Electronic Visualization Lab

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Interactive Multimedia Contents in the IllusionHole

Interactive Multimedia Contents in the IllusionHole Interactive Multimedia Contents in the IllusionHole Tokuo Yamaguchi, Kazuhiro Asai, Yoshifumi Kitamura, and Fumio Kishino Graduate School of Information Science and Technology, Osaka University, 2-1 Yamada-oka,

More information

Construction of visualization system for scientific experiments

Construction of visualization system for scientific experiments Construction of visualization system for scientific experiments A. V. Bogdanov a, A. I. Ivashchenko b, E. A. Milova c, K. V. Smirnov d Saint Petersburg State University, 7/9 University Emb., Saint Petersburg,

More information

Exploring 3D in Flash

Exploring 3D in Flash 1 Exploring 3D in Flash We live in a three-dimensional world. Objects and spaces have width, height, and depth. Various specialized immersive technologies such as special helmets, gloves, and 3D monitors

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

COSMIC WORM IN THE CAVE: STEERING A HIGH PERFORMANCE COMPUTING APPLICATION FROM A VIRTUAL ENVIRONMENT

COSMIC WORM IN THE CAVE: STEERING A HIGH PERFORMANCE COMPUTING APPLICATION FROM A VIRTUAL ENVIRONMENT COSMIC WORM IN THE CAVE: STEERING A HIGH PERFORMANCE COMPUTING APPLICATION FROM A VIRTUAL ENVIRONMENT Trina M. Roy, Carolina Cruz-Neira, Thomas A. DeFanti Electronic Visualization Laboratory University

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

Overview of current developments in haptic APIs

Overview of current developments in haptic APIs Central European Seminar on Computer Graphics for students, 2011 AUTHOR: Petr Kadleček SUPERVISOR: Petr Kmoch Overview of current developments in haptic APIs Presentation Haptics Haptic programming Haptic

More information

TEAM JAKD WIICONTROL

TEAM JAKD WIICONTROL TEAM JAKD WIICONTROL Final Progress Report 4/28/2009 James Garcia, Aaron Bonebright, Kiranbir Sodia, Derek Weitzel 1. ABSTRACT The purpose of this project report is to provide feedback on the progress

More information

Configuring Multiscreen Displays With Existing Computer Equipment

Configuring Multiscreen Displays With Existing Computer Equipment Configuring Multiscreen Displays With Existing Computer Equipment Jeffrey Jacobson www.planetjeff.net Department of Information Sciences, University of Pittsburgh An immersive multiscreen display (a UT-Cave)

More information

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21 Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:

More information

THE DEVELOPMENT OF AN INTEGRATED GRAPHICAL SLS PROCESS CONTROL INTERFACE

THE DEVELOPMENT OF AN INTEGRATED GRAPHICAL SLS PROCESS CONTROL INTERFACE THE DEVELOPMENT OF AN INTEGRATED GRAPHICAL SLS PROCESS CONTROL INTERFACE ABSTRACT Guohua Ma and Richard H. Crawford The University of Texas at Austin This paper presents the systematic development of a

More information

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Tangible User Interface for CAVE TM based on Augmented Reality Technique Tangible User Interface for CAVE TM based on Augmented Reality Technique JI-SUN KIM Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of

More information

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

November 30, Prof. Sung-Hoon Ahn ( 安成勳 ) 4 4 6. 3 2 6 A C A D / C A M Virtual Reality/Augmented t Reality November 30, 2009 Prof. Sung-Hoon Ahn ( 安成勳 ) Photo copyright: Sung-Hoon Ahn School of Mechanical and Aerospace Engineering Seoul National

More information

Andrew Johnson, Jason Leigh, Luc Renambot and a whole bunch of graduate students

Andrew Johnson, Jason Leigh, Luc Renambot and a whole bunch of graduate students Collaborative Visualization using High-Resolution Tile Displays Andrew Johnson, Jason Leigh, Luc Renambot and a whole bunch of graduate students May 25, 2005 Electronic Visualization Laboratory, UIC Established

More information

Software Architecture for a Virtual Environment for Nano Scale Assembly (VENSA)

Software Architecture for a Virtual Environment for Nano Scale Assembly (VENSA) [J. Res. Natl. Inst. Stand. Technol. 109, 279-290 (2004)] Software Architecture for a Virtual Environment for Nano Scale Assembly (VENSA) Volume 109 Number 2 March-April 2004 Yong-Gu Lee 1 Gwangju Institute

More information

EnSight in Virtual and Mixed Reality Environments

EnSight in Virtual and Mixed Reality Environments CEI 2015 User Group Meeting EnSight in Virtual and Mixed Reality Environments VR Hardware that works with EnSight Canon MR Oculus Rift Cave Power Wall Canon MR MR means Mixed Reality User looks through

More information

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION Tweek: Merging 2D and 3D Interaction in Immersive Environments Patrick L Hartling, Allen D Bierbaum, Carolina Cruz-Neira Virtual Reality Applications Center, 2274 Howe Hall Room 1620, Iowa State University

More information

CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR

CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR Karan Singh Inspired and adapted from material by Mark Billinghurst What is this course about? Fundamentals

More information

ARK: Augmented Reality Kiosk*

ARK: Augmented Reality Kiosk* ARK: Augmented Reality Kiosk* Nuno Matos, Pedro Pereira 1 Computer Graphics Centre Rua Teixeira Pascoais, 596 4800-073 Guimarães, Portugal {Nuno.Matos, Pedro.Pereira}@ccg.pt Adérito Marcos 1,2 2 University

More information

Mechatronics Project Report

Mechatronics Project Report Mechatronics Project Report Introduction Robotic fish are utilized in the Dynamic Systems Laboratory in order to study and model schooling in fish populations, with the goal of being able to manage aquatic

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements Jose Fortín and Raúl Suárez Abstract Software development in robotics is a complex task due to the existing

More information

synchrolight: Three-dimensional Pointing System for Remote Video Communication

synchrolight: Three-dimensional Pointing System for Remote Video Communication synchrolight: Three-dimensional Pointing System for Remote Video Communication Jifei Ou MIT Media Lab 75 Amherst St. Cambridge, MA 02139 jifei@media.mit.edu Sheng Kai Tang MIT Media Lab 75 Amherst St.

More information

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Capability for Collision Avoidance of Different User Avatars in Virtual Reality Capability for Collision Avoidance of Different User Avatars in Virtual Reality Adrian H. Hoppe, Roland Reeb, Florian van de Camp, and Rainer Stiefelhagen Karlsruhe Institute of Technology (KIT) {adrian.hoppe,rainer.stiefelhagen}@kit.edu,

More information

FORCE FEEDBACK. Roope Raisamo

FORCE FEEDBACK. Roope Raisamo FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces

More information

A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY

A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY H. ISHII, T. TEZUKA and H. YOSHIKAWA Graduate School of Energy Science, Kyoto University,

More information

Sixteenth Annual Progress Report-Interactive Graphics for Molecular Studies

Sixteenth Annual Progress Report-Interactive Graphics for Molecular Studies Sixteenth Annual Progress Report-Interactive Graphics for Molecular Studies TR90-007 March, 1990 Frederick P. Brooks, Jr. The University of North Carolina at Chapel Hill Department of Computer Science

More information

Perception in Immersive Environments

Perception in Immersive Environments Perception in Immersive Environments Scott Kuhl Department of Computer Science Augsburg College scott@kuhlweb.com Abstract Immersive environment (virtual reality) systems provide a unique way for researchers

More information

House Design Tutorial

House Design Tutorial House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have created a

More information

Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3

Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3 Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3 Naoki KAWAKAMI, Masahiko INAMI, Taro MAEDA, and Susumu TACHI Faculty of Engineering, University of Tokyo 7-3- Hongo,

More information

Regan Mandryk. Depth and Space Perception

Regan Mandryk. Depth and Space Perception Depth and Space Perception Regan Mandryk Disclaimer Many of these slides include animated gifs or movies that may not be viewed on your computer system. They should run on the latest downloads of Quick

More information

Using Transparent Props For Interaction With The Virtual Table

Using Transparent Props For Interaction With The Virtual Table Using Transparent Props For Interaction With The Virtual Table Dieter Schmalstieg 1, L. Miguel Encarnação 2, and Zsolt Szalavári 3 1 Vienna University of Technology, Austria 2 Fraunhofer CRCG, Inc., Providence,

More information

[PYTHON] The Python programming language and all associated documentation is available via anonymous ftp from: ftp.cwi.nl. [DIVER] R. Gossweiler, C.

[PYTHON] The Python programming language and all associated documentation is available via anonymous ftp from: ftp.cwi.nl. [DIVER] R. Gossweiler, C. [PYTHON] The Python programming language and all associated documentation is available via anonymous ftp from: ftp.cwi.nl. [DIVER] R. Gossweiler, C. Long, S. Koga, R. Pausch. DIVER: A Distributed Virtual

More information

Chapter 1 Virtual World Fundamentals

Chapter 1 Virtual World Fundamentals Chapter 1 Virtual World Fundamentals 1.0 What Is A Virtual World? {Definition} Virtual: to exist in effect, though not in actual fact. You are probably familiar with arcade games such as pinball and target

More information

THE UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL. The Bathysphere. Motion Capture and Immersive Projection. Caitlyn Losee 12/1/2010

THE UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL. The Bathysphere. Motion Capture and Immersive Projection. Caitlyn Losee 12/1/2010 THE UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL The Bathysphere Motion Capture and Immersive Projection Caitlyn Losee 12/1/2010 Under the direction of Professor Greg Welch Abstract The University of North

More information

Haptics CS327A

Haptics CS327A Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller

More information

The Da-Lite Difference.

The Da-Lite Difference. The Da-Lite Difference. FRONT PROJECTION SCREENS SCREEN COMPANY TABLE OF CONTENTS Model Type Concealed- Without Doors Concealed- With Doors Ceiling Wall Concealedor Surface Floor Mount Page Tensioned Advantage

More information

Virtual/Augmented Reality (VR/AR) 101

Virtual/Augmented Reality (VR/AR) 101 Virtual/Augmented Reality (VR/AR) 101 Dr. Judy M. Vance Virtual Reality Applications Center (VRAC) Mechanical Engineering Department Iowa State University Ames, IA Virtual Reality Virtual Reality Virtual

More information

The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seun

The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seun The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seungmoon Choi and In Lee Haptics and Virtual Reality Laboratory

More information

A Desktop Networked Haptic VR Interface for Mechanical Assembly

A Desktop Networked Haptic VR Interface for Mechanical Assembly Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 11-2005 A Desktop Networked Haptic VR Interface for Mechanical Assembly Abhishek Seth Iowa State University

More information

Computer Haptics and Applications

Computer Haptics and Applications Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School

More information

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT 1 Rudolph P. Darken, 1 Joseph A. Sullivan, and 2 Jeffrey Mulligan 1 Naval Postgraduate School,

More information

UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS

UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS 5.1 Introduction Orthographic views are 2D images of a 3D object obtained by viewing it from different orthogonal directions. Six principal views are possible

More information