Immersive Multi-Projector Display on Hybrid Screens with Human-Scale Haptic Interface
|
|
- Garey Greene
- 5 years ago
- Views:
Transcription
1 888 IEICE TRANS. INF. & SYST., VOL.E88 D, NO.5 MAY 2005 PAPER Special Section on Cyberworlds Immersive Multi-Projector Display on Hybrid Screens with Human-Scale Haptic Interface Seungzoo JEONG a), Nonmember, Naoki HASHIMOTO, and Makoto SATO, Members SUMMARY Many immersive displays developed in previous researches are strongly influenced by the design concept of the CAVE, which is the origin of the immersive displays. In the view of human-scale interactive system for virtual environment (VE), the existing immersive systems are not enough to use the potential of a human sense further extent. The displays require more complicated structure for flexible extension, and are more restrictive to user s movement. Therefore we propose a novel multiprojector display for immersive VE with haptic interface for more flexible and dynamic interaction. The display part of our system named D-vision has a hybrid curved screen which consist of compound prototype with flat and curve screen. This renders images seamlessly in real time, and generates high-quality stereovision by PC cluster and two-pass technology. Furthermore a human-scale string-based haptic device will integrate with the D-vision for more interactive and immersive VE. In this paper, we show an overview of the D-vision and technologies used for the human-scale haptic interface. key words: immersive projection technology, human-scale virtual environment, multi-modal interaction, force feedback 1. Introduction Recently, many immersive displays have been developed for virtual reality, education, industry, entertainment, etc. These displays originated from the CAVE [3] which surrounds users with large flat screens. Despite a simple design of the CAVE, it could realize highly immersive virtual environments effectively. In other studies, they proposed systems extending the number of its screens, and created the cubic space completely surrounded with six large screens [21]. This may be an ultimate structure of CAVE-like displays because it leads to an overwhelming feeling of presence without leak of outer world. However, when users interact with VE, not only visual quality, but also natural modes of interaction and control, and perception of self-movement affect presence or immersions [5]. From this point of view, adding force feedback to an immersive VE is the most direct and natural approach which can increase user involvement. Some studies about presence experiments [14], [16] have been demonstrated that haptic feedback improves task performance and increases presence in VE. These studies showed that the users felt as if they were present to a higher degree in the virtual environment when they received haptic information. Though CAVE-like displays have been tried with the Manuscript received September 13, Manuscript revised December 1, The authors are with Precision and Intelligence Laboratory, Tokyo Institute of Technology, Yokohama-shi, Japan. a) jeongzoo@hi.pi.titech.ac.jp DOI: /ietisy/e88 d integration of an arm force feedback device, these solutions have several drawbacks. The arm can interfere in the interaction with virtual objects or hide parts of the visualization space. Furthermore, this only works within a limited space [4], [20]. For human-scale interaction between real- and virtual worlds, it is very important for us that the user can interact with virtual objects at hand or close to his hands for getting high-degree presence and realism. It means that a display system should allow the user to step into screen areas so as to gain proper control and lessen the gap between his perceived distance and physical length. Above all, for our desirable interaction system, the display and human interfaces should have flexible configuration and high visualization. In this paper, we propose a novel multi-projector display with a string-based haptic interface, for more interactive and immersive VE. We have developed a multiprojection display system named D-vision [6]. The D of D-vision originates in Divide, and Duplex-vision. The system adopts a multi-projector technology, which uses 24 PCs and projectors to provide high-resolution and wide view-angle images. Images are projected on a hybrid screen, which consists of a flat central screen and a curved peripheral screen. Adoption of flat screen on the front screen can secure the brightness and resolution per unit area. The flat surface distortion free can create high-quality stereoscopic vision. This design concept establishes effective installation spaces, and are generated more streamlined images in the hybrid screen through our rendering method. To generate force feedback effects, the D-vision combines with a human-scale haptic interface named SPIDAR- H. The haptic interface consists of some motors and flexible strings instead of electromagnetic force sensor. Because of its simplicity, it can be implemented into the D-vision flexibly. It also allows full body activity to the user, and a practicable manipulation space. Our aim is to increase the sense of togetherness through touch by allowing physical contact. Our proposed system will invoke the feelings of physical and social presence unobtainable with only hearing and seeing. Hence, when a VE provides life-size interaction and immersive virtual space with user, human metaphors will enlarge their potentials and the interfaces will promote their capability further extend. 2. Immersive Multi-Projector: D-vision In this section, we will describe how to implement our Copyright c 2005 The Institute of Electronics, Information and Communication Engineers
2 JEONG et al.: IMMERSIVE AND INTERACTIVE SYSTEM 889 Fig. 1 An overview of the multi-projector display D-vision. Projectors and a PC cluster for graphics rendering are placed around a hybrid screen. Fig. 2 A structure of the hybrid curved screen. Its central area is flat, and its peripheral area is spherical containing partial surfaces of such objects as flats, spheres, cylinders and tori. system, D-vision for immersive and interactive multiprojection system in detail. An overview of the D-vision isshowninfig Hybrid Curved Screen In many ancient CAVE-like immersive displays, a cubic screen has mostly been the dominant solution. The cubic type with 6 flat screens which completely surround users into that inside, said to an ultimate structure. It requires more complicated structure for flexible extension, and makesit more difficult to integrate othermechanical and large-scale interfaces. Other immersive displays with a spherical or arched screen have also been developed [7], [19]. They reduced distorted images and were for multi-user at the same time. These spherical display systems used only front projection to reduce any extra-spaces. However, the systems suffer from a fundamental problem that a user s shadow will surely appear on a displayed image [12]. Due to this reason, the users cannot enter freely inside a screen. When you apply rear-projection for front screens to avoid shadows, the existing displays involve crucial problems due to massive space and high cost of hardware configuration. For instance spherical screen causes ill and dark areas in the steep curved surface. In order to diminish the dark areas, the screen should subdivide into smaller parts, and be projected by using a large number of projectors according to the number of parts. This is an impractical method for its cost. Our screen design attempts to use limited space efficiently by reducing installation space. Moreover it was considered so that the system provides a virtual experience with human-scale bodily input and use force feedback at ease. Firstly, it is composed of a hybrid curved screen, which adopts flat screens for central view and curved screens for peripheral view and it is based on the structure of human s eyes. In the human s eyes, central view is used to perceive the outer world more precisely with high-resolution input, and peripheral view is used to detect movements of objects in the outer world with low resolution, but wide view angle input. Therefore, a flat fresnel-lenticular stereoscopic screen is used in the central area of the hybrid screen for high quality image projection. And, in the peripheral area, a simple curved screen made with fiberglass reinforced plastic (FRP) is used to realize a wide view angle. The curved screen is composed of some simple primitive shapes as shown in Fig. 2. The size of the hybrid screen amounts to 6.3 m (width) 4.0 m (height) 1.5 m (depth). Because large curved surfaces are difficult to build accurately, each of the screen elements in Fig. 2 is based on a simple shape and its size is relatively small, for example a radius of the spherical part is 1.5 m. These curved surfaces are connected smoothly so as not to distort projected images on the jointed part of the screens. Even if the hybrid screen covers only the users front half, no lack of efficiency of immersion can be detected. The users view direction can be controlled and kept as a central view by using a locomotion interface [11]. As shown in Fig. 1, the central flat part of the screen is for rear projection with 8 projectors with SXGA, pixels resolution. The remaining part of the screen is for front projection with 16 projectors with XGA, pixels resolution. Brightness of each projector is about 3,000 ANSI lumens. These projectors are installed at the positions indicated in Fig. 3. The projectors for the central screen, shown in Fig. 3 (a), are arranged in common style with lens-shift function to realize stereoscopic image projection. Other projectors, shown in Fig. 3 (b), adopt an original arrangement to minimize installation spaces and avoid casting a shadow on the screen. The whole size in Table 1 indicates the volume of installation spaces including screens, projectors and PCs [9], [21]. However, despite of less spacious volume, the ratio of the D-vision s screens is much higher than that of other displays. Therefore, we can confirm that the D-vision s design concept realizes an efficient configuration in limited spaces.
3 890 IEICE TRANS. INF. & SYST., VOL.E88 D, NO.5 MAY 2005 Fig. 3 Projector arrays. (a) is projectors behind the screen, and (b) is behind users. The arrangement of these projectors is original to reduce installation spaces and casting shadows. Table 1 Size of exist immersive displays. system (screen) whole (m) screen (m) ratio D-vision (4) CAVE (1) CABIN (6) COSMOS (17) size is (W) (D) (H) 2.2 Real-Time Image Correction The D-vision has a parallel rendering strategy to generate distributed and high-resolution images in real-time. The rendering processes of the D-vision are distributed to each rendering PC of the PC cluster, and work according to the role of each PC based on target areas on the whole screen. In this system, whole images of the scenes are divided into 16 areas for the distributed rendering in each PC that is connected to each projector. The 8 areas which include the central view and up and down side of that view are rendered by 16 PCs for the stereoscopic viewing using polarized glasses. As a result, the total of 24 PCs are used for image generation of the D-vision. In the compound type with flat and curve screen, however, each color map of the projectors is not balanced, and has no uniformity in the link area of two screens. It also makes render more distorted images on the change of user s viewpoint. Even if the front screen occupies a mostly flat area with free-distortion, geometry and color corrections technologies of rendered images is still needed for perceiving and generating a seamless virtual world. In our rendering method, those correction tables are measured with a digital camera and calculated accurately in advance [22]. The tables are referred in two ways as shown in Fig. 4 (a). One is that original image processing hardware is used for correcting in real-time. This method is stable and reliable with such special hardware. The other is that a programmable GPU on a PC s video card is used. Today s GPU has an enough power to process images in real-time. Each correction table is stored into a video memory as textures as shown in Fig. 4 (b), and referred in extra rendering passes to perform the image corrections. Figure 4 (b) illustrates examples of brightness correction maps as gray-scale texture images. Figures 4 (c) and 4 (d) show the result of the image corrections. Fig. 4 An image correction for the curved screen. Before the correction, images projected with various projectors overlap with each other in a state of disorder. After the correction, those images are seamlessly blended, and a virtual world is realized in the D-vision. This correction method has less computation cost and lower latency because of the high-speed image processing hardware or powerful GPUs. 3. Human-Scale Haptic Interface In this section, less invasive, more flexible human-scale haptic device, is described i.e. SPIDAR-H. And we discuss how to integrate the haptic interface sole developed with the human-scale immersive VEs. 3.1 SPIDAR-H System Adding haptic information to immersive virtual environments is especially effective for interacting and enhancing presence in those environments. The recent mainstream of haptic interfaces for the virtual reality adopts a mechanical link structure or wire-driven system. In the immersive VE, however, the invasive mechanical links prevent users from seeing surrounding images seamlessly, only work within a limited space, even if that structure is stable for highly accurate and scalable operation. On the other side, a wearable force display system for immersive projection displays [8] has been developed. The mechanism is based on the same wire-driven approach as our haptic system, i.e. SPIDAR. Its compact size and
4 JEONG et al.: IMMERSIVE AND INTERACTIVE SYSTEM 891 Fig. 5 A human-scale haptic interface. portable functions make the user walk around freely within large workspace thanks to wearable features. It has limited haptic spaces, and its structure requires larger frame for getting both-handed interaction. Its extended frames can also interfere with the user s vision. Although several type of improved haptic devices have been developed, they were still not suitable for the use in immersive virtual environments because of their size and mechanism [18]. This was a constraint or even a difficulty for integrating force feedback devices with immersive VE. The problems can be solved by an alternative interface to provide force feedback for direct manipulation within a sufficiently large space. The solution is to use less invasive and more flexible string-based haptic devices. Therefore, we adopted a human-scale interface named SPIDAR-H which is based on string-based haptic device for desktop operation [10]. Like the previous human-scale haptic system [1], the SPIDAR-H allows force feedback on one distinct point, like the middle finger, with 3 translations. As four motors are necessary for each point, 8 motors are installed for both hands. They are positioned on non-adjacent vertices of a cubic frame in the origin structure (See Fig. 5). This is a simpler configuration, for both hands force feedback and displaying haptic in every direction. The haptic rendering is implemented by the spring and damper model. Force feedback occurs when collisions or impacts happen. The simulation frame rate can vary from 1 khz to 10 khz. The haptic and visualization rendering will be synchronized stably by high speed I/O and high definition haptic controller. 3.2 Integrating SPIDAR-H into D-Vision A configuration of the haptic device may be a little laborious for satisfying both human-scale force feedback and highly immersive display. The string-based haptic devices have flexible properties, but on the other hand they are capable of having a little sensitive precision at scaleable size. The larger the SPIDAR workspace is, the less accurate the manipulations will be. However, the choice of interface de- Fig. 6 A snapshot of whole system which integrated the SPIDAR-H interface into D-vision. Each user s hand is connected to 4 strings and motors for force feedback. String-based structure is quite safe, and is less restrictive at user s full body movements. pends on whether interaction work is task performance- or human intuitive sense-oriented. If we consider the interface to focus at the potential of human s physical activities, its bandwidth can be extended at an acceptable level. If the manipulation space is +1.1m 3, the absolute maximum errors are ±1.5 cm. Despite larger space like double size, it will be within 2 3 cm range. When we performed an experiment [2] to trace both a real cube and a virtual cube with about 1 khz of control frequency, data of virtual objects was at most about 10 mm in the error difference from tracing data of real object s rim in the calibrated well working space. The results showed that interaction adding to haptic information can help the user to transfer physical information like real world, and user can interact with virtual objects accurately even if our system is a larger scale interface. We implemented the SPIDAR-H without fixed cubic frame into D-vision so as to offer a flexible hardware configuration and high visualization. Each hand manipulation space size is extended approximately 3 m 3 to visualize spaces of D-vision fully. In this system, a total of 8 motors for both hands are placed surrounding the user. Circles in top of Fig. 6 illustrate the position of the motors on the D-vision. Four motors placed in the front side of the users are fixed behind screens, and the strings are tensed through a small hole on the curved peripheral screen. The other motors are placed behind the users by using a frame for projectors. The SPIDAR-H has two rings put on the user s finger on each hand, and each ring has four strings by which motor torque is transmitted. The motors have a rotary encoder, which counts the length of the string from the ring to the motor. By using the length of the four strings, the position of the ring is calculated,
5 892 IEICE TRANS. INF. & SYST., VOL.E88 D, NO.5 MAY 2005 and the force displayed to the users is controlled as they interact with the virtual object by their own hand directly. The strings never prevent the users from immersing into virtual worlds with surrounding images. And flexibility of that strings enables the users to perform various motions freely. 4. Applications The D-vision is a novel immersive projection display, which has multi-modal interfaces as described in above sections. Like other Immersive VE, users can control using familiar input devices like game pad or joystick, and interact with wearing general interfaces such as electromagnetic tracking sensor, marks, and polarized light glass. In special users can walk around large-scale workspace for their manipulation, and use force interaction with virtual objects through their own hands. This shows that our system is a well-suited system, which gives full demonstration for direct and dynamical interactions. In this section, we will illustrate the potentials and capabilities of our system through some examples of the applications. At first, Figure 7 (a) illustrates a psychological experiment for the analysis of human s vision by using a large screen surrounding the user s field of view [17]. If the screen is extended to the user s foot, the sensation of self-motion can be effectively obtained. The feature of the hybrid-curved screen contributes to simulate proper immersive VE for human factor analysis. When an image with specific pattern moves forward and backward, the participants invokes a sense of vection, like a swing back and forth in Fig. 7 (a). At that time, by using a force plate including pressure sensors, the sway of the participant s body is measured and analyzed. Secondly, as described above, haptic feedback will be used in any aspect of virtual reality where more information can be conveyed by touch than by sight and sound alone. The typical haptic interfaces can be used to manipulate virtual objects with force-response. Figure 7 (c) shows an example of molecule visualization [15]. We can intuitively change the position of stereoscopically displayed molecules with the SPIDAR-H, and feel intermolecular force. By engaging life-like virtual human into VE, interface technologies can enrich and improve a human interaction. We can realize the force reaction with a virtual human that generate self-motions [13] (See Fig. 7 (d)). The movement of this reactive virtual human is generated by real human motion data, which is pre-captured by traditional magnetic motion capture system, and stored into our designed database. Using the SPIDAR-H device of our proposed system, the user can hold and throw the virtual ball like a real one. The haptic information originating from the user becomes key factors when generating the virtual human s reactions. In the D-vision, virtual human animate in accordance with time and physical space by user, and at the same time, transmit an appropriate force to user. As the proposed system give visual and force feedback to the user s body in an immediate fashion, it can accomplish more interactive and dynamic application as mentioned above. 5. Conclusions In this paper, we proposed a novel immersive multi- Fig. 7 Application on the D-vision.
6 JEONG et al.: IMMERSIVE AND INTERACTIVE SYSTEM 893 projector display with human-scale haptic interface. The immersive system with hybrid curved screen, rendered seamless images in real time, and generated high-quality stereoscopic images by PC cluster and two-pass technology. Furthermore a string-based haptic device is integrated with the D-vision. Because of its flexible configuration and high visualization, it enables increasingly believable and dynamic physical interaction with virtual objects or humans. Through developments of various applications at present, we confirmed the potential of extending the frontier of interaction and haptic interfaces. In future works, we will develop software which enables users to create applications easily and efficiently for the D-vision, and also try to improve immersive VE system with accuracy and richness. References [1] Y. Cai, M. Ishii, and M. Sato, A human interface device for CAVEsize virtual workspace, IEEE International Conference on Systems, Man, and Cybernetics, vol.3, no.3, pp , Oct [2] W. Choi, S. Jeong, N. Hashimoto, S. Hasegawa, Y. Koike, and M. Sato, A development and evaluation of reactive motion capture system with haptic feedback, Proc. 6th IEEE International Conference on Automatic Face and Gesture Recognition, pp , [3] C. Cruz-Neira, D.J. Sandin, and T.A. DeFanti, Surround-screen projection-based virtual reality: The design and implementation of the CAVE, Proc. SIGGRAPH 93, pp , [4] A. Fischer and J.M. Vance, PHANToM haptic device implemented in a projection screen virtual environment, Proc. workshop on Virtual environments 2003, pp , [5] B.G. Witmer and M.J. Singer, Measuring presence in virtual environments: A presence questionaire, Presence, vol.7, no.3, pp , [6] N. Hashimoto, J. Ryu, M. Yamasaki, T. Minakawa, H. Takeda, S. Hasegawa, and M. Sato, D-vision: Immersive multi-projection display with a curved hybrid screen, Proc. 8th International Conference on Virtual System and MultiMedia (VSMM2002), pp , [7] W. Hashimoto and H. Iwata, Ensphered vision: Spherical immersive display using convex mirror, Transactions of the Virtual Reality Society of Japan, vol.4, no.3, pp , [8] M. Hirose, K. Hirota, T. Ogi, H. Yano, N. Kakehi, M. Saito, and M. Nakashige, HapticGEAR: The development of a wearable force display system for immersive projection displays, Proc. Virtual Reality 2001 Conference (VR 01), pp , [9] M. Hirose, T. Ogi, S. Ishiwata, and T. Yamada, Development and evaluation of immersive multiscreen display: CABIN, Systems and Computers in Japan, Scripta Technica, vol.30, no.1, pp.13 22, [10] M. Ishii and M. Sato, A 3D spatial interface device using tensed strings, Presence, vol.3, no.1, pp.81 86, [11] M. Iwashita, A. Toyama, N. Hashimoto, S. Hasegawa, and M. Sato, Development of locomotion interface based on step-in-place movement, IEICE Trans. Fundamentals (Japanese Edition), vol.j87-a, no.1, pp.87 95, Jan [12] C. Jaynes, S. Webb, R.M. Steele, M. Brown, and W.B. Seales, Dynamic shadow removal from front projection displays, Proc. ITCVG 01, pp , [13] S. Jeong, W. Choi, N. Hashimoto, S. Hasegawa, and M. Sato, A design of reactive virtual human with force interaction, IEICE Technical Report, HIP , [14] E. lotta Sallnas, K. Rassmus-Grohn, and C. Sjostrom, Supporting presence in collaborative environments by haptic force feedback, Proc. ACM Tran. Computer-human interaction, pp , [15] J. Murayama, L. Bougrila, Y. Luo, K. Akahane, S. Hasegawa, B. Hirsbrunner, and M. Sato, SPIDAR G&G: A two-handed haptic interface for bimanual VR interaction, Proc. EuroHaptics 2004, pp , [16] B. Petzold, M.F. Zaeh, B. Faerber, B. Demi, H. Egermeier, J. Schilp, and S. Clarke, A study on visual, auditory, and haptic feedback for assembly tasks, Presence, vol.13, no.1, pp.16 21, [17] J. Ryu, N. Hashimoto, and M. Sato, Analysis of vection using body sway in immersive virtual environment, Correspondences on Human Interface, vol.5, no.3, pp , [18] K. Salisbury, F. Conti, and F. Barbagli, Haptic rendering: Introductory concepts, IEEE Comput. Graph. Appl., vol.24, no.2, pp.25 32, [19] N. Shibano, T. Hatanaka, H. Nakanishi, H. Hoshino, R. Nagahama, K. Sawada, and J. Nomura, Development of VR presentation system with spherical screen for urban environment human media, Transactions of the Virtual Reality Society of Japan, vol.4, no.3, pp , [20] N. Tarrin, S. Coquillart, S. Hasegawa, L. Bouguila, and M. Sato, The stringed haptic workbench: A new haptic workbench solution, Proc. Eurographics, FP15-2, Sept [21] T. Yamada, M. Hirose, and Y. Iida, Development of complete immersive display: COSMOS, Proc. 4th International Conference on Virtual Systems and Multimedia (VSMM 98), pp , [22] M.Yamasaki,T.Minakawa,H.Takeda,S.Hasegawa,andM.Sato, Technology for seamless multi-projection onto a hybrid screen composed of differently shaped surface elements, Proc. 7th Annual Immersive Projection Technology Symposium (IPT), VR , March Seungzoo Jeong received a master s degree in the department of computational Intelligence and Systems Science, the Graduate School of Science and Engineering, Tokyo Institute of Technology in Currently she is on the doctor s course in the precision and intelligence laboratory, Tokyo Institute of Technology. She is interested in human-scale interaction, virtual human and human interface. Naoki Hashimoto received B.S., M.S. and Dr. Eng. degrees from Tokyo Institute of Technology in 1997, 1999 and He has been a research assistant at Precision and Intelligence Laboratory, Tokyo Institute of Technology since His research interests include immersive projection technology, human-scale virtual environment, computer graphics and its applications. He is a member of VRSJ. Makoto Sato graduated in 1973 from Department of Physical Electronics, Faculty of Engineering, Tokyo Institute of Technology, where he obtained the Doctor of Engineering in He became an assistant in the same faculty, and is now a Professor of Precision and Intelligence Laboratory, Tokyo Institute of Technology. He is engaged in researches on pattern recognition and virtual reality.
Air-filled type Immersive Projection Display
Air-filled type Immersive Projection Display Wataru HASHIMOTO Faculty of Information Science and Technology, Osaka Institute of Technology, 1-79-1, Kitayama, Hirakata, Osaka 573-0196, Japan whashimo@is.oit.ac.jp
More informationApplication of 3D Terrain Representation System for Highway Landscape Design
Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented
More informationExperience of Immersive Virtual World Using Cellular Phone Interface
Experience of Immersive Virtual World Using Cellular Phone Interface Tetsuro Ogi 1, 2, 3, Koji Yamamoto 3, Toshio Yamada 1, Michitaka Hirose 2 1 Gifu MVL Research Center, TAO Iutelligent Modeling Laboratory,
More informationImmersive Augmented Reality Display System Using a Large Semi-transparent Mirror
IPT-EGVE Symposium (2007) B. Fröhlich, R. Blach, and R. van Liere (Editors) Short Papers Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror K. Murase 1 T. Ogi 1 K. Saito 2
More informationReal-time Reconstruction of Wide-Angle Images from Past Image-Frames with Adaptive Depth Models
Real-time Reconstruction of Wide-Angle Images from Past Image-Frames with Adaptive Depth Models Kenji Honda, Naoki Hashinoto, Makoto Sato Precision and Intelligence Laboratory, Tokyo Institute of Technology
More informationDevelopment of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
More informationProposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3
Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3 Naoki KAWAKAMI, Masahiko INAMI, Taro MAEDA, and Susumu TACHI Faculty of Engineering, University of Tokyo 7-3- Hongo,
More informationTouching and Walking: Issues in Haptic Interface
Touching and Walking: Issues in Haptic Interface Hiroo Iwata 1 1 Institute of Engineering Mechanics and Systems, University of Tsukuba, 80, Tsukuba, 305-8573 Japan iwata@kz.tsukuba.ac.jp Abstract. This
More informationMultimedia Virtual Laboratory: Integration of Computer Simulation and Experiment
Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment Tetsuro Ogi Academic Computing and Communications Center University of Tsukuba 1-1-1 Tennoudai, Tsukuba, Ibaraki 305-8577,
More information"PENGUIN HOCKEY": A VIRTUAL REALITY GAME SYSTEM FOR CHILDREN
"PENGUIN HOCKEY": A VIRTUAL REALITY GAME SYSTEM FOR CHILDREN Akihiko Shirai, Shouichi Hasegawa, Yasuharu Koike, Makoto Sato Precision and Intelligence Laboratorry, Tokyo Institute of Technology Abstract:
More informationVirtual Environments. Ruth Aylett
Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able
More informationRealistic Visual Environment for Immersive Projection Display System
Realistic Visual Environment for Immersive Projection Display System Hasup Lee Center for Education and Research of Symbiotic, Safe and Secure System Design Keio University Yokohama, Japan hasups@sdm.keio.ac.jp
More informationThe 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seun
The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seungmoon Choi and In Lee Haptics and Virtual Reality Laboratory
More informationCollaboration in Multimodal Virtual Environments
Collaboration in Multimodal Virtual Environments Eva-Lotta Sallnäs NADA, Royal Institute of Technology evalotta@nada.kth.se http://www.nada.kth.se/~evalotta/ Research question How is collaboration in a
More informationCollaborative Flow Field Visualization in the Networked Virtual Laboratory
Collaborative Flow Field Visualization in the Networked Virtual Laboratory Tetsuro Ogi 1,2, Toshio Yamada 3, Michitaka Hirose 2, Masahiro Fujita 2, Kazuto Kuzuu 2 1 University of Tsukuba 2 The University
More informationDevelopment of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane
Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Makoto Yoda Department of Information System Science Graduate School of Engineering Soka University, Soka
More informationA Multimodal Locomotion User Interface for Immersive Geospatial Information Systems
F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationVIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE
VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE Yiru Zhou 1, Xuecheng Yin 1, and Masahiro Ohka 1 1 Graduate School of Information Science, Nagoya University Email: ohka@is.nagoya-u.ac.jp
More informationEvaluation of Five-finger Haptic Communication with Network Delay
Tactile Communication Haptic Communication Network Delay Evaluation of Five-finger Haptic Communication with Network Delay To realize tactile communication, we clarify some issues regarding how delay affects
More informationToward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback
Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback Kumiyo Nakakoji Key Technology Laboratory SRA Inc. 2-32-8 Minami-Ikebukuro, Toshima, Tokyo, 171-8513,
More informationInteractive Multimedia Contents in the IllusionHole
Interactive Multimedia Contents in the IllusionHole Tokuo Yamaguchi, Kazuhiro Asai, Yoshifumi Kitamura, and Fumio Kishino Graduate School of Information Science and Technology, Osaka University, 2-1 Yamada-oka,
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationWe are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors
We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,800 116,000 120M Open access books available International authors and editors Downloads Our
More informationCSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS
CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS Announcements Homework project 2 Due tomorrow May 5 at 2pm To be demonstrated in VR lab B210 Even hour teams start at 2pm Odd hour teams start
More informationLOW COST CAVE SIMPLIFIED SYSTEM
LOW COST CAVE SIMPLIFIED SYSTEM C. Quintero 1, W.J. Sarmiento 1, 2, E.L. Sierra-Ballén 1, 2 1 Grupo de Investigación en Multimedia Facultad de Ingeniería Programa ingeniería en multimedia Universidad Militar
More information2. Introduction to Computer Haptics
2. Introduction to Computer Haptics Seungmoon Choi, Ph.D. Assistant Professor Dept. of Computer Science and Engineering POSTECH Outline Basics of Force-Feedback Haptic Interfaces Introduction to Computer
More informationProp-Based Haptic Interaction with Co-location and Immersion: an Automotive Application
HAVE 2005 IEEE International Workshop on Haptic Audio Visual Environments and their Applications Ottawa, Ontario, Canada, 1-2 October 2005 Prop-Based Haptic Interaction with Co-location and Immersion:
More informationChapter 1 - Introduction
1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationThe Mixed Reality Book: A New Multimedia Reading Experience
The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut
More informationTele-Nursing System with Realistic Sensations using Virtual Locomotion Interface
6th ERCIM Workshop "User Interfaces for All" Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface Tsutomu MIYASATO ATR Media Integration & Communications 2-2-2 Hikaridai, Seika-cho,
More informationMECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES
INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL
More informationUniversity of Geneva. Presentation of the CISA-CIN-BBL v. 2.3
University of Geneva Presentation of the CISA-CIN-BBL 17.05.2018 v. 2.3 1 Evolution table Revision Date Subject 0.1 06.02.2013 Document creation. 1.0 08.02.2013 Contents added 1.5 12.02.2013 Some parts
More informationThe CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.
The CHAI Libraries F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury Computer Science Department, Stanford University, Stanford CA
More informationDevelopment of Informal Communication Environment Using Interactive Tiled Display Wall Tetsuro Ogi 1,a, Yu Sakuma 1,b
Development of Informal Communication Environment Using Interactive Tiled Display Wall Tetsuro Ogi 1,a, Yu Sakuma 1,b 1 Graduate School of System Design and Management, Keio University 4-1-1 Hiyoshi, Kouhoku-ku,
More informationISCW 2001 Tutorial. An Introduction to Augmented Reality
ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University
More informationRealization of Multi-User Tangible Non-Glasses Mixed Reality Space
Indian Journal of Science and Technology, Vol 9(24), DOI: 10.17485/ijst/2016/v9i24/96161, June 2016 ISSN (Print) : 0974-6846 ISSN (Online) : 0974-5645 Realization of Multi-User Tangible Non-Glasses Mixed
More informationEvaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface
Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University
More informationPractical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction
More informationUsing Scalable, Interactive Floor Projection for Production Planning Scenario
Using Scalable, Interactive Floor Projection for Production Planning Scenario Michael Otto, Michael Prieur Daimler AG Wilhelm-Runge-Str. 11 D-89013 Ulm {michael.m.otto, michael.prieur}@daimler.com Enrico
More informationBeyond: collapsible tools and gestures for computational design
Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationVR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.
VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D
More informationTangible interaction : A new approach to customer participatory design
Tangible interaction : A new approach to customer participatory design Focused on development of the Interactive Design Tool Jae-Hyung Byun*, Myung-Suk Kim** * Division of Design, Dong-A University, 1
More informationVirtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback
Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback Taku Hachisu The University of Electro- Communications 1-5-1 Chofugaoka, Chofu, Tokyo 182-8585, Japan +81 42 443 5363
More informationAR 2 kanoid: Augmented Reality ARkanoid
AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular
More informationInformation Layout and Interaction on Virtual and Real Rotary Tables
Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer System Information Layout and Interaction on Virtual and Real Rotary Tables Hideki Koike, Shintaro Kajiwara, Kentaro Fukuchi
More informationPhysical Presence in Virtual Worlds using PhysX
Physical Presence in Virtual Worlds using PhysX One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are
More informationFlexTorque: Exoskeleton Interface for Haptic Interaction with the Digital World
FlexTorque: Exoskeleton Interface for Haptic Interaction with the Digital World Dzmitry Tsetserukou 1, Katsunari Sato 2, and Susumu Tachi 3 1 Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku-cho,
More informationA Road Traffic Noise Evaluation System Considering A Stereoscopic Sound Field UsingVirtual Reality Technology
APCOM & ISCM -4 th December, 03, Singapore A Road Traffic Noise Evaluation System Considering A Stereoscopic Sound Field UsingVirtual Reality Technology *Kou Ejima¹, Kazuo Kashiyama, Masaki Tanigawa and
More informationMulti-Rate Multi-Range Dynamic Simulation for Haptic Interaction
Multi-Rate Multi-Range Dynamic Simulation for Haptic Interaction Ikumi Susa Makoto Sato Shoichi Hasegawa Tokyo Institute of Technology ABSTRACT In this paper, we propose a technique for a high quality
More informationIntegrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices
This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic
More informationAugmented and Virtual Reality
CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS
More informationEnhancing Fish Tank VR
Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands mullie robertl @cwi.nl Abstract Fish tank VR systems provide head
More informationDevelopment of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture
Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Nobuaki Nakazawa 1*, Toshikazu Matsui 1, Yusaku Fujii 2 1 Faculty of Science and Technology, Gunma University, 29-1
More informationMRT: Mixed-Reality Tabletop
MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having
More informationInterior Design using Augmented Reality Environment
Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate
More informationBeyond Visual: Shape, Haptics and Actuation in 3D UI
Beyond Visual: Shape, Haptics and Actuation in 3D UI Ivan Poupyrev Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for
More informationHUMAN MOVEMENT INSTRUCTION SYSTEM THAT UTILIZES AVATAR OVERLAYS USING STEREOSCOPIC IMAGES
HUMAN MOVEMENT INSTRUCTION SYSTEM THAT UTILIZES AVATAR OVERLAYS USING STEREOSCOPIC IMAGES Masayuki Ihara Yoshihiro Shimada Kenichi Kida Shinichi Shiwa Satoshi Ishibashi Takeshi Mizumori NTT Cyber Space
More informationSound rendering in Interactive Multimodal Systems. Federico Avanzini
Sound rendering in Interactive Multimodal Systems Federico Avanzini Background Outline Ecological Acoustics Multimodal perception Auditory visual rendering of egocentric distance Binaural sound Auditory
More informationShopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction
Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp
More informationUngrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments
The 18th IEEE International Symposium on Robot and Human Interactive Communication Toyama, Japan, Sept. 27-Oct. 2, 2009 WeIAH.2 Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments
More informationVEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu
More informationInteractive intuitive mixed-reality interface for Virtual Architecture
I 3 - EYE-CUBE Interactive intuitive mixed-reality interface for Virtual Architecture STEPHEN K. WITTKOPF, SZE LEE TEO National University of Singapore Department of Architecture and Fellow of Asia Research
More informationITS '14, Nov , Dresden, Germany
3D Tabletop User Interface Using Virtual Elastic Objects Figure 1: 3D Interaction with a virtual elastic object Hiroaki Tateyama Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo,
More informationExpression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch
Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch Vibol Yem 1, Mai Shibahara 2, Katsunari Sato 2, Hiroyuki Kajimoto 1 1 The University of Electro-Communications, Tokyo, Japan 2 Nara
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationWelcome to this course on «Natural Interactive Walking on Virtual Grounds»!
Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/
More informationColumn-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation
ITE Trans. on MTA Vol. 2, No. 2, pp. 161-166 (2014) Copyright 2014 by ITE Transactions on Media Technology and Applications (MTA) Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based
More informationExhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience
, pp.150-156 http://dx.doi.org/10.14257/astl.2016.140.29 Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience Jaeho Ryu 1, Minsuk
More informationHMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University
HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive
More informationDevelopment of Flexible Pneumatic Cylinder with Backdrivability and Its Application
Development of Flexible Pneumatic Cylinder with Backdrivability and Its Application Takafumi Morimoto, Mohd Aliff, Tetsuya Akagi, and Shujiro Dohta Department of Intelligent Mechanical Engineering, Okayama
More informationA Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,
IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,
More informationStereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.
Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.
More informationCSE 165: 3D User Interaction. Lecture #14: 3D UI Design
CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware
More informationFigure 2. Haptic human perception and display. 2.2 Pseudo-Haptic Feedback 2. RELATED WORKS 2.1 Haptic Simulation of Tapping an Object
Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback Taku Hachisu 1 Gabriel Cirio 2 Maud Marchal 2 Anatole Lécuyer 2 Hiroyuki Kajimoto 1,3 1 The University of Electro- Communications
More informationAffordance based Human Motion Synthesizing System
Affordance based Human Motion Synthesizing System H. Ishii, N. Ichiguchi, D. Komaki, H. Shimoda and H. Yoshikawa Graduate School of Energy Science Kyoto University Uji-shi, Kyoto, 611-0011, Japan Abstract
More informationImmersive Real Acting Space with Gesture Tracking Sensors
, pp.1-6 http://dx.doi.org/10.14257/astl.2013.39.01 Immersive Real Acting Space with Gesture Tracking Sensors Yoon-Seok Choi 1, Soonchul Jung 2, Jin-Sung Choi 3, Bon-Ki Koo 4 and Won-Hyung Lee 1* 1,2,3,4
More informationTouch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device
Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device Andrew A. Stanley Stanford University Department of Mechanical Engineering astan@stanford.edu Alice X. Wu Stanford
More informationDepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface
DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA
More informationIntroduction to Virtual Reality (based on a talk by Bill Mark)
Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers
More informationForce feedback interfaces & applications
Force feedback interfaces & applications Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jukka Raisamo,
More informationA STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY
A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY H. ISHII, T. TEZUKA and H. YOSHIKAWA Graduate School of Energy Science, Kyoto University,
More informationUbiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1
Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More informationCollaborative Visualization in Augmented Reality
Collaborative Visualization in Augmented Reality S TUDIERSTUBE is an augmented reality system that has several advantages over conventional desktop and other virtual reality environments, including true
More informationOutput Devices - Visual
IMGD 5100: Immersive HCI Output Devices - Visual Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Overview Here we are concerned with technology
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationVIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences June Dr.
Virtual Reality & Presence VIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences 25-27 June 2007 Dr. Frederic Vexo Virtual Reality & Presence Outline:
More informationHuman-Scale Virtual Environment for Product Design: Effect of Sensory Substitution
Human-Scale Virtual Environment for Product Design: Effect of Sensory Substitution Paul Richard, Damien Chamaret, François-Xavier Inglese, Philippe Lucidarme, Jean-Louis Ferrier 37 Abstract This paper
More informationFORCE FEEDBACK. Roope Raisamo
FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces
More informationSpatial Interfaces and Interactive 3D Environments for Immersive Musical Performances
Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances Florent Berthaut and Martin Hachet Figure 1: A musician plays the Drile instrument while being immersed in front of
More informationA Hybrid Actuation Approach for Haptic Devices
A Hybrid Actuation Approach for Haptic Devices François Conti conti@ai.stanford.edu Oussama Khatib ok@ai.stanford.edu Charles Baur charles.baur@epfl.ch Robotics Laboratory Computer Science Department Stanford
More informationMobile Haptic Interaction with Extended Real or Virtual Environments
Mobile Haptic Interaction with Extended Real or Virtual Environments Norbert Nitzsche Uwe D. Hanebeck Giinther Schmidt Institute of Automatic Control Engineering Technische Universitat Miinchen, 80290
More informationInnovative Solutions for Immersive 3D Visualization Laboratory
Innovative Solutions for Immersive 3D Visualization Laboratory Jacek Lebiedź Gdańsk University of Technology, Faculty of ETI Dept. of Intelligent Interactive Systems ul. G. Narutowicza 11/12 80-233 Gdańsk,
More informationtracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system
Line of Sight Method for Tracker Calibration in Projection-Based VR Systems Marek Czernuszenko, Daniel Sandin, Thomas DeFanti fmarek j dan j tomg @evl.uic.edu Electronic Visualization Laboratory (EVL)
More informationUsing Mixed Reality as a Simulation Tool in Urban Planning Project for Sustainable Development
Journal of Civil Engineering and Architecture 9 (2015) 830-835 doi: 10.17265/1934-7359/2015.07.009 D DAVID PUBLISHING Using Mixed Reality as a Simulation Tool in Urban Planning Project Hisham El-Shimy
More informationWhat is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology
Virtual Reality man made reality sense world What is Virtual Reality? Dipl-Ing Indra Kusumah Digital Product Design Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indrakusumah@iptfraunhoferde wwwiptfraunhoferde
More informationDevelopment of Video Chat System Based on Space Sharing and Haptic Communication
Sensors and Materials, Vol. 30, No. 7 (2018) 1427 1435 MYU Tokyo 1427 S & M 1597 Development of Video Chat System Based on Space Sharing and Haptic Communication Takahiro Hayashi 1* and Keisuke Suzuki
More informationSimultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword
Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword Sayaka Ooshima 1), Yuki Hashimoto 1), Hideyuki Ando 2), Junji Watanabe 3), and
More information