Mixed Reality Approach and the Applications using Projection Head Mounted Display
|
|
- Stella Collins
- 5 years ago
- Views:
Transcription
1 Mixed Reality Approach and the Applications using Projection Head Mounted Display Ryugo KIJIMA, Takeo OJIKA Faculty of Engineering, Gifu University 1-1 Yanagido, GifuCity, Gifu Japan phone: , fax: Abstract The aim of this study is to explore the feasible technology for the multiple display environment that is inserted in the real world. In this paper, the wearable projector with an infrared camera and a light source is used in combination with the retro-reflective screens. The visible image and infrared light are projected to the screen and reflected back to the user's eye and the infrared camera. The screen location and the use's finger tip position are calculated using the image processing to enable the interaction. Several example applications or demonstrations are constructed to show the usage of this configuration. The screens function as both the visible screen and the high gain infrared marker, or supprting equipment of input at the same time. Owing to the high reflection gain of retro-reflective screens, a small, light weight projector can be used. Also the high contrast between the screen and the other environmental object in the captured infrared image decreases the difficulty of the image processing. KeyWords: Augumented Reality, Real World Oriented Computing, Ubiquitous Computing, Mixed Reality, Projection Head Mounted Display, Image Processing 1. Introduction 1.1 Sensing for Mixed Reality The sensing issue is one of the essential problem for the Mixed Reality in terms of the feasibility of this technology. The magnetic sensors such as Polhemus FASTRAK or Ascension Bird has been widely used, to measure the location of the user s head for the rendering and the location of the hand or input devices for the interaction. They are generally used within the laboratory environment and difficult to apply in the larger area of activity in our daily life. There have been many efforts to develop the alternative sensing devices, such as ultrasonic sensor, sensor fusion using gyroscope, electric compass, etc., and vision based sensing. However, they seem to be far from maturity and there is a room for the further investigation. The authors approach is to combine the display and sensing in one using the projection head mounted display (PHMD)[12] or the wearable projector, the retroreflective screen(s), and image processing. V02-1
2 1.2 Inserting Virtual Information to Object The Mixed Reality often means the superimposing the generated image over the whole user s vision using optical or video see-through head mounted display. While this type of approach can provide the freedom and flexibility, the displayed information seems to be related strongly to the object or the location in most of the cases. Therefore, the authors are focusing to give the additional graphical information to the specific objects[6] or the locations. The user wears a light weight wearable projector (PHMD) and an infrared camera with the light sources. The PHMD projects images only on the screens. At the same time, the infrared rays are irradiate to capture the image of the screen. Using the image processing, the computer calculates the location of the screen relative to the user s head as well as the fingertip position on the screen for the interaction. While this approach requires to attach the screen in advance where the information will be displayed, the screen with high reflection gain decreases the difficulty of sensing using image processing. 2. Proposed Configuration 2.1 Projection Head Mounted Display (PHMD) The Projection Head Mounted Display (PHMD) is relatively recent technology. The principle of the PHMD was proposed with the first prototype by the first author in 1995 [3][4]. This prototype is used for an application to layout and design the furniture of the room, based on the idea of moving the object among the multiple spaces and enable the seamless connection of the spaces. This idea is similar to that of Pick-and-Drop [8]. In a desktop type virtual environments on the CRT, a shape modeling application was running, and the user also exists in the surrounding virtual environment with PHMD. In the surrounding VE, the user can grab and change the location of furniture using a flying mouse (both for 2D and 3D). If a furniture s shape is not his taste, the user grabs it in the surrounding VE, and push it into the CRT monitor. Then the furniture appeared in the shape modeling application on the CRT and the user modifies the shape. At last the user grabs it with the same flying mouse, and draws it back to the surrounding VE. Afterward, the retro-reflective material was introduced to enable the stereoscopy for the projection display [5] in 1996 and soon the PHMD is used in combination with the retro-reflector [6][7]. The fundamental principle of the PHMD is the optical conjugation between the user s eye and the optical center of the projection. Due to this conjugation, the image is projected from the user s eye position and seen from the same position. This arrangement enables the user to watch the image without distortion when the screen has angle to the line of sight or even the screen is bent, because the viewing transformation is the inverse of the projection transformation and the distortion derived from the screen s shape is canceled (Fig.1). The author has developed a series of PHMD as shown in Fig. 2 - Fig. 6. One of the essential issue to make the small projector was the power-weight and power-heat ratio of the light source. A halogen lamp of five watts used for Rev.1 causes the overheat problem of the LCD panel even with the heat protection using a hot mirror and a heat absorber. The light source is divided from the projection head and they are connected by optical fibers in Rev.2. In Rev.3, the current prototype, hi-luminance white LED block is attached to the projection head. The projection head for the each eye was composed of an LCD panel (Sony LCX009AKB, 0.7 inches, 800x225 pixel), a projection lens (18 mm diameter) and the light source. The LED block generated the luminance of approximately 250 millicd/mm2 and was enough for the use within the room, but not for outdoor usage. The weight of this PHMD is about 150 grams. V02-2
3 LCD Projection Lens Eye Half Mirrot Screen (Left) Fig.1 Pribciple of PHMD (Middle), Fig. 2. Confirming the Optical Design (PHMD Rev. 0. Kijima, Oct 1998) (Right), Fig. 3. Head Mounted Configuration (PHMD Rev. 1. Kijima & Haza, Mar. 1999) Light Source Half Mirror Mirror Projection Head (Left) Fig. 4. Light Source is Divided for Heat Proection (PHMD Rev. 2. Haza & Kijima, Jul. 1999), (Middle) Fig.5. Current Prototype in Stereoscopic Configuration (Rev. 3. Haza, Miwa & Kijima, Dec. 1999), (Right) Fig.6 Projection Head The PHMD became more practical by the introduction of the retro-reflective material as the screen. This material reflects the incoming light back to the direction of the light source. The image is projected from the user s eye position to the reflector and goes back to the eye position. Almost all the energy from the projector goes back to the eye and there is only minimal loss. Therefore, small, light weight, dark projector can be applied as the PHMD. Another merit is the capability of stereoscopy. The image projected from one eye goes back to the same eye and does not reach to another eye. Then each eye of the user can see the different image from the corresponding projector. The retro-reflective screen is the cloth covered with small glass beads. This is the cheap material that is widely used for the road signs. Many displays can be attached easily in low cost. Also this material can be folded and stored in a pocket like as handkerchief Vision Based Measurement using Infrared Camera To achieve the relative location of the user s head to the screen, an infrared NTSC CCD camera and the infrared LCDs is attached to the PHMD, and the vision based measurement was applied. When the original shape was known, the screen s location relative to the user s head can be calculated from the image of the screen. The screen served as both the visual screen for the user and the anchor of the vision based measurement at the same time. Generally speaking, the method to control the environment, other than the image analysis algorithm, is important for the actual application. Since the calculation power of the wearable configuration is limited, the clever way to achieve the clear image of the necessary target is required. V02-3
4 Fig. 7. Differential Image Capture: (Top) Original, (Bottom) Binary Image Infrared Light is, (Left) ON, (Right) OFF Screen Verteces of Convexed Polygon Approximation Screen Edge Region 2 Region 3 Region 1 Diagoal Line Region 4 Resulting Vertex Initial Vertex Fitted Line Fig. 8. Detection of Screen Edge and Screen Verteces: (Top Left) Original Image, (Top Right) Approximation to Convex Polygon, (Bottom Left) Shrunken Quadrangle and Division, (Bottom Right) Line Fitting by Mean Suared Error Method The first device is that the LED and camera were optically conjugated with each other. Due to the high gain of retro-reflector, the small, low power LED can serve enough as the bright light source to achieve the clear image of the screen. The next idea was the differential image calculation. The infrared light was turned on and off synchronized to the frame and the difference of images was calculated. Thus the background noise was decreased largely and the high contrast image of the screen was achieved. The following is the brief description of the image processing. At first, the edge detection was performed from the captured image and the longest one was regarded as the outline of the screen. After compensating the distortion of the camera, 12 lines at every 15 degrees was circumscribed to this edge. The shape of screen was approximated as a convex polygon with 24 vertices. The number of vertices was reduced to the quadrangle, referring to the distance between the neighboring vertices and the angle of neighboring edges. Thus the first approximation of the quadrangle was achieved. Next, the image was divided into four regions using by the diagonal lines of this quadrangle. Using the minimum squared error method, four lines were fitted to edges in each V02-4
5 region. At last the screen s location relative to the user s head was calculated. This calculation is robust when a part of the screen is hidden by the user s hand, because this is based on the edge information, not the vertices' information. In addition, when the user places a finger on the screen, it is seen as the dark area in the bright screen. The second order moment of finger image is estimated and fingertip position is calculated. The user can interact with the displayed object with their fingertip. For example, the user can draw the graphics on the screen with their fingertip. The screen represents the drawing plane in the three dimensional spaces and the user can move it with the other hand, then the spatial object can be drawn. 3. Applications and Demos 3.1 Distributed Display Fig. 9 shows the demonstrations of the omni-display or distributed displays environment in our daily life. Top Left figure shows the virtual window demonstration. A video camera was placed outside the laboratory building, and displayed on the retro-reflective screen on the wall. According to the user s location, the motion parallax was realized by the simple image based rendering. Fig.9 (Top Right) shows the price tag application. A small tag of reflector was attached to the product in the shop, and the current price is projected on it. Fig.9 (Bottom Left) shows the calendar application. The calendar itself was the reflector without any print. The date was displayed on it. The user can draw his plan by using the fingertip operation on the calendar and call it back on by the fingertip. Fig.9 (Bottom Right) shows the internal state application. A reflector was attached to the back of the user s hand. According to the heart beat and blood pressure, the graphics of the blood vessel was animated. Projected Window Virtual Price Tag Projected Real Fig. 9. Distributed Display Demos: (Top Left) Projected Virtual Window on Wall Showing Scene Outside, (Top Right) Virtual Price Tag Attached To Product, (Bottom Left) Projected Calender with Schedule Board Functionality, (Bottom Right) Visualized Brood Vessel V02-5
6 3.2 Screen Location Measurement Fig. 10 is the demonstration application to show the active use of the screen location measurement. The sliced image of human brain using the MRI data was displayed on the screen. As the user held the real screen and moved it, the slicing plane in the virtual world was controlled by the location of the real screen. Thus the user can see various slices of the brain. Also the user can draw the graphical memorandum on the sliced image by their fingertip. Fig. 10. Application: Interactively Slicing MRI Brain Image with Hand-Held Screen 3.3 Medical Apprications Fig. 11 shows the appearance of the medical applications. The left figure shows the user is touching the mockup of human s knee with the projected internal structure. This mockup was made of the stiff bones and soft organic structures with skin. The user can touch with the hand feeling to confirm the position of bones, and can see the internal structure on the real mockup. Also, the user can select the projected image such as the whole structure, bones and tendons, blood vessel and major nerves, etc. The middle figure shows the appearance of the endoscope simulator. The user can insert the toy endoscope to the mockup of the knee, watching the projected endoscope immersing the internal structures. On the other monitor, the simulated image from the virtual endoscope was displayed. Probe Stomach Echo Image Stomach Fig.11. Medical Simulator (Left) Touching themodel of Knee with Internal Structure Displayed (Middle) Endoscope Simulator (Right) Ultrasonic Echo Image Projected on Stomach V02-6
7 The Polhemus sensors were used to measure the location of endoscope. The right figure shows the display of the ultrasonic echo image projected on the stomach. The infrared camera on the user s head measured the location of the ultrasonic probe by detecting four visual markers attached to it. The user can watch the echo image where the echo image was taken. 4. Summary A novel configuration of Mixed Reality using wearable projector with an infrared camera was described, focusing to attach the graphical information to the object and location. A small, light weight wearable projector was constructed and used with a retroreflective cloth screen. The infrared camera was attached to the PHMD, and the movement, location of both of the user's head and fingertip by the image processing, are used for the interaction. Several applications were constructed to show the example usage. Since this configuration is autonomous in terms of sensing, it can be used in larger space of our daily life, not limited to the space of a laboratory. References 1. Wisneski, C., Ishii, H., Dahley, A., Gorbet. M., Brave., S., Ullmer, B., Yarin, P., Ambient Displays: Turning Architectual Space into a Interface between People and Digital Information, Procs of the 1st Int. Workshop on Cooperative Buildings, pp , Springer, Pedersen, E. and Sokoler, T., AROMA: Abstract Representation of Presense Supporting Mutual Awareness. Procs. of CHI 97, pp , ACM Press, Kijima, R. and Hirose, M., A Compound Virtual Environment Using the Projective Head Mounted Display Procs of ICAT/VRST '95, pp , ACM-SIGCHI, Kijima, R., Ojika, R., Transition between VIrtual Environment and Workstation Environment with Projective Head-Mounted-Display, Procs. of IEEE Virtual Reality Annual International Symposium 1997, pp , IEEE, Ishikawa, J., 3D B-VISION, Journal of three dimensional image, 10-4, pp.11-14, (In Japanese) 6. Inami, M., Kawakami, N., Sekiguchi, D, Yanagida, Y., Maeda T., and Tachi, S., Visuo-Haptic Display Using Head-Mounted Projector, Procs of IEEE Virtual Reality 2000 Conference, pp , IEEE, Hua, H., Gao, C., Biocca, F., Rolland, J., P., An Ultra-liht and Compact Design and Implementation of Head Mounted Projective Displays, Procs. of IEEE Virtual Reality 2001 Conference, pp , IEEE, Rekimoto, J. Pick-and-Drop: A Direct Manipulation Technique for Multiple Computer Environments. Procs of User Interface Software Technology (UIST 97), pp.31-39, ACM, Ullmer, B. and Ishii, H., "The metadesk: Models and Prototypes for Tangible User Interfaces," Procs of User Interface Software Technology (UIST '97), pp , ACM, Rekimoto. J., Nagao. K., "The world through the computer: Computer augumented interaction with real world environments", Procs of UIST 95, pp.29-35, ACM Fitzmaurice, G. W., Zhai, S., Chignell, M. H., "Virtual rea;lity for palmtop computers", ACM Transaction on Information Systems, 11-3, pp , ACM, Kijima, R., "Wearable Interface Device", Procs of Human Computer Interaction 2001, Vol.1, pp , Kijima. R., Yamada, E., Ojika., T., "A development of Reflex HMD - HMD with time delay compensation capability-", Procs of International Symposium on Mixed Reality 2001 (ISMR2001), pp.40-47, VRSJ, V02-7
Projection-based head-mounted displays for wearable computers
Projection-based head-mounted displays for wearable computers Ricardo Martins a, Vesselin Shaoulov b, Yonggang Ha b and Jannick Rolland a,b University of Central Florida, Orlando, FL 32816 a Institute
More informationA New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments
Invited Paper A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments J.P. Rolland', Y. Ha', L. Davjs2'1, H. Hua3, C. Gao', and F.
More informationDEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
(Application to IMAGE PROCESSING) DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING SUBMITTED BY KANTA ABHISHEK IV/IV C.S.E INTELL ENGINEERING COLLEGE ANANTAPUR EMAIL:besmile.2k9@gmail.com,abhi1431123@gmail.com
More informationBeyond: collapsible tools and gestures for computational design
Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationInvisibility Cloak. (Application to IMAGE PROCESSING) DEPARTMENT OF ELECTRONICS AND COMMUNICATIONS ENGINEERING
Invisibility Cloak (Application to IMAGE PROCESSING) DEPARTMENT OF ELECTRONICS AND COMMUNICATIONS ENGINEERING SUBMITTED BY K. SAI KEERTHI Y. SWETHA REDDY III B.TECH E.C.E III B.TECH E.C.E keerthi495@gmail.com
More informationProposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3
Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3 Naoki KAWAKAMI, Masahiko INAMI, Taro MAEDA, and Susumu TACHI Faculty of Engineering, University of Tokyo 7-3- Hongo,
More informationTangible Bits: Towards Seamless Interfaces between People, Bits and Atoms
Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms Published in the Proceedings of CHI '97 Hiroshi Ishii and Brygg Ullmer MIT Media Laboratory Tangible Media Group 20 Ames Street,
More informationApplication of 3D Terrain Representation System for Highway Landscape Design
Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented
More informationOptical camouflage technology
Optical camouflage technology M.Ashrith Reddy 1,K.Prasanna 2, T.Venkata Kalyani 3 1 Department of ECE, SLC s Institute of Engineering & Technology,Hyderabad-501512, 2 Department of ECE, SLC s Institute
More informationITS '14, Nov , Dresden, Germany
3D Tabletop User Interface Using Virtual Elastic Objects Figure 1: 3D Interaction with a virtual elastic object Hiroaki Tateyama Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo,
More informationSmart Light Ultra High Speed Projector for Spatial Multiplexing Optical Transmission
Smart Light Ultra High Speed Projector for Spatial Multiplexing Optical Transmission Hideaki NII 1 Maki SUGIMOTO 1 Masahiko INAMI 1, 2 1) The University of Electro-Communications, 2) Japan Science and
More informationAir-filled type Immersive Projection Display
Air-filled type Immersive Projection Display Wataru HASHIMOTO Faculty of Information Science and Technology, Osaka Institute of Technology, 1-79-1, Kitayama, Hirakata, Osaka 573-0196, Japan whashimo@is.oit.ac.jp
More informationEvaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface
Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University
More informationAugmented and Virtual Reality
CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS
More informationCOLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.
COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,
More informationMagic Touch A Simple. Object Location Tracking System Enabling the Development of. Physical-Virtual Artefacts in Office Environments
Magic Touch A Simple Object Location Tracking System Enabling the Development of Physical-Virtual Artefacts Thomas Pederson Department of Computing Science Umeå University Sweden http://www.cs.umu.se/~top
More informationpreface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...
v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)
More informationDesign of a wearable wide-angle projection color display
Design of a wearable wide-angle projection color display Yonggang Ha a, Hong Hua b, icardo Martins a, Jannick olland a a CEOL, University of Central Florida; b University of Illinois at Urbana-Champaign
More informationIntegration of Hand Gesture and Multi Touch Gesture with Glove Type Device
2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &
More informationAugmented and mixed reality (AR & MR)
Augmented and mixed reality (AR & MR) Doug Bowman CS 5754 Based on original lecture notes by Ivan Poupyrev AR/MR example (C) 2008 Doug Bowman, Virginia Tech 2 Definitions Augmented reality: Refers to a
More informationPractical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction
More informationAn Ultra-light and Compact Design and Implementation of Head-Mounted Projective Displays
An Ultra-light and Compact Design and Implementation of Head-Mounted Projective Displays Hong Hua 1,2, Chunyu Gao 1, Frank Biocca 3, and Jannick P. Rolland 1 1 School of Optics-CREOL, University of Central
More informationTele-Nursing System with Realistic Sensations using Virtual Locomotion Interface
6th ERCIM Workshop "User Interfaces for All" Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface Tsutomu MIYASATO ATR Media Integration & Communications 2-2-2 Hikaridai, Seika-cho,
More informationMOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.
More informationISCW 2001 Tutorial. An Introduction to Augmented Reality
ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University
More informationtracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system
Line of Sight Method for Tracker Calibration in Projection-Based VR Systems Marek Czernuszenko, Daniel Sandin, Thomas DeFanti fmarek j dan j tomg @evl.uic.edu Electronic Visualization Laboratory (EVL)
More informationInteractive Multimedia Contents in the IllusionHole
Interactive Multimedia Contents in the IllusionHole Tokuo Yamaguchi, Kazuhiro Asai, Yoshifumi Kitamura, and Fumio Kishino Graduate School of Information Science and Technology, Osaka University, 2-1 Yamada-oka,
More informationVR based HCI Techniques & Application. November 29, 2002
VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted
More information- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture
12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used
More informationImmersive Augmented Reality Display System Using a Large Semi-transparent Mirror
IPT-EGVE Symposium (2007) B. Fröhlich, R. Blach, and R. van Liere (Editors) Short Papers Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror K. Murase 1 T. Ogi 1 K. Saito 2
More informationPaper on: Optical Camouflage
Paper on: Optical Camouflage PRESENTED BY: I. Harish teja V. Keerthi E.C.E E.C.E E-MAIL: Harish.teja123@gmail.com kkeerthi54@gmail.com 9533822365 9866042466 ABSTRACT: Optical Camouflage delivers a similar
More informationVisuo-Haptic Display Using Head-Mounted Projector
Visuo-Haptic Display Using Head-Mounted Projector Masahiko Inami, Naoki Kawakami, Dairoku Sekiguchi, Yasuyuki Yanagida, Taro Maeda and Susumu Tachi The University of Tokyo media3@star.t.u-tokyo.ac.jp Abstract
More informationExperience of Immersive Virtual World Using Cellular Phone Interface
Experience of Immersive Virtual World Using Cellular Phone Interface Tetsuro Ogi 1, 2, 3, Koji Yamamoto 3, Toshio Yamada 1, Michitaka Hirose 2 1 Gifu MVL Research Center, TAO Iutelligent Modeling Laboratory,
More informationChapter 24 Geometrical Optics. Copyright 2010 Pearson Education, Inc.
Chapter 24 Geometrical Optics Lenses convex (converging) concave (diverging) Mirrors Ray Tracing for Mirrors We use three principal rays in finding the image produced by a curved mirror. The parallel ray
More informationInput devices and interaction. Ruth Aylett
Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time
More informationSubjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen
Subjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen Duc Nguyen Van 1 Tomohiro Mashita 1,2 Kiyoshi Kiyokawa 1,2 and Haruo Takemura
More informationThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems
ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science
More informationT h e. By Susumu Tachi, Masahiko Inami & Yuji Uema. Transparent
T h e By Susumu Tachi, Masahiko Inami & Yuji Uema Transparent Cockpit 52 NOV 2014 north american SPECTRUM.IEEE.ORG A see-through car body fills in a driver s blind spots, in this case by revealing ever
More informationCSE 190: 3D User Interaction
Winter 2013 CSE 190: 3D User Interaction Lecture #4: Displays Jürgen P. Schulze, Ph.D. CSE190 3DUI - Winter 2013 Announcements TA: Sidarth Vijay, available immediately Office/lab hours: tbd, check web
More informationDevelopment of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
More informationTouching and Walking: Issues in Haptic Interface
Touching and Walking: Issues in Haptic Interface Hiroo Iwata 1 1 Institute of Engineering Mechanics and Systems, University of Tsukuba, 80, Tsukuba, 305-8573 Japan iwata@kz.tsukuba.ac.jp Abstract. This
More informationIntelligent interaction
BionicWorkplace: autonomously learning workstation for human-machine collaboration Intelligent interaction Face to face, hand in hand. The BionicWorkplace shows the extent to which human-machine collaboration
More informationTelexistence and Retro-reflective Projection Technology (RPT)
Proceedings of the 5 th Virtual Reality International Conference (VRIC2003) pp.69/1-69/9, Laval Virtual, France, May 13-18, 2003 Telexistence and Retro-reflective Projection Technology (RPT) Susumu TACHI,
More informationUngrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments
The 18th IEEE International Symposium on Robot and Human Interactive Communication Toyama, Japan, Sept. 27-Oct. 2, 2009 WeIAH.2 Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments
More informationImplementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring
Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Ashill Chiranjan and Bernardt Duvenhage Defence, Peace, Safety and Security Council for Scientific
More information3D and Sequential Representations of Spatial Relationships among Photos
3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii
More information2 Outline of Ultra-Realistic Communication Research
2 Outline of Ultra-Realistic Communication Research NICT is conducting research on Ultra-realistic communication since April in 2006. In this research, we are aiming at creating natural and realistic communication
More informationVR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.
VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D
More informationDevelopment of Informal Communication Environment Using Interactive Tiled Display Wall Tetsuro Ogi 1,a, Yu Sakuma 1,b
Development of Informal Communication Environment Using Interactive Tiled Display Wall Tetsuro Ogi 1,a, Yu Sakuma 1,b 1 Graduate School of System Design and Management, Keio University 4-1-1 Hiyoshi, Kouhoku-ku,
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationAdvanced User Interfaces: Topics in Human-Computer Interaction
Computer Science 425 Advanced User Interfaces: Topics in Human-Computer Interaction Week 04: Disappearing Computers 90s-00s of Human-Computer Interaction Research Prof. Roel Vertegaal, PhD Week 8: Plan
More informationA Hybrid Immersive / Non-Immersive
A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain
More informationNUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch
1 2 Research Topic TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY Human-Computer Interaction / Natural User Interface Neng-Hao (Jones) Yu, Assistant Professor Department of Computer Science National
More informationVendor Response Sheet Technical Specifications
TENDER NOTICE NO: IPR/TN/PUR/TPT/ET/17-18/38 DATED 27-2-2018 Vendor Response Sheet Technical Specifications 1. 3D Fully Immersive Projection and Display System Item No. 1 2 3 4 5 6 Specifications A complete
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests
More informationExploration of Alternative Interaction Techniques for Robotic Systems
Natural User Interfaces for Robotic Systems Exploration of Alternative Interaction Techniques for Robotic Systems Takeo Igarashi The University of Tokyo Masahiko Inami Keio University H uman-robot interaction
More informationPhysical Interaction and Multi-Aspect Representation for Information Intensive Environments
Proceedings of the 2000 IEEE International Workshop on Robot and Human Interactive Communication Osaka. Japan - September 27-29 2000 Physical Interaction and Multi-Aspect Representation for Information
More informationThe Mixed Reality Book: A New Multimedia Reading Experience
The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut
More informationChapter 18 Optical Elements
Chapter 18 Optical Elements GOALS When you have mastered the content of this chapter, you will be able to achieve the following goals: Definitions Define each of the following terms and use it in an operational
More informationRegan Mandryk. Depth and Space Perception
Depth and Space Perception Regan Mandryk Disclaimer Many of these slides include animated gifs or movies that may not be viewed on your computer system. They should run on the latest downloads of Quick
More informationAnalysis of retinal images for retinal projection type super multiview 3D head-mounted display
https://doi.org/10.2352/issn.2470-1173.2017.5.sd&a-376 2017, Society for Imaging Science and Technology Analysis of retinal images for retinal projection type super multiview 3D head-mounted display Takashi
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationAUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING
6 th INTERNATIONAL MULTIDISCIPLINARY CONFERENCE AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING Peter Brázda, Jozef Novák-Marcinčin, Faculty of Manufacturing Technologies, TU Košice Bayerova 1,
More informationOPTICAL CAMOUFLAGE. ¾ B.Tech E.C.E Shri Vishnu engineering college for women. Abstract
OPTICAL CAMOUFLAGE Y.Jyothsna Devi S.L.A.Sindhu ¾ B.Tech E.C.E Shri Vishnu engineering college for women Jyothsna.1015@gmail.com sindhu1015@gmail.com Abstract This paper describes a kind of active camouflage
More informationUsing Scalable, Interactive Floor Projection for Production Planning Scenario
Using Scalable, Interactive Floor Projection for Production Planning Scenario Michael Otto, Michael Prieur Daimler AG Wilhelm-Runge-Str. 11 D-89013 Ulm {michael.m.otto, michael.prieur}@daimler.com Enrico
More informationOmni-Directional Catadioptric Acquisition System
Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationDESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY
DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY 1 RAJU RATHOD, 2 GEORGE PHILIP.C, 3 VIJAY KUMAR B.P 1,2,3 MSRIT Bangalore Abstract- To ensure the best place, position,
More informationA SURVEY ON GESTURE RECOGNITION TECHNOLOGY
A SURVEY ON GESTURE RECOGNITION TECHNOLOGY Deeba Kazim 1, Mohd Faisal 2 1 MCA Student, Integral University, Lucknow (India) 2 Assistant Professor, Integral University, Lucknow (india) ABSTRACT Gesture
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard
More informationColumn-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation
ITE Trans. on MTA Vol. 2, No. 2, pp. 161-166 (2014) Copyright 2014 by ITE Transactions on Media Technology and Applications (MTA) Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based
More informationdoi: /
doi: 10.1117/12.872287 Coarse Integral Volumetric Imaging with Flat Screen and Wide Viewing Angle Shimpei Sawada* and Hideki Kakeya University of Tsukuba 1-1-1 Tennoudai, Tsukuba 305-8573, JAPAN ABSTRACT
More informationChapter 1 - Introduction
1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over
More informationMulti-touch Interface for Controlling Multiple Mobile Robots
Multi-touch Interface for Controlling Multiple Mobile Robots Jun Kato The University of Tokyo School of Science, Dept. of Information Science jun.kato@acm.org Daisuke Sakamoto The University of Tokyo Graduate
More informationHCI Outlook: Tangible and Tabletop Interaction
HCI Outlook: Tangible and Tabletop Interaction multiple degree-of-freedom (DOF) input Morten Fjeld Associate Professor, Computer Science and Engineering Chalmers University of Technology Gothenburg University
More informationTOUCHABLE HOLOGRAMS AND HAPTIC FEEDBACK: REAL EXPERIENCE IN A VIRTUAL WORLD
TOUCHABLE HOLOGRAMS AND HAPTIC FEEDBACK: REAL EXPERIENCE IN A VIRTUAL WORLD 1 PRAJAKTA RATHOD, 2 SANKET MODI 1 Assistant Professor, CSE Dept, NIRMA University, Ahmedabad, Gujrat 2 Student, CSE Dept, NIRMA
More informationISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1
Development of Multi-D.O.F. Master-Slave Arm with Bilateral Impedance Control for Telexistence Riichiro Tadakuma, Kiyohiro Sogen, Hiroyuki Kajimoto, Naoki Kawakami, and Susumu Tachi 7-3-1 Hongo, Bunkyo-ku,
More informationInformation Layout and Interaction on Virtual and Real Rotary Tables
Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer System Information Layout and Interaction on Virtual and Real Rotary Tables Hideki Koike, Shintaro Kajiwara, Kentaro Fukuchi
More informationA Multimodal Locomotion User Interface for Immersive Geospatial Information Systems
F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,
More informationA Gesture-Based Interface for Seamless Communication between Real and Virtual Worlds
6th ERCIM Workshop "User Interfaces for All" Long Paper A Gesture-Based Interface for Seamless Communication between Real and Virtual Worlds Masaki Omata, Kentaro Go, Atsumi Imamiya Department of Computer
More informationDevelopment of an Education System for Surface Mount Work of a Printed Circuit Board
Development of an Education System for Surface Mount Work of a Printed Circuit Board H. Ishii, T. Kobayashi, H. Fujino, Y. Nishimura, H. Shimoda, H. Yoshikawa Kyoto University Gokasho, Uji, Kyoto, 611-0011,
More informationUnit Two Part II MICROSCOPY
Unit Two Part II MICROSCOPY AVERETT 1 0 /9/2013 1 MICROSCOPES Microscopes are devices that produce magnified images of structures that are too small to see with the unaided eye Humans cannot see objects
More informationInterior Design using Augmented Reality Environment
Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate
More informationMario Romero 2014/11/05. Multimodal Interaction and Interfaces Mixed Reality
Mario Romero 2014/11/05 Multimodal Interaction and Interfaces Mixed Reality Outline Who am I and how I can help you? What is the Visualization Studio? What is Mixed Reality? What can we do for you? What
More informationThe Design of Internet-Based RobotPHONE
The Design of Internet-Based RobotPHONE Dairoku Sekiguchi 1, Masahiko Inami 2, Naoki Kawakami 1 and Susumu Tachi 1 1 Graduate School of Information Science and Technology, The University of Tokyo 7-3-1
More informationAdvancements in Gesture Recognition Technology
IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka
More informationImage Manipulation Interface using Depth-based Hand Gesture
Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking
More informationStereoscopic Augmented Reality System for Computer Assisted Surgery
Marc Liévin and Erwin Keeve Research center c a e s a r, Center of Advanced European Studies and Research, Surgical Simulation and Navigation Group, Friedensplatz 16, 53111 Bonn, Germany. A first architecture
More informationDevelopment of a telepresence agent
Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented
More information3D Form Display with Shape Memory Alloy
ICAT 2003 December 3-5, Tokyo, JAPAN 3D Form Display with Shape Memory Alloy Masashi Nakatani, Hiroyuki Kajimoto, Dairoku Sekiguchi, Naoki Kawakami, and Susumu Tachi The University of Tokyo 7-3-1 Hongo,
More informationDistributed Vision System: A Perceptual Information Infrastructure for Robot Navigation
Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp
More informationINTERIOUR DESIGN USING AUGMENTED REALITY
INTERIOUR DESIGN USING AUGMENTED REALITY Miss. Arti Yadav, Miss. Taslim Shaikh,Mr. Abdul Samad Hujare Prof: Murkute P.K.(Guide) Department of computer engineering, AAEMF S & MS, College of Engineering,
More informationA reduction of visual fields during changes in the background image such as while driving a car and looking in the rearview mirror
Original Contribution Kitasato Med J 2012; 42: 138-142 A reduction of visual fields during changes in the background image such as while driving a car and looking in the rearview mirror Tomoya Handa Department
More informationIndoor Positioning with a WLAN Access Point List on a Mobile Device
Indoor Positioning with a WLAN Access Point List on a Mobile Device Marion Hermersdorf, Nokia Research Center Helsinki, Finland Abstract This paper presents indoor positioning results based on the 802.11
More informationEnhancing Fish Tank VR
Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands mullie robertl @cwi.nl Abstract Fish tank VR systems provide head
More informationOpen Archive TOULOUSE Archive Ouverte (OATAO)
Open Archive TOULOUSE Archive Ouverte (OATAO) OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible. This is an author-deposited
More informationSensor system of a small biped entertainment robot
Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO
More informationEnhancedTable: An Augmented Table System for Supporting Face-to-Face Meeting in Ubiquitous Environment
EnhancedTable: An Augmented Table System for Supporting Face-to-Face Meeting in Ubiquitous Environment Hideki Koike 1, Shinichiro Nagashima 1, Yasuto Nakanishi 2, and Yoichi Sato 3 1 Graduate School of
More informationOptoliner NV. Calibration Standard for Sighting & Imaging Devices West San Bernardino Road West Covina, California 91790
Calibration Standard for Sighting & Imaging Devices 2223 West San Bernardino Road West Covina, California 91790 Phone: (626) 962-5181 Fax: (626) 962-5188 www.davidsonoptronics.com sales@davidsonoptronics.com
More informationDesign and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone
ISSN (e): 2250 3005 Volume, 06 Issue, 11 November 2016 International Journal of Computational Engineering Research (IJCER) Design and Implementation of the 3D Real-Time Monitoring Video System for the
More information