Projection-based head-mounted displays for wearable computers

Similar documents
A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments

Imaging with microlenslet arrays

Design of a wearable wide-angle projection color display

A mobile head-worn projection display

Design of an ultralight and compact projection lens

An Ultra-light and Compact Design and Implementation of Head-Mounted Projective Displays

Conformal optics for 3D visualization

Invisibility Cloak. (Application to IMAGE PROCESSING) DEPARTMENT OF ELECTRONICS AND COMMUNICATIONS ENGINEERING

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

Projection based Head Mounted Display with Eye- Tracking Capabilities

Mixed Reality Approach and the Applications using Projection Head Mounted Display

Design and assessment of microlenslet-array relay optics

Computational Near-Eye Displays: Engineering the Interface Between our Visual System and the Digital World. Gordon Wetzstein Stanford University

Jannick Rolland, 1 Frank Biocca, 2 Hong Hua, 3. Yonggang Ha, 1 Chunyu Gao, 3 and Ola Harrysson 4

Application of 3D Terrain Representation System for Highway Landscape Design

Subjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen

Imaging Systems for Eyeglass-Based Display Devices

Optical camouflage technology

ECEN 4606, UNDERGRADUATE OPTICS LAB

OPTICAL CAMOUFLAGE. ¾ B.Tech E.C.E Shri Vishnu engineering college for women. Abstract

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display

WEARABLE FULL FIELD AUGMENTED REALITY DISPLAY WITH WAVELENGTH- SELECTIVE MAGNIFICATION

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

Systems and Methods for Providing Compact Illumination in Head Mounted Displays

Compact camera module testing equipment with a conversion lens

Head Mounted Display Optics II!

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

A high-resolution optical see-through headmounted display with eyetracking capability

Methods for the Assessment of Head-Mounted Displays in Visual Space

arxiv: v1 [cs.hc] 11 Oct 2017

Head-Mounted Display With Eye Tracking Capability

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

(12) Patent Application Publication (10) Pub. No.: US 2001/ A1

doi: /

E X P E R I M E N T 12

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

DISPLAY metrology measurement

Microlens array-based exit pupil expander for full color display applications

Paper on: Optical Camouflage

Full Color Holographic Optical Element Fabrication for Waveguide-type Head Mounted Display Using Photopolymer

User Interfaces in Panoramic Augmented Reality Environments

T h e. By Susumu Tachi, Masahiko Inami & Yuji Uema. Transparent

Digital Photographic Imaging Using MOEMS

Smart Light Ultra High Speed Projector for Spatial Multiplexing Optical Transmission

Visuo-Haptic Display Using Head-Mounted Projector

This document explains the reasons behind this phenomenon and describes how to overcome it.

Copyright 2005 Society of Photo Instrumentation Engineers.

Beyond: collapsible tools and gestures for computational design

3.0 Alignment Equipment and Diagnostic Tools:

Development of a new multi-wavelength confocal surface profilometer for in-situ automatic optical inspection (AOI)

Active Aperture Control and Sensor Modulation for Flexible Imaging

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

Telexistence and Retro-reflective Projection Technology (RPT)

Virtual Reality Technology and Convergence. NBA 6120 February 14, 2018 Donald P. Greenberg Lecture 7

Lens Design I. Lecture 3: Properties of optical systems II Herbert Gross. Summer term

Breaking Down The Cosine Fourth Power Law

The Past, Present, and Future of Head Mounted Display Designs

Lenses. A lens is any glass, plastic or transparent refractive medium with two opposite faces, and at least one of the faces must be curved.

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Using molded chalcogenide glass technology to reduce cost in a compact wide-angle thermal imaging lens

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

CODE V Introductory Tutorial

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

Compact Optical See-Through Head-Mounted Display with Occlusion Support

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design

EUV Plasma Source with IR Power Recycling

Lens Design I. Lecture 3: Properties of optical systems II Herbert Gross. Summer term

EE119 Introduction to Optical Engineering Fall 2009 Final Exam. Name:

RESEARCH interests in three-dimensional (3-D) displays

LCOS Devices for AR Applications

Compact Lens Assembly for the Teleportal Augmented Reality System (CIP)

Virtual Reality Technology and Convergence. NBAY 6120 March 20, 2018 Donald P. Greenberg Lecture 7

EC-433 Digital Image Processing

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1. Penn et al. (43) Pub. Date: Aug. 7, 2003

Chapter 3. Introduction to Zemax. 3.1 Introduction. 3.2 Zemax

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Big League Cryogenics and Vacuum The LHC at CERN

TL2 Technology Developer User Guide

Reviewers' Comments: Reviewer #1 (Remarks to the Author):

A laser speckle reduction system

Multi-aperture camera module with 720presolution

Reading: Lenses and Mirrors; Applications Key concepts: Focal points and lengths; real images; virtual images; magnification; angular magnification.

The eye, displays and visual effects

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1

Research and Development of an Integrated Electro- Optical and Radio Frequency Aperture 12

LAB 12 Reflection and Refraction

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Unit 2: Optics Part 2

September November 2010

Chapter 36. Image Formation

Holographic Augmented Reality: Towards Near-to-Eye Electroholography

Lenses. Images. Difference between Real and Virtual Images

See-through near-eye displays enabling vision correction

Phys 531 Lecture 9 30 September 2004 Ray Optics II. + 1 s i. = 1 f

Snapshot Mask-less fabrication of embedded monolithic SU-8 microstructures with arbitrary topologies

PROCEEDINGS OF SPIE. Automated asphere centration testing with AspheroCheck UP

LOW COST CAVE SIMPLIFIED SYSTEM

Color electroholography by three colored reference lights simultaneously incident upon one hologram panel

Transcription:

Projection-based head-mounted displays for wearable computers Ricardo Martins a, Vesselin Shaoulov b, Yonggang Ha b and Jannick Rolland a,b University of Central Florida, Orlando, FL 32816 a Institute of Modeling and Simulation b School of Optics/CREOL/FPCE ricky@odalab.ucf.edu ABSTRACT The projection based head-mounted display (HMD) constitutes a new paradigm in the field of wearable computers. Expanding on our previous projection based HMD, we developed a wearable computer consisting of a pair of miniature projection lenses combined with a beam splitter and miniature displays. Such wearable computer utilizes a novel conceptual design encompassing the integration of phase conjugate material (PCM) packaged inside the HMD. Some of the applications benefiting from this innovative wearable HMD are for government agencies and consumers requiring mobility with a large field-of-view (FOV), and an ultra-light weight headset. The key contribution of this paper is the compact design and mechanical assembly of the mobile HMD. 1. INTRODUCTION Projection optics as opposed to eyepiece design has emerged as a new optical design for 3D visualization in HMDs. 1-4 The HMD is a key component for 3D visualization tasks such as surgical planning, medical training, and engineering design. 5 A recent innovation to the HMD field is the head-mounted projection display (HMPD), which may be thought of as a miniature projector mounted on the head with PCM strategically placed in the environment. The HMPD is an emerging technology that lies on the boundary of conventional HMDs and projection displays such as the Cave Automatic Virtual Environment (CAVE) technology. 6-9 It yields 3D visualization capability with a large FOV (i.e. up to 70 degrees with a flat retroreflective screen based on current off-the-shelf PCM), 9 lightweight optics with and low distortion, and correct occlusion of virtual objects by real objects. 10 The early HMPDs conceived in the Optical Diagnostics and Application Laboratory (ODALab) consisted of a pair of miniature projection lenses combined with a beam splitter and miniature displays, all mounted in a headset, as well as PCM placed strategically in the environment, as shown in Fig. 1. The PCM is placed in the environment allowing users to view computer-generated images embedded in the real environment. The stereoscopic images seen by the viewer are projected from the HMPD retroreflected from the PCM to the respective viewers eyes, allowing stereoscopic perception. The PCM is flexible and can be used to partially or completely surround the users or to inexpensively cover any surface or object of various shapes within the environment. Fig. 1 is an example of a dynamic volumetric augmented reality (AR) object of a human s femur perceived by the user wearing the HMPD. 11 The virtual femur retains the physical properties of the real object, but it can also dynamically take on any visual property including animation. The only hindrance of such HMPD system is the mobility outside of the PCM area because of the attachment of the external PCM placed in the real environment. 104 Helmet- and Head-Mounted Displays IX: Technologies and Applications, edited by Clarence E. Rash, Colin E. Reese, Proceedings of SPIE Vol. 5442 (SPIE, Bellingham, WA, 2004) 0277-786X/04/$15 doi: 10.1117/12.542778

Figure 1: Current HMPD The outdoors HMPD that we proposed builds on the previous HMPD concept, however the novelty is the integration of the PCM within the HMPD. 12 This technology expands the boundaries of the conventional HMDs and projection-based displays because it opens the door from an indoor environment tethered to the PCM, to a mobile system with potential outdoors application such as Military Operations on Urbanized Terrain (MOUT). 13 The proposed wearable HMPD configuration allows for 3D visualization capability with a large field of view (FOV), lightweight optics and low distortion. The outdoor HMPD design comprises of lightweight projection optics and integrated PCM in the headset that eliminates the requisite use of an external PCM. A key component of the design is not only the integration of the PCM but also the use of a lens in combination with this novel projection enclosed system clearly facilitating the operability of the technology. 14 In this paper, a review of the conceptual design for the outdoor HMPD is presented in Section 2. In Section 3, we demonstrate a 42-degree projection optics module. Finally, in Section 4 we present an analysis of imaging by utilizing commercially available phase conjugate material with an experimental validation and conclusion for improving the image quality. 2. REVIEW OF THE OPTICAL LAYOUT FOR THE WEARALBE HMPD Fig. 2 provides the conceptual design of an outdoor HMPD, which was achieved in the ODALab and was finalized in collaboration with the United States Army STRICOM, Synthetic Natural Environment (SNE) project. 14 The fundamental principle of the outdoor HMPD is enabled by projection optics that projects a real image on the PCM where the rays are than retroreflected from the PCM back to the user s eye. Due to the nature of the PCM, rays hitting the surface are reflected back on themselves in the opposite direction. Therefore, a user can perceive the virtual projected image at the exit pupil of the optics. 15 If the projected image and PCM are conjugate to each other, the user can clearly view the virtual image. Previously we demonstrated that not placing the PCM at the same location as the projected real image would lead to a degraded and blurred image, rendering the virtual images useless. A solution to rendering of clear virtual images was to place a lens between the projection optics and the PCM, in order to conjugate the PCM and the projected real images. By conjugating the PCM and the projected real images in a compact solution, we enabled a wearable outdoor HMPD. However, other issues arose, which led to a degraded virtual image quality. We will further address these issues in Section 4. Proc. of SPIE Vol. 5442 105

Figure 2: First Order Layout of HMPD Conceptual Design 3. OPTICAL LENS DESIGN The HMPD conceptual design shown in Fig. 3 is an example of how the integration of the miniature and lightweight projection optics and the PCM can be placed on the head as a wearable headset. Figure 3: Wearable HMPD Concept. While a grayscale picture can only be shown here for publication, the display allows full color. The lens module of the projection optics intergraded together with the miniature display is demonstrated in Fig. 4. The miniature display selected was based on illumination requirements. An off-the-shelf 0.6in diagonal Organic Light Emitting Display (OLED) with resolution of 800x600 pixels and 15µm pixel size manufactured by emagin Corp. was integrated into the lens module. Other off-the-shelf miniature displays use external light sources adding to overall length and weight. The self-emitting property of the OLED allows for an ultra lightweight and compact solution for a wearable HMPD. The optical design is composed of a main module consisting of four lenses and a field lens close to the miniature display. The projection lens for the wearable HMPD was designed with a combination of a diffractive optical element (DOE), plastic components, and aspheric surfaces ensuring both compactness and high image quality, while achieving a 42-degree FOV. The wearable HMPD was designed for a 15mm eye relief and might be further modified before the final prototype is built. The eye relief, accounting for the tilt of the beam splitter and the lens module, is less than 26mm, therefore the prototype will not accommodate eyeglasses. The state-ofthe-art compact lens was manufactured within 1in length and lightweight optics of 8 grams per eye. 106 Proc. of SPIE Vol. 5442

Figure 4: Monocular Lens-Mount Assembly. 4. EXPERIMENTAL RESULTS OF PCM We investigated two different types of commercially available PCMs, micro-optical beads and microcorner-cube arrays geometries, approximately 100µm in size, as shown in Fig. 5 (a) and (b). The characteristics of the non-uniform micro-bead array are described by combination of Snells law and specular reflection, while the micro-corner-cube array utilize total internal reflection, both providing the required retroreflective property. Currently the commercially available PCMs are not optimized for imaging, rather for applications such as traffic control and other safety purposes. For the ideal case of a perfect retroreflector, the incoming rays emitted by the miniature display should be reflected back parallel and in the opposite direction to the incident light without any deviation. The commercially available PCMs partially reflect rays that are not parallel to the incident light, instead they deviate within ±15-degree cone. This deviation produces a cone of light reflected from the PCM, which provides more illumination for devices such as stop signs and firefighter s vests, for example. Therefore, image degradation in the virtual image is produced since the rays are reflected back in a cone instead of parallel to the incident light. Due to the imperfections of the micro-optical beads, shown in Fig. 5(a), such as the randomness of the radiuses and the separation between two consecutive beads, the retroreflected rays deviate from being reflected parallel to the incident light. The micro-optical beads over the micro-corner-cube yielded a greater loss of light efficiency, which is needed when overcoming indoor ambient light or outdoor illumination. The next PCM tested was the micro-corner-cube array geometry based on an array of pyramids, shown in Fig. 5(b), which benefits from a uniform spacing, but the faces of the pyramid are not planar and 90- degrees with each two planes of the pyramid. In addition, if the surface of the pyramid is slightly curved, the incident rays will encounter a curved mirror altering the desired optical path for an ideal retroreflection. Therefore, not all of the rays will reflect parallel to the incoming rays, rather they will deviate thus producing image degradation. Finally, to yield an ideal imaging conditions for any PCM we need to satisfy the strenuous uniformity and surface criteria to control the incoming rays to achieve perfect retroreflection. To produce the desired retroreflection we need either an optimized corner cube array shown in Fig. 5(c) or a custom-built microlenslet array, as shown in Fig. 5(d), which will have uniform radii of curvature and separation of the lenses as well as a consistent performance of the microlenses across the array. The manufacturing of such PCM provides some fabrication challenges that will need to be further investigated. Thus, in our further implementation the micro-corner-cube PCM was selected based on the increased light efficiency over the micro-optical beads. Proc. of SPIE Vol. 5442 107

Micro-optical Beads Micro corner-cube Array Micro corner-cube Array Microlenslet Array 3 (a) (b) (c) (d) Figure 5: Different Types of Microstructures With the micro-corner-cube array a bench test was assembled to validate the conceptual design of the wearable HMPD and to qualitatively investigate the image degradation produced by the PCM. Fig. 6 demonstrates the bench setup for the wearable HMPD with the manufactured projection optics on the left. The projection optics will re-image the computer-generated test image shown in Fig. 7, on the PCM. Although we use a grayscale test image, the OLED has the capability of projecting color images. The test image was projected on the micro-corner-cube array and then captured on a CCD camera at the exit pupil location, which simulates a user s eye. Two scenarios were under consideration to qualitatively investigate the image quality: Scenario 1 was with the room lights off and Scenario 2 was with the room lights on (i.e. 15 lux). Projection Optics PCM Figure 6: HMPD Bench Setup Figure 7: Computer-Generated Test Image. The grayscale version is shown. We started our investigation with scenario 1 for the wearable HMPD and captured the 42-degree FOV image at 1500mm as shown in Fig. 8. Next, we investigated scenario 2 and captured the projected virtual image and the environment to provide a full see-through wearable HMPD, as shown in Fig.9 and 10. The virtual images shown in Fig. 9 and 10 were captured with the same ambient light. The difference between the virtual images is that for Fig. 9 the camera was focused on the same image plane as in Fig. 8, while the camera was focused on the background for Fig. 10. Figure 8: Capture Test Image with Lights Off (Scenario 1) Figure 9: Capture Test Image with Lights On 15 lux (Scenario 2) Figure 10: Capture Test Image with Lights On 15 lux (Scenario 2) 108 Proc. of SPIE Vol. 5442

Fig. 8 and 9-10 qualitatively demonstrate the difference in the image quality between scenario 1 and scenario 2. Comparing both the computer-generated test image shown in Fig. 7 and the results of scenario 1 and 2 shown in Fig. 8 and 9-10 demonstrate that the scenario 1 yields better representation of the test image than scenario 2. In scenario 2 the ambient light from the room was less than the microdisplay illumination, therefore, the images were visible but the contrast of the virtual images was decreased. In addition, the PCM was not optimized to perfectly retroreflect all of the light back to the user s eye or in our case the CCD camera, leading to a further decrease in the contrast ratio. 5. CONCLUSION The research presented in this paper led to the conceptual design of a novel single unit optical system consisting of an assembly of OLED microdisplay, projection optics, and PCM integrated into the HMPD. This unique design enables applications such as augmented reality for urban combat, MOUT, guided surgery, and wearable computers, for example, allowing the user to view computer-generated images in an indoors or outdoors environments. This novel design also led to specific design requirements for manufacturing custom PCM that will be integrated in our ultra lightweight, wide field of view HMPD assembly to improve the image quality. ACKNOWLEDGMENTS We thank emagin Corporation for the OLED microdisplay documentation and Rick Plympton of Optimax Corporation for their generous assistance on the lens fabrication. We also thank 3M Inc. for the generous custom fabrication and donation of the micro-corner-cube PCM to the ODALab, the M.I.N.D Lab and the 3DVIS Lab located at three different collaborating universities. This research was supported by grant number N00014-03-10677 awarded by the Office of Naval Research. REFERENCES 1. R. W. Fisher, Head-mounted projection display system featuring beam splitter and method of making same. US Patent: 5,572,229, November 5, 1996. 2. K.F. Arrington, and G.A. Geri, Conjugate-Optical Retroreflector Display System: Optical Principles and Perceptual Issues, Journal of the SID, August 2000. 3. H. Hua, A. Girardot, C. Gao, and J.P. Rolland, Engineering of Head-Mounted Projective Displays, Applied Optics, Vol. 39 (22), 3814-3824, 2000. 4. J.P. Rolland, F. Biocca, H. Hua, Y. Ha, C. Gao, and O. Harrisson, Teleportal Augmented Reality System: Integrating virtual objects, remote collaborators, and physical reality for distributed networked manufacturing, Chapter 11, Springer-Verlag, June 2004. 5. L. Davis, et. al., Application of Augmented Reality to Visualizing Anatomical Airways, Proceedings of SPIE Proceedings, SPIE AeroSense: Helmet- and Head-Mounted Displays VII: Technologyies and Applications, Editors: Clarence E. Rash, Colin E. Reese, Editors, Vol. 4711, p. 400-405, August 2002. 6. R. Kijima, and T. Ojika, Transition between virtual environment and workstation environment with projective head-mounted display, Proceedings of IEEE 1997 Virtual Reality Annual International Symposium, IEEE Comput. Soc. Press. 1997, p.130-7. Los Alamitos, CA, USA. 7. C. Cruz-Neira, D.J. Sandin, and T.A. DeFanti, Surround-screen projection-based virtual reality: the design and implementation of the CAVE, Conference of Computer Graphics 1993, Proc. Of ACM SIGGRAPH 93, p. 135-142, ACM, New York, NY, USA, Anaheim, CA, USA, 1 st -6 th August 1993. 8. M. Inami, N. Kawakami, D. Sekiguchi, Y. Yanagida, T. Maeda, and S. Tachi, Visuo-haptic display using head-mounted projector, Proceedings IEEE Virtual Reality 2000, IEEE Comput. Soc., pp.233-40. Los Alamitos, CA, USA, 2000. 9. Y. Ha, and J.P. Rolland, Compact lens assembly for the teleportal augmented reality system, US Proc. of SPIE Vol. 5442 109

Patent: US Patent: University of Central Florida, Issued August 2004. 10. H. Hua, J.P. Rolland and F. Biocca, Compact Lens-Assembly for Wearable Displays, Projection Systems, and Cameras, US Patent: University of Central Florida, Filed 2001. 11. H. Hua, C. Gao, L. Brown, F. Biocca, and J.P. Rolland, Design of an ultralight head-mounted projective display (HMPD) and its applications in augmented collaborative environments, Stereoscopic Displays and Virtual Reality Systems IX, SPIE Proceedings Vol. 4660, p. 492-497, Editors: Andrew J. Woods, John O. Merritt, Stephen A. Benton, Mark T. Bolas, Editors, May 2002. 12. R. Martins, J.P. Rolland, and Y. Ha, Head-mounted display by integration of phase-conjugate material, US Patent: University of Central Florida, Filed April 18, 2003. 13. S. Julier, Y. Baillot, M. Lanzagorta, D. Brown, and L. Rosenblum, "BARS: Battlefield Augmented Reality Systems", NATO Symposium on Information Processing Techniques for Military Systems, Istambul, Turkey, 2000. 14. R. Martins and J.P. Rolland, "Diffraction of Phase Conjugate Material in a New HMD Architecture," SPIE AeroSense: Helmet and Head-Mounted Displays VIII: Technologies and Applications, SPIE Proceedings Vol. 5186, p. 277-283,, Editors: C. E. Rash and C. E. Reese, September 2003. 15. Rolland, J.P., V. Shaoulov, and F.J. Gonzalez, "The art of back-of-the-envelope paraxial raytracing," IEEE Transactions in Education, Vol 44(4), p. 365-372, November 2001. 110 Proc. of SPIE Vol. 5442