Geo-Located Content in Virtual and Augmented Reality

Similar documents
Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

Virtual Reality Calendar Tour Guide

Omni-Directional Catadioptric Acquisition System

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Audio Output Devices for Head Mounted Display Devices

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Development of a telepresence agent

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Waves Nx VIRTUAL REALITY AUDIO

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Mixed / Augmented Reality in Action

6Visionaut visualization technologies SIMPLE PROPOSAL 3D SCANNING

Wireless Keyboard Without Need For Battery

VR based HCI Techniques & Application. November 29, 2002

Frictioned Micromotion Input for Touch Sensitive Devices

Automatic correction of timestamp and location information in digital images

Multi-Modal User Interaction

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

CAPTURING PANORAMA IMAGES

VIRTUAL MUSEUM BETA 1 INTRODUCTION MINIMUM REQUIREMENTS WHAT DOES BETA 1 MEAN? CASTLEFORD TIGERS HERITAGE PROJECT

WEARABLE FULL FIELD AUGMENTED REALITY DISPLAY WITH WAVELENGTH- SELECTIVE MAGNIFICATION

Using the Desktop Recorder

Haptic holography/touching the ethereal Page, Michael

AUGMENTED REALITY IN URBAN MOBILITY

Mobile and Pervasive Game Technologies. Joel Ross ICS 62 05/19/2011

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Embedding Radars in Robots for Safety and Obstacle Detection

Augmented Reality And Ubiquitous Computing using HCI

One Size Doesn't Fit All Aligning VR Environments to Workflows

Trial code included!

Building a gesture based information display

Potential Uses of Virtual and Augmented Reality Devices in Commercial Training Applications

Intelligent Robotics Sensors and Actuators

Design Principles of Virtual Exhibits in Museums based on Virtual Reality Technology

Haptic Holography/Touching the Ethereal

Innovations in Simulation: Virtual Reality

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

Next Generation Haptics: Market Analysis and Forecasts

The 5 Types Of Touch Screen Technology.! Which One Is Best For You?!

Construction of visualization system for scientific experiments

Global Virtual Reality Market: Industry Analysis & Outlook ( )

Technical Disclosure Commons

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

Air-filled type Immersive Projection Display

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

R (2) Controlling System Application with hands by identifying movements through Camera

MarketsandMarkets. Publisher Sample

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies

Enhancing Shipboard Maintenance with Augmented Reality

Exploring Surround Haptics Displays

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

STE Standards and Architecture Framework TCM ITE

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

Realtime 3D Computer Graphics Virtual Reality

Recent Progress on Augmented-Reality Interaction in AIST

USTGlobal. VIRTUAL AND AUGMENTED REALITY Ideas for the Future - Retail Industry

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

Simulation of Water Inundation Using Virtual Reality Tools for Disaster Study: Opportunity and Challenges

MEMS Solutions For VR & AR

An Introduction into Virtual Reality Environments. Stefan Seipel

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

3D Interaction Techniques

IMGD 4000 Technical Game Development II Interaction and Immersion

Getting Real with the Library. Samuel Putnam, Sara Gonzalez Marston Science Library University of Florida

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Scholarly Article Review. The Potential of Using Virtual Reality Technology in Physical Activity Settings. Aaron Krieger.

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments

YOUR PRODUCT IN 3D. Scan and present in Virtual Reality, Augmented Reality, 3D. SCANBLUE.COM

Application of 3D Terrain Representation System for Highway Landscape Design

OUR RELENTLESS PURSUIT: Applying 3D Video / VR /AR ON THE ROAD TO ZERO

Chapter 1 - Introduction

Benefits of using haptic devices in textile architecture

Chapter 1 Virtual World Fundamentals

_The Ultimate Dream of Flying

A Digital Reality Daniel Gilyana & Arielle Pineda

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

Introduction to Virtual Reality (based on a talk by Bill Mark)

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Falsework & Formwork Visualisation Software

Intelligent Radio Search

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

Development of excavator training simulator using leap motion controller

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Mixed Reality technology applied research on railway sector

Augmented and Virtual Reality

Interface Design V: Beyond the Desktop

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Crafting RPG Worlds in Real Environments with AR. Žilvinas Ledas PhD, Co-Founder at Tag of Joy Šarūnas Ledas Co-Founder at Tag of Joy

immersive visualization workflow

Personal Sensing. Tarek Abdelzaher. Dept. of Computer Science University of Illinois at Urbana Champaign

Transcription:

Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series Recommended Citation Anglaret, Thomas, "Geo-Located Content in Virtual and Augmented Reality", Technical Disclosure Commons, (October 02, 2017) This work is licensed under a Creative Commons Attribution 4.0 License. This Article is brought to you for free and open access by Technical Disclosure Commons. It has been accepted for inclusion in Defensive Publications Series by an authorized administrator of Technical Disclosure Commons.

Anglaret: Geo-Located Content in Virtual and Augmented Reality Geo-Located Content in Virtual and Augmented Reality Abstract: A three-dimensional (3D) world representation is used to place virtual objects that represent geolocalized content into a virtual reality (VR) or augmented reality (AR) environment. Browsing navigation may then be performed in 3D by touching, pointing, or focusing on the virtual object to reveal its corresponding media content. The media content may be static and/or interactive content based on video, audio, images, and/or texts. Rather than being presented on flat surfaces, the content surrounds the user in a more-immersive mode, and more-natural VR or AR techniques can be used to select, expand, move, and interact with the content. Keywords: virtual reality, augmented reality, content, browsing, searching, navigating, immersive navigation, gaze, point, touch, geolocation, geolocalized content Background: Virtual reality (VR) environments rely on display, tracking, and VR-content systems. Through these systems, realistic images, sounds, and sometimes other sensations simulate a user s physical presence in an artificial environment. Each of these three systems are illustrated below in Fig. 1. Published by Technical Disclosure Commons, 2017 2

Defensive Publications Series, Art. 716 [2017] Image Sensors Wide-Angle Camera Narrow-Angle Camera Depth Sensor User-Facing Camera Tracking System Non-Image Sensors Gyroscope Magnetometer Accelerometer GPS Receiver User Interfaces Touchscreen Keyboard Pointing Device Mouse VR-Content System Host Server Network Mobile Device VR Device Processor Display System Head-Mounted Display Projection System Monitor Mobile-Device Display Fig. 1 The systems described in Fig. 1 may be implemented in one or more of various computing devices that can support VR applications, such as servers, desktop computers, VR goggles, computing spectacles, laptops, or mobile devices. These devices include a processor that can manage, control, and coordinate operations of the display, tracking, and VR-content systems. The devices also include memory and interfaces. These interfaces connect the memory with the systems using various buses and other connection methods as appropriate. The display system enables a user to look around within the virtual world. The display system can include a head-mounted display, a projection system within a virtual-reality room, a monitor, or a mobile device s display, either held by a user or placed in a head-mounted device. 3

Anglaret: Geo-Located Content in Virtual and Augmented Reality The VR-content system provides content that defines the VR environment, such as images and sounds. The VR-content system provides the content using a host server, a network-based device, a mobile device, or a dedicated virtual reality device, to name a few. The tracking system enables the user to interact with and navigate through the VR environment, using sensors and user interfaces. The sensors may include image sensors such as a wide-angle camera, a narrow-angle camera, a user-facing camera, and a depth sensor. Non-image sensors may also be used, including gyroscopes, magnetometers, accelerometers, GPS sensors, retina/pupil detectors, pressure sensors, biometric sensors, temperature sensors, humidity sensors, optical or radio-frequency sensors that track the user s location or movement (e.g., user s fingers, arms, or body), and ambient light sensors. The sensors can be used to create and maintain virtual environments, integrate real world features into the virtual environment, properly orient virtual objects (including those that represent real objects, such as a mouse or pointing device) in the virtual environment, and account for the user s body position and motion. The user interfaces may be integrated with or connected to the computing device and enable the user to interact with the VR environment. The user interfaces may include a touchscreen, a keyboard, a pointing device, a mouse or trackball device, a joystick or other game controller, a camera, a microphone, or an audio device with user controls. The user interfaces allow a user to interact with the virtual environment by performing an action, which causes a corresponding action in the VR environment (e.g., raising an arm, walking, or speaking). The tracking system may also include output devices that provide visual, audio, or tactile feedback to the user (e.g., vibration motors or coils, piezoelectric devices, electrostatic devices, LEDs, strobes, and speakers). For example, output devices may provide feedback in the form of blinking and/or flashing lights or strobes, audible alarms or other sounds, songs or other audio Published by Technical Disclosure Commons, 2017 4

Defensive Publications Series, Art. 716 [2017] files, increased or decreased resistance of a control on a user interface device, or vibration of a physical component, such as a head-mounted display, a pointing device, or another user interface device. Fig. 1 illustrates the display, tracking, and VR-content systems as disparate entities in part to show the communications between them, though they may be integrated, e.g., a smartphone mounted in a VR receiver, or operate separately in communication with other systems. These communications can be internal, wireless, or wired. Through these illustrated systems, a user can be immersed in a VR environment. While these illustrated systems are described in the VR context, they can be used, in whole or in part, to augment the physical world. This augmentation, called augmented reality or AR, includes audio, video, or images that overlay or are presented in combination with the real world or images of the real world. Examples include visual or audio overlays to computing spectacles (e.g., some real world-vr world video games or information overlays to a real-time image on a mobile device) or an automobile s windshield (e.g., a heads-up display) to name just a few possibilities. In typical configurations of the VR and AR systems described in Fig. 1, the browsing experience is served via a two-dimensional (2D) paradigm. This represents a challenge for VR and AR users, who must use VR techniques, such as a controller, a touch, or a gaze (and sometimes a virtual keyboard to aid in searching), to navigate a flat and static interface that mimics a conventional mobile or desktop interface. This means that the VR/AR user may have to break away from what is intended to be a virtual experience to engage in 2D interactions in a threedimensional (3D) environment, which can be tiring and frustrating for users seeking an immersive and intuitive VR experience. 5

Anglaret: Geo-Located Content in Virtual and Augmented Reality Description: To address the problem of requiring a VR/AR user to engage in two-dimensional (2D) interactions in a three-dimensional (3D) environment, a 3D world representation is used to place virtual objects that represent geolocalized content into a virtual reality (VR) or augmented reality (AR) environment. The 3D world representation may be a partial or full representation and may include both real and imaginary elements. Browsing navigation may then be performed in 3D by touching, pointing, or focusing on the virtual object to reveal its corresponding media content. The media content may be static and/or interactive content based on video, audio, images, and/or texts. Rather than presenting flat surfaces for clicking on, the content surrounds the user in a moreimmersive mode. The user can move around the objects and, as the content is revealed, interact with the different media using more-natural VR or AR techniques to further select, expand, move, and interact with the content. Selected content can be presented in a 360 view, as appropriate for the application and the environment. For example, in a VR environment, content can surround the user so that however the user moves, the appropriate content will be viewable in 360. In an AR environment, content can be viewable over 360 by using area learning, so that the AR application remembers where objects are, and the gyroscope in the AR device can keep track of the device s position and be able to smoothly display content properly situated with respect to the user s position in the physical world. In some cases, an AR user can switch to VR mode to maintain 360 views. Geolocalized content is content that is relevant to the real or virtual location associated with the VR or AR environment the user is exploring. For example, a VR user might want to browse trending news. Instead of a flat panel display to click a point on, the VR user can interact Published by Technical Disclosure Commons, 2017 6

Defensive Publications Series, Art. 716 [2017] with the news by standing on a virtual globe or map and having various topics present in a 3D surround view. Fig. 2 illustrates an example of this application that includes thumbnail images, displayed in a semi-circle around the user, that the user can touch, gaze at, or select with a user interface (e.g., a VR controller) to interact with in more detail. $$ Economic Headlines: GDP Improving; more growth predicted New Jobs: 156,000; forecast reduced Uptick in Unemployment; now 4.4% Oil Prices Flat; shale holding its own Fig. 2 7

Anglaret: Geo-Located Content in Virtual and Augmented Reality Fig. 3 shows another example of a VR application. In the example of Fig. 3, a VR map is presented to the user. The VR map includes multiple objects that can be accessed to display content. By using the VR controller (e.g., the pointing device of Fig. 1), the user can aim the virtual pointer at a particular object and display the content associated with the object. Fig. 3 Published by Technical Disclosure Commons, 2017 8

Defensive Publications Series, Art. 716 [2017] Fig. 4 illustrates the concept in an example AR environment. In the example of Fig. 4, an AR user sees a virtual globe showing geolocated places in the physical world. The user can gaze at, or reach out and virtually touch, the object or an AR element associated with the place or object (e.g., the images or numbers shown on the globe). This selection presents, in the AR display, content associated with the place or object, such as news or information, hours of operation, or related objects and places. Fig. 4 Because the content is associated with a geolocation, the spatial context of the content can be used to curate the content. The spatial relationships between geolocated objects in a VR or AR environment, and between the geolocated objects and the VR/AR user, allow the content to be curated to be relevant to the user s physical or virtual location, or to more closely match trending searches or activity related to the place or object with which the content is related. 9