Attorney Docket No Date: 25 April 2008

Similar documents
DEPARTMENT OF THE NAVY DIVISION NEWPORT OFFICE OF COUNSEL PHONE: FAX: DSN:

DEPARTMENT OF THE NAVY DIVISION NEWPORT OFFICE OF COUNSEL PHONE: FAX: DSN:

The below identified patent application is available for licensing. Requests for information should be addressed to:

I\1AA/5EA WARFARE CENTERS NEWPORT

Attorney Docket No Date: 20 June 2007

Attorney Docket No Date: 9 July 2007

The below identified patent application is available for licensing. Requests for information should be addressed to:

The below identified patent application is available for licensing. Requests for information should be addressed to:

The below identified patent application is available for licensing. Requests for information should be addressed to:

The below identified patent application is available for licensing. Requests for information should be addressed to:

DEPARTMENT OF THE NAVY. The below identified patent application is available for licensing. Requests for information should be addressed to:

William H. Nedderman, Jr. NOTICE. The above identified patent application is available for licensing. Requests for information should be addressed to:

NOTICE. The above identified patent application is available for licensing. Requests for information should be addressed to:

John J. Vaillancourt Steven L. Camara Daniel W. French NOTICE

The below identified patent application is available for licensing. Requests for information should be addressed to:

The below identified patent application is available for licensing. Requests for information should be addressed to:

DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited

NOTICE. The above identified patent application is available for licensing. Requests for information should be addressed to:

NOTICE. The above identified patent application is available for licensing. Requests for information should be addressed to:

NOTICE. The above identified patent application is available for licensing. Requests for information should be addressed to:

Distribution Unlimited Attorney Docket No Date: 17 November 2005

DEPARTMENT OF THE NAVY. The below identified patent application is available for licensing. Requests for information should be addressed to:

A Hybrid Immersive / Non-Immersive

(12) United States Patent (10) Patent No.: US 7.458,305 B1

NOTICE. The above identified patent application is available for licensing. Requests for information should be addressed to:

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

The below identified patent application is available for licensing. Requests for information should be addressed to:

Francis J. O'Brien, Jr Chung T. Neuven NOTICE

DEPARTMENT OF THE NAVY NAVAL UNDERSEA WARFARE CENTER DIVISION NEWPORT OFFICE OF COUNSEL PHONE: (401) FAX: (401)

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Attorney Docket No Date: 22 May 2007

(12) United States Patent

PRESS RELEASE EUROSATORY 2018

METHOD FOR MAPPING POSSIBLE OUTCOMES OF A RANDOM EVENT TO CONCURRENT DISSIMILAR WAGERING GAMES OF CHANCE CROSS REFERENCE TO RELATED APPLICATIONS

Method and weaving loom for producing a leno ground fabric

United States Patent (19) [11] Patent Number: 5,746,354

(12) United States Patent (10) Patent No.: US 6,705,355 B1

DEPARTMENT OF THE NAVY DIVISION NEWPORT OFFICE OF COUNSEL PHONE: FAX: DSN:

Attorney Docket No Date: 15 March 2002

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1

Continuous play background music system

Chapter 1 Virtual World Fundamentals

Released for Public Distribution

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) United States Patent (10) Patent No.: US 8,187,032 B1

(12) United States Patent (10) Patent No.: US 6,347,876 B1

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

Lockheed Martin. An Overview of Partnering with Small Businesses

MORGAN STATE UNIVERSITY PROCEDURES ON PATENTS AND TECHNOLOGY TRANSFER APPROVED BY THE PRESIDENT NOVEMBER 2, 2015

of a Panoramic Image Scene

SIMGRAPH - A FLIGHT SIMULATION DATA VISUALIZATION WORKSTATION. Joseph A. Kaplan NASA Langley Research Center Hampton, Virginia

... OFFICE OF COUNSEL PHONE: NEWPORT FAX: DSN:

Using VRML and Collaboration Tools to Enhance Feedback and Analysis of Distributed Interactive Simulation (DIS) Exercises

United States Patent [19]

United States Patent 19 Clifton

United States Patent (19) Sun

(51) Int. Cl."... Hosk 720 Amachine device that forces filtered air into and through a

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) United States Patent

Christen Rauscher NOTICE. The above identified patent application is available for licensing. Requests for information should be addressed to:

Altering vibration frequencies of workpieces, such as gas turbine engine blades. Abstract

(12) United States Patent (10) Patent No.: US 6,346,966 B1

Sagittarius Evolution Product Line

(12) United States Patent

United States Patent 19

(12) Patent Application Publication (10) Pub. No.: US 2002/ A1

RASim Prototype User Manual

TECHNOLOGY COMMONALITY FOR SIMULATION TRAINING OF AIR COMBAT OFFICERS AND NAVAL HELICOPTER CONTROL OFFICERS

(12) United States Patent

(12) United States Patent

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1

Imaging Systems for Eyeglass-Based Display Devices

Imaging serial interface ROM

CHAPTER 1. INTRODUCTION 16

Applying Virtual Reality, and Augmented Reality to the Lifecycle Phases of Complex Products

RDT&E BUDGET ITEM JUSTIFICATION SHEET (R-2 Exhibit)

USOO A United States Patent (19) 11 Patent Number: 5,512,817. Nagaraj (45) Date of Patent: Apr. 30, 1996

VoIP Simulated Communications

(12) United States Patent (10) Patent No.: US 6,188,779 B1

73 Assignee: Dialight Corporation, Manasquan, N.J. 21 Appl. No.: 09/144, Filed: Aug. 31, 1998 (51) Int. Cl... G05F /158; 315/307

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Ultra Electronics Integrated Sonar Suite

Omni-Directional Catadioptric Acquisition System

United States Patent 19 Freiesleben

Feature (Claims) Preamble. Clause 1. Clause 2. Clause 3. Clause 4. Preamble. Clause 1. Clause 2. Clause 3. Clause 4

ASSOCIATE IMAGES TO I105}

TEPZZ 7 Z_ 4A T EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 3/0488 ( ) G06F 3/0482 (2013.

30 DAY PILL CUTTING DEVICE

REPORT DOCUMENTATION PAGE

(12) United States Patent (10) Patent No.: US 6,615,108 B1

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

project gnosis tech ed development centre Teaching Kids since 2013

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

(12) United States Patent Sun et al.

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Transcription:

DEPARTMENT OF THE NAVY NAVAL UNDERSEA WARFARE CENTER DIVISION NEWPORT OFFICE OF COUNSEL PHONE: (401) 832-3653 FAX: (401) 832-4432 NEWPORT DSN: 432-3853 Attorney Docket No. 98580 Date: 25 April 2008 The below identified patent application is available for licensing. Requests for information should be addressed to: TECHNOLOGY PARTNERSHIP ENTERPRISE OFFICE NAVAL UNDERSEA WARFARE CENTER 1176 HOWELL ST. CODE 07TP, BLDG. 990 NEWPORT, R1 02841 Serial Number 12/075,267 Filing Date 6 February 2008 Inventor Douglas B. Maxwell Address any questions concerning this matter to the Office of Technology Transfer at (401) 832-1511. DISTRIBUTION STATEMENT Approved for Public Release Distribution is unlimited 20080508095

VIRTUAL REALITY TRAINING SYSTEM FOR A SUBMARINE COMMAND CENTER This application claims the benefit of U.S. Provisional Application No. 60/900,310, filed February 6, 2007 and which is entitled VIRTUAL REALITY TRAINING SYSTEM FOR A SUBMARINE COMMAND CENTER by Douglas B. Maxwell. STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein may be manufactured and used by or for the Government of the United States of America for governmental purposes without the payment of any royalties thereon or therefore. BACKGROUND OF THE INVENTION (1) FIELD OF THE INVENTION [0002] The present invention is directed to training systems. In particular, the present invention is directed to a training system that employs a computer generated virtual reality submarine combat control system. (2) DESCRIPTION OF THE PRIOR ART [0003] Currently, concept of operation exercises, where submarine control system physical layouts (such as combat control systems) are tested and where individuals are trained to 1

operate control systems on a submarine, are performed using expensive (both in cost and time) physical model mockups. One physical mock-up prototype must be built for each control room/attack center configuration in order to assess its layout and functionality. It is impractical due to time and cost to continue to build physical mock-ups for future submarines, such as littoral combat vehicles. Virtual models offer the flexibility to assemble and visualize the different configurations of the control room/attack center efficiently and cheaply because they can be reconfigured electronically. What is therefore needed is a system that uses a mixed real and virtual display interaction methodology to generate the visual appearance of control rooms and allow user interaction with mixed real and virtual control panels. SUMMARY OF THE INVENTION [0004] It is a general purpose and object of the present invention to create rapid acquisition of expertise in operations and maintenance using simulation and virtual environments, performance measuring and coaching. [0005] Another general purpose and object of the present invention is to increase training efficiency using distributed training with minimal instructors, exercise monitor resources and authoring tools. 2

[0006] Another general purpose and object of the present invention is to enhance operational capability using virtual and distributed training aids. [0007] The above objects are accomplished with the present invention through the use of a system that uses a mixed real and virtual display and interaction methodology to generate the visual appearance of control rooms and the ability to interact with control panels. The tactical environment is modeled using actual design specifications for current combat control systems or proposed design specifications for future/experimental combat control systems. The virtual reality environment is generated through a combination of video clustering, gestural input devices, see-through head mounted displays and head tracking devices. A user is able to operate conceptual virtual displays and work with real tactical data that is located within a virtual submarine attack center. Multiple users, whether students or instructors, are accommodated in the environment, each having the capability to interact with different individual displays pertaining to the user's function in the combat system. Users are able to interact and control the scene using data fed from an actual combat control system trainer in real time. Instructors are able to observe the students performing tasks, take control of their system to guide or tutor, and identify or assess weak points in the different control panel design 3

configurations. The present invention accommodates a team of users, student operators and instructors, each equipped with a head mounted display, head tracker and a communication system. BRIEF DESCRIPTION OF THE DRAWINGS [0008] A more complete understanding of the invention and many of the attendant advantages thereto will be more readily appreciated by referring to the following detailed description when considered in conjunction with the accompanying drawings, wherein like reference numerals refer to like parts and wherein: [0009] FIG. 1 is a flow diagram of the method of implementing the virtual reality training system for a submarine command center; DETAILED DESCRIPTION OF THE INVENTION [0010] Referring to FIG. 1 there is illustrated the method of implementing the virtual reality training system of the present invention. The first step in implementing the training environment is to create a model of the actual environment using design data from existing submarines in operation 10. Multiple virtual models of various attack centers are developed including traditional and conceptual future attack-center versions. Modular design principles are employed to allow the model to be reconfigurable by the training system. Individual displays and 4

consoles are modeled separately 20 and inserted into the environments 30. The displays are overlaid with live tactical streams 40 while in the training environment system as described below. In addition to the model, a scene graph is developed that properly displays the model to scale 50. The scene graph has rules that define the physics of the environment. In a preferred embodiment generating a model and scene graph is accomplished through the use of computer aided design software and a computer cluster having significant graphics capability. [0011] The next step in implementing the training environment is to develop a training instructor system to allow instructors to network with the students 60. The system hardware consists of a processor, a view and one or more input devices. The processor is a standard desktop personal computer. The view is either a large monitor or wall projector. One of the input devices is a gestural input device that allows the user to navigate the virtual environment in three dimensions. The gestural input device also allows the user to be tracked in the environment. The gestural input device must have proper registration 70, a minimum sensitivity that can detect changes in tracking. Registration of the user in the environment through tracking is a key to creating a realistic three dimensional environment. If the tracking equipment is not accurate enough, then a user may not be able to aim and 5

correctly interact with the environment. For example, if the tracking equipment can only maintain a +/- 6 inch accuracy within the tracking volume, then a 1 inch diameter button would be very difficult to push in a virtual environment. The software gives the instructor the ability to monitor and modify all command center operations 80. The capabilities include view the control center in "God's eye mode," view and take control of any student's view, reconfigure the layout of the control center, and change the live-tactical displays on the control panels. [0012] The next step in implementing the training environment is to develop a student system using a mixed real and virtual methodology 90. The virtual reality is used to show mock-ups of the submarine attack centers and the locations of other students within the environment. The virtual reality also provides a mix of controls 100 to navigate the space, perform tasks on the control consoles, and interact with other students. The real component of the system is the real interface panel mock ups 110. The students are able to interact with the virtual control panels using the real interface panel mock ups. Each student system consists of a head-mounted display, a mobile computer, keyboard, mouse, head tracker, gestural input device and software. The entire student system is portable and fits within a standard duffle bag. Either a see through or monocle based 6

head mounted display is used so that the student can see the real interface device in front of him or her. The head mounted display supports stereoscopic vision to provide a better sense of depth. The mobile computer can be a laptop, tablet or wearable pc. The head tracker is an inertia based tracker that fits compactly onto the head mounted display. The software shows the training environment and allows the students to move around in virtual space, interact with the control panels, and interact with other students. The software is interactive and distributed, allowing for collaboration with other students at local or remote locations. The real interface device that serves as a model control panel will be populated with real tactical data from simulators as well as live feeds. The feeds become active when the student is positioned in front of the control panes in the virtual attack center. The real interface device is mapped with the same control sequences associated with the virtual control panels. The software will interface with the head tracker to map real head turns to the virtual environment. The software will also generate and display user information about the state of the system and overlay it on the head mounted display screen. When in use the student system can accommodate either all of the students geographically located in the same room, or one student could be in a remote location but they would all see each other in the virtual environment and 7

communicate verbally to each other using voice over internet protocol 120. The instructor may also be in a remote location with complete oversight within the virtual environment. [0013] The instructor and student systems are integrated through network communication methods and the software has a distributed functionality allowing users from multiple sites to interact within the same model including voice communication 130. The instructor and student systems receive live tactical feeds 140. Some of the tactical feeds are from primary sources or training systems. The feeds are overlaid on the control panels in the virtual environment. The live tactical feeds are part of a curriculum specific to the training goals of the system, primarily the rapid acquisition of expertise in operations and maintenance of combat control systems on submarines 150. [0014] The advantage of the present invention is that it reduces costs for testing and allows for simulations involving fire and flooding that cannot be done with physical mock ups. The present invention represents a unique combination of several virtual reality technologies in the specific application of combat system training. Unlike conventional 3D walkthrough applications, this application provides the user with the ability to actually interact and control the scene using data fed from an actual combat control system trainer. 8

[0015] While it is apparent that the illustrative embodiments of the invention disclosed herein fulfill the objectives of the present invention, it is appreciated that numerous modifications and other embodiments may be devised by those skilled in the art. Additionally, feature(s) and/or element(s) from any embodiment may be used singly or in combination with other embodiment(s). Therefore, it will be understood that the appended claims are intended to cover all such modifications and embodiments, which would come within the spirit and scope of the present invention. 9

embodiment may be used singly or in combination with other embodiment (s). Therefore, it will be understood that the appended claims are intended to cover all such modifications and embodiments, which would come within the spirit and scope of the present invention. 9

ABSTRACT The invention as disclosed is a system that uses a combined real and virtual display interaction methodology to generate the visual appearance of submarine combat control rooms and allow interaction with mixed real and virtual control panels for the purposes of training in the operation of submarine combat control systems. 15

I 0 MODELING AN ACTUAL SUBMARINE COMBAT CONTROL SYSTEM ENVIRONMENT USING DESIGN DATA FROM AN EXISTING SUBMARINE 0Ao MODELING CONROL PANEL DISPLAYS AND INDIVIDUAL CONSOLES 30INSERTING THE CONTROL PANEL DISPLAYS AND 30 CONSOLES INTO THE MODEL OF THE ACTUAL SUBMARINE COMBAT CONTROL SYSTEM 'to OVERLAYING THE CONTROL PANEL DISPLAYS WITH I STREAMS OF LIVE TACTICAL DATA DEVELOPING A SCENE GRAPH THAT PROPERLY so DISPLAYS THE MODEL OF THE ACTUAL SUBMARINE COMBAT CONTROL SYSTEM, zo 0 DEVELOPING A TRAINING INSTRUCTOR SYSTEM FOR A TRAINING INSTRUCTOR TO NETWORK WITH STUDENTS 'AT LOCAL AND REMOTE SITES REGISTERING THE GESTURAL INPUT DEVICE TO DETECT A 70 CHANGE IN TRACKING TO CREATE A REALISTIC THREE DIMENSIONAL MODEL OF THE ACTUAL SUBMARINE COMBAT CONTROL SYSTEM VO PROVIDING THE TRAINING INSTRUCTOR WITH THE ABILITY TO MONITOR AND MODIFY ALL COMMAND CENTER OPERATIONS LG

'q0 DEVELOPING A STUDENT SYSTEM FOR STUDENTS USING A COMBINED REAL AND VIRTUAL METHODOLOGY PROVIDING MULTIPLE CONTROLS TO ALLOW THE STUDENTS TO /00 NAVIGATE A SPACE WITHIN THE MODEL, PERFORM A TASK ON A CONTROL CONSOLE, AND INTERACT WITH OTHER STUDENTS /PROVIDING A REAL INTERFACE PANEL MOCK UP, TO ALLOW STUDENTS TO INTERACT WITH THE VIRTUAL CONTROL PANELS BY USING THE REAL INTERFACE PANEL MOCK UP PROVIDING A MEANS FOR VERBAL COMMUNICATION BETWEEN THE STUDENTS AND THE TRAINING INSTRUCTOR THROUGH VOICE OVER INTERNET PROTOCOL INTEGRATING THE TRAINING INSTRUCTOR SYSTEM AND THE 1,3o STUDENT SYSTEMS THROUGH A COMMUNICATION NETWORK AND DISTRIBUTED SOFTWARE POPULATING THE REAL INTERFACE DEVICE THAT SERVES AS / 4 A MODEL CONTROL PANEL WITH ACTUAL TACTICAL DATA FROM A SIMULATOR AND FROM LIVE FEEDS THROUGH A NEAR REAL TIME CONNECTION \I DEVELOPING A CURRICULUM SPECIFIC TO THE ACTUAL SUBMARINE COMBAT CONTROL SYSTEM, THAT ALLOWS THE RAPID ACQUISITION OF EXPERTISE IN OPERATIONS AND MAINTENANCE OF A SUBMARINE COMBAT CONTROL SYSTEM FI~ GCr.O~V