Team Breaking Bat Architecture Design Specification. Virtual Slugger

Similar documents
Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro

Harry Plummer KC BA Digital Arts. Virtual Space. Assignment 1: Concept Proposal 23/03/16. Word count: of 7

ADVANCED WHACK A MOLE VR

Oculus Rift Getting Started Guide

SIU-CAVE. Cave Automatic Virtual Environment. Project Design. Version 1.0 (DRAFT) Prepared for. Dr. Christos Mousas JBU.

Oculus Rift Getting Started Guide

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

TAKE CONTROL GAME DESIGN DOCUMENT

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

VIRTUAL MUSEUM BETA 1 INTRODUCTION MINIMUM REQUIREMENTS WHAT DOES BETA 1 MEAN? CASTLEFORD TIGERS HERITAGE PROJECT

Students: Bar Uliel, Moran Nisan,Sapir Mordoch Supervisors: Yaron Honen,Boaz Sternfeld

immersive visualization workflow

Sensible Chuckle SuperTuxKart Concrete Architecture Report

MANPADS VIRTUAL REALITY SIMULATOR

Head Tracking for Google Cardboard by Simond Lee

Virtual Reality as Innovative Approach to the Interior Designing

Virtual Reality Calendar Tour Guide

ReVRSR: Remote Virtual Reality for Service Robots

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1

Software Requirements Specification

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

Oculus Rift Introduction Guide. Version

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

One Size Doesn't Fit All Aligning VR Environments to Workflows

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

Falsework & Formwork Visualisation Software

Easy Input Helper Documentation

RH King Academy OCULUS RIFT Virtual Reality in the High School Setting

Oculus Rift Development Kit 2

Understanding OpenGL

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

VR/AR Concepts in Architecture And Available Tools

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3

Omni-Directional Catadioptric Acquisition System

Pangolin: A look at the conceptual Architecture of Super Tux Kart. A CISC 326 Project by:

Xplr VR by Travelweek

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106)

Momo Software Context Aware User Interface Application USER MANUAL. Burak Kerim AKKUŞ Ender BULUT Hüseyin Can DOĞAN

Unpredictable movement performance of Virtual Reality headsets

An Escape Room set in the world of Assassin s Creed Origins. Content

Arcade Game Maker Product Line Requirements Model

ASM(AR) Demonstration Engagements Anti-Ship Missile Active Radar Homing

SVEn. Shared Virtual Environment. Tobias Manroth, Nils Pospischil, Philipp Schoemacker, Arnulph Fuhrmann. Cologne University of Applied Sciences

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

Expanding the PEC With Virtual Experiments

Using the Rift. Rift Navigation. Take a tour of the features of the Rift. Here are the basics of getting around in Rift.

HARDWARE SETUP GUIDE. 1 P age

Prepare Checkout and download some of the apps in preparation for our session today. AR Runner MetaVerse CoSpaces

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

Bring Imagination to Life with Virtual Reality: Everything You Need to Know About VR for Events

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Interior Design with Augmented Reality

Physical Presence in Virtual Worlds using PhysX

VIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR

USER GUIDE JOINING PLAYSIGHT SMARTCOURT

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

About Us and Our Expertise :

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

CS 354R: Computer Game Technology

FATE WEAVER. Lingbing Jiang U Final Game Pitch

BE A FIELD OPERATOR IN HYSYS-BASED OTS AND OCULUS RIFT VIRTUAL REALITY

Waves Nx VIRTUAL REALITY AUDIO

USTGlobal. VIRTUAL AND AUGMENTED REALITY Ideas for the Future - Retail Industry

INTRODUCTION TO GAME AI

Geo-Located Content in Virtual and Augmented Reality

ADVICE FOR USING THE BLUEPRINT

Assignment 5: Virtual Reality Design

Virtual Reality Mental Health Education

Virtual Reality Game using Oculus Rift

Tobii Pro VR Integration based on HTC Vive Development Kit Description


Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Making a Recording in the Booth

Immersive Visualization On the Cheap. Amy Trost Data Services Librarian Universities at Shady Grove/UMD Libraries December 6, 2019

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

VR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Website:

Background - Too Little Control

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications. Gate Review

Diving into VR World with Oculus. Homin Lee Software Engineer at Oculus

Linear Motion Servo Plants: IP01 or IP02. Linear Experiment #0: Integration with WinCon. IP01 and IP02. Student Handout

PHYSICS 220 LAB #1: ONE-DIMENSIONAL MOTION

Research on Presentation of Multimedia Interactive Electronic Sand. Table

Contents. Saffire PRO 10 i/o. User Guide. Changes to Version 1. Additional Info. Hardware Monitoring Digital Output Monitoring...

Low-Cost Virtual Reality Environment For Engineering And Construction

Learning technology trends and implications

Easy Input For Gear VR Documentation. Table of Contents

HeroX - Untethered VR Training in Sync'ed Physical Spaces

New Developments in VBS3 GameTech 2014

A Highly Generalised Automatic Plugin Delay Compensation Solution for Virtual Studio Mixers

Shader "Custom/ShaderTest" { Properties { _Color ("Color", Color) = (1,1,1,1) _MainTex ("Albedo (RGB)", 2D) = "white" { _Glossiness ("Smoothness", Ran

Getting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning...

Dynamic Platform for Virtual Reality Applications

Transcription:

Department of Computer Science and Engineering The University of Texas at Arlington Team Breaking Bat Architecture Design Specification Virtual Slugger Team Members: Sean Gibeault Brandon Auwaerter Ehidiamen Ojielu Roshan Lamichhane Geoff Graham Last Updated: 8/25/2014 Breaking Bat 1 System Architecture Specification

Table of Contents REVISION HISTORY 4 LIST OF TABLES 5 LIST OF FIGURES 7 1. INTRODUCTION 8 1.1 PRODUCT CONCEPT 8 1.2 PRODUCT SCOPE 8 1.3 KEY REQUIREMENTS 9 2. META ARCHITECTURE 12 2.1 ARCHITECTURAL VISION 12 2.2 GUIDING PRINCIPLES 12 2.3 ASSUMPTIONS 13 2.4 DESIGN TRADE OFFS 13 3. ARCHITECTURE OVERVIEW 14 3.1 INPUT LAYER 14 3.2 PROCESSING LAYER 14 3.3 STORAGE LAYER 14 3.4 OUTPUT LAYER 14 4. INPUT LAYER 16 4.1 DESCRIPTION 16 4.2 PURPOSE 16 4.3 FUNCTION 17 4.4 DEPENDENCIES 17 4.5 DATA 17 4.6 SUBSYSTEMS 17 5. PROCESSING LAYER 22 5.1 DESCRIPTION 22 5.2 PURPOSE 22 5.3 FUNCTION 22 5.4 DEPENDENCIES 23 Breaking Bat 2 System Architecture Specification

5.5 DATA 23 5.6 SUBSYSTEMS 23 6. OUTPUT LAYER 31 6.1 DESCRIPTION 31 6.2 PURPOSE 31 6.3 FUNCTION 31 6.4 DEPENDENCIES 31 6.5 DATA 31 6.6 SUBSYSTEMS 32 7. STORAGE LAYER 34 7.1 DESCRIPTION 34 7.2 PURPOSE 34 7.3 FUNCTION 34 7.4 DEPENDENCIES 34 7.5 DATA 34 7.6 SUBSYSTEMS 35 8. REQUIREMENTS MAPPING 37 9. RELATIONSHIP MAPPING 39 9.1 SECTION OVERVIEW 39 9.2 DATA FLOW DEFINITION 40 9.3 PRODUCER-CONSUMER RELATIONSHIP MATRIX 41 9.4 ANALYSIS 41 10. TESTING CONSIDERATION 43 10.1 OVERVIEW 43 10.2 INPUT LAYER 43 10.3 PROCESSING LAYER 43 10.4 STORAGE LAYER 43 10.5 OUTPUT LAYER 43 11. OPERATING SYSTEM DEPENDENCIES 44 11.1 INPUT LAYER 44 11.2 PROCESSING LAYER 44 11.4 STORAGE LAYER 44 11.5 OUTPUT LAYER 44 Breaking Bat 3 System Architecture Specification

Document Revision History Revision Number Revision Date Description Rationale 0.1 07/28/2014 First Draft Initial Integration 1.0 08/12/2014 Gate Review Draft Peer review and first draft feedback 2.0 08/25/2014 Baseline Draft Feedback Breaking Bat 4 System Architecture Specification

List of Tables Table # Title Page # 1-1 Key Requirements 9 4-1 Oculus Interlayer Interface Orientation 17 4-2 Oculus Interlayer Interface Tracking Data 18 4-3 Oculus Public Interface Orientation 18 4-4 Oculus Public Interface Tracking Data 18 4-5 X-Box Controller Interlayer Interface Button Event 19 4-6 X-Box Controller Public Interface Button Listener 19 4-7 Baseball Bat Interlayer Interface Acceleration 19 4-8 Baseball Bat Interlayer Interface Orientation 20 4-9 Baseball Bat Public Interface Orientation 20 4-10 Baseball Bat Public Interface Orientation 20 4-11 Kinect Interlayer Interface 20 4-12 Kinect Public Interface 21 5-1 Xbox Controller Interlayer Interface 23 5-2 Baseball Bat Interlayer Interface Tracking Data 24 5-3 Kinect Interlayer Interface Tracking Data 24 5-4 Oculus Rift Interlayer Interface Tracking Data 24 5-5 Menu Controller Interlayer Interface 24 5-6 Load Settings Interlayer Interface 25 5-7 Load Settings Public Interface 25 5-8 Save Settings Public Interface 25 5-9 Physics Data Interlayer Interface 26 5-10 Physics Data Interlayer Interface 26 5-11 Feedback File Public Interface 26 5-12 Asset Interlayer Interface Acceleration 27 5-13 Stats Analyzer Interlayer Interface Orientation 27 5-14 Output Controller Interlayer Interface Acceleration 27 Breaking Bat 5 System Architecture Specification

5-15 Game Engine Interlayer Interface 28 5-16 Game Engine Public Interface 28 5-17 Sound Interlayer Interface 29 5-18 Sound Interlayer Interface 29 5-19 Graphics Public Interface 29 5-20 Sound Public Interface 30 6-1 Graphics Driver Interlayer Interface 32 6-2 Graphics Driver Interlayer Interface 32 6-3 Graphics Driver Public Interface 32 6-4 Sound Driver Interlayer Interface 33 6-5 Sound Driver Public Interface 33 7-1 Levels and Models Interlayer Interface 35 7-2 Settings File Interlayer Interface 36 7-3 Feedback Interlayer Interface 36 8-1 Requirements Mapping 37 9-2 Data Flow 40 Breaking Bat 6 System Architecture Specification

List of Figures Figure # Title Page # 3-1 Detailed Architecture 15 4-1 Input Layer 16 5-1 Processing Layer 22 6-1 Output Layer 31 7-1 Data Layer 34 9-1 System Architecture Relationship Mapping 39 9-2 Producer-Consumer Relationship Matrix 41 Breaking Bat 7 System Architecture Specification

1.1 Product Concept 1. Introduction The Virtual Slugger is a virtual reality batting simulator that combines many cutting edge technologies like the Oculus Rift, Kinect, and other sensors to simulate a realistic batting experience inside of a virtual world. Users will be able to practice their batting skills by telling the system which type of batter they would like to be, which types of pitches they are working on, left handed or right handed, and what difficulty level they would like. The system will simulate a batting experience based on the input they specified and will track the users swing as they try to hit the virtual ball. This data will be analyzed and the system will provide statistics and advice on how to improve their swing. The Virtual Slugger is designed to simulate a real game situation that baseball players will encounter. This demo be on a computer and be launched from the game engine to the main menu. The user will input which type of batter they would like to be such as a power hitter, lead off, or a combination of both. The user will also be able to choose which type of pitches they would like to work on or choose a random order of pitches. Once the game starts the batter will face a virtual pitcher until they either hit the ball or they strike out. Once there is a break in the game (hit or strikeout) the system will give the user feedback on their swing and advice on how to improve i.e.: try to hit the ball lower or higher depending on which type of hitter they have selected. The Virtual Slugger s intended users are all levels of baseball players ranging from amateur to pro. The intended consumer would be aspiring baseball players or professionals that want to practice their swing. Additional consumers would be baseball teams or organizations that would like to try new technology to improve batting averages. 1.2 Product Scope The purpose of Virtual Slugger is to provide any enthusiastic baseball hitter with an interactive virtual batting practice session and offer tips in order to improve the user s hitting skills. The user will be placed inside a virtual environment where he/she will face off against a virtual pitcher. The pitcher will throw a virtual baseball and the user will try to hit the ball. The user will be swinging a real bat and our system will keep track of the bat location and bat speed in order to offer feedback to the system. The system will be built with several pieces of hardware including an Oculus Rift, Kinect, and bat sensors. The Oculus Rift is a virtual reality headset the user will wear while attempting a hit. The Kinect and the bat sensors are sensors that will track the user s swing and analyze the data to offer tips. The users of this system will choose what type of hitter they want to become, such as a power hitter, an on base percentage hitter, or a mixture of both. After the user makes their selection, the system will track their swing and where they are hitting the baseball and offer unique tips based on the given input. Breaking Bat 8 System Architecture Specification

1.3 Key Requirements Number Requirement Description 3.1 Switch Hitting Capability The user will have the option of choosing to bat left-handed or right-handed. This will change the view in the Virtual Slugger system to accommodate both batting styles. 3.2 Setting Configuration Virtual Slugger should allow the user before batting to configure their settings such as height of user, speed of pitches, and type of batter. These configurations will personalize settings to determine the user s strike zone and help give feedback after batting. 3.3 Hit Virtual Baseball Virtual Slugger should allow the user the ability to see a virtual baseball (pitch) thrown and allow the user to swing a real baseball bat to try to make contact with the virtual baseball. 3.4 Real World Bat Virtual Slugger should allow the user to use a real baseball bat in order to make the batting experience as realistic as possible. 3.6 Feedback The Virtual Slugger should provide feedback on what to modify in order to improve their batting following a Breaking Bat 9 System Architecture Specification

batting session. Different feedback will be delivered based on batting style chosen along with statistics from the session. Feedback will be given at the end of the user s at bat and will include, but not be limited to change of batter s stance, arch of swing, and head movement. 3.7 Real Time Statistics The Virtual Slugger should gather real-time statistics during the user s batting session. These statistics will be available for review after each session. 3.8 Learn Fundamentals of Hitting The user will learn the fundamentals of Batting via one of three types of batting style they choose to train for. These three batting styles consist of a bopper which trains to hit the ball with power, a rabbit which trains to hit the ball on the ground, and a dirtbag, who hits the ball for line drives. Based on the batting style chosen, a summary of the batting style will be available along with feedback after batting sessions. 3.9 Switch Pitching The Virtual Slugger should provide the user an option to choose whether the pitcher is left handed or right handed. 4.1 Virtual Reality Device The product will include a virtual reality headset. 4.3 Xbox Controller An Xbox controller will be provided for users to interact with the menus and select Breaking Bat 10 System Architecture Specification

preferences. 4.4 Sensors Sensors will be provided with the product and used with the bat to monitor its motion. 5.1 Latency Latency is currently an issue with virtual reality devices. However, the user should experience as little latency as possible to ensure accuracy when hitting pitches. 5.2 Graphics The user should experience smooth game play, animations and interactions when using the Virtual Slugger. Table 1-1 Key Requirements Breaking Bat 11 System Architecture Specification

2. Meta Architecture This section describes the guiding principles and key assumptions that team Breaking Bat will use when developing the Virtual Slugger. This section also highlights the guiding principles used by the team in making the architectural design. Assumptions made by the team during the architecture design will be addressed as well. 2.1 Architectural Vision The Virtual Slugger consists of a complex but intuitive input system that allows the user to seamlessly interact with the game engine. We wanted the user's experience as realistic as possible, so our primary input device is a real-world bat. This will be augmented with the virtual reality device, Oculus Rift, which will engulf the user inside a virtual world. The architectural design for the Virtual Slugger consists of four key layers which are input layer, processing layer, storage layer, and output layer. Input will be received from sensors such as accelerometers, gyroscopes, Kinect, Xbox controller and Oculus Rift.. The processing layer handles the processing of the inputs and renders the scenes in the Virtual Slugger accordingly. The game engine will be located in the processing layer. The storage layer will store files such as levels/models, settings, statistics and feedback. The output layer will handle sending audio and video output to the appropriate devices. 2.2 Guiding Principles 2.2.1 Reliability The Virtual Slugger is dependent on real-time statistics in order to provide proper feedback to the user therefore, it need to do so reliably in the sense of minimal delay or latency when using the system. This means the architecture design needs to be as thin as possible to prevent data from being sent through more layers than necessary. 2.2.2 Interoperability The Virtual Slugger will be portable between computers with various types of hardware. The Virtual Slugger should work independently of the computer hardware. In order to achieve this, the architecture design must avoid communicating with the hardware devices directly but do so through interfaces and controllers. This abstraction will be used to communicate effectively with the drivers for sound and video output as well as getting input from the Oculus Rift, Xbox controller and sensors. 2.2.3 Usability The Virtual Slugger will be intuitive and easy to use. A user that has minimal knowledge of the technology used in the Virtual Slugger should still be able to understand how to use the Virtual Slugger. In order to achieve this, a thorough yet simplistic user interface will be created. Breaking Bat 12 System Architecture Specification

2.3 Assumptions 2.3.1 Game Engine Team Breaking Bat will use an existing game engine to assist in the development of the Virtual Slugger. The choice has been narrowed down to Unreal Engine 4 but subject to change. 2.3.2 Device Drivers Drivers for devices such as graphics and sound cards will already be available for use. The Virtual Slugger will just need to interface with these drivers. 2.3.3 Operating System The primary operating system for the Virtual Slugger will be Windows 8. 2.3.4 3D and Sound Assets The game engine will be able to accept FBX file format, which can be converted from other formats regardless of the editor used to create them. 2.4 Design Trade Offs A highly modular approach was taken in the architectural design of the Virtual Slugger. This ensures components are somewhat independent but at the same time able to communicate with each other. This enables efficient data flow and accurate processing. Also, local data storage was chosen over remote data storage to eliminate issues such as connectivity and security. Local storage also gives us the advantage of providing feedback as quickly as possible. We have also chosen Unreal Engine 4(UE4) over Unity 4 because UE4 offers a rich set of tools that will decrease development time. Breaking Bat 13 System Architecture Specification

3. Architecture Overview Architecture of the Virtual Slugger consists of four layers. The layers are Input Layer, Processing Layer, Data Storage Layer and Output Layer. The organization and relationships of these four layers is shown in Figure 3-1 below. 3.1 Input Layer The input layer gathers data that comes from the Oculus Rift, Kinect, Xbox controller, accelerometer and gyroscope. The data collected in the Input Layer is transferred to the Processing Layer for further processing. Oculus SDK gathers information on head movement, Xbox controller will work as the button-listener, accelerometer gathers information on the speed of bat swing, and gyroscope gathers information on orientation of bat. 3.2 Processing Layer The processing layer is responsible for processing the information gathered through the Oculus Rift, Kinect, Xbox controller, accelerometer and gyroscope. Processing layer controls the game engine. It consists of Input Controller, Output Controller and Game Engine. It also includes the Asset Loader, Settings Controller and Statistics Analyzer. 3.3 Storage Layer The storage layer is responsible for storing all of the data needed for the game. It consists of Levels and Models Files, Setting Files and Feedback Files. The processing layer will be able to connect to Storage Layer and load level assets, player models, settings, and query for feedback for the user. 3.4 Output Layer The output layer is responsible to provide information to the user in the reasonable manner. It consists of Graphics Driver and Sound Driver. The Graphics Driver provides the display that is loaded from the processing layer and the Sound Driver provides sound to the system. Breaking Bat 14 System Architecture Specification

3.5 Detailed Architecture Figure 3-1 Detailed Architecture Breaking Bat 15 System Architecture Specification

4. Input Layer 4.1 Description Figure 4-1 Input Layer The Input Layer provides a platform to gather input from the user through the Oculus Rift, the Xbox controller, Kinect, and Bat Sensors, and transport this input to the input controller located in the Processing Layer. 4.2 Purpose The purpose of the Input layer is to capture head motion events from the Oculus Rift, button events from the Xbox controller, image processing data from the Kinect, and accelerometer/gyroscopic data from the bat sensors and then transport this data to the Input controller in the processing layer. Breaking Bat 16 System Architecture Specification

4.3 Function The Input layer is solely responsible for retrieving input from the user and providing the hardware interface to the Virtual Slugger system for the user. Each input will provide the system with different kinds of data and this layer will help with transporting this data from the devices to the system. 4.4 Dependencies The Input layer is dependent on the input devices such as Oculus Rift, the Xbox controller, Kinect, and baseball bat sensors. It is also dependent on the user s input, but it is not dependent on any other layer. 4.5 Data The data in this layer is divided up into the different input components. The Oculus will provide gyroscopic event data to track the rotation of the viewers head as well as some image processing data to track the heads spatial movement. The X-Box controller will take basic button event clicks and will be mainly used for navigating the systems menus. The baseball bat sensors (accelerometer/gyroscope) will be constantly streaming accelerometer and gyroscopic data to the system to track and model the bat in the virtual environment. Finally the Kinect will be sending 3 dimensional images to the system to gather extra data about the users swing. 4.6 Subsystems 4.6.1 Oculus 4.6.1.1 Description This subsystem is responsible for acquiring any user input from the Oculus Rift. These inputs are rotational angles about the x, y, z coordinates, which are generated from head motion events. 4.6.1.2 Function The Oculus SDK takes the inputs and translates them to a format readable by the input controller in the processing layer. 4.6.1.3 Interlayer Interfaces Method Description Info Required Info Returned sendorientation Sends the orientation of the Oculus to the Orientation Breaking Bat 17 System Architecture Specification

input controller Table 4-1 Oculus Interlayer Interface Orientation Method Description Info Required Info Returned sendtrackingdata Sends the position of the Oculus to the input controller Oculus position Table 4-2 Oculus Interlayer Interface Tracking Data 4.6.1.4 Public Interfaces Method Description Info Required Info Returned getorientation Gets the orientation of the Oculus Orientation Table 4-3 Oculus Public Interface Orientation Method Description Info Required Info Returned gettrackingdata Gets the position of the Oculus Oculus Position Table 4-4 Oculus Public Interface Tracking Data 4.6.2 X-Box Controller 4.6.2.1 Description This subsystem is responsible for retrieving user input from the Xbox controller. These inputs are button press events which will be used to navigate the menu system. 4.6.2.2 Function The X-Box controller will primarily be used to allow the user to navigate through the menu system easily. Breaking Bat 18 System Architecture Specification

4.6.2.3 Interlayer Interface Method Description Info Required Info Returned buttonevent Sends button event to input controller Button Pressed Table 4-5 X-Box Controller Interlayer Interface Button Event 4.6.2.4 Public Interfaces Method Description Info Required Info Returned buttonlistener Listens for buttonevent to be triggered Button Pressed Table 4-6 X-Box Controller Public Interface Button Listener 4.6.3 Baseball Bat Sensors 4.6.3.1 Description This subsystem is responsible for retrieving the data from the baseball bat and modeling it in virtual reality. 4.6.3.2 Function The baseball bat will primarily be used to allow the user to swing at the pitch and track their data. This will be the users primary source of interaction with the system. 4.6.3.3 Interlayer Interface Method Description Info Required Info Returned sendacceleration Sends accelerometer data to the system accelerometer data Table 4-7 Baseball Bat Interlayer Interface Acceleration Breaking Bat 19 System Architecture Specification

Method Description Info Required Info Returned sendbatorientation Sends orientation data to the system Orientation of the bat Table 4-8 Baseball Bat Interlayer Interface Orientation 4.6.3.4 Public Interfaces Method Description Info Required Info Returned getacceleration Receives accelerometer data accelerometer data Table 4-9 Baseball Bat Public Interface Orientation Method Description Info Required Info Returned getbatorientation gets the orientation of the bat Orientation of the bat Table 4-10 Baseball Bat Public Interface Orientation 4.6.4 Kinect 4.6.4.1 Description This subsystem is responsible for retrieving the image processing data from the Kinect. 4.6.4.2 Function The Kinect will be used to track the arc and position of the user as they go through the motion of swinging the bat. This data will be used to provide additional information on the users swing. 4.6.4.3 Interlayer Interface Method Description Info Required Info Returned Breaking Bat 20 System Architecture Specification

sendkinectdata Sends the data from the Kinect Kinect data Table 4-11 Kinect Interlayer Interface 4.6.4.4 Public Interfaces Method Description Info Required Info Returned getkinectdata Gets the data from the Kinect Kinect data Table 4-12 Kinect Public Interface Breaking Bat 21 System Architecture Specification

5. Processing Layer 5.1 Description Figure 5-1 Processing Layer The processing layer receives data from the input layer and formats it so the output layer can interpret the data. This layer also communicates with the data storage layer in order to load saved files, settings, and configurations. The processing layer is mostly comprised of the game engine, which takes the inputs listed in section 4.6 and renders them in the game. The subsystems of the processing layer include: input controller, game engine, asset loader, settings controller, stats analyzer, and the output controller. 5.2 Purpose The purpose of the processing layer is to handle all the input data, process that data, and then send it to the output layer. Also, the processing layer should retrieve and save files, configurations, and setting located in the data storage layer. 5.3 Function The Processing layer has two functions. The first function is to retrieve all the input data, process that data, and send it to the output layer in real-time. This will be performed by the many subsystems of the processing layer. The second function is to retrieve and save data to the data storage layer. Breaking Bat 22 System Architecture Specification

5.4 Dependencies The processing layer is dependent on the input layer and the storage layer. It needs the data being sent from the input layer in order for the user to have a interactive game play session. The processing layer also needs the data storage layer to load the game environment, player models, and user saved settings and configurations. 5.5 Data The data in this layer consist of the all the data received from the input and data storage layer. This data is broken up into two parts, real-time data and stored data. The real-time data will be coming from the input layer and this data has priority over the stored data. The processing layer needs to handle this data in real-time in order to give good user experience. The stored data will consist of the data received from the data storage layer. This data has will consist of the player models, levels, and user s settings. 5.6 Subsystems 5.6.1 Input Controller 5.6.1.1 Description This subsystem is responsible for acquiring all the real-time data delivered from the input layer. This subsystem will then process the data and relay it to either the settings controller or the game engine. 5.6.1.2 Function The input controller takes real-time data and sends that data to the correct subsystem. 5.6.1.3 Interlayer Interfaces Method Description Info Required Info Returned getcontrollerinput Receives Xbox controller input Xbox Controller Input sendcontrollerinput Send controller data to the settings controller Controller Data Table 5-1 Xbox Controller Interlayer Interface Breaking Bat 23 System Architecture Specification

Method Description Info Required Info Returned getbatdata Receives accelerometer and gyroscope data Accelerometer and gyroscope data sendbatdata Send accelerometer and gyroscope data to the game engine Baseball bat position and speed Table 5-2 Baseball Bat Interlayer Interface Tracking Data Method Description Info Required Info Returned getkinectdata Receives Kinect data Kinect data sendkinectdata Send Kinect data to the game engine Image data Table 5-3 Kinect Interlayer Interface Tracking Data Method Description Info Required Info Returned getoculusriftdata Receives Oculus Rift data Oculus Rift data sendoculusriftdata Send coordinates to the game engine Coordinates Table 5-4 Oculus Rift Interlayer Interface Tracking Data Method Description Info Required Info Returned getmenuselection Receives the Controller Breaking Bat 24 System Architecture Specification

controller input Action Table 5-5 Menu Controller Interlayer Interface 5.6.1.4 Public Interfaces. 5.6.2 Settings Controller 5.6.2.1 Description This subsystem is responsible for retrieving and saving user settings. 5.6.2.2 Function The load settings module will receive the user s selection from the menu controller and then request the settings file from the data storage layer. 5.6.2.3 Interlayer Interface Method Description Info Required Info Returned getmenuselection Gets the menu selection from the input controller User selection Table 5-6 Load Settings Interlayer Interface 5.6.2.4 Public Interfaces Method Description Info Required Info Returned getsettingsfile Retrieve the settings file File Table 5-7 Load Settings Public Interface Method Description Info Required Info Returned savesettingsfile Save the settings File Breaking Bat 25 System Architecture Specification

file Table 5-8 Save Settings Public Interface 5.6.3 Statistics Analyzer 5.6.3.1 Description This subsystem is responsible for analyzing the swing data from the game engine subsystem and output the results to the game engine and the feedback file. 5.6.3.2 Function The physics engine will send the swing data from the bat to the statistics analyzer. The statistics analyzer will then determine what feedback needs to be given to the user. It will then save this feedback in the feedback file and output the tips to the game engine so the user will be able to see it. 5.6.3.3 Interlayer Interface Method Description Info Required Info Returned getphysicsdata gets the physics data on the swing Swing data Table 5-9 Physics Data Interlayer Interface Method Description Info Required Info Returned sendgraphicstip sends the tip the graphics subsystem Swing feedback Table 5-10 Physics Data Interlayer Interface 5.6.3.4 Public Interfaces Method Description Info Required Info Returned savefeedback saves the feedback to the Swing feedback Breaking Bat 26 System Architecture Specification

feedback file Table 5-11 Feedback File Public Interface 5.6.4 Game Engine 5.6.4.1 Description This subsystem is the main component in the processing layer. It is responsible for running the game environment and receiving and processing all the input from our peripherals. It will also output the sound and graphics to the output controller. 5.6.4.2 Function The physics engine will model real-world physics in the game environment. This data will then be sent to the graphics and sound subsystems. 5.6.4.3 Interlayer Interface Method Description Info Required Info Returned getassets gets all the models and levels from the asset loader models and levels Table 5-12 Asset Interlayer Interface Acceleration Method Description Info Required Info Returned getstats gets the stats from the stats analyzer user stats sendstats sends the data that the stats analyzer will use to analyze the swing accelerometer, gyroscope. Table 5-13 Stats Analyzer Interlayer Interface Orientation Method Description Info Required Info Returned Breaking Bat 27 System Architecture Specification

sendgraphicsandsound sends graphics and sound data to the output controller graphics and sound data. Table 5-14 Output Controller Interlayer Interface Acceleration 5.6.4.4 Public Interfaces. 5.6.5 Asset Loader 5.6.5.1 Description This subsystem is responsible for retrieving all the models and levels that the game engine will need to render. 5.6.5.2 Function The asset loader will communicate with the data storage layer to retrieve the models and level that are located on the local drive of the computer running the game. It will fetch this data and send it to the game engine. 5.6.5.3 Interlayer Interface Method Description Info Required Info Returned sendgameenginedata Sends models and levels to game engine all assets the game engine needs Table 5-15 Game Engine Interlayer Interface 5.6.5.4 Public Interfaces Method Description Info Required Info Returned getgameenginedata Gets models and levels for game engine models and levels Table 5-16 Game Engine Public Interface Breaking Bat 28 System Architecture Specification

5.6.6 Output Controller 5.6.6.1 Description This subsystem is responsible for retrieving the graphics and sound data from the game engine and sending it the output layer 5.6.6.2 Function The output controller is the only component in the processing layer that communicates to the output layer. Its function is to take the sound and graphics data and route it to the sound and graphics drivers respectively. 5.6.6.3 Interlayer Interface Method Description Info Required Info Returned getsound get the sound data from the game engine sound data. Table 5-17 Sound Interlayer Interface Method Description Info Required Info Returned getgraphics get the graphics data from the game engine Graphics Table 5-18 Sound Interlayer Interface 5.6.6.4 Public Interfaces Method Description Info Required Info Returned sendgraphics Interaction with the output Graphics data Breaking Bat 29 System Architecture Specification

controller to get Graphics Data. Table 5-19 Graphics Public Interface Method Description Info Required Info Returned sendsound Interaction with the output controller to get sound Data. Sound data Table 5-20 Sound Public Interface Breaking Bat 30 System Architecture Specification

6. Output Layer 6.1 Description Figure 6-1 Output Layer This provides the processed input to the user. This is done using the Oculus Rift which is the VR headset to provide video feedback, speakers for providing sound feedback and the display of the user s PC which will be a replication of what is shown in the VR headset. 6.2 Purpose The purpose of the output layer is to provide processed information to the user in a reasonable fashion using audio and video. 6.3 Function The output layer is responsible for sending processed input back to the user. This will be done using the Oculus Rift, PC display and speakers. 6.4 Dependencies The output layer is dependent on output devices such as the Oculus Rift, speakers and PC display. The Processing layer will provide graphics and sound data. 6.5 Data The data in this layer can be broken into two categories namely audio and video. Video contains information on how the user s input affects the virtual environment. Audio contains information on sound produced by all actions in the virtual environment. Breaking Bat 31 System Architecture Specification

6.6 Subsystems 6.6.1 Graphics Driver 6.6.1.1 Description This subsystem is responsible for displaying the processed input to the user through the Oculus Rift and replicated on the user s PC display. 6.6.1.2 Function This will interact with the Oculus Rift and the PC display. 6.6.1.3 Interlayer Interfaces Method Description Info Required Info Returned displayonheadset Sends the appropriate graphical information required for streaming video on the VR headset Resolution Video stream. Table 6-1 Graphics Driver Interlayer Interface Method Description Info Required Info Returned displayonmonitor Sends the appropriate graphical information required for streaming video on a display monitor. Resolution Video stream Table 6-2 Graphics Driver Interlayer Interface 6.6.1.4 Public Interfaces Method Description Info Required Info Returned getdisplay Shows the video Video Breaking Bat 32 System Architecture Specification

stream to the end user Table 6-3 Graphics Driver Public Interface 6.6.2 Sound Driver 6.6.2.1 Description This subsystem is responsible for providing sound output to the user. 6.6.2.2 Function The sound driver interacts with the speakers to provide sound of actions occurring in the virtual environment. 6.6.2.3 Interlayer Interface Method Description Info Required Info Returned getsound Sends sound data to the sound driver Quality Sound Table 6-4 Sound Driver Interlayer Interface 6.6.2.4 Public Interfaces Method Description Info Required Info Returned putsound Sends sound to the user through speakers Sound Table 6-5 Sound Driver Public Interface Breaking Bat 33 System Architecture Specification

7. Storage Layer 7.1 Description Figure 7-1 Storage Layer The Storage layer stores settings, levels, models, and feedback that may be provided to a user. 7.2 Purpose The Storage layer manages all relevant data in the hard drive. This includes methods to write and read files. 7.3 Function When the user saves their settings, a file is created in the hard drive that allows for retrieval when the user wants to resume. When the user exits the virtual reality simulation and returns to the main menu the Storage Layer is responsible for saving game settings. 7.4 Dependencies The Storage layer depends on the Processing Layer for sending/receiving data. 7.5 Data Data processed will be statistics of user s batting session and any feedback based off of these statistics. The Processing Layer will also receive data from the Levels and Models, Settings, and Feedback files. Breaking Bat 34 System Architecture Specification

7.6 Subsystems 7.6.1 Levels and Models 7.6.1.1 Description The Levels and Models subsystem stores the Levels and Models that are used by the asset loader. 7.6.1.2 Function The Settings Controller receives the Levels and Models needed to run the simulation. 7.6.1.3 Interlayer Interface(s) Method Description Info Required Info Returned getlevelsmodels Gets the levels and models needed for the simulation levels and models Table 7-1 Levels and Models Interlayer Interface 7.6.2 Settings File 7.6.2.1 Description The Settings File subsystem stores the settings generated from the user input. 7.6.2.2 Function The Settings Controller receives the settings created by the user input. These settings are loaded into the Settings File subsystem. 7.6.2.3 Interlayer Interface(s) Method Description Info Required Info Returned getsettings Gets the settings the from the saved file configuration based off of settings storesettings stores settings input by the user User Input none Table 7-2 Settings File Interlayer Interface Breaking Bat 35 System Architecture Specification

7.6.3 Feedback File 7.6.3.1 Description The Feedback File subsystem stores any feedback generated based on the user s statistics found in the batting session. 7.6.3.2 Function The Settings Controller receives the feedback stored in the Feedback File based off of data gathered. 7.6.3.3 Interlayer Interface(s) Method Description Info Required Info Returned getfeedback Gets the feedback generated based off of statistics Statistics from batting session Feedback to improve batter s performance Table 7-3 Feedback Interlayer Interface Breaking Bat 36 System Architecture Specification

8. Requirements Mapping Number Requirement Input Layer Processing Layer Storage Layer Output Layer 3.1 Switch Hitting Capability X X 3.2 Setting Configuration 3.3 Hit Virtual Baseball X X X 3.4 Real World Bat X X 3.5 Virtual Reality Batting Environment X X X 3.6 Feedback X X X 3.7 Real Time Statistics 3.8 Learn Fundamental of Hitting X X X X X X X 3.9 Switch Pitching X X 3.10 Mode Selection X X 4.1 Virtual Reality Device X X 4.3 Xbox Controller X 4.4 Sensors X 4.5 Software X 4.6 Computer X 5.1 Latency X X X X Breaking Bat 37 System Architecture Specification

5.2 Graphics X X X 6.1 Simulator Sickness X 8.1 American English Standard X 8.2 User Friendly Interface X Table 8-1 Requirements Mapping Breaking Bat 38 System Architecture Specification

9. Relationship Mapping 9.1 Section Overview Figure 9-1 System Architecture This section goes into detail on the data flows throughout the architectural design. Sections in yellow represent the Input layer, sections in green represent the Processing layer, sections in blue represent the Storage layer, and sections in gray represent the Output layer. Breaking Bat 39 System Architecture Specification

9.2 Data Flow Definition Data Element Id I1 I2 I3 I4 I5 P1 P2 P3 P4 P5 P6 P7 P8 P9 P10 P11 S1 S2 S3 O1 O2 Description Xbox Controller Input goes into Computer Kinect processes image and sends data to computer Accelerometer sensor on bat sends data to computer Gyroscope sensor on bat sends data to computer Oculus Head tracking sensor data is sent to computer Input data from the Kinect, Accelerometer, Gyroscope, and Oculus is sent to the game engine. Input from the Xbox controller is sent to the settings controller. The Game Engine sends requests to the Asset Loader which will send back the correct levels/models The Game Engine can edit some of the settings file from the pause menu and also pulls user settings from the Settings Controller The Settings Controller tells the Stats Analyzer user configured settings such as what hitting style the user is trying to improve at The Game Engine sends statistics on the user s swing to the Statistics Analyzer which returns Feedback data to be displayed. The Asset Loader sends requests for levels and models to the Levels/Models Storage. The Settings Controller Interacts with the Settings file in the Storage layer. The Statistics Analyzer queries the Feedback file based on user swing data. The Output Controller sends Graphics data to the Graphics Driver The Output Controller sends sound data to the Sound Driver Level and Model files are sent back to the Asset Loader Setting data are saved and updated as well and sent back to the Settings Controller The Feedback storage sends its data to the Stats Analyzer based on the users swing The Graphics Driver renders the image on the monitor The Graphics Driver renders image on the Oculus Breaking Bat 40 System Architecture Specification

O3 The Sound Driver outputs the audio to the speakers 9.3 Producer-Consumer Relationship Matrix Table 9-2 Data Flow 9.4 Analysis Figure 9-2 Producer-Consumer Relationship Matrix From analyzing our Producer-Consumer Relationship Matrix we can see that our most complex layer is our processing layer. This Layer is the heart of our system and connects all the other layers. We have 14 producers for this layer and 9 of those are consumed within the layer as well as 5 more consumed by this layer from the other layers. Our second most complex layer is the Breaking Bat 41 System Architecture Specification

input controller because we have 5 different type of user inputs coming from the input layer. This module consumes all 5 inputs and produces 1 output going into the Input Controller. The Storage Layer communicates with the processing layer receiving file requests and returning data to the appropriate processing modules. The Output s layer finalizes and prepares the sound and graphics data to be displayed on the monitor, oculus, and computer speakers. Breaking Bat 42 System Architecture Specification

10. Testing Consideration 10.1 Overview This section is dedicated to ensuring that each layer will be validated as established in the overall architecture. Each layer will be checked to ensure each is real-time responsive, portable, reliable, and intuitive. 10.2 Input Layer The Input Layer is responsible for managing all input required by the system. The Input Layer is also the layer in which the data begins to flow throughout the overall architecture. Validating the Input Layer requires that any device or subsystem of the layer is properly tested to ensure the flow of information begins at these devices/subsystems. 10.3 Processing Layer The Processing Layer controls the flow of information in out of the layer. The systems inside of this layer will need to be properly tested in order to ensure validation. First, the Input Controller will need to be tested to properly receive data from the Input Layer. Next, the Output Controller will need to be configured properly in order to send data to the Output Layer. Lastly, the Asset Loader, Settings Controller, and Stats Analyzer need to properly send and retrieve data from the Storage layer. 10.4 Storage Layer In order to validate the Storage Layer, the Levels/Models, Settings File, and Feedback subsystems must be tested. Reliability is an important part in each subsystem and further testing will be performed in order to ensure that the Storage Layer can hold any data to create the environment as well as store proper feedback to the user. 10.5 Output Layer The Output Layer will be validated through the Graphics Driver and Sound Driver subsystems. Real-Time Responsiveness, Portability, and Reliability will be checked in this layer. Breaking Bat 43 System Architecture Specification

11. Operating System Dependencies 11.1 Input Layer The Input Layer will receive input from the Oculus Rift DK2 and the Xbox controller. The Oculus Rift is supported on multiple operating systems and the Xbox controller is a Plug n Play device on Windows. The Virtual Slugger will depend on this integration provided by the Windows Operating System. 11.2 Processing Layer The Processing layer will rely on the Unreal Game Engine which is supported across multiple platforms. There will be no dependency on any particular operating system. 11.4 Storage Layer The data for the game will be stored on a hard disk and the access to those files will be abstracted with the Unreal Game Engine. There will be no dependency on any particular operating system. 11.5 Output Layer The Oculus Rift and audio output will also be handled by the Unreal game engine. There will be no dependency on any particular operating system. Breaking Bat 44 System Architecture Specification