Mobile SDK. Version 1.7

Size: px
Start display at page:

Download "Mobile SDK. Version 1.7"

Transcription

1 Mobile SDK Version 1.7

2 2 Introduction Mobile Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved. BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their respective owners. Certain materials included in this publication are reprinted with the permission of the copyright holder. 2

3 Mobile Contents 3 Contents Mobile SDK Getting Started Guide...6 Mobile Development with Unity and Unreal... 7 System and Hardware Requirements... 7 Device Setup...9 Android Development Software Setup for Windows...11 Java Development Kit (JDK) Android Studio Installation...11 Installing Additional Packages and Tools Android Native Development Kit (NDK)...12 Setting up your System to Detect your Android Device...12 Android Development Software Setup for Mac OS X Xcode...13 Java Development Kit (JDK) Android Studio Installation...13 Installing Additional Packages and Tools Android Native Development Kit (NDK)...14 Mobile Development Basics Oculus Signature File (osig) and Application Signing...15 Developer Mode: Running Apps Outside of the Gear VR Headset Android Studio Basics Native Development Overview...24 Native Source Code Native Samples Android Manifest Settings Universal Menu and Reserved User Interactions...27 Reserved User Interactions...27 Universal Menu Native Engine Integration...29 VrApi...29 Lifecycle and Rendering...29 Frame Timing Latency Examples VrApi Input API...35 Asynchronous TimeWarp (ATW)...38 TimeWarp Minimum Vsyncs Consequences of not rendering at 60 FPS Power Management...41 Fixed Clock Level API Power Management and Performance Power State Notification and Mitigation Strategy Advanced Rendering Multi-View Native Application Framework Creating New Apps with the Framework Template UI and Input Handling...49

4 4 Contents Mobile Native SoundEffectContext Runtime Threads Other Native Libraries...52 Media and Assets Mobile VR Media Overview Panoramic Stills...53 Panoramic Videos Movies on Screens Movie Meta-data Oculus 360 Photos and Videos Meta-data...55 Media Locations Native VR Media Applications...56 Models...58 Oculus Cinema Theater Creation FBX Converter Mobile Best Practices Rendering Guidelines Mobile VR Performance Frame Rate Scenes...78 Resolution User Interface Guidelines Stereoscopic UI Rendering...79 The Infinity Problem Depth In-Depth Gazing Into Virtual Reality...80 Adreno Hardware Profile Testing and Troubleshooting...83 Tools and Procedures...83 Android System Properties Screenshot and Video Capture...84 Oculus Remote Monitor Setup...87 Basic Usage Using Oculus Remote Monitor to Identify Common Issues OVR Metrics Tool Android Debugging Adb Logcat Application Performance Analysis Basic Performance Stats through Logcat SysTrace NDK Profiler Snapdragon Profiler Native Debugging Native Debugging with Android Studio Native Debugging with ndk-gdb Mobile Native SDK Migration Guide Migrating to Mobile SDK Migrating to Mobile SDK

5 Mobile Contents 5 Migrating to Mobile SDK Migrating to Mobile SDK Migrating to Mobile SDK Release Notes Release Notes Release Notes Release Notes Release Notes Release Notes Release Notes System Activities/VrApi Release Notes x Release Notes x Release Notes x Release Notes x Release Notes x Release Notes x Release Notes x Release Notes System Driver Release Notes x Release Notes x Release Notes x Release Notes x Release Notes x Release Notes x Release Notes Oculus Remote Monitor Release Notes x Release Notes Mobile SDK Documentation Archive...168

6 6 Mobile SDK Getting Started Guide Mobile Mobile SDK Getting Started Guide The Oculus Mobile SDK includes libraries, tools, and resources for native development for Oculus Go and Samsung Gear VR. SDK Contents VrApi for third-party engine integration (not required for Unity or Unreal). Native application framework for building high-performance VR Applications from scratch. Additional libraries providing support for GUI, locale, and other functionality. Native project sample applications and source to provide reference model for creating your own VR applications. Tools and resources to assist with native development. Mobile SDK Intro Documentation Getting Started Guide: A one-time guide to environment setup. Mobile Development Basics: Information every developer should know about Oculus mobile development. Every developer should read through this guide. Native Developers Most of the Mobile SDK guide is written for native developers. Complete the setup described in the Getting Started Guide, then move on to the Native Development Overview on page 24. Unity and Unreal Developers Mobile developers working with Unity and Unreal should begin with Mobile Development with Unity and Unreal on page 7, as setup and development differ substantially from native setup and development. Platform Features Mobile applications may use our Platform SDK (available separately from our Downloads page) to add features related to security (e.g., entitlements), community (e.g., rooms, matchmaking), revenue (e.g., in-app purchases), and engagement (e.g., leaderboards). For more information, see our Platform SDK documentation. Application Submission For information on preparing to submit your mobile VR application to Oculus for distribution through the Oculus Store, see our Publishing Guide. Thank you for joining us at the forefront of virtual reality! Questions? Visit our developer support forums at Our Support Center can be accessed at

7 Mobile Mobile SDK Getting Started Guide 7 Mobile Development with Unity and Unreal Unity and Unreal provide built-in support for Oculus mobile development. If you wish to use the mobile SDK with other engines, please see Native Engine Integration on page 29. Unity Mobile Development Unity 5.1 and later provides built-in VR support for Oculus Go and Gear VR, enabled by checking a box in Player Settings. The Oculus Mobile SDK is not required. We provide a Utilities for Unity 5 unitypackage that includes supplemental scripts, scenes, and prefabs to assist development. The Utilities package is available from our Downloads page. For more information, see Preparing for Mobile Development in our Oculus Unity Guide. Unreal Mobile Development Unreal versions 4.10 and later provide built-in support for Oculus Go and Gear VR. The Oculus Mobile SDK is not required. The Android SDK is required for mobile development with Unreal. However, most Unreal developers do not need to install Android Studio or NDK. Unreal developers should follow the instructions in our Device Setup on page 9 guide, and install the Java Development Kit (JDK) and Android SDK before beginning development. For more information, see Preparing for Unreal Mobile Development in our Unreal Developer Guide. System and Hardware Requirements Please begin by making sure that you are using supported hardware and devices for this release of the Oculus Mobile SDK. Operating System Requirements The Oculus Mobile SDK currently supports the following operating systems: Windows 7/8/10 Mac OS: (x86 only) Minimum System Requirements The following computer system requirements for the Oculus Mobile SDK are based on the Android SDK system requirements: 2.0+ GHz processor 2 GB system RAM Supported VR Headsets Oculus Go Samsung Gear VR

8 8 Mobile SDK Getting Started Guide Mobile Supported Devices Samsung Galaxy S8+ Samsung Galaxy S8 Samsung Galaxy S7 Edge Samsung Galaxy S7 Samsung Galaxy Note FE Samsung Galaxy Note 5 Samsung Galaxy S6 Edge+ Samsung Galaxy S6 Edge Samsung Galaxy S6 Gear VR Innovator v2 Samsung Galaxy S6 Edge Samsung Galaxy S6 Target Device Requirements API Level 21 (Android 5.0) or later Accessories Samsung Gear VR Controller The Gear VR Controller orientation-tracked input device is the primary Gear VR controller going forward. We recommend that developers take advantage of its capabilities if it makes sense to do so with your application or game. Bluetooth Gamepad In some cases, Bluetooth gamepads are a better fit than the Gear VR Controller and they may still be used, though they may limit your addressable market. A gamepad is necessary for testing the sample applications which come with this release. Compatible gamepads must have the following features: Wireless Bluetooth connection (BT3.0) Compatible with Android devices Start and Select buttons Typical controls include: One Analog Stick Action Button (4) Trigger Button (2) Bluetooth Keyboard (optional) It is useful (but not required) to have a Bluetooth keyboard during development. The Logitech K810 is known to function well.

9 Mobile Mobile SDK Getting Started Guide 9 Device Setup This section will provide information on how to set up your supported device for running, debugging, and testing your mobile application. Please review the System and Hardware Requirements above for the list of supported devices for this SDK release. Note: This information is accurate at the time of publication. We cannot guarantee the consistency or reliability of third-party applications discussed in these pages, nor can we offer support for any of the third-party applications we describe. Configuring Your Device for Debugging In order to test and debug applications on your Android device, you will need to enable specific developer options on the device: 1. Configure Developer Options in Settings Enable USB Debugging Allow mock locations Verify apps via USB 2. Configure Display Options in Settings Disable lock screen Set Display Timeout (optional) Developer Options Note: Depending on which mobile device you are using, options and menu names may vary slightly. Developer options may be found under: Settings -> System -> Developer options. Developer options may be hidden by default. If so, you can expose these options with the following steps: 1. Locate Build number option in Settings. Android M and later: Go to Settings -> System -> About device -> Software Info. Earlier Android Versions: Go to Settings -> System -> About device. 2. Scroll down to Build number. 3. Press Build number seven times. You should be informed that Developer options has been enabled. Once you have found Developer options, enable the following: USB Debugging: This will allow the tools to install and launch deployed apps over USB. You should see the screen shown on the accompanying figure.

10 10 Mobile SDK Getting Started Guide Mobile Note: If the above screen does not appear, ensure that your computer recognizes the device when it is connected. If not, you may need to pull down the notifications bar on your phone and find the USB connection setting, and set USB to Software installation (it may be set to Charging by default). If you still do not see the pictured screen, try a different USB cable. If your phone is recognized by your computer but you do not see the above screen, try toggling USB Debugging off then back on. Check Always allow this computer and hit OK. To purge the authorized whitelist for USB Debugging, press Revoke USB debugging authorizations from the Developer options menu and press OK. Allow mock locations: This will allow you to send mock location information to the device (convenient for apps which use Location Based Services). Verify apps via USB: This will check installed apps from ADB/ADT for harmful behavior. Display Options The following display options are found in: Home -> Apps -> Settings -> Sound and Display. Lock screen/screen Security/Screen lock: Set to None to make the Home screen is instantly available, without swipe or password. Useful to quickly get in and out of the phone. Display/Screen timeout: Set the time to your desired duration. Useful if you are not actively accessing the device but wish to keep the screen awake longer than the default 30 seconds. See Android Debugging for more information.

11 Mobile Mobile SDK Getting Started Guide 11 Android Development Software Setup for Windows In order to develop Android applications, you must have the following software installed on your system: 1. Java Development Kit (JDK) 2. Android Studio Development Bundle 3. Android Native Development Kit (NDK) Java Development Kit (JDK) The Java Development Kit is a prerequisite for Android Studio and Gradle. The latest version which has been tested with this release is JDK 8u91, available from the Java Archive Downloads page: The latest JDK version is available here: Once downloaded and installed, add the environment variable JAVA_HOME and set its value to the JDK install location. For example, the value may be C:\Program Files\Java\jdk1.8.0_91, if you have installed the x64 version. Based on the default installation path of Java SE 8u91, the correct syntax when using set from the command line is: set JAVA_HOME= C:\Program Files\Java\jdk1.8.0_91 Note: The JAVA_HOME value must be your actual path, which may differ from these examples. Additionally, add the JDK to the value of your PATH, e.g. C:\Program Files\Java\jdk1.8.0_91\bin Android Studio Installation Android Studio is the recommended IDE and install manager for the Android SDK tools. Download the Android Studio bundle from: The Android Studio Development Bundle includes the basic tools you need to begin developing Java Android Applications: Android Studio IDE Android SDK tools Latest Android Platform Latest System Image for Emulator Follow Android s installation instructions: To run any of the standalone build scripts that come with the Mobile SDK, you need to set the following environment variables: Set the environment variable ANDROID_HOME to your Android SDK location. The correct syntax for setting your environment variable is: set ANDROID_HOME=<path to SDK>/android/sdk

12 12 Mobile SDK Getting Started Guide Mobile Note: To find the location of the SDK, launch Android Studio and select: File > Project Structure > SDK Location : "Android SDK location" Add the SDK tools and platform tools directories to your PATH: set PATH=%PATH%;%ANDROID_HOME% \tools;%android_home%\platform-tools Installing Additional Packages and Tools Android Studio You must download additional packages required by the Mobile SDK via the Android SDK Manager, found in Tools > Android > SDK Manager. Android Studio may prompt you to take this step automatically the first time you launch it. The following packages are required for native development: Android SDK, API level 21 or later Android Build Tools, and higher LLDB Android Native Development Kit (NDK) The Android Native Development Kit (NDK) is a toolset that allows you to implement parts of your app using native code languages such as C and C++. It is used extensively by the sample applications included with this release. Note: You may install the NDK during the Android Studio installation process, but we recommend installing it manually to be sure that your command-line environment is set up properly and agrees with your Studio setup. The last version of the NDK known to work with the Mobile SDK is r14b. 1. Download the appropriate version of NDK from the following location: downloads/index.html. 2. Save the zip to the directory where you would like to install it, e.g., C:\Dev\Android\android-ndkr14b\. 3. Once downloaded, extract its contents into the parent directory. 4. Add the environment variable ANDROID_NDK_HOME, and set the value to your Android NDK location. For example: set ANDROID_NDK_HOME=C:\Dev\Android\android-ndk-r14b 5. Add the NDK location to your PATH. For example: set PATH=%PATH%;%ANDROID_NDK_HOME% Setting up your System to Detect your Android Device You must set up your system to detect your Android device over USB in order to run, debug, and test your application on an Android device. If you are developing on Windows, you may need to install a USB driver for adb after installing the Android SDK. For an installation guide and links to OEM drivers, see the Android OEM USB Drivers document. Samsung Android drivers may be found on their developer site: Windows may automatically detect the correct device and install the appropriate driver when you connect your device to a USB port on your computer.

13 Mobile Mobile SDK Getting Started Guide 13 Access the Device Manager through the Windows Control Panel. If the device was automatically detected, it will show up under Portable Devices in the Device Manager. Otherwise, look under Other Devices in the Device Manager and select the device to manually update the driver. To verify that the driver successfully recognized the device, open a command prompt and type the command: adb devices Note: You will need to successfully setup your Android development environment in order to use this command. For more information, see the next section: Android Development Environment Setup If the device does not show up, verify that the device is turned on with enough battery power, and that the driver is installed properly. Android Development Software Setup for Mac OS X In order to develop Android applications, you must have the following software installed on your system: 1. Xcode 2. Java Development Kit (JDK) 3. Android Studio Development Bundle 4. Android Native Development Kit (NDK) Your Samsung device may display a notification recommending you install Android File Transfer, a handy application for transferring files between OS X and Android. Xcode Before installing any Android development tools, you must install Xcode. Once installation is complete, some of the following steps (such as installing the JDK) may be unnecessary. To download Xcode, visit Java Development Kit (JDK) Install the JDK if it is not already present on your system. If you have already installed Xcode, this step may be unnecessary. The latest version which has been tested with this release is JDK 8u91, available from the Java Archive Downloads page: The latest JDK version is available here: Android Studio Installation Android Studio is the recommended IDE and install manager for the Android SDK tools. Download the Android Studio bundle from: The Android Studio Development Bundle includes the basic tools you need to begin developing Java Android Applications:

14 14 Mobile SDK Getting Started Guide Mobile Android Studio IDE Android SDK tools Latest Android Platform Latest System Image for Emulator Follow the install instructions located here: pkg=studio To run any of the standalone build scripts that come with the Mobile SDK, you need to set the following environment variables: Set the environment variable ANDROID_HOME to your Android SDK location. The correct syntax for setting your environment variable is: export ANDROID_HOME=<path to SDK>/android/sdk Note: To find the location of the SDK, launch Android Studio and select: File > Project Structure > SDK Location : "Android SDK location" Add the SDK tools and platform tools directories to your PATH: PATH=$PATH:$ANDROID_HOME/tools: $ANDROID_HOME/platform-tools Installing Additional Packages and Tools Android Studio You must download additional packages required by the Mobile SDK via the Android SDK Manager, found in Tools > Android > SDK Manager. Android Studio may prompt you to take this step automatically the first time you launch it. The following packages are required for native development: Android SDK, API level 21 or later Android Build Tools, and higher LLDB Android Native Development Kit (NDK) The Android Native Development Kit (NDK) is a toolset that allows you to implement parts of your app using native code languages such as C and C++. It is used extensively by the sample applications which come with this release. The latest version which has been tested with this release is NDK r14b - it is available for download at the following location: Once downloaded, extract the NDK to your home/dev folder (~/dev). Note: Do not place the Android NDK or SDK folders inside one another. They must be placed into separate folders to avoid any conflict between the two packages. Set the ANDROID_NDK_HOME environment variable to the directory where you have installed your NDK. For example: export ANDROID_NDK_HOME=~/dev/android-ndk-r14b Then add the NDK directory to your PATH: export PATH=$PATH:$ANDROID_NDK_HOME

15 Mobile Mobile Development Basics 15 Mobile Development Basics This guide reviews basic development steps you'll need to know, such as application signing and required support for Universal Menu and volume handling. For instructions on how to install an APK to your mobile device, see "Using adb to Install Applications" in Adb on page 107. Oculus Signature File (osig) and Application Signing Oculus mobile apps require two distinct signatures at different stages of development. Oculus Signature File (required during development, remove for submission) Android Application Signature (required for submission) Oculus Signature File (osig) During development, your application must be signed with an Oculus-issued Oculus Signature File, or osig. This signature comes in the form of a file that you include in your application in order to access protected lowlevel VR functionality on your mobile device. Each signature file is tied to a specific device, so you will need to generate osig files for each device that you use for development. When your application is submitted and approved, Oculus will modify the APK so that it can be used on all devices. Please see our osig self-service portal for more information and instructions on how to request an osig for development: Android Application Signing Android uses a digital certificate (also called a keystore) to cryptographically validate the identity of application authors. All Android applications must be digitally signed with such a certificate in order to be installed and run on an Android device. All developers must create their own unique digital signature and sign their applications before submitting them to Oculus for approval. For more information and instructions, please see Android's "Signing your Applications" documentation: Make sure to save the certificate file you use to sign your application. Every subsequent update to your application must be signed with the same certificate file, or it will fail. Note: Your application must be signed by an Android certificate before you submit it to the Oculus Store. Android Application Signing and Unity Unity automatically signs Android applications with a temporary debug certificate by default. Before building your final release build, create a new Android keystore by following the Sign Your App Manually instructions in Android's Sign your Applications guide. Then assign it with the Use Existing Keystore option, found in Edit > Project Settings > Player > Publishing Options. For more information, see the Android section of Unity's documentation here: Android Application Signing and Unreal Once you add an osig to the appropriate Unreal directory, it will be added automatically to every APK that you build. You will need one osig for each mobile device.

16 16 Mobile Development Basics Mobile To add your osig to Unreal for development: 1. Download an osig as described in above. 2. Navigate to the directory <Unreal-directory>\Engine\Build\Android\Java\. 3. Create a new directory inside \Engine\Build\Android\Java\ and name it assets. The name must not be capitalized. 4. Copy your osig to this directory. When you are ready to build an APK for submission to release, we recommend that you exclude the osig in your APK. To do so, select Edit > Project Settings > Android, scroll down to Advanced APKPackaging, and verify that Remove Oculus Signature Files from Distribution APK is checked. Before building your final release build, create a new Android keystore by following the Sign Your App Manually instructions in Android's Sign your Applications guide. Once you have generated your distribution keystore, go to Edit > Project Settings > Platforms > Android, scroll down to Distribution Signing, and entered the required information. Developer Mode: Running Apps Outside of the Gear VR Headset To run a VR application without loading your mobile device into a headset, you must enable Developer Mode. It is often convenient during development to run VR applications without needing to insert your device into the Gear VR headset, which can be a time-consuming process during development. When Developer Mode is enabled, any Oculus application on your mobile device will launch with distortion and stereoscopic rendering applied without inserting the device into a headset. Devices in Developer Mode implement limited orientation tracking using phone sensors. Orientation tracking may be disabled with the appropriate setting to System Properties or Local Preferences (see Android System Properties on page 83 for details). Enable Developer Mode To enable Developer Mode, you will need to provide a Oculus signature file (osig) to access protected functionality on the Samsung device. You may do this in two ways: 1. Install and run an APK signed with an osig on the phone you wish to run in Developer Mode. If you have already done this, then no further action is necessary. 2. Alternatively, you may download an osig from the Oculus Signature File Generator and copy it to the phone directory /sdcard/oculus/. Once you have provided an osig, follow these steps to set your phone to Developer Mode: Go to Settings > Application Manager Select Gear VR Service Select Manage Storage Click on VR Service Version several times until the Developer Mode toggle shows up Toggle Developer Mode Note: If you do not see the Developer Mode toggle switch after tapping VR Service Version several times, close Gear VR Service and relaunch, and you should see it.

17 Mobile Mobile Development Basics 17 Android Studio Basics This guide introduces the Android Studio IDE and reviews some basic features. Getting Started with Oculus Native Samples: Import Gradle Project 1. If this is the first time you are launching Android Studio, select Open an existing Android Studio project. If you have launched Android Studio before, click File > Open instead. 2. Open any build.gradle project file from the Mobile SDK VRSamples folders. For example, VrSamples/ Native/VrCubeworld_Framework/Projects/Android/build.gradle.

18 18 Mobile Development Basics Mobile 3. When asked if you would like the project to use the Gradle wrapper, click OK.

19 Mobile Mobile Development Basics 19 Note: If this is your first time opening a native project in Android Studio, you are likely to be asked to install some dependencies. Follow the on-screen instructions to install all the required dependencies before you continue. Troubleshooting Gradle Sync Errors Here are some possible solutions if Android Studio reports a Gradle sync or configuration error: The most common cause of such an error is that the Android SDK or NDK locations are wrong. Verify that the SDK and NDK locations are specified correctly in File > Project Structure. If either are wrong or missing, you cannot continue until you fill in the correct path. On macos, sometimes Android Studio reports a missing SDK location error even when the correct paths are listed in the Project Structure dialog box. To correct this problem, copy the local.properties file from your project folder up to the root of your Oculus Mobile SDK folder.

20 20 Mobile Development Basics Mobile Project Overview Android Studio displays project files in the Android view by default. We recommend changing it to the Project view, which provides a good overview of the entire directory structure and highlights imported project directories in bold.

21 Mobile Mobile Development Basics 21 Select Target Configuration, Build, and Run You can build and run your application on your device directly from Android Studio. This will compile your project, build the APK, copy it to the phone over USB or Wi-Fi, and prepare it for launching. If you are developing for Gear VR and your phone is set to Developer Mode (see Developer Mode for instructions), your application can launch without being inserted into your Gear VR headset. Otherwise, when the process completes you will be prompted to insert your mobile device into the headset to launch the application. Make sure you have followed the configuration steps in the Mobile SDK Setup Guide to ensure your device is configured appropriately. Select the target configuration you wish to build before building by selecting Edit Configurations in the project menu in the Android Studio toolbar.

22 22 Mobile Development Basics Mobile Note: Before you can run the application, you must create and then copy the oculussig file for your device to the assets folder of your project. See Application Signing for more information. To build and run your app: 1. Click Run in the toolbar. 2. The Select Deployment Target dialog box appears. This is sometime set to an emulator by default. 3. Select a device listed under Connected Device instead. 4. If your device asks you to Allow USB debugging, click OK. Troubleshooting: If USB debugging does not seem to be working: 1. Go to Developer Options on your phone.

23 Mobile Mobile Development Basics Toggle USB Debugging off and then back on. Syncing Projects If you edit a *.gradle file or install an update to the Oculus Mobile SDK which includes updated Gradle projects, click Sync Project with Gradle Files to update your Android Studio project files.

24 24 Native Development Overview Mobile Native Development Overview Welcome to the Native Development Guide. This guide describes the libraries, tools, samples, and other material provided with this SDK for native development of mobile VR applications. While native software development is comparatively rudimentary, it is closer to the metal and allows implementing very high performance virtual reality experiences without the overhead of elaborate environments such as you would find with a typical game engine. It is not feature rich, but it provides the basic infrastructure you will need to get started with your own high-performance virtual reality experience. This SDK includes several sample projects which provide an overview of the native source code. See Native Samples on page 25 for details. Note: This guide is intended to provide a high-level orientation and discussion of native development with the mobile SDK. Be sure to review the header files for any libraries you use for more extensive, in-depth discussion. Native Source Code This section describes mobile native source code development. The native SDK provides four basic native libraries: VrApi: the minimal API for VR; VrAppFramework: the application framework used by native apps; VrAppSupport: support for GUI, Locale handling, sound, etc.; LibOVRKernel: a low-level Oculus library for containers, mathematical operations, etc. VrApi provides the minimum required API for rendering scenes in VR. Applications may query VrApi for orientation data, and submit textures to apply distortion, sensor fusion, and compositing. The VrApi Input API allows developers to query the state of connected devices, such as the Gear VR Controller. Developers working with a third-party engine other than Unity or Unreal will use VrApi to integrate the mobile SDK. For detailed information, see Native Engine Integration on page 29. The VrAppFramework handles VrApi integration and provides a wrapper around Android activity that manages the Android lifecycle. The VrAppFramework is the basis for several of our samples and first-party applications, including Oculus Video and Oculus 360 Photos. If you are not using Unity, Unreal, or another integration and you would like a basic framework to help get started, we recommend that you have a look. See Native Application Framework on page 48 for more information. VrAppSupport and LibOVRKernel provide minimal functionality to applications using VrAppFramework such as GUI, sound, and locale management. See Other Native Libraries on page 52 for more information. The Vr App Interface (part of VrAppFramework) has a clearly-defined lifecycle, which may be found in VrAppFramework/Include/App.h. LibOVRKernel and VrAppFramework ship with full source as well as pre-built libs and aar files. The VrApi is shipped as a set of public include files, and a pre-built shared library. Providing the VrApi in a separate shared library allows the VrApi implementation to be updated after an application has been released, making it easy to apply hot fixes, implement new optimizations, and add support for new devices without requiring applications to be recompiled with a new SDK. The VrApi is periodically updated automatically to Samsung phones - for release notes, see System Activities/VrApi Release Notes. See the VrSamples/Native/VrCubeWorld projects for examples of how to integrate VrApi into third party engines as well as how to use VrAppFramework. Please see Native Samples on page 25 for more details.

25 Mobile Native Development Overview 25 Main Components Component Description Source code folder VrApi The Virtual Reality API provides a minimal set of entry points for enabling VR rendering in native applications and third-party engines. VrApi VrApi Includes Header files for the VrApi library. VrApi/Includes Application Framework Framework and support code for native applications. Includes code for rendering, user interfaces, sound playback, and more. It is not meant to provide the functionality of a full-fledged engine, but it does provide structure and a lot of useful building blocks for building native applications. VrAppFramework/ Src VrAppFramework Includes Header files for the VrAppFramework library. VrAppFramework/ Include Native Samples Sample projects illustrating use of VrApi and VrAppFramework. VrSamples/Native LibOVRKernel The Oculus library. LibOVRKernel/Src Native Samples The mobile SDK includes a set of sample projects that prove out virtual reality application development on the Android platform and demonstrate high-performance virtual reality experiences on mobile devices. Sample Applications and Media The sample applications included with the SDK are provided as a convenience for development purposes. Some of these apps are similar to apps available for download from the Oculus Store. Due to the potential for conflict with these versions, we do not recommend running these sample apps on the same device on which you have installed your retail experience. If you are using the Samsung Note 4, please take care to secure the retail media content bundled with the SM-R320. It will be difficult if not impossible to replace. Note: Due to limitations of Android ndk-build, your Oculus Mobile SDK must not contain spaces in its path. If you have placed your Oculus Mobile SDK in a path or folder name that contains spaces, you must move or rename the folder before you build our samples. The following samples can be found in \VrSamples\Native\: Oculus 360 Photos: A viewer for panoramic stills. Oculus 360 Videos: A viewer for panoramic videos. Oculus Cinema: Plays 2D and 3D movies in a virtual movie theater. VrController: A simple scene illustrating use of the VrApi Input API. VR Cube World: A sample scene with colorful cubes illustrating construction of basic native apps using different tools provided by the SDK. There are three versions of VR Cube World: VrCubeWorld_SurfaceView is closest to the metal. This sample uses a plain Android SurfaceView and handles all Android Activity and Android Surface lifecycle events in native code. This sample does not use the application framework or LibOVRKernel - it only uses the VrApi. It provides a good example of how to integrate the VrApi in an existing engine. The MULTI_THREADED define encapsulates the code that shows how the VrApi can be integrated into an engine with a separate renderer thread. VrCubeWorld_NativeActivity uses the Android NativeActivity class to avoid manually handling all the lifecycle events. This sample does not use the application framework or LibOVRKernel, it uses VrApi only. It provides a good example of how to integrate the VrApi in an existing engine that uses a

26 26 Native Development Overview Mobile NativeActivity. The MULTI_THREADED define encapsulates the code that shows how the VrApi can be integrated into an engine with a separate renderer thread. VrCubeWorld_Framework uses Oculus Mobile Application Framework. When starting a new application from scratch without using a pre-existing engine, the application framework can make development significantly easier. The application framework presents a much simplified application lifecycle, allowing a developer to focus on the virtual reality experience without having to deal with many of the platformspecific details. Installation To install these sample applications and associated data to your mobile device, perform the following steps: 1. Connect to the device via USB. 2. Run installtophone.bat from your Oculus Mobile SDK directory, e.g.: C:\Dev\Oculus\Mobile \installtophone.bat. 3. Issue the following commands from C:\Dev\Oculus\Mobile\: adb push sdcard_sdk /sdcard/ adb install -r *.apk 4. Alternately, you may copy the files directly onto your mobile device using Windows Explorer, which may be faster in some cases. Android Manifest Settings Configure your manifest with the necessary VR settings, as shown in the following manifest segment. Note: These manifest requirements are intended for development and differ from our submission requirements. Before submitting your application, please be sure to follow the manifest requirements described by our Publishing Guide. <manifest xmlns:android=" package="<packagename>" android:versioncode="1" android:versionname="1.0" android:installlocation= auto > <application android:theme="@android:style/theme.black.notitlebar.fullscreen" > <meta-data android:name="com.samsung.android.vr.application.mode" android:value="vr_only"/> <activity android:screenorientation="landscape" android:launchmode="singletask" android:configchanges="screensize screenlayout orientation keyboardhidden keyboard navigation"> </activity> </application> <uses-sdk android:minsdkversion="21" android:targetsdkversion="21" /> <uses-feature android:glesversion="0x " /> </manifest> Replace <packagename> with your actual package name, such as "com.oculus.cinema". The Android theme should be set to the solid black theme for comfort during application transitioning: Theme.Black.NoTitleBar.Fullscreen The vr_only meta data tag should be added for VR mode detection. The required screen orientation is landscape: android:screenorientation="landscape" It is recommended that your configchanges are as follows: android:configchanges="screensize screenlayout orientation keyboardhidden keyboard navigation" The minsdkversion and targetsdkversion are set to the API level supported by the device. For the current set of devices, the API minimum level and target level are 21. Do not add the nohistory attribute to your manifest. Applications submission requirements may require additional adjustments to the manifest. Please refer to Application Manifests for Release Versions in our Publishing Guide.

27 Mobile Native Development Overview 27 Universal Menu and Reserved User Interactions This section describes the Universal Menu and reserved user interactions including the volume, back, and home buttons. Reserved User Interactions Back button, home button, and volume button behaviors must conform to specific requirements. Volume Button Interactions Volume adjustment on the Samsung device is handled automatically. The volume control dialog display is also handled automatically by the VrApi as of Mobile SDK Do not implement your own volume display handling, or users will see two juxtaposed displays. You may override automatic volume display handling if necessary by setting VRAPI_FRAME_FLAG_INHIBIT_VOLUME_LAYER as an ovrframeparm flag. Back Button Interactions Back button presses are of three types: long-press, short-press, and aborted long-press. A long-press occurs when the back button is held longer than 0.75 seconds. A short-press is 0.25 seconds or shorter. If a single press of the back button is longer than 0.25 seconds but shorter than 0.75 seconds, it results in an aborted long-press and cancels the Universal Menu timer. A timer animation is displayed from 0.25 to 0.75 seconds, indicating to the user that a long-press in underway. Long-press on the back button is reserved, and always opens the Universal Menu. As of Mobile SDK 1.0.4, this behavior is handled automatically by the VrApi. You should no longer implement your own long-press back button handling, including gaze timer. Short-press back button behavior is determined by the application. It is typically (but not necessarily) treated as a generic back action appropriate to the application s current state. Back actions usually prompt apps to navigate one level up in an interface hierarchy. For example, a shortpress on the back button may bring up the application s menu. In another application, a short-press may act as a generic back navigation in the UI hierarchy until the root is reached, at which point it may bring up an application-specific menu, or enter the Universal Menu with a confirmation dialog, allowing the user to exit the application to Oculus Home. In applications built with VrAppFramework or Unity, if no satisfactory stateful condition is identified by the application, the short-press opens the Universal Menu with a confirmation dialog allowing the user to exit the app and return to Oculus Home. Applications built with other engines must implement this handling - see the VrCubeWorld_NativeActivity sample for an example. Home Button Interactions A Home button press always opens a dialog to return the user to Oculus Home. As of Mobile SDK 1.0.4, this behavior is handled automatically by the VrApi.

28 28 Native Development Overview Mobile Universal Menu The Universal Menu provides access to the user's Oculus Profile, friends list, notifications, Settings and Utilities options, and a shortcut to Oculus Home. The Settings submenu provides volume and brightness controls, a reorientation command, and more. The Utilities submenu provides access to the screenshot and video capture functions and the Pass-Through Camera. The Universal Menu is part of the Oculus System Activities application which is installed to the user's device along with Oculus Home and the Oculus App. The Universal Menu is activated when the user initiates the relevant reserved button interactions described in Reserved User Interactions on page 27. Universal Menu Implementation Native apps In native apps, the application is responsible for hooking the back key short-presses by overloading VrAppInterface::OnKeyEvent() and deciding when the user is at the root of the application s UI, at which point it should ignore the back key event by returning false. This will allow VrAppFramework to handle the back key and start the Universal Menu quit confirmation dialog. To display the Universal Menu quit confirmation dialog, the app should call: vrapi_showsystemui( &app->java, VRAPI_SYS_UI_CONFIRM_QUIT_MENU ); Unity apps See OVRPlatformMenu.cs in the Utilities for Unity 5 for sample execution. For more information, see Design Considerations: Universal Menu in our Unity 5 guide.

29 Mobile Native Engine Integration 29 Native Engine Integration This guide describes how to integrate the mobile native SDK with a game engine using VrApi. VrApi provides the minimum required API for rendering scenes in VR. Applications may query VrApi for orientation data, and submit textures to apply distortion, sensor fusion, and compositing. We have provided the source for VrCubeWorld_NativeActivity and VrCubeWorld_SurfaceView, simple sample applications using VrApi, to serve as references. Please see Native Samples on page 25 for more details. VrApi Lifecycle and Rendering Multiple Android activities that live in the same address space can cooperatively use the VrApi. However, only one activity can be in "VR mode" at a time. The following explains when an activity is expected to enter/leave VR mode. Android Activity lifecycle An Android Activity can only be in VR mode while the activity is in the resumed state. The following shows how VR mode fits into the Android Activity lifecycle VrActivity::onCreate() < VrActivity::onStart() < VrActivity::onResume() <---+ vrapi_entervrmode() vrapi_leavevrmode() VrActivity::onPause() VrActivity::onStop() VrActivity::onDestroy() Android Surface lifecycle An Android Activity can only be in VR mode while there is a valid Android Surface. The following shows how VR mode fits into the Android Surface lifecycle VrActivity::surfaceCreated() <----+ VrActivity::surfaceChanged() vrapi_entervrmode() vrapi_leavevrmode() VrActivity::surfaceDestroyed() ---+ Note that the lifecycle of a surface is not necessarily tightly coupled with the lifecycle of an activity. These two lifecycles may interleave in complex ways. Usually surfacecreated() is called after onresume() and surfacedestroyed() is called between onpause() and ondestroy(). However, this is not guaranteed and, for instance, surfacedestroyed() may be called after ondestroy() or even before onpause(). An Android Activity is only in the resumed state with a valid Android Surface between surfacechanged() or onresume(), whichever comes last, and surfacedestroyed() or onpause(), whichever comes first. In other words, a VR application will typically enter VR mode from surfacechanged() or onresume(), whichever comes last, and leave VR mode from surfacedestroyed() or onpause(), whichever comes first.

30 30 Native Engine Integration Mobile Android VR lifecycle This is a high-level overview of the rendering pipeline used by VrApi. For more information, see VrApi\Include \VrApi.h. 1. Initialize the API. 2. Create an EGLContext for the application. 3. Get the suggested resolution to create eye texture swap chains with vrapi_getsystempropertyint( &java, VRAPI_SYS_PROP_SUGGESTED_EYE_TEXTURE_WIDTH ). 4. Allocate a texture swap chain for each eye with the application's EGLContext. 5. Get the suggested FOV to setup a projection matrix. 6. Setup a projection matrix based on the suggested FOV. Note that this is an infinite projection matrix for the best precision. 7. Android Activity/Surface lifecycle loop. Acquire ANativeWindow from Android Surface. Enter VR mode once the Android Activity is in the resumed state with a valid ANativeWindow. Frame loop, possibly running on another thread. Get the HMD pose, predicted for the middle of the time period during which the new eye images will be displayed. The number of frames predicted ahead depends on the pipeline depth of the engine and the synthesis rate. The better the prediction, the less black will be pulled in at the edges. e. Advance the simulation based on the predicted display time. f. Render eye images and setup ovrframeparms using ovrtracking2. g. Render to textureid using the ViewMatrix and ProjectionMatrix from ovrtracking2. Insert fence using eglcreatesynckhr. h. Submit the frame with vrapi_submitframe. i. Must leave VR mode when the Android Activity is paused or the Android Surface is destroyed or changed. j. Destroy the texture swap chains. Make sure to delete the swapchains before the application's EGLContext is destroyed. 8. Shut down the API. a. b. c. d. Integration The API is designed to work with an Android Activity using a plain Android SurfaceView, where the Activity lifecycle and the Surface lifecycle are managed completely in native code by sending the lifecycle events (onresume, onpause, surfacechanged etc.) to native code. The API does not work with an Android Activity using a GLSurfaceView. The GLSurfaceView class manages the window surface and EGLSurface and the implementation of GLSurfaceView may unbind the EGLSurface before onpause() gets called. As such, there is no way to leave VR mode before the EGLSurface disappears. Another problem with GLSurfaceView is that it creates the EGLContext using eglchooseconfig(). The Android EGL code pushes in multisample flags in eglchooseconfig() if the user has selected the "force 4x MSAA" option in settings. Using a multisampled front buffer is completely wasted for TimeWarp rendering. Alternately, an Android NativeActivity can be used to avoid manually handling all the lifecycle events. However, it is important to select the EGLConfig manually without using eglchooseconfig() to make sure the front buffer is not multisampled. The vrapi_getsystemproperty* functions can be called at any time from any thread. This allows an application to setup its renderer, possibly running on a separate thread, before entering VR mode. On Android, an application cannot just allocate a new window/frontbuffer and render to it. Android allocates and manages the window/frontbuffer and (after the fact) notifies the application of the state of affairs through

31 Mobile Native Engine Integration 31 lifecycle events (surfacecreated / surfacechanged / surfacedestroyed). The application (or 3rd party engine) typically handles these events. Since the VrApi cannot just allocate a new window/frontbuffer, and the VrApi does not handle the lifecycle events, the VrApi somehow has to take over ownership of the Android surface from the application. To allow this, the application can explicitly pass the EGLDisplay, EGLContext EGLSurface or ANativeWindow to vrapi_entervrmode(), where the EGLSurface is the surface created from the ANativeWindow. The EGLContext is used to match the version and config for the context used by the background time warp thread. This EGLContext, and no other context can be current on the EGLSurface. If, however, the application does not explicitly pass in these objects, then vrapi_entervrmode() must be called from a thread with an OpenGL ES context current on the Android window surface. The context of the calling thread is then used to match the version and config for the context used by the background TimeWarp thread. The TimeWarp will also hijack the Android window surface from the context that is current on the calling thread. On return, the context from the calling thread will be current on an invisible pbuffer, because the time warp takes ownership of the Android window surface. Note that this requires the config used by the calling thread to have an EGL_SURFACE_TYPE with EGL_PBUFFER_BIT. Before getting sensor input, the application also needs to know when the images that are going to be synthesized will be displayed, because the sensor input needs to be predicted ahead for that time. As it turns out, it is not trivial to get an accurate predicted display time. Therefore the calculation of this predicted display time is part of the VrApi. An accurate predicted display time can only really be calculated once the rendering loop is up and running and submitting frames regularly. In other words, before getting sensor input, the application needs an accurate predicted display time, which in return requires the renderer to be up and running. As such, it makes sense that sensor input is not available until vrapi_entervrmode() has been called. However, once the application is in VR mode, it can call vrapi_getpredicteddisplaytime() and vrapi_getpredictedtracking() at any time from any thread. How Eye Images are Synchronized The VrApi allows for one frame of overlap which is essential on tiled mobile GPUs. Because there is one frame of overlap, the eye images have typically not completed rendering by the time they are submitted to vrapi_submitframe(). To allow the time warp to check whether the eye images have completed rendering, the application can explicitly pass in a sync object (CompletionFence) for each eye image through vrapi_submitframe(). Note that these sync objects must be EGLSyncKHR because the VrApi still supports OpenGL ES 2.0. If, however, the application does not explicitly pass in sync objects, then vrapi_submitframe() must be called from the thread with the OpenGL ES context that was used for rendering, which allows vrapi_submitframe() to add a sync object to the current context and check if rendering has completed. Note that even if no OpenGL ES objects are explicitly passed through the VrApi, vrapi_entervrmode() and vrapi_submitframe() can still be called from different threads. vrapi_entervrmode() needs to be called from a thread with an OpenGL ES context that is current on the Android window surface. This does not need to be the same context that is also used for rendering. vrapi_submitframe() needs to be called from the thread with the OpenGL ES context that was used to render the eye images. If this is a different context than the context used to enter VR mode, then for stereoscopic rendering this context *never* needs to be current on the Android window surface. Eye images are passed to vrapi_submitframe() as "texture swap chains" (ovrtextureswapchain). These texture swap chains are allocated through vrapi_createtextureswapchain(). This is important to allow these textures to be allocated in special system memory. When using a static eye image, the texture swap chain does not need to be buffered and the chain only needs to hold a single texture. When the eye images are dynamically updated, the texture swap chain needs to be buffered. When the texture swap chain is passed to vrapi_submitframe(), the application also passes in the chain index to the most recently updated texture.

32 32 Native Engine Integration Mobile Frame Timing It is critical in VR that we never show the user a stale frame. vrapi_submitframe() controls the synthesis rate through an application specified ovrframeparms::minimumvsyncs. It also determines the point where the calling thread gets released, currently the halfway point of the predicted display refresh cycle. vrapi_submitframe() only returns when both these conditions are met: the previous eye images have been consumed by the asynchronous time warp (ATW) thread at least the specified minimum number of V-syncs have passed since the last call to vrapi_submitframe(). The ATW thread consumes new eye images and updates the V-sync counter halfway through a display refresh cycle. This is the first time ATW can start updating the first eye, covering the first half of the display. As a result, vrapi_submitframe() returns and releases the calling thread at the halfway point of the display refresh cycle. Once vrapi_submitframe() returns, synthesis has a full display refresh cycle to generate new eye images up to the next midway point. At the next halfway point, the time warp has half a display refresh cycle (up to Vsync) to update the first eye. The time warp then effectively waits for V-sync and then has another half a display refresh cycle (up to the next-next halfway point) to update the second eye. The asynchronous time warp uses a high priority GPU context and will eat away cycles from synthesis, so synthesis does not have a full display refresh cycle worth of actual GPU cycles. However, the asynchronous time warp tends to be very fast, leaving most of the GPU time for synthesis. Instead of just using the latest sensor sampling, synthesis uses predicted sensor input for the middle of the time period during which the new eye images will be displayed. This predicted time is calculated using vrapi_getpredicteddisplaytime(). The number of frames predicted ahead depends on the pipeline depth, the extra latency mode, and the minimum number of V-syncs in between eye image rendering. Less than half a display refresh cycle before each eye image will be displayed, ATW will get new predicted sensor input using the very latest sensor sampling. ATW then corrects the eye images using this new sensor input. In other words, ATW always correct the eye images even if the predicted sensor input for synthesis is not perfect. However, the better the prediction for synthesis, the less black will be pulled in at the edges by the asynchronous time warp. The application can improve the prediction by fetching the latest predicted sensor input right before rendering each eye, and passing a, possibly different, sensor state for each eye to vrapi_submitframe(). However, it is very important that both eyes use a sensor state that is predicted for the exact same display time, so both eyes can be displayed at the same time without causing intra frame motion judder. While the predicted orientation can be updated for each eye, the position must remain the same for both eyes, or the position would seem to judder "backwards in time" if a frame is dropped. Ideally the eye images are only displayed for the MinimumVsyncs display refresh cycles that are centered about the eye image predicted display time. In other words, a set of eye images is first displayed at Predicted Display Time - (MinimumVsyncs / 2) display refresh cycles. The eye images should never be shown before this time because that can cause intra frame motion judder. Ideally the eye images are also not shown after Predicted Display Time + (MinimumVsyncs / 2) display refresh cycles, but this may happen if synthesis fails to produce new eye images in time. Latency Examples

33 Mobile Native Engine Integration 33 Diagram Legend MinimumVsyncs = 1, ExtraLatencyMode = off Expected single-threaded simulation latency is 33 milliseconds ATW reduces this to 8-16 milliseconds. MinimumVsyncs = 1, ExtraLatencyMode = on Expected single-threaded simulation latency is 49 milliseconds. ATW reduces this to 8-16 milliseconds.

34 34 Native Engine Integration Mobile MinimumVsyncs = 2, ExtraLatencyMode = off Expected single-threaded simulation latency is 58 milliseconds. ATW reduces this to 8-16 milliseconds. MinimumVsyncs = 2, ExtraLatencyMode = on Expected single-threaded simulation latency is 91 milliseconds. ATW reduces this to 8-16 milliseconds.

35 Mobile Native Engine Integration 35 VrApi Input API This document describes using the Samsung Gear VR Controller with the VrApi Input API. The VrApi Input API allows applications linked to VrApi to enumerate and query the state of devices connected to a Gear VR. Once a device is enumerated, its current state can be queried each frame using the input API. The Input API is defined in VrApi/Src/VrApi_Input.h. For sample usage, see VrSamples/Native/VrController. For a discussion of best practices, see Gear VR Controller Best Practices in Oculus Best Practices. Input Devices Input API supports the Gear VR Headset and Gear VR Controller. Gear VR Headset controls include the touchpad and Back button short-press. The Gear VR Controller is an orientation-tracked input device. Gear VR positions the controller relative to the user by using a body model to estimate the controller s position. Left-handedness versus right-handedness is specified by users during controller pairing and is used to visualize the controller on the appropriate side of the user s body in VR. The Home, Back button long-press, volume up and volume down buttons on the controller and headset are reserved for system use and will not appear in the button state on either input device. Enumerating Devices Enumerated devices may be the Gear VR Headset or the Gear VR Controller. In order to find a device, an application should call vrapi_enumerateinputdevices. This function takes a pointer to the ovrmobile context, and an index and a pointer to an ovrinputcapabilityheader structure. If a device exists for the specified index, the ovrinputcapabilityheader s Type and DeviceID members are set upon return.

36 36 Native Engine Integration Mobile Once a device is enumerated, its full capabilities can be queried with vrapi_getinputdevicecapabilities. This function also takes a pointer to an ovrinputcapabilityheader structure, but the caller must pass a structure that is appropriate for the ovrcontrollertype that was returned by vrapi_enumerateinputdevices. For instance, if vrapi_enumerateinputdevices returns a Type of ovrcontrollertype_trackedremote when passed an index of 0, then the call to vrapi_getinputdevicecapabilities should pass a pointer to the Header field inside of a ovrinputtrackedremotecapabilities structure. For example: ovrinputcapabilityheader capsheader; if ( vrapi_enumerateinputdevices( ovrcontext, 0, &capsheader ) >= 0 ) { if ( capsheader.type == ovrcontrollertype_trackedremote ) { ovrinputtrackedremotecapabilities remotecaps; if ( vrapi_getinputdevicecapabilities( ovr, &remotecaps.header ) >= 0 ) { // remote is connected } } } After successful enumeration, the ovrinputcapabilityheader structure that was passed to vrapi_enumerateinputdevices will have its DeviceID field set to the device ID of the enumerated controller. From this point on, the device state can be queried by calling vrapi_getinputtrackingstate as described below. Device Connection and Disconnection Devices are considered connected once they are enumerated through vrapi_enumerateinputdevices, and when vrapi_getinputtrackingstate and vrapi_getcurrentinputstate return valid results. vrapi_enumerateinputdevices does not do any significant work and may be called each frame to check if a device is present or not. Querying Device Input State The state of the controller and Gear VR headset can be queried via the vrapi_getcurrentinputstate function. Both functions take deviceids and pointers to ovrinputstateheader structures. Before calling these functions, fill in the header s Type field with the type of device that is associated with the passed deviceid. Make sure the structure passed to these functions is not just a header, but the appropriate structure for the device type. For instance, when querying a controller, pass an ovrinputtrackedremotecapabilities structure with the Header.Type field set to ovrcontrollertype_trackedremote. ovrinputstatetrackedremote remotestate; remotestate.header.type = ovrcontrollertype_trackedremote; if ( vrapi_getcurrentinputstate( ovr, controllerdeviceid, &remotestate.header ) >= 0 ) { // act on device state returned in remotestate } vrapi_getcurrentinputstate returns the controller s current button and trackpad state.

37 Mobile Native Engine Integration 37 Querying Device Tracking State To query the orientation tracking state of a device, call vrapi_getinputtrackingstate and pass it a predicted pose time. Passing a predicted pose time of 0 will return the most recently-sampled pose. ovrtracking trackingstate; if ( vrapi_getinputtrackingstate( ovr, controllerdeviceid, &trackingstate ) >= 0 ) VrApi implements an arm model that uses the controller s orientation to synthesize a plausible hand position each frame. The tracking state will return this position in the Position field of the predicted tracking state s HeadPose.Pose member. Controller handedness may be queried using vrapi_getinputdevicecapabilities as described in Enumerating Devices above. Applications that implement their own arm models are free to ignore this position and calculate a position based on the Orientation field that is returned in the predicted tracking state s pose. Recentering the Controller Users may experience some orientation drift in the yaw axis, causing the physical controller's orientation to go out of alignment with its VR representation. To synchronize the physical controller s orientation with the VR representation, users should: 1. Point the controller in the direction of the forward axis of their headset, and 2. Press and hold the Home button for one second. When a recenter occurs, the VrApi arm model is notified and the arm model s shoulders are repositioned to align to the headset s forward vector. This is necessary because the shoulders do not automatically rotate with the head. Applications that implement their own arm models can poll the device input state s RecenterCount field to determine when the controller is recentered. RecenterCount increments only when a recenter is performed. We recommend recentering arm models based on the head pose when this field changes. Headset Emulation Emulation mode is convenient for applications that have not been rebuilt to use the new controller API. When enabled, Gear VR Controller touch values use the same mapping as headset touch values, and applications cannot distinguish headset inputs from controller inputs. Headset emulation for the controller can be toggled on or off by calling vrapi_setremoteemulation. It is enabled by default. When emulation is enabled, applications that load a new VrApi with Gear VR Controller support will receive input from the controller through Android Activity s dispatchkeyeventx and dispatchtouchevent methods. New applications and applications that are specifically updated to use the controller should use the VrApi Input API to enumerate the controller and query its state directly. Applications may also want to enumerate the headset and query its state through the same API. Gear VR Controller Swiping Gestures For Gear VR Controllers, the user interface of your VR experience should follow these natural scrolling and swiping gestures: Swipe up: Pull content upward. Equivalent to scrolling down. Swipe down: Pull content downward. Equivalent to scrolling up.

38 38 Native Engine Integration Mobile Swipe left: Pull content left or go to the next item or page. Swipe right: Pull content right or go to the previous item or page. Asynchronous TimeWarp (ATW) Asynchronous TimeWarp (ATW) transforms stereoscopic images based on the latest head-tracking information to significantly reduce the motion-to-photon delay. reducing latency and judder in VR applications. Overview In a basic VR game loop, the following occurs: The software requests your head orientation. The CPU processes the scene for each eye. The GPU renders the scenes. The Oculus Compositor applies distortion and displays the scenes on the headset. The following shows a basic example of a game loop: Figure 1: Basic Game Loop When frame rate is maintained, the experience feels real and is enjoyable. When it doesn t happen in time, the previous frame is shown which can be disorienting. The following graphic shows an example of judder during the basic game loop: Figure 2: Basic Game Loop with Judder When you move your head and the world doesn t keep up, this can be jarring and break immersion. ATW is a technique that shifts the rendered image slightly to adjust for changes in head movement. Although the image is modified, your head does not move much, so the change is slight. Additionally, to smooth issues with the user s computer, game design or the operating system, ATW can help fix potholes or moments when the frame rate unexpectedly drops. The following graphic shows an example of frame drops when ATW is applied: Figure 3: Game Loop with ATW At the refresh interval, the Compositor applies TimeWarp to the last rendered frame. As a result, a TimeWarped frame will always be shown to the user, regardless of frame rate. If the frame rate is very bad, flicker will be noticeable at the periphery of the display. But, the image will still be stable.

39 Mobile Native Engine Integration 39 ATW is automatically applied by the Oculus Compositor; you do not need to enable or tune it. Although ATW reduces latency, make sure that your application or experience makes frame rate. Discussion Stereoscopic eye views are rendered to textures, which are then warped onto the display to correct for the distortion caused by wide angle lenses in the headset. To reduce the motion-to-photon delay, updated orientation information is retrieved for the headset just before drawing the time warp, and a transformation matrix is calculated that warps eye textures from where they were at the time they were rendered to where they should be at the time they are displayed. Many people are skeptical on first hearing about this, but for attitude changes, the warped pixels are almost exactly correct. A sharp rotation will leave some pixels black at the edges, but this turns out to be minimally distracting. The time warp is taken a step farther by making it an "interpolated time warp." Because the video is scanned out at a rate of about 120 scan lines a millisecond, scan lines farther to the right have a greater latency than lines to the left. On a sluggish LCD this doesn't really matter, but on a crisp switching OLED, users may feel like the world is subtly stretching or shearing when they turn quickly. This is corrected by predicting the head attitude at the beginning of each eye, a prediction of < 8 milliseconds, and the end of each eye, < 16 milliseconds. These predictions are used to calculate time warp transformations, and the warp is interpolated between these two values for each scan line drawn. The time warp may be implemented on the GPU by rendering a full screen quad with a fragment program that calculates warped texture coordinates to sample the eye textures. However, for improved performance the time warp renders a uniformly tessellated grid of triangles over the whole screen where the texture coordinates are setup to sample the eye textures. Rendering a grid of triangles with warped texture coordinates basically results in a piecewise linear approximation of the time warp. If the time warp runs asynchronous to the stereoscopic rendering, then it may also be used to increase the perceived frame rate and to smooth out inconsistent frame rates. By default, the time warp currently runs asynchronously for both native and Unity applications. TimeWarp Minimum Vsyncs The TimeWarp MinimumVsyncs parameter default value is 1 for a 60 FPS target. Setting it to 2 will reduce the maximum application frame rate to no more than 30 FPS. The asynchronous TimeWarp thread will continue to render new frames with updated head tracking at 60 FPS, but the application will only have an opportunity to generate 30 new stereo pairs of eye buffers per second. You can set higher values for experimental purposes, but the only sane values for shipping apps are 1 and 2. There are two cases where you might consider explicitly setting this: If your application can't hold 60 FPS most of the time, it might be better to clamp at 30 FPS all the time, rather than have the app smoothness or behavior change unpredictably for the user. In most cases, we believe that simplifying the experiences to hold 60 FPS is the correct decision, but there may be exceptions. Rendering at 30 application FPS will save a significant amount of power and reduce the thermal load on the device. Some applications may be able to hit 60 FPS, but run into thermal problems quickly, which can have catastrophic performance implications -- it may be necessary to target 30 FPS if you want to be able to play for extended periods of time. See Power Management for more information regarding thermal throttle mitigation strategies.

40 40 Native Engine Integration Mobile Consequences of not rendering at 60 FPS These consequences apply whether you have explicitly set MinimumVsyncs or your app is simply going that slow by itself. If the viewpoint is far away from all geometry, nothing is animating, and the rate of head rotation is low, there will be no visual difference. When any of these conditions are not present, there will be greater or lesser artifacts to balance. If the head rotation rate is high, black at the edges of the screen will be visibly pulled in by a variable amount depending on how long it has been since an eye buffer was submitted. This still happens at 60 FPS, but because the total time is small and constant from frame to frame, it is almost impossible to notice. At lower frame rates, you can see it snapping at the edges of the screen. There are two mitigations for this: 1) Instead of using either "now" or the time when the frame will start being displayed as the point where the head tracking model is queried, use a time that is at the midpoint of all the frames that the eye buffers will be shown on. This distributes the "unrendered area" on both sides of the screen, rather than piling up on one. 2) Coupled with that, increasing the field of view used for the eye buffers gives it more cushion off the edges to pull from. For native applications, we currently add 10 degrees to the FOV when the frame rate is below 60. If the resolution of the eye buffers is not increased, this effectively lowers the resolution in the center of the screen. There may be value in scaling the FOV dynamically based on the head rotation rates, but you would still see an initial pop at the edges, and changing the FOV continuously results in more visible edge artifacts when mostly stable. TimeWarp does not currently attempt to compensate for changes in position, only attitude. We don't have real position tracking in mobile yet, but we do use a head / neck model that provides some eye movement based on rotation, and apps that allow the user to navigate around explicitly move the eye origin. These values will not change at all between eye updates, so at 30 eye FPS, TimeWarp would be smoothly updating attitude each frame, but movement would only change every other frame. Walking straight ahead with nothing really close by works rather better than might be expected, but sidestepping next to a wall makes it fairly obvious. Even just moving your head when very close to objects makes the effect visible. There is no magic solution for this. We do not have the performance headroom on mobile to have TimeWarp do a depth buffer informed reprojection, and doing so would create new visual artifacts in any case. There is a simplified approach that we may adopt that treats the entire scene as a single depth, but work on it is not currently scheduled. It is safe to say that if your application has a significant graphical element nearly stuck to the view, like an FPS weapon, that it is not a candidate for 30 FPS. Turning your viewpoint with a controller is among the most nauseating things you can do in VR, but some games still require it. When handled entirely by the app, this winds up being like a position change, so a low-frame-rate app would have smooth "rotation" when the user's head was moving, but chunky rotation when they use the controller. To address this, TimeWarp has an "ExternalVelocity" matrix parameter that can allow controller yaw to be smoothly extrapolated on every rendered frame. We do not currently have a Unity interface for this. In-world animation will be noticeably chunkier at lower frame rates, but in-place doesn't wind up being very distracting. Objects on trajectories are more problematic, because they appear to be stuttering back and forth as they move, when you track them with your head. For many apps, monoscopic rendering may still be a better experience than 30 FPS rendering. The savings are not as large, but it is a clear tradeoff without as many variables.

41 Mobile Native Engine Integration 41 If you go below 60 FPS, Unity apps may be better off without the multi-threaded renderer, which adds a frame of latency. 30 FPS with GPU pipeline and multi-threaded renderer is getting to be a lot of latency, and while TimeWarp will remove all of it for attitude, position changes including the head model, will feel very lagged. Note that this is all bleeding edge, and some of this guidance is speculative. Power Management Power management is a crucial consideration for mobile VR development. A current-generation mobile device is amazingly powerful for something that you can stick in your pocket - you can reasonably expect to find four 2.6 GHz CPU cores and a 600 MHz GPU. Fully utilized, they can deliver more performance than an Xbox 360 or PS3 in some cases. A governor process on the device monitors an internal temperature sensor and tries to take corrective action when the temperature rises above certain levels to prevent malfunctioning or scalding surface temperatures. This corrective action consists of lowering clock rates. If you run hard into the limiter, the temperature will continue climbing even as clock rates are lowered, and CPU clocks may drop all the way down to 300 MHz. The device may even panic under extreme conditions. VR performance will catastrophically drop along the way. The default clock rate for VR applications is 1.8 GHz on two cores, and 389 MHz on the GPU. If you consistently use most of this, you will eventually run into the thermal governor, even if you have no problem at first. A typical manifestation is poor app performance after ten minutes of good play. If you filter logcat output for "thermal" you will see various notifications of sensor readings and actions being taken. (For more on logcat, see Android Debugging: Logcat.) A critical difference between mobile and PC/console development is that no optimization is ever wasted. Without power considerations, if you have the frame ready in time, it doesn't matter if you used 90% of the available time or 10%. On mobile, every operation drains the battery and heats the device. Of course, optimization entails effort that comes at the expense of something else, but it is important to note the tradeoff. Fixed Clock Level API The Fixed Clock Level API allows the application to set a fixed CPU level and a fixed GPU level. On current devices, the CPU and GPU clock rates are completely fixed to the application set values until the device temperature reaches the limit, at which point the CPU and GPU clocks will change to the power save levels. This change can be detected (see Power State Notification and Mitigation Strategy below). Some apps may continue operating in a degraded fashion, perhaps by changing to 30 FPS or monoscopic rendering. Other apps may display a warning screen saying that play cannot continue. The fixed CPU level and fixed GPU level set by the Fixed Clock Level API are abstract quantities, not MHz / GHz, so some effort can be made to make them compatible with future devices. For current hardware, the levels can be 0, 1, 2, or 3 for CPU and GPU. 0 is the slowest and most power efficient; 3 is the fastest and hottest. Typically the difference between the 0 and 3 levels is about a factor of two. Not all clock combinations are valid for all devices. For example, the highest GPU level may not be available for use with the two highest CPU levels. If an invalid matrix combination is provided, the system will not acknowledge the request and clock settings will go into dynamic mode. VrApi asserts and issues a warning in this case. Note: Use caution with combinations (2,3) and (3,3) because they are likely to lead quickly to overheating. For most apps, we recommend ensuring it runs well at (2,2).

42 42 Native Engine Integration Mobile Power Management and Performance There are no magic settings in the SDK to fix power consumption - this is critical. The length of time your application will be able to run before running into the thermal limit depends on two factors: how much work your app is doing, and what the clock rates are. Changing the clock rates all the way down only yields about a 25% reduction in power consumption for the same amount of work, so most power saving has to come from doing less work in your app. If your app can run at the (0,0) setting, it should never have thermal issues. This is still two cores at around 1 GHz and a 240 MHz GPU, so it is certainly possible to make sophisticated applications at that level, but Unitybased applications might be difficult to optimize for this setting. There are effective tools for reducing the required GPU performance: Don t use chromatic aberration correction on TimeWarp. Don t use 4x MSAA. Reduce the eye target resolution. Using 16-bit color and depth buffers may help. It is probably never a good trade to go below 2x MSAA you should reduce the eye target resolution instead. These all entail quality tradeoffs which need to be balanced against steps you can take in your application: Reduce overdraw (especially blended particles) and complex shaders. Always make sure textures are compressed and mipmapped. In general, CPU load seems to cause more thermal problems than GPU load. Reducing the required CPU performance is much less straightforward. Unity apps should always use the multithreaded renderer option, since two cores running at 1 GHz do work more efficiently than one core running at 2 GHz. If you find that you just aren t close, then you may need to set MinimumVsyncs to 2 and run your game at 30 FPS, with TimeWarp generating the extra frames. Some things work out okay like this, but some interface styles and scene structures highlight the limitations. For more information on how to set MinimumVsyncs, see the TimeWarp technical note. In summary, our general advice: If you are making an app that will probably be used for long periods of time, like a movie player, pick very low levels. Ideally use (0,0), but it is possible to use more graphics if the CPUs are still mostly idle, perhaps up to (0,2). If you are okay with the app being restricted to ten-minute chunks of play, you can choose higher clock levels. If it doesn t work well at (2,2), you probably need to do some serious work. With the clock rates fixed, observe the reported FPS and GPU times in logcat. The GPU time reported does not include the time spent resolving the rendering back to main memory from on-chip memory, so it is an underestimate. If the GPU times stay under 12 ms or so, you can probably reduce your GPU clock level. If the GPU times are low, but the frame rate isn t 60 FPS, you are CPU limited. Always build optimized versions of the application for distribution. Even if a debug build performs well, it will draw more power and heat up the device more than a release build. Optimize until it runs well. For more information on how to improve your Unity application s performance, see Best Practices and Performance Targets in our Unity documentation.

43 Mobile Native Engine Integration 43 Power State Notification and Mitigation Strategy The mobile SDK provides power level state detection and handling. Power level state refers to whether the device is operating at normal clock frequencies or if the device has risen above a thermal threshold and thermal throttling (power save mode) is taking place. In power save mode, CPU and GPU frequencies will be switched to power save levels. The power save levels are equivalent to setting the fixed CPU and GPU clock levels to (0, 0). If the temperature continues to rise, clock frequencies will be set to minimum values which are not capable of supporting VR applications. Once we detect that thermal throttling is taking place, the app has the choice to either continue operating in a degraded fashion or to immediately exit to the Oculus Menu with a head-tracked error message. In the first case, when the application first transitions from normal operation to power save mode, the following will occur: The Universal Menu will be brought up to display a dismissible warning message indicating that the device needs to cool down. Once the message is dismissed, the application will resume in 30Hz TimeWarp mode with correction for chromatic aberration disabled. If the device clock frequencies are throttled to minimum levels after continued use, a non-dismissible error message will be shown and the user will have to undock the device. In this mode, the application may choose to take additional app-specific measures to reduce performance requirements. For Native applications, you may use the following AppInterface call to detect if power save mode is active: GetPowerSaveActive(). For Unity, you may use the following plugin call: OVR_IsPowerSaveActive(). See OVR/Script/Util/OVRModeParms.cs for further details. In the second case, when the application transitions from normal operation to power save mode, the Universal Menu will be brought up to display a non-dismissible error message and the user will have to undock the device to continue. This mode is intended for applications which may not perform well at reduced levels even with 30Hz TimeWarp enabled. You may use the following calls to enable or disable the power save mode strategy: For Native, set settings.modeparms.allowpowersave in VrAppInterface::Configure () to true for power save mode handling, or false to immediately show the head-tracked error message. For Unity, you may enable or disable power save mode handling via OVR_VrModeParms_SetAllowPowerSave(). See OVR/Script/Util/OVRModeParms.cs for further details. Advanced Rendering This section describes advanced rendering features available through the SDK. Multi-View Overview With stock OpenGL, the method of stereo rendering is achieved by rendering to the two eye buffers sequentially. This typically doubles the application and driver overhead, despite the fact that the command streams and render states are almost identical. The GL_OVR_multiview extension addresses the inefficiency of sequential multi-view rendering by adding a means to render to multiple elements of a 2D texture array simultaneously. Using the multi-view extension,

44 44 Native Engine Integration Mobile draw calls are instanced into each corresponding element of the texture array. The vertex program uses a new ViewID variable to compute per-view values, typically the vertex position and view-dependent variables like reflection. The formulation of the extension is high level in order to allow implementation freedom. On existing hardware, applications and drivers can realize the benefits of a single scene traversal, even if all GPU work is fully duplicated per-view. But future support could enable simultaneous rendering via multi-gpu, tile-based architectures could sort geometry into tiles for multiple views in a single pass, and the implementation could even choose to interleave at the fragment level for better texture cache utilization and more coherent fragment shader branching. The most obvious use case in this model is to support two simultaneous views: one view for each eye. However, multi-view can also be used for foveated rendering, where two views are rendered per eye, one with a wide field of view and the other with a narrow one. The nature of wide field of view planar projection is that the sample density can become unacceptably low in the view direction. By rendering two inset eye views per eye, the required sample density is achieved in the center of projection without wasting samples, memory, and time by oversampling in the periphery. Basic Usage The GL_OVR_multiview extension is not a turn-key solution that can simply be enabled to support multi-view rendering in an application. An application must explicitly support GL_OVR_multiview to get the benefits. The GL_OVR_multiview extension is used by the application to render the scene, and the VrApi is unaware of its use. The VrApi supports sampling from the layers of a texture array, but is otherwise completely unaware of how the application produced the texture data, whether multi-view rendering was used or not. However, because of various driver problems, an application is expected to query the VrApi to find out whether or not multi-view is properly supported on a particular combination of device and OS. For example: vrapi_getsystempropertyint( &Java, VRAPI_SYS_PROP_MULTIVIEW_AVAILABLE ) == VRAPI_TRUE; Restructuring VrAppFramework Rendering For Multi-view The following section describes how to convert your application to be Multi-View compliant based on the VrAppFramework Multi-View setup. In order to set up your rendering path to be multi-view compliant, your app should specify a list of surfaces and render state back to App Frame(). Immediate GL calls inside the app main render pass are not compatible with multi-view rendering and not allowed. The first section below describes how to transition your app from rendering with DrawEyeview and instead return a list of surfaces back to the application framework. The section below describes multi-view rendering considerations and how to enable it in your app. Return Surfaces From Frame Set up the Frame Result: Apps should set up the ovrframeresult which is returned by Frame with the following steps: 1. Set up the ovrframeparms - storage for which should be maintained by the application. 2. Set up the FrameMatrices - this includes the CenterEye and View and Projection matrices for each eye. 3. Generate a list of render surfaces and append to the frame result Surfaces list. a. Note: The surface draw order will be the order of the list, from lowest index (0) to highest index. b. Note: Do not free any resources which surfaces in list rely on while Frame render is in flight. 4. Optionally, specify whether to clear the color or depth buffer with clear color.

45 Mobile Native Engine Integration 45 OvrSceneView Example An example using the OvrSceneView library scene matrices and surface generation follows: ovrframeresult OvrApp::Frame( const ovrframeinput & vrframe ) {... // fill in the frameresult info for the frame. ovrframeresult res; // Let scene construct the view and projection matrices needed for the frame. Scene.GetFrameMatrices( vrframe.fovx, vrframe.fovy, res.framematrices ); // Let scene generate the surface list for the frame. Scene.GenerateFrameSurfaceList( res.framematrices, res.surfaces ); // Initialize the FrameParms. FrameParms = vrapi_defaultframeparms( app->getjava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_gettimeinseconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrframe.colortextureswapchain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrframe.depthtextureswapchain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrframe.textureswapchainindex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrframe.texcoordsfromtanangles; FrameParms.Layers[0].Textures[eye].HeadPose = vrframe.tracking.headpose; } FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.frameparms = (ovrframeparmsextbase *) & FrameParms; return res; } Custom Rendering Example First, you need to make sure any immediate GL render calls are represented by an ovrsurfacedef. In the DrawEyeView path, custom surface rendering was typically done by issuing immediate GL calls. glactivetexture( GL_TEXTURE0 ); glbindtexture( GL_TEXTURE_2D, BackgroundTexId ); gldisable( GL_DEPTH_TEST ); gldisable( GL_CULL_FACE ); GlProgram & prog = BgTexProgram; gluseprogram( prog.program ); gluniform4f( prog.ucolor, 1.0f, 1.0f, 1.0f, 1.0f ); glbindvertexarray( globegeometry.vertexarrayobject ); gldrawelements( globegeometry.primitivetype, globegeometry.indexcount, globegeometry.indicestype, NULL ); gluseprogram( 0 ); glactivetexture( GL_TEXTURE0 ); glbindtexture( GL_TEXTURE_2D, 0 ); Instead, with the multi-view compliant path, an ovrsurfacedef and GlProgram would be defined at initialization time as follows. static ovrprogramparm BgTexProgParms[] = { { "Texm", ovrprogramparmtype::float_matrix4 }, { "UniformColor", ovrprogramparmtype::float_vector4 }, { "Texture0", ovrprogramparmtype::texture_sampled }, }; BgTexProgram= GlProgram::Build( BgTexVertexShaderSrc, BgTexFragmentShaderSrc, BgTexProgParms, sizeof( BgTexProgParms) / sizeof( ovrprogramparm ) ); GlobeSurfaceDef.surfaceName = "Globe"; GlobeSurfaceDef.geo = BuildGlobe();

46 46 Native Engine Integration Mobile GlobeSurfaceDef.graphicsCommand.Program = BgTexProgram; GlobeSurfaceDef.graphicsCommand.GpuState.depthEnable = false; GlobeSurfaceDef.graphicsCommand.GpuState.cullEnable = false; GlobeSurfaceDef.graphicsCommand.UniformData[0].Data = &BackGroundTexture; GlobeSurfaceDef.graphicsCommand.UniformData[1].Data = &GlobeProgramColor; At Frame time, the uniform values can be updated, changes to the gpustate can be made, and the surface(s) added to the render surface list. Note: This manner of uniform parm setting requires the application to maintain storage for the uniform data. Future SDKs will provide helper functions for setting up uniform parms and materials. An example of setting up FrameResult using custom rendering follows: ovrframeresult OvrApp::Frame( const ovrframeinput & vrframe ) {... // fill in the frameresult info for the frame. ovrframeresult res; // calculate the scene matrices for the frame. res.framematrices.centerview = vrapi_getcentereyeviewmatrix( &app->getheadmodelparms(), &vrframe.tracking, NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { res.framematrices.eyeview[eye] = vrapi_geteyeviewmatrix( &app->getheadmodelparms(), &CenterEyeViewMatrix, eye ); res.framematrices.eyeprojection[eye] = ovrmatrix4f_createprojectionfov( vrframe.fovx, vrframe.fovy, 0.0f, 0.0f, 1.0f, 0.0f ); } // Update uniform variables and add needed surfaces to the surface list. BackGroundTexture = GlTexture( BackgroundTexId, 0, 0 ); GlobeProgramColor = Vector4f( 1.0f, 1.0f, 1.0f, 1.0f ); res.surfaces.pushback( ovrdrawsurface( &GlobeSurfaceDef ) ); // Initialize the FrameParms. FrameParms = vrapi_defaultframeparms( app->getjava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_gettimeinseconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrframe.colortextureswapchain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrframe.depthtextureswapchain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrframe.textureswapchainindex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrframe.texcoordsfromtanangles; FrameParms.Layers[0].Textures[eye].HeadPose = vrframe.tracking.headpose; } FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.frameparms = (ovrframeparmsextbase *) & FrameParms; return res; } Specify the Render Mode: In your app Configure(), specify the appropriate render mode. To configure the app to render using the surfaces returned by Frame, set the following: settings.rendermode = RENDERMODE_STEREO; Multi-view Render Path Before enabling the multi-view rendering path, you will want to make sure your render data is multi-view compatible. This involves:

47 Mobile Native Engine Integration 47 Position Calculation App render programs should no longer specify Mvpm directly and should instead calculate gl_position using the system provided TransformVertex() function which accounts for the correct view and projection matrix for the current viewid. Per-Eye View Calculations Apps will need to take into consideration per-eye-view calculations: Examples follow: Per-Eye Texture Matrices: In the DrawEyeView path, the texture matrix for the specific eye was bound at the start of each eye render pass. For multi-view, an array of texture matrices indexed by VIEW_ID should be used. Note: Due to a driver issue with the Adreno 420 and version 300 programs, uniform array of matrices should be contained inside a uniform buffer object. Stereo Images: In the DrawEyeView path, the image specific to the eye was bound at the start of each eye render pass. For multi-view, while an array of texture index by VIEW_ID would be preferable, Not all supported platforms support using texture arrays. Instead, specify both textures in the fragment shader and the selection determined by the VIEW_ID. External Image Usage Applications which make use of image_external, i.e. video rendering applications, must take care when constructing image_external shader programs. Not all drivers support image_external as version 300. The good news is that drivers which fully support multiview will support image_external in the version 300 path, which means image_external programs will work correctly when the multi-view path is enabled. However, for drivers which do not fully support multi-view, these shaders will be compiled as version 100. These shaders must continue to work in both paths, i.e., version 300 only constructs should not be used and the additional extension specification requirements, listed above, should be made. For some cases, the cleanest solution may be to only use image_external during Frame to copy the contents of the external image to a regular texture2d which is then used in the main app render pass (which could eat into the multi-view performance savings) Enable Multi-view Rendering Finally, to enable the multi-view rendering path, set the render mode in your app Configure() to the following: settings.rendermode = RENDERMODE_MULTIVIEW;

48 48 Native Application Framework Mobile Native Application Framework The VrAppFramework provides a wrapper around Android activity that manages the Android lifecycle. We have provided the simple sample scene VrCubeWorld_Framework to illustrate using the application framework with GuiSys, OVR_Locale, and SoundEffectContext. Please see Native Samples on page 25 for details. Creating New Apps with the Framework Template This section will get you started with creating new native applications using VrAppFramework. Template Project Using the Application Framework VrTemplate is the best starting place for creating your own mobile app using VrAppFramework. The VrTemplate project is set up for exploratory work and as a model for setting up similar native applications using the Application Framework. We include the Python script make_new_project.py (for Mac OS X) and make_new_project.bat (with wrapper for Windows) to simplify renaming the project name set in the template. Usage Example: Windows To create your own mobile app based on VrTemplate, perform the following steps: 1. Run <install path>\vrsamples\native\vrtemplate\make_new_project.bat, passing the name of your new app and your company as arguments. For example: make_new_project.bat VrTestApp YourCompanyName 2. Your new project will now be located in <install path>\vrsamples\native\vrtestapp. The packagename will be set to com.yourcompanyname.vrtestapp. 3. Copy your oculussig file to the Project/assets/ folder of your project. (See Application Signing for more information.) 4. Navigate to your new project directory. With your Android device connected, execute the build.bat located inside your test app directory to verify everything is working. 5. build.bat should build your code, install it to the device, and launch your app on the device. One parameter controls the build type: clean - cleans the project s build files debug - builds a debug application release - builds a release application -n - skips installation of the application to the phone The Java file VrSamples/Native/VrTestApp/java/com/YourCompanyName/vrtestapp/MainActivity.java handles loading the native library that was linked against VrApi, then calls nativesetappinterface() to allow the C ++ code to register the subclass of VrAppInterface that defines the application. See VrAppFramework/Include/ App.h for comments on the interface. The standard Oculus convenience classes for string, vector, matrix, array, et cetera are available in the Oculus LibOVRKernel, located at LibOVRKernel\Src\. You will also find convenience code for OpenGL ES texture, program, geometry, and scene processing that is used by the demos.

49 Mobile Native Application Framework 49 UI and Input Handling This guide describes resources for handling UI and input for native apps using VrAppFramework. Native User Interface Applications using the application framework have access to the VrGUI interface code. The VrGUI system is contained in VrAppSupport/VrGui/Src. Menus are represented as a generic hierarchy of menu objects. Each object has a local transform relative to its parent and local bounds. VrGUI may be used to implement menus in a native application, such as the Folder Browser control used in Oculus 360 Photos SDK and Oculus 360 Videos SDK (found in FolderBrowser.cpp). VrGUI allows for functional improvements by implementing a component model. When a menu object is created, any number of components can be specified. These components derive from a common base class, defined in VRMenuComponent.h, that handles events sent from the parent VRMenu. Components can be written to handle any event listed in VRMenuEvent.h. Events are propagated to child objects either by broadcasting to all children or by dispatching them along the path from the menu s root to the currently focused object. When handling an event a component can consume it by returning MSG_STATUS_CONSUMED, which will halt further propagation of that event instance. See VRMenuEventHandler.cpp for implementation details. Examples of reusable components can be found in the native UI code, including DefaultComponent.h, ActionComponents.h and ScrollBarComponent.h. Input Handling Input to the application is intercepted in the Java code in VrActivity.java in the dispatchkeyevent() method. If the event is NOT of type ACTION_DOWN or ACTION_UP, the event is passed to the default dispatchkeyevent() handler. If this is a volume up or down action, it is handled in Java code. Otherwise the key is passed to the buttonevent() method, which passes the event to nativekeyevent(). nativekeyevent() posts the message to an event queue, which is then handled by the AppLocal::Command() method. In previous versions of the SDK, AppLocal::Command() called AppLocal::KeyEvent() with the event parameters. However, in this was changed to buffer the events into the InputEvents array. Up to 16 events can be buffered this way, per frame. Each time the VrThreadFunction executes a frame loop, these buffered events are passed into VrFrameBuilder::AdvanceVrFrame(), which composes the VrFrame structure for that frame. The InputEvents array is then cleared. After composition, the current VrFrame is passed to FrameworkInputProcessing(), which iterates over the input event list and passes individual events to the native application via VrAppInterface::OnKeyEvent(). Native applications using the VrAppFramework should overload this method in their implementation of VrAppInterface to receive key events. The application is responsible for sending events to the GUI or other systems from its overloaded VrAppInterface::OnKeyEvent(), and returning true if the event is consumed at any point. If OnKeyEvent() returns true, VrAppFramework assumes the application consumed the event and will not act upon it. If the application passes an input event to the VrGUI System, any system menus or menus created by the application have a chance to consume it in their OnEvent_Impl implementation by returning MSG_STATUS_CONSUMED, or pass it to other menus or systems by returning MSG_STATUS_ALIVE.

50 50 Native Application Framework Mobile Native SoundEffectContext Use SoundEffectContext to easily play sound effects and replace sound assets without recompilation. SoundEffectContext consists of a simple sound asset management class, SoundAssetMapping, and sound pool class, SoundPool. The SoundAssetMapping is controlled by a JSON file in which sounds are mapped as key-value pairs, where a value is the actual path to the.wav file. For example: "sv_touch_active" : "sv_touch_active.wav" In code, we use the key to play the sound, which SoundEffectManger then resolves to the actual asset. For example: soundeffectcontext->play( sv_touch_active ); The string sv_touch_active is first passed to SoundEffectContext, which resolves it to an absolute path, as long as the corresponding key was found during initialization. The following two paths specify whether the sound file is in the res/raw folder of VrAppFramework (e.g., sounds that may be played from any app, such as default sounds or Universal Menu sounds), or the assets folder of a specific app: res/raw/ sv_touch_active.wav" or assets/ sv_touch_active.wav" If SoundEffectContext fails to resolve the passed-in string within the SoundEffectContext::Play function, the string is passed to SoundPooler.play in Java. In SoundPooler.play, we first try to play the passed-in sound from res/raw, and if that fails, from the current assets folder. If that also fails, we attempt to play it as an absolute path. The latter allows for sounds to be played from the phone s internal memory or SD card. The JSON file loaded by SoundAssetMapping determines which assets are used with the following scheme: 1. Try to load sounds_assets.json in the Oculus folder on the sdcard: sdcard/oculus/sound_assets.json 2. If we fail to find the above file, we the load the following two files in this order: res/raw/sound_assets.json assets/sound_assets.json The loading of the sound_assets.json in the first case allows for a definition file and sound assets to be placed on the SD card in the Oculus folder during sound development. The sounds may be placed into folders if desired, as long as the relative path is included in the definition. For example, if we define the following in sdcard/oculus/sound_assets.json: "sv_touch_active" : "SoundDev/my_new_sound.wav" we would replace all instances of that sound being played with our new sound within the SoundDev folder. The loading of the two asset definition files in the second step allows for overriding the framework's built-in sound definitions, including disabling sounds by redefining their asset as the empty string. For example: "sv_touch_active" : "" The above key-value pair, if defined in an app s sound_assets.json (placed in its asset folder), will disable that sound completely, even though it is still played by other VrAppSupport code such as VrGUI.

51 Mobile Native Application Framework 51 You will find the sound effect source code in VrAppSupport/VrSound/. Runtime Threads The UI thread is the launch thread that runs the normal Java code. The VR Thread is spawned by the UI thread and is responsible for the initialization, the regular frame updates, and for drawing the eye buffers. All of the AppInterface functions are called on the VR thread. You should put any heavyweight simulation code in another thread, so this one basically just does drawing code and simple frame housekeeping. Currently this thread can be set to the real-time SCHED_FIFO mode to get more deterministic scheduling, but the time spent in this thread may have to be limited. Non-trivial applications should create additional threads -- for example, music player apps run the decode and analyze in threads, app launchers load JPG image tiles in threads, et cetera. Whenever possible, do not block the VR thread on any other thread. It is better to have the VR thread at least update the view with new head tracking, even if the world simulation hasn't finished a time step. The Talk To Java (TTJ) Thread is used by the VR thread to issue Java calls that aren't guaranteed to return almost immediately, such as playing sound pool sounds or rendering a toast dialog to a texture. Sensors have their own thread so they can be updated at 500 Hz.

52 52 Other Native Libraries Mobile Other Native Libraries This guide describes other native libraries included with the Mobile SDK. Overview Additional supplemental libraries included with this SDK include: VrAppSupport LibOVRKernel VrAppSupport The VrGui library contains functionality for a fully 3D UI implemented as a scene graph. Each object in the scene graph can be functionally extended using components. This library has dependencies on VrAppFramework and must be used in conjunction with it. See the CinemaSDK, Oculus360PhotosSDK and Oculus360Videos SDKs for examples. The VrLocale library is a wrapper for accessing localized string tables. The wrapper allows for custom string tables to be used seamlessly with Android s own string tables. This is useful when localized content is not embedded in the application package itself. This library depends on VrAppFramework. The VrModel library implements functions for loading 3D models in the.ovrscene format. These models can be exported from.fbx files using the FbxConverter utility (see FBXConverter). This library depends on rendering functionality included in VrAppFramework. The VrSound library implements a simple wrapper for playing sounds using the android.media.soundpool class. This library does not provide low latency or 3D positional audio and is only suitable for playing simple UI sound effects. LibOVRKernel LibOVRKernel is a reduced set of libraries primarily of low-level functions, containers, and mathematical operations. It is an integral part of VrAppFramework, but may generally be disregarded by developers using VrApi.

53 Mobile Media and Assets 53 Media and Assets This guide describes how to work with still images, videos, and other media for use with mobile VR applications. Mobile VR Media Overview Author all media, such as panoramas and movies, at the highest-possible resolution and quality, so they can be resampled to different resolutions in the future. This topic entails many caveats and tradeoffs. Panoramic Stills Use 4096x2048 equirectangular projection panoramas for both games and 360 photos. 1024x1024 cube maps is for games, and 1536x1536 cube maps is for viewing in 360 Photos with the overlay code. Panoramic Videos The Qualcomm H.264 video decoder is spec driven by the ability to decode 4k video at 30 FPS, but it appears to have some headroom above that. The pixel rate can be flexibly divided between resolution and frame rate, so you can play a 3840x FPS or a 2048x FPS. The Android software layer appears to have an arbitrary limit of 2048 rows on video decode, so you may not choose to encode, say, a 4096x FPS. The compressed bit rate does not appear to affect the decoding rate; panoramic videos at 60 Mb/s decode without problems, but most scenes should be acceptable at 20 Mb/s or so. The conservative specs for panoramic video are: 60 FPS or 60 FPS stereo. If the camera is stationary, 30 FPS video may be considered. The GALAXY S7 decoder allows 4K at 10bit color, 60 FPS (HEVC) or 8-bit, 60 FPS (VP9). The GALAXY S8 decoder allows 4k at 60 FPS. Oculus 360 Video is implemented using equirectangular mapping to render panoramic videos. Top-bottom, bottom-top, left-right, and right-left stereoscopic video support is implemented using the following naming convention for videos: _TB.mp4 Top / bottom stereoscopic panoramic video _BT.mp4 Bottom / top stereoscopic panoramic video _LR.mp4 Left / right stereoscopic panoramic video _RL.mp4 Right / left stereoscopic panoramic video Default Non stereoscopic video if width does not match height, otherwise loaded as top / bottom stereoscopic video

54 54 Media and Assets Mobile Movies on Screens Comfortable viewing size for a screen is usually less than 70 degrees of horizontal field of view, which allows the full screen to be viewed without turning your head significantly. For video playing on a surface in a virtual world using the recommended 1024x1024 eye buffers, anything over 720x480 DVD resolution is wasted, and if you don t explicitly build mipmaps for it, it will alias and look worse than a lower resolution video. With the TimeWarp overlay plane code running in Oculus Cinema on the 1440 devices, 1280x720 HD resolution is a decent choice. The precise optimum depends on seating position and may be a bit lower, but everyone understands 720P, so it is probably best to stick with that. Use more bit rate than a typical web stream at that resolution, as the pixels will be magnified so much. The optimal bit rate is content dependent, and many videos can get by with less, but 5 Mb/s should give good quality. 1080P movies play, but the additional resolution is wasted and power consumption is needlessly increased. 3D movies should be encoded full side by side with a 1:1 pixel aspect ratio. Content mastered at 1920x1080 compressed side-by-side 3D should be resampled to 1920x540 resolution full side-by-side resolution. Movie Meta-data When loading a movie from the sdcard, Oculus Cinema looks for a sidecar file with metadata. The sidecar file is simply a UTF8 text file with the same filename as the movie, but with the extension.txt. It contains the title, format (2D/3D), and category. { "title": "format": "category": "theater": } "Introducing Henry", "2D", "trailers", "" Title is the name of the movie. Oculus Cinema will use this value instead of the filename to display the movie title. Format describes how the film is formatted. If left blank, it will default to 2D (unless the movie has 3D in it s pathname). Format may be one of the following values: 2D Full screen 2D movie 3D 3D movie with left and right images formatted side-by-side 3DLR 3DLRF 3D movie with left and right images formatted side-by-side full screen (for movies that render too small in 3DLR) 3DTB 3D movie with left and right images formatted top-and-bottom 3DTBF 3D movie with left and right images formatted top-and-bottom full screen (for movies that render too small in 3DTB) Category can be one of the following values: Blank Movie accessible from My Videos tab in Oculus Cinema

55 Mobile Media and Assets 55 Trailers Movie accessible from Trailers tab in Oculus Cinema Multiscreen Movie accessible from Multiscreen tab in Oculus Cinema Oculus 360 Photos and Videos Meta-data The retail version of 360 Photos stores all its media attribution information in a meta file that is packaged into the apk. This allows the categories to be dynamically created and/or altered without the need to modify the media contents directly. For the SDK 360 Photos, the meta file is generated automatically using the contents found in Oculus/360Photos. The meta data has the following structure in a meta.json file which is read in from the assets folder: { "Categories":[ { name : Category1 }, { name : Category2 } ], "Data":[ { "title": "Media title", "author": "Media author", "url": "relative/path/to/media" "tags" : [ { "category" : "Category2" } ] } { "title": "Media title 2", "author": "Media author 2", "url": "relative/path/to/media2" "tags" : [ { "category" : "Category" }, { "category" : "Category2" } ] } } For both the retail and sdk versions of 360 Videos, the meta data structure is not used and instead the categories are generated based on what s read in from the media found in Oculus/360Videos. Media Locations The SDK comes with three applications for viewing stills and movies. The Oculus Cinema application can play both regular 2D movies and 3D movies. The Oculus 360 Photos application can display 360 degree panoramic stills and the Oculus 360 Videos application can display 360 degree panoramic videos. These applications have the ability to automatically search for media in specific folders on the device. Oculus 360 Photos uses metadata which contains the organization of the photos it loads in addition to allowing the data to be dynamically tagged and saved out to its persistent cache. This is how the Favorite feature works, allowing the user to mark photos as favorites without any change to the media storage itself. The following table indicates where to place additional media for these applications. Media Folders Application 2D Movies Movies\ Oculus Cinema DCIM\

56 56 Media and Assets Mobile Media Folders Application Oculus\Movies \My Videos 3D Movies Movies\3D Oculus Cinema DCIM\3D Oculus\Movies \My Videos\3D 360 degree panoramic stills Oculus \360Photos Oculus 360 Photos (In the app assets\meta.json) 360 degree panoramic videos Oculus \360Videos Movie Oculus\Cinema Theater.ovrscene \Theaters Oculus 360 Videos Oculus Cinema Native VR Media Applications Oculus Cinema Oculus Cinema uses the Android MediaPlayer class to play videos, both conventional (from /sdcard/movies/ and /sdcard/dcim/) and side by side 3D (from /sdcard/movies/3d and /sdcard/dcim/3d), in a virtual movie theater scene (from sdcard/oculus/cinema/theaters). See Mobile VR Media Overview for more details on supported image and movie formats. Before entering a theater, Oculus Cinema allows the user to select different movies and theaters. New theaters can be created in Autodesk 3DS Max, Maya, or Luxology MODO, then saved as one or more Autodesk FBX files, and converted using the FBX converter that is included with this SDK. See the FBX Converter guide for more details on how to create and convert new theaters. The FBX converter is launched with a variety of command-line options to compile these theaters into models that can be loaded in the Oculus Cinema application. To avoid having to re-type all the command-line options, it is common practice to use a batch file that launches the FBX converter with all the command-line options. This package includes two such batch files, one for each example theater: SourceAssets/scenes/cinema.bat SourceAssets/scenes/home_theater.bat Each batch file will convert one of the FBX files with associated textures into a model which can be loaded by the Oculus Cinema application. Each batch file will also automatically push the converted FBX model to the device and launch the Oculus Cinema application with the theater. The FBX file for a theater should include several specially named meshes. One of the meshes should be named screen. This mesh is the surfaces onto which the movies will be projected. Read the FBX converter

57 Mobile Media and Assets 57 documentation to learn more about tags. Up to 8 seats can be set up by creating up to 8 tags named cameraposx where X is in the range [1, 8]. A theater is typically one big mesh with two textures. One texture with baked static lighting for when the theater lights are one, and another texture that is modulated based on the movie when the theater lights are off. The lights are gradually turned on or off by blending between these two textures. To save battery, the theater is rendered with only one texture when the lights are completely on or completely off. The texture with baked static lighting is specified in the FBX as the diffuse color texture. The texture that is modulated based on the movie is specified as the emissive color texture. The two textures are typically 4096 x 4096 with 4-bits/texel in ETC2 format. Using larger textures may not work on all devices. Using multiple smaller textures results in more draw calls and may not allow all geometry to be statically sorted to reduce the cost of overdraw. The theater geometry is statically sorted to guarantee frontto-back rendering on a per triangle basis which can significantly reduce the cost of overdraw. Read the FBX Converter guide to learn about optimizing the geometry for the best rendering performance. In addition to the mesh and textures, Oculus Cinema currently requires a 350x280 icon for the theater selection menu. This is included in the scene with a command-line parameter since it is not referenced by any geometry, or it can be loaded as a.png file with the same filename as the ovrscene file. Oculus 360 Photos Oculus 360 Photos is a viewer for panoramic stills. The SDK version of the application presents a single category of panorama thumbnail panels which are loaded in from Oculus/360Photos on the SDK sdcard. Gazing towards the panels and then swiping forward or back on the touchpad will scroll through the content. When viewing a panorama still, touch the touchpad again to bring back up the panorama menu which displays the attribution information if properly set up. Additionally the top button or tapping the back button on the touchpad will bring back the thumbnail view. The bottom button will tag the current panorama as a Favorite which adds a new category at the top of the thumbnail views with the panorama you tagged. Pressing the Favorite button again will untag the photo and remove it from Favorites. Gamepad navigation and selection is implemented via the left stick and d-pad used to navigate the menu, the single dot button selects and the 2-dot button backs out a menu. See Mobile VR Media Overview for details on creating custom attribution information for panoramas. Oculus 360 Videos Oculus 360 Videos works similarly to 360 Photos as they share the same menu functionality. The application also presents a thumbnail panel of the movie read in from Oculus/360Videos which can be gaze selected to play. Touch the touchpad to pause the movie and bring up a menu. The top button will stop the movie and bring up the movie selection menu. The bottom button restarts the movie. Gamepad navigation and selection is implemented via the left stick and d-pad used to navigate the menu, the single dot button selects and the 2dot button backs out a menu. See Mobile VR Media Overview for details on the supported image and movie formats. VrScene By default VrScene loads the Tuscany scene from the Oculus demos, which can be navigated using a gamepad. However, VrScene accepts Android Intents to view different.ovrscene files, so it can also be used as a generic scene viewer during development. New scenes can be created in Autodesk 3DS Max, Maya, or Luxology MODO, then saved as one or more Autodesk FBX files, and converted using the FBX converter that is included with this SDK. See the FBX converter document for more details on creating and converting new FBX scenes

58 58 Media and Assets Mobile Models This section describes creating custom theater environments for Oculus Cinema and using the Oculus FBX Converter tool.

59 Mobile Media and Assets 59 Oculus Cinema Theater Creation

60 60 Media and Assets Mobile This section describes how to create and compile a movie theater FBX for use with Oculus Cinema. Figure 4: The cinema scene in MODO and its item names

61 Mobile Media and Assets 61 Steps to Create a Theater: Create a theater mesh. Create a screen mesh. Create two theater textures (one with the lights on and one with the lights off). Create meshes that define the camera/view positions. Create meshes that use an additive material (optional, for dim lights). Run the FBX Converter tool. Detailed Instructions 1. Create a theater mesh Nothing special here. Just create a theater mesh, but here are some tips for how to optimize the mesh. Keep the polycount as low as physically possible as a rule. Rendering is a taxing operation, and savings in this area will benefit your entire project. As a point of reference, the "cinema" theater mesh has a polycount of 42,000 triangles. Review Performance Guidelines for detailed information about per-scene polycount targets. Polycount optimization can be executed in a variety of ways. For example, in our "cinema" theater in Oculus Cinema, all the chairs near player positions are high poly while chairs in the distance are low poly. It is important to identify and fix any visual faceting caused by this optimization by placing cameras at defined seating positions in your modeling tool. If certain aspects of the scene look low poly then add new vertices in targeted areas of those silhouettes to give the impression that everything in the scene is higher poly than it actually is. This process takes a bit of effort, but it is definitely a big win in terms of visual quality and performance. You may also consider deleting any polys that never face the user when

62 62 Media and Assets Mobile a fixed camera position is utilized. Developing a script in your modeling package should help make quick work of this process. For our cinema example case, this process reduced the polycount by 40%. Figure 5: Poly optimizations of chairs in the cinema mesh. If you ve optimized your mesh and still land above the recommended limits, you may be able to push more polys by cutting the mesh into 9 individual meshes arranged in a 3x3 grid. If you can only watch movies from the center grid cell, hopefully some of the meshes for the other grid cells will not draw if they're out of your current view. Mesh visibility is determined by a bounding box and if you can see the bounding box of a mesh, then you're drawing every single one of its triangles. The figure below illustrates where cut points could be placed if the theater mesh requires this optimization. Note that the

63 Mobile Media and Assets 63 cut at the back of the theater is right behind the last camera position. This keeps triangles behind this cut point from drawing while the user is looking forward: Figure 6: Mesh cuts optimized for camera positions.

64 64 Media and Assets Mobile Another optimization that will help rendering performance is polygon draw order. The best way to get your polygons all sorted is to move your theater geometry such that the average of all the camera positions is near 0,0,0. Then utilize the -sort origin command-line option of the FBX Converter when compiling this mesh (see Step 7 for more information). Material : The more materials you apply to your mesh, the more you slow down the renderer. Keep this number low. We apply one material per scene and it's basically one 4k texture (well, it's actually two, because you have to have one for when the lights are on and one for when the lights are off - this is covered in Step 3 below). Textures : You ll need to add your two textures to the material that show the theater with the lights on and lights off. The way you do that is you add the texture with the lights on and set its mode to diffuse color (in MODO) and set the mode of the texture with the lights off to luminous color. 2. Create a screen mesh Mesh : Create a quad and have its UVs use the full 0-1 space, or it won't be drawing the whole movie screen. Mesh Name : Name the mesh "screen". Material : Apply a material to it called "screen". You may add a tiny 2x2 image called "screen.png" to the material, but it is not absolutely necessary. 3. Create two theater textures (one with the lights on and one with the lights off) Create a CG scene in MODO that is lit as if the lights were turned on. This scene is then baked to a texture. Next, turn off all room lights in the MODO scene and add one area light coming from the screen. This

65 Mobile Media and Assets 65 scene is baked out as the second texture. In the cinema that is included with the SDK, the material is named "cinema." The lights-on image is named "cinema_a.png" and lights-off image is named "cinema_b.png" Figure 7: The cinema_a.png and cinema_b.png scene textures for the "cinema" theater. 4. Create meshes that define the camera/view positions A camera/view position for one of the possible seats in the theater is defined by creating a mesh that consists of a single triangle with a 90 degree corner. The vert at the 90 degree corner is used for the camera position. Name the mesh camerapos1. When you process the scene using the FBX Converter, use the -tag command-line parameter to specify which meshes are used to create the camera positions, and use the -remove parameter to remove them from the scene so that they re not drawn. For example, when converting cinema.fbx, we use -tag screen camerapos1 camerapos2 camerapos3 remove camerapos1 camerapos2 camerapos3 on the command line to convert the camera position meshes to tags and remove them.

66 66 Media and Assets Mobile Max number of seating positions : The current Cinema application supports up to 8 camera positions. Figure 8: Define camera positions by creating right-triangle meshes, placing the 90-degree-vertex at the camera position. 5. Create meshes that use an additive material (optional, for when the lights are off) Different real-world movie theaters leave different lights on while movies are playing. They may dimly light stair lights, walkway lights, or wall lights. To recreate these lights in the virtual theater, bake them to some polys that are drawn additively.

67 Mobile Media and Assets 67 To make an additive material: create a material and append _additive to the material name, then add an image and assign it to luminous color. Figure 9: The cinema_additive_b.png lighting texture for the "cinema" theater 6. Run the FBX Converter tool Note: For more information on the FBX Converter tool, see FBX Converter. To avoid retyping all the FBX Converter command-line options, it is common practice to create a batch file that launches the FBX Converter. For the example cinema, the batch is placed in the folder above where the FBX file is located: \OculusSDK\Mobile\Main\SourceAssets\scenes\cinema\cinema.fbx \OculusSDK\Mobile\Main\SourceAssets\scenes\cinema.bat The cinema.bat batch file contains the following: FbxConvertx64.exe -o cinema -pack -cinema -stripmodonumbers -rotate 90 -scale flipv attrib position uv0 -sort origin -tag screen camerapos1 camerapos2 camerapos3 -remove camerapos1 camerapos2 camerapos3 -render cinema\cinema.fbx -raytrace screen -include cinema\icon.png Note: This is a single line in the batch file that we ve wrapped for the sake of clarity. Here is an explanation of the different command-line options used to compile the cinema.

68 68 Media and Assets Mobile FbxConvertx64.exe The FBX Converter executable. -o cinema The -o option specifies the name of the.ovrscene file. -pack Makes the FBX Converter automatically run the cinema_pack.bat batch file that packages everything into the.ovrscene file. -cinema The.ovrscene file is automatically loaded into Cinema instead of VrScene when the device is connected. -stripmodonumbers MODO often appends {2} to item names because it does not allow any duplicate names. This option strips those numbers. -rotate 90 Rotates the whole theater 90 degrees about Y because the cinema was built to look down +Z, while Cinema looks down +X. -scale 0.01 Scales the theater by a factor of 100 (when MODO saves an FBX file, it converts meters to centimeters). -flipv Flips the textures vertically, because they are flipped when they are compressed. -attrib position uv0 Makes the FBX Converter remove all vertex attributes except for the position and first texture coordinate. -sort origin Sorts all triangles front-to-back from 0,0,0 to improve rendering performance. -tag screen camerapos1 Creates tags for the screen and view positions in the theater. camerapos2 These tags are used by Cinema. camerapos3 -remove camerapos1 camerapos2 camerapos3 Keeps the view position meshes from being rendered. -render cinema \cinema.fbx Specifies the FBX file to compile. -raytrace screen Allows gaze selection on the theater screen. -include cinema \icon.png Includes an icon for the theater in the.ovrscene file. These are most of the options you ll need for the FbxConvert.exe. Additional options exist, but they are not typically needed to create a theater. For a complete list of options and more details on how to create and convert FBX files, see FBX Converter. An example of another command-line option is -discrete <mesh1> [<mesh2>...]. Use this when you cut your mesh into chunks to reduce the number of polys being drawn when any of these meshes are

69 Mobile Media and Assets 69 offscreen. By default, the FBX Converter optimizes the geometry for rendering by merging the geometry into as few meshes as it can. If you cut the scene up into 9 meshes, the FBX Converter will merge them back together again unless you use the -discrete command-line option. 7. Copy your.ovrscene files to sdcard/oculus/cinema/theaters Theater Design Principles Theaters are static scenes that we want users to spend dozens of hours in. It is therefore especially important to design them to very high standards. In this section, we offer some guidelines to assist with theater design. Dimensions and Perspective The viewpoint should be centered on the screen. The screen should be exactly 16:9 aspect ratio, exactly flat, and exactly axial. The screen in the SDK source files for Oculus Cinema covers 60 degrees horizontal FOV. It is more comfortable to have a larger screen that is farther away, because the eyes will have less focus/ vergence mismatch. The sample Oculus Cinema screen is five meters from the viewpoint, which requires the screen to be six meters wide. Three meters away and 3.46 meters x 1.95 meters or four meters and 4.62 x 2.60 may be more reasonable actual targets. The large screen would go nearly to the floor to remain centered on the viewpoint unless the couch area is raised up. Do not over-optimize geometry for the viewpoint for any model that will also be used for PC with position tracking. For example, don t remove occluded back sides, just make them cruder and lower texture detail if you need to optimize. Aliasing All of the geometry in the scene should be designed to minimize edge aliasing. Aliasing will show up at every silhouette edge and at every texture seam. With a nearly constant viewpoint it should be possible to place texture seams on the back sides of objects. Venetian blinds are almost worst case for silhouette edges many long edges. Surfaces that are very nearly coplanar with the viewpoint are even worse than silhouette edges, causing the entire surface to go from not-drawn to a scattering of pixels using the deepest mipmap as the user s head moves around. This can be worked around in the geometric design. You have a large enough triangle budget that there shouldn t be any starkly visible polygon edges. Keeping the use of complex curved surfaces to a minimum in the design is important. Color and Texture Don t use solid black in any environment source textures. OLED displays have technical problems with very deep blacks, so design rooms with fairly bright surfaces. The black leather chairs in the current home theater are problematic for this. You may compress all textures with ETC2 or ASTC texture compression to conserve power - we recommend using ASTC compression. Make sure that no alpha channel is encoded. If you place text on any surfaces, make it as close to parallel with the view plane as possible, and make it large enough that it is easy to read. It should either be headlines or a tiny blur, nothing in between at a barely-legible state. Cut the texture sizes to only what is needed for the sized-to-view-position texture layouts don t expand them to fit a 2x 4k x 4k budget, since additional resolution will be completely wasted. In theory, arbitrary texture sizes are fine, but we recommend staying on power of two boundaries. No need to remain square 4k x 2k and so on are fine.

70 70 Media and Assets Mobile FBX Converter A tool to convert FBX files into geometry for rendering, collision detection and gaze selection in virtual reality experiences. Overview The FBX Converter reads one or more Autodesk FBX files and creates a file with models stored in JSON format accompanied with a binary file with raw data. The JSON file created by the FBX Converter contains the following: A render model. This model has a list of textures, a list of joints, a list of tags, and a list of surfaces. Each surface has a material with a type and references to the textures used. The material textures may include a diffuse, specular, normal, emissive and reflection texture. Each surface also has a set of indexed vertices. The vertices may include the following attributes: position, normal, tangent, binormal, color, uv0, uv1, joint weights, joint indices. Two sets of UVs are supported, one set for a diffuse/normal/specular texture and a separate set for an optional emissive texture. A wall collision model. This model is used to prevent an avatar from walking through walls. This is a list of polytopes where each polytope is described by a set of planes. The polytopes are not necessarily bounded or convex. A ground collision model. This model determines the floor height of the avatar. This model is also no more than a list of polytopes. A ray trace model. This is a triangle soup with a Surface Area Heuristic (SAH) optimized KD-tree for fast ray tracing. Meshes or triangles will have to be annotated to allow interactive focus tracking or gaze selection. Textures for the render model can be embedded in an FBX file and will be extracted by the FBX Converter. Embedded textures are extracted into a folder named <filename>.fbm/, which is a sub-folder of the folder where the FBX file <filename>.fbx is located. Instead of embedding textures, they can also simply be stored in the same folder as the FBX file. The following source texture formats are supported: BMP, TGA, PNG, JPG. For the best quality, a lossy compression format like JPG should be avoided. The JSON file with models and the binary file are stored in a temporary folder named <filename>_tmp/, which is a sub-folder of the folder where the FBX Converter is launched, where <filename> is the output file name specified with the -o command-line option. The FBX Converter will also create a <filename>_pack.bat batch file in the folder where the FBX Converter is launched. This batch file is used to compress the render model textures to a platform specific compression format. A texture will be compressed to ETC2 (with or without alpha) for OpenGL ES mobile platforms and to S3TC (either DXT1/BC1 or DXT5/BC3) for the PC. The batch file uses file time stamps only to compress textures for which there is not already a compressed version that is newer than the source texture. The -clean command-line option may be used to force recompression of all textures. The batch file will also copy the JSON model file, the binary file with raw data and the platform specific compressed textures into a folder named <filename>_tmp/pack/, which is a sub-folder of the aforementioned <filename>_tmp/ folder. 7-Zip is then used to zip up the 'pack' folder into a single package that can be loaded by the application. The -pack command-line option can be used to automatically execute the <filename>_pack.bat batch file from the FBX Converter. Coordinate System The Oculus SDK uses the same coordinates system as the default coordinate system in 3D Studio Max or Luxology MODO. This is a right handed coordinate system with: +X right -X left

71 Mobile Media and Assets 71 +Y up -Y down +Z backward -Z forward The Oculus SDK uses the metric system for measurements, where one unit is equal to one meter. 3D Studio Max and Luxology MODO do not use any specific unit of measure, but one unit in either application maps to one unit in the Oculus SDK. However, when the data from Luxology MODO is saved to an FBX file, all units are automatically multiplied by one hundred. In other words, the unit of measure in the FBX file ends up being centimeter. Therefore there is always a scale of 1/100 specified on the FBX Converter command-line when converting FBX files from Luxology MODO: -scale 0.01 The FBX Converter supports several command-line options to transform the FBX geometry (translate, rotate, scale, et cetera). The transformations will be applied to the geometry in the order in which they are listed on the command-line. Materials Each render model surface stored in the JSON models file has a material. Such a material has a type and references to the textures used. The material textures may include a diffuse, specular, normal, emissive and reflection texture. These textures are retrieved from the FBX file as: 'DiffuseColor' 'NormalMap' 'SpecularColor' 'EmissiveColor' 'ReflectionColor' Most modeling tools will map similarly named textures to the above textures in the FBX file. For instance, using Luxology MODO, the 'Emissive color' texture is mapped to the 'EmissiveColor' texture in the FBX file. During rendering the diffuse texture is multiplied with the emissive texture as follows: color = DiffuseColor(uv0) * EmissiveColor(uv1) * 1.5 Surface reflections look into a cube map (or environment map). The textures for the 6 cube map sides should be named: <name>_right.<ext> <name>_left.<ext> <name>_up.<ext> <name>_down.<ext> <name>_backward.<ext> <name>_forward.<ext> The reflection texture 'ReflectionColor' should be set to one of these 6 textures used to create the cube map. The FBX Converter automatically picks up the other 5 textures and combines all 6 textures into a cube map. The normal map that is used to calculate the surface reflection is expected to be in local (tangent) space. During rendering the color of reflection mapped materials is calculated as follows: surfacenormal = normalize( NormalMap(uv0).x * tangent + NormalMap(uv0).y * binormal + NormalMap(uv0).z * normal ) reflection = dot( eyevector, surfacenormal ) * 2.0 * surfacenormal - eyevector color = DiffuseColor(uv0) * EmissiveColor(uv1) * SpecularColor(uv0) * ReflectionColor(reflection) The material type is one of the following:

72 72 Media and Assets Mobile 1. opaque 2. perforated 3. transparent 4. additive The first three material types are based on the alpha channel of the diffuse texture. The -alpha commandline option must be used to enable the 'perforated' and 'transparent' material types. Without the -alpha command-line option, the alpha channel of the diffuse texture will be removed. The 'additive' material type cannot be derived from the textures. An additive texture is specified by appending _additive to the material name in the FBX file. Animations There is currently not a full blown animation system, but having vertices weighted to joints is still very useful to programmatically move geometry, while rendering as few surfaces as possible. Think about things like the buttons and joystick on the arcade machines in VrArcade. An artist can setup the vertex weighting for skinning, but the FBX Converter also has an option to rigidly bind the vertices of a FBX source mesh to a single joint. In this case the joint name will be the name of the FBX source mesh. The meshes that need to be rigidly skinned to a joint are specified using the -skin command-line option. There is currently a limit of 16 joints per FBX file. The FBX Converter can also apply some very basic parametric animations to joints. These simple animations are specified using the -anim command-line option. The types of animation are rotate, sway and bob. One of these types is specified directly following the -anim command-line option. Several parameters that define the animation are specified after the type. For the rotate and sway these parameters are pitch, yaw and roll in degrees per second. For the bob the parameters are x, y and z in meters per second. Following these parameters, a time offset and scale can be specified. The time offset is typically use to animated multiple joints out of sync and the time scale can be used to speed up or slow down the animation. Last but not least, one or more joints are specified to which the animation should be applied. When a mesh is rigidly skinned to a joint using the -skin command-line option, the FBX Converter stores the mesh node transform on the joint. This mesh node transform is used as the frame of reference (pivot and axes) for animations. Tags A tag is used to define a position and frame of reference in the world. A tag can, for instance, be used to define a screen or a view position in a cinema. A tag can also be used to place objects in the world. The -tag command-line option is used to turn one or more FBX meshes from the render model into tags. The name of a tag will be the name of the mesh. The position and frame of reference are derived from the first triangle of the mesh and are stored in a 4x4 matrix. The position is the corner of the triangle that is most orthogonal. The edges that come out of this corner define the first two basis vectors of the frame of reference. These basis vectors are not normalized to maintain the dimensions of the frame. The third basis vector is the triangle normal vector. Multiple tags can be created by specifying multiple FBX mesh names after the -tag command-line option. The -tag command-line option does not remove the listed meshes from the render model. The -remove command-line option can be used to remove the meshes from the render model. Command-Line Interface The FBX Converter is a command-line tool. To run the FBX Converter open a Windows Command Prompt, which can be found in the Windows Start menu under All Programs -> Accessories. A command prompt can also be opened by typing cmd in the Windows

73 Mobile Media and Assets 73 Run prompt in the Start menu. Once a command prompt has been opened, we recommend launching the FBX Converter from the folder where the source FBX files are located. The FBX Converter comes with the following tools: FbxConvertx64.exe TimeStampx64.exe PVRTexTool/* 7Zip/* (from Oculus VR) (from Oculus VR) (version 3.4, from the PowerVR SDK version 3.3) (version 9.20, from The FbxConvert64.exe is the executable that is launched by the user. The other executables are directly or indirectly used by the FbxConvertx64.exe executable. Options The FBX Converter supports the following command-line options: Command Description -o <output> Specify the name for the.ovrscene file. Specify this name without extension. -render <model.fbx> Specify model used for rendering. -collision <model.fbx meshes> Specify model or meshes for wall collision. -ground <model.fbx meshes> Specify model or meshes for floor collision. -raytrace <model.fbx meshes> Specify model or meshes for focus tracking. -translate <x> <y> <z> Translate the models by x,y,z. -rotate <degrees> Rotate the models about the Y axis. -scale <factor> Scale the models by the given factor. -swapxz Swap the X and Z axis. -flipu Flip the U texture coordinate. -flipv Flip the V texture coordinate. -stripmodonumbers Strip duplicate name numbers added by MODO. -sort <+ -><X Y Z origin> Sort geometry along axis or from origin. -expand <dist> Expand collision walls by this distance. Defaults to 0.5 -remove <mesh1> [<mesh2>...] Remove these source meshes for rendering. -atlas <mesh1> [<mesh2>...] Create texture atlas for these meshes. -discrete <mesh1> [<mesh2>...] Keep these meshes separate for rendering. -skin <mesh1> [<mesh2>...] Skin these source meshes rigidly to a joint. -tag <mesh1> [<mesh2>...] Turn 1st triangles of these meshes into tags. -attrib <attr1> [<attr2>...] Only keep these attributes: [position, normal, tangent, binormal, color, uv0, uv1, auto]. -anim <rotate> <pitch> <yaw> <roll> Apply parametric animation to joints. <timeoffset> <timescale> <joint1> [<joint2>...] -anim <sway> <pitch> <yaw> <roll> <timeoffset> <timescale> <joint1> [<joint2>...] -anim <bob> <X> <Y> <Z> <timeoffset> <timescale> <joint1> [<joint2>...] -ktx Compress textures to KTX files (default).

74 74 Media and Assets Mobile Command Description -pvr Compress textures to PVR files. -dds Compress textures to DDS files. -alpha Keep texture alpha channels if present. -clean Delete previously compressed textures. -include <file1> [<file2>...] Include these files in the package. -pack Automatically run <output>_pack.bat file. -zip <x> 7-Zip compression level (0=none, 9=ultra). -fulltext Store binary data as text in JSON file. -nopush Do not push to device in batch file. -notest Do not run a test scene from batch file. -cinema Launch VrCinema instead of VrScene. -expo Launch VrExpo instead of VrScene. The -collision, -ground and -raytrace command-line options may either specify a separate FBX file or a list of meshes from the FBX file specified with the -render command-line option. If the collision and ray-trace meshes are in the same FBX file as the to be rendered meshes but the collision and ray-trace surface should not be rendered, then these meshes can be removed for rendering using the -remove command-line option. Note that the -collision, -ground, -raytrace, -remove, -atlas, -discrete, -skin and -tag command-line options accept wild cards like * and?. For instance, to make all surfaces discrete use: discrete * Batch Execution Instead of typing all the command-line options on the command prompt, it is common practice to use a batch file to launch the FBX Converter with a number of options. This allows for quick iteration on the assets while consistently using the same settings. The following is the contents of the batch file that was used to convert the FBX for the home theater: FbxConvertx64.exe -o home_theater -pack -stripmodonumbers -rotate 180 -scale translate swapxz -flipv -sort origin -tag screen -render home_theater\home_theater.fbx -raytrace screen Troubleshooting The FBX Converter prints several things on the screen such as configuration options and warnings and errors. Warnings (e.g., missing textures) are printed in yellow, and errors (e.g., missing executables) are printed in red. Optimization The FBX Converter implements various command-line options that can be used to optimize the geometry for rendering. Reducing Draw Calls The FBX Converter automatically merges FBX meshes that use the same material such that they will be rendered as a single surface. At some point it may become necessary to automatically break up surfaces for culling granularity. However, currently it is more important to reduce the number of draw calls due to significant driver overhead on mobile platforms. Source meshes that need to stay separate for some reason can be flagged using the -discrete command-line option of the FBX Converter.

75 Mobile Media and Assets 75 To further reduce the number of draw calls, or to statically sort all geometry into a single surface, the FBX Converter can also create one of more texture atlases using the -atlas option. This option takes a list of FBX source meshes that need to have their textures combined into an atlas. Multiple atlases can be created by specifying the -atlas command-line option multiple times with different mesh names. Textures that are placed in an atlas cannot be tiled (repeated) on a mesh and the texture coordinates of the source mesh need to all be in the [0, 1] range. Reducing Vertices During conversion, the FBX Converter displays the total number of triangles and the total number of vertices of the render geometry. The number of vertices is expected to be in the same ballpark as the number of triangles. Having over two times more vertices than triangles may have performance implications. The number of unique vertices can be reduced by removing vertex attributes that are not necessary for rendering. Unused vertex attributes are generally wasteful and removing them may increase rendering performance just by improving GPU vertex cache usage. An FBX file may store vertex attributes that are not used for rendering. For instance, vertex normals may be stored in the FBX file, but they will not be used for rendering unless there is some form of specular lighting. The FBX file may also store a second set of texture coordinates that are not used when there are no emissive textures. The -attrib command-line option of the FBX Converter can be used to keep only those attributes that are necessary to correctly render the model. For instance, if the model only renders a diffuse texture with baked lighting then all unnecessary vertex attributes can be removed by using -attrib position uv0. The -attrib command-line option also accepts the auto keyword. By using the auto keyword the FBX Converter will automatically determine which vertex attributes need to be kept based on the textures specified per surface material. The auto keyword can be specified in combination with other vertex attributes. For instance: -attrib auto color will make sure that the color attribute will always be kept and the other vertex attributes will only be kept if they are needed to correctly render based on the specified textures. The following table shows how the FBX Converter determines which attributes to keep when the auto keyword is specified: position always automatically kept normal if NormalMap or SpecularColor texture is specified tangent if NormalMap texture is specified binormal if NormalMap texture is specified uv0 if DiffuseColor or SpecularColor texture is specified uv1 if EmissiveColor texture is specified color never automatically kept Reducing Overdraw To be able to render many triangles, it is important to minimize overdraw as much as possible. For scenes or models that do have overdraw, it is very important that the opaque geometry is rendered front-to-back to significantly reduce the number of shading operations. Scenes or models that will only be displayed from a single viewpoint, or a limited range of view points, can be statically sorted to guarantee front-to-back rendering on a per triangle basis. The FBX Converter has a -sort command-line option to statically sort all the geometry. The -sort option first sorts all the vertices. Then it sorts all the triangles based on the smallest vertex index. Next to sorting all the triangles this also results in improved GPU vertex cache usage. The -sort option can sort all geometry along one of the coordinate axes or it can sort all geometry from the origin. Sorting along an axis is useful for diorama-like scenes. Sorting from the origin is useful for theater-like scenes with a full 360 view.

76 76 Media and Assets Mobile Sorting along an axis is done by specifying + or - one of the coordinate axis (X, Y or Z). For instance, to sort all geometry front-to-back along the X axis use: -sort +X Sorting from the origin can be done by specifying + or - origin. For instance, to sort all geometry front-to-back from the origin use: -sort +origin For sorting from the origin to be effective, the origin of the FBX model or scene must be the point from which the model or scene will be viewed. Keep in mind that sorting from the origin happens after any translations have been applied to the FBX geometry using the -translate command-line option. In other words, when using the -sort +origin command-line option in combination with the -translate option, the scene will be sorted from the translated position instead of the original FBX origin. Scenes that can be viewed from multiple vantage points may need to be manually broken up into reasonably sized blocks of geometry that will be dynamically sorted front-to-back at run-time. If the meshes the scene is broken up into use the same material, then the -discrete command-line option can be used to keep the meshes separate for rendering.

77 Mobile Mobile Best Practices 77 Mobile Best Practices Welcome to the Mobile Best Practices Guide. In this guide, we'll review best practices for rendering and for dealing with user interface components. Rendering Guidelines This section contains guidelines for VR application development in the unique domain of mobile development. Mobile VR Performance Be conservative on performance. Even though two threads are dedicated to the VR application, a lot happens on Android systems that we can t control, and performance has more of a statistical character than we would like. Some background tasks even use the GPU occasionally. Pushing right up to the limit will undoubtedly cause more frame drops, and make the experience less pleasant. You aren't going to be able to pull off graphics effects under these performance constraints that people haven't seen years ago on other platforms, so don't try to compete there. The magic of a VR experience comes from interesting things happening in well-composed scenes, and the graphics should largely try not to call attention to themselves. Even if you consistently hold 60 FPS, more aggressive drawing consumes more battery power, and subtle improvements in visual quality generally aren t worth taking 20 minutes off the battery life for a title. Keep rendering straightforward. Draw everything to one view, in a single pass for each mesh. Tricks with resetting the depth buffer and multiple camera layers are bad for VR, regardless of their performance issues. If the geometry doesn't work correctly - all rendered into a single view (FPS hands, et cetera) - then it will cause perception issues in VR, and you should fix the design. You can't handle a lot of blending for performance reasons. If you have limited navigation capabilities in the title and can guarantee that the effects will never cover the entire screen, then you will be okay. Don't use alpha tested / pixel discard transparency -- the aliasing will be awful, and performance can still be problematic. Coverage from alpha can help, but designing a title that doesn't require a lot of cut out geometry is even better. Most VR scenes should be built to work with 16 bit depth buffer resolution and 2x MSAA. If your world is mostly pre-lit to compressed textures, there will be little difference between 16 and 32 bit color buffers. Favor modest "scenes" instead of "open worlds". There are both theoretical and pragmatic reasons why you should, at least in the near term. The first generation of titles should be all about the low hanging fruit, not the challenges. The best-looking scenes will be uniquely textured models. You can load quite a lot of textures Megs of textures is okay. With global illumination baked into the textures, or data actually sampled from the real world, you can make reasonably photo realistic scenes that still run 60 FPS stereo. The contrast with much lower fidelity dynamic elements may be jarring, so there are important stylistic decisions to be made. Panoramic photos make excellent and efficient backdrops for scenes. If you aren't too picky about global illumination, allowing them to be swapped out is often nice. Full image-based lighting models aren't performance-practical for entire scenes, but are probably okay for characters that can't cover the screen.

78 78 Mobile Best Practices Mobile Frame Rate Asynchronous TimeWarp, along with other technologies, allows the Oculus Go and Gear VR to provide a smooth and judder-free experience at 60 FPS, regardless of how fast or slow the application is rendering. This does not mean that performance is no longer a concern, but it gives a lot more margin in normal operation, and improves the experience for applications that do not hold perfectly at 60 FPS. If an application does not consistently run at 60 FPS, then animating objects move choppier, rapid head turns pull some black in at the edges, player movement doesn't feel as smooth, and gamepad turning looks especially bad. However, the asynchronous TimeWarp does not require emptying the GPU pipeline and makes it easier to hold 60 FPS than without. Drawing anything that is stuck to the view will look bad if the frame rate is not held at 60 FPS, because it will only move on eye frame updates, instead of on every video frame. Don't make heads up displays. If something needs to stay in front of the player, like a floating GUI panel, leave it stationary most of the time, and have it quickly rush back to center when necessary, instead of dragging it continuously with the head orientation. Scenes Per scene targets: 50k to 100k triangles 50k to 100k vertices 50 to 100 draw calls An application may be able to render more triangles by using very simple vertex and fragment programs, minimizing overdraw, and reducing the number of draw calls down to a dozen. However, lots of small details and silhouette edges may result in visible aliasing despite MSAA. It is good to be conservative! The quality of a virtual reality experience is not just determined by the quality of the rendered images. Low latency and high frame rates are just as important in delivering a high quality, fully immersive experience, if not more so. Keep an eye on the vertex count because vertex processing is not free on a mobile GPU with a tiling architecture. The number of vertices in a scene is expected to be in the same ballpark as the number of triangles. In a typical scene, the number of vertices should not exceed twice the number of triangles. To reduce the number of unique vertices, remove vertex attributes that are not necessary for rendering. Textures are ideally stored with 4 bits per texel in ETC2 format for improved rendering performance and an 8x storage space reduction over 32-bit RGBA textures. Loading up to 512 MB of textures is feasible, but the limited storage space available on mobile devices needs to be considered. For a uniquely textured environment in an application with limited mobility, it is reasonable to load 128 MB of textures. Baking specular and reflections directly into the textures works well for applications with limited mobility. The aliasing from dynamic shader based specular on bumped mapped surfaces is often a net negative in VR, but simple, smooth shapes can still benefit from dynamic specular in some cases. Dynamic lighting with dynamic shadows is usually not a good idea. Many of the good techniques require using the depth buffer, which is particularly expensive on mobile GPUs with a tiling architecture. Rendering a shadow buffer for a single parallel light in a scene is feasible, but baked lighting and shadowing usually results in better quality. To be able to render many triangles, it is important to reduce overdraw as much as possible. In scenes with overdraw, it is important that the opaque geometry is rendered front-to-back to significantly reduce the number of shading operations. Scenes that will only be displayed from a single viewpoint can be statically sorted to guarantee front-to-back rendering on a per triangle basis. Scenes that can be viewed from multiple vantage

79 Mobile Mobile Best Practices 79 points may need to be broken up into reasonably sized blocks of geometry that will be sorted front-to-back dynamically at run-time. Resolution Due to distortion from the optics, the perceived size of a pixel on the screen varies across the screen. Conveniently, the highest resolution is in the center of the screen where it does the most good, but even with a 2560x1440 screen, pixels are still large compared to a conventional monitor or mobile device at typical viewing distances. With the current screen and optics, central pixels cover about 0.06 degrees of visual arc, so you would want a 6000 pixel long band to wrap 360 degrees around a static viewpoint. Away from the center, the gap between samples would be greater than one, so mipmaps should be created and used to avoid aliasing. For general purpose, rendering this requires 90 degree FOV eye buffers of at least 1500x1500 resolution, plus the creation of mipmaps. While the system is barely capable of doing this with trivial scenes at maximum clock rates, thermal constraints make this unsustainable. Most game style 3D VR content should target 1024x1024 eye buffers. At this resolution, pixels will be slightly stretched in the center, and only barely compressed at the edges, so mipmap generation is unnecessary. If you have lots of performance headroom, you can experiment with increasing this a bit to take better advantage of the display resolution, but it is costly in power and performance. Dedicated viewer apps (e-book reader, picture viewers, remote monitor view, et cetera) that really do want to focus on peak quality should consider using the TimeWarp overlay plane to avoid the resolution compromise and double-resampling of distorting a separately rendered eye view. Using an srgb framebuffer and source texture is important to avoid edge crawling effects in high contrast areas when sampling very close to optimal resolution. User Interface Guidelines Graphical User Interfaces (GUIs) in virtual reality present unique challenges that can be mitigated by following the guidelines in this document. This is not an exhaustive list, but provides some guidance and insight for firsttime implementers of Virtual Reality GUIs (VRGUIs). Stereoscopic UI Rendering If any single word can help developers understand and address the challenges of VRGUIs, it is stereoscopic. In VR, everything must be rendered from two points of view -- one for each eye. When designing and implementing VRGUIs, frequent consideration of this fact can help bring problems to light before they are encountered in implementation. It can also aid in understanding the fundamental constraints acting on VRGUIs. For example, stereoscopic rendering essentially makes it impossible to implement an orthographic Heads Up Display (HUD), one of the most common GUI implementations for 3D applications -- especially games. The Infinity Problem Neither orthographic projections nor HUDs in themselves are completely ruled out in VR, but their standard implementation, in which the entire HUD is presented via the same orthographic projection for each eye view, generally is. Projecting the HUD in this manner requires the user to focus on infinity when viewing the HUD. This effectively places the HUD behind everything else that is rendered, as far as the user s brain is concerned. This can

80 80 Mobile Best Practices Mobile confuse the visual system, which perceives the HUD to be further away than all other objects, despite remaining visible in front of them. This generally causes discomfort and may contribute to eyestrain. Orthographic projection should be used on individual surfaces that are then rendered in world space and displayed at a reasonable distance from the viewer. The ideal distance varies, but is usually between 1 to 3 meters. Using this method, a normal 2D GUI can be rendered and placed in the world and the user s gaze direction used as a pointing device for GUI interaction. In general, an application should drop the idea of orthographically projecting anything directly to the screen while in VR mode. It will always be better to project onto a surface that is then placed in world space, though this provides its own set of challenges. Depth In-Depth Placing a VRGUI in world space raises the issue of depth occlusion. In many 3D applications, it is difficult, even impossible, to guarantee that there will always be enough space in front of the user s view to place a GUI without it being coincident with, or occluded by, another rendered surface. If, for instance, the user toggles a menu during play, it is problematic if the menu appears behind the wall that the user is facing. It might seem that rendering without depth testing should solve the problem, but that creates a problem similar to the infinity problem, in which stereoscopic separation suggests to the user that the menu is further away than the wall, yet the menu draws on top of the wall. There are some practical solutions to this: Render the VRGUI surfaces in two passes, once with depth pass and once with depth fail, using a special shader with the fail pass that stipples or blends with any surface that is closer to the view point than the VRGUI. Project the VRGUI surfaces onto geometry that is closer than the ideal distance. This may not solve all problems if the geometry can be so close that the VRGUI is out of the users view, but it may give them an opportunity to back away while fitting well with any game that presupposes the VRGUIs are physical projections of light into world space. Move the user to another scene when the VRGUI interface comes up. This might be as simple as fading the world to black as the interface fades in. Stop rendering the world stereoscopically, i.e., render both eye views with the same view transform, while the VRGUI is visible, then render the GUI stereoscopically but without depth testing. This will allow the VRGUI surfaces to have depth while the world appears as a 2 dimensional projection behind it. Treat VRGUIs as actual in-world objects. In fantasy games, a character might bring up a book or scroll, upon which the GUI is projected. In a modern setting, this might be a smart phone held in the character s hand. In other instances, a character might be required to move to a specific location in the world -- perhaps a desktop computer -- to interact with the interface. In all these cases, the application would still need to support basic functionality, such as the ability to exit and return to the system launcher, at all times. It is generally not practical to disallow all VRGUI interaction and rendering if there is not enough room for the VRGUI surfaces, unless you have an application that never needs to display any type of menu (even configuration menus) when rendering the world view. Gazing Into Virtual Reality There is more than one way to interact with a VRGUI, but gaze tracking may be the most intuitive. The direction of the user s gaze can be used to select items in a VRGUI as if it were a mouse or a touch on a touch device. A mouse is a slightly better analogy because, unlike a touch device, the pointer can be moved around the interface without first initiating a down event. Like many things in VR, the use of gaze to place a cursor has a few new properties to consider. First, when using the gaze direction to select items, it is important to have a gaze cursor that indicates where gaze has to be

81 Mobile Mobile Best Practices 81 directed to select an item. The cursor should, like all other VR surfaces, be rendered stereoscopically. This gives the user a solid indication of where the cursor is in world space. In testing, implementations of gaze selection without a cursor or crosshair have been reported as more difficult to use and less grounded. Second, because gaze direction moves the cursor, and because the cursor must move relative to the interface to select different items, it is not possible to present the viewer with an interface that is always within view. In one sense, this is not possible with traditional 2D GUIs either, since the user can always turn their head away from the screen, but there are differences. With a normal 2D GUI, the user does not necessarily expect to be able to interact with the interface when not looking at the device that is presenting it, but in VR the user is always looking at the device -- they just may not be looking at the VRGUI. This can allow the user to lose the interface and not realize it is still available and consuming their input, which can further result in confusion when the application doesn t handle input as the user expects (because they do not see a menu in their current view). There are several approaches to handling this issue: Close the interface if it goes outside of some field of view from the user s perspective. This may be problematic for games if the interface pauses play, as play would just resume when the interface closes. Automatically drag the interface with the view as the user turns, either keeping the gaze cursor inside of the interface controls, or keeping it at the edge of the screen where it is still visible to the user. Place an icon somewhere on the periphery of the screen that indicates the user is in menu mode and then allow this icon to always track with the view. Another frequently-unexpected issue with using a gaze cursor is how to handle default actions. In modern 2D GUIs, a button or option is often selected by default when a GUI dialog appears - possibly the OK button on a dialog where a user is usually expected to proceed without changing current settings, or the CANCEL button on a dialog warning that a destructive action is about to be taken. However, in VRGUIs, the default action is dictated by where the gaze cursor is pointing on the interface when it appears. OK can only be the default in a VRGUI if the dialog pops up with OK under the gaze cursor. If an application does anything other than place a VRGUI directly in front of the viewer (such as placing it on the horizon plane, ignoring the current pitch), then it is not practical to have the concept of a default button, unless there is an additional form of input such as a keyboard that can be used to interact with the VRGUI. Adreno Hardware Profile The Adreno has a sizable (512k - 1 meg) on-chip memory that framebuffer operations are broken up into. Unlike the PowerVR or Mali tile based GPUs, the Adreno has a variable bin size based on the bytes per pixel needed for the buffers -- a 4x MSAA, 32 bit color 4x MRT, 32 bit depth render target will require 40 times as many tiles as a 16-bit depth-only rendering. Vertex shaders are run at least twice for each vertex, once to determine in which bins the drawing will happen, and again for each bin that a triangle covers. For binning, the regular vertex shader is stripped down to only the code relevant to calculating the vertex positions. To avoid polluting the vertex cache with unused attributes, rendering with separate attribute arrays may provide some benefit. The binning is done on a per-triangle basis, not a per-draw call basis, so there is no benefit to breaking up large surfaces. Because scenes are rendered twice for stereoscopic view, and because the binning process doubles it again (at least), vertex processing is more costly than you might expect. Avoiding any bin fills from main memory and unnecessary buffer writes is important for performance. The VrApi framework handles this optimally, but if you are doing it yourself, make sure you invalidate color buffers before using them, and discard depth buffers before flushing the eye buffer rendering. Clears still cost some performance, so invalidates should be preferred when possible. There is no dedicated occlusion hardware like one would find in PowerVR chips, but early Z rejection is performed, so sorting draw calls to roughly front-to-back order is beneficial.

82 82 Mobile Best Practices Mobile Texture compression offers significant performance benefits. Favor ETC2 compressed texture formats, but there is still sufficient performance to render scenes with 32 bit uncompressed textures on every surface if you really want to show off smooth gradients glgeneratemipmap() is fast and efficient; you should build mipmaps even for dynamic textures (and of course for static textures). Unfortunately, on Android, many dynamic surfaces (video, camera, UI, etc) come in as SurfaceTextures / samplerexternaloes, which don't have mip levels at all. Copying to another texture and generating mipmaps there is inconvenient and costs a notable overhead, but is still worth considering. srgb correction is free on texture sampling, but has some cost when drawing to an srgb framebuffer. If you have a lot of high contrast imagery, being gamma correct can reduce aliasing in the rendering. Of course, smoothing sharp contrast transitions in the source artwork can also help it. 2x MSAA runs at full speed on chip, but it still increases the number of tiles, so there is some performance cost. 4x MSAA runs at half speed, and is generally not fast enough unless the scene is very undemanding.

83 Mobile Testing and Troubleshooting 83 Testing and Troubleshooting Welcome to the testing and troubleshooting guide. This guide describes tools and procedures necessary for testing and troubleshooting mobile VR applications. Tools and Procedures Welcome to the testing and troubleshooting guide. Android System Properties Use Android System Properties to set various device configuration options for testing and debugging. Use Android System properties to set device configuration options. Local Preferences are being deprecated and should no longer be used for setting debug options. System Properties Note: System Properties reset when the device reboots. All commands are case sensitive. Basic Usage Set value: adb shell setprop <name> <value> Example: adb shell setprop debug.oculus.frontbuffer 0 Result: Disables front buffer rendering. Get current value: adb shell getprop <name> Example: adb shell getprop debug.oculus.gpulevel Result: Returns currently-configured GPU Level, e.g., 1. Get all current values: adb shell getprop fgrep "debug.oculus" Result: Returns all System Preferences settings on the device.

84 84 Testing and Troubleshooting Mobile System Property Values Function debug.oculus.enablecapture 0, 1 Enables support for Oculus Remote Monitor to connect to the application. debug.oculus.cpulevel 0, 1, 2, 3 Changes the fixed CPU level. debug.oculus.gpulevel 0, 1, 2, 3 Changes the fixed GPU level. debug.oculus.asynctimewarp 0, 1 Set to 0 to disable Asynchronous TimeWarp/enable Synchronous TimeWarp. Default is 1 (ATW is enabled). debug.oculus.frontbuffer 0, 1 Disable/enable front buffer. debug.oculus.gputimings 0, 1, 2 Turns on GPU timings in logcat (off by default due to instability). A setting of 2 is necessary on phones using the Mali GPU. debug.oculus.simulateundock -1, 0, 1,, 10 Simulate an undocking event after the specified number of seconds. debug.oculus.enablevideocapture 0, 1 When enabled, each entervrmode generates a new mp4 file in /sdcard/ oculus/videoshots/. Videos are full resolution, undistorted, single-eye, with full-compositing support. Defaults are 1024 resolution at 5 Mb/s. debug.oculus.phonesensors 0, 1 Set to 0 to disable limited orientation tracking provided by phone sensors while in Developer Mode (enabled by default). Screenshot and Video Capture Full-resolution, undistorted, single-eye, full-layer-support 2D screenshots and video capture for VR apps are available through the Universal Menu. Video capture is also available by configuring localprefs. Output File Details Video Capture writes 1024 resolution, 5 Mb per second mp4 files to the /sdcard/oculus/videoshots/ directory. Capture files do not currently include audio. Screenshot writes 1024x1024 jpg files to the /sdcard/oculus/screenshots/ directory. Note: Despite the occurrence of sdcard in this directory path, this is an internal device storage path. Requirements Android 5.0 or later Target applications must be built against Mobile SDK or later.

85 Mobile Testing and Troubleshooting 85 Using the Universal Menu 1. Launch the application that you would like to take a screenshot of. 2. Open the Universal Menu and select the Utilities menu as shown with the gaze cursor. 3. To take a screenshot, select Screenshot. The Universal Menu will close, returning immediately to the running application. The screenshot will execute five seconds after you select the option, giving you a moment to position your view within the scene properly. The countdown is indicated by a blinking red dot in the upper left of your view. The output jpg will be written to the directory indicated above.

86 86 Testing and Troubleshooting Mobile 4. To take a video capture, select Capture Video. The Universal Menu will close, returning immediately to the running application. Video recording begins immediately upon returning to the application, and continues until the user returns to the Universal Menu, switches applications, or exits VR mode. The output mp4 will be written to the directory indicated above. Video Capture using Android System Properties Note: Screenshot is not available with Android System Properties. To enable video capture, set the debug.oculus.enablevideocapture to 1 with the following command: adb shell setprop debug.oculus.enablevideocapture 1 When enabled, each entervrmode will generate a new mp4 file, and every vrapi_entervrmode() will create a new video file. For example, if you launch an app from Home, you may find one video file for your Home session, one for the app you launch, one for System Activities if you long-press, and so forth.

87 Mobile Testing and Troubleshooting 87 To help ensure that there is no disruption to the play experience while recording, you may wish to force the GPU level up and chromatic correction off: adb shell setprop debug.oculus.enablevideocapture 1 debug.oculus.gpulevel 3 The FOV is reduced to 80 degrees, so you are unlikely to see any black pull-in at the edges. Note: Make sure to disable video capture when you are done, or you will fill your phone up with video of every VR session you run! Oculus Remote Monitor The Oculus Remote Monitor client connects to VR applications running on remote devices to capture, store, and analyze data streams. Oculus Remote Monitor is compatible with any Oculus mobile application built with Unity, Unreal Engine, or Native development tools. Download the Oculus Remote Monitor client: Oculus Remote Monitor for Windows Oculus Remote Monitor for macos Setup Enable Capture Server Note: Some of the menu locations described in this process might slightly differ depending on your Samsung model. To enable the Capture Server on a device: Enable Developer Mode (instructions here) and USB Debugging on your device (instructions here). Plug your device into your computer via USB. Check your device to see if it is requesting permission to "Allow USB debugging". If so, accept. Open Oculus Remote Monitor and navigate to the Settings Panel gear icon in the upper-right. Verify that the ADB Path is set to your local installation of adb (included with the Android SDK). For example: C:\Users\ $username\appdata\local\android\sdk\platform-tools\adb.exe 5. Under the Settings (gears) tab, you should now see the device ID of your connected device in the Device Settings list. Select your device ID and click the Enable Capture check box to the right. After completing those steps the Capture Server will run automatically whenever a VR application runs. Note: If you reboot your device, the Device Settings are reset and you must click the Enable Capture check box again. Network Setup Oculus Remote Monitor uses a UDP-broadcast-based auto-discovery mechanism to locate remote hosts, and then a TCP connection to access the capture stream. For this reason, the host and the client must be on the same subnet, and the network must not block or filter UDP broadcasts or TCP connections.

88 88 Testing and Troubleshooting Mobile If you are on a large corporate network that may have such restrictions, we recommend setting up a dedicated network or tethering directly to your device. Furthermore, frame buffer capturing is extremely bandwidth intensive. If your signal strength is low or you have a lot of interference or traffic on your network, you may need to disable Capture Frame Buffer before connecting to improve capture performance. Oculus Remote Monitor uses UDP port 2020 and TCP ports 3030->3040. Application Setup Because we use a direct network connection, the following permission is required in your application's AndroidManifest.xml: <!-- Network access needed for OVRMonitor --> <uses-permission android:name="android.permission.internet" /> Basic Usage Server Browser If the host and client are on the same subnet and the network is configured correctly (see Network Setup above), Oculus Remote Monitor automatically discovers any compatible applications running on the network. To begin capturing and viewing data: 1. Click the Server Browser icon. 2. Select the host you want to monitor. 3. Select the session settings you want to monitor. 4. Click Connect. Open Capture File Each time you connect to a host, the Oculus Remote Monitor automatically compresses and saves the incoming data stream to disk under a unique filename in the format package-yyyymmdd-hhmmss-id.dat. The default recordings directory for these files are in your Documents folder, under the OVRMonitorRecordings subfolder. You can change this directory in the Client Settings panel. To open saved capture files, click the Open Capture File icon. Frame Buffer The Frame Buffer Viewer provides a mechanism for inspecting the frame buffer as the data is received in realtime, which is particularly useful for monitoring play test sessions. To view the most recent frame buffer: Click the Viewport Stream icon.

89 Mobile Testing and Troubleshooting 89 When enabled, the Capture library will stream a downscaled pre-distortion eye buffer across the network. We use downscaling rather than using a higher-quality compression scheme to reduce overhead on the host device as much as possible. This reduces image quality substantially, but still provides valuable visual context as to what is happening on screen at any given moment. The current default is 192x192 compressed. The Monitor application recompresses the Frame Buffer further to save memory and disk space when dealing with large capture sets.

90 90 Testing and Troubleshooting Mobile

91 Mobile Testing and Troubleshooting 91 Performance Overview The overview provides a high-level performance overview. It plots a graphical summary of the VrAPI messages and error conditions against a timeline. To view the log: Click the Performance Overview icon. Move the pointer over the performance overview to reveal details of the collected data. Double-click anywhere in the overview to open the Profiler Data view at that precise point in the timeline.

92 92 Testing and Troubleshooting Mobile

93 Mobile Testing and Troubleshooting 93 Frame Buffer Screen captures of the pre-distorted frame buffer. Move the pointer over this section to view the screenshots captured at that point in time. Frames per Second Delivered Frames. A well-performing application continuously delivers frames per second. Screen Tears. The number of tears per second. A well-performing application does not exhibit any tearing. Early Frames. The number of frames that are completed a whole frame early. Stale Frames. The number of stale frames per second. A well-performing application does not exhibit any stale frames. For more information, see Basic Performance Stats through Logcat on page 110. The number of milliseconds between the latest sensor sampling for tracking Head-Pose Prediction Latency and the anticipated display time of new eye images. (ms) Performance Levels The CPU and GPU clock levels and associated clock frequencies, set by the application. Lower clock levels result in less heat and less battery drain. Thermal ( C) Temperatures in degrees Celsius. Well-optimized applications do not cause the temperature to rise quickly. There is always room for more optimization, which allows lower clock levels to be used, which, in return, reduces the amount of heat that is generated. Available Memory The amount of available memory, displayed every second. It is important (GB) to keep a reasonable amount of memory available to prevent Android from killing backgrounded applications, like Oculus Home. Console VrAPI reports various messages and error conditions to Android's logcat as well as to the Oculus Remote Monitor, which provides the thread and timestamp of each message. The Logging Viewer provides raw access to this data. To view the log: Click the Log icon. Warnings and errors are color-coded to stand out easier, and unlike logcat, thread IDs are tracked so you know exactly when and where it occurred.

94 94 Testing and Troubleshooting Mobile Remote Variables Applications may expose user-adjustable parameters and variables in their code. Nearly any constant in your code may be turned into a knob that can be updated in real-time during a play test. To view and adjust the available remote variables: Click the Remote Variables icon. VrApi exposes CPU and GPU Levels to allow developers to quickly identify their required clock rates. Applications are also free to expose their own options that you can select or adjust.

95 Mobile Testing and Troubleshooting 95 ShowTimeWarpTextureDensity: This experimental feature toggles a visualization mode that colors the screen based on the texel:pixel ratio when running TimeWarp (green indicates 1:1 ratio; dark green < 1:1 ratio; red > 2:1 ratio). Settings To view and adjust the settings: Click the Settings icon. ADB Path Configurable any time. If it does not point to a valid executable when Monitor runs, Monitor will attempt to locate a valid copy of adb by checking the environment variable. If that fails, Monitor will search under ANDROID_HOME. Recordings Directory Specifies the location in which Monitor will automatically store capture files when connected to a remote host. The default is the current user's Documents directory under OVRMonitorRecordings. Frame Buffer Compression Quality Used for client-side recompression of the frame buffer. This helps offload compression load from the host while allowing for significant savings in memory usage on the client. Lower-quality settings provide greater memory savings, but may result in blocking artifacts in the frame buffer viewer. Device Settings Allows you to toggle capture support without manually editing your.oculusprefs file.

96 96 Testing and Troubleshooting Mobile Profiler Data The Profiler Data view provides both real-time and offline inspection of the following data streams on a single, contiguous timeline: CPU/GPU events Sensor readings Console messages, warnings, and errors Frame buffer captures VrAPI also has a number of other events embedded to help diagnose VR-specific scheduling issues. To view the data streams: Click the Profiler Data icon. The Profiler Data view has a number of controls to help you analyze the timeline. The space bar toggles between real-time timelines scrolling and freezing at a specific point in time. This lets you quickly alternate between watching events unfold in real-time and pausing to focus in on a point of interest without stopping or restarting. The mouse wheel zooms the view in and out. Right-click to save a screenshot of the current view or to hide a data stream. To un-hide hidden data streams, click Show Hidden Data Streams Click and drag to pan the timeline forwards or backwards in time.

97 Mobile Testing and Troubleshooting 97

98 98 Testing and Troubleshooting Mobile The Performance Data Viewer screen shows a selected portion of the application timeline: Frame Buffer Provides screen captures of the pre-distorted frame buffer, timestamped the moment immediately before the frame was handed off to the TimeWarp context. The left edge of each screenshot represents the point in time in which it was captured from the GPU. VSync Displays notches on every driver v-sync event. GPU Context GPU Zones inserted into the OpenGL Command Stream via Timer Queries are displayed in a similar manner as CPU events. Each row corresponds to a different OpenGL Context. Typical VR applications will have two contexts: one for TimeWarp, and one for application rendering. Note that on tiler GPUs, these events should be regarded as rough estimates rather than absolute data. CPU Thread Hierarchical visualization of wall clock time of various functions inside VrAPI along with OpenGL Draw calls inside the host application. Log messages are displayed on their corresponding CPU thread as icons. Mouse over each icon to display the corresponding message (blue circles), warning (yellow squares), or error (red diamonds). Sensor General sensor data visualizer. CPU and GPU clocks are visualized in the screenshot shown above, but other data may be displayed here, such as thermal sensors, IMU data, et cetera.

99 Mobile Testing and Troubleshooting 99

100 100 Testing and Troubleshooting Mobile Using Oculus Remote Monitor to Identify Common Issues Tearing Tearing occurs in VR applications any time TimeWarp rendering fails to render ahead of scanout. VrApi attempts to detect this with a GPU Sync Object to determine when the GPU completes rendering distortion for a given eye. If for any reason it does not complete in time, VrApi prints a warning to logcat, which Oculus Remote Monitor picks up. V-sync %d: Eye %d, CPU latency %f, GPU latency %f, Total latency %f If you are running on a Samsung GALAXY S6, you may also see tearing events by looking at the GPU Context that is running WarpToScreen. Example: In this example, because the refresh rate of the display is 60 Hz, the ideal running time of WarpToScreen is 16.66ms, but a scheduling/priority issue in the application caused the second eye to be executed 10 ms late, pushing WarpToScreen to run for ms. The actual eye distortion draw calls are barely visible as two distinct notches under each WarpToScreen event on the GPU.

101 Mobile Testing and Troubleshooting 101 Missed Frames VrApi reports the Frame Index that was submitted inside vrapi_submitframe, as well as the Frame Index that is currently being TimeWarped. This allows you to easily identify missed frames, and to easily track latency between app render and distortion. Every time a Frame Index is reported from API, Oculus Remote Monitor marks it on the timeline for the associated thread with a vertical grey line. If the Frame Index arrives out of order, the line is changed to red, helping you quickly identify problem areas. In the above example, the TimeWarp thread has two out-of-order frames visible. This typically happens when the GPU was unable to finish rendering a frame from the application in time for TimeWarp to begin sampling from it. Zooming in, we can investigate the cause by looking at the actual Frame Index value.

102 102 Testing and Troubleshooting Mobile Just as we thought, the same Frame Index is sampled twice, which results in the first red line. But the next frame is able to catch up, which means it jumps ahead two frames, resulting in the second red line along with a frame that is never displayed. This was all probably caused by the application failing to complete GPU work on time. Application OpenGL Performance Issues Oculus Remote Monitor is capable of capturing OpenGL calls across the entire process (enabled with the Graphics API option). Application-specific performance issues can therefore be spotted at times. In the example below, an application was mapping the same buffer several times a frame for a particle effect. On a Note 4 running KitKat, the GL driver triggered a sync point on the second glunmapbuffer call, causing it to take away 2.73 ms - without the sync point, this same call takes around 0.03 ms. After spotting this issue, the developer was able to quickly fix the buffer usage and reclaim that CPU time.

103 Mobile Testing and Troubleshooting 103 Unlocked CPU Clocks VrApi attempts to lock the CPU and GPU clocks at particular frequencies to ensure some level of execution speed and scheduling guarantees. These are configurable via the CPULevel and GPULevel available in the API. When in VR Developer Mode, the clocks may occasionally unlock when out of the headset for too long. When this happens, the CPU/GPU Clock Sensors go from extremely flat to extremely noisy, typically causing many performance issues like tearing and missed frames, as seen below:

104 104 Testing and Troubleshooting Mobile Note that on a Samsung GALAXY S6, we allow the clocks to boost slightly under certain conditions, but only by a small amount in typical cases, and it should never drop below the requested level. It is also fairly common for some cores to go completely on and offline occasionally. Network Bandwidth Stalls VrCapture uses a fixed-size FIFO internally to buffer events before they are streamed over the network. If this buffer fills faster than we can stream its contents, we are left in a tricky situation. If your network connection stalls long enough for any reason, it eventually causes the host application to stall as well. This is easily spotted in Oculus Remote Monitor - look for around two seconds of extremely long events on the OVR::Capture thread followed by other threads stalling as well. We provide a large internal buffer, so a few network hitches shouldn t affect your application, but if they persist long enough, a random event inside your application will eventually stall until the Capture thread is able to flush the buffer. In the example below, several seconds of poor network connectivity (see by long AsyncStream_Flush events) eventually caused the MapAndCopy event on the application s render thread to stall until it was eventually released by the Capture thread:

105 Mobile Testing and Troubleshooting 105 If you find it difficult to capture reliably because of this issue, we recommend disabling Frame Buffer Capturing before connecting, as this feature consumes the bulk of the bandwidth required. Thermal Limits If your application requests CPU/GPU Levels are too high, the internal SoC and battery temperatures will rise slowly, yet uncontrollably, until it hits the thermal limit. When this happens, GearVR Service will terminate the application and display a thermal warning until the device cools down. It may take quite a long time to encounter this scenario during testing. Monitoring thermals in Oculus Remote Monitor is a great way to quickly see if your application causes the device temperature to rise perpetually.

106 106 Testing and Troubleshooting Mobile Mouse over the Sensor graph to give the precise readout at any given time. We recommend keeping an eye on it. If the temperature exceeds the device s first thermal trip point, the graph turns bright red, which typically occurs a few minutes before GearVR Service shuts the application down, and should serve as a stern warning that you should probably lower the CPU/GPU Levels. OVR Metrics Tool OVR Metrics Tool is an application that provides performance metrics for Oculus mobile applications. OVR Metrics Tool reports application frame rate, heat, GPU and CPU throttling values, and the number of tears and stale frames per second. It is available for download from our Downloads page. OVR Metrics Tool can be run two modes. In Report Mode, it displays performance report about a VR session after it is complete. Report data may be easily exported as a CSV and PNG graphs. In Performance HUD Mode, OVR Metrics Tool renders performance graphs as a VR overlay over any running Oculus application. OVR Metrics Tool may be used with any Oculus mobile application, including those built with Unity, Unreal, or our native mobile SDK. Installation Install OVRMetricsTool.apk on any Gear VR-compatible Android phone. For more details, see Using adb to Install Applications in our adb guide or the "Install an App" section of Google's adb documentation. Report Mode Usage Run your VR application and conduct a session you wish to gather data for. Note that data will be logged from every application you run, including Oculus Home. After you have finished your session and exited VR, open OVR Metrics Tool from your phone's desktop launcher and click the log entry that corresponds to your application. You will see a series of graphs describing the performance of your session. Use the buttons at the bottom of the report to save the information or share an image of the report.

107 Mobile Testing and Troubleshooting 107 Performance HUD Usage Connect to your device using adb (for general instructions, see our adb guide), Enable Performance HUD Mode with the following commands: adb shell setprop debug.oculus.notifpackage com.oculus.ovrmonitormetricsservice adb shell setprop debug.oculus.notifclass com.oculus.ovrmonitormetricsservice.panelservice adb shell am force-stop com.oculus.ovrmonitormetricsservice After setting these properties, restart OVR Metrics Tool. Note that these properties must be set after each phone restart. Once Performance HUD Mode is enabled, you can customize the HUD itself using the Options screen in the OVR Metrics Tool toolbar menu. To disable Performance HUD Mode, restart your phone or send the following two adb commands: adb shell setprop debug.oculus.notifpackage '' adb shell setprop debug.oculus.notifclass '' Android Debugging This document describes utilities, tips and best practices for improving debugging for any application on Android platforms. Most of these tips apply to both native and Unity applications. Adb This guide describes how to perform common tasks using adb. Android Debug Bridge (adb) is included in the Android SDK and is the main tool used to communicate with an Android device for debugging. We recommend familiarizing yourself with it by reading the official documentation located here: For a list of available commands and options, make a connection as described below and enter: adb help Connecting to a Device with adb Using adb from the OS shell, it is possible to connect to and communicate with an Android device either directly through USB, or via TCP/IP over a Wi-Fi connection. You must install the Android SDK and appropriate device drivers to use adb (see Device Setup on page 9 for more information). To connect a device via USB, plug the device into the PC with a compatible USB cable. After connecting, open up an OS shell and type: adb devices If the device is connected properly, adb will show the device id list such as: List of devices attached ce0551e7 device

108 108 Testing and Troubleshooting Mobile Adb may not be used if no device is detected. If your device is not listed, the most likely problem is that you do not have the correct Samsung USB driver - see Setting up your System to Detect your Android Device on page 12 for more information. You may also wish to try another USB cable and/or port. Connecting adb via Wi-Fi Connecting to a device via USB is generally faster than using a TCP/IP connection, but a TCP/IP connection is sometimes indispensable, such as when debugging Gear VR behavior that only occurs when the phone is placed in the headset. To connect via TCP/IP, first determine the IP address of the device and make sure the device is already connected via USB. You can find the IP address of the device under Settings -> About device -> Status. Then issue the following commands: adb tcpip <port> adb connect <ipaddress>:<port> For example: > adb tcpip 5555 restarting in TCP mode port: 5555 > adb connect :5555 connected to :5555 The device may now be disconnected from the USB port. As long as adb devices shows only a single device, all adb commands will be issued for the device via Wi-Fi. To stop using the Wi-Fi connection, issue the following adb command from the OS shell: adb disconnect Using adb to Install Applications To install an APK on your mobile device using adb, connect to the target device and verify connection using adb devices as described above. Then install the APK with the following command: adb install <apk-path> Use the -r option to overwrite an existing APK of the same name already installed on the target device. For example: adb install -r C:\Dev\Android\MyProject\VrApp.apk For more information, see the Installing an Application section of Android's Android Debug Bridge guide. Connection Troubleshooting Note that depending on the particular device, detection may be finicky from time to time. In particular, on some devices, connecting while a VR app is running or when adb is waiting for a device, may prevent the device from being reliably detected. In those cases, try ending the app and stop adb using Ctrl-C before reconnecting the device. Alternatively the adb service can be stopped using the following command after which the adb service will be automatically restarted when executing the next command: adb kill-server Multiple devices may be attached at once, and this is often valuable for debugging client/server applications. Also, when connected via Wi-Fi and also plugged in via USB, adb will show a single device as two devices. In

109 Mobile Testing and Troubleshooting 109 the multiple device case adb must be told which device to work with using the -s switch. For example, with two devices connected, the adb devices command might show: List of devices attached ce0551e :5555 device device The listed devices may be two separate devices, or one device that is connected both via Wi-Fi and plugged into USB (perhaps to charge the battery). In this case, all adb commands must take the form: adb -s <device id> <command> where <device id> is the id reported by adb devices. So, for example, to issue a logcat command to the device connected via TCP/IP: adb -s :55555 logcat -c and to issue the same command to the device connected via USB: adb -s ce0551e7 Logcat The Android SDK provides the logcat logging utility, which is essential for determining what an application and the Android OS are doing. To use logcat, connect the Android device via USB or Wi-Fi, launch an OS shell, and type: adb logcat If the device is connected and detected, the log will immediately begin outputting to the shell. In most cases, this raw output is too verbose to be useful. Logcat solves this by supporting filtering by tags. To see only a specific tag, use: adb logcat -s <tag> This example: adb logcat -s VrApp will show only output with the VrApp tag. In the native VrAppFramework code, messages can generally be printed using the LOG() macro. In most source files this is defined to pass a tag specific to that file. Log.h defines a few additional logging macros, but all resolve to calling android_log_print(). Using Logcat to Determine the Cause of a Crash Logcat will not necessarily be running when an application crashes. Fortunately, it keeps a buffer of recent output, and in many cases a command can be issued to logcat immediately after a crash to capture the log that includes the backtrace for the crash: adb logcat > crash.log

110 110 Testing and Troubleshooting Mobile Simply issue the above command, give the shell a moment to copy the buffered output to the log file, and then end adb (Ctrl+C in a Windows cmd prompt or OS X Terminal prompt). Then search the log for backtrace: to locate the stack trace beginning with the crash. If too much time as elapsed and the log does not show the backtrace, there a full dump state of the crash should still exist. Use the following command to redirect the entire dumpstate to a file: adb shell dumpstate > dumpstate.log Copying the full dumpstate to a log file usually takes significantly longer than simply capturing the currently buffered logcat log, but it may provide additional information about the crash. Getting a Better Stack Trace The backtrace in a logcat capture or dumpstate generally shows the function where the crash occurred, but does not provide line numbering. To get more information about a crash, the Android Native Development Kit (NDK) must be installed. When the NDK is installed, the ndk-stack utility can be used to parse the logcat log or dumpstate for more detailed information about the state of the stack. To use ndk-stack, issue the following: ndk-stack -sym <path to symbol file> -dump <source log file> > stack.log For example, this command: ndk-stack -sym VrNative\Oculus360Photos\obj\local\armeabi-v7a -dump crash.log > stack.log uses the symbol information for Oculus 360 Photos to output a more detailed stack trace to a file named stack.log, using the backtrace found in crash.log. Application Performance Analysis A guide to performance analysis during mobile VR application development. While this document is geared toward native application development, most of the tools presented here are also useful for improving performance in Android applications developed in Unity or other engines. Basic Performance Stats through Logcat A simple way to get some basic performance numbers is to use logcat with a filter for VrApi. Sample usage: adb logcat -s VrApi A line resembling this example will be displayed every second: I/VrApi (26422): FPS=60,Prd=49ms,Tear=0,Early=60,Stale=0,VSnc=1,Lat=1,CPU4/GPU=2/1,1000/350MHz,OC=FF,TA=F0/F0/F0/ F0,SP=F/F/N/N,Mem=1026MHz,Free=1714MB,PSM=0,PLS=0,Temp=31.5C/27. FPS: Frames Per Second. An application that performs well will continuously display FPS. Prd: The number of milliseconds between the latest sensor sampling for tracking and the anticipated display time of new eye images. For a single-threaded application this time will normally be 33 milliseconds. For an application with a separate renderer thread (like Unity) this time will be 49 milliseconds. New eye images are not generated for the time the rendering code is executed, but are instead generated for the time they

111 Mobile Testing and Troubleshooting 111 will be displayed. When an application begins generating new eye images, the time they will be displayed is predicted. The tracking state (head orientation, et cetera) is also predicted ahead for this time. Tears: The number of tears per second. A well behaved and well performing application will display zero tears. Tears can be related to Asynchronous TimeWarp, which takes the last completed eye images and warps them onto the display. The time warp runs on a high priority thread using a high priority OpenGL ES context. As such, the time warp should be able to preempt any application rendering and warp the latest eye images onto the display just in time for the display refresh. However, when there are a lot of heavyweight background operations or the application renders many triangles to a small part of the screen, or uses a very expensive fragment program, then the time warp may have to wait for this work to be completed. This may result in the time warp not executing in time for the display refresh, which, in return, may result in a visible tear line across one of the eyes or both eyes. Early: The number of frames that are completed a whole frame early. Stale: The number of stale frames per second. A well-behaved application performing well displays zero stale frames. New eye images are generated for a predicted display time. If, however, the new eye images are not completed by this time, then the time warp may have to re-project and display a previous set of eye images. In other words, the time warp displays a stale frame. Even though the time warp re-projects old eye images to make them line up with the latest head orientation, the user may still notice some degree of intra-frame motion judder when displaying old images. Vsnc: The value of MinimumVsyncs, which is the number of V-Syncs between displayed frames. This value directly controls the frame rate. For instance, MinimumVsyncs = 1 results in 60 FPS and MinimumVsyncs = 2 results in 30 FPS. CPU0/GPU: The CPU and GPU clock levels and associated clock frequencies, set by the application. Lower clock levels result in less heat and less battery drain. F/F [Thread Affinity]: This field describes the thread affinity of the main thread (first hex nibble) and renderer thread (second hex nibbles). Each bit represents a core, with 1 indicating affinity and 0 indicating no affinity. For example, F/1 (= ) indicates the main thread can run on any of the lower four cores, while the rendered thread can only run on the first core. In practice, F/F and F0/F0 are common results. Free: The amount of available memory, displayed every second. It is important to keep a reasonable amount of memory available to prevent Android from killing backgrounded applications, like Oculus Home. PLS: Power Level State, where "0" = normal, "1" = throttled, and "2" = undock required. Temp: Temperature in degrees Celsius. Well-optimized applications do not cause the temperature to rise quickly. There is always room for more optimization, which allows lower clock levels to be used, which, in return, reduces the amount of heat that is generated. SysTrace SysTrace is the profiling tool that comes with the Android Developer Tools (ADT) Bundle. SysTrace can record detailed logs of system activity that can be viewed in the Google Chrome browser. With SysTrace, it is possible to see an overview of what the entire system is doing, rather than just a single app. This can be invaluable for resolving scheduling conflicts or finding out exactly why an app isn t performing as expected. Under Windows: the simplest method for using SysTrace is to run the monitor.bat file that was installed with the ADT Bundle. This can be found in the ADT Bundle installation folder (e.g., C:\android\adt-bundle-windowsx86_ ) under the sdk/tools folder. Double-click monitor.bat to start Android Debug Monitor.

112 112 Testing and Troubleshooting Mobile Select the desired device in the left-hand column and click the icon highlighted in red above to toggle Systrace logging. A dialog will appear enabling selection of the output.html file for the trace. Once the trace is toggled off, the trace file can be viewed by opening it up in Google Chrome. You can use the WASD keys to pan and zoom while navigating the HTML doc. For additional keyboard shortcuts, please refer to the following documentation: NDK Profiler Use NDK Profiler to generate gprof-compatible profile information. The Android NDK profiler is a port of gprof for Android. The latest version, which has been tested with this release, is 3.2 and can be downloaded from the following location:

113 Mobile Testing and Troubleshooting 113 Once downloaded, unzip the package contents to your NDK sources path, e.g.: C:\Dev\Android\android-ndkr9c\sources. Add the NDK pre-built tools to your PATH, e.g.: C:\Dev\Android\android-ndk-r9c\toolchains\arm-linuxandroideabi-4.8\prebuilt\windows-x86_64\bin. Android Makefile Modifications 1. Compile with profiling information and define NDK_PROFILE LOCAL_CFLAGS := -pg -DNDK_PROFILE 2. Link with the ndk-profiler library LOCAL_STATIC_LIBRARIES := android-ndk-profiler 3. Import the android-ndk-profiler module $(call import-module,android-ndk-profiler) Source Code Modifications Add calls to monstartup and moncleanup to your Init and Shutdown functions. Do not call monstartup or moncleanup more than once during the lifetime of your app. #if defined( NDK_PROFILE ) extern "C" void monstartup( char const * ); extern "C" void moncleanup(); #endif extern "C" { void Java_oculus_VrActivity2_nativeSetAppInterface( JNIEnv * jni, jclass clazz ) { #if defined( NDK_PROFILE ) setenv( "CPUPROFILE_FREQUENCY", "500", 1 ); // interrupts per second, default 100 monstartup( "libovrapp.so" ); #endif app->init(); } void Java_oculus_VrActivity2_nativeShutdown( JNIEnv *jni ) { app->shutdown(); #if defined( NDK_PROFILE ) moncleanup(); #endif } } // extern "C" Manifest File Changes You will need to add permission for your app to write to the SD card. The gprof output file is saved in /sdcard/ gmon.out. <uses-permission android:name="android.permission.write_external_storage" />

114 114 Testing and Troubleshooting Mobile Profiling your App To generate profiling data, run your app and trigger the moncleanup function call by pressing the Back button on your phone. Based on the state of your app, this will be triggered by OnStop() or OnDestroy(). Once moncleanup has been triggered, the profiling data will be written to your Android device at /sdcard/gmon.out. Copy the gmon.out file to the folder where your project is located on your PC using the following command: adb pull /sdcard/gmon.out To view the profile information, run the gprof tool, passing to it the non-stripped library, e.g.: arm-linux-androideabi-gprof obj/local/armeabi/libovrapp.so For information on interpreting the gprof profile information, see the following: docs/gprof/output.html Snapdragon Profiler The Qualcomm Snapdragon Profiler allows developers to analyze performance on Android devices with Snapdragon processors over USB, including CPU, GPU, memory, power, and thermal performance. Samsung Snapdragon phones running VrDriver and later auto-detect when Snapdragon Profiler is running, and configure themselves for best capture. Basic Usage 1. Download and install the Snapdragon Profiler from Qualcomm. 2. Attach a Snapdragon-based S7 with Android M or N via USB, with VR developer mode enabled (for instructions, see Developer Mode: Running Apps Outside of the Gear VR Headset on page 16). 3. Run Snapdragon Profiler. It is important to do this before starting the app you want to use. 4. Select Connect to a Device. If you do this right after starting, you may need to wait a few seconds for the phone icon to turn green (driver files are being transferred). 5. Click connect. 6. Run the VR app that you want to profile. Note that you will see poor performance because of optimizations related to the performance testing - it will not affect your session. 7. Select either trace or snapshot capture modes. 8. In the Data Sources panel, select the app name. Note that only applications with the OpenGL requirement set in the manifest will show up. If the application has the required manifest setting but does not appear, try rebooting the phone and restarting the Snapdragon Profiler. 9. For traces, enable OpenGL ES -> Rendering stages for the most useful information, then click start capture, wait a second or two, and click stop capture. 10.For snapshots, you can capture a frame of commands without any extra options checked. The capture process can take tens of seconds if there is a lot of texture data to transfer 11.We recommend shutting down and restarting the Snapdragon Profiler between sets of tests. 12.Snapdragon Profiler before unplugging your phone, so it can clean up. Don t forget this step! Native Debugging This guide provides basic recommendations for working with the Oculus Mobile SDK in Android Studio and Gradle, and is intended to supplement the relevant Android Studio documentation.

115 Mobile Testing and Troubleshooting 115 Native Debugging with Android Studio This section introduces debugging our sample native apps in Android Studio. Note: Native support in Android Studio is still under development. Some developers have reported problems using run-as with the Note 4 with Lollipop (5.0.x) and S6 with 5.0.0, which may cause issues with debugging. If you have problems debugging, try updating to the latest system software, and please let us know on the Oculus Forums. The default configurations created during project import only support Java debugging. Select Edit Configurations in the Configurations drop-down menu in the Android Studio toolbar. Create a new Android Native configuration as show below: In the General tab of the Run/Debug Configuration dialog box, assign your configuration a name, select the target module, and select the target device mode:

116 116 Testing and Troubleshooting Mobile In the Native tab of the Run/Debug Configuration dialog box, add symbol paths:

117 Mobile Testing and Troubleshooting 117 Note that ndk-build places stripped libraries inside the libs/ directory. You must point the symbol search paths at the obj/local/<arch> directory. This is also not a recursive search path, so you must put the full path to the obj/local/armeabi-v7a directory.

118 118 Testing and Troubleshooting Mobile

Mobile SDK. Version 1.12

Mobile SDK. Version 1.12 Mobile SDK Version 1.12 2 Introduction Mobile Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Mobile SDK. Version 1.14

Mobile SDK. Version 1.14 Mobile SDK Version 1.14 2 Introduction Mobile Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.23 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.7.0 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Oculus Rift Introduction Guide. Version

Oculus Rift Introduction Guide. Version Oculus Rift Introduction Guide Version 0.8.0.0 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Diving into VR World with Oculus. Homin Lee Software Engineer at Oculus

Diving into VR World with Oculus. Homin Lee Software Engineer at Oculus Diving into VR World with Oculus Homin Lee Software Engineer at Oculus Topics Who is Oculus Oculus Rift DK2 Positional Tracking SDK Latency Roadmap 1. Who is Oculus 1. Oculus is Palmer Luckey & John Carmack

More information

GamePro Android Edition User Guide for Android Devices

GamePro Android Edition User Guide for Android Devices GamePro Android Edition User Guide for Android Devices Copyright 2007, My Mobile Gear. Com All rights reserved. End-User License Agreement (EULA) This End-User License Agreement (EULA) is a legal agreement

More information

Samsung Gear VR 4.0 Retail Experience. Setup & Installation Guide

Samsung Gear VR 4.0 Retail Experience. Setup & Installation Guide Samsung Gear VR 4.0 Retail Experience Setup & Installation Guide Before You Begin Users must follow the exact steps as outlined in the document. Users should not skip or ignore any steps outlined in the

More information

VR-Plugin. for Autodesk Maya.

VR-Plugin. for Autodesk Maya. VR-Plugin for Autodesk Maya 1 1 1. Licensing process Licensing... 3 2 2. Quick start Quick start... 4 3 3. Rendering Rendering... 10 4 4. Optimize performance Optimize performance... 11 5 5. Troubleshooting

More information

Unreal. Version

Unreal. Version Unreal Version 1.13.0 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1 OCULUS VR, LLC Oculus User Guide Runtime Version 0.4.0 Rev. 1 Date: July 23, 2014 2014 Oculus VR, LLC All rights reserved. Oculus VR, LLC Irvine, CA Except as otherwise permitted by Oculus VR, LLC, this

More information

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro Virtual Universe Pro Player 2018 1 Main concept The 2018 player for Virtual Universe Pro allows you to generate and use interactive views for screens or virtual reality headsets. The 2018 player is "hybrid",

More information

Unreal. Version 1.7.0

Unreal. Version 1.7.0 Unreal Version 1.7.0 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Ansible Tower Quick Setup Guide

Ansible Tower Quick Setup Guide Ansible Tower Quick Setup Guide Release Ansible Tower 3.2.2 Red Hat, Inc. Mar 08, 2018 CONTENTS 1 Quick Start 2 2 Login as a Superuser 3 3 Import a License 5 4 Examine the Tower Dashboard 7 5 The Settings

More information

Easy Input For Gear VR Documentation. Table of Contents

Easy Input For Gear VR Documentation. Table of Contents Easy Input For Gear VR Documentation Table of Contents Setup Prerequisites Fresh Scene from Scratch In Editor Keyboard/Mouse Mappings Using Model from Oculus SDK Components Easy Input Helper Pointers Standard

More information

Introduction. Modding Kit Feature List

Introduction. Modding Kit Feature List Introduction Welcome to the Modding Guide of Might and Magic X - Legacy. This document provides you with an overview of several content creation tools and data formats. With this information and the resources

More information

1 ImageBrowser Software User Guide 5.1

1 ImageBrowser Software User Guide 5.1 1 ImageBrowser Software User Guide 5.1 Table of Contents (1/2) Chapter 1 What is ImageBrowser? Chapter 2 What Can ImageBrowser Do?... 5 Guide to the ImageBrowser Windows... 6 Downloading and Printing Images

More information

Unreal. Version 1.19

Unreal. Version 1.19 Unreal Version 1.19 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Ansible Tower Quick Setup Guide

Ansible Tower Quick Setup Guide Ansible Tower Quick Setup Guide Release Ansible Tower 3.1.3 Red Hat, Inc. Feb 27, 2018 CONTENTS 1 Quick Start 2 2 Login as a Superuser 3 3 Import a License 5 4 Examine the Tower Dashboard 7 5 The Settings

More information

BIM 360 with AutoCAD Civil 3D, Autodesk Vault Collaboration AEC, and Autodesk Buzzsaw

BIM 360 with AutoCAD Civil 3D, Autodesk Vault Collaboration AEC, and Autodesk Buzzsaw BIM 360 with AutoCAD Civil 3D, Autodesk Vault Collaboration AEC, and Autodesk Buzzsaw James Wedding, P.E. Autodesk, Inc. CI4500 The modern design team does not end at the meeting room door, and by leveraging

More information

SteamVR Unity Plugin Quickstart Guide

SteamVR Unity Plugin Quickstart Guide The SteamVR Unity plugin comes in three different versions depending on which version of Unity is used to download it. 1) v4 - For use with Unity version 4.x (tested going back to 4.6.8f1) 2) v5 - For

More information

ROTATING SYSTEM T-12, T-20, T-50, T- 150 USER MANUAL

ROTATING SYSTEM T-12, T-20, T-50, T- 150 USER MANUAL ROTATING SYSTEM T-12, T-20, T-50, T- 150 USER MANUAL v. 1.11 released 12.02.2016 Table of contents Introduction to the Rotating System device 3 Device components 4 Technical characteristics 4 Compatibility

More information

Welcome to Storyist. The Novel Template This template provides a starting point for a novel manuscript and includes:

Welcome to Storyist. The Novel Template This template provides a starting point for a novel manuscript and includes: Welcome to Storyist Storyist is a powerful writing environment for ipad that lets you create, revise, and review your work wherever inspiration strikes. Creating a New Project When you first launch Storyist,

More information

MINIMUM SYSTEM REQUIREMENTS

MINIMUM SYSTEM REQUIREMENTS Quick Start Guide Copyright 2000-2012 Frontline Test Equipment, Inc. All rights reserved. You may not reproduce, transmit, or store on magnetic media any part of this publication in any way without prior

More information

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106)

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106) Virtual Reality Mobile 360 Nanodegree Syllabus (nd106) Join the Creative Revolution Before You Start Thank you for your interest in the Virtual Reality Nanodegree program! In order to succeed in this program,

More information

BIMXplorer v1.3.1 installation instructions and user guide

BIMXplorer v1.3.1 installation instructions and user guide BIMXplorer v1.3.1 installation instructions and user guide BIMXplorer is a plugin to Autodesk Revit (2016 and 2017) as well as a standalone viewer application that can import IFC-files or load previously

More information

User Guide. PTT Radio Application. Android. Release 8.3

User Guide. PTT Radio Application. Android. Release 8.3 User Guide PTT Radio Application Android Release 8.3 March 2018 1 Table of Contents 1. Introduction and Key Features... 5 2. Application Installation & Getting Started... 6 Prerequisites... 6 Download...

More information

Kodiak Corporate Administration Tool

Kodiak Corporate Administration Tool AT&T Business Mobility Kodiak Corporate Administration Tool User Guide Release 8.3 Table of Contents Introduction and Key Features 2 Getting Started 2 Navigate the Corporate Administration Tool 2 Manage

More information

Progeny Imaging. User Guide V x and Higher. Part Number: ECN: P1808 REV. F

Progeny Imaging. User Guide V x and Higher. Part Number: ECN: P1808 REV. F Progeny Imaging User Guide V. 1.6.0.x and Higher Part Number: 00-02-1598 ECN: P1808 REV. F Contents 1 About This Manual... 5 How to Use this Guide... 5 Text Conventions... 5 Getting Assistance... 6 2 Overview...

More information

My view in VR and controller keep moving or panning outside of my control when using Oculus Go.

My view in VR and controller keep moving or panning outside of my control when using Oculus Go. Applicable ASINs/Models Product sub group Problem My view in VR and controller keep moving or panning outside of my control when using Oculus Go. I'm having trouble connecting my Oculus Go to WiFi. How

More information

pcon.planner PRO Plugin VR-Viewer

pcon.planner PRO Plugin VR-Viewer pcon.planner PRO Plugin VR-Viewer Manual Dokument Version 1.2 Author DRT Date 04/2018 2018 EasternGraphics GmbH 1/10 pcon.planner PRO Plugin VR-Viewer Manual Content 1 Things to Know... 3 2 Technical Tips...

More information

EOS 80D (W) Wireless Function Instruction Manual ENGLISH INSTRUCTION MANUAL

EOS 80D (W) Wireless Function Instruction Manual ENGLISH INSTRUCTION MANUAL EOS 80D (W) Wireless Function Instruction Manual ENGLISH INSTRUCTION MANUAL Introduction What You Can Do Using the Wireless Functions This camera s wireless functions let you perform a range of tasks wirelessly,

More information

Endurance R/C Wi-Fi Servo Controller 2 Instructions

Endurance R/C Wi-Fi Servo Controller 2 Instructions Endurance R/C Wi-Fi Servo Controller 2 Instructions The Endurance R/C Wi-Fi Servo Controller 2 allows you to control up to eight hobby servos, R/C relays, light controllers and more, across the internet

More information

Moving Web 3d Content into GearVR

Moving Web 3d Content into GearVR Moving Web 3d Content into GearVR Mitch Williams Samsung / 3d-online GearVR Software Engineer August 1, 2017, Web 3D BOF SIGGRAPH 2017, Los Angeles Samsung GearVR s/w development goals Build GearVRf (framework)

More information

M-16DX 16-Channel Digital Mixer

M-16DX 16-Channel Digital Mixer M-16DX 16-Channel Digital Mixer Workshop Using the M-16DX with a DAW 2007 Roland Corporation U.S. All rights reserved. No part of this publication may be reproduced in any form without the written permission

More information

RAZER GOLIATHUS CHROMA

RAZER GOLIATHUS CHROMA RAZER GOLIATHUS CHROMA MASTER GUIDE The Razer Goliathus Chroma soft gaming mouse mat is now Powered by Razer Chroma. Featuring multi-color lighting with inter-device color synchronization, the bestselling

More information

TAKE CONTROL GAME DESIGN DOCUMENT

TAKE CONTROL GAME DESIGN DOCUMENT TAKE CONTROL GAME DESIGN DOCUMENT 04/25/2016 Version 4.0 Read Before Beginning: The Game Design Document is intended as a collective document which guides the development process for the overall game design

More information

User Guide: PTT Application - Android. User Guide. PTT Application. Android. Release 8.3

User Guide: PTT Application - Android. User Guide. PTT Application. Android. Release 8.3 User Guide PTT Application Android Release 8.3 March 2018 1 1. Introduction and Key Features... 6 2. Application Installation & Getting Started... 7 Prerequisites... 7 Download... 8 First-time Activation...

More information

User Guide. PTT Radio Application. ios. Release 8.3

User Guide. PTT Radio Application. ios. Release 8.3 User Guide PTT Radio Application ios Release 8.3 March 2018 1 Table of Contents 1. Introduction and Key Features... 5 2. Application Installation & Getting Started... 6 Prerequisites... 6 Download... 6

More information

Overview... 3 Starting the Software... 3 Adding Your Profile... 3 Updating your Profile... 4

Overview... 3 Starting the Software... 3 Adding Your Profile... 3 Updating your Profile... 4 Page 1 Contents Overview... 3 Starting the Software... 3 Adding Your Profile... 3 Updating your Profile... 4 Tournament Overview... 5 Adding a Tournament... 5 Editing a Tournament... 6 Deleting a Tournament...

More information

Understanding OpenGL

Understanding OpenGL This document provides an overview of the OpenGL implementation in Boris Red. About OpenGL OpenGL is a cross-platform standard for 3D acceleration. GL stands for graphics library. Open refers to the ongoing,

More information

Insight VCS: Maya User s Guide

Insight VCS: Maya User s Guide Insight VCS: Maya User s Guide Version 1.2 April 8, 2011 NaturalPoint Corporation 33872 SE Eastgate Circle Corvallis OR 97339 Copyright 2011 NaturalPoint Corporation. All rights reserved. NaturalPoint

More information

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM PRODUCT MANUAL CAWTS03 v3.16 Apple ios ABOUT CASE AIR TABLE OF CONTENTS FEATURES ACCESSORIES The Case Air Wireless Tethering System connects and transfers

More information

CHAPTER1: QUICK START...3 CAMERA INSTALLATION... 3 SOFTWARE AND DRIVER INSTALLATION... 3 START TCAPTURE...4 TCAPTURE PARAMETER SETTINGS... 5 CHAPTER2:

CHAPTER1: QUICK START...3 CAMERA INSTALLATION... 3 SOFTWARE AND DRIVER INSTALLATION... 3 START TCAPTURE...4 TCAPTURE PARAMETER SETTINGS... 5 CHAPTER2: Image acquisition, managing and processing software TCapture Instruction Manual Key to the Instruction Manual TC is shortened name used for TCapture. Help Refer to [Help] >> [About TCapture] menu for software

More information

INSTRUCTION MANUAL IP REMOTE CONTROL SOFTWARE RS-BA1

INSTRUCTION MANUAL IP REMOTE CONTROL SOFTWARE RS-BA1 INSTRUCTION MANUAL IP REMOTE CONTROL SOFTWARE RS-BA FOREWORD Thank you for purchasing the RS-BA. The RS-BA is designed to remotely control an Icom radio through a network. This instruction manual contains

More information

Enhanced Push-to-Talk Application for iphone

Enhanced Push-to-Talk Application for iphone AT&T Business Mobility Enhanced Push-to-Talk Application for iphone Land Mobile Radio (LMR) Version Release 8.3 Table of Contents Introduction and Key Features 2 Application Installation & Getting Started

More information

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Team Breaking Bat Architecture Design Specification. Virtual Slugger Department of Computer Science and Engineering The University of Texas at Arlington Team Breaking Bat Architecture Design Specification Virtual Slugger Team Members: Sean Gibeault Brandon Auwaerter Ehidiamen

More information

JoneSoft Generic Mod Enabler v2.6

JoneSoft Generic Mod Enabler v2.6 JoneSoft Generic Mod Enabler v2.6 User Guide 8 August 2010 Contents Introduction... 2 Installation... 3 1. Central installation... 3 2. Separate installation... 4 Installing over an existing installation...

More information

Oculus Rift Unity 3D Integration Guide

Oculus Rift Unity 3D Integration Guide Oculus Rift Unity 3D Integration Guide 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus

More information

Enhanced Push-to-Talk Application for Android

Enhanced Push-to-Talk Application for Android AT&T Business Mobility Enhanced Push-to-Talk Application for Android Land Mobile Radio (LMR) Version Release 8.3 Table of Contents Introduction and Key Features 2 Application Installation & Getting Started

More information

iphoto Getting Started Get to know iphoto and learn how to import and organize your photos, and create a photo slideshow and book.

iphoto Getting Started Get to know iphoto and learn how to import and organize your photos, and create a photo slideshow and book. iphoto Getting Started Get to know iphoto and learn how to import and organize your photos, and create a photo slideshow and book. 1 Contents Chapter 1 3 Welcome to iphoto 3 What You ll Learn 4 Before

More information

Hytera. PD41X Patrol Management System. Installation and Configuration Guide

Hytera. PD41X Patrol Management System. Installation and Configuration Guide Hytera PD41X Patrol Management System Installation and Configuration Guide Documentation Version: 01 Release Date: 03-2015 Copyright Information Hytera is the trademark or registered trademark of Hytera

More information

Brightness and Contrast Control Reference Guide

Brightness and Contrast Control Reference Guide innovation Series Scanners Brightness and Contrast Control Reference Guide A-61506 Part No. 9E3722 CAT No. 137 0337 Using the Brightness and Contrast Control This Reference Guide provides information and

More information

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR LOOKING AHEAD: UE4 VR Roadmap Nick Whiting Technical Director VR / AR HEADLINE AND IMAGE LAYOUT RECENT DEVELOPMENTS RECENT DEVELOPMENTS At Epic, we drive our engine development by creating content. We

More information

Printer Software Guide

Printer Software Guide Printer Software Guide (For Canon CP Printer Solution Disk Version 4) Macintosh 1 Contents Safety Precautions...3 Read This First...4 About the Manuals...4 Printing Flow Diagram...5 Printing...7 Starting

More information

Save System for Realistic FPS Prefab. Copyright Pixel Crushers. All rights reserved. Realistic FPS Prefab Azuline Studios.

Save System for Realistic FPS Prefab. Copyright Pixel Crushers. All rights reserved. Realistic FPS Prefab Azuline Studios. User Guide v1.1 Save System for Realistic FPS Prefab Copyright Pixel Crushers. All rights reserved. Realistic FPS Prefab Azuline Studios. Contents Chapter 1: Welcome to Save System for RFPSP...4 How to

More information

IVI STEP TYPES. Contents

IVI STEP TYPES. Contents IVI STEP TYPES Contents This document describes the set of IVI step types that TestStand provides. First, the document discusses how to use the IVI step types and how to edit IVI steps. Next, the document

More information

Proprietary and restricted rights notice

Proprietary and restricted rights notice Proprietary and restricted rights notice This software and related documentation are proprietary to Siemens Product Lifecycle Management Software Inc. 2012 Siemens Product Lifecycle Management Software

More information

Unreal. Version 1.8.0

Unreal. Version 1.8.0 Unreal Version 1.8.0 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

MUSC 1331 Lab 3 (Northwest) Using Software Instruments Creating Markers Creating an Audio CD of Multiple Sources

MUSC 1331 Lab 3 (Northwest) Using Software Instruments Creating Markers Creating an Audio CD of Multiple Sources MUSC 1331 Lab 3 (Northwest) Using Software Instruments Creating Markers Creating an Audio CD of Multiple Sources Objectives: 1. Learn to use Markers to identify sections of a sequence/song/recording. 2.

More information

SKF Shaft Alignment Tool Horizontal machines app

SKF Shaft Alignment Tool Horizontal machines app SKF Shaft Alignment Tool Horizontal machines app Short flex couplings Instructions for use Table of contents 1. Using the Horizontal shaft alignment app... 2 1.1 How to change the app language...2 1.2

More information

Projects Connector User Guide

Projects Connector User Guide Version 4.3 11/2/2017 Copyright 2013, 2017, Oracle and/or its affiliates. All rights reserved. This software and related documentation are provided under a license agreement containing restrictions on

More information

This guide provides information on installing, signing, and sending documents for signature with

This guide provides information on installing, signing, and sending documents for signature with Quick Start Guide DocuSign for Dynamics 365 CRM 5.2 Published: June 15, 2017 Overview This guide provides information on installing, signing, and sending documents for signature with DocuSign for Dynamics

More information

Exploring Virtual Reality (VR) with ArcGIS. Euan Cameron Simon Haegler Mark Baird

Exploring Virtual Reality (VR) with ArcGIS. Euan Cameron Simon Haegler Mark Baird Exploring Virtual Reality (VR) with ArcGIS Euan Cameron Simon Haegler Mark Baird Agenda Introduction & Terminology Application & Market Potential Mobile VR with ArcGIS 360VR Desktop VR with CityEngine

More information

Celtx Studios Owner's Manual January 2011

Celtx Studios Owner's Manual January 2011 January 2011 Get the most out of Celtx Studios with the latest version of Celtx - available free at http://celtx.com Screen captures are made using Windows OS. Some image dialogs differ slightly on Mac

More information

RAZER RAIJU TOURNAMENT EDITION

RAZER RAIJU TOURNAMENT EDITION RAZER RAIJU TOURNAMENT EDITION MASTER GUIDE The Razer Raiju Tournament Edition is the first Bluetooth and wired controller to have a mobile configuration app, enabling control from remapping multi-function

More information

Owner s Manual. Page 1 of 23

Owner s Manual. Page 1 of 23 Page 1 of 23 Installation Instructions Table of Contents 1. Getting Started! Installation via Connect! Activation with Native Instruments Service Center 2. Pulse Engines Page! Pulse Engine Layers! Pulse

More information

An Escape Room set in the world of Assassin s Creed Origins. Content

An Escape Room set in the world of Assassin s Creed Origins. Content An Escape Room set in the world of Assassin s Creed Origins Content Version Number 2496 How to install your Escape the Lost Pyramid Experience Goto Page 3 How to install the Sphinx Operator and Loader

More information

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088 Portfolio About Me: I am a Computer Science graduate student at The University of Texas at Dallas. I am currently working as Augmented Reality Engineer at Aireal, Dallas and also as a Graduate Researcher

More information

Chanalyzer by MetaGeek USER GUIDE page 1

Chanalyzer by MetaGeek USER GUIDE page 1 Chanalyzer 5 Chanalyzer by MetaGeek USER GUIDE page 1 Chanalyzer 5 spectrum analysis software Table of Contents Introduction What is Wi-Spy? What is Chanalyzer? Installation Choose a Wireless Network Interface

More information

Nikon View DX for Macintosh

Nikon View DX for Macintosh Contents Browser Software for Nikon D1 Digital Cameras Nikon View DX for Macintosh Reference Manual Overview Setting up the Camera as a Drive Mounting the Camera Camera Drive Settings Unmounting the Camera

More information

i800 Series Scanners Image Processing Guide User s Guide A-61510

i800 Series Scanners Image Processing Guide User s Guide A-61510 i800 Series Scanners Image Processing Guide User s Guide A-61510 ISIS is a registered trademark of Pixel Translations, a division of Input Software, Inc. Windows and Windows NT are either registered trademarks

More information

Enhanced Push-to-Talk Application for iphone

Enhanced Push-to-Talk Application for iphone AT&T Business Mobility Enhanced Push-to-Talk Application for iphone Standard Version Release 8.3 Table of Contents Introduction and Key Features 2 Application Installation & Getting Started 2 Navigating

More information

Experiment 02 Interaction Objects

Experiment 02 Interaction Objects Experiment 02 Interaction Objects Table of Contents Introduction...1 Prerequisites...1 Setup...1 Player Stats...2 Enemy Entities...4 Enemy Generators...9 Object Tags...14 Projectile Collision...16 Enemy

More information

APNT#1166 Banner Engineering Driver v How To Guide

APNT#1166 Banner Engineering Driver v How To Guide Application Note #1166: Banner Engineering Driver v1.10.02 How To Guide Introduction This Application Note is intended to assist users in using the GP-Pro EX Version 2..X\2.10.X Banner Engineering Corp.

More information

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive

More information

Live Agent for Administrators

Live Agent for Administrators Salesforce, Spring 18 @salesforcedocs Last updated: January 11, 2018 Copyright 2000 2018 salesforce.com, inc. All rights reserved. Salesforce is a registered trademark of salesforce.com, inc., as are other

More information

TOON RACER v1.3. Documentation: 1.3. Copyright Sperensis Applications Page 1

TOON RACER v1.3. Documentation: 1.3. Copyright Sperensis Applications   Page 1 TOON RACER v1.3 Documentation: 1.3 Copyright Sperensis Applications www.sperensis.com Page 1 Unity 5.x Upgrade 4 Contents Re-skin UI Interface of MenuScene and PhysicsCar 5 Re-Skin Environment 6 Player

More information

Operating Instructions Pocket Pictor For use with Pocket Pc s

Operating Instructions Pocket Pictor For use with Pocket Pc s Introduction Operating Instructions Pocket Pictor For use with Pocket Pc s The compact size and low power consumption of Pocket PC s make them ideal for use in the field. Pocket Pictor is designed for

More information

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM PRODUCT MANUAL CAWTS03 v3.13 Android ABOUT CASE AIR The Case Air Wireless Tethering System connects and transfers images instantly from your camera

More information

S! Applications & Widgets

S! Applications & Widgets S! Appli...-2 Using S! Applications... -2 Mobile Widget... -3 Customizing Standby Display (Japanese)... -3 Additional Functions... -6 Troubleshooting... - S! Applications & Widgets -1 S! Appli Using S!

More information

Virtual components in assemblies

Virtual components in assemblies Virtual components in assemblies Publication Number spse01690 Virtual components in assemblies Publication Number spse01690 Proprietary and restricted rights notice This software and related documentation

More information

PaperCut PaperCut Payment Gateway Module - Blackboard Quick Start Guide

PaperCut PaperCut Payment Gateway Module - Blackboard Quick Start Guide PaperCut PaperCut Payment Gateway Module - Blackboard Quick Start Guide This guide is designed to supplement the Payment Gateway Module documentation and provides a guide to installing, setting up and

More information

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM PRODUCT MANUAL CAWTS03 v3.14 Windows ABOUT CASE AIR The Case Air Wireless Tethering System connects and transfers images instantly from your camera

More information

USER MANUAL PFF-1010BLACK

USER MANUAL PFF-1010BLACK USER MANUAL PFF-1010BLACK www.denver-electronics.com Before connecting, operating or adjusting this product, please read this user s manual carefully and completely. ENGLISH 1 2 3 4 5 USB RESET DC 1. POWER

More information

BITKIT. 8Bit FPGA. Updated 5/7/2018 (C) CraftyMech LLC.

BITKIT. 8Bit FPGA. Updated 5/7/2018 (C) CraftyMech LLC. BITKIT 8Bit FPGA Updated 5/7/2018 (C) 2017-18 CraftyMech LLC http://craftymech.com About The BitKit is an 8bit FPGA platform for recreating arcade classics as accurately as possible. Plug-and-play in any

More information

OCULUS VR, LLC. Oculus Developer Guide SDK Version 0.4

OCULUS VR, LLC. Oculus Developer Guide SDK Version 0.4 OCULUS VR, LLC Oculus Developer Guide SDK Version 0.4 Date: October 24, 2014 2014 Oculus VR, LLC. All rights reserved. Oculus VR, LLC Irvine CA Except as otherwise permitted by Oculus VR, LLC ( Oculus

More information

Designing in the context of an assembly

Designing in the context of an assembly SIEMENS Designing in the context of an assembly spse01670 Proprietary and restricted rights notice This software and related documentation are proprietary to Siemens Product Lifecycle Management Software

More information

Unity Game Development Essentials

Unity Game Development Essentials Unity Game Development Essentials Build fully functional, professional 3D games with realistic environments, sound, dynamic effects, and more! Will Goldstone 1- PUBLISHING -J BIRMINGHAM - MUMBAI Preface

More information

Easy Input Helper Documentation

Easy Input Helper Documentation Easy Input Helper Documentation Introduction Easy Input Helper makes supporting input for the new Apple TV a breeze. Whether you want support for the siri remote or mfi controllers, everything that is

More information

Shader "Custom/ShaderTest" { Properties { _Color ("Color", Color) = (1,1,1,1) _MainTex ("Albedo (RGB)", 2D) = "white" { _Glossiness ("Smoothness", Ran

Shader Custom/ShaderTest { Properties { _Color (Color, Color) = (1,1,1,1) _MainTex (Albedo (RGB), 2D) = white { _Glossiness (Smoothness, Ran Building a 360 video player for VR With the release of Unity 5.6 all of this became much easier, Unity now has a very competent media player baked in with extensions that allow you to import a 360 video

More information

User Guide: PTT Radio Application - ios. User Guide. PTT Radio Application. ios. Release 8.3

User Guide: PTT Radio Application - ios. User Guide. PTT Radio Application. ios. Release 8.3 User Guide PTT Radio Application ios Release 8.3 December 2017 Table of Contents Contents 1. Introduction and Key Features... 5 2. Application Installation & Getting Started... 6 Prerequisites... 6 Download...

More information

Chanalyzer Lab. Chanalyzer Lab by MetaGeek USER GUIDE page 1

Chanalyzer Lab. Chanalyzer Lab by MetaGeek USER GUIDE page 1 Chanalyzer Lab Chanalyzer Lab by MetaGeek USER GUIDE page 1 Chanalyzer Lab spectrum analysis software Table of Contents Control Your Wi-Spy What is a Wi-Spy? What is Chanalyzer Lab? Installation 1) Download

More information

User Manual. User Manual. Version Last change : March Page 1 ID station User Manual

User Manual. User Manual. Version Last change : March Page 1 ID station User Manual User Manual Version 7.4.3 Last change : March 2017 Page 1 Introduction This is the user manual of the new fastid, the biometric ID and passport photo system. This user guide helps you in everyday use.

More information

Quick Immunity Sequencer

Quick Immunity Sequencer Part No. Z1-003-152, IB006433 Nov. 2006 USERʼS MANUAL PCR-LA Series Application Software SD003-PCR-LA Quick Immunity Sequencer Ver. 1.0 Use of This Manual Please read through and understand this User s

More information

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM

Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM Case Air Wireless TETHERING AND CAMERA CONTROL SYSTEM PRODUCT MANUAL CAWTS03 v3.13 Mac OS ABOUT CASE AIR The Case Air Wireless Tethering System connects and transfers images instantly from your camera

More information

CONTENTS 1. PACKAGE CONTENTS / SYSTEM REQUIREMENTS REGISTRATION / TECHNICAL SUPPORT DEVICE LAYOUT... 6

CONTENTS 1. PACKAGE CONTENTS / SYSTEM REQUIREMENTS REGISTRATION / TECHNICAL SUPPORT DEVICE LAYOUT... 6 Control goes beyond pure power, it requires absolute adaptability. Complete with the features of a full-fledged console controller, the Razer Serval elevates your android gaming experience to a whole new

More information

Wireless Handy Scanner

Wireless Handy Scanner User Guide Works with iscanair Go Scanner App Wireless Handy Scanner For smartphones, tablets, and computers Wi-Fi 802.11g/n supported All trademarks are the property of their respective owners and all

More information

Practical Assignment 1: Arduino interface with Simulink

Practical Assignment 1: Arduino interface with Simulink !! Department of Electrical Engineering Indian Institute of Technology Dharwad EE 303: Control Systems Practical Assignment - 1 Adapted from Take Home Labs, Oklahoma State University Practical Assignment

More information

User Manual. cellsens 1.16 LIFE SCIENCE IMAGING SOFTWARE

User Manual. cellsens 1.16 LIFE SCIENCE IMAGING SOFTWARE User Manual cellsens 1.16 LIFE SCIENCE IMAGING SOFTWARE Any copyrights relating to this manual shall belong to OLYMPUS CORPORATION. We at OLYMPUS CORPORATION have tried to make the information contained

More information