Unreal. Version 1.19

Size: px
Start display at page:

Download "Unreal. Version 1.19"

Transcription

1 Unreal Version 1.19

2 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved. BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their respective owners. Certain materials included in this publication are reprinted with the permission of the copyright holder. 2

3 Unreal Contents 3 Contents Unreal Engine...5 Visual Studio Quick Start... 9 Getting Started Unreal Mobile Development Unreal Input Touch...23 Gear VR Controller Haptics for Rift Controllers Guardian System Boundary Component...28 Unreal VR Audio Advanced Rendering Features Unreal Forward Shading Renderer (PC) VR Compositor Layers Multi-View (Mobile) Hybrid Monoscopic Rendering (Mobile) Adaptive Pixel Density (Rift Only) Application Lifecycle Handling Oculus Platform Features and Online Subsystems...45 Unreal Console Variables and Commands Blueprints...51 Unreal Loading Screens Unreal Mixed Reality Capture...59 Unreal Samples Testing and Performance Analysis in Unreal Release Notes Oculus Oculus Oculus Oculus Oculus Oculus Oculus Unreal Unreal Unreal Unreal Unreal Unreal Unreal Engine Engine Engine Engine Engine Engine Engine Integration Integration Integration Integration Integration Integration Integration Release Release Release Release Release Release Release Notes Notes Notes Notes Notes Notes Notes... 93

4 4 Contents Unreal Oculus Oculus Oculus Oculus Oculus Oculus Oculus Oculus Oculus Oculus Unreal Unreal Unreal Unreal Unreal Unreal Unreal Unreal Unreal Unreal Engine Engine Engine Engine Engine Engine Engine Engine Engine Engine Integration Integration Integration Integration Integration Integration Integration Integration Integration Integration 1.12 Release Notes Release Notes Release Notes Release Notes Release Notes Release Notes Release Notes Release Notes Release Notes Release Notes Oculus Unreal Documentation Archive...103

5 Unreal Unreal Engine 5 Unreal Engine Unreal is distributed with Oculus plugins which make it easy to develop for the Oculus Rift, Oculus Go, and Samsung Gear VR. Applications targeting Oculus devices automatically apply stereoscopic rendering, orientation tracking (Rift, Oculus Go, Gear VR), and positional tracking (Rift only). Unreal Engine Distributions Epic provides a binary distribution of UE4 through the Launcher, and a source distribution available from their GitHub repository. These distributions all contain Oculus support, but they may be a version or two behind the latest SDK. Oculus also distributes UE4 through our own GitHub repository. These distributions are always up to date with the latest Oculus SDKs. We support the current release of UE4, the previous release of UE4, and any preview of the next release of UE4. This development sequence is illustrated in the following chart: Note that our features ship first to the GitHub versions we maintain in our own repository.

6 6 Unreal Engine Unreal Which distribution should I use? For beginning developers, we recommend the binary distribution of the Unreal engine available through the Launcher. It is the most stable, and does not require pulling from GitHub or compiling the engine source code. It is typically a few months behind the latest Oculus SDK features. For professional developers who would like access to the latest SDK features, we recommend the source distributions hosted on Oculus s private GitHub repository here: UnrealEngine.git. To access this repository, you must hold a UE4 license and be subscribed to the private EpicGames/ UnrealEngine repository (see UE4 of GitHub for details). If you are not subscribed and logged into your GitHub account, you will get a 404 error. The standard source distribution is hosted on Epic s private GitHub repository here: EpicGames/UnrealEngine. It is typically about a month behind the Oculus branch in feature support. This branch is available to developers who are subscribed to the private EpicGames/UnrealEngine repository. Note that if you are not subscribed and logged into your GitHub account, you will get a 404 error. For more information on accessing this repository, see UE4 on GitHub. OculusVR and Legacy Plugins Beginning with Oculus integration 1.15, all Oculus functionality is provided through the OculusVR plugin. The OculusVR plugin is included with the latest Oculus versions of Unreal source 4.15 and later. All new features will be developed using the OculusVR plugin going forward. Legacy versions of Unreal, including Unreal versions 4.15 and earlier using Oculus integration 1.14 or earlier, provide functionality through the OculusRiftHMD, GearVR, OculusInput, and OculustFunctionLibrary plugins. What does this guide cover? Unless otherwise noted, this documentation covers features included in Unreal versions available from Oculus s private GitHub repository. Note that API changes may occur when these branches are merged back into Epic s version of the engine. For Epic's documentation about Oculus development, go to Platforms/VR/. Oculus Resources for the Unreal Developer Platform SDK The Oculus Platform supports features related to security, community, revenue, and engagement such as entitlement checking, matchmaking, in-app purchase, VoIP, and cloudsaves. For more information on the Platform Unreal plugin, see our Platform Developer Guide. Avatar SDK The Oculus Avatar SDK includes an Unreal package to assist developers with implementing first-person hand presence for the Rift and Touch controllers. It includes avatar hand and body assets viewable by other users in social applications for Rift and mobile. The first-person hand models and third-person hand and body models supported by the Avatar SDK automatically pull the avatar configuration choices the user has made in Oculus Home to provide a consistent sense of identity across applications. For more information, see our Avatar SDK Developer Guide. Audio Resources

7 Unreal Unreal Engine 7 The Oculus Audio SDK includes spatialization plugins (OSPs) that provide HRTF spatialization and reverb modeling for audio editing tools commonly used with Unreal, including Audiokinetic Wwise and FMOD Studio. See Unreal VR Audio on page 31 for more information. Additional Resources Facebook 360 Capture SDK This sample SDK allows game and virtual reality developers to easily and quickly integrate 360 photo/video capture capability into their applications. It is available for use with Unreal VR applications, and may be downloaded from the Facebook GitHub repository. Unreal/Oculus SDK Version Compatibility The Oculus Unreal SDK requires Windows 7 or later. Table 1: Oculus GitHub Repository Branch Tag OVRPlugin Platform SDK Audio SDK 4.18 oculus release oculus release oculus release Branch Tag PC SDK Mobile SDK 4.15 oculus release oculus release oculus release oculus release oculus release oculus release Table 2: Epic GitHub Repository Branch Tag OVRPlugin Platform SDK Audio SDK 4.18 oculus release oculus release

8 8 Unreal Engine Unreal Branch Tag PC SDK Mobile SDK release release release release release release release

9 Unreal Visual Studio Quick Start 9 Visual Studio Quick Start The following section describes how to download, compile, and launch UE4 from Oculus s GitHub repository using Visual Studio 2015 and Professional developers who wish to take full advantage of the features available with our Unreal integration should download and build the source code. For more information and an overview of other options for working with Unreal, see Unreal Engine on page 5. This guide assumes you have installed Visual Studio 2015 and are familiar with its use. Several of these steps may take some time to complete - some of them typically take over an hour, depending on your computer and the speed of your Internet connection. Note: Visual Studio 2013 is no longer supported. 1. Clone or download the UE4 source code from the Oculus GitHub repository here: Oculus-VR/UnrealEngine. To access this repository, you must hold a UE4 license and have access to the private Epic GitHub repository (see for details). 2. If you downloaded a zip archive of the code, extract the archive contents to a directory where you would like to build the engine, e.g., C:\Unreal\4.13-oculus. We recommend installing to a short path, or you may have errors in the next step with files that exceed the Windows length limit for file names. Alternately, you can map your install directory as a Windows drive to reduce the path length. 3. Run Setup.bat. You may need to Run as Administrator. 4. Run GenerateProjectFiles.bat. You may need to Run as Administrator. 5. Launch UE4.sln to open the project solution in Visual Studio. 6. In the menu bar, select Build > Configuration Manager. Verify that Active solution configuration is set to Development Editor, and that Active solution platform is set to Win In the Solution Explorer, right-click UE4 under Engine and select Set as Startup Project. Then select Build in the same context menu. 8. To launch the engine: With command-line arguments: Right-click UE4 in the Solution Explorer and select Properties. In the UE Property Pages dialog, select Configuration Properties > Debugging on the left. Enter any desired configuration options in the Command Arguments field on the top. Without command-line arguments: In the Solution Explorer on the right, right-click UE4 under Engine and, in the context menu, select Debug > Start New Instance to launch the engine.

10 10 Visual Studio Quick Start Unreal 9. Select the project you would like to open, or specify a new project. If you are creating a new project, don t forget to specify a project name. 10.At this point, the engine will close and a new instance of Visual Studio will launch with your selected or new project. Repeat step 6 to launch the engine with the specified project. For Epic's instructions on building the Unreal Engine from source, see Building Unreal Engine from Source guide.

11 Unreal Getting Started 11 Getting Started This guide reviews the basics for beginning Oculus development in Unreal. Overview The easiest way to get started with Oculus development in Unreal is to experiment with the HMD support provided by default as a Player Start. If Unreal detects an Oculus Rift runtime (e.g., the Oculus app is installed), it will automatically provide orientation and positional tracking and stereoscopic rendering for any game. Standalone PC executables will automatically display in Oculus Rift when run in full-screen mode if a Rift is detected on your system. You may build a project normally and preview it in the Engine by selecting the VR Preview mode in the Play button pulldown options. Note: In order to play an in-development application, you will need to enable running applications from unknown sources in the Oculus app settings, available through the gear icon in the upper-right. Select Settings > General and toggle Unknown Sources on to allow. The first time you run an application that you have not downloaded from the Oculus Store, you will need to launch it directly. Once you have run an application from an unknown source at least once, it will then appear in the Library section of Home and the Oculus app, and may be launched normally, as long as Unknown Sources is enabled. To preview a level in the Rift while editing with the engine, select the dropdown arrow near the Play button and select VR Preview.

12 12 Getting Started Unreal To play a scene in the Rift with a standalone PC game, simply maximize the game window and it will be mirrored to the Rift display. Note: While Unreal is running, you will not be able to access Oculus Home by putting on the headset. Adding a VR Camera Using the default camera set to VR Preview is a good way to get a quick sense of VR development with minimal overhead, but for actual development, we recommend adding a Camera actor to your map and enabling Oculus support by selecting the camera in Viewport and checking the Lock to Hmd checkbox in the Details tab (available in UE4 11).

13 Unreal Getting Started 13 Placing a camera in the scene allows you to control the orientation of the camera view when the game loads, so that you can control the exact perspective that will be visible to the user. This is not possible with the Play Start described above. An additional benefit to using the Camera actor is that you can attach meshes and they will update their position following the HMD view with very little latency. This is generally the best way to add head-locked elements such as cockpit details. Note that we generally discourage head-locked UI elements, but it can be an attractive feature when used carefully. HMD Pose Tracking Origin The HMD pose is reported relative to a tracking origin, which may be set to two values in your HeadMountedDisplay settings: Eye Level The initial pose of the HMD when the camera activates. Floor Level The origin set by the user during Rift calibration, typically near the center of their playable area. Unlike eye-level, the floor-level origin's position is on the floor, at Y = 0. When you recenter the tracking origin, the behavior is different for eye and floor level.

14 14 Getting Started Unreal Eye level moves the origin's position (X, Y, and Z) and the yaw (Y rotation) to match the current head pose. Pitch and roll (X and Z rotation) are not modified because that would make the app's virtual horizon mismatch the real-world horizon, disorienting the user. Floor level moves the origin's X and Z position, but leaves the Y position alone to keep it consistent with the real-world floor. Rotation is handled the same way as eye level. For more information, see: Tracking in our Intro to VR IHeadMountedDisplay in the Unreal API Guide VR Template Unreal Engine v4.13 and later include a Virtual Reality Blueprint project template which may be selected when creating a New Project.

15 Unreal Getting Started 15 VR Template contains two maps, accessible through the Content Browser in Content > VirtualRealityBP > Maps. The HMD Locomotion Map is a simple level that demonstrates teleportation. Set your travel destination with the blue gaze-controlled circle. Once the teleport target is set, press the spacebar or gamepad button to teleport. You may optionally control the the orientation you will be facing on arrival with the gamepad primary joystick. The Motion Controller Map also demonstrates teleportation control, in this case using tracked Touch controllers. Point in the direction you d like to travel with the Touch controller and and control the destination orientation with your gaze. Then press the A-button to teleport.

16 16 Getting Started Unreal Use the trigger buttons to pick up and manipulate the small blue blocks in the level.

17 Unreal Getting Started 17

18 18 Unreal Mobile Development Unreal Unreal Mobile Development This guide covers environment setup, project configuration, and development for the Oculus mobile platform using Unreal. Overview Unreal versions 4.10 and later provide built-in support for Oculus mobile development. Projects configured for VR automatically apply stereoscopic rendering and orientation tracking. Once your environment is set up with the appropriate tools and your project settings are configured properly, you may build virtual reality projects targeting Android, load APKs on your mobile device, and play them on the Oculus Mobile platform. Note: Mobile applications are subject to more computational limitations compared to Rift applications, and this should be taken into consideration beginning in the early design phase. Oculus Mobile SDK Oculus provides a mobile SDK with native C/C++ libraries for Android development in addition to supporting mobile development for game engines such as Unreal. It is not necessary for Unreal developers to download or install the Mobile SDK, but you may wish to look through our Mobile SDK Developer Guide for general information on mobile development such as application signature, basic mobile development tools such as adb and Oculus Remote Monitor, and performance guidelines. We recommend that Unreal mobile developers review the following sections: Mobile SDK Getting Started Guide Mobile Development Basics Mobile Best Practices Testing and Troubleshooting Developers interested in lower-level details about how the mobile VR rendering pipeline is handled by our native libraries may wish to download the mobile SDK and review the headers, particularly for the VrApi, which is responsible for returning headset orientation data and applying distortion, sensor fusion, and compositing. Mobile Environment Setup There are two ways to set up your environment for mobile development for Unreal: 1) using the Codeworks for Android installation package (recommended), or 2) using the standard Android SDK. Epic strongly recommends using the Codeworks for Android installation package rather than the stock Android SDK. The Codeworks installer is bundled with the Unreal installation in Engline\Extras\AndroidWorks. Using this package simplifies Unreal set up and is usually necessary to successfully build mobile projects. However, it requires uninstalling any standard Android SDK installation and may interfere with development using other engines or IDEs that require the standard Android SDK. To install Codeworks for Android 1R4, follow the instructions in Epic s Required Android Setup guide. Note: You will find the required version of Codeworks here: Engine\Extras\AndroidWorks\Win64. To prepare for mobile development using the standard Android SDK, follow the instructions in our Mobile Android Development Software Setup for Windows guide to install the Java Development Kit (JDK), Native Development Kit, and Android SDK. Unreal developers do not need to install Android Studio. Our Unreal SDK does not currently support OS X or Linux.

19 Unreal Unreal Mobile Development 19 Once you have installed the required Android tools, follow the setup instructions described in the Device Setup guide in our Mobile SDK documentation. In this process you will enable Developer Options on your mobile device and make a few device configuration settings. Application Signing Oculus mobile applications must be signed by two different signatures at different stages of development. An Oculus Signature File (osig) is required for your application to run on Oculus Go and Gear VR, and an Android distribution keystore is required for submission to the Oculus Store. Once you add an osig to the appropriate Unreal directory, it will be added automatically to every APK that you build. You will need one osig for each mobile device. To add your osig to Unreal for development: 1. Follow the instructions in Application Signing to download your osig. 2. Navigate to the directory <Unreal-directory>\Engine\Build\Android\Java\. 3. Create a new directory inside \Engine\Build\Android\Java\ and name it assets. The name must not be capitalized. 4. Copy your osig to this directory. When you are ready to build an APK for submission to release, we recommend that you exclude the osig in your APK. To do so, select Edit > Project Settings > Android, scroll down to Advanced APKPackaging, and verify that Remove Oculus Signature Files from Distribution APK is checked.

20 20 Unreal Mobile Development Unreal Before building your final release build, create a new Android keystore by following the Sign Your App Manually instructions in Android's Sign your Applications guide. Once you have generated your distribution keystore, go to Edit > Project Settings > Platforms > Android, scroll down to Distribution Signing, and entered the required information. See the Application Signing section of the Mobile SDK documentation for more information. Configure Unreal for Android SDK If you do not have a project prepared but would like to try out the process, you may use create a scene with starter content such as the C++ First Person project.

21 Unreal Unreal Mobile Development 21 Once you have installed the Android SDK and required tools, configure Unreal for Android development. 1. Select Edit > Project Settings. 2. In the Project Settings menu on the left, go to Platforms and select Android SDK (not Android - we will configure that later). 3. Configure all fields in SDKConfig with the appropriate paths to the necessary tools. Note: if you installed Codeworks for Android, these fields should be populated automatically. Project Configuration This section describes how to configure any C++ or Blueprints Unreal project targeting desktop or mobile for use with Oculus mobile devices. If you do not have a project prepared but would like to try out the process, you may use create a scene with starter content such as the C++ First Person project. 1. Select Edit > Plugins. 2. Select Virtual Reality. 3. Verify that the Enabled check box is checked for OculusVR. If you need to select it, click Restart Now in the lower-right afterward. Close Plugins configuration. 4. Select Edit > Project Settings. 5. Fill in any other desired information in Project > Description. 6. In the Project Settings menu on the left, go to Platforms and select Android. 7. Under APKPackaging, set the Minimum SDK Version to 21, and set the Target SDK Version to Scroll down to Advanced APKPackaging. Then: a. Check Configure the AndroidManifest for deployment to GearVR. b. Verify that Remove Oculus Signature Files from Distribution APK is unchecked, unless you are building a package to release. 9. In Engine > Rendering, uncheck Mobile HDR in the Mobile section. Restart project if prompted to do so. 10.Close the Project Settings configuration window. Building and Running Projects 1. Connect your Samsung phone to your PC via USB. 2. Open a shell terminal and verify that your device is communicating with your PC using adb devices. Note that depending on the device you are using, you may need to configure your connection for software installation. For more information, see Adb in our Mobile SDK Developer Guide. 3. We recommend using ASTC compression, though ETC2 will also work. Select File > Package Project > Android > Android (ASTC). 4. When prompted, browse to the destination folder where you would like your APK to be installed. 5. Once the build process is completed, navigate to the destination folder. Run the.bat file beginning with Install_ to install the application to your phone. 6. Once the build process is complete, your application will be visible on your mobile devices under Apps. Note that it will not be visible in Oculus Home. 7. Click the application to launch. You will be prompted to insert it into your Gear VR headset. Note: You may also set your phone to Developer Mode, which allows you to run VR applications on your mobile device without inserting it into an HMD. Launching a Project Directly onto your Phone You may also directly build and launch an application to your phone without saving the APK locally.

22 22 Unreal Mobile Development Unreal 1. Connect your phone to your PC by USB. 2. Select the Launch menu from the Unreal toolbar and select your phone under Devices. If you do not see your phone listed, verify your USB connection, and check if you need to set your connection to Connected as an Installer - some Samsung phones default to Charge Only connection. 3. Your application will build and install to your Android device. When the build is complete, you will be prompted to insert the phone into your headset to launch. Workflow For Gear VR development, we recommend setting your phone to Developer Mode, so you can run VR apps on your phone without inserting it into the Gear VR headset. This allows you to quickly review your progress on the device without much overhead. You may also launch the engine with a configuration option to use the GearVR plugin and Mobile renderer on the PC. This allows you to preview mobile development from the desktop using an Oculus Rift. To do so, disable OculusRift and SteamVR plugins for your project, and add -OpenGL -FeatureLevelES2 to your command line. To run your app during development, we generally recommend connecting your phone via USB, selecting the Launch pulldown menu, and then selecting your phone under the listed devices. Particularly if you are modifying maps and shaders but not changing the code, this is often much faster than building an APK. It is possible to preview mobile applications in the Oculus Rift during development. However, it is not generally useful to do so, because Rift applications are subject to substantially different performance requirements (see "Performance Targets" in Testing and Performance Analysis in Unreal on page 80). You may find it easiest to use the 2D preview in Unreal, and then either build an APK or use Launch on Device when you need to view the app in VR. Note: Mobile Content Scale Factor is not currently supported for Oculus development. Blueprints We provide Blueprints for common Gear VR operations such as querying battery level and headphone connection status. For more information, see Blueprints on page 51. Advanced Rendering Features for Mobile The Oculus SDK offers advanced mobile rendering feature such as multi-view and hybrid monoscopic rendering. Under some conditions they can significantly improve performance. For more information, see Advanced Rendering Features on page 32

23 Unreal Unreal Input 23 Unreal Input This section describes input handling for Oculus devices. Touch Oculus Touch is the standard tracked controller for Rift. Touch may be used to emulate Microsoft XInput API gamepads without any code changes. However, you must account for the missing logical and ergonoic equivalences between the two types of controllers.for more information, see Emulating Gamepad Input with Touch in our PC SDK Developer Guide. Gear VR Controller The Gear VR Controller is an orientation-tracked input device available through Unreal as a Motion Controller. For a discussion of best practices, see Gear VR Controller Best Practices in Oculus Best Practices. For instructions on how to add a Motion Controller component to your Pawn or Character, see Motion Controller Component Setup in Epic s Unreal documentation. Unreal has also provided a detailed training tutorial called Setting Up VR Motion Controllers. Gear VR positions the controller relative to the user by using a body model to estimate the controller s position. Whether the controller is visualized on the left or right side of the body is determined by left-handedness versus right-handedness, which is specified by users during controller pairing. Orientation tracking is handled automatically by the Motion Controller Component. If you need to query the controller orientation, you can query the Motion Controller rotation. Add the GearVRControllerComponent to create a MotionController with a Gear VR Controller mesh as a child (4.15 and later). The Gear VR Controller mesh may be found in Plugins/GearVR/Content/. Motion Controller Components must be specified as either left or right controllers when they are added, and each Gear VR Controller button mapping has a left/right equivalent. However, any button click sends both left and right events, so the setting you choose when add the Motion Controller component has no effect. Input Sample You will find an example of Gear VR Controller input in our Input sample available in the directory <install>/ Samples/Oculus. Please see the sample and its Level Blueprint for a full illustration of how to use the controller in your game, including the button mappings. Gear VR Controller Swiping Gestures For Gear VR Controllers, the user interface of your VR experience should follow these natural scrolling and swiping gestures: Swipe up: Pull content upward. Equivalent to scrolling down. Swipe down: Pull content downward. Equivalent to scrolling up. Swipe left: Pull content left or go to the next item or page. Swipe right: Pull content right or go to the previous item or page.

24 24 Unreal Input Unreal Haptics for Rift Controllers This guide describes how to use Unreal Blueprints to control haptic effects on Touch or Xbox controllers. You may use the standard Play Haptic Effect Blueprint to send a specified haptic curve to the Oculus Touch or Xbox controller. For more information, see Unreal s Play Haptic Effect guide. Haptics in Unreal Engine 4.13 PlayHapticEffects may be configured to play haptic waves based on three types of input. Right-click Content Browser to bring up the context menu, then select Miscellaneous. and select one of the following three options: Haptic Feedback Buffer: Plays a buffer of bytes 0-255, Haptic Feedback Curve: Draw the haptic linear curve you wish to play using the Haptic Curve Editor, or Haptic Feedback Soundwave: Select a mono audio file to be converted into a haptic effect of corresponding amplitude. The following Blueprint illustrates a simple haptics sequence on the Oculus Touch controller using Play Haptic Effect. This example sends vibrations using Play Haptic Effect when the left controller grip button is pressed. When the button is released, Stop Haptic Effect sends a stop command to the Touch controller. When the left controller X button is pressed, a constant vibration is sent by Set Haptics by Value until the button is released. Note that Set Haptics by Value calls are limited to 30 Hz; additional calls will be disregarded.

25 Unreal Unreal Input 25

26 26 Unreal Input Unreal Haptics in Unreal Engine 4.12 In addition to Play Haptic Effects, Unreal 4.12 adds Play Haptic Soundwave. The following Blueprint illustrates a simple haptics sequence on the Oculus Touch controller using Play Haptic Effects and Play Haptic Soundwave. This example sends vibrations using Play Haptic Effect when the left controller grip button is pressed. When the button is released, Play Haptic Soundwave sends a second vibration to the controller. When the left controller X button is pressed, a constant vibration is sent by Set Haptics by Value until the button is released. Note that Set Haptics by Value calls are limited to 30 Hz; additional calls will be disregarded.

27 Unreal Unreal Input 27

28 28 Unreal Input Unreal APlayerController::PlayHapticSoundWave takes a mono soundwave as an argument. It downsamples the wave into a series of bytes that serially describe the amplitude of the wave (uint8 values 0-255). Each byte is then multiplied by the factor specified in Scale (max = 255), and haptic vibrations are sent to the targeted Oculus Touch controller. Each controller must be targeted individually. Call Stop Haptic Effect to stop haptic playback. Haptics Sample The TouchSample, available from our Unreal GitHub repository, illustrates basic use of Oculus Touch, including haptics control using PlayHapticEffect() and PlayHapticSoundWave(). For more information, see Unreal Samples on page 72. Guardian System Boundary Component OculusBoundaryComponent exposes an API for interacting with the Oculus Guardian System. During Touch setup, users define an interaction area by drawing a perimeter called the Outer Boundary in space with the controller. An axis-aligned bounding box called the Play Area is calculated from this perimeter. When tracked devices approach the Outer Boundary, the Oculus runtime automatically provides visual cues to the user demarcating the Outer Boundary. This behavior may not be disabled or superseded by applications, though the Guardian System visualization may be disabled via user configuration in the Oculus app. Additional handling may be implemented by applications using the class UOculusBoundaryComponent. Possible use cases include pausing the game if the user leaves the Play Area, placing geometry in the world

29 Unreal Unreal Input 29 based on boundary points to create a natural integrated barrier with in-scene objects, disabling UI when the boundary is being rendered to avoid visual discomfort, et cetera. All UOculusBoundaryComponent public methods are available as Blueprints. Please see OculusBoundaryComponent.h for additional details. Basic Use Boundary types are Boundary_Outer and Boundary_PlayArea. Device types are HMD, LTouch, RTouch, Touch (i.e., both controllers), and All. Applications may query the interaction between devices and the Outer Boundary or Play Area by using UOculusBoundaryComponent::GetTriggeredPlayAreaInfo() or UOculusBoundaryComponent::GetTriggeredOuterBoundaryInfo(). Applications may also query arbitrary points relative to the Play Area or Outer Boundary using UOculusBoundaryComponent::CheckIfPointWithinOuterBounds() or UOculusBoundaryComponent::CheckIfPointWithinPlayArea(). This may be useful for determining the location of particular Actors in a scene relative to boundaries so, for example, they are spawned within reach, et cetera. Results are returned as a struct called FBoundaryTestResult, which includes the following members: Member Type Description IsTriggering bool Returns true if the device or point triggers the queried boundary type. DeviceType ETrackedDeviceTypeDevice type triggering boundary. ClosestDistance float Distance between the device or point and the closest point of the test area. ClosestPoint FVector Describes the location in tracking space of the closest boundary point to the queried device or point. ClosestPointNormal FVector Describes the normal of the closest boundary point to the queried device or point. All dimensions, points, and vectors are returned in Unreal world coordinate space. Applications may request that boundaries be displayed or hidden using RequestOuterBoundaryVisible(). Note that the Oculus runtime will override application requests under certain conditions. For example, setting Boundary Area visibility to false will fail if a tracked device is close enough to trigger the boundary s automatic display. Setting the visibility to true will fail if the user has disabled the visual display of the boundary system. Applications may query the current state of the boundary system using UOculusBoundaryComponent::IsOuterBoundaryDisplayed() and UOculusBoundaryComponent::IsOuterBoundaryTriggered(). You may bind delegates using the object OnOuterBoundaryTriggered.

30 30 Unreal Input Unreal Additional Features You may set the boundary color of the automated Guardian System visualization (alpha is unaffected) using UOculusBoundaryComponent::SetOuterBoundaryColor(). Use UOculusBoundaryComponent::ResetOuterBoundaryColor()to reset to default settings. UOculusBoundaryComponent::GetOuterBoundaryPoints() and UOculusBoundaryComponent::GetPlayAreaPoints()return an array of up to 256 3D points that define the Boundary Area or Play Area in clockwise order at floor level. You may query the dimensions of a Boundary Area or Play Area using UOculusBoundaryComponent::GetOuterBoundaryDimensions() or UOculusBoundaryComponent::GetPlayAreaDimensions(), which return a vectors containing the width, height, and depth in tracking space units, with height always returning 0. Boundary Sample BoundarySample, available from our Unreal GitHub repository, illustrates the use of the Boundary Component API for interacting with our Guardian System. For more information, see Unreal Samples on page 72.

31 Unreal Unreal VR Audio 31 Unreal VR Audio Audio is critical for creating a persuasive VR experience and can contribute strongly to the user's sense of immersion. Audio Basics When Unreal PC applications are launched, if the OculusVR plugin (or OculusRift plugin in legacy versions) is enabled and the Oculus VR Runtime Service is installed, then the application will automatically override the default Windows graphics and audio devices and target the Rift. The Oculus VR Runtime Service is installed with the Oculus app. Unless your application is intended to run in VR, do not enable the OculusVR plugin. Otherwise, it is possible that audio and/or video will be incorrectly targeted to the Oculus Rift when the application is run. Alternatively, users can disable loading all HMD plugins by specifying "-nohmd" on the command line. Audio Spatialization The Oculus Audio SDK includes spatialization plugins (OSPs) that provide HRTF spatialization and reverb modeling for audio editing tools commonly used with Unreal, including Audiokinetic Wwise and FMOD Studio. FMOD supports both Rift and mobile development, and Wwise supports Rift development. For more details on integrating our spatialization plugins with FMOD and Wwise for use in Unreal, see the Rift Audio section of our Rift developer guide. We recommend using FMOD or Wwise with the appropriate OSP, which will provide access to our full spatialization feature set as well as the full functionality of the audio tools themselves. Epic also offers built-in audio spatialization for Rift with HRTF-spatialization only. Epic s 4.15-preview branch offers preview support for Gear VR which does not support some features such as ducking and filtering, and which may contain bugs. For an example of how to implement it, see AmbientSound Spatialize in the Unreal Audio Content Example. For more information on our plugins, and for general information about audio design for VR, please see our Audio SDK developer guide.

32 32 Advanced Rendering Features Unreal Advanced Rendering Features This section describes important rendering options and tools that can significantly improve your application. Unreal Forward Shading Renderer (PC) Unreal Engine 4.14 introduced an experimental forward shading renderer optimized for VR. We recommend that all future PC titles use the forward shading renderer. The primary advantage of the forward renderer is that it is substantially faster than the deferred renderer. It outputs lit pixels directly, whereas the deferred renderer outputs material properties to a set of buffers, and then reads them back to apply lighting in a separate pass. Because the forward renderer does not render to intermediate buffers, it can take advantage of MSAA. We recommend using MSAA for anti-aliasing, as it not only reduces motion blur, but often provides a significant savings in GPU utilization over TAA. Not all features from the deferred renderer are available in the forward renderer, but many of these features require tradeoffs that disproportionately impact VR development. For example, space screen effects may introduce stereo disparities that can be uncomfortable for users. Given the substantial advantages the forward shading renderer offers VR developers, we anticipate that the forward shading renderer will become the target for future VR optimizations. For more information on the forward shading renderer, see Epic s New: Forward Shading Renderer with MSAA. VR Compositor Layers In some versions of Unreal, you may add transparent or opaque quadrilateral, cubemap, or cylindrical overlays to your level as compositor layers. TimeWarp compositor layers are rendered at the same frame rate as the compositor, rather than rendering at the application frame rate. They are less prone to judder, and are raytraced through the lenses, which improves the clarity of textures displayed on them. We recommend using compositor layers for text and for headlocked elements. Text rendered on compositor layers is more legible, and headlocked layers remain headlocked when Asynchronous TimeWarp interpolates dropped frames. Gaze cursors and UIs are good candidates for rendering as quadrilateral compositor layers. Cylinders may be useful for smooth-curve UI interfaces. Cubemaps may be used for startup scenes or skyboxes. We recommend using a cubemap compositor layer for your loading scene, so it will always display at a steady minimum frame rate, even if the application performs no updates whatsoever. This can significantly improve application startup times. Quadrilateral, cylinder, and cubemap layers are supported in Unreal versions 4.13 and later. By default, VR compositor layers are always displayed on top of all other objects in the scene. You may set compositor layers to respect depth positioning by enabling Supports Depth. If you are using multiple layers, use the Priority setting to control the depth order in which the layers will be displayed, with lower values indicating greater priority (e.g., 0 is before 1).

33 Unreal Advanced Rendering Features 33 Note that enabling Supports Depth may affect performance, so use it with caution and be sure to assess its impact. To create an overlay: 1. Create an Actor and add it to the level. If you don t want the Actor to be visible in the scene, use an Empty Actor. 2. Select the Actor, select Add Component, and choose Stereo Layer. 3. Under the Stereo Layer options, set Stereo Layer Type to Quad Layer or Cylinder Layer. 4. Set Stereo Layer Type to Face Locked, Torso Locked, or World Locked.

34 34 Advanced Rendering Features Unreal 5. Set the overlay dimensions in world units in Quad Stereo Layer Properties or Cylinder Stereo Layer Properties. 6. Select Supports Depth in Stereo Layer to set your compositor layer to not always appear on top of other scene geometry. Note that this may affect performance. 7. Configure Texture and additional attributes as appropriate. The Actor you slave the component to will be fixed at the center of the quad or the cylinder. You may add up to three VR compositor layers to mobile applications, and up to fifteen to Rift applications. Layers Sample The LayerSample, available from our Unreal GitHub repository, illustrates the use of VR Compositor Layers to display a UMG UI. For more information, see Unreal Samples on page 72. Multi-View (Mobile) Multi-view is an advanced rendering feature for Oculus Go and Gear VR, available in experimental form in Unreal 4.14 and later. If your application is CPU-bound, we strongly recommend using multi-view to improve performance. In typical OpenGL stereo rendering, each eye buffer must be rendered in sequence, doubling application and driver overhead. When multi-view is enabled, objects are rendered once to the left eye buffer, then duplicated to the right buffer automatically with appropriate modifications for vertex position and view-dependent variables such as reflection. Multi-view is currently supported by Note5, S6, S7, and S7 Edge phones using ARM Exynos processors and running Android M or N. It is also supported on S7 and S7 Edge phones using Qualcomm processors and running Android N. Multi-view requires OpenGL ES 2, and does not currently support OpenGL ES 3.1. Although multi-view can substantially reduce CPU overhead, keep in mind that applications submitted to the Oculus Store must maintain minimum frame rate per our requirements, even on devices that do not support multi-view. Enabling Multi-View Open Edit > Project Settings > Engine > Rendering. In the VR section, enable Mobile Multi-View. For Exynos devices, verify that Support OpenGL ES2 is checked in the Build section in Platforms > Android, and that Support OpenGL ES3 is not selected. Direct Multi-View Oculus versions 4.15 and higher using integration 1.12 or later support an enhanced version of multi-view that reduces the number of full-screen copies that must be rendered. In most cases, it provides a substantial reduction in GPU overhead. In Unreal 4.15, to enable Direct Multi-View, add the following to defaultengine.ini: [Oculus.Settings] bdirectmultiview=true In Unreal 4.16 and later, when you enable multi-view as described above, you will have a checkbox option to set it to Direct Multi-View.

35 Unreal Advanced Rendering Features 35 Hybrid Monoscopic Rendering (Mobile) Hybrid monoscopic rendering (available in Unreal 4.15 or later) renders objects close to the viewer in stereoscopic 3D, while rendering all objects that lie past a culling plane only once. Overview Enabling hybrid monoscopic rendering can produce performance improvements by drawing many objects only once instead of twice at a slight stereo disparity. In one test scenario, we measured a 25% decrease in rendering times on Epic s SunTemple sample. In most cases, the visual effect of rendering objects monoscopically with default settings is unnoticeable for typical users. When hybrid monoscopic rendering feature is enabled, a third monoscopic camera is placed between the left and right cameras on the same plane. In mobile applications, the eye cameras have symmetric frusta, so the monoscopic camera shares the same projection matrix as the stereoscopic cameras. The UE4 mobile forward renderer then does the following: 1. Renders non-transparent content with the stereo cameras. 2. Shifts and combines the output to create a monoscopic occlusion mask, which pre-populates the monoscopic depth buffer. 3. Renders non-transparent content with the monoscopic camera. 4. Composites the monoscopic camera s result into the stereo buffers. 5. Render all transparent content and perform all post processing in stereo. To separate stereo and mono content, hybrid monoscopic rendering uses a depth buffer. All pixels beyond the culling plane are discarded in the stereo view by clearing the stereo depth buffer to that depth. The monoscopic projection near plane is also initialized at that depth to discard fragments that have already been rendered in stereo. For additional technical details, see Hybrid Mono Rendering in UE4 and Unity in our Developer Blog. Enabling Hybrid Monoscopic Rendering Open Edit > Project Settings > Engine > Rendering. In the VR section, enable Monoscopic Far Field (Experimental).

36 36 Advanced Rendering Features Unreal Culling Plane When hybrid monoscopic rendering is enabled, a split plane parallel to the z-axis is added to the level, and objects falling on the far side of the plane are rendered using the monoscopic camera. Objects straddling the culling plane are rendered by both the monoscopic and stereoscopic cameras - see Best Practices below for more information. The distance of the culling plane may be configured in Settings > World Settings > VR > Mono Culling Distance (default setting is 750.0, or 7.5 meters). Console Commands When hybrid monoscopic rendering is enabled, you may set its mode with the following console command: vr.monoscopicfarfieldmode [0-4] Mode Description 0 Off 1 On (default) 2 Stereo only 3 Stereo only with no culling plane 4 Mono only Best Practices This section describes how to implement and tune hybrid monoscopic rendering. We recommend reading it carefully - the substantial performance improvements this feature provides when properly implemented make it

37 Unreal Advanced Rendering Features 37 worth spending some time to get it working properly. However, if it is improperly used, it may actually decrease performance. Due to limitations of the frustum culling algorithm we use to cull meshes, large objects around the scene such as environment cubemaps or skyboxes are still drawn to the stereoscopic cameras that have a close culling plane, even if no pixel of those objects makes it past the far plane. Those draw calls are unnecessary and create bandwidth and vertex costs to the GPU and CPU. To prevent unnecessary draw calls in the stereo buffer, we strongly recommend that you identify objects that are further than the culling plane and still render, and force them to render in the monoscopic buffer only. To identify all objects rendered to the stereoscopic cameras, set your level to display objects drawn to the stereoscopic buffer only with the console command vr.monoscopicfarfieldmode 2. You will only see objects that lie on the near side of the culling plane, like this: Figure 1: Stereo Only

38 38 Advanced Rendering Features Unreal Next, set your level to display all draw calls to the stereoscopic buffer with the console command vr.monoscopicfarfieldmode 3. This will display the stereo buffer without culling: Figure 2: Stereo Only with No Culling Any mesh that appears when vr.monoscopicfarfieldmode is set to 3 but not when it is set to 2 is redundantly rendered and should be set to render in the monoscopic buffer only. In the pictures above, that includes anything visible in the second image that does not appear in the first image. Notice in this example the mountains in the scene are drawn to the stereoscopic buffer - this is a typical case. Any very large object that the user is inside of will typically be a candidate for forcing to mono only. To force any mesh to render in the monoscopic buffer only, select the mesh and check the Render in Mono option under Rendering.

39 Unreal Advanced Rendering Features 39 Hybrid monoscopic rendering is usually a better fit for levels that contain a lot of distant content, such as outdoor scenes. In a small room where nearly everything lies before or straddles the culling plane, it will probably not improve performance. Note that you may enable or disable hybrid monoscopic rendering dynamically in an application, enabling it in levels where it provides a performance improvement, and disabling it when it does not. One way to quickly assess the performance impact is to use the console command stat RHI to configure your application to display the number of triangles being drawn in your level in real time. Then run the level with hybrid monoscopic rendering disabled and compare it to running the same level with the feature enabled, with different values for the culling plane, with different meshes tagged with Render in Mono, and so forth. Adaptive Pixel Density (Rift Only) Adaptive Pixel Density allows applications to scale down the application viewport as GPU resources exceed 85% utilization, and to scale up as they become more available. This feature is currently available for mobile development only. For CPU-bound applications, this feature has the potential to improve visual quality. The following charts illustrate pixel density (gold) and frames per second (blue) on a demo application with Adaptive Pixel Density off and on, respectively.

40 40 Advanced Rendering Features Unreal

41 Unreal Advanced Rendering Features 41 To enable Adaptive Pixel Density, use the console command hmd pdadaptive on or the console variable vr.oculus.pixeldensity.adaptive on. You may specify a minimum and maximum scaling factor (default 1 = normal density). See Unreal Console Variables and Commands on page 48 for more information.

42 42 Advanced Rendering Features Unreal To enable Adaptive Pixel Density on startup, specify the appropriate settings in Engine/Config/BaseEngine.ini. For example: [Oculus.Settings] PixelDensityMin=0.5 PixelDensityMax=1.0 PixelDensityAdaptive=true See Loading Console Variables in Epic s Console Manager: Console Variables in C++ for more information. If you do not want some Actors within your level (e.g., text displays) to be scaled, they should be drawn using separate VR Compositor Layers which are not scaled by pixel density. For more information, see VR Compositor Layers on page 32. Note: To minimize the perceived artifacts from changing resolution, there is a two-second minimum delay between every resolution change.

43 Unreal Application Lifecycle Handling 43 Application Lifecycle Handling Applications may query for status changes such as losing input focus or the presence of system overlays and take appropriate action such as pausing. Input Focus With Oculus integration 1.18 or later, applications may query the Has Input Focus flag in the Oculus Library, which returns false if the application loses input focus, as when a Dash menu UI overlay is being rendered. In most cases, when input focus status is lost, the appropriate action is to hide the tracked controllers in the game. For an illustration of a typical case, see the VRCharacter Blueprint in our Touch sample. You may also wish to hide any Actors that may occlude the overlay (i.e., everything within 2-3 meters of the player) when Input Focus returns false.

44 44 Application Lifecycle Handling Unreal Declaring Applications to be Lifecycle-Aware There are two ways to signal the Oculus runtime than an application supports input focus loss so Dash may be rendered over a paused application instead of in an empty scene. The first is to set bsupportsdash=true under Oculus.Settings in Engine.ini (default is false). The second is to launch the application with the parameter -oculus-focus-aware. Do not set bsupportsdash=true unless your application properly handles input focus loss.

45 Unreal Oculus Platform Features and Online Subsystems 45 Oculus Platform Features and Online Subsystems The Oculus Platform supports several features related to security, community, revenue, and engagement, including matchmaking, in-app purchase, entitlement checking, VoIP, cloudsaves, and more. Unreal developers may access Oculus Platform features in two ways: 1. A subset of Platform functionality is built into the engine and is available through the Unreal Online Subsystem interface, and 2. The Platform SDK includes an Unreal plugin providing full access to all Oculus Platform features Some Platform features support Blueprints, such as Entitlement Checking. For more information and instructions, see the Platform section of Blueprints on page 51 For instructions on using the Oculus Platform Plugin for Unreal, please see our Platform SDK documentation and the readme included in the Unreal folder of the Platform SDK archive (available from our Unreal Downloads page). Unreal Online Subsystems Several Oculus Platform features may be accessed through Unreal s Online Subsystems (OSS) interface in Unreal Engine 4.12 and For more information on available UE4 versions and compatibility, see our Unreal Engine on page 5. For each supported feature, only a subset of functionality is exposed by the Online Subsystems interface. Developers who wish to use Platform features not available through Online Subsystems should code against the native C headers located in Engine\Source\ThirdParty\Oculus\LibOVRPlatform and include them in their build. The following features are available through the Online Subsystems. Feature Online Subsystem Interface Achievements Achievements Top Players Leaderboards Friends Leaderboards Rooms Sessions Matchmaking Sessions Cloudsaves Cloudsaves Identity Identity Entitlements Identity VoIP Voice

46 46 Oculus Platform Features and Online Subsystems Unreal Feature Online Subsystem Interface P2P Networking Netdriver To enable the OSS plugin, select Edit > Plugin Settings > Built-In > Online Platform > Online Subsystem Oculus, and check Enabled. To access Oculus Platform features through Online Subsystems, you must adjust your defaultengine.ini file to use Oculus: [OnlineSubsystem] DefaultPlatformService=Oculus [OnlineSubsystemOculus] benabled=true OculusAppId=<app_id_here> For more information on Unreal Subsystems, see Epic s Online Subsystems Overview. Entitlement Checking Entitlement checking may be used to protect apps from unauthorized distribution. Unreal developers using Blueprints can use the Verify Entitlements Blueprint under Oculus > Entitlement. For more information and instructions, see the Platform section of Blueprints on page 51 and Entitlements: App Authorizations in our Platform guide. Applications should always have entitlement checks enabled for submissions to the Oculus store. VoIP Our Online Subsystems implementation of VoIP extends the Unreal class IOnlineVoice. See Unreal s IOnlineVoice reference for more information. To enable VoIP in DefaultEngine.ini: [OnlineSubsystem] DefaultPlatformService=Oculus bhasvoiceenabled=true [Voice] benabled=true To connect to someone else: Online::GetVoiceInterface()->RegisterRemoteTalker(<FUniqueNetIdOculus>); To disconnect from someone: Online::GetVoiceInterface()->UnregisterRemoteTalker(<FUniqueNetIdOculus>); To see if a remote user is talking: Online::GetVoiceInterface()->IsRemotePlayerTalking(<FUniqueNetIdOculus>); To unmute yourself: Online::GetVoiceInterface()->StartNetworkedVoice(0);

47 Unreal Oculus Platform Features and Online Subsystems 47 To mute yourself: Online::GetVoiceInterface()->StopNetworkedVoice(0); P2P Our Online Subsystem implementation of P2P extends the Unreal class UNetDriver. See Unreal's UNetDriver reference for more information. To enable P2P in DefaultEngine.ini: [/Script/Engine.GameEngine] +NetDriverDefinitions=(DefName="GameNetDriver",DriverClassName="OnlineSubsystemOculus.OculusNetDriver",DriverClass [/Script/OnlineSubsystemOculus.OculusNetDriver] NetConnectionClassName="OnlineSubsystemOculus.OculusNetConnection" To act as the host of a game, open the map as a listen server. To act as the client of the game, open a connection to <oculus_id_of_host>.oculus.

48 48 Unreal Console Variables and Commands Unreal Unreal Console Variables and Commands This document reviews useful console variables and commands available for Unreal development. Unreal Console Variables (1.15 or later) Oculus integration 1.15 and later replace our previous console command model with console variables. For a complete description of available console variables, see UE4-Oculus.txt in the root folder of your Unreal installation. Legacy Unreal Console Commands (1.14 or earlier) These commands are available in Unreal versions using Oculus integration 1.14 or earlier. See Unreal Engine on page 5 for more information. Rift: Press the tab keys while your game is running to bring up the console. Gear VR: To bring up the console on a mobile device, set it to developer mode (instructions developer.oculus.com/documentation/mobilesdk/latest/concepts/mobile-troublesh-device-run-app-outside/), launch the application, and tap the screen with four fingers. To specify console commands to be loaded on startup, in most cases you should add them to Engine/Config/ ConsoleVariables.ini. See Loading Console Variables in Epic s Console Manager: Console Variables in C++ for more information. For more information, see "Useful VR Console Commands" in Unreal's VR Cheat Sheet. Table 3: Configuration Commands Command Description stereo on off Enables/Disables stereo mode. stereo e=0.064 Eye distance (m). stereo w2m=100 Overrides default world-units-to-meters scale. stereo ncp=10 fcp=10000 Overrides near clipping and/or far clipping planes for stereo rendering (in cm). stereo cs=1 ps=1 Overrides camera and position scale. stereo show Shows current ipd and head model offset. stereo reset Resets stereo settings. hmd enable disable Enables/Disables HMD. hmd pd [0..3.0] Sets pixel density in the center (default is 1.0). hmd pdadaptive on off Enables/Disables adaptive pixel density (see Adaptive Pixel Density (Rift Only) on page 39 details).

49 Unreal Unreal Console Variables and Commands 49 Command Description hmd pdmax [ ] Sets maximum adaptive pixel density (ignored if hmd pdadaptive is off). hmd pdmin [ ] Sets minimum adaptive pixel density (ignored if hmd pdadaptive is off). hmd sp [ ] Overrides screenpercentage for stereo mode. (Deprecated, use 'hmd pd xxx' instead). hmd hqdistortion on off Enables/Disables high-quality distortion. hmd mirror on off Enables/Disables mirroring to the desktop window. hmd mirror mode [0..4] Sets mirror mode: 0=Distorted, 1=Undistorted, 2=SingleEye, 3=SingleEye Letterboxed, 4=SingleEye Cropped hmdpos on off toggle Enables/Disables/Toggles positional tracking. hmdpos enforce on off Enables/Disables head tracking even if not in stereo (for testing purposes). hmdpos reset {yaw} Resets position and rotation, applies yaw (in degrees) if provided. hmdpos resetrot {yaw} Resets rotation only, applies yaw (in degrees) if provided. hmdpos resetpos Resets position only. hmdpos show Outputs status of positional tracking to log. hmdpos floor eye Selects tracking origin. Table 4: Misc Commands Command Description hmd stats Shows HMD-related stats. hmd grid Toggles lens-centered grid. hmd updateongt on off Enables/Disables updating HMD pose on game thread. On by default. hmd updateonrt on off Enables/Disables updating HMD pose on render thread, for lower latency. On by default. hmd cubemap [gearvr] [xoff=n] Generates a cube map image of your application. May be used for VR app previews in the [yoff=n] [zoff=n] [yaw=n] Store. Cube map PNGs will be saved in the directory GameDir/Saved/Cubemaps. gearvr: If specified, cube map size will be 6x1024x1024, otherwise it will be 6*2048x2048. xoff, yoff, zoff: Offset from the current player's location. yaw: override yaw rotation (degrees).

50 50 Unreal Console Variables and Commands Unreal Command Description hmd setint PerfHUDMode [0..4] Selects performance HUD mode, set to 0 to disable. hmd setint DebugHudStereoMode [0..3] Selects debug HUD stereo mode, set to 0 to disable. hmdversion Prints Oculus SDK version used and Oculus Plugin info.

51 Unreal Blueprints 51 Blueprints Oculus Blueprints are available for developer use. To access these Blueprints, you must enable the Online Subsystem Oculus in Edit > Plugin > Online Platform. The Blueprint class Oculus Library provides Blueprints for querying and controlling loading screens, HMD pose, user profile, raw sensor data, position scale, and more.

52 52 Blueprints Unreal Blueprints available for mobile development include: Set CPU And GPU Levels - passes floats to Gear VR Function Library Several Blueprints useful for mobile VR development are available in Epic's Optional Mobile Features Blueprint Library, such as:

53 Unreal Blueprints 53 Are Headphones Plugged In Get Battery Level Get Battery Temperature Get Volume State Input The Input Blueprint provides a control interface for Touch, Oculus remote, and the Gear VR Controller. Gamepad events return Oculus Remote button events (bool) return Oculus Touch button events (bool) Gamepad Values - return Oculus Touch values (floats)

54 54 Blueprints Unreal Oculus Library - provides a number of Blueprints for querying and controlling splash screen, HMD pose, loading icon, user profile, raw sensor data, position scale, and more. Gear VR Controller clickpad events are reported in the Input Blueprint as thumbstick events. A clickpad click is reported as a press followed by a release. You may query the x- and y-axis of the thumbstick to determine the location of clickpad presses. The Gear VR Controller Back button may be queried with Motion Controller Face Button 1. The Is Device Tracked? Blueprint may be used to infer the handedness of a Gear VR Controller. If a connected input device returns true for Right Controller and false for Left Controller, it is a right-handed controller. Oculus Touch Haptics The following Blueprints are not Oculus-specific, but may be used to control haptics for the Xbox or Oculus Touch controllers. For detailed information, see Haptics for Rift Controllers on page 24 Play Haptic Effect Stop Haptic Effect Set Haptics by Value Play Haptic Soundwave Platform To access these Blueprints, you must enable the Online Subsystem Oculus in Edit > Plugin > Online Platform. Verify Entitlements - verifies the user is authorized to run the application. Get Oculus Identity - returns Oculus ID and Oculus username as strings; provides flow control for On Success/On Failure In this simple example of the Verify Entitlement Blueprint, the application will verify that the user has an entitlement when the application launches, and quit if the check fails. In reality, we recommend providing an appropriate error message before shutting down.

55 Unreal Blueprints 55 This example of the Get Oculus Identity Blueprint illustrates how to print the user s Oculus ID to the log or screen.

56 56 Blueprints Unreal Mobile (pre-1.16 integration) Beginning with Oculus Unreal Integration 1.16, our Unreal integration provides Blueprints based on a common Oculus API for both Rift and mobile development. Mobile-specific Blueprints based on our previous Gear VR plugin were removed and most functionality was merged into the shared API.

57 Unreal Unreal Loading Screens 57 Unreal Loading Screens We strongly recommend adding a loading screen to your Rift or mobile application. Loading screens are required by the Oculus Store. A simple loading screen presenting the user with a rotating texture can be easily added to your game. You will need to specify a texture for display, either with a provided Blueprint or by modifying the defaultengine.ini configuration file. In both methods, the loading screen is drawn to a VR Compositor Layer, and is guaranteed to render consistently at the required minimum frame rate. We recommend favoring the configuration file approach over the Blueprint, as Blueprints must be initialized before your Loading Screen can be displayed, and there may be a delay after application launch before they are visible. In our Unreal source using Oculus SDK 1.14 and earlier, a rotating arrow loading screen was provided by default. Versions using 1.16 and later do not provide a default loading screen. Create a Loading Screen with a Configuration File Add an entry similar to the following example to your project s defaultengine.ini file. The values given in this example work well for a typical icon. [Oculus.Splash.Settings] TexturePath=/Game/LoadingIconTexture.LoadingIconTexture DistanceInMeters=X=6.0 Y=0.0 Z=0.0 SizeInMeters=X=0.25 Y=0.25 RotationDeltaInDegrees=1.0 RotationAxis=X=0.0 Y=0.0 Z=1.0 Create a Loading Screen using a Blueprint Oculus provides Blueprints to show, hide, and remove loading screens and transition screens. The Add Loading Icon Blueprint requires specifying the path of a texture to be added, and will then display that texture in the middle of the screen during loading. For greater control over how the icon will be displayed, use the Add Loading Splash Screen Blueprint:

58 58 Unreal Loading Screens Unreal For more information on Oculus Blueprints, see Blueprints on page 51.

59 Unreal Unreal Mixed Reality Capture 59 Unreal Mixed Reality Capture This guide describes how to add and configure mixed reality capture support for your Unreal application. Mixed reality capture is supported for Rift applications only. Introduction Mixed reality capture places real-world people and objects in VR. It allows live video footage of a Rift user to be composited with the output from a game to create combined video that shows the player in a virtual scene. (Courtesy of Medium and artist Dominic Qwek - Live video footage may be captured with a stationary or tracked camera. For more information and complete setup instructions, see the Mixed Reality Capture Setup Guide. Once configured to use mixed reality capture, applications can be launched with the feature enabled by running them with the appropriate parameter - see "Command-line Parameter Reference" below for more information. Once an application is configured by developers to use mixed reality capture, users can launch applications with the feature enabled and control several relevant settings with settings controlled by Engine.ini or command-line parameters. See [Oculus.Settings.MixedReality] in Engine.ini below for more information.

60 60 Unreal Mixed Reality Capture Unreal Compositing the Scene Mixed reality capture supports two methods for combining application output and video footage: direct composition and external composition. In direct composition mode, your mixed reality capture application streams the real-world footage from your camera to your scene directly, and displays the composited image in the application itself. Direct composition mode requires the use of a green screen for video capture, and the composited image may exhibit some latency from the video stream. We currently recommend using it for proof-of-concept, troubleshooting, and hobby use. For more polished composition, we recommend using external composition mode. In this mode, the application outputs two windows. The MirrorWindow displays the application, as shown below. The second window displays the foreground content from the video stream on the left against a green background, and it displays the background content on the right. Figure 3: External Composition Mode In external composition mode, third-party composition software such as OBS Studio or XSplit is required to clip the green screen, combine the images, and compensate for camera latency. For more information on how to composite a scene, see the Mixed Reality Capture Setup Guide. Preparation You must run the CameraTool prior to launching your mixed reality capture application to configure the external camera and VR Object. See the Mixed Reality Capture Setup Guide for setup information.

61 Unreal Unreal Mixed Reality Capture 61 Create Simple VR Application (Optional) If you wish to experiment with this feature but do not have a VR application prepared, create a basic application with the following steps. 1. Create a new project with the Virtual Reality Blueprint template. 2. Open the MotionControllerMap in Content > VirtualRealityBP > Maps. 3. Configure the Virtual Reality Blueprint for mixed reality capture. a. (Optional) To be able to use a Touch controller as an input device with the Virtual Reality Blueprint, open the Project Settings menu, select Engine - Input and expand Action Mappings. Modify the settings as shown below. b. (Optional) Open Edit Preference in the General - Miscellaneous panel and uncheck Use Less CPU when in Background. This prevents applications with mixed reality capture enabled from entering a lowfps mode when switching to the composition software. c. The MotionControllerPawn Blueprint contains a bug which sets the Tracking Origin to "Eye Level" incorrectly. Add a link between "Default" and "Set Tracking Origin (Floor Level) as shown:

62 62 Unreal Mixed Reality Capture Unreal Add Mixed Reality Capture Support To enable mixed reality capture, you will add a camera actor to your map, and set it as the origin for your tracking space (optional). See the "Blueprint Reference" below for detailed information on the various components and settings.

63 Unreal Unreal Mixed Reality Capture 63

64 64 Unreal Mixed Reality Capture Unreal 1. In the Level editor: a. Add an instance of "Oculus MR Casting Camera Actor" to the map, which will be used in mixed reality capture. The composition parameters for this view may be set in the Oculus MR section of the OculusMR_CastingCameraActor instance. 2. In the Level Blueprint editor: a. Set VRPawn s VROrigin (or the tracking origin component in your map) as the OculusMR_CastingCameraActor1 s TrackingReferenceComponent. If you plan to use the first PlayerController s position as the tracking reference, you may skip this step. See "Blueprint Reference" below for more information. b. Bind the OculusMR_CastingCameraActor1 to the first TrackedCamera, which was configured with the CameraTool. The final Blueprint should looks like this:

65 Unreal Unreal Mixed Reality Capture (Optional) Check the Casting Auto Start checkbox in the Oculus MR section of OculusMR_CastingCameraActor1 to configure the engine to automatically open the Casting Window on launch. This option is useful for debugging. To check that everything is working properly, launch the map in VR Preview mode and verify that the Casting Window opens with the mixed reality capture content. Sample Scene A trivial sample map with mixed reality capture enabled is available in our GitHub repository (access instructions here) in Samples/Oculus/MixedRealitySample. [Oculus.Settings.MixedReality] in Engine.ini You may override any local Mixed Reality Capture settings by specifying launch settings in the [Oculus.Settings.MixedReality] section in your Engine.ini file. To easily write Mixed Reality Capture settings to Engine.ini, configure your project with the desired settings and launch the application with the -save_mxr_settings parameter. This will create an [Oculus.Settings.MixedReality] section with the current configuration settings in Saved/Config/Windows/ Engine.ini. Any user may edit the Engine.ini file to change these settings later. To launch an application using the settings specified in Engine.ini, use the launch parameter load_mxr_settings. Launch Commands Once you package the sample scene, you may launch it with these parameters. You may set your application to launch with mixed reality capture enabled in Blueprints for debugging purposes only. The Blueprint setting bcastingautostart is automatically disabled when you build your package. // launch in direct composition mode MixedRealitySample.exe -vr -mxr_open_direct_composition // launch in direct composition mode with MirrorWindow projection MixedRealitySample.exe -vr -mxr_open_direct_composition -mxr_project_to_mirror_window // launch in MultiView mode MixedRealitySample.exe -vr -mxr_open_multiview

66 66 Unreal Mixed Reality Capture Unreal // launch in MultiView with MirrorWindow projection MixedRealitySample.exe -vr -mxr_open_multiview -mxr_project_to_mirror_window // launch using settings in [Oculus.Settings.MixedReality] section in Engine.ini (overrides local settings) -load_mxr_settings // Save current settings to [Oculus.Settings.MixedReality] section in Engine.ini. -save_mxr_settings Mixed Reality Capture Features The following features work for direct composition. Chroma Key Chroma key settings allow for fine-tuned control of how the video and application streams are composited. Use these settings to set the reference color of the green screen and control various thresholds at which video pixels are included or excluded from the final frame. Dynamic Lighting When Dynamic Lighting is enabled, video captured by the physical camera is illuminated in the composted scene by light effects and flashes within the application. For example, a player would briefly be brightly lit during an explosion in the game. Lighting is applied to video on a flat plane parallel to the camera unless a depth-sensing camera is used (ZED camera), in which case pixel depth is used to generate a per-pixel normal which is used in the lighting process. Virtual Green Screen When enabled, Virtual Green Screen crops video footage that falls outside of the Guardian System Outer Boundary or Play Area configured by the user. The Outer Boundary is the actual perimeter drawn by the user during Touch setup, while the Play Area is a rectangle calculated from the Outer Boundary. Note that the Outer Boundary and Play Area are two-dimensional shapes in the x and z axis. Note that the Outer Boundary and Play Area are two-dimensional shapes in the x and z axis, and the virtual green screen is a 3D volume whose caps are set at +/- 10 meters by default. Blueprint Reference AOculusMR_CastingCameraActor Properties Property Description bcastingautostart Starts mixed reality capture casting automatically when the level starts. This option is for debugging only, and will be automatically disabled when the game is launched as a standalone package. Use launch commands to launch applications with mixed reality capture enabled. bprojecttomirrorwindow Set to true to cast to the MirrorWindow. This can simplify window switching, especially on a single-monitor configuration. By default the scene is casted to a standalone window, which offers the most precision in the composition. When set to true, the bprojecttomirrorwindow is automatically minimized on startup, as the content is now casting to the MirrorWindow.

67 Unreal Unreal Mixed Reality Capture 67 Property Description CompositionMethod MultiView (default): The casting window includes the background and foreground view for external composition. DirectComposition: The game scene is composited with the camera frame directly. ClippingReference Specifies the distance from the camera to the mixed reality capture casting background/foreground boundary. Set to CR_TrackingReference to use the distance to the Tracking Reference (recommended for stationary experiences). Set to CR_Head to use the distance to the HMD (default, recommended for roomscale experiences). TrackedCamera Information about the tracked camera which this object is bound to. TrackingReferenceComponent (optional) If the application uses a VROrigin component to set the tracking space origin, specify that component here. Otherwise the system will use the location of the first PlayerController as the tracking reference. bfollowtrackingreference If true the casting camera will automatically follow the movement of the tracking reference. busetrackedcameraresolution If true the casting viewports will use the same resolution as the camera used in the calibration process. WidthPerView When busetrackedcameraresolution is false, sets the width of each casting viewport (foreground, background, or direct composited). HeightPerView When busetrackedcameraresolution is false, sets the height of each casting viewport (foreground, background, or direct composited). CapturingCamera When CompositionMethod is set to DirectComposition, indicates which physical camera device provides the video frame for compositing. Options are Web Camera 0, Web Camera 1, and ZED camera. CastingLatency When CompositionMethod is set to MultiView, sets the latency of the casting output. This setting may be used to help sync with the camera latency in the external composition application. HandPoseStateLatencyAdds a delay in seconds (max. 0.5) to the tracked controllers in the composited scene to correct for camera latency. ChromaKeyColor Specify the approximate color of your green screen as compositing reference. The default value is Color.green, which matches a typical general green screen under good lighting conditions. ChromaKeySimilarity When the distance between pixel color and chromakeycolor is less than chromakeysimiliarity, the pixel is hidden. Increase this value if the green screen is partially visible, and reduce this value if the person in the scene partially disappears.

68 68 Unreal Mixed Reality Capture Unreal Property Description ChromaKeySmoothRange Defines a small range of color distance between the pixel and the green screen in which the video frame pixel will be rendered as semi-transparent. Increase this value to make the person image more smooth, and decrease it to sharpen ChromaKeySplitRangeDefines a small range of color distance between the pixel and the green screen in which the video frame pixel will be desaturated. Increase this value to reduce green edges around the person s image. Decrease it if the person image looks overly desaturated. VirtualGreenScreenType Options are Off, OuterBoundary, or PlayArea. UseDynamicLighting Set to true to active Dynamic Lighting. DepthQuality Sets quality level of depth image for dynamic lighting to Low, Medium, or High. Higher values are more smooth and accurate, but more costly for performance. DynamicLightingDepthSmoothFactor Larger values make dynamic lighting effects smoother, but values that are too large make the lighting look flat. DynamicLightingDepthVariationClampingValue Sets the maximum depth variation across edges (smaller values set smoother edges). Methods Method Description BindToTrackedCameraIndexIfAvailable Binds the casting camera to the calibrated external camera. If there is no calibrated external camera, the TrackedCamera parameters must be set up to match CastingCameraActor placement. It provides an easy way to directly place a stationary casting camera in the level. RequestTrackedCameraCalibration When bfollowtrackingreference is false, manually call this method to move the casting camera to follow the tracking reference (i.e., the player). OpenCastingWindow Opens the casting window. CloseCastingWindow Closes the casting window. ToggleCastingWindowToggles the casting window. HasCastingWindowOpened Checks if the casting window has already been opened. FTrackedCamera Properties

69 Unreal Unreal Mixed Reality Capture 69 Property Description Index >=0: the index of the external camera. -1: Do not bind to any external camera (i.e., set up to match the manual CastingCameraActor placement). Name The external camera name set through the CameraTool. FieldOfView Horizontal FOV, in degrees. SizeX, SizeY Resolution of the camera frame. AttachedTrackedDevice The tracking node the external camera is bound to: None: stationary camera HMD, LTouch, RTouch: HMD or left/right Touch DeviceObjectZero: The VR object CalibratedRotation, CalibratedOffset The relative pose of the camera to the attached tracking device. UserRotation. UserOffset (optional) Provide user pose to fine tune the relative camera pose at the runtime. Equals the absolute pose in the tracking space if AttachedTrackedDevice==None. Use to match the manual CastingCameraActor placement in the level when Index == -1. UOculusMRFunctionLibrary Methods Method Description GetAllTrackedCameraRetrieve an array of all tracked cameras which were calibrated through the CameraTool. Command-line Parameter Reference Parameter Description -mxr_open Automatically open the casting window in the preset composition mode Automatically open the casting window in MultiView composition mode. mxr_open_multiview Automatically open the casting window in DirectCompositon mode. mxr_open_direct_composition

70 70 Unreal Mixed Reality Capture Unreal Parameter Description Project the casting output to the MirrorWindow. mxr_project_to_mirror_window -save_mxr_settings Create an [Oculus.Settings.MixedReality] section with the current configuration settings in Saved/Config/Windows/Engine.ini. -load_mxr_settings Load mixed reality settings from the [Oculus.Settings.MixedReality] section in Saved/Config/Windows/Engine.ini. Console commands Command Description mr.autoopencasting Automatically open the casting window: 0=Off; 1=MultiView; [0 1 2] 2=DirectComposition mr.chromatorelancea[green-screen removal] When CompositionMethod is set to <float> DirectComposition, sets how heavily to weight non-green values in a pixel. For example, if the character image looks too transparent, you may increase this value to make it more opaque. mr.chromatorelanceb[green-screen removal] When CompositionMethod is set to <float> DirectComposition, sets how heavily to weight the green value. If mid-range greens don t appear to be cut out, increasing B or decreasing A may help. mr.chromashadows [Green-screen removal] When CompositionMethod is set to <float> DirectComposition, the shadow threshold helps mitigate shadow casting issues by eliminating very dark pixels. mr.chromaalphacutoff[green-screen removal] When CompositionMethod is DirectComposition, <float> alpha cutoff is evaluated after chroma-key evaluation and before the bleed test to fully discard pixels with a low alpha value. mr.castinglantency <float> The casting latency in MultiView mode. mr.projecttomirrorwindow Project the casting output to the MirrorWindow. [0 1] mr.projecttomirrorwindow Set to 1 to cast to the MirrorWindow. This can simplify window switching, [0 1] especially on a single-monitor configuration. By default the scene is casted to a standalone window, which offers the most precision in the composition. mr.mixedreality_override Use the Mixed Reality console variables. [0 1] mr.mixedreality_chromakeycolor_r Chroma Key Color R

71 Unreal Unreal Mixed Reality Capture 71 Command Description mr.mixedreality_chromakeycolor_g Chroma Key Color G mr.mixedreality_chromakeycolor_b Chroma Key Color B mr.mixedreality_chromakeysimilarity When the distance between pixel color and chromakeycolor is less than chromakeysimiliarity, the pixel is hidden. Increase this value if the green screen is partially visible, and reduce this value if the person in the scene partially disappears. mr.mixedreality_chromakeysmoothrange Defines a small range of color distance between the pixel and the green screen in which the video frame pixel will be rendered as semi-transparent. Increase this value to make the person image more smooth, and decrease it to sharpen. mr.mixedreality_chromakeyspillrange Defines a small range of color distance between the pixel and the green screen in which the video frame pixel will be desaturated. Increase this value to reduce green edges around the person s image. Decrease it if the person image looks overly desaturated. mr.castinglantency <float> The casting latency in Multi-View mode.

72 72 Unreal Samples Unreal Unreal Samples Oculus provides samples which illustrate basic VR concepts in Unreal such as Touch, haptics, and the Boundary Component API for interacting with the Guardian System. Samples are available from the Oculus Unreal GitHub repository. To access this repository, you must hold a UE4 license and be subscribed to the private EpicGames/UnrealEngine repository (see for details). All samples require a compatible version of the Unreal Engine which supports the illustrated features. To explore samples, we generally recommend using Unreal versions that we ship from our GitHub repository, which always include the latest features. For a review of which Unreal versions support which features, see Unreal Engine on page 5. Boundary Sample BoundarySample is a Blueprint sample that illustrates the use of the Boundary Component API for interacting with our Guardian System. This API allows developers to query the Guardian System trigger state, return the closest boundary point to a queried device or point, and more. In this sample, the Guardian System Boundary Area is visualized on the ground in orange-yellow, and the Play Area is visualized on the ground in purple. Two spheres track the Touch controllers. When they approach the Boundary Area, they project a colored arrow toward the nearest sample point in the Guardian Area.

73 Unreal Unreal Samples 73

74 74 Unreal Samples Unreal NewGameMode and VRCharacter are used to initialize the scene and make the scene display at the appropriate height, et cetera. Controls and Behavior These behaviors are controlled by the BoundarySampleMap, which you can use as a reference for implementing related functionality. Test Control or Behavior RequestOuterBoundaryVisible() Press 'V' on keyboard/right Touch button A to request boundary on. Press 'X'/right Touch button B to cancel. SetOuterBoundaryColor() Press 'G' on keyboard/left Touch button X to set boundary to green. Press 'P'/left Touch button Y to make purple. IsOuterBoundaryDisplayed() If so, cylinder is not visible. IsOuterBoundaryTriggered() If so, cone is not visible (cone should remain if boundaries are requested on by application). GetTriggeredOuterBoundaryInfo() Visualize normals. GetTriggeredPlayAreaInfo() Visualize normals. OnOuterBoundaryTriggered Cube made visible. Event OnOuterBoundaryReturned Cube made invisible. Event Layer Sample LayerSample is a Blueprint sample that illustrates the use of VR Compositor Layers to display a UMG UI. This sample includes two spheres that track with the Touch controllers and two UMG widgets rendered as VR Compositor layers. One is rendered as a quad layer and the other as a cylinder layer.

75 Unreal Unreal Samples 75 Actor_Blueprint illustrates rendering a UMG widget into a stereo layer. The widget is first rendered into a Material, then the SlateUI texture is pulled from the Material into the stereo layer. This is the UMG widget that is rendered to the quad and cylindrical layers in the sample. Open MenuBlueprint to open the UMG widget in the UMG Editor. NewGameMode and VRCharacter are used to initialize the scene and make the scene display at the appropriate height, et cetera.

76 76 Unreal Samples Unreal Input Sample InputSample illustrates basic use of Oculus controllers including Touch and Gear VR Controller. It illustrates tracking, thumbstick control, and haptics control using PlayHapticEffect() and PlayHapticSoundWave(). Two spheres track with the Touch controllers. The right controller thumbstick may be used to control the position of the light gray box. Touch capacitance sensors detect whether the right thumb is in a touch or neartouch position, which controls the height of the box. Press and hold the left Touch grip button to play a haptics clip. Press and hold the left Touch X button to create a haptics effect by setting the haptics value directly.

77 Unreal Unreal Samples 77 You will find the Haptics control Blueprint and the Thumbstick control Blueprint in the sample Level Blueprint. NewGameMode and VRCharacter are used to initialize the scene and make the scene display at the appropriate height, et cetera.

78 78 Unreal Samples Unreal Mixed Reality Capture Sample A trivial sample map with mixed reality capture enabled is available in our GitHub repository (access instructions here) in Samples/Oculus/MixedRealitySample. Select the OculusMR_CastingCameraActor1 instance to see how it is configured for the Level. For more information, see Unreal Mixed Reality Capture on page 59. Sample Scene A trivial sample map with mixed reality capture enabled is available in our GitHub repository (access instructions here) in Samples/Oculus/MixedRealitySample. Legacy Touch Sample This is sample is available with legacy versions using Oculus integration 1.14 or earlier - for more information, see Unreal Engine on page 5.

79 Unreal Unreal Samples 79 TouchSample illustrates basic use of Oculus Touch including controller tracking and thumbstick control. It also illustrates haptics control using PlayHapticEffect() and PlayHapticSoundWave(). Two spheres track with the Touch controllers. The right controller thumbstick may be used to control the position of the light gray box. Touch capacitance sensors detect whether the right thumb is in a touch or near-touch position, which controls the height of the box. Press and hold the left Touch grip button to play a haptics clip. Press and hold the left Touch X button to create a haptics effect by setting the haptics value directly. You will find the Haptics control Blueprint and the Thumbstick control Blueprint in the sample Level Blueprint. NewGameMode and VRCharacter are used to initialize the scene and make the scene display at the appropriate height, et cetera.

80 80 Testing and Performance Analysis in Unreal Unreal Testing and Performance Analysis in Unreal This guide describes basic testing and perfromance analysis for Oculus development in Unreal. VR application debugging is a matter of getting insight into how the application is structured and executed, gathering data to evaluate actual performance, evaluating it against expectation, then methodically isolating and eliminating problems. When analyzing or debugging, it is crucial to proceed in a controlled way so that you know specifically what change results in a different outcome. Focus on bottlenecks first. Only compare apples to apples, and change one thing at a time (e.g., resolution, hardware, quality, configuration). Always be sure to profile, as systems are full of surprises. We recommend starting with simple code, and optimizing as you go - don t try to optimize too early. Performance Targets Before debugging performance problems, establish clear targets to use as a baseline for calibrating your performance. These targets can give you a sense of where to aim, and what to look at if you re not making frame rate or are having performance problems. Below you will find some general guidelines for establishing your baselines, given as approximate ranges unless otherwise noted. Mobile 60 FPS (required by Oculus Store) draw calls per frame 50, ,000 triangles or vertices per frame PC 90 FPS (required by Oculus Store) 500-1,000 draw calls per frame 1-2 million triangles or vertices per frame For more information, see: PC SDK Developer Guide Mobile VR Application Development Oculus Store Submission Requirements Oculus Rift App Requirements Mobile Virtual Reality Check (VRC) Guidelines Rift Performance HUD The Oculus Performance Heads-Up Display (HUD) is an important, easy-to-use tool for viewing timings for render, latency, and performance headroom in real-time as you run an application in the Oculus Rift. The HUD is easily accessible through the Oculus Debug Tool provided with the PC SDK. You may activate it in the Viewport by pressing the ~ key. For more details, see the Performance Heads-Up Display and Oculus Debug Tool sections of the Oculus Rift Developers Guide.

81 Unreal Testing and Performance Analysis in Unreal 81 Rift Compositor Mirror The compositor mirror is an experimental tool for viewing exactly what appears in the headset, with Asynchronous TimeWarp and distortion applied. The compositor mirror is useful for development and troubleshooting without having to wear the headset. Everything that appears in the headset will appear, including Oculus Home, Guardian boundaries, ingame notifications, and transition fades. The compositor mirror is compatible with any game or experience, regardless of whether it was developed using the native PC SDK or a game engine. For more details, see the Compositor Mirror section of the PC SDK Guide. Debug Console (Mobile) If your phone is set to Developer Mode, you may bring up a debug console for VR apps by pressing the screen with four fingers simultaneously while the application is running. Enter stat unit in the console for information about your application frame rate and CPU performance. Oculus Remote Monitor (Mobile) The Oculus Remote Monitor client connects to VR applications running on remote mobile devices to capture, store, and display the streamed-in data. The VrCapture library is automatically included in Unreal projects, so setup and use of the Oculus Remote Monitor is easy.

82 82 Testing and Performance Analysis in Unreal Unreal Oculus Remote Monitor is available from our Downloads page. For more information about setup, features, and use, see Oculus Remote Monitor in our Mobile SDK guide.

83 Unreal Testing and Performance Analysis in Unreal 83 The Frame Buffer Viewer provides a mechanism for inspecting the frame buffer as the data is received in real-time, which is particularly useful for monitoring play test sessions. When enabled, the Capture library will stream a downscaled pre-distortion eye buffer across the network. The Performance Data Viewer provides real-time and offline inspection of the following on a single, contiguous timeline: CPU/GPU events Sensor readings Console messages, warnings, and errors Frame buffer captures The Logging Viewer provides raw access to various messages and errors tracked by thread IDs. Nearly any constant in your code may be turned into a knob that can be updated in real-time during a play test. OVR Metrics Tool (Mobile) OVR Metrics Tool reports application frame rate, heat, GPU and CPU throttling values, and the number of tears and stale frames per second. It is available for download from our Downloads page. OVR Metrics Tool can be run two modes. In Report Mode, it displays performance report about a VR session after it is complete. Report data may be easily exported as a CSV and PNG graphs. In Performance HUD Mode, OVR Metrics Tool renders performance graphs as a VR overlay over any running Oculus application.

84 84 Testing and Performance Analysis in Unreal Unreal For more information, see OVR Metrics Tool in our Mobile SDK Guide. Graphics Debugging Mali Graphics Debugger If you have a Mali phone, such as a GALAXY S6, you can use the Mali Graphics Debugger built into Unreal by selecting it by opening Project Settings, selecting the Android option on the left, and setting Graphics Debugger to Mali Graphics Debugger. Note that because there are no swap buffers in VR, Gear VR does not currently support frame delimiters. Consequently, application frames will be displayed as different render passes of the same frame.

85 Unreal Testing and Performance Analysis in Unreal 85 RenderDoc Version 1.16 and later of the Oculus branch of Unreal provides RenderDoc support. Gear VR support in RenderDoc is an experimental feature, so you must download a nightly build of RenderDoc to access it. To verify that you have the right version, confirm that your RenderDoc installation includes an android/ subfolder in its root path. To use RenderDoc, open Project Settings in Unreal and select the Android option on the left. Set Graphics Debugger to RenderDoc. The RenderDocCmd.apk application must be installed on the target device before debugging. Run the following command from the android/apk/32 directory of your RenderDoc installation: adb install -r RenderDocCmd.apk If your phone is not auto-recognized as a Remote server in the Tools > Manage Remote Server section, add it by typing adb:<device-id> in the Add section. To attach to an application, run it and locate it in Tools > Manage Remote. To capture a frame, click on the Trigger Capture button. Save the capture locally and double-click it in RenderDoc to replay. GDB Debugging (Mobile) Oculus branches of Unreal add support for debugging mobile sessions using ndk-gdb, a small shell script wrapped around GNU GDB that is included with the Android NDK. Using ndk-gdb from the command line adds convenient features to your debugging workflow by allowing, for example, adding breakpoints, stepping through code, and inspecting variables with a command line interface. To use ndk-gdb for debugging: 1. Enable remote port forwarding to link your target mobile port to a PC port: adb forward tcp:$port tcp:$port 2. Set your device to Developer Mode (as described in our Mobile Developer Guide here).

Unreal. Version

Unreal. Version Unreal Version 1.13.0 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Unreal. Version 1.7.0

Unreal. Version 1.7.0 Unreal Version 1.7.0 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Unreal. Version 1.8.0

Unreal. Version 1.8.0 Unreal Version 1.8.0 2 Introduction Unreal Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.23 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.7.0 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Oculus Rift Introduction Guide. Version

Oculus Rift Introduction Guide. Version Oculus Rift Introduction Guide Version 0.8.0.0 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR LOOKING AHEAD: UE4 VR Roadmap Nick Whiting Technical Director VR / AR HEADLINE AND IMAGE LAYOUT RECENT DEVELOPMENTS RECENT DEVELOPMENTS At Epic, we drive our engine development by creating content. We

More information

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1 OCULUS VR, LLC Oculus User Guide Runtime Version 0.4.0 Rev. 1 Date: July 23, 2014 2014 Oculus VR, LLC All rights reserved. Oculus VR, LLC Irvine, CA Except as otherwise permitted by Oculus VR, LLC, this

More information

VR-Plugin. for Autodesk Maya.

VR-Plugin. for Autodesk Maya. VR-Plugin for Autodesk Maya 1 1 1. Licensing process Licensing... 3 2 2. Quick start Quick start... 4 3 3. Rendering Rendering... 10 4 4. Optimize performance Optimize performance... 11 5 5. Troubleshooting

More information

BIMXplorer v1.3.1 installation instructions and user guide

BIMXplorer v1.3.1 installation instructions and user guide BIMXplorer v1.3.1 installation instructions and user guide BIMXplorer is a plugin to Autodesk Revit (2016 and 2017) as well as a standalone viewer application that can import IFC-files or load previously

More information

Diving into VR World with Oculus. Homin Lee Software Engineer at Oculus

Diving into VR World with Oculus. Homin Lee Software Engineer at Oculus Diving into VR World with Oculus Homin Lee Software Engineer at Oculus Topics Who is Oculus Oculus Rift DK2 Positional Tracking SDK Latency Roadmap 1. Who is Oculus 1. Oculus is Palmer Luckey & John Carmack

More information

Enabling Mobile Virtual Reality ARM 助力移动 VR 产业腾飞

Enabling Mobile Virtual Reality ARM 助力移动 VR 产业腾飞 Enabling Mobile Virtual Reality ARM 助力移动 VR 产业腾飞 Nathan Li Ecosystem Manager Mobile Compute Business Line Shenzhen, China May 20, 2016 3 Photograph: Mark Zuckerberg Facebook https://www.facebook.com/photo.php?fbid=10102665120179591&set=pcb.10102665126861201&type=3&theater

More information

SteamVR Unity Plugin Quickstart Guide

SteamVR Unity Plugin Quickstart Guide The SteamVR Unity plugin comes in three different versions depending on which version of Unity is used to download it. 1) v4 - For use with Unity version 4.x (tested going back to 4.6.8f1) 2) v5 - For

More information

GearVR Starter Kit. 1. Preface

GearVR Starter Kit. 1. Preface GearVR Starter Kit 1. Preface Welcome to GearVR Starter Kit! This package is a base project for creating GearVR applications. It also contains additional features to ease creation of EscapeRoom/Adventure

More information

Magic Leap Soundfield Audio Plugin user guide for Unity

Magic Leap Soundfield Audio Plugin user guide for Unity Magic Leap Soundfield Audio Plugin user guide for Unity Plugin Version: MSA_1.0.0-21 Contents Get started using MSA in Unity. This guide contains the following sections: Magic Leap Soundfield Audio Plugin

More information

Modo VR Technical Preview User Guide

Modo VR Technical Preview User Guide Modo VR Technical Preview User Guide Copyright 2018 The Foundry Visionmongers Ltd Introduction 2 Specifications, Installation, and Setup 2 Machine Specifications 2 Installing 3 Modo VR 3 SteamVR 3 Oculus

More information

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive

More information

Achieving High Quality Mobile VR Games

Achieving High Quality Mobile VR Games Achieving High Quality Mobile VR Games Roberto Lopez Mendez, Senior Software Engineer Carl Callewaert - Americas Director & Global Leader of Evangelism, Unity Patrick O'Luanaigh CEO, ndreams GDC 2016 Agenda

More information

Tobii Pro VR Analytics User s Manual

Tobii Pro VR Analytics User s Manual Tobii Pro VR Analytics User s Manual 1. What is Tobii Pro VR Analytics? Tobii Pro VR Analytics collects eye-tracking data in Unity3D immersive virtual-reality environments and produces automated visualizations

More information

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106)

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106) Virtual Reality Mobile 360 Nanodegree Syllabus (nd106) Join the Creative Revolution Before You Start Thank you for your interest in the Virtual Reality Nanodegree program! In order to succeed in this program,

More information

Unreal Studio Project Template

Unreal Studio Project Template Unreal Studio Project Template Product Viewer What is the Product Viewer project template? This is a project template which grants the ability to use Unreal as a design review tool, allowing you to see

More information

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro Virtual Universe Pro Player 2018 1 Main concept The 2018 player for Virtual Universe Pro allows you to generate and use interactive views for screens or virtual reality headsets. The 2018 player is "hybrid",

More information

is currently only supported ed on NVIDIA graphics cards!! CODE DEVELOPMENT AB

is currently only supported ed on NVIDIA graphics cards!! CODE DEVELOPMENT AB NOTE: VR-mode VR is currently only supported ed on NVIDIA graphics cards!! VIZCODE CODE DEVELOPMENT AB Table of Contents 1 Introduction... 3 2 Setup...... 3 3 Trial period and activation... 4 4 Use BIMXplorer

More information

Exploring Virtual Reality (VR) with ArcGIS. Euan Cameron Simon Haegler Mark Baird

Exploring Virtual Reality (VR) with ArcGIS. Euan Cameron Simon Haegler Mark Baird Exploring Virtual Reality (VR) with ArcGIS Euan Cameron Simon Haegler Mark Baird Agenda Introduction & Terminology Application & Market Potential Mobile VR with ArcGIS 360VR Desktop VR with CityEngine

More information

Obduction User Manual - Menus, Settings, Interface

Obduction User Manual - Menus, Settings, Interface v1.6.5 Obduction User Manual - Menus, Settings, Interface As you walk in the woods on a stormy night, a distant thunderclap demands your attention. A curious, organic artifact falls from the starry sky

More information

4.5.1 Mirroring Gain/Offset Registers GPIO CMV Snapshot Control... 14

4.5.1 Mirroring Gain/Offset Registers GPIO CMV Snapshot Control... 14 Thank you for choosing the MityCAM-C8000 from Critical Link. The MityCAM-C8000 MityViewer Quick Start Guide will guide you through the software installation process and the steps to acquire your first

More information

Understanding OpenGL

Understanding OpenGL This document provides an overview of the OpenGL implementation in Boris Red. About OpenGL OpenGL is a cross-platform standard for 3D acceleration. GL stands for graphics library. Open refers to the ongoing,

More information

pcon.planner PRO Plugin VR-Viewer

pcon.planner PRO Plugin VR-Viewer pcon.planner PRO Plugin VR-Viewer Manual Dokument Version 1.2 Author DRT Date 04/2018 2018 EasternGraphics GmbH 1/10 pcon.planner PRO Plugin VR-Viewer Manual Content 1 Things to Know... 3 2 Technical Tips...

More information

STRUCTURE SENSOR QUICK START GUIDE

STRUCTURE SENSOR QUICK START GUIDE STRUCTURE SENSOR 1 TABLE OF CONTENTS WELCOME TO YOUR NEW STRUCTURE SENSOR 2 WHAT S INCLUDED IN THE BOX 2 CHARGING YOUR STRUCTURE SENSOR 3 CONNECTING YOUR STRUCTURE SENSOR TO YOUR IPAD 4 Attaching Structure

More information

Kismet Interface Overview

Kismet Interface Overview The following tutorial will cover an in depth overview of the benefits, features, and functionality within Unreal s node based scripting editor, Kismet. This document will cover an interface overview;

More information

Ball Color Switch. Game document and tutorial

Ball Color Switch. Game document and tutorial Ball Color Switch Game document and tutorial This template is ready for release. It is optimized for mobile (iphone, ipad, Android, Windows Mobile) standalone (Windows PC and Mac OSX), web player and webgl.

More information

Official Documentation

Official Documentation Official Documentation Doc Version: 1.0.0 Toolkit Version: 1.0.0 Contents Technical Breakdown... 3 Assets... 4 Setup... 5 Tutorial... 6 Creating a Card Sets... 7 Adding Cards to your Set... 10 Adding your

More information

Virtual Mix Room. User Guide

Virtual Mix Room. User Guide Virtual Mix Room User Guide TABLE OF CONTENTS Chapter 1 Introduction... 3 1.1 Welcome... 3 1.2 Product Overview... 3 1.3 Components... 4 Chapter 2 Quick Start Guide... 5 Chapter 3 Interface and Controls...

More information

Easy Input For Gear VR Documentation. Table of Contents

Easy Input For Gear VR Documentation. Table of Contents Easy Input For Gear VR Documentation Table of Contents Setup Prerequisites Fresh Scene from Scratch In Editor Keyboard/Mouse Mappings Using Model from Oculus SDK Components Easy Input Helper Pointers Standard

More information

Shader "Custom/ShaderTest" { Properties { _Color ("Color", Color) = (1,1,1,1) _MainTex ("Albedo (RGB)", 2D) = "white" { _Glossiness ("Smoothness", Ran

Shader Custom/ShaderTest { Properties { _Color (Color, Color) = (1,1,1,1) _MainTex (Albedo (RGB), 2D) = white { _Glossiness (Smoothness, Ran Building a 360 video player for VR With the release of Unity 5.6 all of this became much easier, Unity now has a very competent media player baked in with extensions that allow you to import a 360 video

More information

Mobile SDK. Version 1.12

Mobile SDK. Version 1.12 Mobile SDK Version 1.12 2 Introduction Mobile Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

VisualCAM 2018 TURN Quick Start MecSoft Corporation

VisualCAM 2018 TURN Quick Start MecSoft Corporation 2 Table of Contents About this Guide 4 1 About... the TURN Module 4 2 Using this... Guide 4 3 Useful... Tips 5 Getting Ready 7 1 Running... VisualCAM 2018 7 2 About... the VisualCAD Display 7 3 Launch...

More information

COMPASS NAVIGATOR PRO QUICK START GUIDE

COMPASS NAVIGATOR PRO QUICK START GUIDE COMPASS NAVIGATOR PRO QUICK START GUIDE Contents Introduction... 3 Quick Start... 3 Inspector Settings... 4 Compass Bar Settings... 5 POIs Settings... 6 Title and Text Settings... 6 Mini-Map Settings...

More information

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism REPORT ON THE CURRENT STATE OF FOR DESIGN XL: Experiments in Landscape and Urbanism This report was produced by XL: Experiments in Landscape and Urbanism, SWA Group s innovation lab. It began as an internal

More information

My view in VR and controller keep moving or panning outside of my control when using Oculus Go.

My view in VR and controller keep moving or panning outside of my control when using Oculus Go. Applicable ASINs/Models Product sub group Problem My view in VR and controller keep moving or panning outside of my control when using Oculus Go. I'm having trouble connecting my Oculus Go to WiFi. How

More information

The purpose of this document is to outline the structure and tools that come with FPS Control.

The purpose of this document is to outline the structure and tools that come with FPS Control. FPS Control beta 4.1 Reference Manual Purpose The purpose of this document is to outline the structure and tools that come with FPS Control. Required Software FPS Control Beta4 uses Unity 4. You can download

More information

Mobile SDK. Version 1.7

Mobile SDK. Version 1.7 Mobile SDK Version 1.7 2 Introduction Mobile Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Introduction. Modding Kit Feature List

Introduction. Modding Kit Feature List Introduction Welcome to the Modding Guide of Might and Magic X - Legacy. This document provides you with an overview of several content creation tools and data formats. With this information and the resources

More information

M-16DX 16-Channel Digital Mixer

M-16DX 16-Channel Digital Mixer M-16DX 16-Channel Digital Mixer Workshop Using the M-16DX with a DAW 2007 Roland Corporation U.S. All rights reserved. No part of this publication may be reproduced in any form without the written permission

More information

An Escape Room set in the world of Assassin s Creed Origins. Content

An Escape Room set in the world of Assassin s Creed Origins. Content An Escape Room set in the world of Assassin s Creed Origins Content Version Number 2496 How to install your Escape the Lost Pyramid Experience Goto Page 3 How to install the Sphinx Operator and Loader

More information

OCULUS VR, LLC. Oculus Developer Guide SDK Version 0.4

OCULUS VR, LLC. Oculus Developer Guide SDK Version 0.4 OCULUS VR, LLC Oculus Developer Guide SDK Version 0.4 Date: October 24, 2014 2014 Oculus VR, LLC. All rights reserved. Oculus VR, LLC Irvine CA Except as otherwise permitted by Oculus VR, LLC ( Oculus

More information

House Design Tutorial

House Design Tutorial House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have created a

More information

RAZER CENTRAL ONLINE MASTER GUIDE

RAZER CENTRAL ONLINE MASTER GUIDE RAZER CENTRAL ONLINE MASTER GUIDE CONTENTS 1. RAZER CENTRAL... 2 2. SIGNING IN... 3 3. RETRIEVING FORGOTTEN PASSWORDS... 4 4. CREATING A RAZER ID ACCOUNT... 7 5. USING RAZER CENTRAL... 11 6. SIGNING OUT...

More information

Minolta Scanner Plugin

Minolta Scanner Plugin Minolta Scanner Plugin For a list of Minolta digitizers and Geomagic software products with which this plugin is compatible, see Release Notes for Geomagic Minolta Plugin 7.6.0.3. Copyright 2005, Raindrop

More information

Introduction to Autodesk Inventor for F1 in Schools (Australian Version)

Introduction to Autodesk Inventor for F1 in Schools (Australian Version) Introduction to Autodesk Inventor for F1 in Schools (Australian Version) F1 in Schools race car In this course you will be introduced to Autodesk Inventor, which is the centerpiece of Autodesk s Digital

More information

The ideal K-12 science microscope solution. User Guide. for use with the Nova5000

The ideal K-12 science microscope solution. User Guide. for use with the Nova5000 The ideal K-12 science microscope solution User Guide for use with the Nova5000 NovaScope User Guide Information in this document is subject to change without notice. 2009 Fourier Systems Ltd. All rights

More information

EZ360 User Manual (Oculus Go version)

EZ360 User Manual (Oculus Go version) EZ360 User Manual (Oculus Go version) You can also visit the FAQ at ez-360.com/faq EZ360, EZ360 Pro and EZ360 Lite are developed by VR-House. Content: What is EZ360? What is EZ360 Lite? How to download

More information

MC3 Motion Control System Shutter Stream Quickstart

MC3 Motion Control System Shutter Stream Quickstart MC3 Motion Control System Shutter Stream Quickstart Revised 7/6/2016 Carousel USA 6370 N. Irwindale Rd. Irwindale, CA 91702 www.carousel-usa.com Proprietary Information Carousel USA has proprietary rights

More information

EinScan-SE. Desktop 3D Scanner. User Manual

EinScan-SE. Desktop 3D Scanner. User Manual EinScan-SE Desktop 3D Scanner User Manual Catalog 1. 2. 3. 4. 5. 6. 7. 8. 1.1. 1.2. 1.3. 1.1. 1.2. 1.1. 1.2. 1.3. 1.1. 1.2. Device List and Specification... 2 Device List... 3 Specification Parameter...

More information

Motion Blur with Mental Ray

Motion Blur with Mental Ray Motion Blur with Mental Ray In this tutorial we are going to take a look at the settings and what they do for us in using Motion Blur with the Mental Ray renderer that comes with 3D Studio. For this little

More information

CONCEPTS EXPLAINED CONCEPTS (IN ORDER)

CONCEPTS EXPLAINED CONCEPTS (IN ORDER) CONCEPTS EXPLAINED This reference is a companion to the Tutorials for the purpose of providing deeper explanations of concepts related to game designing and building. This reference will be updated with

More information

RAZER GOLIATHUS CHROMA

RAZER GOLIATHUS CHROMA RAZER GOLIATHUS CHROMA MASTER GUIDE The Razer Goliathus Chroma soft gaming mouse mat is now Powered by Razer Chroma. Featuring multi-color lighting with inter-device color synchronization, the bestselling

More information

MINIMUM SYSTEM REQUIREMENTS

MINIMUM SYSTEM REQUIREMENTS Quick Start Guide Copyright 2000-2012 Frontline Test Equipment, Inc. All rights reserved. You may not reproduce, transmit, or store on magnetic media any part of this publication in any way without prior

More information

Mobile SDK. Version 1.14

Mobile SDK. Version 1.14 Mobile SDK Version 1.14 2 Introduction Mobile Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights

More information

Kings! Card Swiping Decision Game Asset

Kings! Card Swiping Decision Game Asset Kings! Card Swiping Decision Game Asset V 1.31 Thank you for purchasing this asset! If you encounter any errors / bugs, want to suggest new features/improvements or if anything is unclear (after you have

More information

DESIGNING GAMES FOR NVIDIA GRID

DESIGNING GAMES FOR NVIDIA GRID DESIGNING GAMES FOR NVIDIA GRID BEST PRACTICES GUIDE Eric Young, DevTech Engineering Manager for GRID AGENDA Onboard Games on to NVIDIA GRID GamePad Support! Configurable Game Settings Optimizing your

More information

Technical Guide. Updated June 20, Page 1 of 63

Technical Guide. Updated June 20, Page 1 of 63 Technical Guide Updated June 20, 2018 Page 1 of 63 How to use VRMark... 4 Choose a performance level... 5 Choose an evaluation mode... 6 Choose a platform... 7 Target frame rate... 8 Judge with your own

More information

This guide updated November 29, 2017

This guide updated November 29, 2017 Page 1 of 57 This guide updated November 29, 2017 How to use VRMark... 4 Choose a performance level... 5 Choose an evaluation mode... 6 Choose a platform... 7 Target frame rate... 8 Judge with your own

More information

Easy Input Helper Documentation

Easy Input Helper Documentation Easy Input Helper Documentation Introduction Easy Input Helper makes supporting input for the new Apple TV a breeze. Whether you want support for the siri remote or mfi controllers, everything that is

More information

6 System architecture

6 System architecture 6 System architecture is an application for interactively controlling the animation of VRML avatars. It uses the pen interaction technique described in Chapter 3 - Interaction technique. It is used in

More information

About the DSR Dropout, Surge, Ripple Simulator and AC/DC Voltage Source

About the DSR Dropout, Surge, Ripple Simulator and AC/DC Voltage Source About the DSR 100-15 Dropout, Surge, Ripple Simulator and AC/DC Voltage Source Congratulations on your purchase of a DSR 100-15 AE Techron dropout, surge, ripple simulator and AC/DC voltage source. The

More information

Experiment 02 Interaction Objects

Experiment 02 Interaction Objects Experiment 02 Interaction Objects Table of Contents Introduction...1 Prerequisites...1 Setup...1 Player Stats...2 Enemy Entities...4 Enemy Generators...9 Object Tags...14 Projectile Collision...16 Enemy

More information

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Team Breaking Bat Architecture Design Specification. Virtual Slugger Department of Computer Science and Engineering The University of Texas at Arlington Team Breaking Bat Architecture Design Specification Virtual Slugger Team Members: Sean Gibeault Brandon Auwaerter Ehidiamen

More information

Mobile Virtual Reality what is that and how it works? Alexey Rybakov, Senior Engineer, Technical Evangelist at DataArt

Mobile Virtual Reality what is that and how it works? Alexey Rybakov, Senior Engineer, Technical Evangelist at DataArt Mobile Virtual Reality what is that and how it works? Alexey Rybakov, Senior Engineer, Technical Evangelist at DataArt alexey.rybakov@dataart.com Agenda 1. XR/AR/MR/MR/VR/MVR? 2. Mobile Hardware 3. SDK/Tools/Development

More information

House Design Tutorial

House Design Tutorial Chapter 2: House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have

More information

Inserting and Creating ImagesChapter1:

Inserting and Creating ImagesChapter1: Inserting and Creating ImagesChapter1: Chapter 1 In this chapter, you learn to work with raster images, including inserting and managing existing images and creating new ones. By scanning paper drawings

More information

Instant Delay 1.0 Manual. by unfilteredaudio

Instant Delay 1.0 Manual. by unfilteredaudio Instant Delay 1.0 Manual by unfilteredaudio Introduction Instant Delay takes the Modern Instant mode from our hit delay/looper Sandman Pro and crosses it with our soft saturator and resonant filter from

More information

Oculus Rift Unity 3D Integration Guide

Oculus Rift Unity 3D Integration Guide Oculus Rift Unity 3D Integration Guide 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus

More information

Learn Unity by Creating a 3D Multi-Level Platformer Game

Learn Unity by Creating a 3D Multi-Level Platformer Game Learn Unity by Creating a 3D Multi-Level Platformer Game By Pablo Farias Navarro Certified Unity Developer and Founder of Zenva Table of Contents Introduction Tutorial requirements and project files Scene

More information

House Design Tutorial

House Design Tutorial Chapter 2: House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have

More information

i800 Series Scanners Image Processing Guide User s Guide A-61510

i800 Series Scanners Image Processing Guide User s Guide A-61510 i800 Series Scanners Image Processing Guide User s Guide A-61510 ISIS is a registered trademark of Pixel Translations, a division of Input Software, Inc. Windows and Windows NT are either registered trademarks

More information

MANUAL. Invictus Guitar V1.0

MANUAL. Invictus Guitar V1.0 MANUAL Invictus Guitar V1.0 Copyright (c) Martin Britz 2017 Disclaimer Disclaimer The information in this document is subject to change without notice and does not represent a commitment on the part of

More information

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote 8 th International LS-DYNA Users Conference Visualization Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote Todd J. Furlong Principal Engineer - Graphics and Visualization

More information

VR Capture & Analysis Guide. FCAT VR Frame Capture Analysis Tools for VR

VR Capture & Analysis Guide. FCAT VR Frame Capture Analysis Tools for VR VR Capture & Analysis Guide FCAT VR Frame Capture Analysis Tools for VR 1 TABLE OF CONTENTS Table of Contents... 2 FCAT VR... 4 Measuring the Quality of your VR Experience... 4 FCAT VR Capture...4 FCAT

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

Roadblocks for building mobile AR apps

Roadblocks for building mobile AR apps Roadblocks for building mobile AR apps Jens de Smit, Layar (jens@layar.com) Ronald van der Lingen, Layar (ronald@layar.com) Abstract At Layar we have been developing our reality browser since 2009. Our

More information

Getting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning...

Getting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning... Contents Getting started 1 System Requirements......................... 1 Software Installation......................... 2 Hardware Installation........................ 2 System Limitations and Tips on

More information

Momo Software Context Aware User Interface Application USER MANUAL. Burak Kerim AKKUŞ Ender BULUT Hüseyin Can DOĞAN

Momo Software Context Aware User Interface Application USER MANUAL. Burak Kerim AKKUŞ Ender BULUT Hüseyin Can DOĞAN Momo Software Context Aware User Interface Application USER MANUAL Burak Kerim AKKUŞ Ender BULUT Hüseyin Can DOĞAN 1. How to Install All the sources and the applications of our project is developed using

More information

1. Creating geometry based on sketches 2. Using sketch lines as reference 3. Using sketches to drive changes in geometry

1. Creating geometry based on sketches 2. Using sketch lines as reference 3. Using sketches to drive changes in geometry 4.1: Modeling 3D Modeling is a key process of getting your ideas from a concept to a read- for- manufacture state, making it core foundation of the product development process. In Fusion 360, there are

More information

Moving Web 3d Content into GearVR

Moving Web 3d Content into GearVR Moving Web 3d Content into GearVR Mitch Williams Samsung / 3d-online GearVR Software Engineer August 1, 2017, Web 3D BOF SIGGRAPH 2017, Los Angeles Samsung GearVR s/w development goals Build GearVRf (framework)

More information

RAZER RAIJU TOURNAMENT EDITION

RAZER RAIJU TOURNAMENT EDITION RAZER RAIJU TOURNAMENT EDITION MASTER GUIDE The Razer Raiju Tournament Edition is the first Bluetooth and wired controller to have a mobile configuration app, enabling control from remapping multi-function

More information

Owner s Manual. Page 1 of 23

Owner s Manual. Page 1 of 23 Page 1 of 23 Installation Instructions Table of Contents 1. Getting Started! Installation via Connect! Activation with Native Instruments Service Center 2. Pulse Engines Page! Pulse Engine Layers! Pulse

More information

Annex IV - Stencyl Tutorial

Annex IV - Stencyl Tutorial Annex IV - Stencyl Tutorial This short, hands-on tutorial will walk you through the steps needed to create a simple platformer using premade content, so that you can become familiar with the main parts

More information

Photoshop CS2. Step by Step Instructions Using Layers. Adobe. About Layers:

Photoshop CS2. Step by Step Instructions Using Layers. Adobe. About Layers: About Layers: Layers allow you to work on one element of an image without disturbing the others. Think of layers as sheets of acetate stacked one on top of the other. You can see through transparent areas

More information

Omniverse Setup Instructions

Omniverse Setup Instructions Omniverse Setup Instructions Hello Omni customer, Please follow the steps outlined below to get your Omni ready for Omniverse! Let us know if you have questions or issues at any time at support@virtuix.com.

More information

SolidWorks Part I - Basic Tools SDC. Includes. Parts, Assemblies and Drawings. Paul Tran CSWE, CSWI

SolidWorks Part I - Basic Tools SDC. Includes. Parts, Assemblies and Drawings. Paul Tran CSWE, CSWI SolidWorks 2015 Part I - Basic Tools Includes CSWA Preparation Material Parts, Assemblies and Drawings Paul Tran CSWE, CSWI SDC PUBLICATIONS Better Textbooks. Lower Prices. www.sdcpublications.com Powered

More information

Brightness and Contrast Control Reference Guide

Brightness and Contrast Control Reference Guide innovation Series Scanners Brightness and Contrast Control Reference Guide A-61506 Part No. 9E3722 CAT No. 137 0337 Using the Brightness and Contrast Control This Reference Guide provides information and

More information

Welcome to Corel DESIGNER, a comprehensive vector-based package for technical graphic users and technical illustrators.

Welcome to Corel DESIGNER, a comprehensive vector-based package for technical graphic users and technical illustrators. Workspace tour Welcome to Corel DESIGNER, a comprehensive vector-based package for technical graphic users and technical illustrators. This tutorial will help you become familiar with the terminology and

More information

Designing in the context of an assembly

Designing in the context of an assembly SIEMENS Designing in the context of an assembly spse01670 Proprietary and restricted rights notice This software and related documentation are proprietary to Siemens Product Lifecycle Management Software

More information

Unity Game Development Essentials

Unity Game Development Essentials Unity Game Development Essentials Build fully functional, professional 3D games with realistic environments, sound, dynamic effects, and more! Will Goldstone 1- PUBLISHING -J BIRMINGHAM - MUMBAI Preface

More information

0FlashPix Interoperability Test Suite User s Manual

0FlashPix Interoperability Test Suite User s Manual 0FlashPix Interoperability Test Suite User s Manual Version 1.0 Version 1.0 1996 Eastman Kodak Company 1996 Eastman Kodak Company All rights reserved. No parts of this document may be reproduced, in whatever

More information

Getting Started. with Easy Blue Print

Getting Started. with Easy Blue Print Getting Started with Easy Blue Print User Interface Overview Easy Blue Print is a simple drawing program that will allow you to create professional-looking 2D floor plan drawings. This guide covers the

More information

Drum Leveler. User Manual. Drum Leveler v Sound Radix Ltd. All Rights Reserved

Drum Leveler. User Manual. Drum Leveler v Sound Radix Ltd. All Rights Reserved 1 Drum Leveler User Manual 2 Overview Drum Leveler is a new beat detection-based downward and upward compressor/expander. By selectively applying gain to single drum beats, Drum Leveler easily achieves

More information

Kodiak Corporate Administration Tool

Kodiak Corporate Administration Tool AT&T Business Mobility Kodiak Corporate Administration Tool User Guide Release 8.3 Table of Contents Introduction and Key Features 2 Getting Started 2 Navigate the Corporate Administration Tool 2 Manage

More information

House Design Tutorial

House Design Tutorial House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have created a

More information