Home Sweet Virtual Home

Size: px
Start display at page:

Download "Home Sweet Virtual Home"

Transcription

1 KTH DT2140 Home Sweet Virtual Home Niklas Blomqvist Carlos Egusquiza January 20, 2015 Annika Strålfors Supervisor: Christopher Peters 1

2 ABSTRACT Multimodal interaction is an important area in human computer interaction. This project explores how the Oculus Rift works together with voice recognition and standard controls in a virtual environment. The users are able to walk around in two rooms and change certain parameters of the furniture. A study was conducted and user experience data was collected through a survey in order to evaluate the application. None of the users experienced motion sickness and all thought the navigation of the menu and the rooms with the Oculus Rift head tracking together with the standard controls worked well. Voice input was not perceived as a completely natural modality to interact with the program. 2

3 CONTENTS 1 Introduction 4 2 Related work 4 3 Implementation Initial thoughts Change of plans Final implementation Results Application Evaluation Discussion 10 6 Conclusion 13 3

4 1 INTRODUCTION Multimodal interaction is an expanding area in human computer interaction in which users can interact with technology through several input and output modalities. This project aimed to combine head tracking with voice recognition and standard controls in order to make an application where users could explore a virtual room through the use of an Oculus Rift. A virtual representation of a room in a house environment was created in Unity together with a GUI. Through the Oculus Rift, the user could explore and manipulate certain objects in order to see how they would look in the real world room. The objects consisted of different furniture in which the user could change textures, color and models. A future use of the application might be for commercial use in different web furniture shops where users can download models of different furniture and import them into a virtual representation of a room in their house in order to explore how the furniture will look in context with a real world room. The Oculus Rift is a head mounted display (HMD) which will most likely be available for commercial use sometime in It has great potential in many usability areas but not much research has been done on it since it is fairly new. Motion sickness is a big issue when developing virtual environments used with a HMD. In these scenarios, where the user s eyes are occupied, voice input has shown superiority over for example keyboard entry in many applications [1]. Further on, voice recognition has been proved useful in certain environments, i.e to store and forward messages, for different alerts in busy environments and for blind or motor-impaired users. However, some researchers mean that voice input is inefficient compared to physical controls input in tasks that require problem solving activities due to how the brain processes information [2]. This project wanted to explore how well Oculus Rift head tracking and voice recognition could be used together for navigation in the virtual room. This report consist of an description over the application followed by an evaluation in which the users got to test the application and answer a survey. The results are presented with a subsequent discussion. 2 RELATED WORK The role that voice input plays in human-computer interaction is significant under certain circumstances. When the user is disabled, when pronounciation is the subject in matter of computer use, when natural language interaction is prefered, when only a limited keyboard and/or screen is available or when the user s hands or eyes are busy. When using an Oculus Rift, the user s eyes will be looking in the virtual environment and the user won t be able to see a standard control, like the keyboard. Voice could therefore be an optimal input method over the keyboard in this scenario. In many applications, where the user s input is constrained, voice leads to faster task performance and less errors than keyboard entry [1]. However, the use of voice recognition as an input modality in human-computer interaction applications is a challenging area with a slow development compared to visual interfaces. Spoken languages between humans are a very effective and natural way to interact for most people. As addressed in the article The limits of speech recognition, voice recognition can 4

5 create severe limitations when implemented in human-computer systems [2]. Speech is difficult to review and edit, it is a slow way of representing information and it interferes significantly with other cognitive tasks. As discussed in the article, one reason might be that humans perceive it as more difficult to talk and think at the same time, compared to the use of standard controls input, i.e use of keyboard or mouse, and parallel through processes. The reason is that physical activity is processed in other parts of the brain than problem solving, while speaking and listening are processes in the same brain area as the short-term and working memory. Using speech as an input modality in applications that demand a lot of attention can therefore be problematic. In order to create effective application interaction, designers need to set realistic goals for speech-based human-computer interaction and acknowledge the limitations of voice recognition when it comes to be performed in parallel with problem solving. The Oculus Rift is probably the leading low-cost HMD at this point. There are, as mentioned before, several other HMDs but most of these are intended for use in labs and are much more expensive. A study which compared the Oculus Rift, development kit 1 (DK1), with the high-cost Nvis SX60 was made in 2014 [3]. The Oculus Rift has a lower resolution (640 x 800 per eye compared to Nvis 1280 x 1024 per eye) but a higher field of view (FoV). The Oculus Rift is also less heavy, weighing about 480 g against Nvis 1250 g. There are some other differentiating factors between the two systems. Two tests were conducted, followed by a survey about motion sickness. In the first test, they compared the Oculus Rift to the Nvis SX60 in the task of egocentric distance estimation. The second test was divided into three tasks; sorting, searching and viewing. The Oculus Rift was used with two Razer Hydra controllers where movement is constructed by using the analog stick on the right controller and the environment is built in Unity3Ds game engine s pro license version. The Nvis SX60 system was used with Cyberglove II for grasping and a Logitech Freedom wireless joystick for movement. The results showed that the low-cost Oculus Rift outperformed the high-cost Nvis SX60 in both the distance estimation task and the sorting, searching and viewing tasks. In the egocentric distance test, the results were obtained by calculating the ratio of distance walked to true distance. The Oculus Rift users scored about 0.9 (where 1 is perfect estimation) when below 10 meters and about 1.1 after that whilst the Nvis SX60 users scored around 0.55 for all distances. The task completion time for the sorting and searching tasks was less with the Oculus Rift and the participants felt more present in the virtual environment viewed in the Oculus Rift. There were 2 participants who had to disconnect from the test due to motion sickness when using the Oculus Rift. The authors suggests this might be due to the low resolution but states that there are many factors at play in determining visual discomfort in 3D displays. The new Oculus Rift (DK2) that is available now, has a higher resolution of 960 x 1080 per eye. The development of new low-cost interface technologies, like the Oculus Rift has renewed the interest of virtual environments, especially for private entertainment use. Side effects such as nausea from cybersickness, sickness experienced by users of head-steered virtual reality systems, is a major issue [4]. Simulator sickness and motion sickness have been previously well-studied, yet many of the issues are still unresolved. A study suggests that a big factor in why they are still unresolved is because of the methods of measurement of cybersickness. More precise tools to objectively measure cybersickness are needed in order to solve some of these issues. Most methods used to measure cybersickness today are subjec- 5

6 tive and based on questionnaires and while useful for providing a snapshot of participant experience, this method is prone to problems and does not provide uninterrupted real time monitoring information while the participant is in the virtual environment. Studies have shown that distance in virtual reality environments often are underestimated compared to distance perception in the real world [5]. In the real world, distances up to 25 meters have been shown to be perceived with quite accurate estimations. In one of these studies, two main explanations why distance perception in virtual environments tend to be underestimated have been proposed. The first suggestion is that graphical information are missing from the rendering of the scene. The second proposal is that it depends on the immersive display technology. In the study, an experiment of action tasks related to egocentric distance estimations was conducted. The participants in the experiment viewed targets on the ground from different distances (2m, 3.5m and 5m) and was then instructed to walk towards the previous targets, without vision. The participants were instructed to memorize the target and its surroundings before their vision was blacked out. There were three different environments; a real hallway, a virtual 360 degree panorama photograph of the hallway display in a HMD (head mounted display) and a computer graphic rendering of the hallway displayed in a HMD. After the participants had observed the target, their vision was blacked out (either with a blindfold or the HMD display was cleared) and they were instructed to walk towards the previous target and stop when they perceived that they had moved the distance to the target. The results from the study showed that there was a significant difference between distance perception in the real world and the two virtual representations of the hallway. The distance perception was most accurate under the real world conditions. The panorama photograph of the hallway resulted in a slightly better distance perception compared to the computer graphic rendering, but the difference was very small compared to results from previous research. A possible explanation of this was that the complexity of the graphical model and the high resolution of the display system. The results from the study confirmed that distance perception is more accurate under real world conditions and suggest that due to the small differences between the estimations under the virtual conditions, the compression of egocentric distance in virtual environments is likely to be caused by the HMD system. When reconstructing virtual environments, cameras with an built-in depth sensor can be used. The consumer-graded camera Kinect has the potential to be used in these mapping applications where the accuracy requirements are less strict [6]. It determines the depth of difference objects by triangulation. This triangulation is done by the sensor s main components: an IR laser emitter, an IR camera and a RGB camera. The random errors in the depth data are minimum when the object is close to the sensor. Increasing the distance to the sensor makes the random error increase quadratically. The depth resolution decreases quadratically, reaching the lowest resolution at the maximum range of 5 m. 6

7 3 IMPLEMENTATION 3.1 INITIAL THOUGHTS Our goals were different in the planning stage of the project. We had planned to create a virtual environment by scanning an existing room with the help of the Kinect. The virtual room would be navigated with a wii-controller in combination with voice commands. The main goal was multimodality: it is important to let the user decide which modality he or she wants to use. The user would have the possibility to change the color, model or texture of the room s furniture. One important aspect of this setup was the fact that, as the modeled room would be an existing room, we would be able to analyse the distance perception in the virtual environment. 3.2 CHANGE OF PLANS We found it difficult to find free scanning software for the Kinect. The ones that were found were very limited and required a lot of computational power, preferable computers with GPU acceleration. The laptops available could not perform these tasks in an efficient way. These difficulties led to a change of plans. Instead, we chose to use existing 3d-models downloaded from an online 3d-models source Turbosquid. This solved part of the problem and we were able to focus on functionality. Another difference is that the wii-controller was not used for navigation, mostly due to lack of time. Navigation is instead done by using the keyboard, the keys W, A, S and D for walking in each direction, and Q and E for rotating. 3.3 FINAL IMPLEMENTATION The 3d-models were imported into Unity3D, where they were also put together, resulting in a virtual environment. Unity3D is a game engine which is well-known. Unity3D makes it easier to implement physics (such as collision and gravity) into the game, so the developers don t have to code everything from scratch. The virtual environment uses the Microsoft Speech API in order for the voice commands to work. There are currently three voice commands that can be used: select, bedroom, kitchen, and close. Their functionality is self-explanatory. Environments created in unity can be visualized with the help of the Oculus Rift. This is done by attaching two cameras (one for each eye) to the player character. The player character is just an invisible capsule shaped collider, which gives the feeling that it can walk into other physical objects in the room. 7

8 4 R ESULTS Here are the result from the collected data from the evaluation together with snapshots from the results of the appearance of the final product. 4.1 A PPLICATION Below are some snapshots from the application. The end product consisted of two rooms in which the user could navigate through standard controls and furnish the room through both standard controls and voice recognition. A full video presentation of the application is available in the reference section [7]. 4.2 E VALUATION Below are the results from the survey evaluation. Each diagram corresponds to one of the questions in the survey. The total number of participants were five and the participants were instructed to answer the survey directly after they had tried out the application. As seen in the first question none of the participants experienced any nausea or motion sickness. The participants were also consistent regarding that they felt that navigation through head tracking in the menu worked well. All participants experienced that the combination between the Oculus rift head tracking and standard controls for navigation also worked well. Furthermore, 40 % of the participants felt that voice worked as a natural modality to interact with 8

9 the program, while 60 % felt unsure. In the question regarding which modality the participants preferred to use, standard controls or voice recognition, 60 % preferred to use standard controls while 40 % favoured voice recognition for interaction with the program. 9

10 5 DISCUSSION This project did not put much emphasis on making an evaluation, but focused more on implementing a multimodal program. Nevertheless, we did a smaller evaluation to get some insight into the feel of the program and how the different modalities worked with each other. None of our participants felt motion sick during testing but this is normally a big problem with HMD s. We used the latest Oculus Rift (DK2) which has a high resolution and field of view, which have been shown to decrease motion sickness, so this could be one factor to no one feeling motion sick. Participants tested the program for about two to three minutes which is not that long and being in a calm environment could also be factors to why no one felt motion sick. Navigation in the menu by using head tracking worked well for all participants which were somewhat expected. A lot of current iterations of Oculus Rift implementations uses this technique. Navigating in the room with the Oculus Rift and the keyboard did also work well for all participants, but would probably not work as well in more complex scenarios. The Oculus Rift occupies your eyes, so finding keys on the keyboard for someone who s not used to a keyboard will be a big problem. We did only add one button for clicking things due to this being a problem and our initial thoughts on integrating the program with a Wii-Remote or some sort of controller would probably be the better option since it s easier to find buttons on that than on the keyboard without looking. Also, if you would remove your hand from the des- 10

11 ignated control area (around the WASD-buttons) you would probably need to remove your headgear to find the location again and this is not a problem with a Wii-Remote. Most participants preferred the standard control (the keyboard) over voice input as well. This could be because of our scenario, where clicking something could be done with either the keyboard or by voice. It doesn t feel so natural to say select as to say for example change texture or change color to red. We had problems when using too many word inputs that the windows speech service recognized them poorly which resulted with us only having four voice inputs; select, kitchen, bedroom and close. Another factor that the keyboard was preferred could be that choosing something in a menu by clicking the boxes feels natural with a keyboard and less natural by voice, but in another scenario the feeling could be switched. Movement was also done with the keyboard so the participants were already using it before entering a menu. Lastly, most participants didn t know if voice felt as a natural modality for interacting with the program and the rest felt it was natural. This is again, probably because of our specific scenario where you already use the keyboard for navigation as well as clicking in a menu feels more natural when using a keyboard than voice because that s how we normally do. Having more natural voice commands would probably make voice feel more natural when interacting with the program or by increasing the complexity of what you as a user can do in the environment. During the course of this project we have hit some bumps in the road, but nothing we couldn t manage to solve. Initially, it was hard for us to figure out what kind of project we wanted to do and to help us narrow it down, we started by choosing what technology we wanted to use. Early on we knew that we wanted to work with the Oculus Rift in some kind of environment. We had thoughts on re-creating a real environment and then comparing it with the real environment, thus having the opportunity to get some valuable human perception data from the survey as well. We did some research and had a meeting with our supervisor Chris, after which, we decided to try and integrate a Kinect into our project, to make a virtual environment from a real one with the help of Kinect s built-in depth sensor. After much research and trying it out, we decided to skip these parts. We wouldn t have time to build an environment and then try to focus on the multimodality aspects of the project, which should be in the center of the project. We had also intentions of making all models ourselves in Maya but decided to skip this too because of the project s time limit as well as skipping the integration of a Wii-Remote instead of a keyboard. We have been very optimistic from the start with what we wanted to accomplish and had to scale it down as we progressed with the project. Nevertheless, what we accomplished in the end, is something we re proud of. We worked tools and environment which we hadn t worked with before and successfully created something of value. We got to create something with Unity3D by having keyboard, voice and head motion tracking as inputs and have learnt a great deal surrounding these techniques. In addition to being optimistic about the goals of the project and pressed for time, we encountered another unforeseen problem. We had talked with another group that had access to the Visualization studio during the winter holiday so we could also have access when it was closed. But the one with the valid passcard went on a holiday and neither us, or the rest of their group could get access to the Visualization studio during these two weeks. Only 11

12 having access to an Oculus Rift in the studio, we had to do most of the work without one and then the last couple of days integrate it and try it out with an Oculus Rift. This led to us having very little time to polish the program beyond our intended functionality. What worked well within the group was our cooperation. We met up when we needed and talked on either Facebook or Skype to update each other on the progress and goals. There were never any problem for us to change the goals of the projects when needed. For a small group these types of quick changes are easier to carry out than for a bigger group so another strength was our ability to be agile. It s hard to come up with things we would have done differently if given the opportunity but one thing would probably be to have a backup plan to get access to the Visualization studio. We lost valuable testing time with the Oculus Rift because of the two weeks of no access to it during the winter holiday. If we would have a functioning program before the last couple of days of submitting the report we would also have had more time to evaluate the program and test it on more people. Another thing in hindsight, would maybe have been to use less time for research and more time for brainstorming and coming up with ideas in the beginning. We lost a lot of time researching and trying technologies we didn t use in the end. We used up a couple of days trying out the Kinect, researching its functionality and how to re-construct real environments. We did learn much in this area but because of the time limit of the project, time was derived from it. We did spend time brainstorming but after we had come up with somewhat of an idea we pursued it instead of coming up with more ideas to use as backup. We went from re-creating a shopping mall, to re-creating the Visualization studio, to re-creating a group members house and then decided to skip constructing a virtual environment from a real. Instead of going from one idea to the next we could ve worked with many ideas parallel to each other so that we wouldn t waste as much time on every idea. When researching the environments and ideas with the Oculus Rift we found that a lot are under development. The Oculus Rift will probably be available commercially this year and many companies wants to have products usable with it. We noticed that our idea to recreate a real environment from a house and then use it for house viewings or to re-furnish it isn t unique, products and programs like these are under development. It was fun to work and develop something that is so new and discuss and come up with ideas for potential obstacles these environments and technologies struggles with. Probably the biggest thing we ll take with us from this project is the insight in many different technologies and platforms. We had barely worked with any of the tools and technologies before so we learnt a lot about them. If we were to continue with our work we would try to integrate some sort of re-construction technique of a real environment to make our program more up to date and usable in scenarios like house viewings and re-furnishing. We would also add functionality and more natural voice commands and do research in a better voice recognition service. Most, if not all of our weaknesses could be solved if we had more time to work on them, but some, requires a lot more time. Working with re-construction of real environments for example, would require a project in itself. This being a multimodality course we focused as much as possible on the multimodal aspects than making our program usable as a commercial product or getting good data from a human perception point of view. 12

13 6 CONCLUSION We are certain that virtual environments will be increasingly popular in the future. In order to increase usability, developers need to focus on multimodality. We have tried to do so by letting the user decide between voice recognition and a keyboard; both of which could accomplish the same tasks in the virtual environment. One important detail to keep in mind is that voice recognition (or at least Microsoft Speech API) is not very precise at times; and it can return a lot of false positives. These limitations led to us only having four voice commands. This is something that we could have worked on if we had had more time, maybe by changing the voice recognition API. We are all very happy with the result. It was fun developing and testing the virtual environment for the Oculus Rift; and we all agree that our product is something which could be useful for a lot of people if we continued working on it. 13

14 REFERENCES [1] The role of voice input for human-machine interaction. Philip R. Cohen, Sharon L. Oviatt [2] The limits of speech recognition, Ben Shneiderman, Communications of the ACM, Vol. 43, No. 9, [3] A Comparison of Two Cost-Differentiated Virtual Reality Systems for Perception and Action Tasks. Mary K. Young, Graham B. Gaylor, Scott M. Andrus, Bobby Bodenheimer [4] A Systematic Review of Cybersickness. Simon Davis, Keith Nesbitt, Eugene Nalivaiko [5] Perceived egocentric distance in Real, Image-based and Traditional Virtual Environments. Peter Willemsen, Amy A. Gooch [6] Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications. Kourosh Khoshelham, Sander Oude Elberink [7] Video presentation of the program 14

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

Perception in Immersive Environments

Perception in Immersive Environments Perception in Immersive Environments Scott Kuhl Department of Computer Science Augsburg College scott@kuhlweb.com Abstract Immersive environment (virtual reality) systems provide a unique way for researchers

More information

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-153 SOLUTIONS FOR DEVELOPING SCORM CONFORMANT SERIOUS GAMES Dragoş BĂRBIERU

More information

Workshop 4: Digital Media By Daniel Crippa

Workshop 4: Digital Media By Daniel Crippa Topics Covered Workshop 4: Digital Media Workshop 4: Digital Media By Daniel Crippa 13/08/2018 Introduction to the Unity Engine Components (Rigidbodies, Colliders, etc.) Prefabs UI Tilemaps Game Design

More information

ADVANCED WHACK A MOLE VR

ADVANCED WHACK A MOLE VR ADVANCED WHACK A MOLE VR Tal Pilo, Or Gitli and Mirit Alush TABLE OF CONTENTS Introduction 2 Development Environment 3 Application overview 4-8 Development Process - 9 1 Introduction We developed a VR

More information

Virtual Reality as Innovative Approach to the Interior Designing

Virtual Reality as Innovative Approach to the Interior Designing SSP - JOURNAL OF CIVIL ENGINEERING Vol. 12, Issue 1, 2017 DOI: 10.1515/sspjce-2017-0011 Virtual Reality as Innovative Approach to the Interior Designing Pavol Kaleja, Mária Kozlovská Technical University

More information

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Content based on Dr.LaViola s class: 3D User Interfaces for Games and VR What is a User Interface? Where

More information

Project Multimodal FooBilliard

Project Multimodal FooBilliard Project Multimodal FooBilliard adding two multimodal user interfaces to an existing 3d billiard game Dominic Sina, Paul Frischknecht, Marian Briceag, Ulzhan Kakenova March May 2015, for Future User Interfaces

More information

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism REPORT ON THE CURRENT STATE OF FOR DESIGN XL: Experiments in Landscape and Urbanism This report was produced by XL: Experiments in Landscape and Urbanism, SWA Group s innovation lab. It began as an internal

More information

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088 Portfolio About Me: I am a Computer Science graduate student at The University of Texas at Dallas. I am currently working as Augmented Reality Engineer at Aireal, Dallas and also as a Graduate Researcher

More information

G54GAM Lab Session 1

G54GAM Lab Session 1 G54GAM Lab Session 1 The aim of this session is to introduce the basic functionality of Game Maker and to create a very simple platform game (think Mario / Donkey Kong etc). This document will walk you

More information

Immersion in Multimodal Gaming

Immersion in Multimodal Gaming Immersion in Multimodal Gaming Playing World of Warcraft with Voice Controls Tony Ricciardi and Jae min John In a Sentence... The goal of our study was to determine how the use of a multimodal control

More information

Frequently Asked Questions

Frequently Asked Questions Frequently Asked Questions Index Frequently Asked Questions... 1 Being a Mystery Shopper... 3 What is a mystery shopper?... 3 How can I become a mystery shopper?... 3 What are you looking for in a mystery

More information

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Team Breaking Bat Architecture Design Specification. Virtual Slugger Department of Computer Science and Engineering The University of Texas at Arlington Team Breaking Bat Architecture Design Specification Virtual Slugger Team Members: Sean Gibeault Brandon Auwaerter Ehidiamen

More information

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences Elwin Lee, Xiyuan Liu, Xun Zhang Entertainment Technology Center Carnegie Mellon University Pittsburgh, PA 15219 {elwinl, xiyuanl,

More information

Journey through Game Design

Journey through Game Design Simulation Games in Education Spring 2010 Introduction At the very beginning of semester we were required to choose a final project to work on. I found this a bit odd and had the slightest idea what to

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation Minghao Cai and Jiro Tanaka Graduate School of Information, Production and Systems Waseda University Kitakyushu, Japan Email: mhcai@toki.waseda.jp,

More information

Home-Care Technology for Independent Living

Home-Care Technology for Independent Living Independent LifeStyle Assistant Home-Care Technology for Independent Living A NIST Advanced Technology Program Wende Dewing, PhD Human-Centered Systems Information and Decision Technologies Honeywell Laboratories

More information

BIMXplorer v1.3.1 installation instructions and user guide

BIMXplorer v1.3.1 installation instructions and user guide BIMXplorer v1.3.1 installation instructions and user guide BIMXplorer is a plugin to Autodesk Revit (2016 and 2017) as well as a standalone viewer application that can import IFC-files or load previously

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology Natural User Interfaces What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger

More information

VR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Website:

VR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren   Website: VR for Microsurgery Design Document Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Email: med-vr@iastate.edu Website: Team Members/Role: Maggie Hollander Leader Eric Edwards Communication Leader

More information

Virtual Reality RPG Spoken Dialog System

Virtual Reality RPG Spoken Dialog System Virtual Reality RPG Spoken Dialog System Project report Einir Einisson Gísli Böðvar Guðmundsson Steingrímur Arnar Jónsson Instructor Hannes Högni Vilhjálmsson Moderator David James Thue Abstract 1 In computer

More information

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro Virtual Universe Pro Player 2018 1 Main concept The 2018 player for Virtual Universe Pro allows you to generate and use interactive views for screens or virtual reality headsets. The 2018 player is "hybrid",

More information

In the end, the code and tips in this document could be used to create any type of camera.

In the end, the code and tips in this document could be used to create any type of camera. Overview The Adventure Camera & Rig is a multi-behavior camera built specifically for quality 3 rd Person Action/Adventure games. Use it as a basis for your custom camera system or out-of-the-box to kick

More information

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game Daniel Clarke 9dwc@queensu.ca Graham McGregor graham.mcgregor@queensu.ca Brianna Rubin 11br21@queensu.ca

More information

The purpose of this document is to outline the structure and tools that come with FPS Control.

The purpose of this document is to outline the structure and tools that come with FPS Control. FPS Control beta 4.1 Reference Manual Purpose The purpose of this document is to outline the structure and tools that come with FPS Control. Required Software FPS Control Beta4 uses Unity 4. You can download

More information

User Interface Software Projects

User Interface Software Projects User Interface Software Projects Assoc. Professor Donald J. Patterson INF 134 Winter 2012 The author of this work license copyright to it according to the Creative Commons Attribution-Noncommercial-Share

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Students: Bar Uliel, Moran Nisan,Sapir Mordoch Supervisors: Yaron Honen,Boaz Sternfeld

Students: Bar Uliel, Moran Nisan,Sapir Mordoch Supervisors: Yaron Honen,Boaz Sternfeld Students: Bar Uliel, Moran Nisan,Sapir Mordoch Supervisors: Yaron Honen,Boaz Sternfeld Table of contents Background Development Environment and system Application Overview Challenges Background We developed

More information

Macquarie University Introductory Unity3D Workshop

Macquarie University Introductory Unity3D Workshop Overview Macquarie University Introductory Unity3D Workshop Unity3D - is a commercial game development environment used by many studios who publish on iphone, Android, PC/Mac and the consoles (i.e. Wii,

More information

VR/AR Concepts in Architecture And Available Tools

VR/AR Concepts in Architecture And Available Tools VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality

More information

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Heads up interaction: glasgow university multimodal research. Eve Hoggan Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

Development Outcome 2

Development Outcome 2 Computer Games: F917 10/11/12 F917 10/11/12 Page 1 Contents Games Design Brief 3 Game Design Document... 5 Creating a Game in Scratch... 6 Adding Assets... 6 Altering a Game in Scratch... 7 If statement...

More information

VIRTUAL MUSEUM BETA 1 INTRODUCTION MINIMUM REQUIREMENTS WHAT DOES BETA 1 MEAN? CASTLEFORD TIGERS HERITAGE PROJECT

VIRTUAL MUSEUM BETA 1 INTRODUCTION MINIMUM REQUIREMENTS WHAT DOES BETA 1 MEAN? CASTLEFORD TIGERS HERITAGE PROJECT CASTLEFORD TIGERS HERITAGE PROJECT VIRTUAL MUSEUM BETA 1 INTRODUCTION The Castleford Tigers Virtual Museum is an interactive 3D environment containing a celebratory showcase of material gathered throughout

More information

Image Manipulation Unit 34. Chantelle Bennett

Image Manipulation Unit 34. Chantelle Bennett Image Manipulation Unit 34 Chantelle Bennett I believe that this image was taken several times to get this image. I also believe that the image was rotated to make it look like there is a dead end at

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.23 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Virtual Reality Game using Oculus Rift

Virtual Reality Game using Oculus Rift CN1 Final Report Virtual Reality Game using Oculus Rift Group Members Chatpol Akkawattanakul (5422792135) Photpinit Kalayanuwatchai (5422770669) Advisor: Dr. Cholwich Nattee Dr. Nirattaya Khamsemanan School

More information

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1 Active Stereo Vision COMP 4102A Winter 2014 Gerhard Roth Version 1 Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can

More information

Essential Step Number 4 Hi this is AJ and welcome to Step Number 4, the fourth essential step for change and leadership. And, of course, the fourth free webinar for you. Alright, so you ve learned Steps

More information

First Tutorial Orange Group

First Tutorial Orange Group First Tutorial Orange Group The first video is of students working together on a mechanics tutorial. Boxed below are the questions they re discussing: discuss these with your partners group before we watch

More information

Easy Input For Gear VR Documentation. Table of Contents

Easy Input For Gear VR Documentation. Table of Contents Easy Input For Gear VR Documentation Table of Contents Setup Prerequisites Fresh Scene from Scratch In Editor Keyboard/Mouse Mappings Using Model from Oculus SDK Components Easy Input Helper Pointers Standard

More information

Keywords: Innovative games-based learning, Virtual worlds, Perspective taking, Mental rotation.

Keywords: Innovative games-based learning, Virtual worlds, Perspective taking, Mental rotation. Immersive vs Desktop Virtual Reality in Game Based Learning Laura Freina 1, Andrea Canessa 2 1 CNR-ITD, Genova, Italy 2 BioLab - DIBRIS - Università degli Studi di Genova, Italy freina@itd.cnr.it andrea.canessa@unige.it

More information

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR LOOKING AHEAD: UE4 VR Roadmap Nick Whiting Technical Director VR / AR HEADLINE AND IMAGE LAYOUT RECENT DEVELOPMENTS RECENT DEVELOPMENTS At Epic, we drive our engine development by creating content. We

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.7.0 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Chapter 1 Virtual World Fundamentals

Chapter 1 Virtual World Fundamentals Chapter 1 Virtual World Fundamentals 1.0 What Is A Virtual World? {Definition} Virtual: to exist in effect, though not in actual fact. You are probably familiar with arcade games such as pinball and target

More information

Trial code included!

Trial code included! The official guide Trial code included! 1st Edition (Nov. 2018) Ready to become a Pro? We re so happy that you ve decided to join our growing community of professional educators and CoSpaces Edu experts!

More information

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space Chapter 2 Understanding and Conceptualizing Interaction Anna Loparev Intro HCI University of Rochester 01/29/2013 1 Problem space Concepts and facts relevant to the problem Users Current UX Technology

More information

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation University of California, Santa Barbara CS189 Fall 17 Capstone VR Telemedicine Product Requirement Documentation Jinfa Zhu Kenneth Chan Shouzhi Wan Xiaohe He Yuanqi Li Supervised by Ole Eichhorn Helen

More information

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017 TOUCH & FEEL VIRTUAL REALITY DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es Minimum System Specs Operating System Windows 8.1 or newer Processor AMD Phenom II or Intel Core i3 processor

More information

understanding sensors

understanding sensors The LEGO MINDSTORMS EV3 set includes three types of sensors: Touch, Color, and Infrared. You can use these sensors to make your robot respond to its environment. For example, you can program your robot

More information

Picks. Pick your inspiration. Addison Leong Joanne Jang Katherine Liu SunMi Lee Development Team manager Design User testing

Picks. Pick your inspiration. Addison Leong Joanne Jang Katherine Liu SunMi Lee Development Team manager Design User testing Picks Pick your inspiration Addison Leong Joanne Jang Katherine Liu SunMi Lee Development Team manager Design User testing Introduction Mission Statement / Problem and Solution Overview Picks is a mobile-based

More information

Controlling Viewpoint from Markerless Head Tracking in an Immersive Ball Game Using a Commodity Depth Based Camera

Controlling Viewpoint from Markerless Head Tracking in an Immersive Ball Game Using a Commodity Depth Based Camera The 15th IEEE/ACM International Symposium on Distributed Simulation and Real Time Applications Controlling Viewpoint from Markerless Head Tracking in an Immersive Ball Game Using a Commodity Depth Based

More information

Overall approach, including resources required. Session Goals

Overall approach, including resources required. Session Goals Participants Method Date Session Numbers Who (characteristics of your play-tester) Overall approach, including resources required Session Goals What to measure How to test How to Analyse 24/04/17 1 3 Lachlan

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

DESIGN A SHOOTING STYLE GAME IN FLASH 8

DESIGN A SHOOTING STYLE GAME IN FLASH 8 DESIGN A SHOOTING STYLE GAME IN FLASH 8 In this tutorial, you will learn how to make a basic arcade style shooting game in Flash 8. An example of the type of game you will create is the game Mozzie Blitz

More information

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko SPIDERMAN VR Adam Elgressy and Dmitry Vlasenko Supervisors: Boaz Sternfeld and Yaron Honen Submission Date: 09/01/2019 Contents Who We Are:... 2 Abstract:... 2 Previous Work:... 3 Tangent Systems & Development

More information

CS 354R: Computer Game Technology

CS 354R: Computer Game Technology CS 354R: Computer Game Technology http://www.cs.utexas.edu/~theshark/courses/cs354r/ Fall 2017 Instructor and TAs Instructor: Sarah Abraham theshark@cs.utexas.edu GDC 5.420 Office Hours: MW4:00-6:00pm

More information

Introduction To Immersive Virtual Environments (aka Virtual Reality) Scott Kuhl Michigan Tech

Introduction To Immersive Virtual Environments (aka Virtual Reality) Scott Kuhl Michigan Tech Introduction To Immersive Virtual Environments (aka Virtual Reality) Scott Kuhl Michigan Tech Hobbies: Photography Hobbies: Biking Two summers ago: 120 miles over 5 days in rural NW Ireland Last summer:

More information

interactive laboratory

interactive laboratory interactive laboratory ABOUT US 360 The first in Kazakhstan, who started working with VR technologies Over 3 years of experience in the area of virtual reality Completed 7 large innovative projects 12

More information

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Helmut Schrom-Feiertag 1, Christoph Schinko 2, Volker Settgast 3, and Stefan Seer 1 1 Austrian

More information

In this project you ll learn how to create a platform game, in which you have to dodge the moving balls and reach the end of the level.

In this project you ll learn how to create a platform game, in which you have to dodge the moving balls and reach the end of the level. Dodgeball Introduction In this project you ll learn how to create a platform game, in which you have to dodge the moving balls and reach the end of the level. Step 1: Character movement Let s start by

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

A New Simulator for Botball Robots

A New Simulator for Botball Robots A New Simulator for Botball Robots Stephen Carlson Montgomery Blair High School (Lockheed Martin Exploring Post 10-0162) 1 Introduction A New Simulator for Botball Robots Simulation is important when designing

More information

Revision for Grade 6 in Unit #1 Design & Technology Subject Your Name:... Grade 6/

Revision for Grade 6 in Unit #1 Design & Technology Subject Your Name:... Grade 6/ Your Name:.... Grade 6/ SECTION 1 Matching :Match the terms with its explanations. Write the matching letter in the correct box. The first one has been done for you. (1 mark each) Term Explanation 1. Gameplay

More information

M-16DX 16-Channel Digital Mixer

M-16DX 16-Channel Digital Mixer M-16DX 16-Channel Digital Mixer Workshop Using the M-16DX with a DAW 2007 Roland Corporation U.S. All rights reserved. No part of this publication may be reproduced in any form without the written permission

More information

What My Content Was Like Four Years Ago

What My Content Was Like Four Years Ago It s challenging to create content that gets people to take action. However, tons of creators are publishing content every day or every week that helps/entertains people, and they are making a living off

More information

Software Requirements Specification

Software Requirements Specification ÇANKAYA UNIVERSITY Software Requirements Specification Simulacrum: Simulated Virtual Reality for Emergency Medical Intervention in Battle Field Conditions Sedanur DOĞAN-201211020, Nesil MEŞURHAN-201211037,

More information

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time

More information

ncloth Simulation in Maya ncloth Simulation Report Cart 434 Advance 3D Studio By Umer Usman Instructor: Stephan Menzies

ncloth Simulation in Maya ncloth Simulation Report Cart 434 Advance 3D Studio By Umer Usman Instructor: Stephan Menzies 1 ncloth Simulation in Maya ncloth Simulation Report Cart 434 Advance 3D Studio By Umer Usman Instructor: Stephan Menzies 2 Index 1. Abstract 2. Introduction 3. Research 4. Tests a. Adding Backplate b.

More information

Oculus Rift Development Kit 2

Oculus Rift Development Kit 2 Oculus Rift Development Kit 2 Sam Clow TWR 2009 11/24/2014 Executive Summary This document will introduce developers to the Oculus Rift Development Kit 2. It is clear that virtual reality is the future

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

Orbital Delivery Service

Orbital Delivery Service Orbital Delivery Service Michael Krcmarik Andrew Rodman Project Description 1 Orbital Delivery Service is a 2D moon lander style game where the player must land a cargo ship on various worlds at the intended

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

VR-Plugin. for Autodesk Maya.

VR-Plugin. for Autodesk Maya. VR-Plugin for Autodesk Maya 1 1 1. Licensing process Licensing... 3 2 2. Quick start Quick start... 4 3 3. Rendering Rendering... 10 4 4. Optimize performance Optimize performance... 11 5 5. Troubleshooting

More information

Requirements Specification. An MMORPG Game Using Oculus Rift

Requirements Specification. An MMORPG Game Using Oculus Rift 1 System Description CN1 An MMORPG Game Using Oculus Rift The project Game using Oculus Rift is the game application based on Microsoft Windows that allows user to play the game with the virtual reality

More information

School Based Projects

School Based Projects Welcome to the Week One lesson. School Based Projects Who is this lesson for? If you're a high school, university or college student, or you're taking a well defined course, maybe you're going to your

More information

PRINTING & SHARING IMAGES IN LIGHTROOM

PRINTING & SHARING IMAGES IN LIGHTROOM Photzy PRINTING & SHARING IMAGES IN LIGHTROOM Quick Guide Written by Kent DuFault PRINTING & SHARING IMAGES IN LIGHTROOM // PHOTZY.COM 1 Photzy recently received this email from one of our followers: I

More information

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp

More information

PASS IT ON PROBLEM SOLUTION OVERVIEW INTERFACE DESIGNS

PASS IT ON PROBLEM SOLUTION OVERVIEW INTERFACE DESIGNS PASS IT ON Alistair Inglis: Designer & User Testing Haley Sayres: Manager & Documentation Rebecca Wang: Developer & User Testing Thomas Zhao: Developer & User Testing Pass It On: Improve offline interactions.

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Apple ARKit Overview. 1. Purpose. 2. Apple ARKit. 2.1 Overview. 2.2 Functions

Apple ARKit Overview. 1. Purpose. 2. Apple ARKit. 2.1 Overview. 2.2 Functions Apple ARKit Overview 1. Purpose In the 2017 Apple Worldwide Developers Conference, Apple announced a tool called ARKit, which provides advanced augmented reality capabilities on ios. Augmented reality

More information

IMGD 4000 Technical Game Development II Interaction and Immersion

IMGD 4000 Technical Game Development II Interaction and Immersion IMGD 4000 Technical Game Development II Interaction and Immersion Robert W. Lindeman Associate Professor Human Interaction in Virtual Environments (HIVE) Lab Department of Computer Science Worcester Polytechnic

More information

HCI Midterm Report CookTool The smart kitchen. 10/29/2010 University of Oslo Gautier DOUBLET ghdouble Marine MATHIEU - mgmathie

HCI Midterm Report CookTool The smart kitchen. 10/29/2010 University of Oslo Gautier DOUBLET ghdouble Marine MATHIEU - mgmathie HCI Midterm Report CookTool The smart kitchen 10/29/2010 University of Oslo Gautier DOUBLET ghdouble Marine MATHIEU - mgmathie Summary I. Agree on our goals (usability, experience and others)... 3 II.

More information

Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies

Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies Mirko Sužnjević, Maja Matijašević This work has been supported in part by Croatian Science Foundation

More information

Instruction Manual. Pangea Software, Inc. All Rights Reserved Enigmo is a trademark of Pangea Software, Inc.

Instruction Manual. Pangea Software, Inc. All Rights Reserved Enigmo is a trademark of Pangea Software, Inc. Instruction Manual Pangea Software, Inc. All Rights Reserved Enigmo is a trademark of Pangea Software, Inc. THE GOAL The goal in Enigmo is to use the various Bumpers and Slides to direct the falling liquid

More information

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21 Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:

More information

Easy Input Helper Documentation

Easy Input Helper Documentation Easy Input Helper Documentation Introduction Easy Input Helper makes supporting input for the new Apple TV a breeze. Whether you want support for the siri remote or mfi controllers, everything that is

More information

Until now, I have discussed the basics of setting

Until now, I have discussed the basics of setting Chapter 3: Shooting Modes for Still Images Until now, I have discussed the basics of setting up the camera for quick shots, using Intelligent Auto mode to take pictures with settings controlled mostly

More information

Seaman Risk List. Seaman Risk Mitigation. Miles Von Schriltz. Risk # 2: We may not be able to get the game to recognize voice commands accurately.

Seaman Risk List. Seaman Risk Mitigation. Miles Von Schriltz. Risk # 2: We may not be able to get the game to recognize voice commands accurately. Seaman Risk List Risk # 1: Taking care of Seaman may not be as fun as we think. Risk # 2: We may not be able to get the game to recognize voice commands accurately. Risk # 3: We might not have enough time

More information

Foreword Thank you for purchasing the Motion Controller!

Foreword Thank you for purchasing the Motion Controller! Foreword Thank you for purchasing the Motion Controller! I m an independent developer and your feedback and support really means a lot to me. Please don t ever hesitate to contact me if you have a question,

More information

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002 INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Alternative Interfaces SMD157 Human-Computer Interaction Fall 2002 Nov-27-03 SMD157, Alternate Interfaces 1 L Overview Limitation of the Mac interface

More information

Background - Too Little Control

Background - Too Little Control GameVR Demo - 3Duel Team Members: Jonathan Acevedo (acevedoj@uchicago.edu) & Tom Malitz (tmalitz@uchicago.edu) Platform: Android-GearVR Tools: Unity and Kinect Background - Too Little Control - The GearVR

More information

Network Institute Tech Labs

Network Institute Tech Labs Network Institute Tech Labs Newsletter Spring 2016 It s that time of the year again. A new Newsletter giving you some juicy details on exciting research going on in the Tech Labs. This year it s been really

More information

Blend Photos With Apply Image In Photoshop

Blend Photos With Apply Image In Photoshop Blend Photos With Apply Image In Photoshop Written by Steve Patterson. In this Photoshop tutorial, we re going to learn how easy it is to blend photostogether using Photoshop s Apply Image command to give

More information

What you see is not what you get. Grade Level: 3-12 Presentation time: minutes, depending on which activities are chosen

What you see is not what you get. Grade Level: 3-12 Presentation time: minutes, depending on which activities are chosen Optical Illusions What you see is not what you get The purpose of this lesson is to introduce students to basic principles of visual processing. Much of the lesson revolves around the use of visual illusions

More information

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-103 DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS

More information

A Guide to Virtual Reality for Social Good in the Classroom

A Guide to Virtual Reality for Social Good in the Classroom A Guide to Virtual Reality for Social Good in the Classroom Welcome to the future, or the beginning of a future where many things are possible. Virtual Reality (VR) is a new tool that is being researched

More information

Overview. The Game Idea

Overview. The Game Idea Page 1 of 19 Overview Even though GameMaker:Studio is easy to use, getting the hang of it can be a bit difficult at first, especially if you have had no prior experience of programming. This tutorial is

More information