Integration of jartoolkit and enjine: extending with AR the potential use of a didactic game engine

Size: px
Start display at page:

Download "Integration of jartoolkit and enjine: extending with AR the potential use of a didactic game engine"

Transcription

1 Integration of jartoolkit and enjine: extending with AR the potential use of a didactic game engine Fernando Tsuda Paula M. Hokama Thiago M. Rodrigues João L. Bernardes Jr. Escola Politécnica da Universidade de São Paulo São Paulo, SP (11) {fernando.tsuda, paula.hokama, thiagomoreira13}@gmail.com joao.bernardes@poli.usp.br Figure 1. The "We A.R. Dancin' " Game ABSTRACT This paper describes the integration of a didactic game engine (enjine) and an Augmented Reality API (jartoolkit) to increase the potential uses of the engine both as a teaching tool and as a testbed for new technologies, in this case, low cost AR games. The tools, methodology and the implemented solution are analyzed and several tests are described, showing results regarding the feasibility and playability of such a system. Keywords Augmented Reality, Electronic Games, Game Engines, jartoolkit. 1. INTRODUCTION Augmented Reality (AR) is a technology that allows the combination of real and virtual elements in a scene, with which the user can interact in real time with the use of 3D registration [4]. To make this interaction possible, the use of high cost equipment is common, such as HMDs (Head Mounted Displays) to show the scene with virtual objects and position sensors such as GPS (Global Positioning System) or magnetic sensors that indicate the user s localization in the environment. However, it is possible to use AR with low cost devices such as conventional video monitors, webcams, PDAs or cell phones. In this case, the user is often detected in the environment with the use of computer vision techniques. This technology can be used in several applications such as medicine, buildings and equipment maintenance or the entertainment area in general. This work s goal fits in this last category, specifically the electronic games area using AR, whose development can be facilitated integrating a game engine with an AR library. In this work, the term Augmented Reality is used to designate any system that satisfies Azuma s definition [4], without the distinctions presented by Milgram [11]. The electronic games area has considerable social, economic and technological importance. In the social area, they allow the interaction of different people even in distant locations, a fact strengthened with the growth of multiplayer online games where thousands of players interact in a common virtual environment. Augmented Reality allows a richer interaction between these people, either introducing the real image of the player in the virtual environment or augmenting the real world with virtual elements of the game. In the economic area, the world-wide electronic games segment has, in the past years, generated yearly revenues of approximately US$ 26 billion, exceeding the profit from ticket sales by the Hollywood movie industry [8]. And in the technological area, games motivate the creation of new hardware, software and architectures with the capability to deal with the increase of its complexity that occur due to de use, for instance, of higher definition graphics and sound, which strengthen the realistic aspect of current games. As examples of technological advances in hardware greatly motivated by games, the increase of GPU (Graphics Processing Unit) capabilities and the creation of a unit responsible by processing the physical calculations inside a game (called PPU or Physics Processing Unit) can be mentioned. Games are also one of the most interesting areas to explore Augmented Reality. They not only do not usually demand the high precision required in medical or maintenance areas, for instance, they also benefit from AR through the development of games with original playability and with greater player immersion. Combining games and AR not only allows better (more natural, for instance) interfaces with the player but also stimulate the use of AR technology, so it can be more easily popularized and used in other areas, including applications that require more precision. The ludic characteristic of games

2 facilitates the acceptance and tests with innovative user interfaces, which may even be still in prototype stages, much like many AR systems. [17] Electronic games also have a great potential of application in the didactic area either in form of educational games or with the development of games as a pedagogical strategy. This later paradigm has been successfully applied at Escola Politécnica da USP to teach Computer Graphics courses [20]. For this end, a didactic game engine, called enjine [12], was developed, based on Java and the Java 3D graphics API. Currently, enjine has as secondary objective, aside from its didactic use, which is to serve as a test platform for new technology in games. Its simple and structured architecture aids in both objectives. However, like the most existing game engines, enjine does not offer Augmented Reality resources. An alternative to develop AR games with enjine is to use the jartoolkit library [7]. Integrating these two tools, however, is not a simple task, since the architecture of one ignores the other, generating conflicts of access to graphic resources and difficulties to share data between them. Thus the work presented here has as its objective the integration of jartoolkit and enjine, to simplify the development of games that use the concept of Augmented Reality. This project brings two major contributions, each related to one of enjine's main objectives. The first is to increase the potential of didactic applications of enjine, for instance to teach concepts of Augmented Reality and Image Processing, subjects that can be part of an undergraduate Computer Graphics course or even constitute postgraduate courses by themselves, as is the case in Escola Politécnica. The other is to supply a framework that facilitates the exploration of new alternatives to create games based on AR with low cost, increasing the enjine s potential as a test platform for new technologies in games (in this case, AR). To test the integration of these tools and at the same time to build a proof of concept of the viability of low cost AR games, some prototypes have been developed, among which the dance game We AR Dancin (some images of this game are shown in Figure 1) was the most elaborate and deserves a more detailed analysis in the next sections. In the following sections some similar works and the tools used in this project will be detailed. After that the specification and methodology used on the integration of enjine with jartoolkit will be discussed. Later the results obtained will be shown and discussed and the We AR Dancin game will be more detailed. 2. RELATED WORK One of the aspects that makes the entry of new consumers in the games market difficult is the growing complexity to handle their traditional input devices [16]. Games have an increasingly larger quantity of commands to input through a keyboard or a joystick with a great number of buttons currently they have 8 action buttons and 3 directional buttons whose functions the player must learn and memorize for each game [9]. Thus, new ways of input have been used as an attempt to create more instinctive playability. Some examples from the game industry can be mentioned: the use of controllers that look like musical instruments such as electric guitars or bongo drums; the use of carpets where the user performs a "dance", controlling the game with his feet; cameras such as the EyeToy, an accessory to Sony's Playstation videogame system where the user plays making moves in front of the camera; and the WiiMote, the Nintendo Wii controller, which is a recent attempt from Nintendo to attract new players through the use of a innovative way to play videogames, where the controller s movement is detected and converted into a correspondent game action. It must also be mentioned that the EyeToy can be used in AR games with low cost, in a similar way to the work presented in this text since, aside from being capable to detect movements, it can also capture the real image of the user and insert it in the virtual environment of the game using the camera, the TV and the console. Figure 2 shows the EyeToy and the WiiMote. Figure 2. The EyeToy, a camera for Sony's Playstation console and the WiiMote, Nintendo Wii s controller Most academic works resembling what is proposed here for Augmented Reality games have a peculiarity: they have an architecture based around the requisite of AR first and the characteristics of games come in second place, if at all. In this project, the starting point was the game engine, the enjine, and AR is treated as another form of input (exploring enjine's abstract input layer) and output (integrating the video stream from the camera with the virtual elements) to the game. An example of architecture that also uses the ARToolKit and has been used to develop Augmented Reality games, but is actually a platform for general AR applications, is the StudierStube [18]. Although initially this platform was limited to specific forms of data input and output like HMDs and a Personal Interaction Panel [15], it now supports a greater variety of interaction devices and the use of computer vision. It has even been adapted to be used in mobile devices [22]. Another framework to develop games using AR which supports the use of ARToolKit (used in the first application created with this library) and is even oriented to the development of educational games is described by Santiago et al. [14]. This framework is limited to the development of adventure games with AR and uses the concept of hypermedia graphs to link game events and entities and represent its evolution. Focusing in AR games instead of the tools for their development, there are several works in the related literature. Bernardes et al. [5] lists and tries to classify some of them and also discusses tangible interfaces (where a specific physical object may be associated with each virtual object and, handling this physical counterpart directly, the user may handle the virtual object in an instinctive manner). Most AR games developed with ARToolKit use this kind of interface, with the markers acting as the tangible real elements. Among the games that use a conventional video monitor as display two can be mentioned: a game that aids in learning Kanji [21] and CamBall [24], a Ping-Pong game where

3 the marker is located at the paddles. Another game where the markers are used as a tangible interface augmented with the use of buttons is a Chinese Checkers game [6], but this game cannot be considered as an AR game, since it consists of only virtual elements. Other games use the markers as a non-tangible interfaces, to facilitate the identification of features of interest. An example is the Invisible Train [22], which uses the StudierStube in a mobile device. Another example is the GeFighters [19] (which, for the same reason as the Chinese Checkers game, cannot be considered an AR application, but that also uses fiducial markers and computer vision as means of player interaction), a fighting game where the movements of the markers attached on the player s hands controls its avatar (while, curiously, the foot movements controls the fighting movements such as punches). GeFighters uses a platform to manage the input devices called CIDA, which translates the actions from different devices as a game actions, similarly to what happens in enjine's abstract input layer. The integration of enjine with jartoolkit allows the use of the markers as either tangible interfaces or features of interest; however none of the prototypes developed for the testing in this work use markers as tangible interfaces yet. 3. SOFTWARE TOOLS 3.1 enjine The enjine [10][12] is a game engine developed at the Interactive Technologies Laboratory (Interlab-USP) with the use of the Java language and Java 3D. Its main goal is to serve as a didactic tool to aid in teaching subjects such as Computer Graphics and Software Engineering through the quick implementation of games. Thus, despite its simple structure (because the students have little time to learn it), enjine allows the creation of relatively complex games in short periods of time. A second purpose for enjine is to act as a test platform to new technologies in games, since its simple architecture has proven suitable for this function. This work contributes with both purposes, as will be discussed later. Conceptually, the enjine can be divided into three layers, as shown in Figure 3. Classes from the Java API and other libraries such as Java 3D are located in the lowest tier. The middle layer contains several "core" enjine modules, responsible for services such graphical rendering of objects, command input, sound output and others functions related to game management. The topmost layer contains a framework that makes the implementation of enjine s services more transparent to the programmer for specific types of games. The class SinglePlayerGame, for instance, facilitates the creation of games that will run in a common desktop computer. Figure 3. enjine layers EnJine's different modules are implemented as packages, as shown in Figure 4. The Core package contains the Game, GameState and GameObject classes which represents game entities with different characteristics and can aggregate the classes Updater, to update the object, Collidable, to detect collisions between objects, Viewable, responsible for the object's visual representation and Messenger to allow the exchange of messages between objects. Figure 4. enjine packages The IO package contains the Graphics, Input and Sound packages. The Graphics package is responsible for the rendering of game objects, using the Java 3D API. The Sound package is responsible for the basic functionality for sound in the game. The Input package handles user input in enjine and is based on an abstract layer that separates input devices from the correspondent game actions. Each device (such as a joystick, mouse or keyboard) is represented by a subclass of InputDevice which has a series of InputSensor objects representing device elements such as a joystick s buttons or a keyboard s keys. To link these sensors to a game action ( InputAction ) such as jump or move right, for instance, there is a class called InputManager. Figure 5 illustrates the relation between the classes that belong to Input module of enjine.

4 for the allocation and initialization of the main instances used, such as JARToolKit and JARFrameGrabber. Other important classes are ARPatternTransforfGroup, which extends Java 3D's TransformGroup class and is responsible for marker information and represents the transform group that will be influenced by the marker movement; and the ARBehavior class that extends the Behavior from Java 3D and is responsible for updating the images obtained from the camera, for identifying the marker patterns and for rendering virtual objects on them. Figure 5. enjine input system 3.2 ARToolKit and jartoolkit The ARToolKit [2] is a library that provides functions to facilitate the development of Augmented Reality applications. Its works based on computer vision techniques to calculate the real position of the camera in relation to fiducial markers and their orientation, allowing the programmers to show virtual objects on them. Figure 6 shows how an usual application using ARToolKit works. Figure 6. Operation of an application developed with ARToolKit However, the use of ARToolKit is restricted to users of C and C++ languages. To allow the use of ARToolKit s functions with the Java language, jartoolkit was developed. It consists of classes that access ARToolkit's functions using the JNI (Java Native Interface). jartoolkit's architecture divides the ARToolKit's functions in two main classes: jartoolkit the class that encapsulates the functions of ARToolKit responsible for tracking. For the time being, only the basic functions are implemented, focusing on the Windows version of the library. JARFrameGrabber Class that encapsulates all the functions necessary to access the video input from the camera, using the DirectShow API. Another important information is that the integration between Augmented Reality functionality and Java 3D is made through a component called "jartoolkit for Java 3D", with several classes that facilitate the inclusion of AR-related scene graph nodes in Java 3D. The main class, called JARToolKit3D, is responsible 4. METHODOLOGY To develop this project, the UML (Unified Modeling Language) was used to model its components, making it possible to visualize the work s products and services with the use of standardized diagrams. The integration with jartoolkit gives enjine the capacity to recognize commands inserted with the use of fiducial markers and a video camera, aside from reproducing the video stream obtained from the camera, characteristics that allow the creation of games that use the Augmented Reality technology. To specify this part of the project, it doesn t make sense to use regular use cases, because its functionality is not presented directly to actors but instead to the games that use the integrated platform. Thus, an abstract actor was created to represents the game that will be developed using the enjine. Initially, it was necessary to study Augmented Reality libraries. Besides the already mentioned ARToolKit and jartoolkit, other available Augmented Reality libraries were studied, such as the ARToolKit Plus [3] and ARTag [1] as well Intel's computer vision library, OpenCV [13]. The choice of jartoolkit was based mainly on its implementation in the Java language, which facilitates the integration with enjine. The study of demo applications supplied with jartoolkit (such as the one shown in Figure 7) and their respective source code showed that the position and orientation of the markers are stored in a 4x4 transformation matrix with the three first lines and columns containing the information about the rotation (and about the scale too, but in this case it is not used) and the last column containing the marker's translation. This matrix is normally used to apply the same transformation to some virtual object, giving the user the sensation that the object is on the marker. Figure 7. Demo application of jartoolkit

5 With the demo applications, some tests were carried out to validate the use of jartoolkit in games with the measurement of values such as the maximum distance between the markers and the camera and the maximum speed of movement of the marker that would still allow its position to be registered by the system. To aid in this task, the apparatus shown in Figure 8 was built. An electric motor pulls the marker by a string with controlled speed in front of the camera and the camera height can be adjusted. Despite some problems, described ahead, jartoolkit proved adequate to the project. branch of scene graph that recognizes the marker on the image and to transfer it to the ARInput device in intervals defined by the game s programmer; ARDisplay : Extending enjine's class GameDisplay, it was created to allow the reproduction of the video stream from the camera as a background and to add the scene graph that represents the AR devices, with the use of ARInstance methods. Figure 8. Apparatus mounted to validate jartoolkit Regarding the enjine, the study was focused in the classes located in the Input and Graphics packages. 5. ARCHITECTURE The studies described earlier revealed the need to add to enjine's scene graph jartoolkit s branch graph responsible for detecting the marker on the scene, to allow the acquisition of the marker transformation matrix. But just adding that branch to enjine s scene graph is not sufficient for the values within that matrix to be recognized as a command inputs. Another need identified was the reproduction of the video stream obtained from the camera in the game. Those problems were solved with the creation of a middleware component, containing the following classes and functionalities: ARInstance : Class that initializes and accesses the main functions of the JARToolKit3D class. Among its main functions are: the registration of the markers that will be used in the game and the creation of the scene graph responsible by the capture and reading of the images obtained from the camera. Only one instance of this class is allowed to exist (it is a Singleton); ARInput : Class that represents an input device from the images obtained from camera. It inherits from InputDevice. In this implementation, each marker is considered as a distinct device and must have an ARPatternTransformGroup that will be added to the scene graph of the game allowing its identification on the captured image. This class is also responsible for converting the values from the transformation matrix into the six commands that will be used in the game (translation in the x, y and z axes and rotation around those axes, representing, in this way, the six degrees of freedom of an object), as shown in Figure 9; ARTimerTask : Extending the Java s class TimerTask, is responsible for getting the transformation matrix from the Figure 9. Sequence of value transfer from the transformation matrix to the AR device To facilitate the creation of Augmented Reality games, a class called ARGame was created inside the Framework package, inheriting from Game. It was developed with to include the attributes and methods necessary to instantiate the interface between enjine and jartoolkit correctly and to make this task transparent to the programmer. Figure 10 shows a class diagram representing the classes of the middleware developed in this work and its relation with jartoolkit and enjine. Figure 10. Middleware classes and their relations to the enjine and jartoolkit

6 During the initial tests (which will be described later) to validate the integrated tools, the need was perceived to develop routines for the calibration of the camera, to identify the position of the marker on the scene in a correctly. Until then, the virtual objects generated on the markers was incorrectly registered, causing the following effects: 1) In the perspective image obtained from the camera, the values of coordinates on the x and y axis vary according to the distance between the marker and the camera. For instance, if at a distance of 1 meter the value of x in the position matrix varies from -100 to +100 for each extremity of the screen, for 3 meters these values will proportionally proportionally to and +300; 2) For a given distance, the camera is not correctly calibrated for all points in the xy plane. For points near the center of the screen the object stays on the marker but if the marker moves away from the center, the object will move with a different speed. To solve these two problems, the second one was focused on first. Analyzing the virtual object displacement in relation to the marker in the tests, it was noticed that this variation was linear in each of the x and y axes. With the tests, the values of the constants (the angular coefficient, k1, and the displacement of the line, k2) in the equation of the line (equation 1) that represents this variation could be determined empirically. y k x + = (1) 1 k 2 This function, however, can only realize the calibration on the xy plane located at the distance used on the tests (z=1600 mm, adopted as a reference value). Thus the solution that permits to adjust this calibration function to any distance is to project (during the calibration of the z axis) all points obtained from the camera to the reference distance before the calibration of x and y axes. Figure 11 shows the necessary sequence to obtain the calibration of each point. Figure 11. Complete process of calibration This method of calibration was satisfactory to adjust virtual objects located in places with fixed z values. However, the tests with the variation of this value (detailed ahead) showed that the process is not satisfactory: the object does not follow the marker correctly in this case. Thus, to solve this new problem, it was created a new class extending the class Camera from enjine and the transformation values used by the camera in jartoolkit was adopted. This solution supplies a result with least failures and deviation and reveals itself satisfactory. The cost of its adoption, however, was the limitation of allowing the creation of a single instance of class Camera in the game because of the use of fixed values applied in the camera calibration. This limits the enjine's functionality of creating several instances of the camera and positioning them in different points. 6. RESULTS AND TESTS The results obtained with the integration were verified with several tests, which are described below. 6.1 jartoolkit Tests The tests with jartoolkit were accomplished with the aid of the equipment shown in Figure 8. Initially, the results of these tests were not satisfactory, because the marker was being recognized only when moving in very low speeds (around 0.1 m/s). In addition, small variations in lighting on the marker affected the tests results, as can be verified by the presence or not of people near the marker, even if to the naked eye the lighting was not altered (there were no shadows directly on the marker, for instance). As the configuration of the computer that was used is recent (Pentium M, with 1.73 GHz, 512 MB of RAM, on-board video card and Java version 5.0 Update 5 and Java3D 1.3.1), the first conclusion was that the problem was caused by the jartoolkit, which was developed over an old version of ARToolKit. jartoolkit latest version s date and consequently of the ARToolKit version used is July 2004, while ARToolKit s newest version, while the tests were made, dated from June However, two factors showed that jartoolkit could be used in the project: The use of dedicated video cards instead of on-board cards. The execution of the same program in Pentium IV HT computers, with 3.40 GHz, 1 GB of RAM and an ATI FireGL V3100 GPU with 128 MB of dedicated memory showed the possibility of high velocity while moving the marker. This configuration allowed the virtual object to follow the marker that was moving with the velocity of approximately 1 m/s; The use of newer versions of the same tools. For instance, using Java version 5.0 Update 8 and Java3D in the first computer where the tests were made allowed the movement of the marker in greater speeds (approximately 0.4 m/s). This factor contributed decisively for the use of jartoolkit in the project. 6.2 jartoolkit integration with enjine The integration tests with jartoolkit and enjine were accomplished in two ways: Command input tests with the markers, using games already developed with enjine; The creation of a game prototype, using this new type of interaction. In the first test, the game City Run was used, a racing game developed by students in the computer graphics course, and available in enjine s website [10]. In that game, the playability was changed so that it could accept commands from a marker in the video stream. This way, the car could be manipulated by moving the marker up, down, left and right and disregarding the movement in depth and the rotations of the marker.

7 One of the interesting results of the tests made with this prototype was that the presence of many graphic elements (in this case, the game environment) in the screen, apparently can confuse the user s control, because it covers the video and, consequently, the perception of the marker s position on screen, which serves as a feedback to the user. Additionally, the need for the horizontal inversion of the video proceeding from the camera was determined, so that the user could act as if he were facing a mirror. This showed a marked improve in usability during the tests. With regard to the marker s movement, in this prototype, when it moved up, the system detected this movement, however its amplitude was not stored, that is, the car would move forward always with the same intensity. In this moment the need for different interpretations marker s movement was identified. Interaction should possibly be made in several ways: according to the absolute marker s position, according to the marker s position relative to some point of interest or according to the position relative to the marker s former position (i.e., analyzing its movement). This approach allows the creation of games that use the camera and the markers as a flexible form of input, and that do not even need to be Augmented Reality applications. Figure 12 shows one of the prototype s screens. bounce the ball back move in the corresponding direction, and aiming to hit the ball, so that it will be sent to the adversary. In the tests made with this prototype, the goals were to verify the use of more than one marker at the same time in a game, and to learn the process of game creation using the enjine. Even though the rules of the game were not fully implemented the score, for example the goal of moving the objects with different markers was accomplished. In this test it also was possible to test the rotation through of the values of the angles that were obtained from the matrix, but without practical effects on the game rules implemented so far. However, the attempt to apply the rotations using one TransformGroups for each angle for each marker was not accomplished successfully, due to the difficulty to determine the right order in which to apply the transformations. It is worth mentioning that this prototype was already developed using the absolute position of the markers as a form of input. Figure 13 shows a screenshot of the Pong prototype. Figure 12. Test using adaptations of a game already developed with enjine Therefore, with this first test, it could be identified the following facts: Detection of the user movement s validation and their association, in fact, to a specific action in a developed game; The video stream insertion in the game screen s validation, joined with the rest of the scenery; Identification of the necessity to invert the images from the camera; The identification of the superposition of the virtual scenery over the video stream real scenery and the possible disturbance that it can be to the user relating the loss of a marker s movement visibility. For the second prototype, a game similar to Pong was created. It chosen because of the need of having few graphic elements on the screen and its easy development. Its functioning consists of moving the marker up and down, making the cubes that are representing the platforms which in the original game are used to Figure 13. Pong prototype In this second test, the following facts could be identified: Validation of the different interpretations that were developed for the marker movement (in this case, the virtual object follow the absolute value of the y coordinate of the marker but remains at a fixed x coordinate); Validation of the use of more than one marker at the same time on screen; Validation of the association between the marker and the object rotations, but in this particular application this still presented some problems. At this point, a more complete test was judged necessary to verify the behavior of the 3D object in relation to the marker movement on the z axis (depth). To this end, a test application similar to jartoolkit s demo (Figure 7) was developed, in which the cube should follow the marker and enjine should be able to, from the position matrix given by jartoolkit, interpreted the marker movement and thus the cube movement. In this test, it was identified that the camera calibration procedure developed at first and discussed earlier, in which the translation values from the position matrix were adapted to the reference plane, was functional only for games which had actions without depth variation for the markers. This is why new solutions were sought, that would allow the creation of 3D games, attending the minimum requisites of Augmented Reality.

8 As discussed before, the solution adopted was the creation of a class extending the Camera class and using jartoolkit's calibration values, an alternative that, despite being limited, allows the scope of the project to remain without drastic changes. A better calibration procedure is planned as a future work. 6.3 We A.R. Dancin This game was developed as a second stage of the project, to put to the test some Game Design concepts applied to an AR game, and to validate the integrated tools. First, analyzing the results from a brainstorm undertaken by the project's participants, a dance game was chosen for implementation, since this idea would allow focusing the work on the game logic and diminishing the efforts on more artistic areas (in which the development team lacked expertise) such as plot development and modeling of complex graphical objects. Another advantage resulting from the tests is the possibility to offer to the player the sensation of immersion in an environment that contains virtual objects without these objects interfering with the visual feedback to the player, (i.e. not occluding the video stream severely). The following requirements were defined for the game: The player must use only the markers to select all objects shown by the game on the screen, touching them, thus avoiding the use of keyboard or mouse; The possibility of two players dividing the screen, allowing multiplayer matches, besides single player games; Use of the hands and legs to execute the movements. Thus, each player must use four markers, one for each member, on which three-dimensional objects are rendered; The game presents objects that must be touched by the player with the specific member (one of the hands or one of the legs). For this, different colors and symbols are used, to indicate to the player which member he must use. Figure 14. Screen from game "We A.R. Dancin'" To facilitate the playability, the virtual objects presented by the game have its depth and scale adjusted dynamically accordingly to the marker s position. For instance, if the marker that represents the player s left hand is at 2 meters of distance from the camera, the objects related to this member will have their positions adjusted to this depth, but this is imperceptible to the player. In this manner, the game can be played independently of the distance between the player and the camera. Although it does not influence the playability, the problem with the rotation faced previously was solved with the use of a matrix already built with the angles obtained from the middleware on the TransformGroup. The matrix used is the same used internally by ARToolKit. During informal tests with several users (between 20 and 30 users), including users whose ages varied widely and with the use of a better equipment than used during the previous tests (Intel Core 2 Duo with 2.0 GHz, 2 GB of RAM and a video camera with higher resolution) the game was received with enthusiasm, with the possibility to play with nearly imperceptible delay and with correct recognition of the markers. During the game, up to 8 markers could be used, allowing the participation of two players simultaneously. While in a few frames some marker might not have been detected (or not detected with great precision), this happened rarely enough to be nearly imperceptible to the players. No cases of incorrect marker recognition (recognizing one marker as another) was detected during these tests. 7. CONCLUSIONS The integration of the jartoolkit and enjine did increase the potential use of enjine to teach several Augmented Reality concepts through the development of original games. It can illustrate, for instance, the concept of 3D registration or the merging of real and virtual elements in a rather practical way. More complex games may even be developed to study concepts of Image Processing, for instance to remove the background from an image. The implementation of the game We A.R. Dancin has validated the proposed solution satisfactorily. The creation of games with Augmented Reality offers to the players the possibility to interact with the game in more instinctive (and original) ways and was very well received in this case. Despite the success of the implementation, the project offers several opportunities for future works. Among the various possible improvements identified, some are related with new internal functions for the enjine, such as the possibility remove the background from the image obtained from the camera (as mentioned earlier), allowing the integration of the image of the player obtained in real time with the game, as the player s avatar or the game's main character, with the player immersed into the virtual game world. In the case of We A.R, Dancin, it is the virtual objects that are included in the real environment of the player instead. Another possibility is the capacity to render the real elements with depth (currently the video stream is just an background image, always occluded by the virtual objects) allowing the occlusion of virtual objects and even reducing the caused when complex game worlds occlude the real elements and reduce the feedback to the user and thus playability. In the final solution adopted, the possibility to create various instances of cameras inside the game by the enjine was limited, because of the chosen solution to calibrate the camera. An

9 important future work, therefore, is the improvement of this calibration. Regarding the use of jartoolkit itself, after the intense study of its architecture and the analysis made to identify the more reasonable solution to allow its integration with the enjine with, it was possible to identify points to improve the performance of the system. As an example, the interface between the jartoolkit and Java 3D (implemented by the component jartoolkit for Java 3D ) could be removed and implemented directly as part of the enjine, aiming optimize its performance, especially considering the fact that jartoolkit is no longer updated. The development of new Augmented Reality games using the integrated tools and exploring new game designs and forms of interaction forms is, obviously, an important possibility for future works, too. REFERENCES [1] ARTAG. Available at artag/. Last visit in December, [2] ARTOOLKIT 2.33 manual. Available at: ufrj.br/grva/realidade_aumentada. Last visit in December, [3] ARTOOLKIT PLUS. Available at: handheld_ar/artoolkitplus.php. Last visit in December, [4] AZUMA, R.T. A Survey of Augmented Reality. Teleoperators and Virtual Environments, vol. 6, p , Aug [5] BERNARDES, J.; Dias, J. e Tori, R. Exploring Mixed Reality User Interfaces for Eletronic Games. In: BRAZILIAN GAMES AND DIGITAL ENTERTAINMENT WORKSHOP, 4, Anais... SBC, v. 1, p [6] COOPER, N.; Keatley, A.; Dahlquist, M.; Mann, S.; Slay, H.; Zucco, J.; Smith, R. & Thomas, B. Augmented Reality Chinese Checkers. In: ACM SIGCHI ADVANCES IN COMPUTER ENTERTAINMENT, Anais... ACM, p [7] GEIGER, C. et al. JARToolKit A Java Binding for ARToolKit. Augmented Reality Toolkit, The First IEEE International Workshop [8] HAUSE, K. What to Play Next: Gaming Forecast , in Conference On Human Factors In Computing Systems, p , [9] INFOEXAME. News Article on Stagnation of the Worldwide Market of Games, Info Exame Magazine, Feb [10] INTERLAB. enjine: Engine for Games in Java. [11] MILGRAM, P.; Takemura, H.; Utsumi, A. & Kishino, F. Augmented Reality: A Class of Displays on the Reality- Virtuality Continuum. SPIE Vol. 2351, Telemanipulator and Telepresence Technologies, 1994, p [12] NAKAMURA, R.; Bernardes, J. & Tori, R. enjine: Architecture and Application of an Open-Source Didactic Game Engine. SBGames [13] OPENCV. Avaliable at logy/computing/opencv/. Last visit in December, [14] SANTIAGO, J.; Romero, L. & Correia, N. A Framework for Exploration and Gaming in Mixed Reality. in Pervasive Computing Environments Workshop, Pervasive 2004 Conference, [15] SCHMALSTIEG, D., Fuhrmann, A., Hesina, G., Szalavari, Zs., Encarnação, L. M., Gervautz, M., Purgathofer, W.: The Studierstube Augmented Reality Project. PRESENCE - Teleoperators and Virtual Environments, Vol. 11, No. 1, MIT Press. [16] SIGGRAPH 2005: Beyond the Gamepad panel session. Transcription available at features/ / kane_01.shtml. Last visit in December, [17] STARNER, T. et al. Mind-Warping: Towards Creating a Compelling Collaborative Augmented Reality Game. In: INTERNATIONAL CONFERENCE ON INTELLIGENT USER INTERFACES, 5, Anais... ACM, p [18] STUDIERSTUBE Augmented Reality Project. studierstube.icg.tu-graz.ac.at/. Last visit in December, [19] TEIXEIRA, J.; Farias, T.; Moura, G.; Lima, J.; Pessoa, S.; & Teichrieb, V. GeFighters: an Experiment for Gesture-based Interaction Analysis in a Fighting Game. SBGames [20] TORI, R.; Bernardes Jr., J. L.; Nakamura, R. Teaching Introductory Computer Graphics Using Java 3D, Games and Customized Software: a Brazilian Experience, Anais do ACM SIGGRAPH 2006 Educator s Program, [21] WAGNER, D. & Barakonyi, I. Augmented Reality Kanji Learning. In: IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, 2, Anais... ACM, [22] WAGNER, D.; Pintaric, T.; Ledermann, F. & Schmalstieg, D. Towards Massively Multi-User Augmented Reality on Handheld Devices. In: INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING, 3, Anais [23] WAGNER, D. & Schmalstieg, D. First Steps Towards Handheld Augmented Reality. [24] WOODWARD, C.; Honkamaa, P.; Jäppinen, J. & Pyökkimies, E. CamBall Augmented Networked Table Tennis Played with Real Rackets. In: ACM SIGCHI ADVANCES IN COMPUTER ENTERTAINMENT, Anais... ACM, p

GeFighters: an Experiment for Gesture-based Interaction Analysis in a Fighting Game

GeFighters: an Experiment for Gesture-based Interaction Analysis in a Fighting Game GeFighters: an Experiment for Gesture-based Interaction Analysis in a Fighting Game João Marcelo Teixeira Thiago Farias Guilherme Moura João Paulo Lima Saulo Pessoa Veronica Teichrieb Federal University

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Interior Design with Augmented Reality

Interior Design with Augmented Reality Interior Design with Augmented Reality Ananda Poudel and Omar Al-Azzam Department of Computer Science and Information Technology Saint Cloud State University Saint Cloud, MN, 56301 {apoudel, oalazzam}@stcloudstate.edu

More information

PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY

PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY Marcella Christiana and Raymond Bahana Computer Science Program, Binus International-Binus University, Jakarta

More information

MRT: Mixed-Reality Tabletop

MRT: Mixed-Reality Tabletop MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having

More information

Augmented Reality- Effective Assistance for Interior Design

Augmented Reality- Effective Assistance for Interior Design Augmented Reality- Effective Assistance for Interior Design Focus on Tangible AR study Seung Yeon Choo 1, Kyu Souk Heo 2, Ji Hyo Seo 3, Min Soo Kang 4 1,2,3 School of Architecture & Civil engineering,

More information

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14: Part 14: Augmented Reality Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Introduction to Augmented Reality Augmented Reality Displays Examples AR Toolkit an open source software

More information

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS Nuno Sousa Eugénio Oliveira Faculdade de Egenharia da Universidade do Porto, Portugal Abstract: This paper describes a platform that enables

More information

Ubiquitous Home Simulation Using Augmented Reality

Ubiquitous Home Simulation Using Augmented Reality Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL

More information

Augmented Reality Lecture notes 01 1

Augmented Reality Lecture notes 01 1 IntroductiontoAugmentedReality Lecture notes 01 1 Definition Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated

More information

Interaction, Collaboration and Authoring in Augmented Reality Environments

Interaction, Collaboration and Authoring in Augmented Reality Environments Interaction, Collaboration and Authoring in Augmented Reality Environments Claudio Kirner1, Rafael Santin2 1 Federal University of Ouro Preto 2Federal University of Jequitinhonha and Mucury Valeys {ckirner,

More information

Immersive Authoring of Tangible Augmented Reality Applications

Immersive Authoring of Tangible Augmented Reality Applications International Symposium on Mixed and Augmented Reality 2004 Immersive Authoring of Tangible Augmented Reality Applications Gun A. Lee α Gerard J. Kim α Claudia Nelles β Mark Billinghurst β α Virtual Reality

More information

Sensible Chuckle SuperTuxKart Concrete Architecture Report

Sensible Chuckle SuperTuxKart Concrete Architecture Report Sensible Chuckle SuperTuxKart Concrete Architecture Report Sam Strike - 10152402 Ben Mitchell - 10151495 Alex Mersereau - 10152885 Will Gervais - 10056247 David Cho - 10056519 Michael Spiering Table of

More information

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-153 SOLUTIONS FOR DEVELOPING SCORM CONFORMANT SERIOUS GAMES Dragoş BĂRBIERU

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

Theory and Practice of Tangible User Interfaces Tuesday, Week 9 Augmented Reality Theory and Practice of Tangible User Interfaces Tuesday, Week 9 Outline Overview Examples Theory Examples Supporting AR Designs Examples Theory Outline Overview Examples Theory Examples

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment Mohamad Shahrul Shahidan, Nazrita Ibrahim, Mohd Hazli Mohamed Zabil, Azlan Yusof College of Information Technology,

More information

HELPING THE DESIGN OF MIXED SYSTEMS

HELPING THE DESIGN OF MIXED SYSTEMS HELPING THE DESIGN OF MIXED SYSTEMS Céline Coutrix Grenoble Informatics Laboratory (LIG) University of Grenoble 1, France Abstract Several interaction paradigms are considered in pervasive computing environments.

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Immersive Simulation in Instructional Design Studios

Immersive Simulation in Instructional Design Studios Blucher Design Proceedings Dezembro de 2014, Volume 1, Número 8 www.proceedings.blucher.com.br/evento/sigradi2014 Immersive Simulation in Instructional Design Studios Antonieta Angulo Ball State University,

More information

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Content based on Dr.LaViola s class: 3D User Interfaces for Games and VR What is a User Interface? Where

More information

Introduction to Game Design. Truong Tuan Anh CSE-HCMUT

Introduction to Game Design. Truong Tuan Anh CSE-HCMUT Introduction to Game Design Truong Tuan Anh CSE-HCMUT Games Games are actually complex applications: interactive real-time simulations of complicated worlds multiple agents and interactions game entities

More information

An exploration from virtual to augmented reality gaming

An exploration from virtual to augmented reality gaming SIMULATION & GAMING, Sage Publications, December, 37(4): 507-533, (2006). DOI: 10.1177/1046878106293684 An exploration from virtual to augmented reality gaming Fotis Liarokapis City University, UK Computer

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION Tweek: Merging 2D and 3D Interaction in Immersive Environments Patrick L Hartling, Allen D Bierbaum, Carolina Cruz-Neira Virtual Reality Applications Center, 2274 Howe Hall Room 1620, Iowa State University

More information

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ISCW 2001 Tutorial. An Introduction to Augmented Reality ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University

More information

UMI3D Unified Model for Interaction in 3D. White Paper

UMI3D Unified Model for Interaction in 3D. White Paper UMI3D Unified Model for Interaction in 3D White Paper 30/04/2018 Introduction 2 The objectives of the UMI3D project are to simplify the collaboration between multiple and potentially asymmetrical devices

More information

iwindow Concept of an intelligent window for machine tools using augmented reality

iwindow Concept of an intelligent window for machine tools using augmented reality iwindow Concept of an intelligent window for machine tools using augmented reality Sommer, P.; Atmosudiro, A.; Schlechtendahl, J.; Lechler, A.; Verl, A. Institute for Control Engineering of Machine Tools

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

Procedural Level Generation for a 2D Platformer

Procedural Level Generation for a 2D Platformer Procedural Level Generation for a 2D Platformer Brian Egana California Polytechnic State University, San Luis Obispo Computer Science Department June 2018 2018 Brian Egana 2 Introduction Procedural Content

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Haptic Feedback in Mixed-Reality Environment

Haptic Feedback in Mixed-Reality Environment The Visual Computer manuscript No. (will be inserted by the editor) Haptic Feedback in Mixed-Reality Environment Renaud Ott, Daniel Thalmann, Frédéric Vexo Virtual Reality Laboratory (VRLab) École Polytechnique

More information

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18,   ISSN International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, www.ijcea.com ISSN 2321-3469 AUGMENTED REALITY FOR HELPING THE SPECIALLY ABLED PERSONS ABSTRACT Saniya Zahoor

More information

Using Mixed Reality as a Simulation Tool in Urban Planning Project for Sustainable Development

Using Mixed Reality as a Simulation Tool in Urban Planning Project for Sustainable Development Journal of Civil Engineering and Architecture 9 (2015) 830-835 doi: 10.17265/1934-7359/2015.07.009 D DAVID PUBLISHING Using Mixed Reality as a Simulation Tool in Urban Planning Project Hisham El-Shimy

More information

Avatar: a virtual reality based tool for collaborative production of theater shows

Avatar: a virtual reality based tool for collaborative production of theater shows Avatar: a virtual reality based tool for collaborative production of theater shows Christian Dompierre and Denis Laurendeau Computer Vision and System Lab., Laval University, Quebec City, QC Canada, G1K

More information

Augmented Board Games

Augmented Board Games Augmented Board Games Peter Oost Group for Human Media Interaction Faculty of Electrical Engineering, Mathematics and Computer Science University of Twente Enschede, The Netherlands h.b.oost@student.utwente.nl

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Mid-term report - Virtual reality and spatial mobility

Mid-term report - Virtual reality and spatial mobility Mid-term report - Virtual reality and spatial mobility Jarl Erik Cedergren & Stian Kongsvik October 10, 2017 The group members: - Jarl Erik Cedergren (jarlec@uio.no) - Stian Kongsvik (stiako@uio.no) 1

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

Integrating the Wii Controller with enjine: 3D Interfaces Extending the Frontiers of a Didactic Game Engine

Integrating the Wii Controller with enjine: 3D Interfaces Extending the Frontiers of a Didactic Game Engine Integrating the Wii Controller with enjine: 3D Interfaces Extending the Frontiers of a Didactic Game Engine JOÃO BERNARDES, RICARDO NAKAMURA, DANIEL CALIFE, DANIEL TOKUNAGA, and ROMERO TORI Escola Politécnica

More information

Guidelines for Implementing Augmented Reality Procedures in Assisting Assembly Operations

Guidelines for Implementing Augmented Reality Procedures in Assisting Assembly Operations Guidelines for Implementing Augmented Reality Procedures in Assisting Assembly Operations Viviana Chimienti 1, Salvatore Iliano 1, Michele Dassisti 2, Gino Dini 1, and Franco Failli 1 1 Dipartimento di

More information

Effective Iconography....convey ideas without words; attract attention...

Effective Iconography....convey ideas without words; attract attention... Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Usability and Playability Issues for ARQuake

Usability and Playability Issues for ARQuake Usability and Playability Issues for ARQuake Bruce Thomas, Nicholas Krul, Benjamin Close and Wayne Piekarski University of South Australia Abstract: Key words: This paper presents a set of informal studies

More information

Virtual Reality in E-Learning Redefining the Learning Experience

Virtual Reality in E-Learning Redefining the Learning Experience Virtual Reality in E-Learning Redefining the Learning Experience A Whitepaper by RapidValue Solutions Contents Executive Summary... Use Cases and Benefits of Virtual Reality in elearning... Use Cases...

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

Understanding OpenGL

Understanding OpenGL This document provides an overview of the OpenGL implementation in Boris Red. About OpenGL OpenGL is a cross-platform standard for 3D acceleration. GL stands for graphics library. Open refers to the ongoing,

More information

Fig.1 AR as mixed reality[3]

Fig.1 AR as mixed reality[3] Marker Based Augmented Reality Application in Education: Teaching and Learning Gayathri D 1, Om Kumar S 2, Sunitha Ram C 3 1,3 Research Scholar, CSE Department, SCSVMV University 2 Associate Professor,

More information

Contact info.

Contact info. Game Design Bio Contact info www.mindbytes.co learn@mindbytes.co 856 840 9299 https://goo.gl/forms/zmnvkkqliodw4xmt1 Introduction } What is Game Design? } Rules to elaborate rules and mechanics to facilitate

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.23 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

GLOSSARY for National Core Arts: Media Arts STANDARDS

GLOSSARY for National Core Arts: Media Arts STANDARDS GLOSSARY for National Core Arts: Media Arts STANDARDS Attention Principle of directing perception through sensory and conceptual impact Balance Principle of the equitable and/or dynamic distribution of

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

Virtual- and Augmented Reality in Education Intel Webinar. Hannes Kaufmann

Virtual- and Augmented Reality in Education Intel Webinar. Hannes Kaufmann Virtual- and Augmented Reality in Education Intel Webinar Hannes Kaufmann Associate Professor Institute of Software Technology and Interactive Systems Vienna University of Technology kaufmann@ims.tuwien.ac.at

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.7.0 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents From: AAAI Technical Report SS-00-04. Compilation copyright 2000, AAAI (www.aaai.org). All rights reserved. Visual Programming Agents for Virtual Environments Craig Barnes Electronic Visualization Lab

More information

Artificial Life Simulation on Distributed Virtual Reality Environments

Artificial Life Simulation on Distributed Virtual Reality Environments Artificial Life Simulation on Distributed Virtual Reality Environments Marcio Lobo Netto, Cláudio Ranieri Laboratório de Sistemas Integráveis Universidade de São Paulo (USP) São Paulo SP Brazil {lobonett,ranieri}@lsi.usp.br

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology

More information

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING 6 th INTERNATIONAL MULTIDISCIPLINARY CONFERENCE AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING Peter Brázda, Jozef Novák-Marcinčin, Faculty of Manufacturing Technologies, TU Košice Bayerova 1,

More information

The Design & Development of RPS-Vita An Augmented Reality Game for PlayStation Vita CMP S1: Applied Game Technology Duncan Bunting

The Design & Development of RPS-Vita An Augmented Reality Game for PlayStation Vita CMP S1: Applied Game Technology Duncan Bunting The Design & Development of RPS-Vita An Augmented Reality Game for PlayStation Vita CMP404.2016-7.S1: Applied Game Technology Duncan Bunting 1302739 1 - Design 1.1 - About The Game RPS-Vita, or Rock Paper

More information

Online Game Quality Assessment Research Paper

Online Game Quality Assessment Research Paper Online Game Quality Assessment Research Paper Luca Venturelli C00164522 Abstract This paper describes an objective model for measuring online games quality of experience. The proposed model is in line

More information

Mixed Reality technology applied research on railway sector

Mixed Reality technology applied research on railway sector Mixed Reality technology applied research on railway sector Yong-Soo Song, Train Control Communication Lab, Korea Railroad Research Institute Uiwang si, Korea e-mail: adair@krri.re.kr Jong-Hyun Back, Train

More information

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL Yap Hwa Jentl, Zahari Taha 2, Eng Tat Hong", Chew Jouh Yeong" Centre for Product Design and Manufacturing (CPDM).

More information

Interactive intuitive mixed-reality interface for Virtual Architecture

Interactive intuitive mixed-reality interface for Virtual Architecture I 3 - EYE-CUBE Interactive intuitive mixed-reality interface for Virtual Architecture STEPHEN K. WITTKOPF, SZE LEE TEO National University of Singapore Department of Architecture and Fellow of Asia Research

More information

Remote Collaboration Using Augmented Reality Videoconferencing

Remote Collaboration Using Augmented Reality Videoconferencing Remote Collaboration Using Augmented Reality Videoconferencing Istvan Barakonyi Tamer Fahmy Dieter Schmalstieg Vienna University of Technology Email: {bara fahmy schmalstieg}@ims.tuwien.ac.at Abstract

More information

AUTOMATION OF 3D MEASUREMENTS FOR THE FINAL ASSEMBLY STEPS OF THE LHC DIPOLE MAGNETS

AUTOMATION OF 3D MEASUREMENTS FOR THE FINAL ASSEMBLY STEPS OF THE LHC DIPOLE MAGNETS IWAA2004, CERN, Geneva, 4-7 October 2004 AUTOMATION OF 3D MEASUREMENTS FOR THE FINAL ASSEMBLY STEPS OF THE LHC DIPOLE MAGNETS M. Bajko, R. Chamizo, C. Charrondiere, A. Kuzmin 1, CERN, 1211 Geneva 23, Switzerland

More information

Software Requirements Specification

Software Requirements Specification ÇANKAYA UNIVERSITY Software Requirements Specification Simulacrum: Simulated Virtual Reality for Emergency Medical Intervention in Battle Field Conditions Sedanur DOĞAN-201211020, Nesil MEŞURHAN-201211037,

More information

ARCHITECTURE AND MODEL OF DATA INTEGRATION BETWEEN MANAGEMENT SYSTEMS AND AGRICULTURAL MACHINES FOR PRECISION AGRICULTURE

ARCHITECTURE AND MODEL OF DATA INTEGRATION BETWEEN MANAGEMENT SYSTEMS AND AGRICULTURAL MACHINES FOR PRECISION AGRICULTURE ARCHITECTURE AND MODEL OF DATA INTEGRATION BETWEEN MANAGEMENT SYSTEMS AND AGRICULTURAL MACHINES FOR PRECISION AGRICULTURE W. C. Lopes, R. R. D. Pereira, M. L. Tronco, A. J. V. Porto NepAS [Center for Teaching

More information

Creating a 3D Assembly Drawing

Creating a 3D Assembly Drawing C h a p t e r 17 Creating a 3D Assembly Drawing In this chapter, you will learn the following to World Class standards: 1. Making your first 3D Assembly Drawing 2. The XREF command 3. Making and Saving

More information

Unity Certified Programmer

Unity Certified Programmer Unity Certified Programmer 1 unity3d.com The role Unity programming professionals focus on developing interactive applications using Unity. The Unity Programmer brings to life the vision for the application

More information

The Control of Avatar Motion Using Hand Gesture

The Control of Avatar Motion Using Hand Gesture The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Marco Cavallo Merging Worlds: A Location-based Approach to Mixed Reality Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Introduction: A New Realm of Reality 2 http://www.samsung.com/sg/wearables/gear-vr/

More information

INTRODUCTION TO GAME AI

INTRODUCTION TO GAME AI CS 387: GAME AI INTRODUCTION TO GAME AI 3/31/2016 Instructor: Santiago Ontañón santi@cs.drexel.edu Class website: https://www.cs.drexel.edu/~santi/teaching/2016/cs387/intro.html Outline Game Engines Perception

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Klen Čopič Pucihar School of Computing and Communications Lancaster University Lancaster, UK LA1 4YW k.copicpuc@lancaster.ac.uk Paul

More information

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y New Work Item Proposal: A Standard Reference Model for Generic MAR Systems ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y What is a Reference Model? A reference model (for a given

More information

CHAPTER 1. INTRODUCTION 16

CHAPTER 1. INTRODUCTION 16 1 Introduction The author s original intention, a couple of years ago, was to develop a kind of an intuitive, dataglove-based interface for Computer-Aided Design (CAD) applications. The idea was to interact

More information

Introduction to Computer Games

Introduction to Computer Games Introduction to Computer Games Doron Nussbaum Introduction to Computer Gaming 1 History of computer games Hardware evolution Software evolution Overview of Industry Future Directions/Trends Doron Nussbaum

More information

Augmented reality for machinery systems design and development

Augmented reality for machinery systems design and development Published in: J. Pokojski et al. (eds.), New World Situation: New Directions in Concurrent Engineering, Springer-Verlag London, 2010, pp. 79-86 Augmented reality for machinery systems design and development

More information

ARK: Augmented Reality Kiosk*

ARK: Augmented Reality Kiosk* ARK: Augmented Reality Kiosk* Nuno Matos, Pedro Pereira 1 Computer Graphics Centre Rua Teixeira Pascoais, 596 4800-073 Guimarães, Portugal {Nuno.Matos, Pedro.Pereira}@ccg.pt Adérito Marcos 1,2 2 University

More information

Benefits of using haptic devices in textile architecture

Benefits of using haptic devices in textile architecture 28 September 2 October 2009, Universidad Politecnica de Valencia, Spain Alberto DOMINGO and Carlos LAZARO (eds.) Benefits of using haptic devices in textile architecture Javier SANCHEZ *, Joan SAVALL a

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

Digital Media & Computer Games 3/24/09. Digital Media & Games

Digital Media & Computer Games 3/24/09. Digital Media & Games Digital Media & Games David Cairns 1 Digital Media Use of media in a digital format allows us to manipulate and transmit it relatively easily since it is in a format a computer understands Modern desktop

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

EXPERIMENTAL FRAMEWORK FOR EVALUATING COGNITIVE WORKLOAD OF USING AR SYSTEM IN GENERAL ASSEMBLY TASK

EXPERIMENTAL FRAMEWORK FOR EVALUATING COGNITIVE WORKLOAD OF USING AR SYSTEM IN GENERAL ASSEMBLY TASK EXPERIMENTAL FRAMEWORK FOR EVALUATING COGNITIVE WORKLOAD OF USING AR SYSTEM IN GENERAL ASSEMBLY TASK Lei Hou and Xiangyu Wang* Faculty of Built Environment, the University of New South Wales, Australia

More information

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems Fumihisa Shibata, Takashi Hashimoto, Koki Furuno, Asako Kimura, and Hideyuki Tamura Graduate School of Science and

More information

Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam

Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam Tavares, J. M. R. S.; Ferreira, R. & Freitas, F. / Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam, pp. 039-040, International Journal of Advanced Robotic Systems, Volume

More information

Networked Virtual Environments

Networked Virtual Environments etworked Virtual Environments Christos Bouras Eri Giannaka Thrasyvoulos Tsiatsos Introduction The inherent need of humans to communicate acted as the moving force for the formation, expansion and wide

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

Pangolin: Concrete Architecture of SuperTuxKart. Caleb Aikens Russell Dawes Mohammed Gasmallah Leonard Ha Vincent Hung Joseph Landy

Pangolin: Concrete Architecture of SuperTuxKart. Caleb Aikens Russell Dawes Mohammed Gasmallah Leonard Ha Vincent Hung Joseph Landy Pangolin: Concrete Architecture of SuperTuxKart Caleb Aikens Russell Dawes Mohammed Gasmallah Leonard Ha Vincent Hung Joseph Landy Abstract For this report we will be looking at the concrete architecture

More information