AR 2 kanoid: Augmented Reality ARkanoid

Similar documents
preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Toward an Augmented Reality System for Violin Learning Support

Interior Design using Augmented Reality Environment

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL

ReVRSR: Remote Virtual Reality for Service Robots

Implementation of Image processing using augmented reality

A Geometric Correction Method of Plane Image Based on OpenCV

Chapter 1 - Introduction

Computer Vision. The Pinhole Camera Model

Various Calibration Functions for Webcams and AIBO under Linux

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

A Vehicular Visual Tracking System Incorporating Global Positioning System

Catadioptric Stereo For Robot Localization

Team KMUTT: Team Description Paper

Lane Detection in Automotive

Face Detection System on Ada boost Algorithm Using Haar Classifiers

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

ME 6406 MACHINE VISION. Georgia Institute of Technology

Image Processing & Projective geometry

Study of the touchpad interface to manipulate AR objects

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

CORRECTED VISION. Here be underscores THE ROLE OF CAMERA AND LENS PARAMETERS IN REAL-WORLD MEASUREMENT

Usability and Playability Issues for ARQuake

A Vehicular Visual Tracking System Incorporating Global Positioning System

PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY

A Vehicular Visual Tracking System Incorporating Global Positioning System

ISCW 2001 Tutorial. An Introduction to Augmented Reality

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

INTERIOR DESIGN USING AUGMENTED REALITY

Cameras. CSE 455, Winter 2010 January 25, 2010

iwindow Concept of an intelligent window for machine tools using augmented reality

Colour correction for panoramic imaging

Keywords: setting out, layout, augmented reality, construction sites.

Augmented Reality Mixed Reality

Oculus Rift Getting Started Guide

Augmented Reality Lecture notes 01 1

A Comparison Between Camera Calibration Software Toolboxes

Overview of current developments in haptic APIs

APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE

Tracking in Unprepared Environments for Augmented Reality Systems

Computer Vision Slides curtesy of Professor Gregory Dudek

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

Augmented Reality From Science to Mass-Market Stefan Misslinger, metaio, Inc.

USABILITY AND PLAYABILITY ISSUES FOR ARQUAKE

Roadblocks for building mobile AR apps

INTRODUCTION TO GAME AI

Annotation Overlay with a Wearable Computer Using Augmented Reality

3DUNDERWORLD-SLS v.3.0

Visualization of Vehicular Traffic in Augmented Reality for Improved Planning and Analysis of Road Construction Projects

Module 6: Liquid Crystal Thermography Lecture 37: Calibration of LCT. Calibration. Calibration Details. Objectives_template

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

A Personal Surround Environment: Projective Display with Correction for Display Surface Geometry and Extreme Lens Distortion

Psychophysics of night vision device halo

Omni-Directional Catadioptric Acquisition System

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

ECC419 IMAGE PROCESSING

Augmented Reality Applications for Nuclear Power Plant Maintenance Work

9.5 symmetry 2017 ink.notebook. October 25, Page Symmetry Page 134. Standards. Page Symmetry. Lesson Objectives.

Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire. Introduction

Team 4. Kari Cieslak, Jakob Wulf-Eck, Austin Irvine, Alex Crane, Dylan Vondracek. Project SoundAround

Spring 2005 Group 6 Final Report EZ Park

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

Development of an Education System for Surface Mount Work of a Printed Circuit Board

3D and Sequential Representations of Spatial Relationships among Photos

Reading. Angel. Chapter 5. Optional

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

Research on Hand Gesture Recognition Using Convolutional Neural Network

Oculus Rift Getting Started Guide

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

Time-Lapse Panoramas for the Egyptian Heritage

PLazeR. a planar laser rangefinder. Robert Ying (ry2242) Derek Xingzhou He (xh2187) Peiqian Li (pl2521) Minh Trang Nguyen (mnn2108)

Civil Engineering Application for Virtual Collaborative Environment

Digital Photographic Imaging Using MOEMS

High Performance Imaging Using Large Camera Arrays

Towards Achieving Robust Video Self-avatars under Flexible Environment Conditions

Machine Vision for the Life Sciences

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

BoBoiBoy Interactive Holographic Action Card Game Application

A Survey of Mobile Augmentation for Mobile Augmented Reality System

A Novel Approach of Compressing Images and Assessment on Quality with Scaling Factor

Basler. Aegis Electronic Group. GigE Vision Line Scan, Cost Effective, Easy-to-Integrate

CS354 Computer Graphics Viewing and Projections

Automatic Electricity Meter Reading Based on Image Processing

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

Parallax-Free Long Bone X-ray Image Stitching

Cantag: an open source software toolkit for designing and deploying marker-based vision systems. Andrew Rice. Computer Laboratory

Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery

Introduction to Game Design. Truong Tuan Anh CSE-HCMUT

AR Tamagotchi : Animate Everything Around Us

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

Interior Design with Augmented Reality

Transcription:

AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular 80 s game, Arkanoid. The system consists of a camera, a computer and virtual reality goggles. The player s head position is computed from the placement of circles on a playing area. This position, along with camera information is then used to reconstruct a virtual view of the player s world. This virtual world is overlayed on the image of the real world to give an augmented reality view of the game to the user. Solutions to problems with implementation, bandwidth, and position prediction are also discussed. 1 Introduction The advancement of computing technology, as well as the decreasing cost of siliconbased hardware, has led to the development of many near-real-life games, and the incorporation of many gaming companies. These games try to synthetically create a world. This virtual world exists only inside a computer, and may be like nothing we have ever seen, yet it tries to mimic many aspects of the real world. Computer graphics has progressed much over the years, and many of the virtual objects in these worlds appear very much like the real thing. The next step in computer gaming is actually using some real objects, and playing in a real world. This requires the mixing both real and virtual worlds, or augmenting of the real world. This paper is organized as follows. Related work and an overview of the system is first presented. Next, we describe the implementation components: camera calibration, camera position tracking, 3D graphics generation, collision detection, user input-output and the overall algorithm. This paper is concluded with a list of suggested future work. 2 System Overview Like the original Arkanoid, shown in Figure 1, the player has a paddle with which he must deflect a moving ball at some bricks. The idea behind the game is the same, only now a person wears a head-mounted display with a camera attached and looks at a surface with a circle pattern on it. See Figure 1. The bricks, paddle and puck are all created with OpenGL and overlayed on the real three dimensional world as if they were actually there. Positional information to relate the virtual world (bricks, paddle and puck) to the real world (ice surface) is extracted from the circles on the ice surface (center ice). Unlike other systems [2, 3, 4, 5], AR 2 kanoid uses one computer for all tasks. These tasks include computer vision tasks, 3d rendering tasks and mixing of the real and virtual worlds. There are 2 main computer vision related tasks. Calibration of the camera (done once, before the game can be started), and extracting of the camera position and orientation using this calibration.

3 Implementation Figure 1: Original Arkanoid vs AR 2 kanoid. The system has 5 main components: camera calibration, camera position tracking, 3D graphics generation, object collision detection, and user input-output. To get video for object registration and virtual object overlaying, a firewire camera is used. The camera used in AR 2 kanoid is a Pyro Firewire Webcam. This particular camera was chosen for several reasons. First, it can grab 640x480 color images at 30 frames per second, which is comparable to more expensive camera systems. Second, a frame grabber is not needed. The camera attaches to the pc via a firewire (aka IEEE1394) port. To perform many of the required image processing techniques required here, OpenCV is used. OpenCV is an open source computer vision library. It is the most powerful free image processing package available. Like many open source projects, although there is no direct support for OpenCV, there is a mailing group which can be used to solve most problems. OpenCV is easy to use, fast and free. 3.1 Camera Calibration To calibrate the camera, the calibration routine is supplied with several views of a planar model object, or pattern, of known geometry. For every view the points on the model plane and their projections onto the image are passed to the calibration routine. OpenCV s camera calibration has been modified to accept a pattern of circles. Both an intrinsic and extrinsic camera calibration is completed according to Zhang [6, 7]. The intrinsic camera parameters specify the camera characteristics. These parameters include focal length, principal point, effective pixel size, and radial distortion coefficient of the lens. The extrinsic camera parameters describe the spatial relationship between the camera and the real-world. They are the rotation matrix and the translation vector. Together they specify the transformation between the camera and world reference frames.

3.2 Camera Position Tracking As the user moves towards or away from the ice surface, the size of the bricks, puck and paddle should change, just as they would in the real world. The user will usually follow the direction of the puck up and down the ice. This means that as the puck moves up the ice towards the bricks, the users head tilts upwards so that he can get a better view of where the puck is going. The same happens when the puck goes down the ice towards the goal line. This will change the perspective of any real objects that the user is viewing and has to be taken into account for the virtual objects as well. The image processing for AR 2 kanoid works as follows. A color image is sent to the image processing system from the camera. This three channel color image is then converted to a grayscale image, thresholded and filtered to give the ice surface. Next, all three color channels (red, green and blue) are extracted. These are thresholded and ANDed together with the ice surface as shown in Equation 1 to detect the red circles. RedCircle = RedAN D(N OT Blue)AN D(N OT Green)AN DIceSurf ace (1) The centers of these circles are computed and passed to the calibration routine which then computes the extrinsic parameters of the circle pattern. That is, the position and orientation of the pattern. 3.3 3D Graphics Generation Virtual Reality refers to the situation when a user is immersed in a virtually created environment to give the illusion of a real world. For example, many car racing video games are based on this. When playing the game, the user feels as if he is really driving a car at high speed when in reality he is just playing a game. The most common method or software package used to generate this virtual world is OpenGL [1]. OpenGL is a software package used to generate two dimensional and three dimensional graphics. There are several versions of OpenGL available. Version 1.1 is used in AR 2 kanoid. The high level explanation of OpenGL is very simple. Draw the points of the objects you want to create, move them where you want them to be in the sceen, then position a camera from where you want to view them all. The bricks, puck, paddle, and ice surface were generated using OpenGL. 3.3.1 Creating the Bricks The bricks are probably the most complicated virtual object created in this game because they have a brick-like texture stretched over their shape. Each brick is defined by 8 three dimensional points. One for each corner. Combinations of 4 of these corners define a side. The brick-like texture is then stretched over each side. 3.3.2 Creating the Puck A round disk-shaped puck is created using a cylinder with a disk for the top. The cylinder and disk are not perfect circles, they are really polygons with a high number of sides. In this case, twenty sides are used. The more sides the more circle-like the

shapes will be, but the longer it will take to render. Twenty gave a happy medium between the two. 3.3.3 Creating the Paddle The paddle is created identically to a brick except that it has no texture attached. This may be changed in a future version. 3.4 Collision Detection Collision detection is a major part of any video game. It is especially important in this game since a collision of the puck with another object will result in something happening. The various objects that the puck can collide with, along with the result is shown in Table 1. In AR 2 kanoid, each object is modeled as a rectangle and the intersection of any two rectangles results in a collision. Object Brick Paddle Left and Right Walls Top Wall Goal Line Result of Collision with Puck Brick Explodes and Puck Changes Direction Puck Changes Direction Puck Changes Left-Right Direction Puck Changes Up-Down Direction Number of Pucks Decrements, Play Resets Table 1: Result of Puck Colliding with a Particular Object 3.5 User Input and Output In order for a user to be immersed in a game, and for them to feel like it is real, they have to be able to communicate with the game. This is done through input and output devices. For AR 2 kanoid, the user can send information into the game by moving the joystick or moving his virtual reality goggles with camera attached. Information is sent to the user via earphones (sound), and through the camera/virtual reality goggles display. DirectX is a technology produced by Microsoft to allow faster device communication on its Windows Operating System. Microsoft gives a free sdk for anyone to use this technology. AR 2 kanoid uses DirectX 8.1 for the joystick control and for the sound output. Joystick control is used to allow the user to move the paddle to deflect the puck. Several sounds are used in AR 2 kanoid, they are shown in Table 2. 3.6 Mixing Physical Reality and Virtual Reality The video camera and the virtual reality goggles gives the user a picture of the real world. OpenGL drawings give the user a virtual world. Overlaying the OpenGL drawings on the images from the real world will augment the users perception of the real world. In AR 2 kanoid, the bridge between the real world and the virtual world is the center circle pattern.

Sound Puck Floating Puck Deflection Explosion Score Event Game Started, Puck Hovering Puck Colliding with a Wall or Paddle Puck Colliding with a Brick Puck Colliding with Goal Line 3.7 Overall Algorithm Table 2: Sound Resulting from a Particular Event The algorithm flow is as follows. A color image is grabbed from the camera and is fed to the image processing and augmented reality overlay systems. The image processing system detects the ice surface and the circle pattern in the image. Next, the orientation of the ice surface with respect to the camera is computed from the orientation and size of the circle pattern. This relative orientation matrix is then used to arrange the bricks, puck and paddle in the virtual (OpenGL) world, which is subsequently overlayed on the original image to give an augmented reality view to the user. This image is then displayed via the virtual reality display system. 4 Conclusions and Future Work This paper has introduced AR 2 kanoid, an augmented reality video game. Results are very promising, but more work has to be completed. Due to the limitations of the computer hardware (firewire port speed, CPU speed, bus speed) a small image (160x120) is extracted from the camera, used for processing and then resized to fit the screen. This makes the positional information quantized and the surface appear jumpy. Currently, a kalman filter is being used to remedy this. In future versions, an improved kalman filter and ultimately a higher resolution image should be obtained from the camera. Second, more realistic OpenGL drawings of the bricks, paddle and puck are required. A feature that I personally would like to add is virtual lighting corresponding to actual lighting, so the shadows look real. This, however, is a whole research area in itself. References [1] Wright R S, Sweet M, 1996 OpenGL Superbible. Waite Group Press. [2] Azuma R 1997 Survey of augmented reality. Presence: Teleoperators and Virtual Environments, vol. 6, no. 4 [3] Piekarski W, Gunther B, Thomas B 1999 Integrating virtual and augmented realities in an outdoor application. 2nd International workshop on augmented reality (IWAR1999): 20-21

[4] Piekarski W, Thomas B 2001 Tinmith-evo-5 - An architecture for supporting mobile augmented reality environments. 2nd International Symposium on Augmented Reality (ISAR2001): 29-30 [5] Feiner S, MacIntyre B, Hollerer T, Webster A 1997 A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Proceedings 1st International Symposium on Wearable Computers 74-81 [6] Zhang Z 1999 Flexible Camera Calibration By Viewing a Plane From Unknown Orientations. International Conference on Computer Vision (ICCV 99), 666-673 [7] Zhang Z 2000 A Flexible New Technique for Camera Calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, 1330-1334