The Construction of a Virtual Lego Set Chris Armer

Size: px
Start display at page:

Download "The Construction of a Virtual Lego Set Chris Armer"

Transcription

1 The Construction of a Virtual Lego Set Chris Armer BSc Computing 2005/2006 The candidate confirms that the work submitted is their own and the appropriate credit has been given where reference has been made to the work of others. I understand that failure to attribute material which is obtained from another source may be considered as plagiarism. (Signature of student)

2 Summary The purpose of this project is to explore what, if any benefits exist from adding haptic feedback to a VR system. It focuses on the perception of manufacturing tolerances felt between objects presented in a scene and the methodology involved in producing such a system. The report details the research carried out to gain an understanding of the problem area and the resources required to tackle such a problem. The process is covered within from initial research through the implementation stage to final evaluation of the addressed problem. Chris Armer i

3 Contents 1 INTRODUCTION MOTIVATION AIMS MINIMUM REQUIREMENTS PROJECT OUTLINE 3 2 BACKGROUND READING ASSESS THE PROBLEM Previous Report API S GHOST SDK ReachIn API OpenHaptics Decision 9 3 LEGO SET RATIONALE HISTORY TYPES OF BRICK ANALYSIS RESOURCES DESIGN METHODOLOGIES Waterfall Iteration method EVALUATION METHODOLOGIES Questionnaires Interviews Task Based Walkthroughs SCHEDULE..18 Chris Armer ii

4 5 DESIGN PROCESS VRML PROGRAMMING Basics ReachIn API Setup Haptic Rendering.21 6 IMPLEMENTATION ITERATION PROCESS EVALUATION INTRODUCTION USERS TASKS PROCESS RESULTS CONCLUSION GOALS ACHIEVEMENTS LIMITATIONS FUTURE WORK SUMMARY 40 9 BIBLIOGRAPHY..41 APPENDIX A PROJECT REFLECTION 43 APPENDIX B EARLY BRICK RESEARCH...45 APPENDIX C TIMESHEET..47 APPENDIX D API INSTALLATION INSRTUCTIONS 48 APPENDIX E TASKS.49 APPENDIX F INTERVIEW...52 Chris Armer iii

5 1. Introduction This chapter aims to give the reader an understanding of the motivation and aims behind the project along with the specified minimum requirements to be attained. A brief summary of the remaining chapters of the report is also laid out. 1.1 Motivation The chosen title of this project is The Construction of a Virtual Lego Set to Explore the Benefits of Haptic Feedback over Non-Haptic Interaction. The idea is to compare completion times of a range of tasks between haptic and non-haptic applications to see what (if any) benefit haptic feedback has. This will be detailed further below and in the proceeding chapters of this report. Even before I started my university career I had a vivid interest in anything computer related and specifically the 3D worlds that video games inhabit. This has been fed by the countless games that have been released in the last 10+ years but came to a head when the tools used to create these worlds became available to the end user. Programs such as Hammer (formally WorldCraft) and more recently full SDK s (standard developers kits) have allowed me to get an insight into the professional world of game design and how this is accomplished. When the option arose at university, I jumped at the chance to take modules which could help me further this interest, namely graphics and HCI modules. These options along with furthering my quest for knowledge would also help to focus my degree towards an eventual career in these areas. When the time came to choose a Project title my main focus were the graphics related titles offered my Mr Ruddle, which lead me to choose the project you are currently reading. Chris Armer 1

6 1.2 Aims The main aim of the project is to investigate the benefits of tying in haptic feedback to a visual application (in this case a 3D Virtual Reality Lego Set). This will be achieved by producing two versions of a Lego Set, one with and one without the haptic interaction enabled and comparing user task times to see if the haptic feedback is beneficial. A side aim as mentioned above is a formative one of just broadening my knowledge of the realm of computer graphics which is achieved through following the process outlined in this report and producing the final application 1.3 Minimum requirements The minimum requirements for this project are as follows: 1. Review of literature into haptic / VR interactions 2. Stereo Viewing with the ReachIn Display system achieved 3. Non-Haptic Lego Set functional 4. Haptic Lego set functional 5. Evaluate with at least 4 users More information into these minimum requirements can be found in the mid term report found at the back of this report. Chris Armer 2

7 1.4 Project outline Chapter 2 outlines the problem to be tackled and research avenues available for me to follow. Programming environments are discussed and a decision is made based on this research Chapter 3 presents an in depth background to my choice of task to develop into the application, along with examples and the scope of the project is also presented. Chapter 4 details the specifics of what is required for the project, namely resources needed, and presents the design and evaluation methodologies available to me to follow for the project. A schedule for the proceeding work is drawn up and discussed Chapter 5 describes the Design for producing the finished application and the programming needed to achieve this. VRML is introduced to the reader and how the ReachIn API uses these files to create 3D applications. Chapter 6 follows the implementation carried out to produce the finished application. Any discrepancies from the plan in Chapter 5 are addressed and explained Chapter 7 covers the evaluation of the haptic application in depth. The tasks are introduced and detailed and the results are produced and analysed. Chapter 8 presents an overall evaluation of the project. The goals are re-iterated and compared against the achievements of the project and program extensions are suggested Chris Armer 3

8 2. Background Reading This chapter records the process I went through in analysing the problem and carrying out the required background reading 2.1 Assessing the Problem After doing some general research into the problem and its associated technologies I found the best place to start would be to acquire and read any previous projects in the same area. This led me to Svein Iversen s 2004 project Integrating Force Feedback with Virtual Reality which gave a very good insight into the process I will be following and the associated technologies I will be working with. Along with this I found The Programmers guides for the three rival API s that are available to me. These paired with the manuals for the two pieces of hardware I will be using (namely the ReachIn Display System and the Phantom) will give me a sound understanding of the hardware to be used for the project Previous Report A document that I referred to a great deal within the research process was the report completed by Svein Iversen called Integrating Force Feedback with Virtual Reality. This is because it is very similar in aim and process to my chosen report. It has been mentioned that the report gave me a good insight into the overall process, here I will qualify that somewhat. The overall aim of Iversen s project was to see if the inclusion of haptic feedback within a virtual reality application could facilitate the perception of tolerances in virtual assemblies. This was carried out by producing an application where users had to successfully mate a block with 2 pegs with another block with corresponding holes. The application, as my own will, had versions with and without haptic feedback allowing for an accurate comparison between the same tasks being completed with and without force feedback. Looking at the conclusion of Iversen s report we can see that the overall objective and minimum requirements set out at the start of the project have been achieved, [ADD REFERENCE TO ERIC], the prototype has shown that the integration of force feedback with virtual reality can facilitate the perception of tolerances in virtual assemblies i.e. the process he followed produced a (mainly) successful application and would be a good model to follow. Chris Armer 4

9 2.2 API s As mentioned above there are three API s that I could use to code the main project, the GHOST SDK, the ReachIn API and the OpenHaptics toolkit. All provide separate facilities for shape based surface rendering and abstract haptic effect or force field rendering (Matthew Hutchins) All API s have their own advantages and disadvantages which I will go into more detail over in the following sections, but I will also be focusing on ease of coding as how quickly and easily working prototypes are produced will have a major bearing on my choice of API. I will also be looking into the code involved in producing a hello world type of program, as this is usually a good indication of the complexity of a writing language. Two of above API s, namely the GHOST SDK and ReachInAPI both work round the premise of a scene graph but apply it in different ways. From Wikipedia: A scene graph is an object oriented structure that arranges the logical and often (but not necessarily) spatial representation of a graphical scene. For the basis of this project, using a scene graph allows for easy abstraction of a scene allowing the programmer to focus more on what is being modelled rather than how it is being modelled. Chris Armer 5

10 2.2.1 GHOST SDK The GHOST SDK (General Haptic Open Software Toolkit) is a powerful, software tool kit that eases the task of developing touch-enabled applications (SensAble website), produced by SensAble (the Company who produce the Phantom). It is written using C++ and consists of a set 9 main classes and a large number of subclasses which together make the SDK highly extensible. As is shown below, the amount of code required to produce a simple program using GHOST is considerable, especially considering that when ran it doesn t actually produce any visual output. #include <stdlib.h> #include <gstbasic.h> #include <gstsphere.h> #include <gstphantom.h> #include <gstseparator.h> #include <gstscene.h> main() { gstscene *scene = new gstscene; gstseparator *root = new gstseparator; gstsphere *sphere = new gstsphere; sphere->setradius(20); gstphantom *phantom = new gstphantom("phantom name"); root->addchild(phantom); root->addchild(sphere); scene->setroot(root); scene->startservoloop(); while (!scene->getdoneservoloop()) { // Do application loop here. // Could poll for keyboard input // and when user inputs to quit // application call // scene->stopservoloop(); } } Chris Armer 6

11 2.2.2 ReachIn API The ReachIn API is a C++ API based on the VRML scene graph model and as such it can be programmed using VRML and python (as a scripting language) as well as C++ (ReachIn API). One of the ReachIn API s most crucial features is its ability to render an object both graphically and with haptics enabled, allowing for greater accuracy with projects both in production and running. Compared with the GHOST SDK, writing a simple program using the ReachIn API is far easier, with it only being 6 lines of writing. Below is a python script to produce a cube on screen. File: hello.wrl #VRML V2.0 utf8 Group { children [ Shape { appearance Appearance {} geometry Box { size } } ] } Chris Armer 7

12 The relative ease of the program writing is due both to the ease of writing of python programs and the intuitive way the API uses these programs to create a scene. It would only take 1 extra line added to the above code to tie in haptic feedback; as seen below. File: touchable.wrl #VRML V2.0 utf8 Group { children [ Shape { appearance Appearance { surface FrictionalSurface{} # Only this line added! } geometry Box { size } } ] } OpenHaptics The full name of this API is the SensAble 3D Touch SDK OpenHaptics and as the name suggests is another API from the company who make the Phantom. It is mainly coded in the C programming language but some utilities and the source code examples are C++ based. It is modelled from the OpenGL API with extra functions added in to integrate force feedback. The logic behind this is to allow developers who are familiar with the widely known OpenGL API to be able to tie haptic feedback into their projects. It includes 2 different and interchangeable methods for producing haptic applications the Haptic Device API (HDAPI) and the Haptic Library API (HLAPI). The HDAPI gives low level access to the haptic device and requires the programmer to render forces directly, calculating the required functions by hand. The HLAPI gives higher level rendering and sits on top of the HDAPI, which allows significant reuse of existing OpenGL code and greatly simplifies the synchronisation of graphics and haptic threads. Due to the nature of the SDK, both API s can be used together as well as separately allowing greater control over exactly what is being modelled. Chris Armer 8

13 Below we can see a simple program which produces a simple shape. It is almost exclusively OpenGL with just a few extra functions added in to tie in haptics (code in bold) // display method for HelloHaptics program void display(void) { hlbeginframe(); glclear(gl_color_buffer_bit GL_DEPTH_BUFFER_BIT); glcolor3f(1.0, 0.0, 0.0); hlbeginshape(hl_shape_depth_buffer, myshapeid); glbegin(gl_polygon); glvertex3f(0.25, 0.25, 0.0); glvertex3f(0.25, 0.25, 0.0); glvertex3f(0.25, 0.25, 0.0); glvertex3f(0.25, 0.25, 0.0); glend(); hlendshape(); glflush(); hlendframe(); } This is pretty simple to write due to my previous experiences with OpenGL but does not allow the rapid prototyping offered by the ReachIn API Decision Based on what I have learned above I have decided to use the ReachIn API for this project. This is due to the ease of programming because of the integration of python, VRML and C++ into one unit, and the co-location feature of the ReachIn display and the API being the advantages over the GHOST SDK and the use of the scene graph and higher level modelling being the advantages over the OpenHaptics SDK. Chris Armer 9

14 3 Lego Set The name LEGO is a brand name which (like Hoover) has become the standard descriptor of a whole range of products, namely self locking building bricks. These bricks have become a staple in every child s upbringing from 1949 to the present day, due to their relative cheapness and the infinite possibilities that arise from playing with a set for any length of time. This section outlines the rationale behind choosing to model the Lego system and gives the history behind the product that is currently available 3.1 Rationale I decided to use Lego bricks to form the basis of my application for a number of reasons. First and foremost is the ease of use of the bricks. There are variations of Lego bricks that are available for babies so the basic use can t be too difficult. This means that I will be avoiding complex interactions which would be hard to code as well as removing all ambiguity on the part of the user on how they are to interact with the scene. Second is the shape. As all Lego bricks are of the same general shape then after one of the bricks have been modelled it is a relatively simple task to edit this model to produce the different sizes and shapes I will be requiring. The shape could also aid in putting the user at ease as to the use of the system. The aim of the system is that a user sitting at the station with the program running in front of them would intuitively know what to do to interact with the scene as everyone has used Lego bricks at some point in their lives. Another benefit of the Lego system is the sheer number of things that can be accomplished with just a small amount of Lego bricks. Anything can be produced from a simple wall to working robots and space stations. The vast majority of the options available are out of scope for this project so I will just be focusing on the simple tasks to put to the user. These tasks will be laid out in detail in chapter 6. Chris Armer 10

15 3.2 History The Lego brick that we all know started its life as the brainchild of a carpenter from Billund, Denmark, Ole Kirk Christiansen. In 1947 Christiansen received a sample of Kiddicraft Self-Locking Building bricks from an English child psychologist called Mr Hilary Harry Fisher Price and inspired by these started producing his own version which he called Automatic Binding Bricks At the time of their release, wooden toys were the standard and in the following years after the launch the Lego Group (as Christiansen s company was called) experienced poor sales as the reception of these plastic competitors was sourly received. The original bricks were sold just as brick sets starting from 1949 but it wasn t until 1955 that the first toy system of a town was introduced and the brand took off; around this time they toy was exported for the first time. The first types of Lego brick were made by injection moulding cellulose acetate and this system was used right up until 1963 where the material was replaced with Acrylonitrile Butadiene Styrene, or ABS which is a much stronger material and so safer to use as a child s toy. The design and shape of bricks has changed so little in the proceeding years that the bricks made in 1963 can still interlock with the bricks produced in They have gone through a number of iterations to become the bricks we know today as can be seen in appendix B Chris Armer 11

16 3.3 Types of brick There is a great amount of Lego bricks that I could choose to use for this project but I am only going to need a range of the simple bricks. The bricks I will use can be seen listed below. The number stands for the count of studs i.e. 2x1 means 2 studs wide by 1 stud long 2x1 Figure 3.1: A 2x1 Brick 2x2 Figure 3.2: A 2x2 Brick Chris Armer 12

17 2x3 Figure 3.3: A 2x3 Brick 2x4 Figure 3.4: A 2x4 Brick 2x8 Figure 3.5: A 2x8 Brick Chris Armer 13

18 4. Analysis This chapter aims to outline the resources that were required to produce the application, present the different design and evaluation methodologies available and make a justified decision on which was the best for this project. A projected schedule is laid down and detailed. 4.1 Resources The resources required for the completion of the application range from specialised hardware/software to highly common software and are detailed below. The specialist hardware required is The ReachIn Display System. This consists of a Phantom Premium Haptic Device from SensAble paired with a stereo monitor system from ReachIn. The Phantom device is a mounted mechanical arm which co-locates the projection area of the ReachIn Display to allow for true location between image and arm and provides position sensing and feedback in 6 DOF (Degrees Of Freedom). The ReachIn Display consists of a monitor supported on a frame which projects onto a semi-transparent mirror, coupled with a pair of CrystalEyes shutter glasses which together creates a 3D illusion in the space between the mirror and desk. (SensAble Technologies, 2006 ReachIn Technologies, 2006) The specialist Software required is the ReachIn API from ReachIn Technologies. This is installed on the workstation where the hardware is located and is required to test all written code. The API is based upon the VRML scene graph model and supports scripting in a variety of programming languages, focusing on C++ and Python. Other software required for the development of the application includes a web browser with a VRML viewer installed (e.g. Firefox with Crtona installed) and the Vim text editor. Chris Armer 14

19 4.2 Design Methodologies There are a number of design methodologies that could be followed for developing the application each of which has its own advantages and disadvantages Waterfall The waterfall model is a software development model (in which development is seen as flowing steadily downwards through the phases of requirements analysis, design, implementation, testing, integration, and maintenance.( WIKIpedia waterfall page, 2006). This is an established method dating back to 1970 which has separate stages of the process which are complete in them selves and the next stage can only commence when the previous stage has completed. It has the advantage that a set time is spent on each process and many faults can be picked up in the early stages of the process which would be expensive to rectify at a later date. Figure 4.1:The Waterfall method The disadvantage of this process however is there is no identifiable output until every stage is completed so a great deal of time and money could be invested in a project before it was found out that it did not meet specifications. Chris Armer 15

20 4.2.2 Iterative method In comparison to the waterfall method, the development process of the iteration method is completed in a number of short cycles which all have an output built on in the previous iteration. An iteration is said to be complete when a process similar to the one described by the waterfall method is completed but on a much shorter scale. The finished output of each iteration is analysed and this information is used to improve the output in the next iteration. It has the advantage of knowing what the final output will be like early on in the project so it can be changed or abandoned before too many resources are expended on it. The disadvantage of this method is a problem of limit. It is sometimes hard to know when to stop iterating and release the application, rather than performing another iteration to address that last bug. For a number of reasons the iterative method has been chosen for use in this project. If the waterfall method was chosen and there was a flaw in the design then there is a chance of it not being noticed until too far down the line for it to be feasible to address or begin again. If the iterative method was to be used then this would not be a problem as there would be at least two iterations in the process, and the time limit imposed by the end of term would mean that it would not be feasible to carry out too many iterations. Another reason for this implementation method to be chosen over the waterfall method is all tasks to be completed need to be fully known beforehand if the scheduling is to work and the application to be available on time. Due to the nature of the task chosen for the project having some unknowns in it such as not knowing exactly if a specific design idea will work as a direct comparison to the suggested application has not been found by the author; so the timing would be reliant on a number of things working which were not guaranteed to do so. Thus amount of ambiguity in the method would make successfully planning within the waterfall method impossible. 4.3 Evaluation methods It is very important that the correct method of evaluation is chosen for the task in hand as acquiring the wrong results from the right application is just as useless as getting no results at all. With the problem in mind there are a few evaluation techniques that could be valid Questionnaires Depending on which questions are asked and how they are worded questionnaires can either be very useful or a waste of time. Pointed questions with a finite number of options are good for getting comparative quantitative data but block out opportunities for users to give general opinions Chris Armer 16

21 4.3.2 Interviews Interviews are a more efficient way of getting the right sort of qualitative data needed than asking an opened ended question in a questionnaire. This is due to the developer being available to guide the evaluator to stay on topic and help to remove all ambiguity that may be present in any questions Task based This is a good method to use if quantitative data such as times are required for comparisons. An evaluator is given a set task within an application and has to complete it as quickly and efficiently as possible with the developer present for help if needed Walkthroughs These are common techniques which have the developer guiding an evaluator through a system step by step and highlighting the aspects that are needed to be considered. This is good as the results obtained will be precisely what is needed but eliminates any chance of a wider examination of the system. The evaluation of the application will use a mix of all of the above techniques to get precisely the results required while at the same time allowing for general opinions to arise. The first method that will be used will be a series of similar tasks in the same environment some of which will be walkthroughs and the remainder will be task based. So the easier tasks will be shown to the evaluator to ensure everyone taking part understands the application fully before considering the task based evaluation of the remaining harder rules. This will be followed by a guided interview on individual evaluators immediately after the final task has been finished. This will give the evaluator the chance to voice their opinions on the application while the developer still acquires the quantitative data needed for a comparable evaluation. Chris Armer 17

22 4.4 Schedule The suggested schedule can be found in appendix C and shows graphically the timescales I intend to have the different sections completed by along with any other important dates within the Department which would affect my progress. A detailed breakdown of each section is as follows. Research (Complete by 8/01/06) Read Integrating Force Feedback with Virtual Reality by Svein Iversen (2004) Research into current VR worlds and how they are manipulated Research and compare the two ReachIn Display System API s and decide which to use Research into the Phantom and how it works/is calibrated. Use examples to get a feel for it Use simple programs to explore normal + stereo viewing with the ReachIn Display System 1 st Iteration (Complete by 19/02/06) Produce a non-haptic Lego set Tie in feedback for haptic version 1 st Evaluation (Complete by 5/03/06) Evaluate using expert guidelines based evaluation Review results of evaluation 2 nd Iteration (Complete 31/03/06) Bearing in mind evaluation, develop 2 nd non-haptic version Tie in haptics again 2 nd Evaluation (Complete 16/04/06) Individual demonstrations then a focus group Evaluate results of 2 nd evaluation Write final report From the third week in February until the due date of the 2 nd of May, time should be taken out to write the report while carrying out the second iteration and evaluation. This is to avoid an inferior report letting a project down. Chris Armer 18

23 5. Design Process This chapter intends to detail the processes followed to turn an idea for a program into a 3d application with haptic feedback. An introduction to the base technologies is given, programming languages are introduced and simple programs are explained. Furthermore the Implementation process is presented 5.1 VRML Programming The first thing to be aware of when talking about VRML (which stands for Virtual Reality Modelling Language) is that it has now been made redundant by the successor X3D in most situations. X3D an Open Standards XML-enabled 3D file format to enable real-time communication of 3D data across all applications and network applications. (Web3D, 2006)) This means that the number of web resources is diminishing all the time but due to the similarities of the two technologies there will still be some available. This however is irrelevant as VRML is what is focused on. VRML makes it easy to quickly and universally model 3D scenes to be displayed by web browsers with the correct plug in. There are numerous clients available but the one used for this project was the Cortona client available from ParrallelGraphics. (Web3D, 2006) There are a number of ways of creating and editing VRML files which have their advantages and disadvantages, namely using a plain text editor such as Notepad, Vim or a proprietary editor, an example of which is VrmlPad. This is a specialised VRML editor from ParrallelGraphics which includes a wide range of features designed to make programming in VRML as easy as possible. These features include the likes of smart AutoComplete, dynamic error detection and syntax highlighting. (ParrallelGraphics VrmlPad) These features do make VRML coding far easier than just using a plain text editor but this software must be purchased to make full use of it. The trial software was used for the first VRML files produced but due to problems mentioned below it was replaced with the Vim text editor. This is a free editor which has good syntax highlighting which is all that was required to aid with the production of working code after familiarity with the language was obtained. Chris Armer 19

24 The VRML language in itself is quite a powerful tool for producing 3D objects and scenes but it must be observed that to integrate a VRML file into the ReachIn API care needs to be taken with which nodes are used. This is because the API and VRML language don t share all common nodes, that is some nodes in one of the two are implemented differently or even not at all. When learning VRML this was not an important factor but when files were written specifically to be added into the API this had to be taken into account. The ReachIn API Programmers guide includes a list of all nodes for both environments detailing the differences between them so this was referenced throughout the implementation Basics Due to the way that VRML is implemented it is very easy to produce visual output, and specific nodes have been included to aid in this. Nodes such a as Box, Cone and Sphere allow the easy generation of basic shapes while nodes like IndexedFaceSet and IndexedLineSet allow the generation of more complicated shapes through coordinate geometry. There are many sources available for learning VRML, examples of which were used within this project are The VRML 2.0 Handbook (Hartman, J & Wernecke, J, 1996)) and Floppys VRML97 tutorial (VapourTeck, 2006)). Both have all the information required to create detailed VRML scenes and behaviour within these scenes The first iteration of the final application began as the modelling of a simple brick in pure VRML as can be seen in figure 5.1 Figure 5.1: First VRML file Chris Armer 20

25 5.2 ReachIn API The ReachIn API makes use of VRML files to display scenes within the ReachIn Display System although before this is possible a number of files need to be edited. This section sets out how to produce a 3D image of what is drawn within a VRML file Setup To begin with the files that are to be displayed need to be located somewhere the API can get access to and be told they need to be there. The details needed to make a VRML file run within the API can be found in appendix D. Now at a command prompt if the following command is entered then the VRML file should be displayed on the ReachIn Display. Command: reachinload <filename> e.g. reachinload test Haptic rendering The above displays an image on screen but at present there is no haptic rendering occurring on the scene so the stylus has no effect on the image. To achieve this, some of the ReachIn API s specific nodes need to be utilised. The Appearance node within the API utilises a field that is not present within VRML, namely the SimpleSurface node. This field is the simplest method within the API of implementing haptic rendering and contains two fields which control its behaviour. These fields are Stiffness field which controls the surface repulsive stiffness of the surface it refers to, and the damping field which controls the surface collision damping of the surface. These two fields simulate the feel of the object. Other Nodes have been defined within the API to give different tactile feels but were not used for this application so will not be detailed here. This simple brick which is now projected within the ReachIn Display and the Haptic feedback it produces is the building block of all applications within this programming environment. The following section presents the implementation of these simple examples to produce a complex and fully interactive application Chris Armer 21

26 6. Implementation This section details the implementation of the application from start to finish, compares it with the planned design from the previous chapter and explains any discrepancies between the two plans 6.1 Iterations The original plan was to complete two full iterations and conduct an evaluation on the first implementation when it was completed. This was heavily dependent on the programming stage to begin at a certain time (from the time chart in appendix this was the 20 th January) to allow for both iterations to complete fully. I was unable to give the project the time it required mainly due to the personal reason outlined in appendix A which was eventually resolved at the end of February. By this time the first evaluation was supposed to have been completed and in fact the application was not far into the first iteration. As about half of the development time was lost there was no choice but resort to a shortened waterfall method of designing by following the plans for the first iteration but aiming to complete on or around the date originally quoted for the end of the second iteration. One of the reasons that the iteration method of designing was chosen in the first place was due to a number of unknowns with regard to programming ideas within the application. When these unknowns were reached in the implementation, the pre-conceived ideas to overcome the problem failed to work correctly or at all, and the process was held up further while alternate methods were found. This is detailed below in the implementation 6.2 Process As mentioned in the design stage, a number of VRML files were created before the ReachIn API were already created, namely each of the bricks that would eventually be the main aspect of the haptic application. The first hurdle that was encountered was working out how to import VRML files into the API so they would be displayed on the ReachIn Display. After reading the URN section within the ReachIn API Programmers guide the information that is currently shown in appendix D was understood and put into practice. The file was named test.wrl as at the time it was a test and the name has just stuck on the application up until submission of the report. After appendix D was followed the file could be ran by passing a local reference to the files to the ReachinLoad executable using the command reachinload test rather than having to write an absolute path every time. Chris Armer 22

27 The next milestone that was tackled was adding some haptics and interaction into the currently static scene. This was done by following the code of that made up the application of the 2004 Msc project by Iversen, S and the example code available on the ReachIn website (ReachIn technologies, 2006). This finally left me with a scene of 6 bricks which were touchable and the whole scene was interacted when the stylus button was depressed. This was a step in the right direction and was followed up with trying to separate the bricks up so one could be moved and another could be static and the tolerances between would be felt. Again following code laid out my Iversen, S in his Ideal.wrl file, I produced a file where one brick was constantly attached to the stylus and all the others were static on scene (but translatable using the stylus button). The beginnings of the program were there, the tolerances between bricks were noticeable but it would be a while before there was a noticeable Lego set produced. This is the stage where I started looking into some scripting to dynamically change what was shown on screen. After more practices with the code examples on the ReachIn website and the tutorials found on the Floppy s Web3D guide helped me to gain a basic understanding of scripting. This is where a major problem was experienced I had got up to this stage in the process expecting to be writing the scripting in JavaScript, a language I am versed in. This was not the case after a closer look at the API Programmers Guide and I had to choose between C++ and Python. Python was chosen after a look at equivalent programs in the Programmers guide and it was a case of trial and error with a little help gained from the small python scripting section within the Programmers guide. From this point the Masters project was of little use to me for help with the coding as scripting was not touched upon in any great length. This is where I had to use a textbook I acquired in my first year and have subsequently found a number of online versions to help me (How to think like a Computer Scientist, Learning with Python, Downey, A et al.2002). Trail and error based upon ideas present to me from the Python book eventually got me with a workable script which was starting to display the desired actions. I could dynamically select which brick to select on screen using arrays to store references to coordinates of the relevant bricks and comparing the coordinates with the position of the stylus at button press. I found that after much bad luck, I could not dynamically draw just 1 brick into the CollidingController node of the program so I had to think of a fix. This was worked around by drawing each brick twice within the program, once in the CollidingController section of the program (so it was all attached to the stylus all the time) and again in the scene. I simulated the bricks moving onto the stylus from the scene by finding the closest brick to the stylus on button press then scaling the corresponding brick in the scene to (basically making it invisible) and maximising the Chris Armer 23

28 corresponding brick on the stylus. This process was reversed when the button was released, and while it gave the impression of selecting a brick from the scene, it is really jus maximising and minimising the right bricks at the right time to give that impression.. What was needed next was a way to be able to store the new coordinates so when a brick was moved it stayed where it was released rather than jumping back to where it was. This was achieved by routing the stylus coordinated at button release to the VRML file brick coordinates. This now gave the impression of a working Lego set just without brick interactions. This was the last big task that needed tackling for the application to do what it was intended and to be able to evaluate it (and so hit all the minimum requirements). This was done by experimenting with the LayeredGroup node of the ReachIn API with each brick in the scene inhabiting its own GraspableTransform node. This made all of the bricks on the scene have haptic feedback and I just needed something to be able touch these bricks with. I had to think of a way to dynamically change the bounding box of the stylus depending on which brick was selected to give the impression of the selected brick being able to interact with all others. I had to manually define a bounding box within the VRML file but make it have no influence on the screen. Once I had one of these bounding boxes for each brick I routed the correspoinding bounding box into the bounding box section for the stylus on button press and emptied it again on button release. With all this complete the application gives the impression of doing what It was supposed to all bricks start on the scene and the stylus is empty and has to force on any of the bricks. When the stylus is moved close to a brick and the button is pressed, it disappears from the scene (just minimises it) and appears on the stylus (maximises an already minimised image) and the corresponding bounding box is selected and this brick now interacts with the scene intuitively and as a Lego set should. When the button is released the brick on the stylus moves to the current coordinates of the stylus and the process can be repeated. With numerous iterations of this process all brick can be selected and moved, interacts with all bricks and the tasks (and any other the user wishes) can be achieved. The application that was the result of following this implementation and which was evaluated in the next chapter can be found at the following address. To install, download the archive and read the included Readme file. Chris Armer 24

29 7. Evaluation This chapter outlines the need for and process of evaluating the application. The method followed and the tasks required are identified and the results of the evaluation are laid down with a discussion into their validity 7.1 Introduction Any research process is useless unless it is evaluated fully; the need for in depth evaluation is paramount if the topic is ever to be fully understood. With this in mind, a great deal of time has been expended on designing and implementing an objective and measurable evaluation process for the application. The proposed techniques have both quantitative results that can be used for direct comparison as well as qualitative results such as user testimonials which can be used to get user opinions on the application. Factors that could influence the results had been identified and the tasks designed to minimise their effects. 7.2 Users A number of users were selected to take part in the experiment and care was taken to make the results as un-biased as possible. This was tackled by using a cross section of users from different backgrounds and genders for each versions of the application. To counteract any kind of previous knowledge from influencing results, users both with and without computer experience were selected for each version of the application but care was taken to ensure that all users had no extensive knowledge of 3D or Haptic applications. To tackle the possibility of task learning influencing the results, different users were chosen for the haptic and non-haptic versions of the application. In the end 2 males evaluated the haptic version with a male and a female evaluating the non-haptic version. 7.3 Tasks As mentioned earlier in the report, Lego was chosen as a viable choice for this project as it allows a number of simple and easily identifiable tasks to be presented to the user for completion. These tasks Chris Armer 25

30 were chosen to be simple enough so that the only factor that would influence the users completion time would be the interactivity of the application; not the understanding of the tasks. It was decided that task completion time would be recorded for each task as this would give a quantitative result which could be directly compared with the equivalent task in the alternate application version. It was understood that allowing a user to evaluate both versions of the application would include an element of task learning to effect the times of the 2 nd set of results so it was decided that each user would only evaluate one version of the application to minimise this. This however would still allow learning to effect the times of the later tasks within an evaluation but as this was unavoidable and had similar effects on all results it was deemed acceptable. As mentioned in the research a wide selection of objects could be created with a small range and number of bricks but for the purpose of this project only a small number of simple tasks were chosen for the evaluation and are detailed below. Task 1: Build a Tower This task required the user to select any bricks to create a wall. It aimed to allow the users to become familiar with the mechanics of the application with a simple task. Figure 7.1: An example of task 1 Chris Armer 26

31 Task 2: Build a Colour Co-ordinated Stack This task was a simple extension of the first one, with the user creating just one stack of similarly coloured bricks Figure 7.2: An example of task 2 Task 3: Build a Shape Co-ordinated Stack This was the last task included for familiarising the user with the program. A stack of similarly shaped bricks were to be produced. Chris Armer 27

32 Figure 7.3: An example of task 3 Task 4: Build a Neat Wall This task required users to select a number of bricks (of any size or colour) and build a wall of greater length than any one brick. Users were told that the base had to include at least 2 bricks to avoid the first task being repeated Figure 7.4: An example of task 4 Task 5: Build an Archway This required the user to produce two small towers and then join them with a long brick on top. Figure 7.5 An example of task 5 Chris Armer 28

33 Task 6: Build a Pyramid This requires users to use a large number of bricks to produce a pyramid. It is meant to be quite taxing and require feeling tolerances between multiple bricks. Users were again asked to make the base at least 2 bricks long to avoid the task becoming too simple. Figure 7.6 An example of task 6 Chris Armer 29

34 7.4 Process Prior to the user participating in the evaluation all materials that were needed were made available and the system and all hardware was checked to be in working order to ensure identical starting points. This included the application running correctly, The CrystalEyes glasses working correctly and the task sheet present (a copy of which can be found in appendix E). Before the evaluation began all hardware and tasks were explained to the user and then the application was ran. The user would be given 2 minutes to familiarise themselves with the application before the tasks were began. The user would then be asked to complete all tasks in turn with the system reset after the completion of each task. This was to ensure that all results were fair and had the same start point. The first three tasks served the purpose of giving the user further experience of using the application with a task in mind and the last three tasks were timed and recorded. The task was considered to be started when the application was fully loaded and the user had selected the first brick and considered finished when the last brick relating to the task was released. Each user answered a quick interview after they had completed the evaluation to detail their opinions on the application. Interview As the aim of the project was to explore whether haptic feedback within an application was beneficial I decided that on top of the quantitative results from timing each of the tasks, it would be beneficial to find out what the users opinions and suggestions were upon completing all tasks. To achieve this I decided to conduct a short guided interview on each user shortly after finishing the last task. This followed the format set out in appendix F and included open ended questions to keep the user on topic but did not hound them into a particular response. Along with these open ended questions were a small number of direct ones which served the purpose of rating the application to aid in comparisons between users. Results of these questions and the timings of all tasks can be found in the next section Chris Armer 30

35 7.5 Results This section will present the quantitative results of the evaluation followed by the qualitative results of the guided interview. The evaluation of the application focused mainly on the time taken to complete the tasks set to the user but as mentioned above also has section devoted to the qualitative opinions of those taking part. For the following tables, users A and D evaluated the non-haptic version which left users B and C to evaluate the haptic version of the application. Figure 6.7 presents the time taken by each user for each task and it shows that users B and C have smaller completion times across the board. It is noticeable that for the majority of users, the time to complete task 5 is less than that of task 4. This could be due to task learning having an effect on the efficiency of each user s manipulation of the scene. The completion time for task 6 for each user shows no real correlation with their other results, with one of the users in each version finding it harder and one finding it easier. This can be explained by mentioning that each user comes from a different background and so having different amounts of experience with computer interaction; therefore different users will become proficient with the system at differing paces. This information is also shown in table Completion Time (seconds) User A User B User C User D 20 0 Task 4 Task 5 Task 6 Figure 7.7: Time taken per task per user Chris Armer 31

36 Figure 7.8 shows that the average time for each task is always less for the haptic version than the nonhaptic version showing the addition of haptics to an application aids with object manipulation. Also evident from the figure is the trend of all times getting smaller as the times progressed. This was expected and is a direct effect of the user learning to interact with the scene more efficiently Time (seconds) Haptic Non-Haptic Task 4 Task 5 Task 6 Figure 7.8: Average time taken per task User Task 4 Time Task 5 Time Task 6 Time (seconds) (seconds) (seconds) A B C D Table 7.1 Results of Evaluation Chris Armer 32

37 Taking the previously established fact that haptic interaction was beneficial into account it is noticeable that users B and C were evaluating the haptic version. This is confirmed in table 7.2 which shows the total completion time for each user as well as the average time of each user to complete a task. User Total User Time Average User Time (seconds) (seconds) A B C D Table 7.2: Total and Average user times This information is shown graphically in figure 7.9 and shows conclusively that users B and C had a shorter total time for task completion. This information paired with the results shown in figure 6.8 proves that the haptics benefited the application. D C User B A Average time (seconds) Figure 7.9: Average time for all tasks Chris Armer 33

38 As previously stated a guided interview was carried out on each user soon after the last task was completed. The results are shown in table 7.3 and show that users B and C who used the haptic version of the application found the bricks very easy to place and could feel the tolerances between bricks while manipulating the scene. The results for users A and D show that some difficulty was experienced with the non-haptic version of the application for one of the users and this correlates with the largest total time displayed above. This user however felt that the addition of haptic feedback would help complete the tasks but the final user disagreed with this. User Haptics Q1 Q2 Q3 (H) Q4(H) Q4(N) Q5 A No 3 No N/A N/A Yes 4 B Yes 4 Yes Yes Yes N/A 5 C Yes 5 Yes Yes Yes N/A 5 D No 5 Yes N/A N/A No 5 Table 7.3: Results from guided interview Finally the users were asked for any comments about the system and the evaluation tasks, including likes/dislikes and suggestions on improvement and are listed. The first list is comments that were made, un-prompted; by more than one user and the second list are the other comments that were valid. The suggestion that came from every user who took part in the evaluation was to be able to have the choice of moving a group of bricks that had already been interfaced. The next most frequent suggestion was including the capability of rotating the scene as a whole to get a better view of the bricks being manipulated More accurate collision detection for the larger bricks would help speed up the process the system was very easy to use and well put together the system could benefit from dual stylus manipulation An option to dynamically change the number of bricks in the scene would reduce the amount of clutter within the application Chris Armer 34

39 Results evaluation From the results detailed above it can be seen that task outlined for the evaluation took less time to complete with haptics enabled and so the total time per user for all tasks was also less. Thos we can conclude that the task of adding haptic feedback to a 3D application is a worthwhile one. This is confirmed by the opinions of the users who evaluated that version of the application This result was also reached by the previous report into this area which was used as a basis to this project. The prototype has shown that the integration of force feedback with virtual reality can facilitate the perception of tolerances in virtual assemblies (S. Iversson, 2004) Chris Armer 35

40 8. Conclusion This chapter details the project findings, re-iterates the goals which were defined at the beginning of the report and compares them to the achievements actually attained. Limitations of the current implementation are discussed and suggestions on improvements are made. 8.1 Goals The goals of the project are based upon the minimum requirements and aim of the project which can be found in Chapter 1.2 and are detailed below: The main aim was defined as producing two versions of the same application so that direct comparisons could be drawn between them. The side aim was to broaden the authors knowledge of the topic(s) covered A review of literature covering the topics of haptic and virtual reality interactions was to be completed Achieving Stereo Viewing within the System Non-haptic Lego set was to be functional A Haptic version of the above application was also required to be functional An evaluation of the system conducted with a minimum of four users A possible extension was that of including a user menu for the ease of users accessing frequently needed tasks Chris Armer 36

41 8.2 Achievements The required applications were produced and do meet the requirements needed to justify them as a working Lego set. Namely it is visually recognisable as what it was intended to be, it is intuitive to use and tolerances can be felt between bricks within the scene. It has been recognised that these interactions are in need of adjustment to make the tolerances feel more real but within the project scope these were deemed to be acceptable. If more time was available then the code would have included the aforementioned modification along with a number of others but was impossible within the current time constraints Based upon the list of goals specified, it can be said that these were realised to some extent. A literature review into required areas can be found within the project report, although greater depth within this would have aided in the design process, namely with configuring the interactions between objects in the scene Stereo viewing was achieved from the outset of this project as it was an option within the ReachIn environmental settings but was required to add the third dimension needed. The differences between the Haptic and Non-Haptic versions of the application were minimal but essential for the minimum requirements to be met. Details of how this was achieved can be found within Chapter 5. This was achieved to a level such that the evaluation could take place. The Evaluation was completed with the required minimum of four users and results of which can be found in Chapter 6 The possible extension of including a user menu was also achieved but its use is very minimal. At the time of writing the menu could only be used for exiting the program but as this was a more elegant route than force exiting the program with the Escape key it was left active. Chris Armer 37

42 8.3 Limitations Although the application works and all the minimum requirements where achieved, it is still not without its faults. This section aims to address these problems and suggest ways they could be rectified. Overall the general coding style within the application is not as good as it could be. Due to time restraints towards the end of the process, it was decided that inelegant but working code was preferential to an incomplete solution, so rudimentary solutions were used. In general this does not have a major bearing upon the finished application apart from in two areas, namely performance and brick interactions. Due to the final implementation, all the bricks within the application had to be rendered twice which when no brick was selected was un-noticeable but when a brick was being held the motion did seem somewhat slower. This became more apparent when more bricks were added to the scene. This also was the final reason why 14 was the number of bricks that were finally modelled, a trade-off between system performance and application usability. The second area where this caused problem is the interactions between bricks. As point-based feedback was used, if the points were not positioned correctly then undesired behaviour became apparent and negatively affected the efficiency of the application. The same reason of a buggy but finished product over an incomplete one being desirable was the reason this was so. The extension that was implemented was added on late in the process and so is the reason for its limited capabilities. It remained within the application as what it did offer was a better option than forcing the program to quit using the keyboard but its potential was not fully realised. The literature review that is present in Chapter 2 is minimalist at best but does give a basis on which to proceed, even if it isn t the most efficient or informed way of proceeding through a report. This was due partly to personal reasons which are mentioned in appendix A and partly to an underestimation of their importance coupled with bad timekeeping. The project was possible with the review that was carried out, but the project as a whole would have benefited from the proper preliminary work being completed satisfactorily. Chris Armer 38

43 8.4 Future work The produced application fulfils all of the minimum requirements set out by this report but by no means is complete. There a number of extensions that could be applied to the program to increase or improve its functionality. Some of the following were drawn up by the author but some equally valid points were brought up by the users who evaluated the system. The code as its stands does work but is very inefficiently written and does include a number of hacks to make it work as intended. Additional time could be expended going through the code and rewriting sections of it to make it more efficient. Good examples of this could be the brick selection implementation or a number of the loops. An area which could be improved upon is limiting the file size of the main program. This could be addressed by modulating all the code and referencing it within the Pyton file using the ReachIn API WriteVRMLFromFile function. This could be used by having all the brick VRML files separate and then imbed the import functions within loops so only the bricks needed are imported and drawn, unlike the current implementation. This would also combat overall file size as code would not need to be duplicated, rather imported twice. The most frequent suggestion made by the users which could be implemented is the ability to select a group of bricks and be able to manipulate them in 6 DOF as is possible currently only for 1 brick at a time. It was mentioned that this would make the tasks quicker and easier to complete. This could be achieved by exploring further into different GraspableTransform nodes within the LayeredGroup node. Another suggestion was to be able to rotate the perspective on the full scene to get a better view of the selection of bricks currently being modified. This could possibly be achieved by implementing a global rotation to all objects in the scene As mentioned previously, if there was more time available then the collision detection that is currently in place would be tweaked to give better results. This could have been avoided if further research had been carried out The 3D menu was used in the current implementation of the application but its uses are minimal at the time of writing. There are possibilities available to use this menu for many other things. Two prime examples could be resetting the scene by implementing a quit and reloading of the scene and Chris Armer 39

44 including an undo option to return the last brick to its previous location. The second option would not be a major task due to the current application making it easy to save a bricks previous position. Another possible extension that came from the evaluation is the option of dynamically adding or removing bricks from the scene from within the program. This could be used from within the 3D menu and would aid in reducing on-screen clutter by only having bricks displayed that the user would currently be using An extension that would greatly expand the usability and arguably make the application almost completely intuitive to pick up and use is by importing it into a dual Phantom set-up. Having two points of interaction would greatly improve upon the current system as two bricks could be moved at once and interactions felt in two hands at once. This would almost definitely reduce task times further than adding haptic feedback alone. The final extension that will be suggested here is one that was a possible extension within this report but disregarded due to the extra time required to implement it. If the system was made multi-user than two people working together could reduce task times even further 8.5 Summary The overall objective of the project was to investigate into the benefits of including haptic feedback within a 3D manufacturing application. The main function of the application was for the user to experience tolerance clashes between objects within the scene and for these haptic feedback cues to aid in the chosen manufacturing tasks. The research carried out focused on the different APIs available for development in and a report that carried out a very similar project in The Lego set produced allowed an artificial means of testing haptic tolerances resulting from object interaction within a 3D scene. This application allowed an evaluation which showed that integrating haptic feedback into an application is an advantage Despite problems with the implementation, the minimum requirements stated at the beginning of the report have been achieved and the original problem has been solved Chris Armer 40

45 9. Bibliography ParallelGraphics (2006) Cortona VRML Client5.0 [Accessed 30th April 2006] Downey, A et al, (2002), How to Think Like a Computer Scientist, Learning With Python SensAble Technologies, Inc (1996) GHOST SDK Programmers Guide Version 4 SensAble Technologies, Inc (2002) GHOST API Reference Manual Hartman, J & Wernecke, J. (1996) The VRML 2.0 Handbook Building Moving Worlds on the Web, 1 st ed. USA, Silicon Graphics Inc. Itkowitz, Brandon et al, The OpenHaptics Toolkit: A Library for Adding 3D Touch Navigation and Haptics to Graphics Applications < [Accessed 8th December 2005] Iversen, Svein (2004) Integrating Force Feedback With Virtual Reality Lego Company (2006) Lego Timeline page [Accessed 30 th April 2006] ParrallelGraphics (2004) VRMLPad [Accessed 8th December 2005] Pressman, R., (2000), Software Engineering: A Practitioner s Approach. PenZilla.net (2006) PenZilla.net s Python Tutorial [Accessed 30th April 2006] Chris Armer 41

46 ReachIn Technologies (2002) ReachIn API 3.2 Programmers Guide ReachIn Technologies (2006) ReachIn API Getting started guide [Accessed 8th December 2005] ReachIn Technologies (2006) ReachIn Technologies Homepage [Accessed 28th April 2006] SensAble Technologies (2006) SensAble Technologies Homepage [Accessed 20th December 2005] Technica (2005) Technica Homepage [Accessed 30th April 2006] Web3d consortium (2006) Web3D Consortium Homepage [Accessed 1 st May 2006] Vapour Technology (2006) Floppys Web3D guide [Accessed 1 st May 2006] Wikipedia (2005) Wikipedia encyclopedia [Accessed 1st May 2005] Wikipedia (2005) Wikipedia Lego page [Accessed 1 st May 2006] Wikipedia (2005) Wikipedia Waterfall Method information [Accessed 30th April 2006] Wikipedia (2005) Wikipedia Iterative Method Info [Accessed 30th April 2006] Chris Armer 42

47 Appendix A: Project Reflection This project has been a learning experience for me and has taught me valuable lessons both technical and personal that will serve me well in my coming years in the industry. My background in this area upon starting this project back in September had been restricted to what had been covered in the Graphics and HCI modules I took within the school and what I had come across in general Internet experiences for a Computing student (such as researching a bit into any new technologies that I came across mentioned in websites such as and through word of mouth amongst my colleagues. This background along with my passionate interest into computer games and the 3D worlds in which they occur equipped me with the drive to choose a graphical project, namely this one. Following is a list including some of the lessons I feel I have learned throughout this process along with opinions on how they have benefited me or could have been avoided if the process was to be repeated in the future by myself or other students. Personal issues The factor most detrimental to my progress in the project was a major personal issue which created an extremely stressful environment for me to work in. Concurrent to my University work was an ongoing paternity claim which seriously disrupted my ability to concentrate and led to a loss of focus. This was eventually resolved successfully and since that date I have been trying to catch up the work I couldn t do earlier in the process but has left me short on time towards the end. Timekeeping One of the major problems that had plagued me throughout the project was my poor timekeeping. This along with the previous point has made this project far more difficult for me than it should have been. Completing the groundwork (research and initial plan) should be completed at your first convenience as it will make everything else easier if you know what you are doing and why. Due to me leaving these late I was at a disadvantage of what to do and why when I actually did start giving the project 100% when the paternity case was resolved. Chris Armer 43

48 Report Write-up It would be (and was for me) tempting for students to leave the report write-up until the end of the process. It does leave time for you to focus more single-mindedly on the implementation if you are running late on your schedule but this is actually a false gain. This is because if you are keeping up with the write-up of each section as you complete it then the contents will be fresh in your mind, your supervisor can comment on your work and the final write-up will take far less time. Enjoyment As mentioned in the first point of this section, if you are enjoying working on the project then the work will come a lot easier to you. I found this was the case and would happily carry on the application side after the project has been submitted; but the little time I had left for the implementation meant that the project potential was not realised All in all, despite the myriad problems that have affected me and my work through this process I have found it quite enjoyable. If I had the opportunity to repeat the project I would address the way I managed myself and (personal issues allowing) gave the project 100% from the very start giving it the time it deserved. It has been a positive experience working in the research area of haptics and using hardware I would not usually have access to. I would like to hope that future students will find this work useful and be able to use it as a base for their own work in the future Chris Armer 44

49 Appendix B: Lego Brick Iterations All images on this page used in accordance with the Lego Company s fair play policy The Lego Brick did not always look the way it does now, although it had been almost the same for 48 years now. The design has been through a number of iterations which can be seen detailed below Automatic Binding brick (1949) This was the first type of Lego brick to be manufactured on a large scale and can be seen on the right. It is created from a vacuum forming with a material called Cellulose Acetate which was used in all bricks up until Automatic Binding Brick ( ) The main difference between these bricks and the earlier version is the logo printed on the under side of 2x2 and larger bricks Lego Mursten ( ) The addition to this version was the logo was in a different style and only occurred on bricks of size 2x4 and larger Chris Armer 45

50 Lego System I Leg ( ) This was the first iteration of the brick that had the company logo on each of the studs. The interior was also indented to mark the position of the studs on the top Lego Brick (1958 present date) This version presented the most changes to the standard design but has remained standard until this day. The slits that had been present in all previous bricks for windows have been removed and central hollow tubes were added to make the connections sturdier. Chris Armer 46

51 Appendix C: TimeSheet Chris Armer 47

52 Appendix D: ReachIn API setup instructions To get VRML files working with the ReachIn API follow the steps laid out below:: 1. create a directory eg: C:\Reachin\API\resources\test 2. create an index.urn file in that directory eg: " #URN1.0 test +/test.wrl urn:inet:reachin.se:/test/* +/* " where test is the name of the file in the first line and the directory in the 2nd 3. edit the index.urn in C:\Reachin\API\resources to link the file eg add this line: " urn:inet:reachin.se:/* +/* " 4. run with the file name at a command prompt eg: ReachinLoad test Chris Armer 48

53 Appendix E: User Evaluation Tasks Build a Tower This task required the user to select bricks of a certain size to create a wall. It aimed to allow the users to become familiar with the mechanics of the application with a simple task. Build a Colour Co-ordinated Stack This task was a simple extension of the first one, with the user creating stacks of similarly coloured bricks Chris Armer 49

54 Build a Shape Co-ordinated Stack This was another task for familiarising the user with the program. Stacks of similarly shaped bricks were to be produced Build a Neat Wall This task required users to select a number of bricks (of any size or colour) and build a wall of greater length than any one brick. Chris Armer 50

55 Build an Archway This required the user to produce two small towers and then join them with a long brick on top. Build a Pyramid This requires users to use a large number of bricks to produce a pyramid. It is meant to be quite taxing and require feeling tolerances between multiple bricks. Chris Armer 51

Project Multimodal FooBilliard

Project Multimodal FooBilliard Project Multimodal FooBilliard adding two multimodal user interfaces to an existing 3d billiard game Dominic Sina, Paul Frischknecht, Marian Briceag, Ulzhan Kakenova March May 2015, for Future User Interfaces

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Developing a VR System. Mei Yii Lim

Developing a VR System. Mei Yii Lim Developing a VR System Mei Yii Lim System Development Life Cycle - Spiral Model Problem definition Preliminary study System Analysis and Design System Development System Testing System Evaluation Refinement

More information

Topics VRML. The basic idea. What is VRML? History of VRML 97 What is in it X3D Ruth Aylett

Topics VRML. The basic idea. What is VRML? History of VRML 97 What is in it X3D Ruth Aylett Topics VRML History of VRML 97 What is in it X3D Ruth Aylett What is VRML? The basic idea VR modelling language NOT a programming language! Virtual Reality Markup Language Open standard (1997) for Internet

More information

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. The CHAI Libraries F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury Computer Science Department, Stanford University, Stanford CA

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

Set Up Your Domain Here

Set Up Your Domain Here Roofing Business BLUEPRINT WordPress Plugin Installation & Video Walkthrough Version 1.0 Set Up Your Domain Here VIDEO 1 Introduction & Hosting Signup / Setup https://s3.amazonaws.com/rbbtraining/vid1/index.html

More information

YCN STUDENT AWARDS

YCN STUDENT AWARDS lego c r e a t e a c a m p a i g n f o r t h e L E G O B r a n d t h a t d i s t i n g u i s h e s L E G O f r o m all competitors WEBSITE - lego.com Social - youtube.com/lego facebook.com/legogroup 1

More information

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic

More information

The Interview. Preparation & research. Grooming. Know your CV. Interview: arrive five minutes early

The Interview. Preparation & research. Grooming. Know your CV. Interview: arrive five minutes early The Interview Preparation & research Preparation is critical to a successful interview. You should have detailed knowledge of the content of the position for which you are being interviewed, the competencies

More information

Development of K-Touch TM Haptic API for Various Datasets

Development of K-Touch TM Haptic API for Various Datasets Development of K-Touch TM Haptic API for Various Datasets Beom-Chan Lee 1 Jong-Phil Kim 2 Jongeun Cha 3 Jeha Ryu 4 ABSTRACT This paper presents development of a new haptic API (Application Programming

More information

Intelligent Modelling of Virtual Worlds Using Domain Ontologies

Intelligent Modelling of Virtual Worlds Using Domain Ontologies Intelligent Modelling of Virtual Worlds Using Domain Ontologies Wesley Bille, Bram Pellens, Frederic Kleinermann, and Olga De Troyer Research Group WISE, Department of Computer Science, Vrije Universiteit

More information

Designing Semantic Virtual Reality Applications

Designing Semantic Virtual Reality Applications Designing Semantic Virtual Reality Applications F. Kleinermann, O. De Troyer, H. Mansouri, R. Romero, B. Pellens, W. Bille WISE Research group, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium

More information

MESA Cyber Robot Challenge: Robot Controller Guide

MESA Cyber Robot Challenge: Robot Controller Guide MESA Cyber Robot Challenge: Robot Controller Guide Overview... 1 Overview of Challenge Elements... 2 Networks, Viruses, and Packets... 2 The Robot... 4 Robot Commands... 6 Moving Forward and Backward...

More information

12-POINT CHECKLIST FOR BUILDING AN ONLINE BUSINESS

12-POINT CHECKLIST FOR BUILDING AN ONLINE BUSINESS 12-Point Checklist For Building an Online Business Building an online business is never an easy task. Either if you are a business veteran or a beginner pursuing a dream, there are numerous challenges

More information

Sensible Chuckle SuperTuxKart Concrete Architecture Report

Sensible Chuckle SuperTuxKart Concrete Architecture Report Sensible Chuckle SuperTuxKart Concrete Architecture Report Sam Strike - 10152402 Ben Mitchell - 10151495 Alex Mersereau - 10152885 Will Gervais - 10056247 David Cho - 10056519 Michael Spiering Table of

More information

Keytar Hero. Bobby Barnett, Katy Kahla, James Kress, and Josh Tate. Teams 9 and 10 1

Keytar Hero. Bobby Barnett, Katy Kahla, James Kress, and Josh Tate. Teams 9 and 10 1 Teams 9 and 10 1 Keytar Hero Bobby Barnett, Katy Kahla, James Kress, and Josh Tate Abstract This paper talks about the implementation of a Keytar game on a DE2 FPGA that was influenced by Guitar Hero.

More information

LEGO Package Redesign

LEGO Package Redesign LEGO Package Redesign Special Edition Comic-Con Brickheadz Emily Gaston Contents 3 Company Overview The LEGO Company is a Danish family-owned company based in Billund, Denmark that manufactures LEGO-brand

More information

Methods for Haptic Feedback in Teleoperated Robotic Surgery

Methods for Haptic Feedback in Teleoperated Robotic Surgery Young Group 5 1 Methods for Haptic Feedback in Teleoperated Robotic Surgery Paper Review Jessie Young Group 5: Haptic Interface for Surgical Manipulator System March 12, 2012 Paper Selection: A. M. Okamura.

More information

User Interface Software Projects

User Interface Software Projects User Interface Software Projects Assoc. Professor Donald J. Patterson INF 134 Winter 2012 The author of this work license copyright to it according to the Creative Commons Attribution-Noncommercial-Share

More information

Computer Usage among Senior Citizens in Central Finland

Computer Usage among Senior Citizens in Central Finland Computer Usage among Senior Citizens in Central Finland Elina Jokisuu, Marja Kankaanranta, and Pekka Neittaanmäki Agora Human Technology Center, University of Jyväskylä, Finland e-mail: elina.jokisuu@jyu.fi

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY

A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY H. ISHII, T. TEZUKA and H. YOSHIKAWA Graduate School of Energy Science, Kyoto University,

More information

Immersive Simulation in Instructional Design Studios

Immersive Simulation in Instructional Design Studios Blucher Design Proceedings Dezembro de 2014, Volume 1, Número 8 www.proceedings.blucher.com.br/evento/sigradi2014 Immersive Simulation in Instructional Design Studios Antonieta Angulo Ball State University,

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR HCI and Design Admin Reminder: Assignment 4 Due Thursday before class Questions? Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR 3D Interfaces We

More information

SCIRun Haptic Display for Scientific Visualization

SCIRun Haptic Display for Scientific Visualization SCIRun Haptic Display for Scientific Visualization Lisa J K Durbeck, Nicholas J Macias, David M Weinstein, Chris R Johnson, John M Hollerbach Introduction The overall goal of this research is to enhance

More information

SCIRun Haptic Display for Scientific Visualization

SCIRun Haptic Display for Scientific Visualization SCIRun Haptic Display for Scientific Visualization Lisa J. K. Durbeck, Nicholas J. Macias, David M. Weinstein, Chris R. Johnson, John M. Hollerbach University of Utah Introduction The overall goal of this

More information

CDT: DESIGN AND COMMUNICATION

CDT: DESIGN AND COMMUNICATION CDT: DESIGN AND COMMUNICATION Paper 7048/01 Structured Key message Whilst many excellent answers were seen, the following were considered to be areas where improvement could be made: the correct positioning

More information

FORCE FEEDBACK. Roope Raisamo

FORCE FEEDBACK. Roope Raisamo FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces

More information

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction

More information

in the New Zealand Curriculum

in the New Zealand Curriculum Technology in the New Zealand Curriculum We ve revised the Technology learning area to strengthen the positioning of digital technologies in the New Zealand Curriculum. The goal of this change is to ensure

More information

Virtual prototyping based development and marketing of future consumer electronics products

Virtual prototyping based development and marketing of future consumer electronics products 31 Virtual prototyping based development and marketing of future consumer electronics products P. J. Pulli, M. L. Salmela, J. K. Similii* VIT Electronics, P.O. Box 1100, 90571 Oulu, Finland, tel. +358

More information

WaterColors that. al vesselli.com. Painting Glass. Lesson 1. Contemporary Realism Techniques Using Watercolors

WaterColors that. al vesselli.com. Painting Glass. Lesson 1. Contemporary Realism Techniques Using Watercolors WaterColors that POP! Contemporary Realism Techniques Using Watercolors Lesson 1 Painting Glass al vesselli.com WaterColors that BIntroduction. efore we even begin to talk about watercolors and how I use

More information

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments Weidong Huang 1, Leila Alem 1, and Franco Tecchia 2 1 CSIRO, Australia 2 PERCRO - Scuola Superiore Sant Anna, Italy {Tony.Huang,Leila.Alem}@csiro.au,

More information

Free Cell Solver. Copyright 2001 Kevin Atkinson Shari Holstege December 11, 2001

Free Cell Solver. Copyright 2001 Kevin Atkinson Shari Holstege December 11, 2001 Free Cell Solver Copyright 2001 Kevin Atkinson Shari Holstege December 11, 2001 Abstract We created an agent that plays the Free Cell version of Solitaire by searching through the space of possible sequences

More information

Haptics-Augmented Physics Simulation: Coriolis Effect

Haptics-Augmented Physics Simulation: Coriolis Effect Haptics-Augmented Physics Simulation: Coriolis Effect Felix G. Hamza-Lup, Benjamin Page Computer Science and Information Technology Armstrong Atlantic State University Savannah, GA 31419, USA E-mail: felix.hamza-lup@armstrong.edu

More information

Fixtures for Production of Modular Weld Tables

Fixtures for Production of Modular Weld Tables TSM 416 Technology Capstone Projects Undergraduate Theses and Capstone Projects 4-28-2017 Fixtures for Production of Modular Weld Tables Jeremy Andersen Iowa State University, jeremya@iastate.edu Cameron

More information

GCSE Design and Technology Specification - NEA Guidance

GCSE Design and Technology Specification - NEA Guidance GCSE Design and Technology 2017 Specification - NEA Guidance Non Examined Assessment NEA Non Examined Assessment 50% of the qualification. Approximately 35 hrs of candidate work. Design & Make task from

More information

RUNNYMEDE COLLEGE & TECHTALENTS

RUNNYMEDE COLLEGE & TECHTALENTS RUNNYMEDE COLLEGE & TECHTALENTS Why teach Scratch? The first programming language as a tool for writing programs. The MIT Media Lab's amazing software for learning to program, Scratch is a visual, drag

More information

MPEG-V Based Web Haptic Authoring Tool

MPEG-V Based Web Haptic Authoring Tool MPEG-V Based Web Haptic Authoring Tool by Yu Gao Thesis submitted to the Faculty of Graduate and Postdoctoral Studies In partial fulfillment of the requirements For the M.A.Sc degree in Electrical and

More information

Behaviors That Revolve Around Working Effectively with Others Behaviors That Revolve Around Work Quality

Behaviors That Revolve Around Working Effectively with Others Behaviors That Revolve Around Work Quality Behaviors That Revolve Around Working Effectively with Others 1. Give me an example that would show that you ve been able to develop and maintain productive relations with others, thought there were differing

More information

Propietary Engine VS Commercial engine. by Zalo

Propietary Engine VS Commercial engine. by Zalo Propietary Engine VS Commercial engine by Zalo zalosan@gmail.com About me B.S. Computer Engineering 9 years of experience, 5 different companies 3 propietary engines, 2 commercial engines I have my own

More information

TECHNICAL AND OPERATIONAL NOTE ON CHANGE MANAGEMENT OF GAMBLING TECHNICAL SYSTEMS AND APPROVAL OF THE SUBSTANTIAL CHANGES TO CRITICAL COMPONENTS.

TECHNICAL AND OPERATIONAL NOTE ON CHANGE MANAGEMENT OF GAMBLING TECHNICAL SYSTEMS AND APPROVAL OF THE SUBSTANTIAL CHANGES TO CRITICAL COMPONENTS. TECHNICAL AND OPERATIONAL NOTE ON CHANGE MANAGEMENT OF GAMBLING TECHNICAL SYSTEMS AND APPROVAL OF THE SUBSTANTIAL CHANGES TO CRITICAL COMPONENTS. 1. Document objective This note presents a help guide for

More information

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks 3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks David Gauldie 1, Mark Wright 2, Ann Marie Shillito 3 1,3 Edinburgh College of Art 79 Grassmarket, Edinburgh EH1 2HJ d.gauldie@eca.ac.uk, a.m.shillito@eca.ac.uk

More information

Getting Started Guide

Getting Started Guide SOLIDWORKS Getting Started Guide SOLIDWORKS Electrical FIRST Robotics Edition Alexander Ouellet 1/2/2015 Table of Contents INTRODUCTION... 1 What is SOLIDWORKS Electrical?... Error! Bookmark not defined.

More information

Introduction. From DREAM... Everything starts with an idea or concept in your mind. To DRAWING... The dream is given form by putting it on paper.

Introduction. From DREAM... Everything starts with an idea or concept in your mind. To DRAWING... The dream is given form by putting it on paper. 1 Introduction Then David gave his son Solomon the plans for the portico of the temple,its buildings, its storerooms, its upper parts, its inner rooms... (1 Chronicles 28:11 NIV) From DREAM... Everything

More information

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko SPIDERMAN VR Adam Elgressy and Dmitry Vlasenko Supervisors: Boaz Sternfeld and Yaron Honen Submission Date: 09/01/2019 Contents Who We Are:... 2 Abstract:... 2 Previous Work:... 3 Tangent Systems & Development

More information

Videos get people excited, they get people educated and of course, they build trust that words on a page cannot do alone.

Videos get people excited, they get people educated and of course, they build trust that words on a page cannot do alone. Time and time again, people buy from those they TRUST. In today s world, videos are one of the most guaranteed ways to build trust within minutes, if not seconds and get a total stranger to enter their

More information

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The

More information

Skylands Learning is your trusted learning advisor. That is our promise your trusted learning advisor. Four simple words.

Skylands Learning is your trusted learning advisor. That is our promise your trusted learning advisor. Four simple words. Page 1 of 12 METHODOLOGY Who we are Skylands Learning is your trusted learning advisor. That is our promise your trusted learning advisor. Four simple words. Not enough information? At Skylands, we have

More information

Assembly Set. capabilities for assembly, design, and evaluation

Assembly Set. capabilities for assembly, design, and evaluation Assembly Set capabilities for assembly, design, and evaluation I-DEAS Master Assembly I-DEAS Master Assembly software allows you to work in a multi-user environment to lay out, design, and manage large

More information

IB Interview Guide: How to Walk Through Your Resume or CV as an Undergrad or Recent Grad

IB Interview Guide: How to Walk Through Your Resume or CV as an Undergrad or Recent Grad IB Interview Guide: How to Walk Through Your Resume or CV as an Undergrad or Recent Grad Hello, and welcome to this next lesson in this module on how to tell your story, in other words how to walk through

More information

Additive Manufacturing: A New Frontier for Simulation

Additive Manufacturing: A New Frontier for Simulation BEST PRACTICES Additive Manufacturing: A New Frontier for Simulation ADDITIVE MANUFACTURING popularly known as 3D printing is poised to revolutionize both engineering and production. With its capability

More information

Putting the Finishing Touches on Your April Round Application

Putting the Finishing Touches on Your April Round Application Putting the Finishing Touches on Your April Round Application Hello. Does the reviewer of your application review your essays before or after reading your recommendations? Hi! The order your application

More information

Our team at the Big Idea Competition forum (NASA)

Our team at the Big Idea Competition forum (NASA) My name is Rounak Mukhopadhyay and I have been working at NASA Langley for the past six months; if I had to put it simply I would say it has been the best six months that have ever happened to me. It all

More information

INTRODUCTION TO GAME AI

INTRODUCTION TO GAME AI CS 387: GAME AI INTRODUCTION TO GAME AI 3/31/2016 Instructor: Santiago Ontañón santi@cs.drexel.edu Class website: https://www.cs.drexel.edu/~santi/teaching/2016/cs387/intro.html Outline Game Engines Perception

More information

Clickteam Fusion 2.5 [Fastloops ForEach Loops] - Guide

Clickteam Fusion 2.5 [Fastloops ForEach Loops] - Guide INTRODUCTION Built into Fusion are two powerful routines. They are called Fastloops and ForEach loops. The two are different yet so similar. This will be an exhaustive guide on how you can learn how to

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

Rubber Hand. Joyce Ma. July 2006

Rubber Hand. Joyce Ma. July 2006 Rubber Hand Joyce Ma July 2006 Keywords: 1 Mind - Formative Rubber Hand Joyce Ma July 2006 PURPOSE Rubber Hand is an exhibit prototype that

More information

Terms and Conditions

Terms and Conditions 1 Terms and Conditions LEGAL NOTICE The Publisher has strived to be as accurate and complete as possible in the creation of this report, notwithstanding the fact that he does not warrant or represent at

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

Visual & Virtual Configure-Price-Quote (CPQ) Report. June 2017, Version Novus CPQ Consulting, Inc. All Rights Reserved

Visual & Virtual Configure-Price-Quote (CPQ) Report. June 2017, Version Novus CPQ Consulting, Inc. All Rights Reserved Visual & Virtual Configure-Price-Quote (CPQ) Report June 2017, Version 2 2017 Novus CPQ Consulting, Inc. All Rights Reserved Visual & Virtual CPQ Report As of April 2017 About this Report The use of Configure-Price-Quote

More information

ST NICHOLAS COLLEGE HALF YEARLY PRIMARY EXAMINATIONS February 2018 YEAR 6 ENGLISH TIME: 50 minutes (Reading Comprehension)

ST NICHOLAS COLLEGE HALF YEARLY PRIMARY EXAMINATIONS February 2018 YEAR 6 ENGLISH TIME: 50 minutes (Reading Comprehension) ST NICHOLAS COLLEGE HALF YEARLY PRIMARY EXAMINATIONS February 2018 YEAR 6 ENGLISH TIME: 50 minutes (Reading Comprehension) Total 30 Name: Class: English- Reading Comprehension-Year 6-Half Yearly Exams

More information

VIRTUAL REALITY AND RAPID PROTOTYPING: CONFLICTING OR COMPLIMENTARY?

VIRTUAL REALITY AND RAPID PROTOTYPING: CONFLICTING OR COMPLIMENTARY? VIRTUAL REALITY AND RAPID PROTOTYPING: CONFLICTING OR COMPLIMENTARY? I.Gibson, D.Brown, S.Cobb, R.Eastgate Dept. Manufacturing Engineering & Operations Management University of Nottingham Nottingham, UK

More information

Documents for the Winning Job Search

Documents for the Winning Job Search Table of Content 1 2 3 5 6 7 Documents for the Winning Job Search Resumes Brag Books 30/60/90 Day Sales Plan References Letters of Recommendation Cover Letters Thank You Notes Technology Sheet What Do

More information

UX CAPSTONE USER EXPERIENCE + DEVELOPMENT PROCESS

UX CAPSTONE USER EXPERIENCE + DEVELOPMENT PROCESS UX CAPSTONE USER EXPERIENCE + DEVELOPMENT PROCESS USER EXPERIENCE (UX) Refers to a person s emotions and attitudes about using a particular product, system or service; including the practical, experiential,

More information

City in The Box - CTB Helsinki 2003

City in The Box - CTB Helsinki 2003 City in The Box - CTB Helsinki 2003 An experimental way of storing, representing and sharing experiences of the city of Helsinki, using virtual reality technology, to create a navigable multimedia gallery

More information

CSC C85 Embedded Systems Project # 1 Robot Localization

CSC C85 Embedded Systems Project # 1 Robot Localization 1 The goal of this project is to apply the ideas we have discussed in lecture to a real-world robot localization task. You will be working with Lego NXT robots, and you will have to find ways to work around

More information

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern

More information

Laser scale axis referencing with controllers with low bandwidth sine and cosine inputs

Laser scale axis referencing with controllers with low bandwidth sine and cosine inputs Laser scale axis referencing with controllers with low bandwidth sine and cosine inputs Introduction This document describes the technique used to interface an HS20 laser scale axis to a controller with

More information

School Based Projects

School Based Projects Welcome to the Week One lesson. School Based Projects Who is this lesson for? If you're a high school, university or college student, or you're taking a well defined course, maybe you're going to your

More information

Evolving the JET Virtual Reality System for Delivering the JET EP2 Shutdown Remote Handling Task

Evolving the JET Virtual Reality System for Delivering the JET EP2 Shutdown Remote Handling Task EFDA JET CP(10)07/08 A. Williams, S. Sanders, G. Weder R. Bastow, P. Allan, S.Hazel and JET EFDA contributors Evolving the JET Virtual Reality System for Delivering the JET EP2 Shutdown Remote Handling

More information

IED Detailed Outline. Unit 1 Design Process Time Days: 16 days. An engineering design process involves a characteristic set of practices and steps.

IED Detailed Outline. Unit 1 Design Process Time Days: 16 days. An engineering design process involves a characteristic set of practices and steps. IED Detailed Outline Unit 1 Design Process Time Days: 16 days Understandings An engineering design process involves a characteristic set of practices and steps. Research derived from a variety of sources

More information

DEPUIS project: Design of Environmentallyfriendly Products Using Information Standards

DEPUIS project: Design of Environmentallyfriendly Products Using Information Standards DEPUIS project: Design of Environmentallyfriendly Products Using Information Standards Anna Amato 1, Anna Moreno 2 and Norman Swindells 3 1 ENEA, Italy, anna.amato@casaccia.enea.it 2 ENEA, Italy, anna.moreno@casaccia.enea.it

More information

INVOLVING USERS TO SUCCESSFULLY MEET THE CHALLENGES OF THE DIGITAL LIBRARY: A 30 YEAR PERSONAL REFLECTION

INVOLVING USERS TO SUCCESSFULLY MEET THE CHALLENGES OF THE DIGITAL LIBRARY: A 30 YEAR PERSONAL REFLECTION INVOLVING USERS TO SUCCESSFULLY MEET THE CHALLENGES OF THE DIGITAL LIBRARY: A 30 YEAR PERSONAL REFLECTION Dr Graham Walton, Head of Planning and Resources, Library and Honorary Research Fellow, Centre

More information

Figure 1: Electronics Workbench screen

Figure 1: Electronics Workbench screen PREFACE 3 Figure 1: Electronics Workbench screen When you concentrate on the concepts and avoid applying by rote a memorized set of steps you are studying for mastery. When you understand what is going

More information

Tutorial: Creating maze games

Tutorial: Creating maze games Tutorial: Creating maze games Copyright 2003, Mark Overmars Last changed: March 22, 2003 (finished) Uses: version 5.0, advanced mode Level: Beginner Even though Game Maker is really simple to use and creating

More information

GCSE Design and Technology Specification - NEA Guidance

GCSE Design and Technology Specification - NEA Guidance GCSE Design and Technology 2017 Specification - NEA Guidance Non Examined Assessment NEA Non Examined Assessment 50% of the qualification. Approximately 35 hrs of candidate work. Design & Make task from

More information

Mobile and web games Development

Mobile and web games Development Mobile and web games Development For Alistair McMonnies FINAL ASSESSMENT Banner ID B00193816, B00187790, B00186941 1 Table of Contents Overview... 3 Comparing to the specification... 4 Challenges... 6

More information

Assignment II: Set. Objective. Materials

Assignment II: Set. Objective. Materials Assignment II: Set Objective The goal of this assignment is to give you an opportunity to create your first app completely from scratch by yourself. It is similar enough to assignment 1 that you should

More information

RASim Prototype User Manual

RASim Prototype User Manual 7 th Framework Programme This project has received funding from the European Union s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 610425

More information

Weight Loss: Template Two

Weight Loss: Template Two Weight Loss: Template Two Template Two features 25 Steps in order to create a script that s been designed to convert your audience to buy a weight loss related product or service. It s the long version

More information

Developing Frogger Player Intelligence Using NEAT and a Score Driven Fitness Function

Developing Frogger Player Intelligence Using NEAT and a Score Driven Fitness Function Developing Frogger Player Intelligence Using NEAT and a Score Driven Fitness Function Davis Ancona and Jake Weiner Abstract In this report, we examine the plausibility of implementing a NEAT-based solution

More information

Test of GF MCP-PRO. Developed by GoFlight

Test of GF MCP-PRO. Developed by GoFlight Test of GF MCP-PRO Developed by GoFlight Flightsim enthusiasts will continuously try to improve their virtual experience by adding more and more realism to it. To gain that effect today, you need to think

More information

Alex Friedman s Portfolio

Alex Friedman s Portfolio Alex Friedman s Portfolio Northwestern University McCormick School of Engineering Product Design Engineering Class of 2018 alexfriedman2018@u.northwestern.edu 857-294-4600 www.alexfriedman.me Engineering

More information

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote 8 th International LS-DYNA Users Conference Visualization Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote Todd J. Furlong Principal Engineer - Graphics and Visualization

More information

University of Bath Department of Mechanical Engineering Design for FDM Rapid Prototyping Manufacture (Basic)

University of Bath Department of Mechanical Engineering Design for FDM Rapid Prototyping Manufacture (Basic) University of Bath BATH BA2 7AY United Kingdom Tel +44 (0)1225 388388 University of Bath Department of Mechanical Engineering Design for FDM Rapid Prototyping Manufacture (Basic) Prepared by... E Sells

More information

Successful SATA 6 Gb/s Equipment Design and Development By Chris Cicchetti, Finisar 5/14/2009

Successful SATA 6 Gb/s Equipment Design and Development By Chris Cicchetti, Finisar 5/14/2009 Successful SATA 6 Gb/s Equipment Design and Development By Chris Cicchetti, Finisar 5/14/2009 Abstract: The new SATA Revision 3.0 enables 6 Gb/s link speeds between storage units, disk drives, optical

More information

Creating a Mobile Game

Creating a Mobile Game The University of Akron IdeaExchange@UAkron Honors Research Projects The Dr. Gary B. and Pamela S. Williams Honors College Spring 2015 Creating a Mobile Game Timothy Jasany The University Of Akron, trj21@zips.uakron.edu

More information

Haptic Rendering and Volumetric Visualization with SenSitus

Haptic Rendering and Volumetric Visualization with SenSitus Haptic Rendering and Volumetric Visualization with SenSitus Stefan Birmanns, Ph.D. Department of Molecular Biology The Scripps Research Institute 10550 N. Torrey Pines Road, Mail TPC6 La Jolla, California,

More information

[PYTHON] The Python programming language and all associated documentation is available via anonymous ftp from: ftp.cwi.nl. [DIVER] R. Gossweiler, C.

[PYTHON] The Python programming language and all associated documentation is available via anonymous ftp from: ftp.cwi.nl. [DIVER] R. Gossweiler, C. [PYTHON] The Python programming language and all associated documentation is available via anonymous ftp from: ftp.cwi.nl. [DIVER] R. Gossweiler, C. Long, S. Koga, R. Pausch. DIVER: A Distributed Virtual

More information

Properties of two light sensors

Properties of two light sensors Properties of two light sensors Timo Paukku Dinnesen (timo@daimi.au.dk) University of Aarhus Aabogade 34 8200 Aarhus N, Denmark January 10, 2006 1 Introduction Many projects using the LEGO Mindstorms RCX

More information

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Application Areas of AI   Artificial intelligence is divided into different branches which are mentioned below: Week 2 - o Expert Systems o Natural Language Processing (NLP) o Computer Vision o Speech Recognition And Generation o Robotics o Neural Network o Virtual Reality APPLICATION AREAS OF ARTIFICIAL INTELLIGENCE

More information

Servo Tuning Tutorial

Servo Tuning Tutorial Servo Tuning Tutorial 1 Presentation Outline Introduction Servo system defined Why does a servo system need to be tuned Trajectory generator and velocity profiles The PID Filter Proportional gain Derivative

More information

Edexcel On-screen English Functional Skills Pilot Sample Assessment Materials April 2009 Level 2

Edexcel On-screen English Functional Skills Pilot Sample Assessment Materials April 2009 Level 2 Edexcel On-screen English Functional Skills Pilot Sample Assessment Materials April 2009 Level 2 Time allowed: 1 hour and 15 minutes. Answer ALL the questions. The total mark for this test is 45. Use the

More information

Managing the process towards a new library building. Experiences from Utrecht University. Bas Savenije. Abstract

Managing the process towards a new library building. Experiences from Utrecht University. Bas Savenije. Abstract Managing the process towards a new library building. Experiences from Utrecht University. Bas Savenije Abstract In September 2004 Utrecht University will open a new building for the university library.

More information

VISUALISING ERGONOMICS DATA FOR DESIGN

VISUALISING ERGONOMICS DATA FOR DESIGN INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN, ICED11 15-18 AUGUST 2011, TECHNICAL UNIVERSITY OF DENMARK VISUALISING ERGONOMICS DATA FOR DESIGN Hua Dong 1,2, Eujin Pei 1, Hongyan Chen 1 and Robert Macredie

More information