Improving orientation and mobility skills through virtual environments for people who are blind: past research and future potential

Size: px
Start display at page:

Download "Improving orientation and mobility skills through virtual environments for people who are blind: past research and future potential"

Transcription

1 Improving orientation and mobility skills through virtual environments for people who are blind: past research and future potential O Lahav School of Education, Tel-Aviv University, P.O. Box 39040, Tel-Aviv, ISRAEL lahavo@post.tau.ac.il muse.tau.ac.il/orly/ ABSTRACT This presented paper describes and examines 21 virtual environments developed specifically to support people who are blind in collecting spatial information before arrival at a new location and to help people who are newly blind practice orientation and mobility skills during rehabilitation. The paper highlights weaknesses and strengths of virtual environments that have been developed in the past 15 years as orientation and mobility aids for people who are blind. These results have potential to influence future research and development of a new orientation and mobility aid that could enhance navigation abilities. 1. INTRODUCTION A basic task such as navigation requires a coordinated combination of sensory and cognitive skills. Unfortunately, people who are blind face great difficulties in performing such tasks. Research on orientation and mobility (O&M) skills of people who are blind in known and unknown spaces (Passini and Proulx, 1988; Ungar et al., 1996) indicates that support for the acquisition of spatial mapping and orientation skills should be supplied at two main levels: perceptual and conceptual. In this paper we use the term O&M to refer to the field dealing with systematic techniques by which blind persons orient themselves to their environment and move about independently (Blasch et al., 1997). At the perceptual level, information perceived via other senses should compensate for the deficiency in the visual channel. Thus, the haptic, audio, and olfactory channels become powerful suppliers of information about unknown environments. At the conceptual level, the focus is on supporting the development of appropriate strategies for an efficient mapping of the space and the generation of navigation paths. According to Jacobson (1993), people who are blind tend to explore the indoor environment through a perimeter-recognition strategy, followed by a grid-scanning strategy. Over the years, secondary O&M aids have been developed to help people who are blind explore real spaces. There are currently more than 146 O&M electronic systems and devices (Roentgen et al., 2008). These secondary aids are not a replacement for primary aids such as the long cane and the dog guide. We can divide these aids into two groups: (i) preplanning aids provide the user with information before arrival in an environment, for example verbal description, tactile maps, physical models, digital audio, and tactile screens; and (ii) in-situ aids provide the user with information about the environment while in the space, for example obstacle detectors, tactile vision substitution system, embedded sensors in the environment, and Global Positioning Systems (GPS). There are a number of limitations in the use of these preplanning and in-situ aids. For example, the limited dimensions of tactile maps and models may result in poor resolution of the provided spatial information, there are difficulties in manufacturing them and acquiring updated spatial information, and they are rarely available. Because of these limitations, people who are blind are less likely to use preplanning aids in everyday life. The major limitation of the in-situ aids is that the user must gather the spatial information in the explored space. There is also a safety issue, since the in-situ aids are based mostly on auditory feedback, which in the real space can reduce users attention and isolate them from the surrounding space. Using virtual environments (VEs) has the potential to improve the ability of people with sensorial, physical, mental, and learning disabilities (Schultheis and Rizzo, 2001; Standen et al., 2001). Interaction in the VE by special needs populations presents both benefits and limitations. The benefits of the VE mainly include the user s independent interaction and activity in the VE. Users receive immediate feedback suiting their sensory and cognitive abilities. The VE allows the user to practice without fear, time limitations, or the 393

2 need for the participation of a professional. In addition, the VE technology allows the professional to manage the amount of information and sensorial stimuli that users receive during their interaction within the VE. These unique capabilities of the VE technology system fulfill the need to design a flexible and adaptive learning or rehabilitation program for each client according to his or her special needs and abilities. Moreover, the VE technology can assist professionals in gathering information about their clients interactions, which can assist in designing future learning and rehabilitation programs. On the other hand, the VE has some limitations. The VE is not a replica or replacement for real space interactions and activities. Furthermore, most rehabilitation centers and schools cannot afford these expensive technologies. Additionally, some systems under development are still too heavy, bulky, or complicated for use outside the laboratory environment. Technologically advanced virtual devices enable individuals who are blind to learn by using haptic and audio feedback to detect artificial representations of reality. The most recent generations of haptic devices transmit feeling through direct contact with the virtual object (e.g., SensAble Phantom Desktop, Immersion Corp. s CyberForce, Novint Falacon, and Nintendo's Wii). Stemming from the development of these devices, applications have been researched and developed especially for people who are blind, including identification of texture and shape recognition (Semwal and Evans-Kamp, 2000; Sjotrom and Rassmus-Grohn, 1999), mathematical learning environments (Karshmer and Bledsoe, 2002; Yu et al., 2001; Van Scoy et al., 2000; Van Scoy et al., 2005), and acquisition of spatial information. This paper describes and examines VEs that have been developed to enable people who are blind to improve their O&M skills. There are mainly two groups of VEs: (i) systems that support the acquisition of a cognitive map (Evett et al., 2009; González-Mora, 2003; Iglesias et al., 2004; Kurniawan et al., 2004; Lahav and Mioduser, 2004; Lahav et al., 2011; Lécuyer et al., 2003; Max and González, 1997; Merabet and Sánchez, 2009; Ohuchi et al., 2006; Pokluda and Sochor, 2003; Sánchez and Lumbreras, 2000; Simonnet et al., 2010; Torres-Gil et al., 2010; Zelek et al., 2003); and (ii) systems that are used as O&M rehabilitation aids (D Atri et al., 2007; González-Mora et al., 2006; Inman et al., 2000; Lahav, et al., 2011; Lécuyer, et al., 2003; Max and González, 1997; Seki and Ito, 2003; Seki and Sato, 2011; Tzovaras et al., 2004). 2.1 Sample Selection 2. METHOD This study analyzed 21 peer reviewed papers selected based on research topic: VE, for people who are blind, on O&M as subject matter. The first group of papers was found by search engines for scientific journals and conferences, other papers were selected through snowball sampling, using the bibliography items to find other papers. No papers were excluded on the basis of methodological or result quality. To assess the validity of the database, three evaluators (researcher and two graduate students) analyzed all the papers. Each paper was analyzed twice. Each of the two graduate student evaluators received 11 papers, randomly selected from our list, to be characterized according to the variables. To maximize the common framework of analysis, the graduate student evaluators and researcher met several times to discuss the variables and experimentally apply them to a number of papers. The author and two evaluators coded all 34 questions. Interjudge reliability was 97.4% and therefore regarded as valid. 2.2 Variables Our evaluation characterized 34 variables in three main dimensions: Descriptive Information Dimension. This dimension included basic information regarding the paper and research, such as year of publication, researcher affiliation, researcher discipline, state, and source of funding). System Dimension. This dimension included four categories: (i) System features included six variables, such as system type (software and hardware), system developments stage (prototype and shelf product), number of users (single and multiple), location (local and remote), system modality (haptic, audio, multimodal haptic, and audio), and user s input and output device (tracking system, joystick, game controller, Phantom, keyboard, head-mounted display, headphones, loudspeakers, etc.). (ii) Haptic feedback included two variables: type of haptic feedback (thermal, vibration, texture, stiffness, dumping, collision, and gravity) and variety of haptic feedback. (iii) Audio feedback included two variables: audio system (mono, stereo, and surrounding) and type of audio feedback (oral virtual guide, user footsteps, echo location/obstacle perception, and sound localization). (iv) Interaction type included four variables: user interaction (user device, body movement and user 394

3 device, and body movement), virtual object type (static, dynamic and static and dynamic), operation of the virtual object (rotation), and allowing scaling (increase or decrease object or area size). Research Dimension. This dimension included five categories: (i) Research type included three variables: clinic research; type of research (preliminary and usability), and research goal (acquire cognitive map, O&M rehabilitation trainee). (ii) Participant category included four variables: participants visual ability, number of participants, age, and gender. (iii) Target space category included three variables: VE representing real space, space complexity (simple and complex), and space location (indoor, outdoor). (iv) Research task category also included four variables: length of exposure to VE, type of exploration, construction of cognitive map after exploring VE, and orientation tasks in the real space. (v) Data collection this category included only one variable: whether the developed system included a user log. 2.3 Collecting Data Instrument For the collection of the data we used a protocol research that included all the research categories and variables described above. 2.4 Procedure This study included three stages. At the first stage the researcher collected the peer review target papers using academic search engines; other papers were selected through snowball sampling. In the second stage a protocol research was developed, which included all the research categories and variables. During the third stage the researcher and two graduate students analyzed each paper twice according to the research protocol. 2.5 Data Analysis To evaluate the research papers we used spreadsheet (Excel) and SPSS software mainly for cross-tab analyzing. 3. RESULTS The research results are described below along the three research data dimensions. 3.1 Descriptive Information Dimension The first paper was published in 1997 (Max and González, 1997). Most of the researchers were from academic institutions (82%); only 43% of the groups included interdisciplinary researchers, such as technology disciplines (e.g., computer science, engineering, and industrial science), social sciences (e.g., education, psychology, and rehabilitation), and from the health sciences (e.g., medicine, neuropsychobiology, and physical therapy). Most of the paper authors were from the EU research community (67%). Worldwide, governments are the major funders (62%) with only 10% of funding from private industrial companies. 3.2 System Dimension Most of the research groups developed software and used shelf hardware (77%). All VEs were developed through the prototype stage and were targeted to single users in a local mode. The most frequent system modality was auditory (53%); 43% of the VEs were multimodal (audio and haptic). Per examination of input and output user devices, users operated one or more devices, e.g., tracking system (48%), joystick (15%), game controller (15%), Phantom (19%), keyboard (29%), and head-mounted display (15%). Ten VEs integrated haptic feedback and used one or more types of haptic feedback. Ninety-five percent of the VEs included audio feedback, 40% integrated a surrounding audio system, 25% used a mono system, and only 15% included a stereo system. In type of audio feedback, 85% used sound localization, 40% echolocation and obstacle perception, 20% user footsteps, and 15% oral virtual guide. The interaction type analysis shows that most of the virtual components were static (91%); very few VEs allowed the users to manipulate the VE s objects or its space. 3.3 Research Dimension Most papers included clinical research (82%), while 67% had preliminary research and 29% described usability experiments. Seventy-two percent of the research included people who were congenitally blind and late blind in their research; however 24% of the research included sighted participants who were asked to use 395

4 blindfolds during the experiments. Less then half of the examined papers (43%) included fewer than ten participants in their research. Sixty-seven percent of these research participants were adults. The VEs represented real spaces (67%), simple spaces (67%), and indoor areas (82%). Most of the simple spaces were represented in the auditory modality systems, unlike the multimodal VEs, which represented mainly complex spaces. The research results confirm the potential and the effectiveness of VEs as O&M aids. In all the clinical research, participants were asked to explore the new space by using the VE systems. These results show that most of the participants (60%-100%) explored the VE successfully. Some of the VE systems used a haptic device, such as a virtual cane (D Atri et al., 2007; Lahav and Mioduser, 2004; Lahav et al., 2011; Lécuyer et al., 2003; Pokluda and Sochor, 2003; Tzovaras et al., 2004). These research participants reported that the different virtual canes were useful for active exploration and as passive guidance. On the other hand, some of the participants reported that they disliked being moved passively by the virtual cane (Pokluda and Sochor, 2003). Only one system included fly mode, and its users were able to determine height by the height of the directional beacons (Max and González, 1997). Nevertheless, most of the researchers noted that the avatar speed motion in the VE was necessary to meet the individual needs. Furthermore, Seki and Sato (2011) found that the difference in stress pulse ratio in the virtual training group improved in terms of walk stress, as it did also in the real space participants group. They suggested that the VE was perceived by the user as a safe training environment and thus it could reduce the stress experienced by the novice trainee, as opposed to the stress experienced in training in the real space. Furthermore, the results found by Ohuchi et al. (2006) showed that participants physically turning right or left in a multimodal VE caused disorientation. The multimodal systems mostly focused on acquiring a cognitive map. Accurate spatial descriptions of the explored spaces were given after exploring the VE (Evett et al., 2009; Lahav and Mioduser, 2004; Lahav et al., 2011; Max and González, 1997; Ohuchi et al., 2006; Pokluda and Sochor, 2003). The participants were able to simulate the environment size differences successfully (Kurniawan et al., 2004; Tzovaras et al., 2004). Similar results were found among adults and children who were totally blind or had residual vision, but different results were found among children with residual vision and medium cognitive achievement,who were unable to create a spatial cognitive map (Sánchez and Lumbreras, 2000). In the real space, most of the participants (70%-100%) were able to transfer and apply spatial information that was acquired during their VE exploration (Evett et al., 2009; Lahav et al., 2011). 4. CONCLUSIONS In the past 15 years, 21 VEs have been researched and developed for the use of people who are blind. Each research group designed and developed a unique solution for O&M preplanning systems to help people who are blind gather new spatial information or to act as an O&M rehabilitation simulator. The encouraging research results have important implications for the continuation of the research and development. Hopefully, these promising results will have important influence on future research and development, focusing on O&M skills and cognitive spatial behavior in the VE. From the implementation side, the use of affordable VEs as an O&M aid can lead to direct influence on users quality of daily life, including professional education, employment, social life, and rehabilitation of people who are newly blind. We hope that this paper will expand the awareness of the use of the VE as an O&M aid by research and development groups, users, rehabilitation services, and other public service providers. Unfortunately, today, despite the encouraging results, these VEs are not available outside research labs. The research results showed that some of the VEs ask the user to operate several devices at the same time, which can affect the user s ability to work independently or affect his or her cognitive load in gathering and analyzing extensive information. Future applications will need to maintain a balance between user-friendly systems and audio and haptic representations. Further research is needed to continue the research of Simonnet et al. (2010), to examine if and how the VE s spatial exploration methods, allocentric or geocentric representations, influence the user s spatial model. This is a topic that was less commonly examined and which might have an influence on the user s ultimate ability and outcome in using a VE. Additionally, the research must proceed to examine the real-life scenarios in which this type of O&M aid is most needed, such as outdoor and complex spaces. In the mean time, handheld device technologies are increasingly being used by people who are blind. Until three years ago, users who are blind carried a variety of devices, including cell phone, GPS, note taker, color identifier, drug labels reader, and music or audio book device. Today one handheld device offers all of these technologies and more. Two years ago, Google announced a new Android application called Intersection Explorer (Google Co., 2010), a preplanning application that allows people who are blind to explore the layout of streets on Google Maps by using touch to move along the street and to receive auditory directions. Tactile handheld devices have been developed (Fukushima and Kajimoto, 2011; Youngseong and 396

5 Eunsol, 2010), which allow users who are blind to gather tactile feedback on the backside of the handheld device. Encouraged by research results, we suggest integrating an O&M aid application based on multimodal interfaces in a handheld device. The handheld device s screen will fit the user s palm, enabling collection of all the tactile information. This unique application will allow users to explore the spatial space in advance, preplan a new path, install landmarks, apply these landmarks through the GPS in the real space, share this information with multiple users, and use different spatial layers through the GPS (such as user s landmarks, public transportation, and road construction). These and other new technologies hold important potential to improve the quality of life of people who are blind. Acknowledgements: This research was supported by a grant from The European Commission, Marie Curie International Reintegration Grants (Grant No. FP7-PEOPLE IRG). I thank the two graduate students, H. Gedalevitz and I. Milman, who helped to evaluate the research papers. 5. REFERENCES B B Blasch, W R Wiener and R L Welsh (1997), Foundations of Orientation and Mobility, American Foundation for the Blind, New York, pp E D Atri, C M Medaglia, A Serbanati, U B Ceipidor, E Panizzi and A D Atri (2007), A system to aid blind people in the mobility: A usability test and its results, Proc. Second International Conference on Systems (ICONS'07), Sainte-Luce. L Evett, S Battersby, A Ridley and D Brown (2009), An interface to virtual environments for people who are blind using Wii technology Mental models and navigation, J of Assistive Technologies, 3, 2, pp S. Fukushima and H Kajimoto (2011), Palm touch panel: Providing touch sensation through the device, Proc. ITS 11 the ACM International Conference on Interactive Tabletops and Surfaces, New York. J L González-Mora (2003), VASIII: Development of an interactive device based on virtual acoustic reality oriented to blind rehabilitation, Jornada de Seguimiento de Proyectos en Tecnologías Informáticas. J L González-Mora, A F Rodriguez-Hernández, E Burunat, F Martin and M A Castellano (2006), Seeing the world by hearing: Virtual acoustic space (VAS) a new space perception system for blind people, Proc. IEEE International Conference on Information and Communication Technologies, Damascus. Google Co. (2010) Google Intersection Explorer application. Retrieved from R Iglesias, S Casado, T Gutierrez, J L Barbero, C A Avizzano, S Marcheschi and M Bergamasco (2004), Computer graphics access for blind people through a haptic and audio virtual environment, Proc. 3rd IEEE International Workshop on Haptic, Audio and Visual Environments and Their Applications -- HAVE 2004, Ottawa D P Inman, K Loge and A Cram (2000), Teaching orientation and mobility skills to blind children using computer generated 3-D sound environments, Proc. ICAD 2000, Atlanta, pp W H Jacobson (1993), The Art and Science of Teaching Orientation and Mobility to Persons with Visual Impairments. American Foundation for the Blind, New York. A I Karshmer and C Bledsoe (2002), Access to mathematics by blind students -- Introduction to the special thematic session, Proc. International Conference on Computers Helping People with Special Needs (ICCHP), Linz. S H Kurniawan, A Sporka, V Nemec and P Slavik (2004), Design and user evaluation of a spatial audio system for blind users, Proc. 5th International Conference on Disability, Virtual Reality and Associated Technologies, Oxford, pp O Lahav and D Mioduser (2004), Exploration of unknown spaces by people who are blind, using a multisensory virtual environment (MVE), Journal of Special Education Technology, 19, 3, pp O Lahav, D Schloerb, S Kummar and M A Srinivasan (2011), A virtual map to support people who are blind to navigate through real spaces, Journal of Special Education Technology, 26, 4. A Lécuyer, P Mobuchon, C Mégard, J Perret, C Andriot and J P Colinot (2003), HOMERE: A multimodal system for visually impaired people to explore virtual environments, Proc. IEEE Virtual Reality 2003 (VR 03), Los Angeles. M L Max and J R González (1997), Blind persons navigate in virtual reality (VR); hearing and feeling communicates reality, In Medicine Meets Virtual Reality (K. S. Morgan et al., Eds), IOS Press, San Diego, pp

6 L B Merabet and J Sánchez (2009), Audio-based navigation using virtual environments: Combining technology and neuroscience, AER Journal: Research and Practice in Visual Impairment and Blindness, 2, 3, pp M Ohuchi, Y Iwaya, Y Suzuki and T Munekata (2006), Cognitive-map formation of blind persons in virtual sound environment, Proc. 12th International Conference on Auditory Display, London. R Passini and G Proulx (1988), Wayfinding without vision: An experiment with congenitally blind people, Environment and Behavior, 20, pp L Pokluda and J Sochor (2003), Spatial haptic orientation for visually impaired people, Proc. Eurographics Ireland Chapter Workshop Series, University of Ulster, Coleraine. U R Roentgen, G J Gelderblom, M Soede and L P de Witte (2008), Inventory of electronic mobility aids for persons with visual impairments: A literature review, J Visual Impairment and Blindness, 102, 11, pp J Sánchez and M Lumbreras (2000), Usability and cognitive impact of the interaction with 3D virtual interactive acoustic environments by blind children, Proc. 3rd International Conference on Disability, Virtual Reality and Associated Technologies, Alghero. M T Schultheis and A A Rizzo (2001), The application of virtual reality technology for rehabilitation, Rehabilitation Psychology, 46, 3, pp Y Seki and K Ito (2003), Study on acoustical training system of obstacle perception for the blind, Proc. Assistive Technology Research Series 11, Assistive Technology -- Shaping the Future, Dublin, pp Y Seki and T Sato (2011), A training system of orientation and mobility for blind people using acoustic virtual reality, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 19, 1, pp S K Semwal and D L Evans-Kamp (2000), Virtual environments for visually impaired, Proc. 2nd International Conference on Virtual Worlds, Paris. M Simonnet, S Vieilledent, D R Jacobson and J Tisseau (2010), The assessment of non visual maritime cognitive maps of a blind sailor: A case study, J of Maps, 2010, pp C Sjotrom and K Rassmus-Grohn (1999). The sense of touch provides new computer interaction techniques for disabled people, Technology and Disability, 10, pp P J Standen, D J Brown and J J Cromby (2001), The effective use of virtual environments in the education and rehabilitation of students with intellectual disabilities, British J of Education Technology, 32, 3, pp M A Torres-Gil, O Casanova-Gonzalez and J L Gonzalez-Mora (2010), Applications of virtual reality for visually impaired people, WSEAS Transactions on Computers, 2, 9, pp D Tzovaras, G Nikolakis, G Fergadis, S Malasiotis and M Stavrakis (2004), Design and implementation of haptic virtual environments for the training of the visually impaired, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 12, 2, pp F Van Scoy, T Kawai, M Darrah and C Rash (2000), Haptic display of mathematical functions for teaching mathematics to students with vision disabilities: Design and proof of concept, Proc. Haptic Human Computer Interface: First International Workshop, Glasgow, pp F Van Scoy, D McLaughlin and A Fullmer (2005), Auditory augmentation of haptic graphs: Developing a graphic tool for teaching precalculus skills? to blind students, Proc. ICAD 05-Eleventh Meeting of the International Conference on Auditory Display, Limerick. S Ungar, M Blades and S Spencer (1996), The construction of cognitive maps by children with visual impairments, In The Construction of Cognitive Maps (J Portugali, Ed), Kluwer Academic, Netherlands, pp K Youngseong and Y Eunsol (2010), Voim, Retrieved from: W Yu, R Ramloll and S A Brewster (2001), Haptic graphs for blind computer users, Proc. presented at the Haptic Human-Computer Interaction First International Workshop, Lecture Notes in Computer Science, Glasgow, pp J S Zelek, S Bromley, D Asmar and D Thompson (2003), A haptic glove as a tactile-vision sensory substitution for wayfinding, J Visual Impairment and Blindness, 97, 10, pp

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills O Lahav and D Mioduser School of Education, Tel Aviv University,

More information

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv, Tel-Aviv,

More information

Integrate the BlindAid system in a traditional orientation and mobility rehabilitation program

Integrate the BlindAid system in a traditional orientation and mobility rehabilitation program Integrate the BlindAid system in a traditional orientation and mobility rehabilitation program The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story

More information

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I 1 Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study I Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv,

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

Interactive Exploration of City Maps with Auditory Torches

Interactive Exploration of City Maps with Auditory Torches Interactive Exploration of City Maps with Auditory Torches Wilko Heuten OFFIS Escherweg 2 Oldenburg, Germany Wilko.Heuten@offis.de Niels Henze OFFIS Escherweg 2 Oldenburg, Germany Niels.Henze@offis.de

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Comparing Two Haptic Interfaces for Multimodal Graph Rendering Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,

More information

The ENABLED Editor and Viewer simple tools for more accessible on line 3D models. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

The ENABLED Editor and Viewer simple tools for more accessible on line 3D models. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten The ENABLED Editor and Viewer simple tools for more accessible on line 3D models Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Published in: 5th international conference on Enactive Interfaces

More information

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/

More information

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence Ji-Won Song Dept. of Industrial Design. Korea Advanced Institute of Science and Technology. 335 Gwahangno, Yusong-gu,

More information

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,

More information

Realtime 3D Computer Graphics Virtual Reality

Realtime 3D Computer Graphics Virtual Reality Realtime 3D Computer Graphics Virtual Reality Marc Erich Latoschik AI & VR Lab Artificial Intelligence Group University of Bielefeld Virtual Reality (or VR for short) Virtual Reality (or VR for short)

More information

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Test of pan and zoom tools in visual and non-visual audio haptic environments Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Published in: ENACTIVE 07 2007 Link to publication Citation

More information

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software: Human Factors We take a closer look at the human factors that affect how people interact with computers and software: Physiology physical make-up, capabilities Cognition thinking, reasoning, problem-solving,

More information

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

From Dots To Shapes: an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun "From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils Patrick Roth, Lori Petrucci, Thierry Pun Computer Science Department CUI, University of Geneva CH - 1211 Geneva

More information

Buddy Bearings: A Person-To-Person Navigation System

Buddy Bearings: A Person-To-Person Navigation System Buddy Bearings: A Person-To-Person Navigation System George T Hayes School of Information University of California, Berkeley 102 South Hall Berkeley, CA 94720-4600 ghayes@ischool.berkeley.edu Dhawal Mujumdar

More information

Virtual Tactile Maps

Virtual Tactile Maps In: H.-J. Bullinger, J. Ziegler, (Eds.). Human-Computer Interaction: Ergonomics and User Interfaces. Proc. HCI International 99 (the 8 th International Conference on Human-Computer Interaction), Munich,

More information

Comparison of Haptic and Non-Speech Audio Feedback

Comparison of Haptic and Non-Speech Audio Feedback Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability

More information

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions Ernesto Arroyo MIT Media Laboratory 20 Ames Street E15-313 Cambridge, MA 02139 USA earroyo@media.mit.edu Ted Selker MIT Media Laboratory

More information

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 t t t rt t s s Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 1 r sr st t t 2 st t t r t r t s t s 3 Pr ÿ t3 tr 2 t 2 t r r t s 2 r t ts ss

More information

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Sound rendering in Interactive Multimodal Systems. Federico Avanzini Sound rendering in Interactive Multimodal Systems Federico Avanzini Background Outline Ecological Acoustics Multimodal perception Auditory visual rendering of egocentric distance Binaural sound Auditory

More information

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng. Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng. Multimedia Communications Research Laboratory University of Ottawa Ontario Research Network of E-Commerce www.mcrlab.uottawa.ca abed@mcrlab.uottawa.ca

More information

Virtual Reality in Neuro- Rehabilitation and Beyond

Virtual Reality in Neuro- Rehabilitation and Beyond Virtual Reality in Neuro- Rehabilitation and Beyond Amanda Carr, OTRL, CBIS Origami Brain Injury Rehabilitation Center Director of Rehabilitation Amanda.Carr@origamirehab.org Objectives Define virtual

More information

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Hasti Seifi, CPSC554m: Assignment 1 Abstract Graphical user interfaces greatly enhanced usability of computer systems over older

More information

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne Introduction to HCI CS4HC3 / SE4HC3/ SE6DO3 Fall 2011 Instructor: Kevin Browne brownek@mcmaster.ca Slide content is based heavily on Chapter 1 of the textbook: Designing the User Interface: Strategies

More information

Subject Description Form. Upon completion of the subject, students will be able to:

Subject Description Form. Upon completion of the subject, students will be able to: Subject Description Form Subject Code Subject Title EIE408 Principles of Virtual Reality Credit Value 3 Level 4 Pre-requisite/ Corequisite/ Exclusion Objectives Intended Subject Learning Outcomes Nil To

More information

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1 Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility

More information

An Investigation on Vibrotactile Emotional Patterns for the Blindfolded People

An Investigation on Vibrotactile Emotional Patterns for the Blindfolded People An Investigation on Vibrotactile Emotional Patterns for the Blindfolded People Hsin-Fu Huang, National Yunlin University of Science and Technology, Taiwan Hao-Cheng Chiang, National Yunlin University of

More information

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation Rassmus-Gröhn, Kirsten; Molina, Miguel; Magnusson, Charlotte; Szymczak, Delphine Published in: Poster Proceedings from 5th International

More information

Blindstation : a Game Platform Adapted to Visually Impaired Children

Blindstation : a Game Platform Adapted to Visually Impaired Children Blindstation : a Game Platform Adapted to Visually Impaired Children Sébastien Sablé and Dominique Archambault INSERM U483 / INOVA - Université Pierre et Marie Curie 9, quai Saint Bernard, 75,252 Paris

More information

Haplug: A Haptic Plug for Dynamic VR Interactions

Haplug: A Haptic Plug for Dynamic VR Interactions Haplug: A Haptic Plug for Dynamic VR Interactions Nobuhisa Hanamitsu *, Ali Israr Disney Research, USA nobuhisa.hanamitsu@disneyresearch.com Abstract. We demonstrate applications of a new actuator, the

More information

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments Elias Giannopoulos 1, Victor Eslava 2, María Oyarzabal 2, Teresa Hierro 2, Laura González 2, Manuel Ferre 2,

More information

Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation

Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation Yuhang Zhao1, 2, Cynthia L. Bennett1, 3, Hrvoje Benko1, Edward Cutrell1, Christian Holz1,

More information

Access Invaders: Developing a Universally Accessible Action Game

Access Invaders: Developing a Universally Accessible Action Game ICCHP 2006 Thursday, 13 July 2006 Access Invaders: Developing a Universally Accessible Action Game Dimitris Grammenos, Anthony Savidis, Yannis Georgalis, Constantine Stephanidis Human-Computer Interaction

More information

these systems has increased, regardless of the environmental conditions of the systems.

these systems has increased, regardless of the environmental conditions of the systems. Some Student November 30, 2010 CS 5317 USING A TACTILE GLOVE FOR MAINTENANCE TASKS IN HAZARDOUS OR REMOTE SITUATIONS 1. INTRODUCTION As our dependence on automated systems has increased, demand for maintenance

More information

The Use of Virtual Reality System for Education in Rural Areas

The Use of Virtual Reality System for Education in Rural Areas The Use of Virtual Reality System for Education in Rural Areas Iping Supriana Suwardi 1, Victor 2 Institut Teknologi Bandung, Jl. Ganesha 10 Bandung 40132, Indonesia 1 iping@informatika.org, 2 if13001@students.if.itb.ac.id

More information

Technology offer. Aerial obstacle detection software for the visually impaired

Technology offer. Aerial obstacle detection software for the visually impaired Technology offer Aerial obstacle detection software for the visually impaired Technology offer: Aerial obstacle detection software for the visually impaired SUMMARY The research group Mobile Vision Research

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

Keywords: Innovative games-based learning, Virtual worlds, Perspective taking, Mental rotation.

Keywords: Innovative games-based learning, Virtual worlds, Perspective taking, Mental rotation. Immersive vs Desktop Virtual Reality in Game Based Learning Laura Freina 1, Andrea Canessa 2 1 CNR-ITD, Genova, Italy 2 BioLab - DIBRIS - Università degli Studi di Genova, Italy freina@itd.cnr.it andrea.canessa@unige.it

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Workshop Session #3: Human Interaction with Embedded Virtual Simulations Summary of Discussion

Workshop Session #3: Human Interaction with Embedded Virtual Simulations Summary of Discussion : Summary of Discussion This workshop session was facilitated by Dr. Thomas Alexander (GER) and Dr. Sylvain Hourlier (FRA) and focused on interface technology and human effectiveness including sensors

More information

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality Bruce N. Walker and Kevin Stamper Sonification Lab, School of Psychology Georgia Institute of Technology 654 Cherry Street, Atlanta, GA,

More information

Multi-Modal User Interaction

Multi-Modal User Interaction Multi-Modal User Interaction Lecture 4: Multiple Modalities Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk MMUI, IV, Zheng-Hua Tan 1 Outline Multimodal interface

More information

Waves Nx VIRTUAL REALITY AUDIO

Waves Nx VIRTUAL REALITY AUDIO Waves Nx VIRTUAL REALITY AUDIO WAVES VIRTUAL REALITY AUDIO THE FUTURE OF AUDIO REPRODUCTION AND CREATION Today s entertainment is on a mission to recreate the real world. Just as VR makes us feel like

More information

VIRTUAL ENVIRONMENTS FOR THE EVALUATION OF HUMAN PERFORMANCE. Towards Virtual Occupancy Evaluation in Designed Environments (VOE)

VIRTUAL ENVIRONMENTS FOR THE EVALUATION OF HUMAN PERFORMANCE. Towards Virtual Occupancy Evaluation in Designed Environments (VOE) VIRTUAL ENVIRONMENTS FOR THE EVALUATION OF HUMAN PERFORMANCE Towards Virtual Occupancy Evaluation in Designed Environments (VOE) O. PALMON, M. SAHAR, L.P.WIESS Laboratory for Innovations in Rehabilitation

More information

6 Ubiquitous User Interfaces

6 Ubiquitous User Interfaces 6 Ubiquitous User Interfaces Viktoria Pammer-Schindler May 3, 2016 Ubiquitous User Interfaces 1 Days and Topics March 1 March 8 March 15 April 12 April 26 (10-13) April 28 (9-14) May 3 May 10 Administrative

More information

Hiroyuki Kajimoto Satoshi Saga Masashi Konyo. Editors. Pervasive Haptics. Science, Design, and Application

Hiroyuki Kajimoto Satoshi Saga Masashi Konyo. Editors. Pervasive Haptics. Science, Design, and Application Pervasive Haptics Hiroyuki Kajimoto Masashi Konyo Editors Pervasive Haptics Science, Design, and Application 123 Editors Hiroyuki Kajimoto The University of Electro-Communications Tokyo, Japan University

More information

Glasgow eprints Service

Glasgow eprints Service Hoggan, E.E and Brewster, S.A. (2006) Crossmodal icons for information display. In, Conference on Human Factors in Computing Systems, 22-27 April 2006, pages pp. 857-862, Montréal, Québec, Canada. http://eprints.gla.ac.uk/3269/

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat.

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat. Oihana Otaegui, Estíbaliz Loyo, Eduardo Carrasco, Caludia Fösleitner, John Spiller, Daniela Patti, Adela, Marcoci, Rafael Olmedo, Markus Dubielzig 1 ABSTRACT (Oihana Otaegui, Vicomtech-IK4, San Sebastian,

More information

Virtual Haptic Map Using Force Display Device for Visually Impaired

Virtual Haptic Map Using Force Display Device for Visually Impaired Virtual Haptic Map Using Force Display Device for Visually Impaired Takayuki Satoi Masanao Koeda Tsuneo Yoshikawa College of Information Science and Engineering, Department of Human and Computer Intelligence,

More information

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics

More information

Booklet of teaching units

Booklet of teaching units International Master Program in Mechatronic Systems for Rehabilitation Booklet of teaching units Third semester (M2 S1) Master Sciences de l Ingénieur Université Pierre et Marie Curie Paris 6 Boite 164,

More information

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS) Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS) Jussi Rantala Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Contents

More information

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Issues and Challenges of 3D User Interfaces: Effects of Distraction Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design. In the Realm of the Senses

Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design. In the Realm of the Senses Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design Charles Spence Department of Experimental Psychology, Oxford University In the Realm of the Senses Wickens

More information

Simulation of Water Inundation Using Virtual Reality Tools for Disaster Study: Opportunity and Challenges

Simulation of Water Inundation Using Virtual Reality Tools for Disaster Study: Opportunity and Challenges Simulation of Water Inundation Using Virtual Reality Tools for Disaster Study: Opportunity and Challenges Deepak Mishra Associate Professor Department of Avionics Indian Institute of Space Science and

More information

Do You Feel What I Hear?

Do You Feel What I Hear? 1 Do You Feel What I Hear? Patrick Roth 1, Hesham Kamel 2, Lori Petrucci 1, Thierry Pun 1 1 Computer Science Department CUI, University of Geneva CH - 1211 Geneva 4, Switzerland Patrick.Roth@cui.unige.ch

More information

HUMAN COMPUTER INTERFACE

HUMAN COMPUTER INTERFACE HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the

More information

Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO

Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO Overview Basic concepts and ideas of virtual environments

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Computer Haptics and Applications

Computer Haptics and Applications Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School

More information

A Wiimote controlled interface to virtual environments for people who are blind mental models and attentional demands

A Wiimote controlled interface to virtual environments for people who are blind mental models and attentional demands A Wiimote controlled interface to virtual environments for people who are blind mental models and attentional demands Lindsay Evett*, Allan Ridley, Steven Battersby and David Brown Interactive Systems

More information

VR based HCI Techniques & Application. November 29, 2002

VR based HCI Techniques & Application. November 29, 2002 VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted

More information

Virtual Reality Calendar Tour Guide

Virtual Reality Calendar Tour Guide Technical Disclosure Commons Defensive Publications Series October 02, 2017 Virtual Reality Calendar Tour Guide Walter Ianneo Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS Nuno Sousa Eugénio Oliveira Faculdade de Egenharia da Universidade do Porto, Portugal Abstract: This paper describes a platform that enables

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Using Real Objects for Interaction Tasks in Immersive Virtual Environments Using Objects for Interaction Tasks in Immersive Virtual Environments Andy Boud, Dr. VR Solutions Pty. Ltd. andyb@vrsolutions.com.au Abstract. The use of immersive virtual environments for industrial applications

More information

Augmented Reality Tactile Map with Hand Gesture Recognition

Augmented Reality Tactile Map with Hand Gesture Recognition Augmented Reality Tactile Map with Hand Gesture Recognition Ryosuke Ichikari 1, Tenshi Yanagimachi 2 and Takeshi Kurata 1 1: National Institute of Advanced Industrial Science and Technology (AIST), Japan

More information

School of Computer Science. Course Title: Introduction to Human-Computer Interaction Date: 8/16/11

School of Computer Science. Course Title: Introduction to Human-Computer Interaction Date: 8/16/11 Course Title: Introduction to Human-Computer Interaction Date: 8/16/11 Course Number: CEN-371 Number of Credits: 3 Subject Area: Computer Systems Subject Area Coordinator: Christine Lisetti email: lisetti@cis.fiu.edu

More information

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Heads up interaction: glasgow university multimodal research. Eve Hoggan Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not

More information

Design and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People

Design and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People Design and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People Ying Ying Huang Doctoral Thesis in Human-Computer Interaction KTH, Stockholm, Sweden 2010 Avhandling som med tillstånd

More information

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine Show me the direction how accurate does it have to be? Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine Published: 2010-01-01 Link to publication Citation for published version (APA): Magnusson,

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Design and evaluation of Hapticons for enriched Instant Messaging

Design and evaluation of Hapticons for enriched Instant Messaging Design and evaluation of Hapticons for enriched Instant Messaging Loy Rovers and Harm van Essen Designed Intelligence Group, Department of Industrial Design Eindhoven University of Technology, The Netherlands

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

An Investigation of Search Behaviour in a Tactile Exploration Task for Sighted and Non-sighted Adults.

An Investigation of Search Behaviour in a Tactile Exploration Task for Sighted and Non-sighted Adults. An Investigation of Search Behaviour in a Tactile Exploration Task for Sighted and Non-sighted Adults. Luca Brayda Guido Rodriguez Istituto Italiano di Tecnologia Clinical Neurophysiology, Telerobotics

More information

Towards affordance based human-system interaction based on cyber-physical systems

Towards affordance based human-system interaction based on cyber-physical systems Towards affordance based human-system interaction based on cyber-physical systems Zoltán Rusák 1, Imre Horváth 1, Yuemin Hou 2, Ji Lihong 2 1 Faculty of Industrial Design Engineering, Delft University

More information

VIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences June Dr.

VIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences June Dr. Virtual Reality & Presence VIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences 25-27 June 2007 Dr. Frederic Vexo Virtual Reality & Presence Outline:

More information

Omni-Directional Catadioptric Acquisition System

Omni-Directional Catadioptric Acquisition System Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Audio makes a difference in haptic collaborative virtual environments

Audio makes a difference in haptic collaborative virtual environments Audio makes a difference in haptic collaborative virtual environments JONAS MOLL, YING YING HUANG, EVA-LOTTA SALLNÄS HCI Dept., School of Computer Science and Communication, Royal Institute of Technology,

More information

HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES

HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES ICSRiM University of Leeds School of Music and School of Computing Leeds LS2 9JT UK info@icsrim.org.uk www.icsrim.org.uk Abstract The paper

More information

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of

More information

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate Immersive Training David Lafferty President of Scientific Technical Services And ARC Associate Current Situation Great Shift Change Drive The Need For Training Conventional Training Methods Are Expensive

More information

Glasgow eprints Service

Glasgow eprints Service Yu, W. and Kangas, K. (2003) Web-based haptic applications for blind people to create virtual graphs. In, 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 22-23 March

More information

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces In Usability Evaluation and Interface Design: Cognitive Engineering, Intelligent Agents and Virtual Reality (Vol. 1 of the Proceedings of the 9th International Conference on Human-Computer Interaction),

More information

Providing external memory aids in haptic visualisations for blind computer users

Providing external memory aids in haptic visualisations for blind computer users Providing external memory aids in haptic visualisations for blind computer users S A Wall 1 and S Brewster 2 Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, 17

More information

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa VIRTUAL REALITY Introduction Emil M. Petriu SITE, University of Ottawa Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality HUMAN PERCEPTION OF

More information

Salient features make a search easy

Salient features make a search easy Chapter General discussion This thesis examined various aspects of haptic search. It consisted of three parts. In the first part, the saliency of movability and compliance were investigated. In the second

More information

Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People

Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People Atheer S. Al-Khalifa 1 and Hend S. Al-Khalifa 2 1 Electronic and Computer Research Institute, King Abdulaziz City

More information

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Helmut Schrom-Feiertag 1, Christoph Schinko 2, Volker Settgast 3, and Stefan Seer 1 1 Austrian

More information

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018. Research Intern Director of Research We are seeking a summer intern to support the team to develop prototype 3D sensing systems based on state-of-the-art sensing technologies along with computer vision

More information

SPATIAL information is not fully available to visually

SPATIAL information is not fully available to visually 170 IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES, VOL. 5, NO. 2, APRIL-JUNE 2012 Spatial Learning Using Locomotion Interface to Virtual Environment Kanubhai K. Patel and Sanjaykumar Vij Abstract The inability

More information

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu Augmented Home Integrating a Virtual World Game in a Physical Environment Serge Offermans and Jun Hu Eindhoven University of Technology Department of Industrial Design The Netherlands {s.a.m.offermans,j.hu}@tue.nl

More information

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp. 105-124. http://eprints.gla.ac.uk/3273/ Glasgow eprints Service http://eprints.gla.ac.uk

More information

Specification of symbols used on Audio-Tactile Maps for individuals with blindness

Specification of symbols used on Audio-Tactile Maps for individuals with blindness Specification of symbols used on Audio-Tactile Maps for individuals with blindness D2.3 Production of AT-Maps Prepare by : Contributors Konstantinos Charitakis All partners Work Package : No 2 Email: Form:

More information