Haptic presentation of 3D objects in virtual reality for the visually disabled
|
|
- Joy Melton
- 6 years ago
- Views:
Transcription
1 Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl, andrzej.materka@p.lodz.pl ABSTRACT The paper presents an initial research on haptic perception of 3D objects in a virtual reality environment for aiding the visually disabled persons in learning new routes and obstacle identification. The study spans a number of fields, from the very technical, such as scene segmentation and obstacle detection algorithms to psychological aspects such as the effectiveness in utilizing haptic information. The authors constructed a prototype system for the tactile presentation of real objects in a virtual reality. 1. INTRODUCTION Research concerning the application of haptic force feedback devices for the blind are worth further development as sight cannot be substituted by the auditory information channel alone. Information concerning the living environment can be complemented in this case by the sense of touch. Furthermore, designing the tactile system dedicated to people with visual disabilities would allow them to access information from the virtual 3D world simply by touching it. The system should consist of three major elements: a camera (provides information about the distance from obstacles in a scene a depth map), a computer (the depth map is segmented and the virtual scene is created) and a haptic device (the interface for a tactile presentation of the acquired scene). A haptic device is the interface used for communication between a human and a virtual reality. Thanks to the force feedback it produces, a user can feel the shape, density and texture of 3D objects created in a virtual world. The touching experience when using this interface is quite close to reality. Haptic perception incorporates both kinaesthetic sensing, (i.e. of the position and movement of joints and limbs), and tactile sensing, (i.e. through the skin) (Loomis and Lederman, 1986). The most popular haptic devices available are the Phantom Sensable (Sensable Corp.) and a range of touching manipulators from Force Dimension (Force Dimension Corp.). The systems define one contact point at a time between the observer and the virtual object. They do not stimulate cutaneous receptors responding to temperature, pressure and pain. The mentioned devices have great potential and they were considered for use by the blind to familiarize themselves with obstacles inside buildings and for learning new routes and shapes. However, their high cost limits their availability to the average user. Since haptic force feedback technology has entered the world of computer games, a new low-cost device called Falcon Novint (Novint Corp.) has appeared on the market. Although the device has only 3 degrees of freedom, compared to that of the Phantom Sensable with 6, this is enough for the 3D object presentation. The aim of this experimental study is to present a prototype system which allows for real scenes to automatically appear in a virtual reality (by means of a time-of-flight 3D camera) and to be accessed in the haptic form by the usage of Falcon Novint. 2. RELATED WORK Research concerning the application of haptic-force-feedback, stationary devices for navigating the blind can be divided into two categories: building virtual maps and creating simulators where real obstacles and objects are presented virtually. They are made for learning new routes and it seems they have the potential as a tool which the blind can use to acquire knowledge about a place for an intended first time visit. The majority of projects are focused on checking if advanced, expensive devices with a proven quality of performance can be used for such purposes. In the paper (Jansson et al., 1999), two independent studies investigating problems concerning the use of haptic virtual environments for blind people are described. Two 103
2 devices, a Phantom 1.5 and an Impulse Engine 3000 were used to render virtual textures and 3D objects. Experiments proved that objects rendered by these devices can be effectively perceived by both blind and blindfolded sighted observers. However, the investigated scenarios were very simple. In another publication (Jansson, 1998), the usefulness of a haptic force feedback device (the PHANToM) for information without visual guidance was also confirmed. The author of the project tried to find the answers to the following questions: how well blind-folded observers perception of the roughness of real and virtual sandpapers agree and if the 3D forms of virtual objects could be judged accurately and with short exploration times down to a size of 5 mm. Blind-folded sighted observers judged the roughness of real and virtual sandpapers to be nearly the same. The presented experiments were concluded with a statement that a haptic device can present useful information without vision. Considerations using tactile maps for the blind were published. The paper (Kostopoulos et al., 2007) describes a framework of map image analysis and presentation of the semantic information to blind users using alternative modalities (i.e. haptics and audio). The resulting haptic-audio representation of the map is used by the blind for navigation and path planning purposes. However, available literature in the field lacks concrete findings concerning the usage of Falcon Novint game controller. 3. THE PROTOTYPE SYSTEM FOR HAPTIC PRESENTATION OF 3D SCENES The haptic presentation system was built in order to enable the blind people a touching interaction with 3D real objects created in virtual reality. The prototype consists of an SR3000 camera (Mesa Imaging AG), a laptop and a Falcon Novint haptic interface (see Fig. 1). Figure 1. Diagram of the designed system. The camera provides information about the distance from obstacles in a scene by calculating the time of flight of the emitted and reflected back light. A 2.5D depth map is calculated at the output. The camera is connected to the remote computer. Data processing on a laptop is divided into two stages: the scene segmentation and the virtual scene modeling (see Fig. 2). 3.1 Segmentation On a laptop, the depth map is segmented in order to extract all obstacles from the acquired scene. This process allows gathering information (i.e. location and size of objects) that is used to create a virtual scene. First, the point cloud representing the scene is processed in order to find points corresponding to planes. Planes finding procedure is based on normal vector estimation at each point. For a given point p t (x 0,y o,z 0 ) with normal vector: (1) n { a, b, c} the equation of a corresponding plane is given as: The input point cloud data is proceed as follows: a ( x x0) b( y y0) c( z z0) 0 (2) The input point cloud is ordered in a k-d tree data structure (K-D Tree). A normal vector at a each point is estimated from the surrounding point neighborhood. For this purpose k-neighbours are found. Next, the k-neighbours are used to calculate normal vectors (Rusu, 2009). Points which have the same normal vectors are grouped together (a certain deviation angle between normal vectors is assumed). 104
3 RANSAC algorithm (Random Sample Consensus) is applied to each group of points in order to find planes (a given group may include a few parallel planes, therefore certain distance threshold between points and the minimal number of points which form a plane are assumed). The calculated planes are filtered from the input cloud. Figure 2. Diagram of the designed system working principle. Next, the clustering algorithm (Rusu, 2009) is used in order to find points representing objects. Two points from the point cloud form an object when the distance d between them is shorter or equal to the assumed distance threshold d th. 3.2 Scene modeling for the tactile presentation The Falcon Novint haptic game controller is used for presentation of the virtual scenario. Using her/his sense of touch the blind user accesses information about the content of the observed scenes. The procedure of the virtual scene modeling is as follow: The found planes are created in a virtual reality (the background and the ground planes). The real obstacles are substituted by 3D boxes whose sizes and locations correspond to the real sizes and locations of objects. Locations of the boxes are given by the centroids calculated for each point cloud representing obstacles. Sizes of the boxes correspond to the maximum distance between points representing obstacles along the X and Y axis. The haptic and graphic rendering algorithms are applied to the created scene and the Falcon Novint device is activated. For the purpose of a tactile presentation an open source haptics software development platform H3D is used (H3D API). Procedure of segmentation and scene modeling are presented in Fig EXPERIMENTS WITH THE BLIND USERS Experiments were designed and performed in order to examine usability of the tactile presentation of the real environment in a virtual reality, utilizing the designed prototype system for the blind users (see Fig. 4). 4.1 Aim of research Performed experiments had following goals: Check the opportunity of the application of the tactile presentation system for the blind and visually impaired. Examine the usability and potential of the force feedback device, Falcon Novint, for a 3D virtual object presentation without the usage of vision. 105
4 collect the blind participants opinion about their requirements and preferences concerning the design of such a system and learn the potential application areas of it. 4.2 Participants The group of participants consisted of eight blind people, two women and six men. Six of them were born blind and the others lost their sight at different times during their lives. They were chosen as representatives of different educational and occupational backgrounds. They also represent a different ability of tactile perception of the surrounding environment. a) b) c) Figure 3. The virtual scene modeling process: a 2.5D depth map of the scene a), the segmented scene b) (grey found planes, black found obstacles), the reconstructed scene for the tactile presentation c) (the obstacles are replaced by cubes, see the text ). 4.3 Procedure and evaluation The experiments were divided into 4 stages: Figure 4. Prototype system tested by the blind. Stage 1 Training phase Participants were informed about the prototype system. Then its functionality was explored. The practice period with the device was adjusted according to the ability of each participant (15-30 minutes). 106
5 Stage 2 Scene content recognition Experiments were performed for scenes with different number of obstacles (between 2 and 5). Participants were asked to say how many obstacles were presented in each scene. Stage 3 Distance estimation to each obstacle from a chosen point of observation This stage was divided into two scenarios. In the first scenario, three scenes with different location of one object were presented. In the second scenario, one scene consisted of 3 objects was presented. In both cases, participants were asked to estimate objects distances to the chosen point of observation (e.g. the background wall). Stage 4 Estimation of obstacles height Three scenes with different number of objects were presented (between 2 and 5). Users were asked to estimate heights of objects in each scene (the height of an object was estimated in relation to the other objects). The objects in each scene were located on the ground, against a background wall up to 7.5 m (the SR3000 camera measurement range). Exploration time of every scene was measured. Every participant decided themselves when to finish exploration of a given presented scene. After the exploration was finished he/she was asked to describe the scene. The task was carried out successfully when the blind person correctly identified all obstacles in the presented scene. 4.4 Results The results of the second stage are presented in Fig. 5 and Fig. 6. The outcomes of the third stage are shown in Fig. 7. The outputs of the last stage are presented in Fig. 8 and Fig. 9. Figure 5. Results of the second stage of experiments. Figure 6. Exploration time of a scene for the second stage (all users). 107
6 Figure 7. The outcomes of the third stage of experiments. Figure 8. The results of estitamtion of objects height. Figure 9. Exploration time of a scene for the fourth stage. 4.5 Discussion Diverse exploration time of the scenes was measured. It depended on a tactile perception skills of the participants and on the chosen way of the scene exploration. The time of exploration was not lengthen proportionally to the complexity of the scenes, because the participants learnt how to efficiently use the haptic interface in order to identify the scene s content. In the second stage the worst result was obtained for the scene with 5 objects (two object were identified as one, because they were located close to each other). In the third stage two ways of the obstacle s location estimation were compared. In both cases participants were able to find location of objects in relation to a chosen point of observation, but in case where all obstacles were located in one scene the process was faster (the distances could be compared directly without switching between scenes). In the last stage for scenes containing two or three objects nobody had problem to properly 108
7 estimate the heights of the objects. For the scene with five objects three blind persons failed to correctly recognize objects heights. When all the experiments were completed, the blind participants expressed their opinions about the system and its usability in real life scenarios. They were impressed by the system s performance. In their opinion, there are a couple of potential applications where such a system could help the blind in everyday activities. They gave many hints about improving the system. The first suggestion was to add vocal information about the 3D position of the probe (the virtual finger in the system). This would be very helpful in order not to lose themselves in the virtual environment. Furthermore, the sonification of some of the scene points or objects could also be very useful as the volunteers suggested (the presented research concerning the perception of 3D objects by touching it). Special focus is also required when creating virtual objects. They should be as similar to those real ones as possible (size, stiffness, texture, density). When the scene consists of many objects that differ in size, the smaller ones should be created specifically to be noticed. 5. CONCLUSIONS In the article a prototype system for tactile presentation of real objects in a virtually reality was described. The system usability was examined by the blind participants. The performed experiments have proved that the system can be applied for the blind and emphasized the challenges that yet have to be overcome. This kind of application has to meet special requirements in order to be safe and reliable. The challenging issue is to present a real world scenario in a virtual reality. Many requirements need to be met, mainly: choosing the scale of virtual objects in the ratio to the real ones, solving the problem of losing oneself in VR, solving the problem of presenting scenes consisting of many objects/details (each of a different size), as the system s resolution is finite. The above are all the subject of scientific research in terms of the technical and psychological aspects. Acknowledgements: This work has been supported by the Ministry of Science and Higher Education of Poland research grant no. N R in years The described study is carried out with cooperation with Lodz Chapter of the Polish Society for the Blind. 6. REFERENCES J Loomis and S Lederman (1986), Tactual perception. In Handbook of perception and human performance (K Boff & J Thomas, Eds), Wiley/Interscience, New York, pp: Sensable Corp.: last accessed: 30 June Force Dimension Corp.: last accessed: 30 June Novint Corp.: home.novint.com, last accessed: 30 June G Jansson et al. (1999), Haptic virtual Environments for Blind People: Exploratory Experiments With Two Devices International Journal of Virtual Reality, 4, 3, pp G Jansson (1998), Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?, The International Journal of Virtual Reality, 4, pp: K Kostopoulos et al. (2007), Haptic Access to conventional 2D maps for the visually impaired, Journal on Multimodal User Interfaces, 2, 1. Mesa Imaging AG: last accessed: 30 June K-D Tree: last accessed: 30 June R Rusu (2009) Semantic 3D Object Maps for Everyday Manipulation in Human Living Environments, Phd Dissertation, Institut für Informatik, der Technischen Universität München. H3D API: last accessed: 30 June
Salient features make a search easy
Chapter General discussion This thesis examined various aspects of haptic search. It consisted of three parts. In the first part, the saliency of movability and compliance were investigated. In the second
More informationCollaboration in Multimodal Virtual Environments
Collaboration in Multimodal Virtual Environments Eva-Lotta Sallnäs NADA, Royal Institute of Technology evalotta@nada.kth.se http://www.nada.kth.se/~evalotta/ Research question How is collaboration in a
More informationCan a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?
Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects? Gunnar Jansson Department of Psychology, Uppsala University
More informationMultisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study
Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv, Tel-Aviv,
More informationOverview of current developments in haptic APIs
Central European Seminar on Computer Graphics for students, 2011 AUTHOR: Petr Kadleček SUPERVISOR: Petr Kmoch Overview of current developments in haptic APIs Presentation Haptics Haptic programming Haptic
More informationPractical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction
More informationForce feedback interfaces & applications
Force feedback interfaces & applications Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jukka Raisamo,
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationMECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES
INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL
More informationt t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2
t t t rt t s s Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 1 r sr st t t 2 st t t r t r t s t s 3 Pr ÿ t3 tr 2 t 2 t r r t s 2 r t ts ss
More informationHuman Factors. We take a closer look at the human factors that affect how people interact with computers and software:
Human Factors We take a closer look at the human factors that affect how people interact with computers and software: Physiology physical make-up, capabilities Cognition thinking, reasoning, problem-solving,
More informationFORCE FEEDBACK. Roope Raisamo
FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces
More informationMultisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills
Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills O Lahav and D Mioduser School of Education, Tel Aviv University,
More informationComparing Two Haptic Interfaces for Multimodal Graph Rendering
Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,
More informationCS277 - Experimental Haptics Lecture 2. Haptic Rendering
CS277 - Experimental Haptics Lecture 2 Haptic Rendering Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering A note on timing...
More informationWelcome to this course on «Natural Interactive Walking on Virtual Grounds»!
Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/
More informationHaptic Rendering CPSC / Sonny Chan University of Calgary
Haptic Rendering CPSC 599.86 / 601.86 Sonny Chan University of Calgary Today s Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering
More informationInput-output channels
Input-output channels Human Computer Interaction (HCI) Human input Using senses Sight, hearing, touch, taste and smell Sight, hearing & touch have important role in HCI Input-Output Channels Human output
More informationHeads up interaction: glasgow university multimodal research. Eve Hoggan
Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not
More informationHaptic Camera Manipulation: Extending the Camera In Hand Metaphor
Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium
More informationTouch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics
Touch & Haptics Touch & High Information Transfer Rate Blind and deaf people have been using touch to substitute vision or hearing for a very long time, and successfully. OPTACON Hong Z Tan Purdue University
More informationYu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp
Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp. 105-124. http://eprints.gla.ac.uk/3273/ Glasgow eprints Service http://eprints.gla.ac.uk
More informationDevelopment of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationAugmented Reality Tactile Map with Hand Gesture Recognition
Augmented Reality Tactile Map with Hand Gesture Recognition Ryosuke Ichikari 1, Tenshi Yanagimachi 2 and Takeshi Kurata 1 1: National Institute of Advanced Industrial Science and Technology (AIST), Japan
More informationAssessing the utility of dual finger haptic interaction with 3D virtual environments for blind people
Assessing the utility of dual finger haptic interaction with 3D virtual environments for blind people K Gladstone 1, H Graupp 1 and C Avizzano 2 1 isys R&D, Royal National Institute of the Blind, 105 Judd
More informationAn Investigation of the Interrelationship between Physical Stiffness and Perceived Roughness
Proceedings of the 2 nd International Conference on Human-Computer Interaction Prague, Czech Republic, August 14-15, 2014 Paper No. 61 An Investigation of the Interrelationship between Physical Stiffness
More informationthe human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o
Traffic lights chapter 1 the human part 1 (modified extract for AISD 2005) http://www.baddesigns.com/manylts.html User-centred Design Bad design contradicts facts pertaining to human capabilities Usability
More informationA comparison of learning with haptic and visual modalities.
University of Louisville ThinkIR: The University of Louisville's Institutional Repository Faculty Scholarship 5-2005 A comparison of learning with haptic and visual modalities. M. Gail Jones North Carolina
More informationLearning to Detect Doorbell Buttons and Broken Ones on Portable Device by Haptic Exploration In An Unsupervised Way and Real-time.
Learning to Detect Doorbell Buttons and Broken Ones on Portable Device by Haptic Exploration In An Unsupervised Way and Real-time Liping Wu April 21, 2011 Abstract The paper proposes a framework so that
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationAudio makes a difference in haptic collaborative virtual environments
Audio makes a difference in haptic collaborative virtual environments JONAS MOLL, YING YING HUANG, EVA-LOTTA SALLNÄS HCI Dept., School of Computer Science and Communication, Royal Institute of Technology,
More informationVirtual Tactile Maps
In: H.-J. Bullinger, J. Ziegler, (Eds.). Human-Computer Interaction: Ergonomics and User Interfaces. Proc. HCI International 99 (the 8 th International Conference on Human-Computer Interaction), Munich,
More informationUsing low cost devices to support non-visual interaction with diagrams & cross-modal collaboration
22 ISSN 2043-0167 Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration Oussama Metatla, Fiore Martin, Nick Bryan-Kinns and Tony Stockman EECSRR-12-03 June
More informationInteractive Exploration of City Maps with Auditory Torches
Interactive Exploration of City Maps with Auditory Torches Wilko Heuten OFFIS Escherweg 2 Oldenburg, Germany Wilko.Heuten@offis.de Niels Henze OFFIS Escherweg 2 Oldenburg, Germany Niels.Henze@offis.de
More informationHaptics CS327A
Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller
More informationGESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera
GESTURE BASED HUMAN MULTI-ROBOT INTERACTION Gerard Canal, Cecilio Angulo, and Sergio Escalera Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 2/27 Introduction Nowadays robots are able
More informationDevelopment of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane
Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Makoto Yoda Department of Information System Science Graduate School of Engineering Soka University, Soka
More information6 Ubiquitous User Interfaces
6 Ubiquitous User Interfaces Viktoria Pammer-Schindler May 3, 2016 Ubiquitous User Interfaces 1 Days and Topics March 1 March 8 March 15 April 12 April 26 (10-13) April 28 (9-14) May 3 May 10 Administrative
More informationSound rendering in Interactive Multimodal Systems. Federico Avanzini
Sound rendering in Interactive Multimodal Systems Federico Avanzini Background Outline Ecological Acoustics Multimodal perception Auditory visual rendering of egocentric distance Binaural sound Auditory
More informationDrumtastic: Haptic Guidance for Polyrhythmic Drumming Practice
Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The
More informationWhat will the robot do during the final demonstration?
SPENCER Questions & Answers What is project SPENCER about? SPENCER is a European Union-funded research project that advances technologies for intelligent robots that operate in human environments. Such
More informationFrom Encoding Sound to Encoding Touch
From Encoding Sound to Encoding Touch Toktam Mahmoodi King s College London, UK http://www.ctr.kcl.ac.uk/toktam/index.htm ETSI STQ Workshop, May 2017 Immersing a person into the real environment with Very
More informationComputer Haptics and Applications
Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School
More informationIDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez
IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK Javier Sanchez Center for Computer Research in Music and Acoustics (CCRMA) Stanford University The Knoll, 660 Lomita Dr. Stanford, CA 94305,
More informationVirtual Environments. CSCI 420 Computer Graphics Lecture 25. History of Virtual Reality Flight Simulators Immersion, Interaction, Real-time Haptics
CSCI 420 Computer Graphics Lecture 25 Virtual Environments Jernej Barbic University of Southern California History of Virtual Reality Flight Simulators Immersion, Interaction, Real-time Haptics 1 Virtual
More informationLeading the Agenda. Everyday technology: A focus group with children, young people and their carers
Leading the Agenda Everyday technology: A focus group with children, young people and their carers March 2018 1 1.0 Introduction Assistive technology is an umbrella term that includes assistive, adaptive,
More informationTitle: A Comparison of Different Tactile Output Devices In An Aviation Application
Page 1 of 6; 12/2/08 Thesis Proposal Title: A Comparison of Different Tactile Output Devices In An Aviation Application Student: Sharath Kanakamedala Advisor: Christopher G. Prince Proposal: (1) Provide
More informationVirtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama
CSCI 480 Computer Graphics Lecture 25 Virtual Environments Virtual Reality computer-simulated environments that can simulate physical presence in places in the real world, as well as in imaginary worlds
More informationDo You Feel What I Hear?
1 Do You Feel What I Hear? Patrick Roth 1, Hesham Kamel 2, Lori Petrucci 1, Thierry Pun 1 1 Computer Science Department CUI, University of Geneva CH - 1211 Geneva 4, Switzerland Patrick.Roth@cui.unige.ch
More informationCognitive Evaluation of Haptic and Audio Feedback in Short Range Navigation Tasks
Cognitive Evaluation of Haptic and Audio Feedback in Short Range Navigation Tasks Manuel Martinez, Angela Constantinescu, Boris Schauerte, Daniel Koester and Rainer Stiefelhagen INSTITUTE FOR ANTHROPOMATICS
More information1/22/13. Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama
CSCI 480 Computer Graphics Lecture 25 Virtual Environments Apr 29, 2013 Jernej Barbic University of Southern California http://www-bcf.usc.edu/~jbarbic/cs480-s13/ History of Virtual Reality Immersion,
More informationtactile perception according to texts of Vincent Hayward, J.J Gibson. florian wille // tactile perception // // 1 of 15
tactile perception according to texts of Vincent Hayward, J.J Gibson. florian wille // tactile perception // 30.11.2009 // 1 of 15 tactile vs visual sense The two senses complement each other. Where as
More informationComparison of Haptic and Non-Speech Audio Feedback
Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability
More informationProprioception & force sensing
Proprioception & force sensing Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jussi Rantala, Jukka
More informationMultimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms. I-Chun Alexandra Hou
Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms by I-Chun Alexandra Hou B.S., Mechanical Engineering (1995) Massachusetts Institute of Technology Submitted to the
More informationTest of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten
Test of pan and zoom tools in visual and non-visual audio haptic environments Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Published in: ENACTIVE 07 2007 Link to publication Citation
More informationFlexible Active Touch Using 2.5D Display Generating Tactile and Force Sensations
This is the accepted version of the following article: ICIC Express Letters 6(12):2995-3000 January 2012, which has been published in final form at http://www.ijicic.org/el-6(12).htm Flexible Active Touch
More informationThe CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.
The CHAI Libraries F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury Computer Science Department, Stanford University, Stanford CA
More informationTEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY
TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY MARCH 4, 2012 HAPTICS SYMPOSIUM Overview A brief introduction to CS 277 @ Stanford Core topics in haptic rendering Use of the CHAI3D framework
More informationPERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT
PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,
More informationBeyond Visual: Shape, Haptics and Actuation in 3D UI
Beyond Visual: Shape, Haptics and Actuation in 3D UI Ivan Poupyrev Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for
More informationVR based HCI Techniques & Application. November 29, 2002
VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted
More informationthese systems has increased, regardless of the environmental conditions of the systems.
Some Student November 30, 2010 CS 5317 USING A TACTILE GLOVE FOR MAINTENANCE TASKS IN HAZARDOUS OR REMOTE SITUATIONS 1. INTRODUCTION As our dependence on automated systems has increased, demand for maintenance
More informationUMI3D Unified Model for Interaction in 3D. White Paper
UMI3D Unified Model for Interaction in 3D White Paper 30/04/2018 Introduction 2 The objectives of the UMI3D project are to simplify the collaboration between multiple and potentially asymmetrical devices
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationResearch Seminar. Stefano CARRINO fr.ch
Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks
More informationDifferences in Fitts Law Task Performance Based on Environment Scaling
Differences in Fitts Law Task Performance Based on Environment Scaling Gregory S. Lee and Bhavani Thuraisingham Department of Computer Science University of Texas at Dallas 800 West Campbell Road Richardson,
More informationDevelopment of K-Touch TM Haptic API for Various Datasets
Development of K-Touch TM Haptic API for Various Datasets Beom-Chan Lee 1 Jong-Phil Kim 2 Jongeun Cha 3 Jeha Ryu 4 ABSTRACT This paper presents development of a new haptic API (Application Programming
More informationHAPTIC VIRTUAL ENVIRON- MENTS FOR BLIND PEOPLE: EXPLORATORY EXPERIMENTS WITH TWO DEVICES
8 THE INTERNATIONAL JOURNAL OF VIRTUAL REALITY Vol. 3, No. 4 HAPTIC VIRTUAL ENVIRON- MENTS FOR BLIND PEOPLE: EXPLORATORY EXPERIMENTS WITH TWO DEVICES G Jansson 1, H Petrie 2, C Colwell 2, D Kornbrot 2,
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationThe Impact of Unaware Perception on Bodily Interaction in Virtual Reality. Environments. Marcos Hilsenrat, Miriam Reiner
The Impact of Unaware Perception on Bodily Interaction in Virtual Reality Environments Marcos Hilsenrat, Miriam Reiner The Touchlab Technion Israel Institute of Technology Contact: marcos@tx.technion.ac.il
More informationFeelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces
Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics
More informationVR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Website:
VR for Microsurgery Design Document Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Email: med-vr@iastate.edu Website: Team Members/Role: Maggie Hollander Leader Eric Edwards Communication Leader
More information3D interaction techniques in Virtual Reality Applications for Engineering Education
3D interaction techniques in Virtual Reality Applications for Engineering Education Cristian Dudulean 1, Ionel Stareţu 2 (1) Industrial Highschool Rosenau, Romania E-mail: duduleanc@yahoo.com (2) Transylvania
More informationUbiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1
Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationEvaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface
Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University
More informationMultisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I
1 Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study I Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv,
More informationSPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko
SPIDERMAN VR Adam Elgressy and Dmitry Vlasenko Supervisors: Boaz Sternfeld and Yaron Honen Submission Date: 09/01/2019 Contents Who We Are:... 2 Abstract:... 2 Previous Work:... 3 Tangent Systems & Development
More informationHaptic Perception & Human Response to Vibrations
Sensing HAPTICS Manipulation Haptic Perception & Human Response to Vibrations Tactile Kinesthetic (position / force) Outline: 1. Neural Coding of Touch Primitives 2. Functions of Peripheral Receptors B
More informationMethods for Haptic Feedback in Teleoperated Robotic Surgery
Young Group 5 1 Methods for Haptic Feedback in Teleoperated Robotic Surgery Paper Review Jessie Young Group 5: Haptic Interface for Surgical Manipulator System March 12, 2012 Paper Selection: A. M. Okamura.
More informationArticle. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.
Article A comparison of three nonvisual methods for presenting scientific graphs ROTH, Patrick, et al. Abstract This study implemented three different methods for presenting scientific graphs to visually
More informationEnabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation
Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation Yuhang Zhao1, 2, Cynthia L. Bennett1, 3, Hrvoje Benko1, Edward Cutrell1, Christian Holz1,
More informationChapter 1 - Introduction
1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over
More informationVisuaLax: Visually Relaxing Augmented Reality Application Using Music and Visual Therapy
DOI: 10.7763/IPEDR. 2013. V63. 5 VisuaLax: Visually Relaxing Augmented Reality Application Using Music and Visual Therapy Jeremiah Francisco +, Benilda Eleonor Comendador, Angelito Concepcion Jr., Ron
More informationPROPRIOCEPTION AND FORCE FEEDBACK
PROPRIOCEPTION AND FORCE FEEDBACK Roope Raisamo and Jukka Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere,
More informationBuddy Bearings: A Person-To-Person Navigation System
Buddy Bearings: A Person-To-Person Navigation System George T Hayes School of Information University of California, Berkeley 102 South Hall Berkeley, CA 94720-4600 ghayes@ischool.berkeley.edu Dhawal Mujumdar
More informationMobile Audio Designs Monkey: A Tool for Audio Augmented Reality
Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality Bruce N. Walker and Kevin Stamper Sonification Lab, School of Psychology Georgia Institute of Technology 654 Cherry Street, Atlanta, GA,
More informationAutomatic Online Haptic Graph Construction
Automatic Online Haptic Graph Construction Wai Yu, Kenneth Cheung, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science University of Glasgow, Glasgow, UK {rayu, stephen}@dcs.gla.ac.uk
More informationA Multimodal Locomotion User Interface for Immersive Geospatial Information Systems
F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,
More informationBenefits of using haptic devices in textile architecture
28 September 2 October 2009, Universidad Politecnica de Valencia, Spain Alberto DOMINGO and Carlos LAZARO (eds.) Benefits of using haptic devices in textile architecture Javier SANCHEZ *, Joan SAVALL a
More informationTouch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence
Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence Ji-Won Song Dept. of Industrial Design. Korea Advanced Institute of Science and Technology. 335 Gwahangno, Yusong-gu,
More informationSeminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)
Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS) Jussi Rantala Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Contents
More informationDesign and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People
Design and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People Ying Ying Huang Doctoral Thesis in Human-Computer Interaction KTH, Stockholm, Sweden 2010 Avhandling som med tillstånd
More informationMSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation
MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation Rahman Davoodi and Gerald E. Loeb Department of Biomedical Engineering, University of Southern California Abstract.
More informationFM Knowledge Modelling and Management by Means of Context Awareness and Augmented Reality
FM Knowledge Modelling and Management by Means of Context Awareness and Augmented Reality Janek Götze University of Applied Sciences Zwickau janek.goetze@fh-zwickau.de +49 375 536 3448 Daniel Ellmer University
More informationVR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing
www.dlr.de Chart 1 > VR-OOS System Architecture > Robin Wolff VR-OOS Workshop 09/10.10.2012 VR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing Robin Wolff DLR, and
More informationTouch. Touch & the somatic senses. Josh McDermott May 13,
The different sensory modalities register different kinds of energy from the environment. Touch Josh McDermott May 13, 2004 9.35 The sense of touch registers mechanical energy. Basic idea: we bump into
More informationComparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians
British Journal of Visual Impairment September, 2007 Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians Dr. Olinkha Gustafson-Pearce,
More information