Multi-modal Exploration of Large Scientific Data Using Virtual Reality
|
|
- Mervyn Fisher
- 6 years ago
- Views:
Transcription
1 1 Department of Computing Science University of Alberta Multi-modal Exploration of Large Scientific Data Using Virtual Reality Dr. Pierre Boulanger Department of Computing Science University of Alberta Shuttle Radar Topography Mission (SRTM) February 11, 2000, the Shuttle Radar Topography Mission (SRTM) was launched into space as part of one of the payload of the Shuttle Endeavor. Using a new radar sweeping technique most of the Earth's surfaces was digitized in 3D in approximately 10 days. SRTM acquired enough data during its mission to obtain a near-global highresolution database of the Earth's topography. Terrain Model of Mount St. Helens Terrain Model After Compression and Hole Filling Low Altitude Airflow Over Mount St. Helens Terrain Model Rendering Aburra Valley Colombia CFD Model Convective Winds Simulations Based on Landsat IR Data
2 2 Virtual Analysis of a Francis Turbine at La Herradura in Colombia The main objective of the DIFRANCI Project is to apply a condition assessment methodology following holistic approach to the maintenance of the Francis turbines of "La Herradura" hydropower plant in Colombia. Project in collaboration between: Empresas Públicas de Medellín Colombian Agency for Science and Technology EAFIT University, Medellin, Colombia EPFL, Lausanne, Swizerland UofA, Edmonton, Canada Digitizing the Turbine Using a Hand Held Scanner Scanning Using Handy Scan from Creaform 3D Final Scanned Reversed Engineered Model Scanned Model of the Turbine Extracted FEM Mesh Rapid Virtual Prototyping Once a 3D model is created, virtual prototyping allows product testing without the need to build a real prototype Allows for shape and functional optimization Allows to tract the complete life cycle of a product Rapid Virtual Prototyping requires powerful computing infrastructure especially if it is interactive CFD Simulation of Francis Turbine Project Particle Flow
3 3 Pressure Variations vs Time Results of CFD Analysis Wall Pressure Comparisons of the computed pressure recovery coefficient with the experimental values, medium size mesh, 1.15ψ= in the case of the FLINDT draft tube. The UofA/EAFIT Virtual Wind Tunnel Definition of Interactive CFD Need: A set of tools that allow the designer to test interactively the behavior of a design under various flow conditions. Task: The main task performed by an interactive CFD system can be defined as solving simulations of fluid flow for an object, given the possible scenarios defined by the user and to display the results interactively. User/Consumer: The target market is none other than designers interested in testing their design s behavior under fluid flow conditions to optimize and test their design. Interactive CFD User needs UofA/EAFIT Virtual Wind Tunnel Architecture
4 4 Standard Scientific Visualization/Simulation Pipeline Remote Simulation Parameters Desktop Data Simulator Filter Map Render Data Repository High Speed Network Image Problems With Current Practice Hard and expensive process to determine dominant parameters in a CFD simulation model. Simulation runs cannot be steered resulting in useless computations. It is like working blind. Current pipeline do not allow collaboration. It is always after the fact that the simulation data is analyzed and shared by a group of engineer. Even Better: Collaborative Visualization/Simulation Steering Environment Simulation Parameters User1 UofA Advanced Collaborative Immersive Environments New VizRoom 3D Graphic Rendering Massive Storage Data Client 1 Filter Map Render Image Haptic Rendering Simulation Server User n 3D Sound Rendering AMMI Lab Local High Speed Network Data Client n Filter Map Render Image Input Sensors Definition of Virtual Reality A virtual reality system is an interface between a human and a machine capable of creating a real-time sensory experience of real and artificial worlds through the various human sensory channels. These sensory channels for man are: Vision, Audition, Touch, Smell, and Taste. Definition of Real-Time Real-time, in virtual reality, means that the computer system can detect the input of the user and react to it fast enough so that it appears to be instantaneous. Burdea, 1993
5 5 How is VR Different than CG? Objects in the environment have a strong sense of spatial presence, creating the effect that the objects exist independently of the user. Control of interaction within the environment is often through direct manipulation of objects as in the real world. How is VR Different than CG? The computer interface is hidden in the sense that the user interacts with objects in the environment rather than a computer which controls objects in the environment. The user is immersed in the environment, i.e., the user experiences the environment from within. Immersion= Presence Presence is a state of consciousness where the human actor has a sense of being in the location specified by the displays. The unique feature of "virtual reality" systems is that they are general purpose presence transforming machines. Multimodal vs. Multimedia Multi-modal systems use more than one sense or mode of interaction e.g. visual and aural senses: a text processor may speak the words as well as echoing them to the screen Multi-media systems use more than one media to communicate information e.g. a computer-based teaching system - may use video, animation, text and still images - different media all using visual mode of interaction - may also use sounds, both speech and non-speech Multimodal Human Computer Interaction Human Computer Multimedia output Information processing Interface Information processing Internal perception / action feedback loop Multimodal input AI agents internal decision loop Level of Interactivity Reactive: where little user control takes place over the content's structure with program directed options and feedback Co-active: providing user control for sequence, timings and style. Pro-active: where the user controls both structure and content at most of the levels.
6 6 First Level of Function Segregation Virtual Wind Tunnel Interaction Map Data Exploration and the Mapping Problem Visualization n-d Data Mapping Virtual World Visual Haptic Sound Westgrid 4K x 2K Display Visualization Toolkits Computation/Analysis + Visualization NIH Image and its PC version Scion Matlab Programming Toolkits The Visualization ToolKit (VTK) Insight ToolKit (ITK) VisAD and Vis5D (also visualization spreadsheet ) SCIRun Graphical Programming Toolkits Open Data Explorer (OpenDX) Paraview Advanced Visual Systems (AVS/Express) Amira Slicer Multivariate Display Techniques Glyphs: 2D Scalar Heterogeneous Techniques: 2D and 3D Texture Spot Noise (van Wijk), Healey, Ware Layering 2D Scalar: Slivers, DDS (3D?) 2D Scalar, Vector, Tensor: Laidlaw, Crawfis Problem Reduction Dimensional reduction, Cluster analysis Smart Particles Time and space multiplexing Mapping different fields over time Magic
7 7 Glyph: Flow Probe Multiple Views in Space Definition of Haptic Gibson (1966) A haptic system is defined as "The sensibility of the individual to the world adjacent to his body by use of his body". The haptic perceptual system is unusual in that it can include the sensory receptors from the whole body and is closely linked to the movement of the body so can have a direct effect on the world being perceived. Human Haptics Two complementary channels: ~ Tactile Strictly responsible for the variation of the cutaneous stimuli Presents spatial distribution of forces ~ Kinesthetic (Proprioception) Refers to the human perception of one s own body position and motion Presents only the net force information Tactile Display CyberTouch of Immersion Skin sensation is essential for many manipulation and exploration tasks, for example, medical palpation. Tactile display devices stimulate the skin to generate these sensations of contact. The skin responds to several distributed physical quantities: 1. High-frequency vibrations: Surface texture, slip, impact, and puncture. 2. Small-scale shape or pressure distribution 3. Thermal properties CyberTouch is a tactile feedback option for Immersion's CyberGlove. It features small vibrotactile stimulators on each finger and the palm of the CyberGlove. Each stimulator can be individually programmed to vary the strength of touch sensation. The array of stimulators can generate simple sensations such as pulses or sustained vibration, and they can be used in combination to produce complex tactile feedback patterns.
8 8 Other Tactile Displays Thermal properties: We infer material composition and temperature difference. Thermal display devices usually based on Peltier thermoelectric coolers. Many other tactile display modalities: electro-rheological devices for conveying compliance, electro-cutaneous stimulators, ultrasonic friction displays, and rotating disks for creating slip sensations. Haptic Rendering of Surfaces The Models of the Probe State of the Art Difficult to simulate proprioception on the entire body a point Fingertip Wrist a 3D object a line segment Arm + 2 fingers Foot VR Architecture of CoRSAIRe/CFD Environment at LIMSI CFD Exploration Using LIMSI CoRSAIRe System Menelas 2010
9 9 CFD Exploration Using CoRSAIRe System Haptic Data Rendering-I Pao and Lawrence, 1998 Tourque Nulling Transverse Damping Haptic Data Rendering-II AMMI Lab Multi-Modal Interface for CFD (Visual and Sound) Pao and Lawrence, 1998 van Reimersdahl et al., 2003 Relative Drag Feature Shift Type of Modalities Multi-Modal Exploration of CFD Flow Modalities used for the interface Visual Mono/Stereo/CAVE Haptic Perception of fluids flow Objects manipulations Setting of boundary conditions Sonification of fluids
10 10 Project Background Multi-Modal Rendering System Structure Input: Fluid field with velocity vector, pressure, and other data Changes with time Output: Sound characterizing the given fluid field Ambient: global to the whole field Local: at the point or area of interaction Local region: particles of the specific subset area around the pointer contribute to the sound Max/MSP Program Main Program as Max/MSP object Sound Solution Data Server Haptic Program Haptic Device Visualization Program Image Haptic Rendering Read from the haptic device and sends pointer info to the sound and visual programs Pointer position and orientation (converted to the data field dimensions) Buttons: interaction sphere diameter- local region Render a force feedback: Virtual walls: provides a force disallowing movement of the device outside of the data field boundary Other feedback possible: produce a force that is proportional to the flow density and its direction Visual Rendering Displays vector field, virtual pointer (microphone) and interaction sphere SGI OpenGL Performer Library for graphical representation Sound Rendering I Calculates velocity vector at the position of the virtual microphone depending on interaction sphere radius (using Schaeffer s interpolation scheme): pn( t) / rm rn s p ForallNode m( t) 2 1/ r r ForallNodes Small : from vertices of the grid cell Large: from all the vertices inside the influence sphere Velocity value & angle at that position m n 2 Sound Rendering II Two output values for both angle and velocity: Output = value / max value Output = (value / max value) 5/3 Virtual Microphone Direction Relationship between loudness level and intensity: S ~ a 3/5 [B.Gold] Thus, a function between values and amplitude should be: a = const * data value 5/3 to imply: S ~ data value v (t)
11 11 Sound Rendering III White band noise is modified in amplitude and frequency to simulate a wind effect Frequency ~ v (t) were v -> [0,1] -> [500, 1500] Amplitude ~ v * 5 / 3 were v 5/3 -> [0,1] and a 5/3 -> [0.5, 1] v (t) 5/3 Sonification Types Positive vs. Negative Amplitude Modulation velocity value is mapped to either increase or decrease in amplitude of the sound Amplitude vs. Frequency Modulation highest velocity value is mapped to either loudest noise or highest pitch noise Before vs. after interpolation many separate sounds for each vertex in the local area vs. one sound of the interpolated value at the position of virtual pointer Hypothesis According to multimodal theory adding sound rendering to visual rendering should improve exploration of CFD flow fields Testability: In our experiments, we will determine if our hypothesis is correct for an eddy localization task Simplicity: This hypothesis is simple and can easily be tested Null hypothesis: The combination of visual and auditory do not improve eddy localization efficiency. Variables Independent Variables Eddy localization in the flow field Starting point in the volume Sound mapping: frequency or amplitude modulation Sound rendering: positive or negative amplitude modulation Interpolation type: Schaeffer s or multi-sound sources Virtual microphone radius R Dependent Variables Eddy localization error Time and length of trajectory to get to the eddy Interface Evaluation Procedure 1. Design the experiment. 2. Conduct the experiment. 3. Collect the data. 4. Analyze the data. 5. Draw your conclusions & establish hypotheses 6. Redesign and do it again. Usability Experimental Setup Visual and/or Audio cues, haptic - navigation
12 12 Usability Study Experimental Setup Participants asked to locate vortex centers 40 fields: 25x25x25 Random vortex locations Red arrow / specific sound 15 warm-up trials 36 experimental trials Random start point Random setup Usability Study Results Worst results for the audio-alone system: participants are slower in locating the goal participants are less efficient in exploring the volume Participants are less precise in locating the goal Multimodal vs. Visual Only Interface Importance of Sound Rendering Parameters Specific system setup helps to improve performance Equal or better results for the multi-modal system Participants explore less space Participants are much faster in locating the goal position The Best Configuration: Positive amplitude with large radius and before interpolation Usability Study Conclusion The multimodal system is more efficient to localize eddies than either pure visual or pure audio systems Specific mapping parameters influence system performance Different audio parameters are better for audio-only than for multi-modal interface Different audio parameters might be better for different conditions Multi-modal Exploration of Electric Filds Melenas 2010
13 13 Molecule Docking Solar TErrestrial RElations Observatory (STEREO) Melenas 2010 NASA Picture May, 2006 STEREO Data Pictures by Johns-Hopkins Applied Physics Laboratory Usable Data from STEREO Stereo Pairs of the sun in visible and ultraviolet bands 3-D Reconstructions of the Corona and the sun main body 3-D distribution of plasma characteristics of solar energetic particles Local vector magnetic field Plasma characteristics of protons, alpha particles and heavy ions Trace the generation and evolution of traveling radio disturbances NASA STEREO Database STEREO Multi-Modal Interface System for Data Driven Simulations Sun Corona Simulation Sun Particle Emission Simulation Sun Electro-Magnetic Emission Simulation Multi-Modal Data Exploration Interface Earth Electro-Magnetic Disturbances Simulation
14 14 Next! The GPU Revolution SGI Petaflops in a rack True real-time simulation and visualization Next Seminar Dec 1. Recent Developments of Closed-loop Simulation and Visualization Interfaces Using GPU SGI Prism XL Tesla M2050 / M2070 GPU Computing Module
Augmentation of Visualisation Using Sonification: A Case Study in Computational Fluid Dynamics
IPT-EGVE Symposium (2007) B. Fröhlich, R. Blach, and R. van Liere (Editors) Augmentation of Visualisation Using Sonification: A Case Study in Computational Fluid Dynamics M. Kasakevich 1 P. Boulanger 1
More informationHaptic Rendering and Volumetric Visualization with SenSitus
Haptic Rendering and Volumetric Visualization with SenSitus Stefan Birmanns, Ph.D. Department of Molecular Biology The Scripps Research Institute 10550 N. Torrey Pines Road, Mail TPC6 La Jolla, California,
More informationHaptic presentation of 3D objects in virtual reality for the visually disabled
Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,
More informationWelcome to this course on «Natural Interactive Walking on Virtual Grounds»!
Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/
More informationVIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa
VIRTUAL REALITY Introduction Emil M. Petriu SITE, University of Ottawa Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality HUMAN PERCEPTION OF
More informationHaptic Sensing and Perception for Telerobotic Manipulation
Haptic Sensing and Perception for Telerobotic Manipulation Emil M. Petriu, Dr. Eng., P.Eng., FIEEE Professor School of Information Technology and Engineering University of Ottawa Ottawa, ON., K1N 6N5 Canada
More informationComparison of Haptic and Non-Speech Audio Feedback
Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Marc Erich Latoschik AI & VR Lab Artificial Intelligence Group University of Bielefeld Virtual Reality (or VR for short) Virtual Reality (or VR for short)
More informationFeeding human senses through Immersion
Virtual Reality Feeding human senses through Immersion 1. How many human senses? 2. Overview of key human senses 3. Sensory stimulation through Immersion 4. Conclusion Th3.1 1. How many human senses? [TRV
More informationMECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES
INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL
More informationComputer Haptics and Applications
Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School
More informationVirtual Environments. Ruth Aylett
Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able
More informationVIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS
VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS Jaejoon Kim, S. Mandayam, S. Udpa, W. Lord, and L. Udpa Department of Electrical and Computer Engineering Iowa State University Ames, Iowa 500
More informationAbdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.
Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng. Multimedia Communications Research Laboratory University of Ottawa Ontario Research Network of E-Commerce www.mcrlab.uottawa.ca abed@mcrlab.uottawa.ca
More informationWhat is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology
Virtual Reality man made reality sense world What is Virtual Reality? Dipl-Ing Indra Kusumah Digital Product Design Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indrakusumah@iptfraunhoferde wwwiptfraunhoferde
More informationBeyond Visual: Shape, Haptics and Actuation in 3D UI
Beyond Visual: Shape, Haptics and Actuation in 3D UI Ivan Poupyrev Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for
More informationFORCE FEEDBACK. Roope Raisamo
FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces
More informationOutput Devices - Non-Visual
IMGD 5100: Immersive HCI Output Devices - Non-Visual Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Overview Here we are concerned with
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationHaptics CS327A
Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationVirtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback
Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback Taku Hachisu The University of Electro- Communications 1-5-1 Chofugaoka, Chofu, Tokyo 182-8585, Japan +81 42 443 5363
More informationCollaboration in Multimodal Virtual Environments
Collaboration in Multimodal Virtual Environments Eva-Lotta Sallnäs NADA, Royal Institute of Technology evalotta@nada.kth.se http://www.nada.kth.se/~evalotta/ Research question How is collaboration in a
More informationInteractive Virtual Environments
Interactive Virtual Environments Introduction Emil M. Petriu, Dr. Eng., FIEEE Professor, School of Information Technology and Engineering University of Ottawa, Ottawa, ON, Canada http://www.site.uottawa.ca/~petriu
More informationHeads up interaction: glasgow university multimodal research. Eve Hoggan
Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not
More informationMultimedia Virtual Laboratory: Integration of Computer Simulation and Experiment
Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment Tetsuro Ogi Academic Computing and Communications Center University of Tsukuba 1-1-1 Tennoudai, Tsukuba, Ibaraki 305-8577,
More informationVIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE
VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE Yiru Zhou 1, Xuecheng Yin 1, and Masahiro Ohka 1 1 Graduate School of Information Science, Nagoya University Email: ohka@is.nagoya-u.ac.jp
More informationVR based HCI Techniques & Application. November 29, 2002
VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted
More informationSound rendering in Interactive Multimodal Systems. Federico Avanzini
Sound rendering in Interactive Multimodal Systems Federico Avanzini Background Outline Ecological Acoustics Multimodal perception Auditory visual rendering of egocentric distance Binaural sound Auditory
More informationInteractive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1
VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio
More informationUsing the Radio Spectrum to Understand Space Weather
Using the Radio Spectrum to Understand Space Weather Ray Greenwald Virginia Tech Topics to be Covered What is Space Weather? Origins and impacts Analogies with terrestrial weather Monitoring Space Weather
More informationSalient features make a search easy
Chapter General discussion This thesis examined various aspects of haptic search. It consisted of three parts. In the first part, the saliency of movability and compliance were investigated. In the second
More informationFrom Encoding Sound to Encoding Touch
From Encoding Sound to Encoding Touch Toktam Mahmoodi King s College London, UK http://www.ctr.kcl.ac.uk/toktam/index.htm ETSI STQ Workshop, May 2017 Immersing a person into the real environment with Very
More informationPractical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction
More informationCSE 165: 3D User Interaction. Lecture #14: 3D UI Design
CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware
More informationPeter Berkelman. ACHI/DigitalWorld
Magnetic Levitation Haptic Peter Berkelman ACHI/DigitalWorld February 25, 2013 Outline: Haptics - Force Feedback Sample devices: Phantoms, Novint Falcon, Force Dimension Inertia, friction, hysteresis/backlash
More informationHaptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces
In Usability Evaluation and Interface Design: Cognitive Engineering, Intelligent Agents and Virtual Reality (Vol. 1 of the Proceedings of the 9th International Conference on Human-Computer Interaction),
More informationForce feedback interfaces & applications
Force feedback interfaces & applications Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jukka Raisamo,
More informationResonance Tube. 1 Purpose. 2 Theory. 2.1 Air As A Spring. 2.2 Traveling Sound Waves in Air
Resonance Tube Equipment Capstone, complete resonance tube (tube, piston assembly, speaker stand, piston stand, mike with adapters, channel), voltage sensor, 1.5 m leads (2), (room) thermometer, flat rubber
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationFigure 2. Haptic human perception and display. 2.2 Pseudo-Haptic Feedback 2. RELATED WORKS 2.1 Haptic Simulation of Tapping an Object
Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback Taku Hachisu 1 Gabriel Cirio 2 Maud Marchal 2 Anatole Lécuyer 2 Hiroyuki Kajimoto 1,3 1 The University of Electro- Communications
More informationOmni-Directional Catadioptric Acquisition System
Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationResonance Tube. 1 Purpose. 2 Theory. 2.1 Air As A Spring. 2.2 Traveling Sound Waves in Air
Resonance Tube Equipment Capstone, complete resonance tube (tube, piston assembly, speaker stand, piston stand, mike with adaptors, channel), voltage sensor, 1.5 m leads (2), (room) thermometer, flat rubber
More informationMULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT
MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003
More information702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet
702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet Arūnas Žvironas a, Marius Gudauskis b Kaunas University of Technology, Mechatronics Centre for Research,
More informationProprioception & force sensing
Proprioception & force sensing Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jussi Rantala, Jukka
More informationUsing Hybrid Reality to Explore Scientific Exploration Scenarios
Using Hybrid Reality to Explore Scientific Exploration Scenarios EVA Technology Workshop 2017 Kelsey Young Exploration Scientist NASA Hybrid Reality Lab - Background Combines real-time photo-realistic
More informationInternational Journal of Advanced Research in Computer Science and Software Engineering
Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com A Study on SensAble
More informationExploring Surround Haptics Displays
Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,
More informationAugmented Perception for Diagnosis and Orientation Support
:envihab International Symposium Cologne, Germnay May 22-24, 2011 University of Applied Sciences for Diagnosis and Orientation Support Artificial sensors and their application in medicine and navigation
More informationCutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery
Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery Claudio Pacchierotti Domenico Prattichizzo Katherine J. Kuchenbecker Motivation Despite its expected clinical
More informationTactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) November -,. Tokyo, Japan Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images Yuto Takeda
More informationBlind navigation with a wearable range camera and vibrotactile helmet
Blind navigation with a wearable range camera and vibrotactile helmet (author s name removed for double-blind review) X university 1@2.com (author s name removed for double-blind review) X university 1@2.com
More informationTitle: A Comparison of Different Tactile Output Devices In An Aviation Application
Page 1 of 6; 12/2/08 Thesis Proposal Title: A Comparison of Different Tactile Output Devices In An Aviation Application Student: Sharath Kanakamedala Advisor: Christopher G. Prince Proposal: (1) Provide
More informationTouch Perception and Emotional Appraisal for a Virtual Agent
Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de
More informationIntroduction to Haptics
Introduction to Haptics Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction (TAUCHI) Department of Computer Sciences University of Tampere, Finland Definition
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationMechatronics Project Report
Mechatronics Project Report Introduction Robotic fish are utilized in the Dynamic Systems Laboratory in order to study and model schooling in fish populations, with the goal of being able to manage aquatic
More informationResonance Tube Lab 9
HB 03-30-01 Resonance Tube Lab 9 1 Resonance Tube Lab 9 Equipment SWS, complete resonance tube (tube, piston assembly, speaker stand, piston stand, mike with adaptors, channel), voltage sensor, 1.5 m leads
More informationAutonomous and Autonomic Systems: With Applications to NASA Intelligent Spacecraft Operations and Exploration Systems
Walt Truszkowski, Harold L. Hallock, Christopher Rouff, Jay Karlin, James Rash, Mike Hinchey, and Roy Sterritt Autonomous and Autonomic Systems: With Applications to NASA Intelligent Spacecraft Operations
More informationHAPTICS AND AUTOMOTIVE HMI
HAPTICS AND AUTOMOTIVE HMI Technology and trends report January 2018 EXECUTIVE SUMMARY The automotive industry is on the cusp of a perfect storm of trends driving radical design change. Mary Barra (CEO
More informationHigh-Resolution Corrosion Monitoring for Reliable Assessment of Infrastructure
19 th World Conference on Non-Destructive Testing 2016 High-Resolution Corrosion Monitoring for Reliable Assessment of Infrastructure André Lamarre 1 1 Olympus Scientific Solutions Americas, Quebec City,
More informationHaptic Perception & Human Response to Vibrations
Sensing HAPTICS Manipulation Haptic Perception & Human Response to Vibrations Tactile Kinesthetic (position / force) Outline: 1. Neural Coding of Touch Primitives 2. Functions of Peripheral Receptors B
More informationAn Introduction into Virtual Reality Environments. Stefan Seipel
An Introduction into Virtual Reality Environments Stefan Seipel stefan.seipel@hig.se What is Virtual Reality? Technically defined: VR is a medium in terms of a collection of technical hardware (similar
More informationLecture 7: Human haptics
ME 327: Design and Control of Haptic Systems Winter 2018 Lecture 7: Human haptics Allison M. Okamura Stanford University types of haptic sensing kinesthesia/ proprioception/ force cutaneous/ tactile Related
More informationWhat is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel
An Introduction into Virtual Reality Environments What is Virtual Reality? Technically defined: Stefan Seipel stefan.seipel@hig.se VR is a medium in terms of a collection of technical hardware (similar
More informationUsing VR and simulation to enable agile processes for safety-critical environments
Using VR and simulation to enable agile processes for safety-critical environments Michael N. Louka Department Head, VR & AR IFE Digital Systems Virtual Reality Virtual Reality: A computer system used
More informationConstruction of visualization system for scientific experiments
Construction of visualization system for scientific experiments A. V. Bogdanov a, A. I. Ivashchenko b, E. A. Milova c, K. V. Smirnov d Saint Petersburg State University, 7/9 University Emb., Saint Petersburg,
More informationDevelopment of intelligent systems
Development of intelligent systems (RInS) Robot sensors Danijel Skočaj University of Ljubljana Faculty of Computer and Information Science Academic year: 2017/18 Development of intelligent systems Robotic
More informationVR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.
VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D
More informationMulti-Modal User Interaction
Multi-Modal User Interaction Lecture 4: Multiple Modalities Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk MMUI, IV, Zheng-Hua Tan 1 Outline Multimodal interface
More informationComparing Two Haptic Interfaces for Multimodal Graph Rendering
Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,
More informationWhy interest in visual perception?
Raffaella Folgieri Digital Information & Communication Departiment Constancy factors in visual perception 26/11/2010, Gjovik, Norway Why interest in visual perception? to investigate main factors in VR
More informationTouch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device
Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device Andrew A. Stanley Stanford University Department of Mechanical Engineering astan@stanford.edu Alice X. Wu Stanford
More informationINDE/TC 455: User Interface Design
INDE/TC 455: User Interface Design Module 13.0 Interface Technology 1 Three more interface considerations What is the best allocation of responsibility between the human and the tool? What is the best
More informationJournal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES
Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp. 97 102 SCIENTIFIC LIFE DOI: 10.2478/jtam-2014-0006 ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES Galia V. Tzvetkova Institute
More informationAural and Haptic Displays
Teil 5: Aural and Haptic Displays Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Aural Displays Haptic Displays Further information: The Haptics Community Web Site: http://haptic.mech.northwestern.edu/
More informationEE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department
EE631 Cooperating Autonomous Mobile Robots Lecture 1: Introduction Prof. Yi Guo ECE Department Plan Overview of Syllabus Introduction to Robotics Applications of Mobile Robots Ways of Operation Single
More informationTouch. Touch & the somatic senses. Josh McDermott May 13,
The different sensory modalities register different kinds of energy from the environment. Touch Josh McDermott May 13, 2004 9.35 The sense of touch registers mechanical energy. Basic idea: we bump into
More informationINDE/TC 455: User Interface Design
INDE/TC 455: User Interface Design Autumn 2008 Class #21 URL:courses.washington.edu/ie455 1 TA Moment 2 Class #20 Review Review of flipbooks 3 Assignments for Class #22 Individual Review modules: 5.7,
More informationEffective Iconography....convey ideas without words; attract attention...
Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the
More informationHaplug: A Haptic Plug for Dynamic VR Interactions
Haplug: A Haptic Plug for Dynamic VR Interactions Nobuhisa Hanamitsu *, Ali Israr Disney Research, USA nobuhisa.hanamitsu@disneyresearch.com Abstract. We demonstrate applications of a new actuator, the
More informationTouch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics
Touch & Haptics Touch & High Information Transfer Rate Blind and deaf people have been using touch to substitute vision or hearing for a very long time, and successfully. OPTACON Hong Z Tan Purdue University
More informationCS277 - Experimental Haptics Lecture 2. Haptic Rendering
CS277 - Experimental Haptics Lecture 2 Haptic Rendering Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering A note on timing...
More informationLearning the Proprioceptive and Acoustic Properties of Household Objects. Jivko Sinapov Willow Collaborators: Kaijen and Radu 6/24/2010
Learning the Proprioceptive and Acoustic Properties of Household Objects Jivko Sinapov Willow Collaborators: Kaijen and Radu 6/24/2010 What is Proprioception? It is the sense that indicates whether the
More informationLecture 23: Robotics. Instructor: Joelle Pineau Class web page: What is a robot?
COMP 102: Computers and Computing Lecture 23: Robotics Instructor: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp102 What is a robot? The word robot is popularized by the Czech playwright
More informationSPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko
SPIDERMAN VR Adam Elgressy and Dmitry Vlasenko Supervisors: Boaz Sternfeld and Yaron Honen Submission Date: 09/01/2019 Contents Who We Are:... 2 Abstract:... 2 Previous Work:... 3 Tangent Systems & Development
More informationINTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT
INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,
More informationARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)
Exhibit R-2 0602308A Advanced Concepts and Simulation ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) FY 2005 FY 2006 FY 2007 FY 2008 FY 2009 FY 2010 FY 2011 Total Program Element (PE) Cost 22710 27416
More informationPROPRIOCEPTION AND FORCE FEEDBACK
PROPRIOCEPTION AND FORCE FEEDBACK Roope Raisamo and Jukka Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere,
More informationHaptic Feedback to Guide Interactive Product Design
Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 2-2009 Haptic Feedback to Guide Interactive Product Design Andrew G. Fischer Iowa State University Judy M.
More informationHaptic interaction. Ruth Aylett
Haptic interaction Ruth Aylett Contents Haptic definition Haptic model Haptic devices Measuring forces Haptic Technologies Haptics refers to manual interactions with environments, such as sensorial exploration
More information¾ B-TECH (IT) ¾ B-TECH (IT)
HAPTIC TECHNOLOGY V.R.Siddhartha Engineering College Vijayawada. Presented by Sudheer Kumar.S CH.Sreekanth ¾ B-TECH (IT) ¾ B-TECH (IT) Email:samudralasudheer@yahoo.com Email:shri_136@yahoo.co.in Introduction
More informationDigitizing Color. Place Value in a Decimal Number. Place Value in a Binary Number. Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally
Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally Fluency with Information Technology Third Edition by Lawrence Snyder Digitizing Color RGB Colors: Binary Representation Giving the intensities
More informationHaptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.
Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction. Figure 1. Setup for exploring texture perception using a (1) black box (2) consisting of changeable top with laser-cut haptic cues,
More informationGeo-Located Content in Virtual and Augmented Reality
Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More information9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011
Training Course Remote Sensing Basic Theory & Image Processing Methods 19 23 September 2011 Introduction to Remote Sensing Michiel Damen (September 2011) damen@itc.nl 1 Overview Some definitions Remote
More informationNeuro-Fuzzy and Soft Computing: Fuzzy Sets. Chapter 1 of Neuro-Fuzzy and Soft Computing by Jang, Sun and Mizutani
Chapter 1 of Neuro-Fuzzy and Soft Computing by Jang, Sun and Mizutani Outline Introduction Soft Computing (SC) vs. Conventional Artificial Intelligence (AI) Neuro-Fuzzy (NF) and SC Characteristics 2 Introduction
More informationAcoustic resolution. photoacoustic Doppler velocimetry. in blood-mimicking fluids. Supplementary Information
Acoustic resolution photoacoustic Doppler velocimetry in blood-mimicking fluids Joanna Brunker 1, *, Paul Beard 1 Supplementary Information 1 Department of Medical Physics and Biomedical Engineering, University
More informationHaptic Technology- Comprehensive Review Study with its Applications
Haptic Technology- Comprehensive Review Study with its Applications Tanya Jaiswal 1, Rambha Yadav 2, Pooja Kedia 3 1,2 Student, Department of Computer Science and Engineering, Buddha Institute of Technology,
More information