HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES

Size: px
Start display at page:

Download "HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES"

Transcription

1 HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES ICSRiM University of Leeds School of Music and School of Computing Leeds LS2 9JT UK Abstract The paper describes an ongoing research for a method of interactive sonification of 2D image data. The method utilizes an existing device (from computer games console), Nintendo's wiimote controller [11] to provide a means of interacting with the image to aid exploration, providing the user with sonic and haptic feedback. This paper presents a method for segmentation and analysis of regions within an image to produce global and local descriptors of the specified region. INTRODUCTION Sonification methods present information through using sound (particularly nonspeech), so that users obtain an understanding of the data or processes under investigation by listening [6]. Factors such as end application and the nature of the data under scrutiny play a major role in determining the best mapping and synthesis methods for the sonification process. A particular challenge lays in representing data which is typically time independent (such as image) within a modality which cannot exist without time (audio). In some instances, the mapping in the time domain can be implicit. In instances where this is not the case, the time domain must be constructed elsewhere within the system. The system described within the paper looks at the sonification of time independent 2D image data and, in particular, the construction of the time domain through mapping of user actions to feedback. This current body of research looks at the sonification of the irregular shapes that make up organic images. The algorithms presented perform the processes of defining regions within the image, extracting features from these regions and constructing usable parameters from which sonic and haptic feedback can be produced. The system is driven by the users actions are therefore continued feedback comes as a result of continued interaction. An emphasis on interaction for exploring images should induce benefits such as learnability and subtlety of use [13] coming from increased familiarity with the system. Work presented within [22] looks at issues of mapping time independent data onto the time domain (i.e., image to sound). Two methods of playback have been defined as scanning (automatic sonification route) and probing (user controlled sonification route). Previous attempts have utilized both of these methods in many different forms. Particular examples include a probing method in which sound parameters are mapped onto a spatial domain [7], and automatic scan methods based upon the raster scan technique (pixel by pixel) [21] and a left to right method (column by column) [10]. 188

2 Continuing efforts have been focused upon implementing interactivity as a focus for sonification system design. Saue [16] introduces the concept of walking through data sets in order to determine global, intermediate, local and point data features. Hellstrom et al. [4] implement the mouse as a virtual microphone to explore data spaces. Pauletto and Hunt [12] have created a toolkit for interacting with data sets in which the user uses the mouse to interact with a data space, navigating sonified data in real time. Continuous feedback or the creation of context between modes can also aid the user in discrimination of sound [1]. The implementation of the highest level of real time continuous interaction has been found to produce the most pleasing, efficient and fastest method of analysing data [13]. Work presented in [4] highlights qualities which can be gained from the user being tightly embedded within an interactive control loop, sighting increased levels of 'control intimacy', a quality seen in the manipulation of musical instruments Interfaces with sonic feedback have been realized most commonly in the form of mouse [2, 3, and 12] with additional forms of the keyboard [15] and tablet [5]. Devices which offer a multimodal feedback are less common. A successful binding of auditory and haptic feedback will provide the user with a multimodal description of the data under scrutiny. Work is presented in [20] in which mobile device alerts the user to messages through combinations of sonic and haptic feedback in order to approximate physical objects 'dropping' into the device. Features such as weight, material and size of the object can all be approximated through combination of feedback across the two modalities to convey message parameters such as size, urgency, etc. Understanding of real world objects through physical manipulation and sonic and haptic feedback is particularly prominent with acoustic instruments. An expert violinist is constantly gauging a number of parameters through feedback both haptically (string tension, bow tension etc.) and sonically (pitch, timbre etc.) and it is this feedback which enables the highest level of control over the instrument. The following points are produced in [4] as a guideline for human machine interface design based upon the acoustic instrument example: Physical interaction for sonic response. Increased learning times producing higher level of performance. Interface reacting to physical interaction in a well known way. Sonification system design may benefit from the same design ethos to that of electronic musical instrument design. Learning how to manipulate the image data to extract different types of response will require the user to become familiar with an interactive system inducing a learning curve and the added benefits that come with it. DESIGN AND DEVELOPMENT This section provides a system overview and describes the overall system in terms of its modular components and the individual algorithms which comprise it. The methods for region selection, feature extraction and the construction of parameters for sonic and haptic feedback are then explained. 189

3 System Overview The system comprises of 3 separate components as seen in Figure 1. A user input method in the form of Nintendo s wiimote [11], an input device containing a +/-3g 8-bit 3-axis accelerometer, 1024x768 infra-red camera capable of 4 point tracking and an additional 11 discrete buttons (including 4-way directional pad). The input device will be connected via Bluetooth to a PC running Max/MSP Jitter [9], a visual programming environment which will be used for image processing and analysis, audio synthesis, and communication with the wiimote device. Max/MSP externals have been created in C++ to produce fast image processing and analysis modules which can be integrated within the same development environment as the sonic and haptic feedback modules. Audio feedback will be provided through stereo headphones and haptic feedback through communicating with the wiimote s built in force feedback motor. Figure 1. System Data Flow Region Extraction The system currently implements an area selection algorithm based upon image segmentation. Through identifying regions with connected areas of similar pixel colour values we can segment the image and thus analyze these sections separate from the image as a whole. In the system, the wiimote is pointed at the computer monitor to scan over the image. The centre pixel is determined using the infra-red camera and point tracking hardware built into the wiimote. Depressing the A button will then initiate a procedure to segment the neighbouring area the user is pointing at based upon the single pixel being pointed at. The neighbourhood of pixels is defined using the following algorithm: A flood fill algorithm [18] is performed from the centre pixel coordinates on the original image. The flooded area is converted to a binary image with foreground representing the connected flooded areas. 190

4 The image is resized to find a bounding box containing the selected region to be analysed for providing feedback to the user. A chain code algorithm [8] is implemented to produce a set of integers denoting the movement taken from one pixel to the next in tracing the region perimeter: A raster scan is performed to find the first instance of a flooded pixel. From this position, a 4 neighbourhood chain code algorithm traces the path of the border clockwise around the shape perimeter. In an instance where a connecting pixel is unavailable in the 4 neighbourhood setup, the remaining 4 diagonal neighbourhoods are checked (clockwise from bottom right) and the path is continued. The chain code completing the shape perimeter is the chain code recorded. Global Shape Describing Parameters From the region extraction process we have obtained data describing the net area for the region and the region perimeter. In addition to these, we can derive a number of new parameters which can describe global shape descriptors for the region. An intermediate mapping step converts these raw acquired data sets into a set of parameters which aim to describe the shape as a set of more useable terms. These are obtained as follows [14, 17]: where n is the series of x coordinates within the chain code path, y is the series of y coordinates within the chain code path and chaincodelength is the number of steps taken in the chain code algorithm implementation. Local Shape Exploration Parameters The system aims to provide both a global representation of the selected region and also a local representation, describing smaller areas of within the region accessible 191

5 through direct interaction. Through exploring the region with the wiimote cursor the construction of local parameters provides the user with a means of further investigation of the region under scrutiny. The following methods have been implemented to produce local parameters. Distance from centre of gravity. The distance from the centre of gravity of the user position within the shape can form a new parameter for feedback. (Figure 2) Figure 2. Distance Fields from Centre of Gravity Local perimeter section. Variable size sections of the perimeter can be accessed using the wiimote (Figure 3). The shape perimeter can be sonified with respect to angularity and path (from existing chain code segment) with a scan angle determinable by wiimote cursor position and tilt sensing capability (motion detection from accelerometers [19]). Figure 3. Scanning of Local Perimeter Section Shape perimeter points. Points which comprise the perimeter can be sonified with respect to their relative horizontal and vertical position and magnitude to that of the wiimote cursor. Figure 4 shows how these parameters are constructed. 192

6 Figure 4. Determining Position of Relative Perimeter Points User Configurable Mapping The construction of global and local parameters allows for the categorization of input type. Similarly, we can we can group the feedback methods into the following categories: High level musical parameters such as melody, rhythm and harmony. Low level sonic parameters such as pitch, dynamics and frequency components. Amplitude and pulse frequency of wiimote rumble capability. A user configurable mapping strategy allows the user to experiment with the perceptual effect of mapping input and output parameters singularly and also across categories types. CONCLUSIONS AND FURTHER WORK The interface currently implements a method for user driven selection and parameter extraction of regions comprising images loaded into the system. The regions are analysed to produce global descriptors from which we can build sonifications and haptic responses. Local descriptors are constructed through user interaction providing an additional category of parameters from which we can produce additional localised feedback. Current work is focused on the investigation of mapping strategies in order to find effective ways of capitalizing on the segregation of input parameters we have obtained. Experimentation with the mapping of different parameter types to different sonic and musical descriptors across the two modalities should provide interesting and hopefully categorical results. Work in this area will point the system in the way of defining the most effective default setting for the new user, offering both generic settings and supporting user personalization. Continuing work on the project can now include the development of modules for colour and texture analysis, grouping and context of segments and furthering of new interaction methods to facilitate these modules. References 193

7 [1] FERNSTROM, M and Brazil, E: Human-Computer Interaction Design based on Interactive Sonification Hearing Actions or Instruments/Agents. Proc. of the 2004 International Workshop on Interactive Sonification, Bielefeld University, Germany, 8th January 2004 [2] HELLSTROM, S. O and Winberg F: Qualitative aspects of the auditory direct manipulation: A case study of the towers of Hanoi. Proc. of the 7th Int. Conf. on Auditory Display, [3] HOLMES, J: Interacting with an information space using sound: Accuracy and patterns. Proc. of the International Conference on Auditory Display, Limerick, Ireland, [4] HUNT. A and Hermann, T: The importance of Interaction in Sonification. Proc. of ICAD 04 Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July 6-9, [5] KILDAL, J and Brewster, S: Providing a size-independent overview of nonvisual tables. Proc. of the International Conference on Auditory Display, London, England, 2 [6] KRAMER, G. Ed: Auditory Display Sonification, Audification, and Auditory Interfaces, Addison-Wesley, [7] LEE, Z., Berger, J and Yeo W. S: Mapping Sound to Image in Interactive Multimedia Art. Retrieved 15th January 2008 from: [8] LIU, Y. K and Zalik, B: An Efficient Chain Code with Huffman Coding Pattern Recognition, Volume 38, Issue 4, April 2005, Pages [9] Max/MSP Jitter. Graphical Real-Time Programming Environment. [10] MEIJER, P. B. L.: Vision Technology for the Totally Blind. Retrieved 20th December 2007 from: [11] Nintendo. Nintendo Wiimote. [12] PAULETTO, S and Hunt, A: A toolkit for interactive sonification. Proc. of the 10th Int. Conf. on Auditory Display, [13] PAULETTO, S and Hunt, A: Interacting with Sonifications: An Evaluation. Proc. of the 13th International Conference on Auditory Display, Montreal, Canada, June 26-29, [14] RUSS, J. C: The Image Processing Handbook, Third Ed. CRC Press, [15] STOCKMAN, T., Hind, G and Frauenerger, C: Interactive Sonification and Spreadsheets. Proc of the International Conference on Auditory Display, Limerick, Ireland, [16] SAUE, S: A model for interaction in exploratory sonification displays. Proceedings of ICAD 2000 [17] TROUILLOT, X., Jourlin, M and Pinoli, J. C: Geometric Parameters Computation with Freeman Code. Retrieved 20th April 2008 from: [18] WEISFELD, S: Stack Based Flood Fill Algorithm. Retrieved 15th March 2008 from: 194

8 ack-based-flood-fill-algorithm.aspx [19] Wiili.org Wii Linux. Motion Analysis. Retrieved 1st May 2008 from: [22] YEO, W. S and Berger, J: A Framework for Designing Image Sonification Methods. Proc of ICAD 05- Eleventh Meeting of the International Conference on Auditory Display, pp Limerick, Ireland, July 6 9, [20] J. Williamson, R. Murray-Smith, and S. Hughes, Shoogle: Excitatory Multimodal Interaction on Mobile Devices. [21] YEO, W. S. and Berger, J: Application of Raster Scanning Method to Image Sonification, Sound Visualization, Sound Analysis and Synthesis. Proc of the 9th Int. Conference on Digital Audio Effects. Montreal, Canada, September 18 20, [22] YEO, W. S. and Berger, J: A Framework for Designing Image Sonification Methods. Proceedings of ICAD 05-Eleventh Meeting of the International Conference on Auditory Display, pp Limerick, Ireland, July 6 9,

Interactive Exploration of City Maps with Auditory Torches

Interactive Exploration of City Maps with Auditory Torches Interactive Exploration of City Maps with Auditory Torches Wilko Heuten OFFIS Escherweg 2 Oldenburg, Germany Wilko.Heuten@offis.de Niels Henze OFFIS Escherweg 2 Oldenburg, Germany Niels.Henze@offis.de

More information

Spatialization and Timbre for Effective Auditory Graphing

Spatialization and Timbre for Effective Auditory Graphing 18 Proceedings o1't11e 8th WSEAS Int. Conf. on Acoustics & Music: Theory & Applications, Vancouver, Canada. June 19-21, 2007 Spatialization and Timbre for Effective Auditory Graphing HONG JUN SONG and

More information

Visual Attention in Auditory Display

Visual Attention in Auditory Display Visual Attention in Auditory Display Thorsten Mahler 1, Pierre Bayerl 2,HeikoNeumann 2, and Michael Weber 1 1 Department of Media Informatics 2 Department of Neuro Informatics University of Ulm, Ulm, Germany

More information

Comparison of Haptic and Non-Speech Audio Feedback

Comparison of Haptic and Non-Speech Audio Feedback Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability

More information

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces LCC 3710 Principles of Interaction Design Class agenda: - Readings - Speech, Sonification, Music Readings Hermann, T., Hunt, A. (2005). "An Introduction to Interactive Sonification" in IEEE Multimedia,

More information

Abstract. 2. Related Work. 1. Introduction Icon Design

Abstract. 2. Related Work. 1. Introduction Icon Design The Hapticon Editor: A Tool in Support of Haptic Communication Research Mario J. Enriquez and Karon E. MacLean Department of Computer Science University of British Columbia enriquez@cs.ubc.ca, maclean@cs.ubc.ca

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

From Dots To Shapes: an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun "From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils Patrick Roth, Lori Petrucci, Thierry Pun Computer Science Department CUI, University of Geneva CH - 1211 Geneva

More information

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK Javier Sanchez Center for Computer Research in Music and Acoustics (CCRMA) Stanford University The Knoll, 660 Lomita Dr. Stanford, CA 94305,

More information

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

The Deep Sound of a Global Tweet: Sonic Window #1

The Deep Sound of a Global Tweet: Sonic Window #1 The Deep Sound of a Global Tweet: Sonic Window #1 (a Real Time Sonification) Andrea Vigani Como Conservatory, Electronic Music Composition Department anvig@libero.it Abstract. People listen music, than

More information

GEN/MDM INTERFACE USER GUIDE 1.00

GEN/MDM INTERFACE USER GUIDE 1.00 GEN/MDM INTERFACE USER GUIDE 1.00 Page 1 of 22 Contents Overview...3 Setup...3 Gen/MDM MIDI Quick Reference...4 YM2612 FM...4 SN76489 PSG...6 MIDI Mapping YM2612...8 YM2612: Global Parameters...8 YM2612:

More information

Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time.

Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time. 2. Physical sound 2.1 What is sound? Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time. Figure 2.1: A 0.56-second audio clip of

More information

Tracking and Recognizing Gestures using TLD for Camera based Multi-touch

Tracking and Recognizing Gestures using TLD for Camera based Multi-touch Indian Journal of Science and Technology, Vol 8(29), DOI: 10.17485/ijst/2015/v8i29/78994, November 2015 ISSN (Print) : 0974-6846 ISSN (Online) : 0974-5645 Tracking and Recognizing Gestures using TLD for

More information

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Sound rendering in Interactive Multimodal Systems. Federico Avanzini Sound rendering in Interactive Multimodal Systems Federico Avanzini Background Outline Ecological Acoustics Multimodal perception Auditory visual rendering of egocentric distance Binaural sound Auditory

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and 8.1 INTRODUCTION In this chapter, we will study and discuss some fundamental techniques for image processing and image analysis, with a few examples of routines developed for certain purposes. 8.2 IMAGE

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces Investigating Phicon Feedback in Non- Visual Tangible User Interfaces David McGookin and Stephen Brewster Glasgow Interactive Systems Group School of Computing Science University of Glasgow Glasgow, G12

More information

ECC419 IMAGE PROCESSING

ECC419 IMAGE PROCESSING ECC419 IMAGE PROCESSING INTRODUCTION Image Processing Image processing is a subclass of signal processing concerned specifically with pictures. Digital Image Processing, process digital images by means

More information

From Shape to Sound: sonification of two dimensional curves by reenaction of biological movements

From Shape to Sound: sonification of two dimensional curves by reenaction of biological movements From Shape to Sound: sonification of two dimensional curves by reenaction of biological movements Etienne Thoret 1, Mitsuko Aramaki 1, Richard Kronland-Martinet 1, Jean-Luc Velay 2, and Sølvi Ystad 1 1

More information

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 t t t rt t s s Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 1 r sr st t t 2 st t t r t r t s t s 3 Pr ÿ t3 tr 2 t 2 t r r t s 2 r t ts ss

More information

SOPA version 2. Revised July SOPA project. September 21, Introduction 2. 2 Basic concept 3. 3 Capturing spatial audio 4

SOPA version 2. Revised July SOPA project. September 21, Introduction 2. 2 Basic concept 3. 3 Capturing spatial audio 4 SOPA version 2 Revised July 7 2014 SOPA project September 21, 2014 Contents 1 Introduction 2 2 Basic concept 3 3 Capturing spatial audio 4 4 Sphere around your head 5 5 Reproduction 7 5.1 Binaural reproduction......................

More information

SONIFICATION OF SPATIAL DATA

SONIFICATION OF SPATIAL DATA SONIFICATION OF SPATIAL DATA Tooba Nasir Computing Laboratory University of Kent Canterbury, UK. tn37@kent.ac.uk Jonathan C. Roberts Computing Laboratory University of Kent Canterbury, UK. j.c.roberts@kent.ac.uk

More information

Glasgow eprints Service

Glasgow eprints Service Hoggan, E.E and Brewster, S.A. (2006) Crossmodal icons for information display. In, Conference on Human Factors in Computing Systems, 22-27 April 2006, pages pp. 857-862, Montréal, Québec, Canada. http://eprints.gla.ac.uk/3269/

More information

ANALYSIS AND EVALUATION OF IRREGULARITY IN PITCH VIBRATO FOR STRING-INSTRUMENT TONES

ANALYSIS AND EVALUATION OF IRREGULARITY IN PITCH VIBRATO FOR STRING-INSTRUMENT TONES Abstract ANALYSIS AND EVALUATION OF IRREGULARITY IN PITCH VIBRATO FOR STRING-INSTRUMENT TONES William L. Martens Faculty of Architecture, Design and Planning University of Sydney, Sydney NSW 2006, Australia

More information

Virtual Tactile Maps

Virtual Tactile Maps In: H.-J. Bullinger, J. Ziegler, (Eds.). Human-Computer Interaction: Ergonomics and User Interfaces. Proc. HCI International 99 (the 8 th International Conference on Human-Computer Interaction), Munich,

More information

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Heads up interaction: glasgow university multimodal research. Eve Hoggan Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not

More information

MAGNITUDE-COMPLEMENTARY FILTERS FOR DYNAMIC EQUALIZATION

MAGNITUDE-COMPLEMENTARY FILTERS FOR DYNAMIC EQUALIZATION Proceedings of the COST G-6 Conference on Digital Audio Effects (DAFX-), Limerick, Ireland, December 6-8, MAGNITUDE-COMPLEMENTARY FILTERS FOR DYNAMIC EQUALIZATION Federico Fontana University of Verona

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Design and evaluation of Hapticons for enriched Instant Messaging

Design and evaluation of Hapticons for enriched Instant Messaging Design and evaluation of Hapticons for enriched Instant Messaging Loy Rovers and Harm van Essen Designed Intelligence Group, Department of Industrial Design Eindhoven University of Technology, The Netherlands

More information

Mobile Motion: Multimodal Device Augmentation for Musical Applications

Mobile Motion: Multimodal Device Augmentation for Musical Applications Mobile Motion: Multimodal Device Augmentation for Musical Applications School of Computing, School of Electronic and Electrical Engineering and School of Music ICSRiM, University of Leeds, United Kingdom

More information

REAL-TIME CONTROL OF SONIFICATION MODELS WITH A HAPTIC INTERFACE. Thomas Hermann, Jan Krause and Helge Ritter

REAL-TIME CONTROL OF SONIFICATION MODELS WITH A HAPTIC INTERFACE. Thomas Hermann, Jan Krause and Helge Ritter REAL-TIME CONTROL OF SONIFICATION MODELS WITH A HAPTIC INTERFACE Thomas Hermann, Jan Krause and Helge Ritter Faculty of Technology Bielefeld University, Germany thermann jkrause helge @techfak.uni-bielefeld.de

More information

Exploring Haptics in Digital Waveguide Instruments

Exploring Haptics in Digital Waveguide Instruments Exploring Haptics in Digital Waveguide Instruments 1 Introduction... 1 2 Factors concerning Haptic Instruments... 2 2.1 Open and Closed Loop Systems... 2 2.2 Sampling Rate of the Control Loop... 2 3 An

More information

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv, Tel-Aviv,

More information

Square Pixels to Hexagonal Pixel Structure Representation Technique. Mullana, Ambala, Haryana, India. Mullana, Ambala, Haryana, India

Square Pixels to Hexagonal Pixel Structure Representation Technique. Mullana, Ambala, Haryana, India. Mullana, Ambala, Haryana, India , pp.137-144 http://dx.doi.org/10.14257/ijsip.2014.7.4.13 Square Pixels to Hexagonal Pixel Structure Representation Technique Barun kumar 1, Pooja Gupta 2 and Kuldip Pahwa 3 1 4 th Semester M.Tech, Department

More information

EMA-Tactons: Vibrotactile External Memory Aids in an Auditory Display

EMA-Tactons: Vibrotactile External Memory Aids in an Auditory Display EMA-Tactons: Vibrotactile External Memory Aids in an Auditory Display Johan Kildal 1, Stephen A. Brewster 1 1 Glasgow Interactive Systems Group, Department of Computing Science University of Glasgow. Glasgow,

More information

TANGIBLE ACTIVE OBJECTS AND INTERACTIVE SONIFICATION AS A SCATTER PLOT ALTERNATIVE FOR THE VISUALLY IMPAIRED

TANGIBLE ACTIVE OBJECTS AND INTERACTIVE SONIFICATION AS A SCATTER PLOT ALTERNATIVE FOR THE VISUALLY IMPAIRED TANGIBLE ACTIVE OBJECTS AND INTERACTIVE SONIFICATION AS A SCATTER PLOT ALTERNATIVE FOR THE VISUALLY IMPAIRED Eckard Riedenklau, Thomas Hermann, Helge Ritter Ambient Intelligence Group / Neuroinformatics

More information

SONIFICATIONS FOR DIGITAL AUDIO WORKSTATIONS: REFLECTIONS ON A PARTICIPATORY DESIGN APPROACH

SONIFICATIONS FOR DIGITAL AUDIO WORKSTATIONS: REFLECTIONS ON A PARTICIPATORY DESIGN APPROACH SONIFICATIONS FOR DIGITAL AUDIO WORKSTATIONS: REFLECTIONS ON A PARTICIPATORY DESIGN APPROACH Oussama Metatla, Nick Bryan-Kinns, Tony Stockman, Fiore Martin School of Electronic Engineering & Computer Science

More information

Paint with Your Voice: An Interactive, Sonic Installation

Paint with Your Voice: An Interactive, Sonic Installation Paint with Your Voice: An Interactive, Sonic Installation Benjamin Böhm 1 benboehm86@gmail.com Julian Hermann 1 julian.hermann@img.fh-mainz.de Tim Rizzo 1 tim.rizzo@img.fh-mainz.de Anja Stöffler 1 anja.stoeffler@img.fh-mainz.de

More information

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration 22 ISSN 2043-0167 Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration Oussama Metatla, Fiore Martin, Nick Bryan-Kinns and Tony Stockman EECSRR-12-03 June

More information

Mel Spectrum Analysis of Speech Recognition using Single Microphone

Mel Spectrum Analysis of Speech Recognition using Single Microphone International Journal of Engineering Research in Electronics and Communication Mel Spectrum Analysis of Speech Recognition using Single Microphone [1] Lakshmi S.A, [2] Cholavendan M [1] PG Scholar, Sree

More information

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Comparing Two Haptic Interfaces for Multimodal Graph Rendering Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,

More information

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 2, Issue 6 (Jul. Aug. 2013), PP 08-13 e-issn: 2319 4200, p-issn No. : 2319 4197 Auditory-Tactile Interaction Using Digital Signal Processing

More information

Image Extraction using Image Mining Technique

Image Extraction using Image Mining Technique IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719 Vol. 3, Issue 9 (September. 2013), V2 PP 36-42 Image Extraction using Image Mining Technique Prof. Samir Kumar Bandyopadhyay,

More information

The Sonification and Learning of Human Motion

The Sonification and Learning of Human Motion The Sonification and Learning of Human Motion Kevin M. Smith California State University, Channel Islands One University Drive, Camarillo, California 93012 k2msmith@gmail.com David Claveau California State

More information

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time

More information

Creating a 3D environment map from 2D camera images in robotics

Creating a 3D environment map from 2D camera images in robotics Creating a 3D environment map from 2D camera images in robotics J.P. Niemantsverdriet jelle@niemantsverdriet.nl 4th June 2003 Timorstraat 6A 9715 LE Groningen student number: 0919462 internal advisor:

More information

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp. 105-124. http://eprints.gla.ac.uk/3273/ Glasgow eprints Service http://eprints.gla.ac.uk

More information

SONIFYING ECOG SEIZURE DATA WITH OVERTONE MAPPING: A STRATEGY FOR CREATING AUDITORY GESTALT FROM CORRELATED MULTICHANNEL DATA

SONIFYING ECOG SEIZURE DATA WITH OVERTONE MAPPING: A STRATEGY FOR CREATING AUDITORY GESTALT FROM CORRELATED MULTICHANNEL DATA Proceedings of the th International Conference on Auditory Display, Atlanta, GA, USA, June -, SONIFYING ECOG SEIZURE DATA WITH OVERTONE MAPPING: A STRATEGY FOR CREATING AUDITORY GESTALT FROM CORRELATED

More information

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON Josep Amat 1, Alícia Casals 2, Manel Frigola 2, Enric Martín 2 1Robotics Institute. (IRI) UPC / CSIC Llorens Artigas 4-6, 2a

More information

Responsive Sensate Environments: Past and Future Directions

Responsive Sensate Environments: Past and Future Directions Responsive Sensate Environments: Past and Future Directions Designing Space as an Interface with Socio-Spatial Information CAAD Futures 2005, Wien 21 June, 2005 Kirsty Beilharz Key Centre of Design Computing

More information

Access Invaders: Developing a Universally Accessible Action Game

Access Invaders: Developing a Universally Accessible Action Game ICCHP 2006 Thursday, 13 July 2006 Access Invaders: Developing a Universally Accessible Action Game Dimitris Grammenos, Anthony Savidis, Yannis Georgalis, Constantine Stephanidis Human-Computer Interaction

More information

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS Richard Etter 1 ) and Marcus Specht 2 ) Abstract In this paper the design, development and evaluation of a GPS-based

More information

Evaluating the Effectiveness of Auditory and Tactile Surface Graphs for the Visually Impaired

Evaluating the Effectiveness of Auditory and Tactile Surface Graphs for the Visually Impaired Evaluating the Effectiveness of Auditory and Tactile Surface Graphs for the Visually Impaired James A. Ferwerda; Rochester Institute of Technology; Rochester, NY USA Vladimir Bulatov, John Gardner; ViewPlus

More information

VIRTUAL REALITY PLATFORM FOR SONIFICATION EVALUATION

VIRTUAL REALITY PLATFORM FOR SONIFICATION EVALUATION VIRTUAL REALITY PLATFORM FOR SONIFICATION EVALUATION Thimmaiah Kuppanda 1, Norberto Degara 1, David Worrall 1, Balaji Thoshkahna 1, Meinard Müller 2 1 Fraunhofer Institute for Integrated Circuits IIS,

More information

Acoustic Rendering as Support for Sustained Attention during Biomedical Procedures

Acoustic Rendering as Support for Sustained Attention during Biomedical Procedures Acoustic Rendering as Support for Sustained Attention during Biomedical Procedures Emil Jovanov, Dusan Starcevic University of Belgrade Belgrade, Yugoslavia Kristen Wegner, Daniel Karron Computer Aided

More information

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills O Lahav and D Mioduser School of Education, Tel Aviv University,

More information

Sonic Interaction Design: New applications and challenges for Interactive Sonification

Sonic Interaction Design: New applications and challenges for Interactive Sonification Sonic Interaction Design: New applications and challenges for Interactive Sonification Thomas Hermann Ambient Intelligence Group CITEC Bielefeld University Germany Keynote presentation DAFx 2010 Graz 2010-09-07

More information

VOICE OF SISYPHUS: AN IMAGE SONIFICATION MULTIMEDIA INSTALLATION

VOICE OF SISYPHUS: AN IMAGE SONIFICATION MULTIMEDIA INSTALLATION VOICE OF SISYPHUS: AN IMAGE SONIFICATION MULTIMEDIA INSTALLATION Ryan McGee, Joshua Dickinson, and George Legrady Experimental Visualization Lab Media Arts and Technology University of California, Santa

More information

DO YOU HEAR A BUMP OR A HOLE? AN EXPERIMENT ON TEMPORAL ASPECTS IN THE RECOGNITION OF FOOTSTEPS SOUNDS

DO YOU HEAR A BUMP OR A HOLE? AN EXPERIMENT ON TEMPORAL ASPECTS IN THE RECOGNITION OF FOOTSTEPS SOUNDS DO YOU HEAR A BUMP OR A HOLE? AN EXPERIMENT ON TEMPORAL ASPECTS IN THE RECOGNITION OF FOOTSTEPS SOUNDS Stefania Serafin, Luca Turchet and Rolf Nordahl Medialogy, Aalborg University Copenhagen Lautrupvang

More information

Preeti Rao 2 nd CompMusicWorkshop, Istanbul 2012

Preeti Rao 2 nd CompMusicWorkshop, Istanbul 2012 Preeti Rao 2 nd CompMusicWorkshop, Istanbul 2012 o Music signal characteristics o Perceptual attributes and acoustic properties o Signal representations for pitch detection o STFT o Sinusoidal model o

More information

Title: A Comparison of Different Tactile Output Devices In An Aviation Application

Title: A Comparison of Different Tactile Output Devices In An Aviation Application Page 1 of 6; 12/2/08 Thesis Proposal Title: A Comparison of Different Tactile Output Devices In An Aviation Application Student: Sharath Kanakamedala Advisor: Christopher G. Prince Proposal: (1) Provide

More information

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 - COMPUTERIZED IMAGING Section I: Chapter 2 RADT 3463 Computerized Imaging 1 SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 COMPUTERIZED IMAGING Section I: Chapter 2 RADT

More information

Tutorial Day at MobileHCI 2008, Amsterdam

Tutorial Day at MobileHCI 2008, Amsterdam Tutorial Day at MobileHCI 2008, Amsterdam Text input for mobile devices by Scott MacKenzie Scott will give an overview of different input means (e.g. key based, stylus, predictive, virtual keyboard), parameters

More information

Automatic Online Haptic Graph Construction

Automatic Online Haptic Graph Construction Automatic Online Haptic Graph Construction Wai Yu, Kenneth Cheung, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science University of Glasgow, Glasgow, UK {rayu, stephen}@dcs.gla.ac.uk

More information

Electric Audio Unit Un

Electric Audio Unit Un Electric Audio Unit Un VIRTUALMONIUM The world s first acousmonium emulated in in higher-order ambisonics Natasha Barrett 2017 User Manual The Virtualmonium User manual Natasha Barrett 2017 Electric Audio

More information

Head Tracker Range Checking

Head Tracker Range Checking Head Tracker Range Checking System Components Haptic Arm IR Transmitter Transmitter Screen Keyboard & Mouse 3D Glasses Remote Control Logitech Hardware Haptic Arm Power Supply Stand By button Procedure

More information

SGN Audio and Speech Processing

SGN Audio and Speech Processing Introduction 1 Course goals Introduction 2 SGN 14006 Audio and Speech Processing Lectures, Fall 2014 Anssi Klapuri Tampere University of Technology! Learn basics of audio signal processing Basic operations

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

Buddy Bearings: A Person-To-Person Navigation System

Buddy Bearings: A Person-To-Person Navigation System Buddy Bearings: A Person-To-Person Navigation System George T Hayes School of Information University of California, Berkeley 102 South Hall Berkeley, CA 94720-4600 ghayes@ischool.berkeley.edu Dhawal Mujumdar

More information

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Issues and Challenges of 3D User Interfaces: Effects of Distraction Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

Learning the Proprioceptive and Acoustic Properties of Household Objects. Jivko Sinapov Willow Collaborators: Kaijen and Radu 6/24/2010

Learning the Proprioceptive and Acoustic Properties of Household Objects. Jivko Sinapov Willow Collaborators: Kaijen and Radu 6/24/2010 Learning the Proprioceptive and Acoustic Properties of Household Objects Jivko Sinapov Willow Collaborators: Kaijen and Radu 6/24/2010 What is Proprioception? It is the sense that indicates whether the

More information

Impulse noise features for automatic selection of noise cleaning filter

Impulse noise features for automatic selection of noise cleaning filter Impulse noise features for automatic selection of noise cleaning filter Odej Kao Department of Computer Science Technical University of Clausthal Julius-Albert-Strasse 37 Clausthal-Zellerfeld, Germany

More information

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods 19 An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods T.Arunachalam* Post Graduate Student, P.G. Dept. of Computer Science, Govt Arts College, Melur - 625 106 Email-Arunac682@gmail.com

More information

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces In Usability Evaluation and Interface Design: Cognitive Engineering, Intelligent Agents and Virtual Reality (Vol. 1 of the Proceedings of the 9th International Conference on Human-Computer Interaction),

More information

Conversational Gestures For Direct Manipulation On The Audio Desktop

Conversational Gestures For Direct Manipulation On The Audio Desktop Conversational Gestures For Direct Manipulation On The Audio Desktop Abstract T. V. Raman Advanced Technology Group Adobe Systems E-mail: raman@adobe.com WWW: http://cs.cornell.edu/home/raman 1 Introduction

More information

Sound source localization and its use in multimedia applications

Sound source localization and its use in multimedia applications Notes for lecture/ Zack Settel, McGill University Sound source localization and its use in multimedia applications Introduction With the arrival of real-time binaural or "3D" digital audio processing,

More information

From Encoding Sound to Encoding Touch

From Encoding Sound to Encoding Touch From Encoding Sound to Encoding Touch Toktam Mahmoodi King s College London, UK http://www.ctr.kcl.ac.uk/toktam/index.htm ETSI STQ Workshop, May 2017 Immersing a person into the real environment with Very

More information

HEAD. Advanced Filters Module (Code 5019) Overview. Features. Module with various filter tools for sound design

HEAD. Advanced Filters Module (Code 5019) Overview. Features. Module with various filter tools for sound design HEAD Ebertstraße 30a 52134 Herzogenrath Tel.: +49 2407 577-0 Fax: +49 2407 577-99 email: info@head-acoustics.de Web: www.head-acoustics.de ASM 19 Data Datenblatt Sheet Advanced Filters Module (Code 5019)

More information

Blind navigation with a wearable range camera and vibrotactile helmet

Blind navigation with a wearable range camera and vibrotactile helmet Blind navigation with a wearable range camera and vibrotactile helmet (author s name removed for double-blind review) X university 1@2.com (author s name removed for double-blind review) X university 1@2.com

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY *Ms. S. VAISHNAVI, Assistant Professor, Sri Krishna Arts And Science College, Coimbatore. TN INDIA **SWETHASRI. L., Final Year B.Com

More information

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY IADIS International Conference Gaming 2008 THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY Yang-Wai Chow School of Computer Science and Software Engineering

More information

Multi-Modal User Interaction

Multi-Modal User Interaction Multi-Modal User Interaction Lecture 4: Multiple Modalities Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk MMUI, IV, Zheng-Hua Tan 1 Outline Multimodal interface

More information

Proceedings of Meetings on Acoustics

Proceedings of Meetings on Acoustics Proceedings of Meetings on Acoustics Volume 19, 2013 http://acousticalsociety.org/ ICA 2013 Montreal Montreal, Canada 2-7 June 2013 Architectural Acoustics Session 1pAAa: Advanced Analysis of Room Acoustics:

More information

APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE

APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE Najirah Umar 1 1 Jurusan Teknik Informatika, STMIK Handayani Makassar Email : najirah_stmikh@yahoo.com

More information

Unit 1.1: Information representation

Unit 1.1: Information representation Unit 1.1: Information representation 1.1.1 Different number system A number system is a writing system for expressing numbers, that is, a mathematical notation for representing numbers of a given set,

More information

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

Artex: Artificial Textures from Everyday Surfaces for Touchscreens Artex: Artificial Textures from Everyday Surfaces for Touchscreens Andrew Crossan, John Williamson and Stephen Brewster Glasgow Interactive Systems Group Department of Computing Science University of Glasgow

More information

TWO-DIMENSIONAL FOURIER PROCESSING OF RASTERISED AUDIO

TWO-DIMENSIONAL FOURIER PROCESSING OF RASTERISED AUDIO TWO-DIMENSIONAL FOURIER PROCESSING OF RASTERISED AUDIO Chris Pike, Department of Electronics Univ. of York, UK chris.pike@rd.bbc.co.uk Jeremy J. Wells, Audio Lab, Dept. of Electronics Univ. of York, UK

More information

AUDIO-ENHANCED COLLABORATION AT AN INTERACTIVE ELECTRONIC WHITEBOARD. Christian Müller Tomfelde and Sascha Steiner

AUDIO-ENHANCED COLLABORATION AT AN INTERACTIVE ELECTRONIC WHITEBOARD. Christian Müller Tomfelde and Sascha Steiner AUDIO-ENHANCED COLLABORATION AT AN INTERACTIVE ELECTRONIC WHITEBOARD Christian Müller Tomfelde and Sascha Steiner GMD - German National Research Center for Information Technology IPSI- Integrated Publication

More information

ISONIC PA AUT Spiral Scan Inspection of Tubular Parts Operating Manual and Inspection Procedure Rev 1.00 Sonotron NDT

ISONIC PA AUT Spiral Scan Inspection of Tubular Parts Operating Manual and Inspection Procedure Rev 1.00 Sonotron NDT ISONIC PA AUT Spiral Scan Inspection of Tubular Parts Operating Manual and Inspection Procedure Rev 1.00 Sonotron NDT General ISONIC PA AUT Spiral Scan Inspection Application was designed on the platform

More information

JUGGLING SOUNDS. Till Bovermann 1, Jonas Groten 2, Alberto de Campo 1, Gerhard Eckel 1

JUGGLING SOUNDS. Till Bovermann 1, Jonas Groten 2, Alberto de Campo 1, Gerhard Eckel 1 JUGGLING SOUNDS Till Bovermann 1, Jonas Groten 2, Alberto de Campo 1, Gerhard Eckel 1 Institute of Electronic Music and Acoustics 1 Joanneum Research 2 University of Music and Dramatic Arts Institute of

More information

GEOMETRIC SHAPE DETECTION WITH SOUNDVIEW. Department of Computer Science 1 Department of Psychology 2 University of British Columbia Vancouver, Canada

GEOMETRIC SHAPE DETECTION WITH SOUNDVIEW. Department of Computer Science 1 Department of Psychology 2 University of British Columbia Vancouver, Canada GEOMETRIC SHAPE DETECTION WITH SOUNDVIEW K. van den Doel 1, D. Smilek 2, A. Bodnar 1, C. Chita 1, R. Corbett 1, D. Nekrasovski 1, J. McGrenere 1 Department of Computer Science 1 Department of Psychology

More information

The analysis of multi-channel sound reproduction algorithms using HRTF data

The analysis of multi-channel sound reproduction algorithms using HRTF data The analysis of multichannel sound reproduction algorithms using HRTF data B. Wiggins, I. PatersonStephens, P. Schillebeeckx Processing Applications Research Group University of Derby Derby, United Kingdom

More information

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne Introduction to HCI CS4HC3 / SE4HC3/ SE6DO3 Fall 2011 Instructor: Kevin Browne brownek@mcmaster.ca Slide content is based heavily on Chapter 1 of the textbook: Designing the User Interface: Strategies

More information

Stamp detection in scanned documents

Stamp detection in scanned documents Annales UMCS Informatica AI X, 1 (2010) 61-68 DOI: 10.2478/v10065-010-0036-6 Stamp detection in scanned documents Paweł Forczmański Chair of Multimedia Systems, West Pomeranian University of Technology,

More information