S.RIMELL D,M.HOWARD A,D.HUNT P,R.KIRK A,M.TYRRELL. Music Technology Research Group Dept of Electronics, University of York

Size: px
Start display at page:

Download "S.RIMELL D,M.HOWARD A,D.HUNT P,R.KIRK A,M.TYRRELL. Music Technology Research Group Dept of Electronics, University of York"

Transcription

1 The development of a computer-based, physically modelled musical instrument with haptic Feedback, for the performance and composition of electroacoustic music S.RIMELL D,M.HOWARD A,D.HUNT P,R.KIRK A,M.TYRRELL Music Technology Research Group Dept of Electronics, University of York

2 Abstract Conventional computer-based instruments that make use of a piano-style electronic keyboard are creatively limiting for a number of reasons. Expressive compositional and performance potential in such instruments is restricted due to the indirect relationship between the musician s gesture and the sound produced. Tactile feedback cues that are so important to the performer of acoustic instruments are also lacking, furthering the effect of physically isolating the musician from their instrument. In addition, electronically produced timbres commonly sound unnatural, sterile and lacking in physical causality. This paper describes the development of a computer based musical instrument called Cymatic that addresses these problems. Cymatic uses techniques of physical modelling to synthesise various musical components, including strings, membranes and volumes (and even shapes comprising more than three dimensions!) which can be joined together to form complex instruments. Various excitations can be applied to the virtual instrument to set it resonating, including bowing, plucking, microphone input, wave oscillators and random forces, producing a stereo output in real-time. The graphical user interface displays an animation of the resonating structure as it is being played, providing the user with the impression of working with real physical musical components. User input is currently achieved via a force feedback joystick, the axes and buttons of which can be mapped the any of the instrument s parameters, providing real-time, multiparametric gestural control. Tactile feedback is transmitted back to the joystick controller, allowing the user to feel and interact with the vibrations of the instrument. This reconnects the tactile feedback path that is missing from most computer-based musical instruments, increasing the sense of physical connection and causality between the musician and instrument. Musical examples will demonstrate the natural and organic sounding nature of Cymatic s physical model and its creative potential as a performance and compositional tool will be discussed.

3 Introduction Compared to traditional acoustic instruments, the computer in principle offers a virtual infinity of musical potential, representing a form of tabula rasa which can be configured to suit the individual s needs and aesthetic interests. In practice however, computer based instruments have yet to attain creative equality with their physical counterparts. Despite the musical and creative freedom it offers, current technology places serious limitations on the expressive potential of computer based instruments, particularly in the realm of live performance. Computer instruments, are often criticised as being cold, lifeless or lacking in expression, whereas physical instruments may be described as warm, intimate or organic. Techniques of physical modelling have gone some way to addressing these criticisms by modelling the sound creation mechanism rather than attempting to animate its waveform. By intrinsically linking the sound creation mechanism with the sound produced, physically modelled instruments more closely resemble their acoustic cousins, accurately and naturally recreating aperiodic and transient sonorities more commonly associated with timbres of physical causality. Manipulating physical parameters (such as string length, tension and mass) is also more intuitive for musicians than dealing with arbitrary and unpredictable low level parameters such as amplitude modulation, oscillator frequency and phasing (as with more conventional synthesis methods). However, the fact remains that however accurately an instrument can be modelled inside a computer, it remains a separate and untouchable entity for the musician, who is physically divorced from the instrument by the physical-virtual divide. The most common method of attempting to bridge this void is to employ the piano-style MIDI keyboard to control the synthesised instrument s parameters. The keyboard is a familiar and flexible interface for the control of sound but falls short in a number of respects when it comes to permitting total creative control over multiple instrumental parameters. A typical keyboard offers only one degree of freedom and its underlying MIDI protocol treats notes as discrete events bound by an onset and an offset. While this may be sufficient to control synthesis models which share its interface (such as the piano, organ or harpsichord), timbres that evolve between the note onset and offset (i.e. stringed and wind instruments) are less well served. MIDI functions such as aftertouch, modulation and pitch bend have gone some way to addressing this problem but still do not offer anything like the expressive potential of many acoustic instruments.

4 Also, regardless of the expressive potential of a particular computer interface, the musician still remains physically detached from the sound source, which resides in the virtual domain. After audition itself, the haptic senses provide the most important means for observing the behaviour of musical instruments [1] but as computers have evolved to prioritise visual stimuli over tactile control, the haptic senses have been left sadly undernourished. Both tactile (vibrational and textural) and proprioceptive (awareness of one s body state and the forces upon it) cues are vital in combination with aural feedback to create a complex and realistic musical impression [2]. Haptic feedback has also been shown to improve the playability of computer instruments [3] helping the user to gain internal models of the instrument s functionality. Furthermore, computers release the musician from the one gesture, one acoustic event paradigm [4] which characterises acoustic instruments, alienating the musician s gesture from the sound produced. These factors lead to an overall isolation of the musician from the instrument, exaggerating perceptions of computer instruments as cold, lifeless and lacking in expression. Figure 1 shows a simplified diagrammatical representation of the contrasting input and feedback paths of typical acoustic and computer instruments. Many attempts have been made to incorporate haptic feedback into computer instrument controllers. Electronic keyboards have been developed to more closely replicate the feel of real pianos [5] and to provide tactile feedback [6]. Haptic feedback bows have been made to simulate the feel and forces involved with playing bowed instruments [7] and finger fitted vibrational devices have been utilised in open air gestural musical instruments [8]. However, haptic control devices have so far been generally restrictive in their potential for application across different computer instruments and inaccessible to the musical masses.

5 Performer Performer Interface Interface Instrument Instrument c) Performer Gesture Control data Interface Instrument Haptic Feedback Haptic Feedback Data Visual and auditory Feedback Figure 1: Diagrammatical representation showing input and feedback paths for a)acoustic instruments b)typical Computer Instruments c) proposed computer instrument incorporating tactile feedback (i.e Cymatic) Cymatic: an overview Cymatic is a new computer-based musical instrument in development which attempts to address the problems described above. Taking inspiration from sound synthesis environments such as TAO [9], Mosaic [10] and CORDIS-ANIMA[11], Cymatic makes use of a mass-spring physical modelling paradigm, which is implemented as a library of individual components in 1, 2, 3, or even more dimensions, providing strings, membranes, volumes, and structures of higher dimensionality that could never be physically realised. These structures can be joined together to form a complex instrument by arbitrarily connecting individual components. The instrument can be excited at any desired mass in the model by plucking, bowing, arbitrary synthesised waveforms or via an external audio signal.

6 The size, shape, tension, mass and damping characteristics of each component can be user defined and can be varied dynamically in real time during synthesis, which is one way of creating an instrument that could never be created in the real physical world. The resulting sound can be heard by placing any number of virtual microphones at user-defined points on the instrument. The output is the sampled displacement of the massspring cell to which it is attached. Cymatic currently runs on a Silicon Graphics O2 workstation running Irix 6.5. The acoustic output can be produced in real-time for a modest sized structure of around 60 masses (for a 44.1kHz sampling frequency) or off-line for a large structure. The user has some degree of control over the trade-off between real-time operation and instrument size by selecting the sampling frequency, which can be set at various values between 8kHz and 48kHz. Cymatic can display real-time OpenGL animations of the resonating structures. These animations are a useful form of visual feedback, which when combined with auditory and haptic feedback creates a complex and realistic impression of working with real physical instruments. Animations can also provide useful feedback as to the authenticity of the physically modelled excitations and can be used to visually demonstrate acoustic principles. The control interface MIDI controller inputs offer real time control of Cymatic, allowing variation of each of the physical parameters of the instrument i.e. mass, tension, damping, excitation (force, velocity and excitation point) and virtual microphone point. Any device capable of transmitting MIDI controller messages can be employed to control Cymatic. Currently, multi parametric MIDI input is provided via a Microsoft Sidewinder Force Feedback Pro joystick [12] and a Logitech ifeel mouse [13]. The combination of these controllers not only allows simultaneous control of up to six physical parameters but also provides tactile and proprioceptive feedback to the user. The ifeel mouse is a conventional optical mouse containing a vibrotactile device that can create numerous tactile sensations. The Microsoft joystick boasts the ability to output up to six forces simultaneously, enabling the programmer to create realistic force effects such as vibration, recoil, damping and friction that characterise real instruments. Both control devices are interfaced with a Windows PC, (as they are non SGI compatible) where their movement data is converted by software to MIDI in order to remotely

7 control Cymatic. The ifeel mouse produces tactile sensations using Immersion s software (Immersion Touchsense Entertainment [14]) which converts Cymatic s audio output directly into frequency and amplitude related tactile sensations. The haptic capabilities of the Microsoft joystick can be programmed via MIDI. Cymatic feeds the appropriate haptic commands to the joystick while the instrument is playing in order to recreate the feel and tactile response of a real instrument. The effect of each controller is user defined so the control interface can be configured to the instrument and excitation that Cymatic is running. For example the velocity of the mouse could be used to control the bow velocity of a Cymatic bowed string instrument while the joystick controls the force of the bow, the excitation point, the tension and the mass of the string. Vibrational forces corresponding to the frequency and amplitude of the sound output would also be felt by both hands and forces corresponding to the bow force and position could be felt by the joystick hand. The joystick can also be used as a virtual drum stick for exciting a Cymatic membrane instrument with the velocity of the y-axis mapped to the force of the drum stick. Recoil forces from the joystick allow the virtual membrane to be felt by the user. Cymatic s real-time audio input also offers interesting creative possibilities. An audio input from any sound source can be used to excite a virtual instrument while the joystick and mouse manipulate its physical characteristics. Conclusions and future work Cymatic is not intended to accurately model real instruments, rather its intention is to allow the musician to intuitively and physically interact with new instruments which may not be practical or possible to conceive in the physical world. Its tactile and gestural interfaces reconnect the haptic feedback path that is missing from most computer instruments, immersing the musician more totally into the playing process and enhancing the perception of manipulating actual physical instruments. It is intended that the synthesis method and control interface will make Cymatic a warmer, more organic and more expressive instrument than conventional computer based instruments. Empirical research will be carried out in the near future to guide Cymatic s development in order to achieve these goals. Currently, Cymatic s only limiting factor is that of processing power. Due to the heavy demands of the mass-spring physical modelling technique, Cymatic is currently restricted to running a maximum of around 60 mass-spring cells in real-time at 44.1Khz (though obviously many more at lesser sampling rates). Work is underway to adapt the code to run on an Origin 8-node parallel computer which should increase the speed of the model by up to 16 times,

8 potentially limiting Cymatic s complexity and creative capacity to the musician s imagination. Acknowledgements The authors thanks the Engineering and Physical Sciences Research Council for their support of this work under grant number GR/M94137.

9 Address for correspondence: STUART RIMELL: DAVID HOWARD: ANDY HUNT: ROSS KIRK: ANDY TYRRELL: Music Technology Research Group Dept of Electronics, University of York, York YO10 5DD UK

10 References [1] Cook, P.R. (1999). Music, Cognition & Computerised Sound: An Introduction to Psychoacoustics. MIT Press. London. pp229. [2] MacLean, K.E. (2000). Designing With Haptic-Feedback, [3]O Modhrain, M.S., and Chafe, C. (2000). Incorporating Haptic Feedback Into Interfaces For Music Applications, Proceedings of ISORA World Automation Conference [4] Wessel, D. & Wright, M. (2000) Problems and Prospects for Intimate Musical Control of Computers. cnmat.cnmat.berkeley.edu/research/chi2000/wessel.pdf [5] Gillespie, B. (1992) Proc ICMC San Jose, CA. pp [6] Cadoz, C., Luciani, A. & Florens, J.L.(1984). Responsive Input Devices and Sound Synthesis by Simulation of Instrumental Mechanisms: The Cordis System. Computer Music Journal 8(3): pp [7] ] Nichols, C.(2001) The vbow: Haptic Feedback and Sound Synthesis of a Virtual Violin Bow Controller. [8] Rovan, J. (2000) Typology of Tactile Sounds and their Synthesis in Gesture- Driven Computer Music Performance. In Trends in Gestural Control of Music. Wanderley, M., Battier, M. (eds). Editions IRCAM, Paris. [9] Pearson, M., and Howard, D.M. (1995) A musician s approach to physical modelling., Proc. ICMC., pp [10] Morrison, J.D. and Adrien, J-M.(1993) MOSAIC: A Framework for Modal Synthesis., Computer Music Journal 17(1): [11] Cadoz, C., Luciani, A. & Florens, J.L.(1993). CORDIS-ANIMA: A Modelling system for sound and image synthesis, the general formalism., Computer Music Journal 17(1):19-29 [12] [13] [14]

CYMATIC: A TACTILE CONTROLLED PHYSICAL MODELLING INSTRUMENT. David M Howard and Stuart M Rimell

CYMATIC: A TACTILE CONTROLLED PHYSICAL MODELLING INSTRUMENT. David M Howard and Stuart M Rimell CYMATIC: A TACTILE CONTROLLED PHYSICAL MODELLING INSTRUMENT David M Howard and Stuart M Rimell Media Engineering Research Group Department of Electronics, University of York, UK dh@ohm.york.ac.uk ABSTRACT

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Exploring Haptics in Digital Waveguide Instruments

Exploring Haptics in Digital Waveguide Instruments Exploring Haptics in Digital Waveguide Instruments 1 Introduction... 1 2 Factors concerning Haptic Instruments... 2 2.1 Open and Closed Loop Systems... 2 2.2 Sampling Rate of the Control Loop... 2 3 An

More information

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances Florent Berthaut and Martin Hachet Figure 1: A musician plays the Drile instrument while being immersed in front of

More information

IMPROVING PERFORMERS MUSICALITY THROUGH LIVE INTERACTION WITH HAPTIC FEEDBACK: A CASE STUDY

IMPROVING PERFORMERS MUSICALITY THROUGH LIVE INTERACTION WITH HAPTIC FEEDBACK: A CASE STUDY IMPROVING PERFORMERS MUSICALITY THROUGH LIVE INTERACTION WITH HAPTIC FEEDBACK: A CASE STUDY Tychonas Michailidis Birmingham Conservatoire Birmingham City University tychonas@me.com Jamie Bullock Birmingham

More information

Physical Modelling Concepts for a Collection of Multisensory Virtual Musical Instruments

Physical Modelling Concepts for a Collection of Multisensory Virtual Musical Instruments Physical Modelling Concepts for a Collection of Multisensory Virtual Musical Instruments James Leonard, Claude Cadoz To cite this version: James Leonard, Claude Cadoz. Physical Modelling Concepts for a

More information

Gesture in Embodied Communication and Human-Computer Interaction

Gesture in Embodied Communication and Human-Computer Interaction Eleni Efthimiou Georgios Kouroupetroglou (Eds.) Gesture in Embodied Communication and Human-Computer Interaction 9th International Gesture Workshop, GW 2011 Athens, Greece, May 25-27, 2011 Institute for

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

Dept. of Computer Science, University of Copenhagen Universitetsparken 1, DK-2100 Copenhagen Ø, Denmark

Dept. of Computer Science, University of Copenhagen Universitetsparken 1, DK-2100 Copenhagen Ø, Denmark NORDIC ACOUSTICAL MEETING 12-14 JUNE 1996 HELSINKI Dept. of Computer Science, University of Copenhagen Universitetsparken 1, DK-2100 Copenhagen Ø, Denmark krist@diku.dk 1 INTRODUCTION Acoustical instruments

More information

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 2, Issue 6 (Jul. Aug. 2013), PP 08-13 e-issn: 2319 4200, p-issn No. : 2319 4197 Auditory-Tactile Interaction Using Digital Signal Processing

More information

ADDING VIBROTACTILE FEEDBACK TO THE T-STICK DIGITAL MUSICAL INSTRUMENT

ADDING VIBROTACTILE FEEDBACK TO THE T-STICK DIGITAL MUSICAL INSTRUMENT ADDING VIBROTACTILE FEEDBACK TO THE T-STICK DIGITAL MUSICAL INSTRUMENT Joseph Malloch Input Devices and Music Interaction Laboratory Centre for Interdisciplinary Research in Music Media and Technology

More information

Chapter 2. Meeting 2, Measures and Visualizations of Sounds and Signals

Chapter 2. Meeting 2, Measures and Visualizations of Sounds and Signals Chapter 2. Meeting 2, Measures and Visualizations of Sounds and Signals 2.1. Announcements Be sure to completely read the syllabus Recording opportunities for small ensembles Due Wednesday, 15 February:

More information

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to Chapter 2 Related Work 2.1 Haptic Feedback in Music Controllers The enhancement of computer-based instrumentinterfaces with haptic feedback dates back to the late 1970s, when Claude Cadoz and his colleagues

More information

Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback

Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback Taku Hachisu The University of Electro- Communications 1-5-1 Chofugaoka, Chofu, Tokyo 182-8585, Japan +81 42 443 5363

More information

The Sound of Touch. Keywords Digital sound manipulation, tangible user interface, electronic music controller, sensing, digital convolution.

The Sound of Touch. Keywords Digital sound manipulation, tangible user interface, electronic music controller, sensing, digital convolution. The Sound of Touch David Merrill MIT Media Laboratory 20 Ames St., E15-320B Cambridge, MA 02139 USA dmerrill@media.mit.edu Hayes Raffle MIT Media Laboratory 20 Ames St., E15-350 Cambridge, MA 02139 USA

More information

From Encoding Sound to Encoding Touch

From Encoding Sound to Encoding Touch From Encoding Sound to Encoding Touch Toktam Mahmoodi King s College London, UK http://www.ctr.kcl.ac.uk/toktam/index.htm ETSI STQ Workshop, May 2017 Immersing a person into the real environment with Very

More information

Whole geometry Finite-Difference modeling of the violin

Whole geometry Finite-Difference modeling of the violin Whole geometry Finite-Difference modeling of the violin Institute of Musicology, Neue Rabenstr. 13, 20354 Hamburg, Germany e-mail: R_Bader@t-online.de, A Finite-Difference Modelling of the complete violin

More information

YAMAHA. Modifying Preset Voices. IlU FD/D SUPPLEMENTAL BOOKLET DIGITAL PROGRAMMABLE ALGORITHM SYNTHESIZER

YAMAHA. Modifying Preset Voices. IlU FD/D SUPPLEMENTAL BOOKLET DIGITAL PROGRAMMABLE ALGORITHM SYNTHESIZER YAMAHA Modifying Preset Voices I IlU FD/D DIGITAL PROGRAMMABLE ALGORITHM SYNTHESIZER SUPPLEMENTAL BOOKLET Welcome --- This is the first in a series of Supplemental Booklets designed to provide a practical

More information

Time-domain simulation of the bowed cello string: Dual-polarization effect

Time-domain simulation of the bowed cello string: Dual-polarization effect Time-domain simulation of the bowed cello string: Dual-polarization effect Hossein Mansour, Jim Woodhouse, and Gary Scavone Citation: Proc. Mtgs. Acoust. 19, 035014 (2013); View online: https://doi.org/10.1121/1.4800058

More information

Realtime Software Synthesis for Psychoacoustic Experiments David S. Sullivan Jr., Stephan Moore, and Ichiro Fujinaga

Realtime Software Synthesis for Psychoacoustic Experiments David S. Sullivan Jr., Stephan Moore, and Ichiro Fujinaga Realtime Software Synthesis for Psychoacoustic Experiments David S. Sullivan Jr., Stephan Moore, and Ichiro Fujinaga Computer Music Department The Peabody Institute of the Johns Hopkins University One

More information

Figure 2. Haptic human perception and display. 2.2 Pseudo-Haptic Feedback 2. RELATED WORKS 2.1 Haptic Simulation of Tapping an Object

Figure 2. Haptic human perception and display. 2.2 Pseudo-Haptic Feedback 2. RELATED WORKS 2.1 Haptic Simulation of Tapping an Object Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback Taku Hachisu 1 Gabriel Cirio 2 Maud Marchal 2 Anatole Lécuyer 2 Hiroyuki Kajimoto 1,3 1 The University of Electro- Communications

More information

The included VST Instruments

The included VST Instruments The included VST Instruments - 1 - - 2 - Documentation by Ernst Nathorst-Böös, Ludvig Carlson, Anders Nordmark, Roger Wiklander Additional assistance: Cecilia Lilja Quality Control: Cristina Bachmann,

More information

SPATIO-OPERATIONAL SPECTRAL (S.O.S.)

SPATIO-OPERATIONAL SPECTRAL (S.O.S.) SPATIO-OPERATIONAL SPECTRAL (S.O.S.) SYNTHESIS David Topper 1, Matthew Burtner 1, Stefania Serafin 2 VCCM 1, McIntire Department of Music, University of Virginia CCRMA 2, Department of Music, Stanford

More information

Design and evaluation of Hapticons for enriched Instant Messaging

Design and evaluation of Hapticons for enriched Instant Messaging Design and evaluation of Hapticons for enriched Instant Messaging Loy Rovers and Harm van Essen Designed Intelligence Group, Department of Industrial Design Eindhoven University of Technology, The Netherlands

More information

Anticipation in networked musical performance

Anticipation in networked musical performance Anticipation in networked musical performance Pedro Rebelo Queen s University Belfast Belfast, UK P.Rebelo@qub.ac.uk Robert King Queen s University Belfast Belfast, UK rob@e-mu.org This paper discusses

More information

Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time.

Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time. 2. Physical sound 2.1 What is sound? Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time. Figure 2.1: A 0.56-second audio clip of

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

Musical Instrument of Multiple Methods of Excitation (MIMME)

Musical Instrument of Multiple Methods of Excitation (MIMME) 1 Musical Instrument of Multiple Methods of Excitation (MIMME) Design Team John Cavacas, Kathryn Jinks Greg Meyer, Daniel Trostli Design Advisor Prof. Andrew Gouldstone Abstract The objective of this capstone

More information

Computer Audio. An Overview. (Material freely adapted from sources far too numerous to mention )

Computer Audio. An Overview. (Material freely adapted from sources far too numerous to mention ) Computer Audio An Overview (Material freely adapted from sources far too numerous to mention ) Computer Audio An interdisciplinary field including Music Computer Science Electrical Engineering (signal

More information

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Glasgow eprints Service

Glasgow eprints Service Hoggan, E.E and Brewster, S.A. (2006) Crossmodal icons for information display. In, Conference on Human Factors in Computing Systems, 22-27 April 2006, pages pp. 857-862, Montréal, Québec, Canada. http://eprints.gla.ac.uk/3269/

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Comparing Two Haptic Interfaces for Multimodal Graph Rendering Comparing Two Haptic Interfaces for Multimodal Graph Rendering Wai Yu, Stephen Brewster Glasgow Interactive Systems Group, Department of Computing Science, University of Glasgow, U. K. {rayu, stephen}@dcs.gla.ac.uk,

More information

Haptic Carillon Analysis & Design of the Carillon Mechanism

Haptic Carillon Analysis & Design of the Carillon Mechanism University of Wollongong Research Online Faculty of Creative Arts - Papers (Archive) Faculty of Law, Humanities and the Arts 2009 Haptic Carillon Analysis & Design of the Carillon Mechanism Mark Havryliv

More information

Haptics CS327A

Haptics CS327A Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller

More information

A Parametric Model for Spectral Sound Synthesis of Musical Sounds

A Parametric Model for Spectral Sound Synthesis of Musical Sounds A Parametric Model for Spectral Sound Synthesis of Musical Sounds Cornelia Kreutzer University of Limerick ECE Department Limerick, Ireland cornelia.kreutzer@ul.ie Jacqueline Walker University of Limerick

More information

The Physics of Musical Instruments

The Physics of Musical Instruments Neville H. Fletcher Thomas D. Rossing The Physics of Musical Instruments Second Edition With 485 Illustrations Springer Contents Preface Preface to the First Edition v vii I. Vibrating Systems 1. Free

More information

A Musical Controller Based on the Cicada s Efficient Buckling Mechanism

A Musical Controller Based on the Cicada s Efficient Buckling Mechanism A Musical Controller Based on the Cicada s Efficient Buckling Mechanism Tamara Smyth CCRMA Department of Music Stanford University Stanford, California tamara@ccrma.stanford.edu Julius O. Smith III CCRMA

More information

ERGOS: Multi-degrees of Freedom and Versatile Force-Feedback Panoply

ERGOS: Multi-degrees of Freedom and Versatile Force-Feedback Panoply ERGOS: Multi-degrees of Freedom and Versatile Force-Feedback Panoply Jean-Loup Florens, Annie Luciani, Claude Cadoz, Nicolas Castagné ACROE-ICA, INPG, 46 Av. Félix Viallet 38000, Grenoble, France florens@imag.fr

More information

Virtual Reality Instruments capable of changing Dimensions in Real-time

Virtual Reality Instruments capable of changing Dimensions in Real-time Steven Gelineck Niels Böttcher Linda Martinussen Stefania Serafin Aalborg University Copenhagen, Denmark E-mail: heinztomatketchup@hotmail.com, ilz@jenkamusic.dk, lm-grafik@lm-grafik.dk sts@media.aau.dk

More information

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The

More information

Quarterly Progress and Status Report. A look at violin bows

Quarterly Progress and Status Report. A look at violin bows Dept. for Speech, Music and Hearing Quarterly Progress and Status Report A look at violin bows Askenfelt, A. journal: STL-QPSR volume: 34 number: 2-3 year: 1993 pages: 041-048 http://www.speech.kth.se/qpsr

More information

ANALYSIS AND EVALUATION OF IRREGULARITY IN PITCH VIBRATO FOR STRING-INSTRUMENT TONES

ANALYSIS AND EVALUATION OF IRREGULARITY IN PITCH VIBRATO FOR STRING-INSTRUMENT TONES Abstract ANALYSIS AND EVALUATION OF IRREGULARITY IN PITCH VIBRATO FOR STRING-INSTRUMENT TONES William L. Martens Faculty of Architecture, Design and Planning University of Sydney, Sydney NSW 2006, Australia

More information

A Look at Un-Electronic Musical Instruments

A Look at Un-Electronic Musical Instruments A Look at Un-Electronic Musical Instruments A little later in the course we will be looking at the problem of how to construct an electrical model, or analog, of an acoustical musical instrument. To prepare

More information

Effects of Stiffness on Tapping Performance Do We Rely on Force to Keep Synchronized along with a Metronome?

Effects of Stiffness on Tapping Performance Do We Rely on Force to Keep Synchronized along with a Metronome? Effects of Stiffness on Tapping Performance Do We Rely on Force to Keep Synchronized along with a Metronome? Damien Couroussé, Jean-Loup Florens, Annie Luciani ACROE-ICA INPG, 46 av. Félix Viallet, 38

More information

Photone Sound Design Tutorial

Photone Sound Design Tutorial Photone Sound Design Tutorial An Introduction At first glance, Photone s control elements appear dauntingly complex but this impression is deceiving: Anyone who has listened to all the instrument s presets

More information

Abstract. 2. Related Work. 1. Introduction Icon Design

Abstract. 2. Related Work. 1. Introduction Icon Design The Hapticon Editor: A Tool in Support of Haptic Communication Research Mario J. Enriquez and Karon E. MacLean Department of Computer Science University of British Columbia enriquez@cs.ubc.ca, maclean@cs.ubc.ca

More information

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 27 PACS: 43.66.Jh Combining Performance Actions with Spectral Models for Violin Sound Transformation Perez, Alfonso; Bonada, Jordi; Maestre,

More information

The Resource-Instance Model of Music Representation 1

The Resource-Instance Model of Music Representation 1 The Resource-Instance Model of Music Representation 1 Roger B. Dannenberg, Dean Rubine, Tom Neuendorffer Information Technology Center School of Computer Science Carnegie Mellon University Pittsburgh,

More information

MEASURING THE BOW PRESSING FORCE IN A REAL VIOLIN PERFORMANCE

MEASURING THE BOW PRESSING FORCE IN A REAL VIOLIN PERFORMANCE MEASURING THE BOW PRESSING FORCE IN A REAL VIOLIN PERFORMANCE Enric Guaus, Jordi Bonada, Alfonso Perez, Esteban Maestre, Merlijn Blaauw Music Technology Group, Pompeu Fabra University Ocata 1, 08003 Barcelona

More information

Quarterly Progress and Status Report. Observations on the transient components of the piano tone

Quarterly Progress and Status Report. Observations on the transient components of the piano tone Dept. for Speech, Music and Hearing Quarterly Progress and Status Report Observations on the transient components of the piano tone Askenfelt, A. journal: STL-QPSR volume: 34 number: 4 year: 1993 pages:

More information

Speech, Hearing and Language: work in progress. Volume 12

Speech, Hearing and Language: work in progress. Volume 12 Speech, Hearing and Language: work in progress Volume 12 2 Construction of a rotary vibrator and its application in human tactile communication Abbas HAYDARI and Stuart ROSEN Department of Phonetics and

More information

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp. 105-124. http://eprints.gla.ac.uk/3273/ Glasgow eprints Service http://eprints.gla.ac.uk

More information

Using VR and simulation to enable agile processes for safety-critical environments

Using VR and simulation to enable agile processes for safety-critical environments Using VR and simulation to enable agile processes for safety-critical environments Michael N. Louka Department Head, VR & AR IFE Digital Systems Virtual Reality Virtual Reality: A computer system used

More information

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience Ryuta Okazaki 1,2, Hidenori Kuribayashi 3, Hiroyuki Kajimioto 1,4 1 The University of Electro-Communications,

More information

MODELING BILL S GAIT: ANALYSIS AND PARAMETRIC SYNTHESIS OF WALKING SOUNDS

MODELING BILL S GAIT: ANALYSIS AND PARAMETRIC SYNTHESIS OF WALKING SOUNDS MODELING BILL S GAIT: ANALYSIS AND PARAMETRIC SYNTHESIS OF WALKING SOUNDS PERRY R. COOK Princeton University Dept. of Computer Science (also Music), 35 Olden St., Princeton, NJ, USA, 08544 prc@cs.princeton.edu

More information

Standing Waves. Lecture 21. Chapter 21. Physics II. Course website:

Standing Waves. Lecture 21. Chapter 21. Physics II. Course website: Lecture 21 Chapter 21 Physics II Standing Waves Course website: http://faculty.uml.edu/andriy_danylov/teaching/physicsii Lecture Capture: http://echo360.uml.edu/danylov201415/physics2spring.html Standing

More information

StringTone Testing and Results

StringTone Testing and Results StringTone Testing and Results Test Objectives The purpose of this audio test series is to determine if topical application of StringTone to strings of electric and acoustic musical instruments is effective

More information

Coney Island: Combining jmax, Spat and VSS for Acoustic Integration of Spatial and Temporal Models in a Virtual Reality Installation

Coney Island: Combining jmax, Spat and VSS for Acoustic Integration of Spatial and Temporal Models in a Virtual Reality Installation Coney Island: Combining jmax, Spat and VSS for Acoustic Integration of Spatial and Temporal Models in a Virtual Reality Installation Robin Bargar* (rbargar@ncsa.uiuc.edu), Francois Dechelle (Dechelle@ircam.fr),

More information

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration Nan Cao, Hikaru Nagano, Masashi Konyo, Shogo Okamoto 2 and Satoshi Tadokoro Graduate School

More information

Fitur YAMAHA ELS-02C. An improved and superbly expressive STAGEA. AWM Tone Generator. Super Articulation Voices

Fitur YAMAHA ELS-02C. An improved and superbly expressive STAGEA. AWM Tone Generator. Super Articulation Voices Fitur YAMAHA ELS-02C An improved and superbly expressive STAGEA Generating all the sounds of the world AWM Tone Generator The Advanced Wave Memory (AWM) tone generator incorporates 986 voices. A wide variety

More information

RS380 MODULATION CONTROLLER

RS380 MODULATION CONTROLLER RS380 MODULATION CONTROLLER The RS380 is a composite module comprising four separate sub-modules that you can patch together or with other RS Integrator modules to generate and control a wide range of

More information

An Evaluation of Input Devices for Timbre Space Navigation

An Evaluation of Input Devices for Timbre Space Navigation An Evaluation of Input Devices for Timbre Space Navigation ROEL VERTEGAAL MPhil Dissertation Department of Computing University of Bradford 1994 An Evaluation of Input Devices for Timbre Space Navigation

More information

INVESTIGATING BINAURAL LOCALISATION ABILITIES FOR PROPOSING A STANDARDISED TESTING ENVIRONMENT FOR BINAURAL SYSTEMS

INVESTIGATING BINAURAL LOCALISATION ABILITIES FOR PROPOSING A STANDARDISED TESTING ENVIRONMENT FOR BINAURAL SYSTEMS 20-21 September 2018, BULGARIA 1 Proceedings of the International Conference on Information Technologies (InfoTech-2018) 20-21 September 2018, Bulgaria INVESTIGATING BINAURAL LOCALISATION ABILITIES FOR

More information

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Anders J Johansson, Joakim Linde Teiresias Research Group (www.bigfoot.com/~teiresias) Abstract Force feedback (FF) is a technology

More information

APPENDIX B Setting up a home recording studio

APPENDIX B Setting up a home recording studio APPENDIX B Setting up a home recording studio READING activity PART n.1 A modern home recording studio consists of the following parts: 1. A computer 2. An audio interface 3. A mixer 4. A set of microphones

More information

In Phase. Out of Phase

In Phase. Out of Phase Superposition Interference Waves ADD: Constructive Interference. Waves SUBTRACT: Destructive Interference. In Phase Out of Phase Superposition Traveling waves move through each other, interfere, and keep

More information

THE DISPLACED BOW AND APHISMS: ABSTRACT PHYSICALLY INFORMED SYNTHESIS METHODS FOR COMPOSITION AND INTERACTIVE PERFORMANCE.

THE DISPLACED BOW AND APHISMS: ABSTRACT PHYSICALLY INFORMED SYNTHESIS METHODS FOR COMPOSITION AND INTERACTIVE PERFORMANCE. THE DISPLACED BOW AND APHISMS: ABSTRACT PHYSICALLY INFORMED SYNTHESIS METHODS FOR COMPOSITION AND INTERACTIVE PERFORMANCE Georg Essl Computer & Information Science & Engineering University of Florida gessl@cise.ufl.edu

More information

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING Proceedings of the 1998 Winter Simulation Conference D.J. Medeiros, E.F. Watson, J.S. Carson and M.S. Manivannan, eds. SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF

More information

Band-Limited Simulation of Analog Synthesizer Modules by Additive Synthesis

Band-Limited Simulation of Analog Synthesizer Modules by Additive Synthesis Band-Limited Simulation of Analog Synthesizer Modules by Additive Synthesis Amar Chaudhary Center for New Music and Audio Technologies University of California, Berkeley amar@cnmat.berkeley.edu March 12,

More information

User Interface Agents

User Interface Agents User Interface Agents Roope Raisamo (rr@cs.uta.fi) Department of Computer Sciences University of Tampere http://www.cs.uta.fi/sat/ User Interface Agents Schiaffino and Amandi [2004]: Interface agents are

More information

Principles of Musical Acoustics

Principles of Musical Acoustics William M. Hartmann Principles of Musical Acoustics ^Spr inger Contents 1 Sound, Music, and Science 1 1.1 The Source 2 1.2 Transmission 3 1.3 Receiver 3 2 Vibrations 1 9 2.1 Mass and Spring 9 2.1.1 Definitions

More information

Direction-Dependent Physical Modeling of Musical Instruments

Direction-Dependent Physical Modeling of Musical Instruments 15th International Congress on Acoustics (ICA 95), Trondheim, Norway, June 26-3, 1995 Title of the paper: Direction-Dependent Physical ing of Musical Instruments Authors: Matti Karjalainen 1,3, Jyri Huopaniemi

More information

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics

More information

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Hasti Seifi, CPSC554m: Assignment 1 Abstract Graphical user interfaces greatly enhanced usability of computer systems over older

More information

Advanced Audiovisual Processing Expected Background

Advanced Audiovisual Processing Expected Background Advanced Audiovisual Processing Expected Background As an advanced module, we will not cover introductory topics in lecture. You are expected to already be proficient with all of the following topics,

More information

HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES

HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES ICSRiM University of Leeds School of Music and School of Computing Leeds LS2 9JT UK info@icsrim.org.uk www.icsrim.org.uk Abstract The paper

More information

Laboratory Assignment 4. Fourier Sound Synthesis

Laboratory Assignment 4. Fourier Sound Synthesis Laboratory Assignment 4 Fourier Sound Synthesis PURPOSE This lab investigates how to use a computer to evaluate the Fourier series for periodic signals and to synthesize audio signals from Fourier series

More information

A Java Virtual Sound Environment

A Java Virtual Sound Environment A Java Virtual Sound Environment Proceedings of the 15 th Annual NACCQ, Hamilton New Zealand July, 2002 www.naccq.ac.nz ABSTRACT Andrew Eales Wellington Institute of Technology Petone, New Zealand andrew.eales@weltec.ac.nz

More information

Lab 12. Vibrating Strings

Lab 12. Vibrating Strings Lab 12. Vibrating Strings Goals To experimentally determine relationships between fundamental resonant of a vibrating string and its length, its mass per unit length, and tension in string. To introduce

More information

Benefits of using haptic devices in textile architecture

Benefits of using haptic devices in textile architecture 28 September 2 October 2009, Universidad Politecnica de Valencia, Spain Alberto DOMINGO and Carlos LAZARO (eds.) Benefits of using haptic devices in textile architecture Javier SANCHEZ *, Joan SAVALL a

More information

Multi-point vibrotactile feedback for an expressive musical interface

Multi-point vibrotactile feedback for an expressive musical interface Proceedings of the International Conference on New Interfaces for Musical Expression, Baton Rouge, LA, USA, May 3-June 3, 205 Multi-point vibrotactile feedback for an expressive musical interface Stefano

More information

Falcon Singles - Oud for Falcon

Falcon Singles - Oud for Falcon Falcon Singles - Oud for Falcon 2016 Simon Stockhausen Installation As there is no default location for 3rd party sound libraries for Falcon, you can just install the folder Oud which you extracted from

More information

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. The CHAI Libraries F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury Computer Science Department, Stanford University, Stanford CA

More information

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration 22 ISSN 2043-0167 Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration Oussama Metatla, Fiore Martin, Nick Bryan-Kinns and Tony Stockman EECSRR-12-03 June

More information

DR BRIAN BRIDGES SOUND SYNTHESIS IN LOGIC II

DR BRIAN BRIDGES SOUND SYNTHESIS IN LOGIC II DR BRIAN BRIDGES BD.BRIDGES@ULSTER.AC.UK SOUND SYNTHESIS IN LOGIC II RECAP... Synthesis: artificial sound generation Variety of methods: additive, subtractive, modulation, physical modelling, wavetable

More information

Physics-Based Sound Synthesis

Physics-Based Sound Synthesis 1 Physics-Based Sound Synthesis ELEC-E5620 - Audio Signal Processing, Lecture #8 Vesa Välimäki Sound check Course Schedule in 2017 0. General issues (Vesa & Fabian) 13.1.2017 1. History and future of audio

More information

Computer Haptics and Applications

Computer Haptics and Applications Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School

More information

SGN Audio and Speech Processing

SGN Audio and Speech Processing Introduction 1 Course goals Introduction 2 SGN 14006 Audio and Speech Processing Lectures, Fall 2014 Anssi Klapuri Tampere University of Technology! Learn basics of audio signal processing Basic operations

More information

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS Jaejoon Kim, S. Mandayam, S. Udpa, W. Lord, and L. Udpa Department of Electrical and Computer Engineering Iowa State University Ames, Iowa 500

More information

INTRODUCTION TO COMPUTER MUSIC PHYSICAL MODELS. Professor of Computer Science, Art, and Music. Copyright by Roger B.

INTRODUCTION TO COMPUTER MUSIC PHYSICAL MODELS. Professor of Computer Science, Art, and Music. Copyright by Roger B. INTRODUCTION TO COMPUTER MUSIC PHYSICAL MODELS Roger B. Dannenberg Professor of Computer Science, Art, and Music Copyright 2002-2013 by Roger B. Dannenberg 1 Introduction Many kinds of synthesis: Mathematical

More information

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK Javier Sanchez Center for Computer Research in Music and Acoustics (CCRMA) Stanford University The Knoll, 660 Lomita Dr. Stanford, CA 94305,

More information

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Sound rendering in Interactive Multimodal Systems. Federico Avanzini Sound rendering in Interactive Multimodal Systems Federico Avanzini Background Outline Ecological Acoustics Multimodal perception Auditory visual rendering of egocentric distance Binaural sound Auditory

More information

Strings: Guitar, Harp, Piano and Harpsichord

Strings: Guitar, Harp, Piano and Harpsichord Strings: Guitar, Harp, Piano and Harpsichord 80/20 A stringed instrument uses standing waves on a string to provide the frequency generation. f 1 f 2 f 3 f 4 ~ ~ String Standing Waves f n A Standing Wave

More information

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Beyond Visual: Shape, Haptics and Actuation in 3D UI Beyond Visual: Shape, Haptics and Actuation in 3D UI Ivan Poupyrev Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for

More information

Virtual Experiments as a Tool for Active Engagement

Virtual Experiments as a Tool for Active Engagement Virtual Experiments as a Tool for Active Engagement Lei Bao Stephen Stonebraker Gyoungho Lee Physics Education Research Group Department of Physics The Ohio State University Context Cues and Knowledge

More information

ETHERA EVI MANUAL VERSION 1.0

ETHERA EVI MANUAL VERSION 1.0 ETHERA EVI MANUAL VERSION 1.0 INTRODUCTION Thank you for purchasing our Zero-G ETHERA EVI Electro Virtual Instrument. ETHERA EVI has been created to fit the needs of the modern composer and sound designer.

More information

Lecture 2: Acoustics

Lecture 2: Acoustics ELEN E4896 MUSIC SIGNAL PROCESSING Lecture 2: Acoustics 1. Acoustics, Sound & the Wave Equation 2. Musical Oscillations 3. The Digital Waveguide Dan Ellis Dept. Electrical Engineering, Columbia University

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information