Motion Origami. Introduction. Daniel Bartoš

Size: px
Start display at page:

Download "Motion Origami. Introduction. Daniel Bartoš"

Transcription

1 Motion Origami Daniel Bartoš FAMU, Prague, Czech Republic Abstract. The Motion Origami project explores live performance strategies focused on gesture based control of sound. The sound processing involves granular sound synthesis in a Max/MSP patch. Motion Origami is designed for a live performance scenario as an interactive music instrument. Motion Origami makes use of the live performance audio input or of the pre-recorded sound material. This live interface prototype explores a gesture-based music composition and music performance techniques. The sound transformations are driven by the hand gestures, while the use of motion tracking device lets the user build up a specific experience and virtuosity. Keywords: Leap Motion sensor, gesture recognition, motion tracking, music expression, granular synthesis, Max/MSP Introduction The name of the Motion Origami project is inspired by the Japanese art tradition of origami folding. The actual process of paper folding is reflected in a specific compositional strategy, which uses captured hand gestures. In other words the performing artist, the musician, is able to 'fold' sounds with his own hand gestures into new sound objects. The simple 'folds' therefore result into complex soundscapes during the performance. Figure. Motion Origami MAX/MSP patch with the Leap Motion Motion Origami; the project presentation can be accessed online at

2 This paper describes an original Max/MSP patch, which uses the Smooth Overlapping Granular Synthesis object sogs~ 2 as it's main sound transforming element. The novelty of this approach is not in the production of a new code or a particular specific design, but in making a creative use of the existing and advanced audio processing tools in the sound and music performance. The project shows how a live interface prototype can be turned into an original compositional tool. As such, the Motion Origami project represents a recipe on how to approach a design of an experimental music instrument and shows an approach similar to the rapid prototyping technique applied to the realm of the advanced Max/MSP audio processing domain. Leap Motion The Leap Motion 3 sensor was designed for touch-less tracking of hand gestures and their movements. The sensor in conjunction with the Leap Motion SDK becomes a sophisticated controller and delivers complex information monitoring the hand movements in real-time. Hand gestures are captured with a high precision accuracy along with the individual finger positions, rotations and finger tips accelerations. The Leap Motion sensor was introduced to the market in the mid 203 and swiftly found its place among other devices designed for body movements and gesture tracking as for example the Kinect 4 and the Wii controller 5. Such devices work usually with general body tracking and can be repurposed. The Leap Motion on the contrary is designed to capture the hand gestures and movements only. In fact the Leap Motion sensor could be thought of as a special interactive space 6 of predefined volume of air. Common music applications of the Leap Motion sensor are primarily based on the imitations of physical interfaces of existing musical instruments. This is true especially for the following projects: Air- Keyes, Crystal Piano, Air Harp, etc. (Han 204). The main reason for that is, that the virtual keyboard represents an ideal setup for testing of the system latency (Silva 203), as the low latency response is one of the most important elements for any real-time music performance application. The latency of the Leap Motion, as advertised by the manufacturer, is anywhere from 5ms to 20ms, but this particular figure obviously depends on the whole system configuration and components used. Another category in the existing applications of the Leap Motion sensor is represented by various hybrid instruments. A specific selection of such projects is described in the paper Lessons Learned in Exploring Leap Motion (Han, Gold 204). The Motion Origami Body & Sound Interaction The theme of physical interaction and sound processing is thoroughly investigated in the paper called Sound Design as Human Matter Interaction (Wei 203), whereas the most important keyword becomes the term material computation 7. In the extended sense we can think of the Leap Motion sensor as if it is constituting an interactive space on its own, where the calculations and interaction take place. The gesture recognition in conjunction with a music performance is 2 MAX/MSP object sogs~ (Smooth Overlap Granular Synthesis) by Norbert Schnell, IRCAM Centre Pompidou; more information can be accessed online at 3 Leap Motion, complete sensor and SDK specification can be accessed online at 4 Microsoft Kinect is a sensor using depth map to create 3D space representation. Developed by Microsoft for a game console Microsoft Xbox 360. Kinect specification can be accessed online at 5 Wii Remote is part of Nintendo Wii Console developed by Nintendo Company, Ltd. Wii specification can be accessed online at 6 Interactive space of 8 cubic feet, equivalent of 0,22 m 3 respectively, as stated by the manufacturer; more information can be accessed online at 7 Realtime, continuously responsive environments for the design of technical systems for human-computer interaction design. Ibid, p. 200.

3 8 also being explored by the IRCAM's ISMM research team. While the physical body & sound interaction concept is 9 present for example in the projects of Marco Donnarumma, who uses set of repurposed bio-sensors. Figure 2: Motion Origami the individual patch sections explained In the case of the Motion Origami Max/MSP patch, the performer's physical gesture interaction is the primary source of the resulting sound transformations. The performer creates new sound objects with the captured hand gestures. The hand gestures in the Motion Origami patch are recognised in the interactive space above the sensor and are turned into a control mechanism coded in the Max/MSP patch. A single recognised hand gesture initialises the audio buffer with the incoming audio. The buffer acts as a starting point for the granular synthesis soundscape building. The hand gestures control the granular synthesis engine parameters, along with the timing of the buffer initialization with a new audio material during the live performance as well. The hand gestures control the wet & dry ratio of the audio signal input and 0 also the multichannel audio distribution via the Ambisonics engine. Motion Origami Patch Implementation The Motion Origami patch is programmed in Max/MSP. Data from the Leap Motion sensor are captured by the 2 SwirlyLeap object. The updated version of the patch uses the current and well documented IRCAM's leapmotion 8 IRCAM ISMM team {SOUND MUSIC MOVEMENT} INTERACTION; more information can be accessed online at 9 Marco Donnarumma; project presentations can be accessed online at: 0 Ambisonics Tools from the Institute for Computer Music and Sound Technology ICST, Zurich University of the Arts can be accessed online at: Max/MSP visual programming environment by Cycling74, more information can be accessed online at

4 object 3. The Smooth Overlapping Granular Synthesis object sogs~ 4 was chosen because it offers a simple and creative control over the audio captured into the audio buffer and can be used for a specific navigation and exploration of the audio buffer. The sogs~ object also mimics the paper folding technique in the sense, that the original paper surface is substituted with a 2D plain made of two individual parameters: the grain position and the grain size. The data from the Leap Motion sensor are mapped to drive the sogs~ object with those two selected parameters: the performer than navigates the space of the audio buffer defined by the grain size and the grain position parameters respectively. Figure 3. Motion Origami the recognised gestures illustrated The wet & dry mix ratio, which is also mapped to hand gestures, offers detailed control over the sound merging and accents the actual 'sound folds'. These 'sound folds' can build up a certain level of complexity thanks to the fact, that the live audio source is coupled with the audio material stored in the buffer. Although the audio is modified in the granulation process, it shares the same spectral and tonal characteristics with the original sound source. This in turn creates elaborated sound transformations, which can be though of as the introduced 'sound folding' process. The patch recognises a specific gesture, which is required to start the initialisation of the audio buffer. In this way the audio buffer is filled with a new audio material. The buffer initialisation starts with a gesture of a closed hand. This gesture can be paraphrased as 'grab the sound' gesture. In this very moment, the buffer is filled with the live sound input and becomes available to sogs~ object. Subsequent hand gestures control various aspects of the granular synthesis engine: horizontal hand swipe controls grain position selection, vertical hand movement controls time length of a grain. Moreover the overall palm position above the sensor in the x - y plane defines the sound source position in the multichannel Ambisonics space and adds a multichannel spatialisation layer to the performance. The other Leap Motion recognised variables as yaw, pitch a roll are alternatively mapped to extended FX processing (reverb, distortion, etc.), depending on the performance scenario. 2 Swirly Leap Max/MSP access the Leap Motion API, written by Tom Ritchford from New Yorku. Project can be accessed online at 3 Well documented Max/MSP object leapmotion by IRCAM, more information can be accessed online at 4 Max/MSP object sogs~ (Smooth Overlap Granular Synthesis) by Norbert Schnell, IRCAM Centre Pompidou, more information can be accessed at

5 Figure 4: Motion Origami a detail of the subpatch controlling the audio buffer initialisation Conclusions & Performance 5 The most inviting application of the Motion Origami patch is a vocal performance or simply a music instrument, which leaves enough space for the interaction with the sensor itself. The beauty of the live performance approach lies the fact, that the performer can interact with his/her captured music material and add multiple layers of expression by solely using the hand gestures. New layer of improvisation can be introduced, while new themes and phrases emerge. The performer than interacts with a new music material, which is based on the sound qualities of the original music instrument or the vocal performance. The performer can control the following parameters in the Motion Origami live interface: time based selection of a phrase sampled into the buffer; grain size and it's position in the buffer; wet & dry mix ratio and Ambisonics sound source space position (if applicable). Using gestures in music composition and performance proves to be very intuitive. The sensor alone has to be 'learned' to be operated properly and this fact delivers a specific virtuosity over time. The Leap Motion sensor with the Motion Origami patch opens up a new exciting field of music composition and sound processing coupled with immediate gestural interaction. The biggest challenge in the gesture based performance is the recognition of the quantized gestures (Potter 203). While parametric control of the various patch elements doesn't present any technical problem trackingwise, the recognition of the quantized and unique gestures proved difficult through out the development phase of the patch. For example, while playing on a virtual keyboard, one can limit the key strokes to a specific scale and limit the mis-triggered notes this way. But when it comes to evoking a specific functionality (sampling initialization, sound bank toggle, etc.) the gestures have to be recognized with exceptionally high precision as those decisions create an integral part of the performance itself. This aspect of the live performance gives us a specific constrains and we have to consider them in the live performance scenarios, when using the Leap Motion sensor. The quality of the tracking depends also on the present light conditions and the overall sensor setup. For example, a direct light reaching the sensors's surface can introduce inconsistency in the tracking. 5 Motion Origami, the project presentation can be accessed online at

6 Overall, the Leap Motion is very suitable for various intuitive music composition and performance scenarios. Occasional errors in tracking can be overcome with a good patch design and the restricted use of the quantized gestures leaving out the quantized gestures to be services by traditional hardware controllers. Having said that, the Leap Motion sensor excels in the intuitive gesture interaction performance and the gesture based music composition strategies. Acknowledgements. The Motion Origami project was funded by a research grant scheme Sensors as music instruments at HAMU, Music Academy in Prague run by Michal Rataj. Special thanks go to Michal Rataj6 and Matthew Ostrowski7 for their support, help with the patch design and their ideas about performance and music composition. References Han, Jihyun; Gold, Nicolas Lessons Learned in Exploring the Leap Motion Sensor for Gesture-based Instrument Design, in Caramiaux, B and Tahiroğlu, K and Fiebrink, R and Tanaka, A (eds.). Proceedings of the International Conference on New Interfaces for Musical Expression (NIME'4) London 204, Goldsmiths University of London: Potter, Leight Elen; Araullo, Jake; Carter Lewis The Leap Motion controller: A view on sign language, Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration Adelaide, ACM, New York: Silva, Eduardo S; de Abreu, Jader Anderson O.; de Almeida, Janiel Henrique P.; Teichrieb, Veronica; Ramalho, Geber L A Preliminary Evaluation of the Leap Motion Sensor as Controller of New Digital Musical Instruments, Proceedings of the Brazilian Symposium on Computer Music Brasilia, SBCM: Schwarz, Diemo Concatenative sound synthesis: The early years, Journal of New Music Research 35(): 3 22 (March). Wei, Sha Xin; Freed, Adrian; Navab, Navid Sound Design As Human Matter Interaction, Proceedings of CHI '3 Extended Abstracts on Human Factors in Computing Systems in Paris, ACM, New York: Michal Rataj music composer and researcher at HAMU (Music Academy in Prague). More information can be accessed online at 7 Matthew Ostrowski expert on sensors and multimedia programming based at Harvestworks NYC, offered valuable insights into Leap Motion programming. More information can be accessed online at and

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances Florent Berthaut and Martin Hachet Figure 1: A musician plays the Drile instrument while being immersed in front of

More information

NEYMA, interactive soundscape composition based on a low budget motion capture system.

NEYMA, interactive soundscape composition based on a low budget motion capture system. NEYMA, interactive soundscape composition based on a low budget motion capture system. Stefano Alessandretti Independent research s.alessandretti@gmail.com Giovanni Sparano Independent research giovannisparano@gmail.com

More information

ApProgXimate Audio: A Distributed Interactive Experiment in Sound Art and Live Coding

ApProgXimate Audio: A Distributed Interactive Experiment in Sound Art and Live Coding ApProgXimate Audio: A Distributed Interactive Experiment in Sound Art and Live Coding Chris Kiefer Department of Music & Sussex Humanities Lab, University of Sussex, Brighton, UK. School of Media, Film

More information

Classification for Motion Game Based on EEG Sensing

Classification for Motion Game Based on EEG Sensing Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Mobile Motion: Multimodal Device Augmentation for Musical Applications

Mobile Motion: Multimodal Device Augmentation for Musical Applications Mobile Motion: Multimodal Device Augmentation for Musical Applications School of Computing, School of Electronic and Electrical Engineering and School of Music ICSRiM, University of Leeds, United Kingdom

More information

The Shake Stick. Alex Wilson 1 and Abram Hindle 2 ABSTRACT 1 INTRODUCTION 2 METHODOLOGY

The Shake Stick. Alex Wilson 1 and Abram Hindle 2 ABSTRACT 1 INTRODUCTION 2 METHODOLOGY The Shake Stick Alex Wilson 1 and Abram Hindle 2 1 Department of Computing Science, University of Alberta, Edmonton, Canada, aewilson@ualberta.ca, 2 Department of Computing Science, University of Alberta,

More information

McCormack, Jon and d Inverno, Mark. 2012. Computers and Creativity: The Road Ahead. In: Jon McCormack and Mark d Inverno, eds. Computers and Creativity. Berlin, Germany: Springer Berlin Heidelberg, pp.

More information

Zero Latency and Tape Style Monitor Handbook

Zero Latency and Tape Style Monitor Handbook What is "Z" monitoring? Zero Latency or Direct ing via ASIO 2.0 is somewhere between 2-5ms. So is that good enough when monitoring or overdubbing a live performs? Well it depends on the person. Vocals

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY Sidhesh Badrinarayan 1, Saurabh Abhale 2 1,2 Department of Information Technology, Pune Institute of Computer Technology, Pune, India ABSTRACT: Gestures

More information

IMPROVING PERFORMERS MUSICALITY THROUGH LIVE INTERACTION WITH HAPTIC FEEDBACK: A CASE STUDY

IMPROVING PERFORMERS MUSICALITY THROUGH LIVE INTERACTION WITH HAPTIC FEEDBACK: A CASE STUDY IMPROVING PERFORMERS MUSICALITY THROUGH LIVE INTERACTION WITH HAPTIC FEEDBACK: A CASE STUDY Tychonas Michailidis Birmingham Conservatoire Birmingham City University tychonas@me.com Jamie Bullock Birmingham

More information

My project is based on How museum installations could be combined with gesture technologies to make them more interactive.

My project is based on How museum installations could be combined with gesture technologies to make them more interactive. Project Summary My project is based on How museum installations could be combined with gesture technologies to make them more interactive. Research Topics Interactive gesture technology. How it has developed.

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Image Manipulation Interface using Depth-based Hand Gesture

Image Manipulation Interface using Depth-based Hand Gesture Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking

More information

Immersive Guided Tours for Virtual Tourism through 3D City Models

Immersive Guided Tours for Virtual Tourism through 3D City Models Immersive Guided Tours for Virtual Tourism through 3D City Models Rüdiger Beimler, Gerd Bruder, Frank Steinicke Immersive Media Group (IMG) Department of Computer Science University of Würzburg E-Mail:

More information

Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica Agell

Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica Agell DEGREE PROJECT FOR MASTER OF SCIENCE WITH SPECIALIZATION IN ROBOTICS DEPARTMENT OF ENGINEERING SCIENCE UNIVERSITY WEST Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica

More information

Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl. Kinect2Scratch Workbook

Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl. Kinect2Scratch Workbook Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl Workbook Scratch is a drag and drop programming environment created by MIT. It contains colour coordinated code blocks that allow a user to build up instructions

More information

synchrolight: Three-dimensional Pointing System for Remote Video Communication

synchrolight: Three-dimensional Pointing System for Remote Video Communication synchrolight: Three-dimensional Pointing System for Remote Video Communication Jifei Ou MIT Media Lab 75 Amherst St. Cambridge, MA 02139 jifei@media.mit.edu Sheng Kai Tang MIT Media Lab 75 Amherst St.

More information

Harry Plummer KC BA Digital Arts. Virtual Space. Assignment 1: Concept Proposal 23/03/16. Word count: of 7

Harry Plummer KC BA Digital Arts. Virtual Space. Assignment 1: Concept Proposal 23/03/16. Word count: of 7 Harry Plummer KC39150 BA Digital Arts Virtual Space Assignment 1: Concept Proposal 23/03/16 Word count: 1449 1 of 7 REVRB Virtual Sampler Concept Proposal Main Concept: The concept for my Virtual Space

More information

Microsoft Scrolling Strip Prototype: Technical Description

Microsoft Scrolling Strip Prototype: Technical Description Microsoft Scrolling Strip Prototype: Technical Description Primary features implemented in prototype Ken Hinckley 7/24/00 We have done at least some preliminary usability testing on all of the features

More information

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 27 PACS: 43.66.Jh Combining Performance Actions with Spectral Models for Violin Sound Transformation Perez, Alfonso; Bonada, Jordi; Maestre,

More information

Boneshaker A Generic Framework for Building Physical Therapy Games

Boneshaker A Generic Framework for Building Physical Therapy Games Boneshaker A Generic Framework for Building Physical Therapy Games Lieven Van Audenaeren e-media Lab, Groep T Leuven Lieven.VdA@groept.be Vero Vanden Abeele e-media Lab, Groep T/CUO Vero.Vanden.Abeele@groept.be

More information

KINECT CONTROLLED HUMANOID AND HELICOPTER

KINECT CONTROLLED HUMANOID AND HELICOPTER KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED

More information

Implementations of the Leap Motion in sound synthesis, effects modulation and assistive performance tools

Implementations of the Leap Motion in sound synthesis, effects modulation and assistive performance tools Implementations of the Leap Motion in sound synthesis, effects modulation and assistive performance tools Lamtharn Hantrakul Department of Music, Yale University 469 College Street, New Haven CT lamtharn.hantrakul@yale.edu

More information

Electric Audio Unit Un

Electric Audio Unit Un Electric Audio Unit Un VIRTUALMONIUM The world s first acousmonium emulated in in higher-order ambisonics Natasha Barrett 2017 User Manual The Virtualmonium User manual Natasha Barrett 2017 Electric Audio

More information

Paint with Your Voice: An Interactive, Sonic Installation

Paint with Your Voice: An Interactive, Sonic Installation Paint with Your Voice: An Interactive, Sonic Installation Benjamin Böhm 1 benboehm86@gmail.com Julian Hermann 1 julian.hermann@img.fh-mainz.de Tim Rizzo 1 tim.rizzo@img.fh-mainz.de Anja Stöffler 1 anja.stoeffler@img.fh-mainz.de

More information

GAME AUDIO LAB - AN ARCHITECTURAL FRAMEWORK FOR NONLINEAR AUDIO IN GAMES.

GAME AUDIO LAB - AN ARCHITECTURAL FRAMEWORK FOR NONLINEAR AUDIO IN GAMES. GAME AUDIO LAB - AN ARCHITECTURAL FRAMEWORK FOR NONLINEAR AUDIO IN GAMES. SANDER HUIBERTS, RICHARD VAN TOL, KEES WENT Music Design Research Group, Utrecht School of the Arts, Netherlands. adaptms[at]kmt.hku.nl

More information

The Deep Sound of a Global Tweet: Sonic Window #1

The Deep Sound of a Global Tweet: Sonic Window #1 The Deep Sound of a Global Tweet: Sonic Window #1 (a Real Time Sonification) Andrea Vigani Como Conservatory, Electronic Music Composition Department anvig@libero.it Abstract. People listen music, than

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Falcon Singles - Oud for Falcon

Falcon Singles - Oud for Falcon Falcon Singles - Oud for Falcon 2016 Simon Stockhausen Installation As there is no default location for 3rd party sound libraries for Falcon, you can just install the folder Oud which you extracted from

More information

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Content based on Dr.LaViola s class: 3D User Interfaces for Games and VR What is a User Interface? Where

More information

The Complete Guide to Game Audio

The Complete Guide to Game Audio The Complete Guide to Game Audio For Composers, Musicians, Sound Designers, and Game Developers Aaron Marks Second Edition AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO

More information

Audio Engineering Society. Convention Paper. Presented at the 128th Convention 2010 May London, UK

Audio Engineering Society. Convention Paper. Presented at the 128th Convention 2010 May London, UK Audio Engineering Society Convention Paper Presented at the 128th Convention 2010 May 22 25 London, UK The papers at this Convention have been selected on the basis of a submitted abstract and extended

More information

Ableton announces Live 9 and Push

Ableton announces Live 9 and Push Ableton announces Live 9 and Push Berlin, October 25, 2012 Ableton is excited to announce two groundbreaking new music-making products: Live 9, the music creation software with inspiring new possibilities,

More information

Hybrid Coding (JPEG) Image Color Transform Preparation

Hybrid Coding (JPEG) Image Color Transform Preparation Hybrid Coding (JPEG) 5/31/2007 Kompressionsverfahren: JPEG 1 Image Color Transform Preparation Example 4: 2: 2 YUV, 4: 1: 1 YUV, and YUV9 Coding Luminance (Y): brightness sampling frequency 13.5 MHz Chrominance

More information

An Implementation and Usability Study of a Natural User Interface Virtual Piano

An Implementation and Usability Study of a Natural User Interface Virtual Piano The University of Akron IdeaExchange@UAkron Honors Research Projects The Dr. Gary B. and Pamela S. Williams Honors College Spring 2018 An Implementation and Usability Study of a Natural User Interface

More information

Journal of Professional Communication 3(2):41-46, Professional Communication

Journal of Professional Communication 3(2):41-46, Professional Communication Journal of Professional Communication Interview with George Legrady, chair of the media arts & technology program at the University of California, Santa Barbara Stefan Müller Arisona Journal of Professional

More information

Air Marshalling with the Kinect

Air Marshalling with the Kinect Air Marshalling with the Kinect Stephen Witherden, Senior Software Developer Beca Applied Technologies stephen.witherden@beca.com Abstract. The Kinect sensor from Microsoft presents a uniquely affordable

More information

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different

More information

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The

More information

Eye.Breathe.Music: creating music through minimal movement

Eye.Breathe.Music: creating music through minimal movement Eye.Breathe.Music: creating music through minimal movement Sam Bailey, Adam Scott, Harry Wright, Ian Symonds, and Kia Ng ICSRiM University of Leeds, School of Computing, School of Electronic and Electrical

More information

Sound source localization and its use in multimedia applications

Sound source localization and its use in multimedia applications Notes for lecture/ Zack Settel, McGill University Sound source localization and its use in multimedia applications Introduction With the arrival of real-time binaural or "3D" digital audio processing,

More information

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy Andrada David Ovidius University of Constanta Faculty of Mathematics and Informatics 124 Mamaia Bd., Constanta, 900527,

More information

Abstract. Keywords: virtual worlds; robots; robotics; standards; communication and interaction.

Abstract. Keywords: virtual worlds; robots; robotics; standards; communication and interaction. On the Creation of Standards for Interaction Between Robots and Virtual Worlds By Alex Juarez, Christoph Bartneck and Lou Feijs Eindhoven University of Technology Abstract Research on virtual worlds and

More information

Multichannel Robot Speech Recognition Database: MChRSR

Multichannel Robot Speech Recognition Database: MChRSR Multichannel Robot Speech Recognition Database: MChRSR José Novoa, Juan Pablo Escudero, Josué Fredes, Jorge Wuth, Rodrigo Mahu and Néstor Becerra Yoma Speech Processing and Transmission Lab. Universidad

More information

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Team Breaking Bat Architecture Design Specification. Virtual Slugger Department of Computer Science and Engineering The University of Texas at Arlington Team Breaking Bat Architecture Design Specification Virtual Slugger Team Members: Sean Gibeault Brandon Auwaerter Ehidiamen

More information

COMET: Collaboration in Applications for Mobile Environments by Twisting

COMET: Collaboration in Applications for Mobile Environments by Twisting COMET: Collaboration in Applications for Mobile Environments by Twisting Nitesh Goyal RWTH Aachen University Aachen 52056, Germany Nitesh.goyal@rwth-aachen.de Abstract In this paper, we describe a novel

More information

Fool s Paradise Virtual Reality Installation and Performance

Fool s Paradise Virtual Reality Installation and Performance Contact Information Paul Hertz 773-975-9153 (home/studio) 2215 W. Fletcher St. 847-467-2443 (office) Chicago, IL 60618-6403 ignotus@ignotus.com http://ignotus.com/ Project Abstract Fools Paradise is an

More information

Composite Body-Tracking:

Composite Body-Tracking: Composite Body-Tracking: Device Abstraction Layer with Data Fusion for Gesture Recognition in Virtual Reality Applications Vortragender: Betreuer: Verantwortlicher Professor: Luis Alejandro Rojas Vargas

More information

Multi-point nonlinear spatial distribution of effects across the soundfield

Multi-point nonlinear spatial distribution of effects across the soundfield Edith Cowan University Research Online ECU Publications Post Multi-point nonlinear spatial distribution of effects across the soundfield Stuart James Edith Cowan University, s.james@ecu.edu.au Originally

More information

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088 Portfolio About Me: I am a Computer Science graduate student at The University of Texas at Dallas. I am currently working as Augmented Reality Engineer at Aireal, Dallas and also as a Graduate Researcher

More information

User Interaction and Perception from the Correlation of Dynamic Visual Responses Melinda Piper

User Interaction and Perception from the Correlation of Dynamic Visual Responses Melinda Piper User Interaction and Perception from the Correlation of Dynamic Visual Responses Melinda Piper 42634375 This paper explores the variant dynamic visualisations found in interactive installations and how

More information

Taking an Ethnography of Bodily Experiences into Design analytical and methodological challenges

Taking an Ethnography of Bodily Experiences into Design analytical and methodological challenges Taking an Ethnography of Bodily Experiences into Design analytical and methodological challenges Jakob Tholander Tove Jaensson MobileLife Centre MobileLife Centre Stockholm University Stockholm University

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

COMPUTER GAME DESIGN (GAME)

COMPUTER GAME DESIGN (GAME) Computer Game Design (GAME) 1 COMPUTER GAME DESIGN (GAME) 100 Level Courses GAME 101: Introduction to Game Design. 3 credits. Introductory overview of the game development process with an emphasis on game

More information

Digitalising sound. Sound Design for Moving Images. Overview of the audio digital recording and playback chain

Digitalising sound. Sound Design for Moving Images. Overview of the audio digital recording and playback chain Digitalising sound Overview of the audio digital recording and playback chain IAT-380 Sound Design 2 Sound Design for Moving Images Sound design for moving images can be divided into three domains: Speech:

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Best Of BOLDER Collection Granular Owner s Manual

Best Of BOLDER Collection Granular Owner s Manual Best Of BOLDER Collection Granular Owner s Manual Music Workstation Overview Welcome to the Best Of Bolder Collection: Granular This is a collection of samples created with various software applications

More information

A Gesture Control Interface for a Wave Field Synthesis System

A Gesture Control Interface for a Wave Field Synthesis System A Gesture Control Interface for a Wave Field Synthesis System ABSTRACT Wolfgang Fohl HAW Hamburg Berliner Tor 7 20099 Hamburg, Germany fohl@informatik.haw-hamburg.de This paper presents the design and

More information

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences Elwin Lee, Xiyuan Liu, Xun Zhang Entertainment Technology Center Carnegie Mellon University Pittsburgh, PA 15219 {elwinl, xiyuanl,

More information

The reactable*: A Collaborative Musical Instrument

The reactable*: A Collaborative Musical Instrument The reactable*: A Collaborative Musical Instrument Martin Kaltenbrunner mkalten@iua.upf.es Sergi Jordà sjorda@iua.upf.es Günter Geiger ggeiger@iua.upf.es Music Technology Group Universitat Pompeu Fabra

More information

Humera Syed 1, M. S. Khatib 2 1,2

Humera Syed 1, M. S. Khatib 2 1,2 A Hand Gesture Recognition Approach towards Shoulder Wearable Computing Humera Syed 1, M. S. Khatib 2 1,2 CSE, A.C.E.T/ R.T.M.N.U, India ABSTRACT: Human Computer Interaction needs computer systems and

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017 TOUCH & FEEL VIRTUAL REALITY DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es Minimum System Specs Operating System Windows 8.1 or newer Processor AMD Phenom II or Intel Core i3 processor

More information

USING IDEA MATERIALIZATION TO ENHANCE DESIGN CREATIVITY

USING IDEA MATERIALIZATION TO ENHANCE DESIGN CREATIVITY INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN, 27-30 JULY 2015, POLITECNICO DI MILANO, ITALY USING IDEA MATERIALIZATION TO ENHANCE DESIGN CREATIVITY Georgiev, Georgi V.; Taura, Toshiharu Kobe University,

More information

PRESS RELEASE CREATE, MIX, CONTROL YOUR MUSIC WITH A SIMPLE HAND GESTURE

PRESS RELEASE CREATE, MIX, CONTROL YOUR MUSIC WITH A SIMPLE HAND GESTURE PRESS RELEASE CREATE, MIX, CONTROL YOUR MUSIC WITH A SIMPLE HAND GESTURE Specktr? what is that? Specktr is a wireless MIDI controller based on a glove that creates an outstanding and intuitive experience.

More information

Tangible Message Bubbles for Childrenʼs Communication and Play

Tangible Message Bubbles for Childrenʼs Communication and Play Tangible Message Bubbles for Childrenʼs Communication and Play Kimiko Ryokai School of Information Berkeley Center for New Media University of California Berkeley Berkeley, CA 94720 USA kimiko@ischool.berkeley.edu

More information

The Use of Avatars in Networked Performances and its Significance

The Use of Avatars in Networked Performances and its Significance Network Research Workshop Proceedings of the Asia-Pacific Advanced Network 2014 v. 38, p. 78-82. http://dx.doi.org/10.7125/apan.38.11 ISSN 2227-3026 The Use of Avatars in Networked Performances and its

More information

GrainTrain: A Hand-drawn Multi-touch Interface for Granular Synthesis

GrainTrain: A Hand-drawn Multi-touch Interface for Granular Synthesis GrainTrain: A Hand-drawn Multi-touch Interface for Granular Synthesis Anıl Çamcı Department of Performing Arts Technology University of Michigan acamci@umich.edu ABSTRACT We describe an innovative multi-touch

More information

Space Mouse - Hand movement and gesture recognition using Leap Motion Controller

Space Mouse - Hand movement and gesture recognition using Leap Motion Controller International Journal of Scientific and Research Publications, Volume 7, Issue 12, December 2017 322 Space Mouse - Hand movement and gesture recognition using Leap Motion Controller Nifal M.N.M, Logine.T,

More information

The Sound of Touch. Keywords Digital sound manipulation, tangible user interface, electronic music controller, sensing, digital convolution.

The Sound of Touch. Keywords Digital sound manipulation, tangible user interface, electronic music controller, sensing, digital convolution. The Sound of Touch David Merrill MIT Media Laboratory 20 Ames St., E15-320B Cambridge, MA 02139 USA dmerrill@media.mit.edu Hayes Raffle MIT Media Laboratory 20 Ames St., E15-350 Cambridge, MA 02139 USA

More information

HAPTICS AND AUTOMOTIVE HMI

HAPTICS AND AUTOMOTIVE HMI HAPTICS AND AUTOMOTIVE HMI Technology and trends report January 2018 EXECUTIVE SUMMARY The automotive industry is on the cusp of a perfect storm of trends driving radical design change. Mary Barra (CEO

More information

Designing Toys That Come Alive: Curious Robots for Creative Play

Designing Toys That Come Alive: Curious Robots for Creative Play Designing Toys That Come Alive: Curious Robots for Creative Play Kathryn Merrick School of Information Technologies and Electrical Engineering University of New South Wales, Australian Defence Force Academy

More information

INTERACTIVE SKETCHING OF THE URBAN-ARCHITECTURAL SPATIAL DRAFT Peter Kardoš Slovak University of Technology in Bratislava

INTERACTIVE SKETCHING OF THE URBAN-ARCHITECTURAL SPATIAL DRAFT Peter Kardoš Slovak University of Technology in Bratislava INTERACTIVE SKETCHING OF THE URBAN-ARCHITECTURAL SPATIAL DRAFT Peter Kardoš Slovak University of Technology in Bratislava Abstract The recent innovative information technologies and the new possibilities

More information

Immersive Real Acting Space with Gesture Tracking Sensors

Immersive Real Acting Space with Gesture Tracking Sensors , pp.1-6 http://dx.doi.org/10.14257/astl.2013.39.01 Immersive Real Acting Space with Gesture Tracking Sensors Yoon-Seok Choi 1, Soonchul Jung 2, Jin-Sung Choi 3, Bon-Ki Koo 4 and Won-Hyung Lee 1* 1,2,3,4

More information

CHAPTER 1. INTRODUCTION 16

CHAPTER 1. INTRODUCTION 16 1 Introduction The author s original intention, a couple of years ago, was to develop a kind of an intuitive, dataglove-based interface for Computer-Aided Design (CAD) applications. The idea was to interact

More information

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.

More information

International Journal of Informative & Futuristic Research ISSN (Online):

International Journal of Informative & Futuristic Research ISSN (Online): Reviewed Paper Volume 2 Issue 6 February 2015 International Journal of Informative & Futuristic Research An Innovative Approach Towards Virtual Drums Paper ID IJIFR/ V2/ E6/ 021 Page No. 1603-1608 Subject

More information

Leimu: Gloveless Music Interaction Using a Wrist Mounted Leap Motion

Leimu: Gloveless Music Interaction Using a Wrist Mounted Leap Motion Leimu: Gloveless Music Interaction Using a Wrist Mounted Leap Motion Dom Brown Dom.Brown@uwe.ac.uk Nathan Renney nathanrenney@hotmail.com Adam Stark Mi.mu Limited London, UK adam@mimugloves.com Chris Nash

More information

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 INTE 2014 Fabrication of the kinect remote-controlled cars and planning of the motion

More information

DERIVATION OF TRAPS IN AUDITORY DOMAIN

DERIVATION OF TRAPS IN AUDITORY DOMAIN DERIVATION OF TRAPS IN AUDITORY DOMAIN Petr Motlíček, Doctoral Degree Programme (4) Dept. of Computer Graphics and Multimedia, FIT, BUT E-mail: motlicek@fit.vutbr.cz Supervised by: Dr. Jan Černocký, Prof.

More information

ETHERA EVI MANUAL VERSION 1.0

ETHERA EVI MANUAL VERSION 1.0 ETHERA EVI MANUAL VERSION 1.0 INTRODUCTION Thank you for purchasing our Zero-G ETHERA EVI Electro Virtual Instrument. ETHERA EVI has been created to fit the needs of the modern composer and sound designer.

More information

WK-7500 WK-6500 CTK-7000 CTK-6000 BS A

WK-7500 WK-6500 CTK-7000 CTK-6000 BS A WK-7500 WK-6500 CTK-7000 CTK-6000 Windows and Windows Vista are registered trademarks of Microsoft Corporation in the United States and other countries. Mac OS is a registered trademark of Apple Inc. in

More information

Localized Space Display

Localized Space Display Localized Space Display EE 267 Virtual Reality, Stanford University Vincent Chen & Jason Ginsberg {vschen, jasong2}@stanford.edu 1 Abstract Current virtual reality systems require expensive head-mounted

More information

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation Minghao Cai and Jiro Tanaka Graduate School of Information, Production and Systems Waseda University Kitakyushu, Japan Email: mhcai@toki.waseda.jp,

More information

A Symphony of Colour. Christopher Le Brun: A Symphony of Colour, A3 Editorial. June 1, 2017.

A Symphony of Colour. Christopher Le Brun: A Symphony of Colour, A3 Editorial. June 1, 2017. A Symphony of Colour Acclaimed British artist, Christopher Le Brun, returns to Berlin with a solo presentation filling Arndt Art Agency s spaces with a symphony of colour. The title, Now Turn the Page,

More information

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality And Ubiquitous Computing using HCI Augmented Reality And Ubiquitous Computing using HCI Ashmit Kolli MS in Data Science Michigan Technological University CS5760 Topic Assignment 2 akolli@mtu.edu Abstract : Direct use of the hand as an input

More information

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Hung-Chi Chu 1, Yuan-Chin Cheng 1 1 Department of Information and Communication Engineering, Chaoyang University

More information

Virtual Reality Instruments capable of changing Dimensions in Real-time

Virtual Reality Instruments capable of changing Dimensions in Real-time Steven Gelineck Niels Böttcher Linda Martinussen Stefania Serafin Aalborg University Copenhagen, Denmark E-mail: heinztomatketchup@hotmail.com, ilz@jenkamusic.dk, lm-grafik@lm-grafik.dk sts@media.aau.dk

More information

Building a gesture based information display

Building a gesture based information display Chair for Com puter Aided Medical Procedures & cam par.in.tum.de Building a gesture based information display Diplomarbeit Kickoff Presentation by Nikolas Dörfler Feb 01, 2008 Chair for Computer Aided

More information

Interior Design with Augmented Reality

Interior Design with Augmented Reality Interior Design with Augmented Reality Ananda Poudel and Omar Al-Azzam Department of Computer Science and Information Technology Saint Cloud State University Saint Cloud, MN, 56301 {apoudel, oalazzam}@stcloudstate.edu

More information

TACTILE COMPOSITION SYSTEMS FOR COLLABORATIVE FREE SOUND

TACTILE COMPOSITION SYSTEMS FOR COLLABORATIVE FREE SOUND TACTILE COMPOSITION SYSTEMS FOR COLLABORATIVE FREE SOUND Dan Livingstone Computer Music Research School of Computing, Communications and Electronics, University of Plymouth, Drakes Circus Plymouth PL148AA

More information

Realtime Software Synthesis for Psychoacoustic Experiments David S. Sullivan Jr., Stephan Moore, and Ichiro Fujinaga

Realtime Software Synthesis for Psychoacoustic Experiments David S. Sullivan Jr., Stephan Moore, and Ichiro Fujinaga Realtime Software Synthesis for Psychoacoustic Experiments David S. Sullivan Jr., Stephan Moore, and Ichiro Fujinaga Computer Music Department The Peabody Institute of the Johns Hopkins University One

More information

UUIs Ubiquitous User Interfaces

UUIs Ubiquitous User Interfaces UUIs Ubiquitous User Interfaces Alexander Nelson April 16th, 2018 University of Arkansas - Department of Computer Science and Computer Engineering The Problem As more and more computation is woven into

More information

tactile.motion: An ipad Based Performance Interface For Increased Expressivity In Diffusion Performance

tactile.motion: An ipad Based Performance Interface For Increased Expressivity In Diffusion Performance tactile.motion: An ipad Based Performance Interface For Increased Expressivity In Diffusion Performance Bridget Johnson Michael Norris Ajay Kapur New Zealand School of Music michael.norris@nzsm.ac.nz New

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information