User Interaction and Perception from the Correlation of Dynamic Visual Responses Melinda Piper

Similar documents
Interactive Tables. ~Avishek Anand Supervised by: Michael Kipp Chair: Vitaly Friedman

Australian Curriculum The Arts

GLOSSARY for National Core Arts: Media Arts STANDARDS

Multi-Modal User Interaction

Visual Arts What Every Child Should Know

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Design Research & Tangible Interaction

Hoboken Public Schools. Visual and Arts Curriculum Grades K-6

EDUCATIONAL PROGRAM YEAR bachiller. The black forest FIRST YEAR OF HIGH SCHOOL PROGRAM

BSc in Music, Media & Performance Technology

Visual Arts, Music, Dance, and Theater Personal Curriculum

Visual Art Standards Grades P-12 VISUAL ART

The Mixed Reality Book: A New Multimedia Reading Experience

Media Arts Standards PK 3

Visual Studies (VS) Courses. Visual Studies (VS) 1

Page 1 of 8 Graphic Design I Curriculum Guide

National Core Arts Standards Grade 8 Creating: VA:Cr a: Document early stages of the creative process visually and/or verbally in traditional

Envision original ideas and innovations for media artworks using personal experiences and/or the work of others.

Delaware Standards for Visual & Performing Arts

Years 9 and 10 standard elaborations Australian Curriculum: Digital Technologies

YEAR 7 & 8 THE ARTS. The Visual Arts

GRAPHIC. Educational programme

FP7 ICT Call 6: Cognitive Systems and Robotics

Spatialization and Timbre for Effective Auditory Graphing

Accessibility on the Library Horizon. The NMC Horizon Report > 2017 Library Edition

2015 Arizona Arts Standards. Media Arts Standards K - High School

Joining Forces University of Art and Design Helsinki September 22-24, 2005

Built Environment. ARCH1101 Architectural Design Studio Abbreviated Course Outline T1. Russell Lowe

Anticipation in networked musical performance

LIGHT-SCENE ENGINE MANAGER GUIDE

The secret behind mechatronics

UDIS Programme of Inquiry

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

ApProgXimate Audio: A Distributed Interactive Experiment in Sound Art and Live Coding

Grade 6: Creating. Enduring Understandings & Essential Questions

COURSE DESCRIPTION Advanced 2D Art

Years 3 and 4 standard elaborations Australian Curriculum: Design and Technologies

Human Computer Interaction. What is it all about... Fons J. Verbeek LIACS, Imagery & Media

Human Computer Interaction

DESIGN METHODOLOGY PROCESS BOOK CHRISSY ECKMAN GRDS 348: GRAPHIC DESIGN STUDIO I WINTER 2016

Salient features make a search easy

ENHANCING PRODUCT SENSORY EXPERIENCE: CULTURAL TOOLS FOR DESIGN EDUCATION

Visual Design in Games

Years 9 and 10 standard elaborations Australian Curriculum: Design and Technologies

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

2018 Avanade Inc. All Rights Reserved.

EUROPASS SUPPLEMENT TO THE DIPLOMA OF TÉCNICO SUPERIOR DE ARTES PLÁSTICAS Y DISEÑO (HIGHER EDUCATION IN PLASTIC ARTS AND DESIGN)

Reinventing movies How do we tell stories in VR? Diego Gutierrez Graphics & Imaging Lab Universidad de Zaragoza

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Expressive Arts Curriculum Map

Visualization/Aesthetics/Semantics. Viewing Cues

Edgewood College General Education Curriculum Goals

Grade 5: Kansas Visual Art Performance Standards

Team Breaking Bat Architecture Design Specification. Virtual Slugger

BA STUDY PROGRAM IN ART & TECHNOLOGY THE FACULTY OF HUMANITIES AALBORG UNIVERSITY

Photography (PHOT) Courses. Photography (PHOT) 1

Touch Perception and Emotional Appraisal for a Virtual Agent

MPEG-4 Structured Audio Systems

Project magazine. Workpackage 5 // Deliverable

Enduring Understandings 1. Design is not Art. They have many things in common but also differ in many ways.

Achievement Targets & Achievement Indicators. Compile personally relevant information to generate ideas for artmaking.

GRADE FOUR THEATRE CURRICULUM Module 1: Creating Characters

ADVANCES IN IT FOR BUILDING DESIGN

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Introduction to Vision. Alan L. Yuille. UCLA.

Years 5 and 6 standard elaborations Australian Curriculum: Design and Technologies

Effective Iconography....convey ideas without words; attract attention...

Description of and Insights into Augmented Reality Projects from

A Java Virtual Sound Environment

TRACING THE EVOLUTION OF DESIGN

Toi Creativity Mōhio Virtuosity Mātauranga Understanding Mana Autonomy Whanaungatanga Connectedness

Achievement Targets & Achievement Indicators. Envision, propose and decide on ideas for artmaking.

IMPROVING PERFORMERS MUSICALITY THROUGH LIVE INTERACTION WITH HAPTIC FEEDBACK: A CASE STUDY

Subject Name:Human Machine Interaction Unit No:1 Unit Name: Introduction. Mrs. Aditi Chhabria Mrs. Snehal Gaikwad Dr. Vaibhav Narawade Mr.

University of Huddersfield Repository

CREATIVE COMPUTER GRAPHICS I

Grade 4: Kansas Visual Art Performance Standards

Dynamic Designs of 3D Virtual Worlds Using Generative Design Agents

The MARCS Institute for Brain, Behaviour and Development

Essential Question: Where do choreographers get ideas for dances?

Static and Moving Patterns

Case Study: The Autodesk Virtual Assistant

Sensible Chuckle SuperTuxKart Concrete Architecture Report

Interacting with Music in Video Games Derek Dahmer Advised by Dr. Badler

Movie Production. Course Overview

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

California Subject Examinations for Teachers

THE FUTURE OF STORYTELLINGº

Visual Arts Standards

Toi Creativity Mōhio Virtuosity Mātauranga Understanding Mana Autonomy Whanaungatanga Connectedness

BASIC SKILLS IN THE STUDY OF FORM - GENERATING DIFFERENT STYLING PROPOSALS BASED ON VARIATIONS IN SURFACE ORIENTATION

GCSE Subject Criteria for Art and Design

Language, Context and Location

Delaware Standards for Visual & Performing Arts

Augmented Reality Lecture notes 01 1

Introduction to probing

A study on sound source apparent shape and wideness

Fine Arts Student Learning Outcomes Course, Program and Core Competency Alignment

Virtual Environments before Pixels: Yayoi Kusama's Impact on Virtual Reality

Transcription:

User Interaction and Perception from the Correlation of Dynamic Visual Responses Melinda Piper 42634375 This paper explores the variant dynamic visualisations found in interactive installations and how they affect user interaction and perception through correlation to the interface and additional functional outputs. It will be presented in the context of assessing future developmental ideas for the MozART project, created by students at The University of Queensland, and will explore a range of research concepts which could be applied in a continuation of the project. MozART is a physical interactive computing project which was designed to encourage exploration of musical creativity through the application of a tangible spandex screen, acting as a musical instrumental device for controlling musical output through interactivity. Pushing upon the spandex material enables interactivity, with this action being detected through a Microsoft Kinect depth sensor and results in the generation of musical and visual outputs. Essentially behaving as a musical instrument through the digital augmentation and attenuation of musical effects and production. MozART makes use of auditory, physical and visual modes which effectively present the project to be considered a multi-modal interactive installation. The visuals are projected from behind the spandex screen and are reflective of the interactions in movement and positioning on the screen in real time. While the current version of MozART displays visualisations which reflect different collating musical outputs and represent several interactive inputs, there is the prospect of moving forward in the future to redesign the project so that the visual output better reflects a more diverse range of musical compositions created. In a way that the visual output can take into account a multitude of extra interactions so that the result is more distinct to an overall reflection of the instrumental music produced, while also being maintained at a level so that the output may still be predicted and aligned with specific piece production. Designing visuals for interactive pieces must elicit rich interactions from the user that can be easily understood in conjunction with its correlated sensory outputs. With a multitude of possible visual representations, presenting a visual element for interaction must build upon the context and experience of which the installation presents to the user as well as provide

a form which assists with the ease of use and understanding of the interactions. MozART was designed so that the visual elements would be affected by different interactive data sent from the Kinect, this includes position, depth, speed and alive time. This effectively presents a visual representation of the user s interaction and has a direct correlation to the changes made to the musical output. Therefore the physical interaction with the installation and its outputted audio response is directly related to the visuals. The user eventually comes to an understanding of how the musical aspect is produced and therefore the interaction design and interactive visuals are perceived to be so closely tied that it is believed they are to be the same in their connections (Caldis, C. 2014). Moving beyond this is the opportunity to take multiple instances of data and effectively create patterns of visual output from repeated combined interactions. Also introducing an additional functionality would be to add multi-touch capability s which influence the visuals based on the number of interactions at any given time. Potentially opening up entirely new level of complexity to the visual output and interactivity (Kastbjerg, S. 2013). The visuals are formed from a particle library which create particles that are designed for a fluid system. Interactive effects on the MozART visuals are linked to factors such as particle color which is affected by the position of the interaction, the momentum of the particles influences by the speed of the interactive movement, the particle length is affected by the depth and particle elimination is reduced once the alive time of an interaction passes a specific threshold. Combining these points (so each collected data point was taken into account) to produce a combined and unique visual component or effect. This would delve into an approach more customary to traditional music visualizers, whereby a generation of different musical notes produce a visual display base on the overall music output. This supports a move towards a system which would trigger the representation of complexity for the user, whereby it appears complex only through interpretation but not understanding (Blythe, M 2005). Making the system still easy to predict and effectively play but also supporting a more multifaceted visual creativity. However, William Hsu makes mention that in regards to audio-visual systems the visuals should not over-determine the narrative of the performance (Hsu, W. 2009) which is something to consider when approaching the possibility of determining additional visual effects and creations.

The Cube is an audio-visual interactive installation which aims to provide the user with a multitude of ways to interact with the parameters of a system. It also highlights the importance in transforming a constant fluctuation of information in a way that would engage the users in the same way a performance might (Didakis, S. 2007). Regardless of the simplicity of The Cube s controller it is addressed that the divergent dynamic mapping to a variety of control parameters has been implemented that makes the interaction even more meaningful (Didakis, S. 2007). Allowing for cross-mapping of inputs to produce a combined output visual effect with expressive results. Richard Dreyfuss describes a method for creating interactive graphical music displays and how linking together the musical hierarch of an unfolding musical piece with specific visual cues can provide the user to experience a geo-spatial history of the performance as it unfolds and will be graphically informed of recurring patterns in the music through recurring visualised events (Dreyfuss, R. 2009). Providing a musical performance which is informed by previous interactions and musical output, culminating in a display which presents interactions as a completed and planned piece rather than a disjointed interaction from the user. Enabling this for MozART would open up the possibility for extending the performance capabilities and enabling it to become more focused as an instrument for musical composition. One problem with real-time music visuals is that they are supplied with insufficient information to effectively inform the visuals to an emotional effect and leads to impersonal imagery. While computers cannot hope to understand musical information to the same extend as a human, there are ways in which the digital music output (or corresponding interaction by the user) can be used to create expressive and emotional visual content (Bowens, K. 2008). This can be achieved by analysing pertinent musical elements (such as volume, pitch or multi-user contribution) and their resulting overall output, essentially mapping which combination of interactions results in a musical output which demands the need for a specialised visual cue. Allowing for an extension on the project which the user and viewer can experience a new emotional insight to the music produced through the visual association. It would be important to develop a framework for which mappings are important for the project, discovering which relationships between inputs result in an appropriate response in the musical output. Output of high energy music are best suited

for visuals crowded with particles that are constantly moving and drawing over themselves (Bowens, K. 2008). This opens the possibility of applying changes to aspects of MozART such as the resting particles. When the output is of high intensity (following the identification of mapping the input data which produces this effect), having these resting particles move rapidly when the collating mapped input is detected would produce a sense of liveliness in the emotional impact of the visuals. Figure 1 displays some examples from a Particle animation scheme and their resulting mood type associations, taken from (Bowens, K. 2008).

When addressing the possibility of additional visualization effects and creation it is important to maintain a level which does not overcrowd the resulting output and produce a visual display where overlapping elements restrict the user s ability to follow real time interactions, or result in displeasing visual aesthetics. In conclusion, with an investigation into the mapping and hierarchal interactive inputs it is possible to extend upon the visual element of MozART in a way that could improve the visualization element in a way that presents more expressive and engaging visuals. While this would require some extensive planning for the overall visual output, and would require an investigation into the output results from a combination of musical outputs, the application of this type of visual mapping would bring about the chance to construct musical pieces which provide a backdrop of visualizations which suit the combination of musical outputs and further promote MozART as not only a digital musical instrument but also as intuitive performance art.

References: Bergstrom, T. Karahalios, K. (2007). Seeing More: Visualizing Audio Cues. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol.4663 (2). pp 29-42. Blythe, M. Overbeeke, K. Monk, A. & Wright, P. (2005). Funology From Usability to Enjoyment. Kluwer Academic Publishers. Vol 3. pp 26-36 Bowens, K. (2008). Interactive Musical Visualization Based On Emotional And Color Theory. The University of Texas at Austin. Caldis, C. (2014). Data Sonification Artworks: A Music and Design Investigation of Multimodal Interactive Installations. University of the Witwatersrand, Witps School of Arts. pp 55-65. Didakis, S. (2007). The Cube: An Audiovisual Interactive Installation. Proceedings SMC 07, 4 th Sound and Music Computing Conference. Lefkada, Greece. Dreyfuss, R. Dubois, R. & Kiehl, J. (2009). Interactive tool and appertaining method for creating a graphical music display. United States Patent. Patent N0. US 7,601,904 B2 Hsu, W. (2009). Designing Interactive Audiovisual Systems for Improvising Ensembles. Department of Computer Science San Francisco State University, San Francisco. Kastbjerg, S. Jensen, J. & Nielsen, M. (2013). A study in engaging collaborative audiovisual experiences using a malleable interactive surface. Aalborg University Copenhagen. pp 77-73.