Tactilis Mensa: Interactive Interface to the Art Collection Ecosystem

Similar documents
Making A Touch Table

Workshop one: Constructing a multi-touch table (6 december 2007) Laurence Muller.

International Journal of Advance Engineering and Research Development. Surface Computer

Dhvani : An Open Source Multi-touch Modular Synthesizer

Multi-touch technologies, the reactable* and building a multi-touch device for use in composition and performance. Timothy Roberts.

Prototyping of Interactive Surfaces

Fly Elise-ng Grasstrook HG Eindhoven The Netherlands Web: elise-ng.net Tel: +31 (0)

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes)

Mudpad: Fluid Haptics for Multitouch Surfaces

Multi-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Techniques for Suppressing Adverse Lighting to Improve Vision System Success. Nelson Bridwell Senior Vision Engineer Machine Vision Engineering LLC

From Table System to Tabletop: Integrating Technology into Interactive Surfaces

CS 247 Project 2. Part 1. Reflecting On Our Target Users. Jorge Cueto Edric Kyauk Dylan Moore Victoria Wee

Interaction with Cultural Heritage using FTIR touch table. By: Kevin Nguyen Host Site: NICT, Tokyo, Japan 8/20/10

Platform KEY FEATURES OF THE FLUURMAT 2 SOFTWARE PLATFORM:

MRT: Mixed-Reality Tabletop

UNIT-III LIFE-CYCLE PHASES

SLAPbook: tangible widgets on multi-touch tables in groupware environments

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL

Programmable Ferrofluid Display

Sensory Room. Policy for. Vicarage School

Parts of a Lego RCX Robot

Improvisation and Tangible User Interfaces The case of the reactable

Eye-centric ICT control

Controlling Spatial Sound with Table-top Interface

English PRO-642. Advanced Features: On-Screen Display

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Christian Brothers University 650 East Parkway South Memphis, TN

Geo-Located Content in Virtual and Augmented Reality

YEAR 7 & 8 THE ARTS. The Visual Arts

An Intuitive Multi-Touch Surface and Gesture Based Interaction for Video Surveillance Systems

HOME SCIENCE CHAPTER 3: LIGHTING IN THE HOME Class: X

TIMEWINDOW. dig through time.

Translucent Tangibles on Tabletops: Exploring the Design Space

Holographic Stereograms and their Potential in Engineering. Education in a Disadvantaged Environment.

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

Chapter 1 Virtual World Fundamentals

Installation & User Manual Micro-Image Capture 7

WAYNESBORO AREA SCHOOL DISTRICT CURRICULUM INTRODUCTION TO ENGINEERING

Seishi IKAMI* Takashi KOBAYASHI** Yasutake TANAKA* and Akira YAMAGUCHI* Abstract. 2. System configuration. 1. Introduction

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

VISUAL STUDIES OF TRANSPARENT PV - ELEMENTS

Unit 8: Light and Optics

CHAPTER 1. INTRODUCTION 16

Immersive Guided Tours for Virtual Tourism through 3D City Models

PROCESS TABLETOP CARVING PROJECTS BY JOHN SCIORTINO AND ISAAC LEVINE

Cricut Design Space App for ipad User Manual

One Display for a Cockpit Interactive Solution: The Technology Challenges

Try to Recall GRADE VI LIGHT ENERGY. At the end of the module, you should be able to: Identify energy and its uses (light)

Nature Methods: doi: /nmeth Supplementary Figure 1. VR Assays for Flies, Fish, and Mice

Interactive Multimedia Contents in the IllusionHole

pcon.planner PRO Plugin VR-Viewer

Building a gesture based information display

Directory of Home Labs, Materials List, and SOLs

SITE PREPARATION. Capture Station Placement and Maintenance. General Maintenance Plan

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

OPEN CALL FOR NIGHT LIGHTS AS PART OF SINGAPORE NIGHT FESTIVAL 2016

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

virtual reality SANJAY SINGH B.TECH (EC)

Multi-tool support for multi touch

HUMAN COMPUTER INTERFACE

LAB 11 Color and Light

Technical Guide for Radio-Controlled Advanced Wireless Lighting

The Evolution of Tangible User Interfaces on Touch Tables: New Frontiers in UI & UX Design. by JIM SPADACCINI and HUGH McDONALD

The below identified patent application is available for licensing. Requests for information should be addressed to:

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

VIDEOTEST-KARYO 3.1 SPECIFICATION

TEPZZ 7 Z_ 4A T EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 3/0488 ( ) G06F 3/0482 (2013.

VICs: A Modular Vision-Based HCI Framework

Call for Teaching Artists to create community-based art for a Fall 2017 exhibition

Virtual Reality Devices in C2 Systems

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Assignment 5: Virtual Reality Design

DIGITAL PHOTOGRAPHY FOR OBJECT DOCUMENTATION GOOD, BETTER, BEST

Appendix A ACE exam objectives map

A Hybrid Immersive / Non-Immersive

Application of 3D Terrain Representation System for Highway Landscape Design

User Manual. This User Manual will guide you through the steps to set up your Spike and take measurements.

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions

Touch technologies for large-format applications

COMPANY PROFILE MOBILE TECH AND MARKETING

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Bryce 7.1 Pro IBL Light Sources. IBL Light Sources

Chapter 17: Wave Optics. What is Light? The Models of Light 1/11/13

Great (Focal) Lengths Assignment #2. Due 5:30PM on Monday, October 19, 2009.

Diploma Thesis Final Report: A Wall-sized Focus and Context Display. Sebastian Boring Ludwig-Maximilians-Universität München

A Kinect-based 3D hand-gesture interface for 3D databases

Development of a Dual-Extraction Industrial Turbine Simulator Using General Purpose Simulation Tools

How to create a stop-motion animation. A guide to creating stopmotion animation in class. scottishbooktrust.com. Age CFE Level First and Second

Reflectance. Transformation. Imaging. Author: Shaun McConnaghy

Ball State University Department of Architecture Spring 2017 Grondzik

Properties of Light Lab Instructions Grade 7 Science Westmount High School

Ortelia Set Designer User Manual

FTIR multi-touch display how-to guide

ABB i-bus KNX Lighting Constant lighting control

Transcription:

Tactilis Mensa: Interactive Interface to the Art Collection Ecosystem A creative work submitted in partial fulfilment of the requirements for the award of the degree BACHELOR OF CREATIVE ARTS (HONOURS) in MEDIA ARTS UNIVERSITY OF WOLLONGONG by Liam Fiddler BCA BCompSc Faculty of Creative Arts 2009

Abstract Tactilis Mensa: Interactive Interface to the UoW Art Collection Ecosystem, is a prototype for an alternative art collection browsing system. Utilising a combination of touch-screen and lightpen technologies (as opposed to the more traditional mouse and keyboard), it aims to engage users both physically and mentally as they interact with the data. The work actively visualises the relationships between tagged artworks and encourages users to narrow or expand their search path through an alternative interface. Free from the limitations of an ordinary 'real-object' gallery exhibition, this virtual exhibition enables a potential shift in power from the curator to the audience; they browse the works in the order that they choose, not in a physically predetermined order. 2

Declaration I certify that this creative submission is entirely my own work except where I have given full documented references to the work of others and that the material contained in this creative submission has not been submitted for formal assessment in any formal course. 3

Table of Contents Abstract...2 Declaration...3 Table of Contents...4 Concept...5 Build Process...7 Digital Process...9 Scope for Further Expansion... 11 References... 13 4

Concept The prototype is an interface to the UoW Art Collection Ecosystem (UoW ACE). The UoW ACE is a concept lattice system built over a relational database which allows navigation of objects through neighbouring tags, concepts and perspectives. Conceptually, the act of adding or removing neighbouring tags is one which corresponds to kinesthetic actions - the intent of the user translated into physical locomotion. In exploring this concept it was important that the work involve the user in a tactile, tangible manner. Foregoing a totally immersive experience, the novel interface on an interactive surface provides a relatively close approximation of the natural actions it attempts to mimic. The physical form of the prototype is inspired by the Multi-Touch Interaction Research work by mathematician Jeff Han (2006), the Reactable project (Jordà, Alonso, Geiger & Kaltenbrunner, 2005) and the mæve installation (Nagel 2008). The deceptive simplicity of Jeff Han's frustrated total internal reflection (FTIR) surfaces and mæve's appropriation of the rear diffuse illumination (Rear-DI) model were used as reference for the engineering and scientific theory which drives the work. Meanwhile, the visual styling and method of interaction demonstrated in the Reactable project served to influence the look-and-feel of the work. As a prototype gallery interface, the table was designed to look like a modern plinth - 90 degree angles, lightly textured white exterior and slightly raised feet to create a shadow-line around the base. Initial sketches of the table featured an angled table-top, so the surface would be directed towards the user like a lectern. As the prototype progressed it became clear that doing so would prevent multiple users from interacting with the data simultaneously; they would all have to stand 5

on the same side in order to comfortably access the touch-panel. The finished work has a horizontal surface, allowing multiple users to move around the table freely. The graphical user interface (GUI) has been designed to allow clear and unencumbered navigation between works. Using attraction and repulsion physics to simulate a physical space, each tag is synesthetically personified a form of organic information design. The GUI was developed in a series of iterations; each iteration attempting to reduce the complexity of traversing the concept lattices. In the finished interface the user simply drags a tag into or out of the search bubble to expand or refine their search. Visually the GUI borrows heavily from the early work of Ben Fry, particularly anemone (1999) which was the initial inspiration for representing the lattice framework. The fluid movement of the data objects in anemone served to influence the representation of tags in Tactilis Mensa. 6

Build Process Tactilis Mensa was built using the following materials: 5 sheets of 16mm MDF 1 sheet of 8mm perspex/plexiglass 1 semi-opaque plastic tablecloth 1 front-reflection mirror 6 infra-red (IR) light emitting diode (LED) illuminators 1 small form factor PC (2.6GHz Core 2 Duo, 3GB RAM) 1 LCD projector (SVGA, 2200 lumens) 1 webcam (60 fps, IR-enabled, 960 x 720px resolution) 2 layers of over-exposed camera film 1 ventilation fan (9", 12V) 2 ventilation covers 1 infra-red light-pen The build process took place over several months. Externally, Tactilis Mensa is made from MDF and features a rear-projection screen for the tabletop. The rear-projection screen is simply a semi-opaque plastic tablecloth stretched over the perspex surface, which serves to diffuse the projected image and IR light. The projector is placed horizontally on the floor of the table, a mirror is used to direct the light onto the surface. Six IR illuminators are strategically placed inside the table to evenly spread IR light across the surface. This ensures that wherever the user places their finger on the surface the light will be adequately refracted. 7

(Zacosham 2008) Two layers of over-exposed camera film are taped over the webcam lens and the webcam is positioned to monitor the entire surface. The exposed negatives block out any light in the visible spectrum (ie. the projected image) while still allowing the IR light through. The webcam itself is a Logitech Quickcam Pro 9000; a camera which can produce 60 fps video at high resolution and doesn't require any modifications to detect IR light. beams. The light-pen was created by replacing the LED in a keychain torch with one that emits IR 8

Digital Process The underlying software architecture is comprised of a number of different applications working in tandem. When a user touches the screen or shines the light-pen on the surface some IR light is refracted and appears brighter than the surrounding areas, this blob is then viewed by the webcam. Community Core Vision (CCV), an open-source software for blob detection and tracking, receives the live webcam feed, processes each frame and determines where on a Cartesian grid the blob occurs. CCV sends the captured data using the TUIO protocol to 127.0.0.1 (localhost) on port 3333. TUIO is an open framework, common protocol and API for tangible multitouch applications. The "TUIO" name represents a combination of acronyms for Tangible User Interface and Open Sound Control. A custom Processing sketch was developed to receive and interpret the TUIO signals and behave accordingly. This provides the graphical user interface (GUI) for the work. Two libraries for Processing have been used in the GUI, tuiozones and traer.physics. tuiozones is a small library which provides easy access to decoded TUIO messages, allowing you to access the abstracted data and interpret it in a meaningful way. traer.physics provides the physics engine used to control the amount of attraction / repulsion between tags as well as the springs which hold the tags and artwork thumbnails at the appropriate distance from the search bubble's center particle. 9

The PC inside the table is running an Apache install, configured with PHP5 and PostgreSQL. These services provide access to a local version of the UoW ACE back-end webservice. Each time a tag is dragged in or out of the search bubble, a call is made to the local webservice; the returned XML data indicates which tags can be used to further refine the search and which artwork thumbnails should be displayed. 10

Scope for Further Expansion As a technical prototype, there are a few modifications that could be made to improve Tactilis Mensa or provide a better fit for the concept. Within the scope of the Honours project many of these expansions are unlikely to be implemented, however they provide some insight into the thought processes and highlight potential pitfalls for anyone looking to embark on a similar endeavour. On a conceptual level the work aims to explore the development of a multi-user interface and particular attention was paid to designing the physical form so as to allow for this usage, but there are two key areas where this could be improved: 1. The GUI application developed is directional in nature. The text always faces the 'bottom' edge of the table and the artwork images are always presented in the same manner. To be truly multi-user friendly the works should be able to be rotated. This is a software implementation issue and can be easily adjusted in the future. 2. A rectangular table provides scope for four users to comfortably gather around the table. Alternate design sketches were designed around the idea of a cylindrical object allowing more people to easily access all edges of the surface. Limitations in projector throw ratios and the relative complexity prevented this design from being feasible, though a larger budget and timeframe might allow this idea to grow. The ambient lighting in a room has a tendency to confuse the blob detection software. If it were possible to completely prevent external light from entering the table then calibration would be simple and a universal solution would apply. Unfortunately due to a small amount of IR light being 11

emitted by the projector and light entering through the projection surface and the ventilation holes, the application needs to be calibrated every time the prototype is used. Additionally, depending on the amount of IR light inside the table, the exposure rate of the webcam may need to be increased - lowering the framerate and causing a visible delay when the user tries to drag an object. Disabling the illuminators and using the light-pen alone provides the most responsive and accurate interface - but it prevents touches being recognised, creating a less tactile experience. 12

References Fry, B 1999, anemone ben fry, accessed 20/10/2009, http://benfry.com/anemone/ Han, J 2006, Jeff Han, NYU Courant Institute of Mathematical Sciences, February, accessed 20/10/2009, http://cs.nyu.edu/~jhan/ Jordà, Alonso, Geiger & Kaltenbrunner 2005, Reactable, accessed 20/10/2009, http://www.reactable.com/ Nagel, T 2008, mæve installation, accessed 20/10/2009, http://portal.mace-project.eu/maeve/ Ramos, A, Titoulenko, A, Moore, C, Wallin, D, Muller, L, Khoshabeh, R, Hartman, S, Sandler, S, Hansen, T 2009, Community Core Vision, software, accessed 11/05/2009, http://ccv.nuigroup.com/ Reas, C, Fry, B 2009, Processing, software, accessed 11/05/2009, http://www.processing.org/ Zacosham 2008, Rearditouch, JPEG, accessed 20/10/2009 http://wiki.nuigroup.com/image:rearditouch.jpg 13