What was the first gestural interface?

Similar documents
Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology

Air Marshalling with the Kinect

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

CSE Tue 10/09. Nadir Weibel

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes)

Touch Interfaces. Jeff Avery

Heads up interaction: glasgow university multimodal research. Eve Hoggan

3D Interaction Techniques

Realtime 3D Computer Graphics Virtual Reality

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Image Manipulation Interface using Depth-based Hand Gesture

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Gesture Recognition with Real World Environment using Kinect: A Review

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

User Interface Software Projects

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

Research Seminar. Stefano CARRINO fr.ch

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

IMGD 4000 Technical Game Development II Interaction and Immersion

Nontraditional Interfaces

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Tangible User Interfaces

Intel Perceptual Computing SDK Human Interface Guidelines

Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI

Mobile Applications 2010

Multi-Modal User Interaction

A Kinect-based 3D hand-gesture interface for 3D databases

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Getting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning...

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Building Perceptive Robots with INTEL Euclid Development kit

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

Nontraditional Interfaces. An Introduction into Nontraditional Interfaces R.I.T. S. Ludi/R. Kuehl p. 1 R I T. Software Engineering

KINECT CONTROLLED HUMANOID AND HELICOPTER

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Copyrights and Trademarks

Classifying 3D Input Devices

Autodesk. SketchBook Mobile

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Digital Design and Communication Teaching (DiDACT) University of Sheffield Department of Landscape. Adobe Photoshop CS5 INTRODUCTION WORKSHOPS

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Augmented and Virtual Reality

RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Impress Guide Chapter 4 Adding and Formatting Pictures

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

UUIs Ubiquitous User Interfaces

Project Multimodal FooBilliard

User Experience Guidelines

Viewer 2 Quick Start Guide

Table of Contents. Creating Your First Project 4. Enhancing Your Slides 8. Adding Interactivity 12. Recording a Software Simulation 19

Advancements in Gesture Recognition Technology

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Using Adobe Photoshop

Impress Guide. Chapter 4 Adding and Formatting Pictures

Chapter 4 Adding and Formatting Pictures

of interface technology. For example, until recently, limited CPU power has dictated the complexity of interface devices.

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space

EinScan-SE. Desktop 3D Scanner. User Manual

A Brief Survey of HCI Technology. Lecture #3

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Optimization of user interaction with DICOM in the Operation Room of a hospital

Interface Design V: Beyond the Desktop

Students will be able to create movement through the use of line or implied line and repetition.

Adobe Photoshop CS 6 Level I. Topics: Toolbars Workspace Panels Camera Raw Image Adjustment

Computer Vision in Human-Computer Interaction

W i n d o w s. ScanGear CS-S 4.3 for CanoScan FB1200S Color Image Scanner. User's Guide

MC3 Motion Control System Shutter Stream Quickstart

User Experience Guidelines

3D Data Navigation via Natural User Interfaces

IceTrendr - Polygon. 1 contact: Peder Nelson Anne Nolin Polygon Attribution Instructions

Vector VS Pixels Introduction to Adobe Photoshop

Prospective Teleautonomy For EOD Operations

Projection Based HCI (Human Computer Interface) System using Image Processing

Microsoft Scrolling Strip Prototype: Technical Description

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Create new drawing. Select Collection. Manage graphs. Collection Name. Graphs preview

UNIVERSITY OF WATERLOO Physics 360/460 Experiment #2 ATOMIC FORCE MICROSCOPY

Contents: Bibliography:

REALGRAIN 2 USER'S GUIDE PLUG-IN BY IMAGENOMIC

CS415 Human Computer Interaction

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

Creating Drop Shadows with Photoshop

Perceptual Interfaces. Matthew Turk s (UCSB) and George G. Robertson s (Microsoft Research) slides on perceptual p interfaces

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

Classification for Motion Game Based on EEG Sensing

An exploration of pen tail gestures for interactions

Interaction Techniques for High Resolution Displays

Input devices and interaction. Ruth Aylett

Photoshop Domain 2: Identifying Design Elements When Preparing Images

6. Graphics MULTIMEDIA & GRAPHICS 10/12/2016 CHAPTER. Graphics covers wide range of pictorial representations. Uses for computer graphics include:

Transcription:

stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1

Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.... that it wasn't perceptual it was all symbolic. I started thinking that artists and musicians had the best relationships to their tools. As early as '74, the computer could see you. Krueger 1988 P2: Shadow Boxing Experience a computer; don t learn to use it Draw inspiration from prior work: Manipulate physical environment to enhance experience or sensing (bright lights, audio, ). Manipulate virtual environment (add virtual objects). Explore potentially ambiguous input/output relationships - without deep recognition. For example: optical flow, regions of activity, etc. Add sensor channels: depth camera, microphone, TOPICS Natural User Interfaces Deixis & Proxemics Gesture Input Technology Gesture Design Natural User Interfaces 2

What makes an input method natural? The top 8 images for natural interaction (vs. the top 8 images for natural ) 3

A reasonable working definition? This is an ill-posed question! A user interface is natural if: The experience of using a system matches expectations, such that it is always clear to the user how to proceed, and that few steps (with a minimum of physical and cognitive effort) are required to complete common tasks. Hinckley & Wigdor Wait isn t this just usability by another name? It is a common mistake to attribute the naturalness of a product to the underlying input technology. A touch screen, or any other input method for that matter, is not inherently natural. Hinckley & Wigdor Fluent experiences depend on the context and expectations of the user, often relying on prior learning and skill acquisition. What do we do with gestures and body posture? 4

Deixis You, here, now! Deixis: referencing the world We continuously reference elements in the world in ambiguous ways, yet for the most part we seem to convey our intentions quite well. Deixis: Reference by means of an expression whose interpretation is relative to the (usually) extralinguistic context. Smell this flower Common methods of physical reference: pointing & placing [Clark 2003] 5

Reference by Pointing Reference by Orientation and Eye Gaze Reference by Placement Put That There 6

Proxemics 7

Proxemics Proxemics is the study of measurable distances between people as they interact. [Hall 1966] Taxonomy of Distance: Intimate: embracing, touching or whispering Personal: interaction among friends / family Social: interactions among acquaintances Public: distance used for public speaking Marquardt et al, 2011 Vogel & Balakrishnan, 2004 Incorporating Deixis & Proxemics 8

Kinect Sensor Gesture Input Technologies RGB camera infrared camera infrared projector Microphones Motor USB How Kinect Works Depth Cameras Structured Light 3D Scanner Structured IR light 5 7 cheap, fast, accurate missing pixels, shadows Structured IR missing pixels (not IR reflective) shadow RGB Depth far near 9

RGB vs. Depth for Pose Estimation Human Pose Estimation RGB RGB Only works when well lit Background clutter Scale unknown Clothing, skin colour Depth DEPTH Works in low light Person pops out from bg Scale known Uniform texture Shadows, missing pixels θ x y z Much easier with depth! Kinect tracks 20 body joints in real time. Skeletal Tracking Kinect SDK Input depth image Inferred body parts & overlaid joint hypotheses top view side view front view 3D joint hypotheses Input Image Data Streams: RGB, Depth images Skeletal Tracking Audio (Microsoft Speech Platform) Constraints Latency: data analysis introduces lag 86cm to 4m range Not outdoors (too much IR noise) Not too close to other Kinects (IR interference) Track 1-2 people only; full bodies must be in view (?) 10

Leap Motion Designing Gestural Interfaces Designing Gestural UIs A designer must consider: (a) the physical sensor Input Device Properties Property Sensed: position, force, angle, joints States Sensed: contact, hover, Precision: accuracy of selection Latency: delay in property/state sensing Acquisition Time: get pen, move hand to mouse False Input: accidental touches 11

Of clutches and live mics Device Property State Tracked Mouse Stylus Touch 2D Position 2D Position 2D Position Hover, Button-Press Hover, Contact Contact Gesture 2D/3D Position?? In-air gestures may involve a live mic, increasing chances of false positives and false negatives. Clutch: differentiate actions intended to drive the computing system from those that are not. Managing a live mic Reserved Actions Design gestures that will not be triggered unless specifically desired by the user. Reserved Clutches Use a special gesture to indicate that the system should now monitor for input commands. Multi-Modal Input Use another modality such as buttons or voice input to engage tracking by the system. Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user (c) ergonomic and industrial design (d) the interplay between all interaction techniques and among all devices in the surrounding context (e) the learning curve Gesture Design Exercise 12

How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity. Participatory design: have representative users generate potential gestures for you. One methodology [Wobbrock et al 2009]: 1. Show participant start and end states of UI 2. Participant performs gesture for that effect 3. Analyze collected gestures from population Must still consider interplay with task/context! Design Exercise Context: virtual post-its the primary interface elements are movable, resizable squares. Your task: design a consistent touch gesture vocabulary for a set of operations. You may assume that: (a) Users can use both of their hands. (b) The system identifies the hands/fingers being used. (c) You may introduce additional widgets or graphical elements as part of your vocabulary. Design Exercise Overview: 5 min Individually develop your own gestures 15 min Share with table, revise as a group 15 min Share with class Consider: - Learnability - Mechanics of repeated use - Consistency / compatibility across operations 13

Select Element Move Rotate A A A User-Designed Gestures Select Multiple Shrink Enlarge A A A A Pan (Scroll) Workspace Zoom Workspace Delete Cut Copy Paste Undo Redo Invoke Menu Menu Item 1 Menu Item 2 Menu Item 3 Final Thoughts Leverage the unique opportunities provided by a particular input technology. Don t shoehorn new modalities where old techniques excel. Consider perceptual vs. symbolic input. Prevent accidental (vs. intentional) input via unambiguous design and/or clutching. Respect existing conventions of spatial reference and social use of space. 14