Touch & Gesture. HCID 520 User Interface Software & Technology

Similar documents
Touch & Gesture. HCID 520 User Interface Software & Technology

What was the first gestural interface?

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

Gesture Recognition with Real World Environment using Kinect: A Review

Touch Interfaces. Jeff Avery

CSE Tue 10/09. Nadir Weibel

Air Marshalling with the Kinect

Research Seminar. Stefano CARRINO fr.ch

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes)

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

A Kinect-based 3D hand-gesture interface for 3D databases

Multi-Modal User Interaction

of interface technology. For example, until recently, limited CPU power has dictated the complexity of interface devices.

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Image Manipulation Interface using Depth-based Hand Gesture

Advancements in Gesture Recognition Technology

Realtime 3D Computer Graphics Virtual Reality

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Classifying 3D Input Devices

Project Multimodal FooBilliard

User Interface Software Projects

IMGD 4000 Technical Game Development II Interaction and Immersion

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Table of Contents. Display + Touch + People = Interactive Experience. Displays. Touch Interfaces. Touch Technology. People. Examples.

KINECT CONTROLLED HUMANOID AND HELICOPTER

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Mobile Applications 2010

3D Interaction Techniques

Computer Vision in Human-Computer Interaction

Prospective Teleautonomy For EOD Operations

3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

Sensing and Perception

Stop Compromising My Touchscreen!

UUIs Ubiquitous User Interfaces

Naturalness in the Design of Computer Hardware - The Forgotten Interface?

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Building Perceptive Robots with INTEL Euclid Development kit

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Community Update and Next Steps

Getting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning...

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Effective Iconography....convey ideas without words; attract attention...

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Interface Design V: Beyond the Desktop

The 5 Types Of Touch Screen Technology.! Which One Is Best For You?!

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Analysis of Various Methodology of Hand Gesture Recognition System using MATLAB

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

Universal Usability: Children. A brief overview of research for and by children in HCI

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Heads up interaction: glasgow university multimodal research. Eve Hoggan

CS415 Human Computer Interaction

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Intelligent Robotics Sensors and Actuators

INDE/TC 455: User Interface Design

IceTrendr - Polygon. 1 contact: Peder Nelson Anne Nolin Polygon Attribution Instructions

Available online at ScienceDirect. Procedia Computer Science 50 (2015 )

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Collaboration in Multimodal Virtual Environments

M-16DX 16-Channel Digital Mixer

Nontraditional Interfaces

Questionnaire Design with an HCI focus

Classifying 3D Input Devices

INDE/TC 455: User Interface Design

Apple ARKit Overview. 1. Purpose. 2. Apple ARKit. 2.1 Overview. 2.2 Functions

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Input devices and interaction. Ruth Aylett

Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI

Paper Prototyping Kit

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Controlling vehicle functions with natural body language

Comparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application

HUMAN COMPUTER INTERFACE

Ubiquitous Computing. Spring 2010

Mimetic Interaction Spaces : Controlling Distant Displays in Pervasive Environments

interactive laboratory

Tangible User Interfaces

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

Developing a VR System. Mei Yii Lim

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Sketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

R (2) Controlling System Application with hands by identifying movements through Camera

Sketching Interface. Motivation

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space

Multi-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

Projection Based HCI (Human Computer Interface) System using Image Processing

Controlling Humanoid Robot Using Head Movements

Augmented and Virtual Reality

Transcription:

Touch & Gesture HCID 520 User Interface Software & Technology

Natural User Interfaces

What was the first gestural interface?

Myron Krueger There were things I resented about computers.

Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.

Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.

Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.... that it wasn't perceptual it was all symbolic.

Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.... that it wasn't perceptual it was all symbolic. I started thinking that artists and musicians had the best relationships to their tools.

Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.... that it wasn't perceptual it was all symbolic. I started thinking that artists and musicians had the best relationships to their tools. As early as '74, the computer could see you. Krueger, 1988

[O Sullivan]

http://worrydream.com/abriefrantonthefutureofinteractiondesign/

compared to. Pictures Under Glass

What makes an input method natural?

Top 8 images for natural interaction.

Top 8 images for natural

What makes an input method natural?

What makes an input method natural? An ill-posed question

A reasonable definition? A user interface is natural if: The experience of using a system matches expectations, such that it is always clear to the user how to proceed, and that few steps (with a minimum of physical and cognitive effort) are required to complete common tasks. Hinckley & Wigdor

A reasonable definition? A user interface is natural if: The experience of using a system matches expectations, such that it is always clear to the user how to proceed, and that few steps (with a minimum of physical and cognitive effort) are required to complete common tasks. Hinckley & Wigdor Q: Is this just usability by another name?

It is a common mistake to attribute the naturalness of a product to the underlying input technology. A touch screen, or any other input method for that matter, is not inherently natural. Hinckley & Wigdor

It is a common mistake to attribute the naturalness of a product to the underlying input technology. A touch screen, or any other input method for that matter, is not inherently natural. Hinckley & Wigdor Fluent experiences depend on the context and expectations of the user, often relying on prior learning and skill acquisition.

Touch Input

Touch Input Different types of sensors

Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch.

Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch. Capacitive: Layer holds electric charge, changed by touch at contact point. Supports multi-touch.

Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch. Capacitive: Layer holds electric charge, changed by touch at contact point. Supports multi-touch. Surface acoustic wave: Measure changes to ultrasonic audio waves. Expensive, sensitive.

Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch. Capacitive: Layer holds electric charge, changed by touch at contact point. Supports multi-touch. Surface acoustic wave: Measure changes to ultrasonic audio waves. Expensive, sensitive. Optical imaging: Use IR light and cameras to track touches (appear as shadows). Multi-touch.

In-Air Gestures

Kinect Sensor We continuously reference elements in the world in ambiguous ways, yet for the most part RGB infrared we seem to convey our intentions quite well. Deixis: Reference by means of an expression infrared whose interpretation is relative to the (usually) projector extralinguistic context. camera Common methods of physical reference: pointing & placing [Clark 2003] camera Microphones Motor USB

Depth Cameras Structured IR light cheap, fast, accurate missing pixels, shadows Structured IR Missing pixels (not IR reflective) shadow far RGB Depth near

How Kinect Works Structured Light 3D Scanner

RGB vs. Depth for Pose Estimation RGB Only works when well lit Background clutter Scale unknown Clothing, skin colour DEPTH Depth Works in low light Person pops from bg Scale known Uniform texture Shadows, missing pixels Much easier with depth

Human Pose Estimation x y z θ Kinect tracks 20 body joints in real time.

Skeletal Tracking top view Input depth image Inferred body parts & overlaid joint hypotheses front view 3D joint hypotheses side view

Kinect API Input Image Data Streams: RGB, Depth images Skeletal Tracking Audio (Microsoft Speech Platform) Constraints Latency: data analysis introduces lag 86cm to 4m range Not outdoors (too much IR noise) Not too close to other Kinects (IR interference) Track 1-2 people only; full bodies must be in view (?)

Gesture Design

Designing Gestural UIs A designer must consider: (a) the physical sensor

Input Device Properties Property Sensed: position, force, angle, joints States Sensed: contact, hover, Precision: accuracy of selection Latency: delay in property/state sensing Acquisition Time: get pen, move hand to mouse False Input: accidental touches

On clutches and live mics Device Property State Tracked

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position??

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position??

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact Gesture 2D/3D Position??

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact Gesture 2D/3D Position?? In-air gestures may involve a live mic, increasing chances of false positives and false negatives.

On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact Gesture 2D/3D Position?? In-air gestures may involve a live mic, increasing chances of false positives and false negatives. Clutch: differentiate actions intended to drive the computing system from those that are not.

Managing a live mic Reserved Actions Design gestures that will not be triggered unless specifically desired by the user. Reserved Clutches Use a special gesture to indicate that the system should now monitor for input commands. Multi-Modal Input Use another modality such as buttons or voice input to engage tracking by the system.

Designing Gestural UIs A designer must consider: (a) the physical sensor

Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user

Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user (c) ergonomic and industrial design

Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user (c) ergonomic and industrial design (d) the interplay between all interaction techniques and among all devices in the surrounding context

Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user (c) ergonomic and industrial design (d) the interplay between all interaction techniques and among all devices in the surrounding context (e) the learning curve

How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity.

How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity. Participatory design: have representative users generate potential gestures for you.

How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity. Participatory design: have representative users generate potential gestures for you. One methodology [Wobbrock et al 2009] 1 Show participant start and end states of UI 2 Participant performs gesture for that effect 3 Analyze collected gestures from population

How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity. Participatory design: have representative users generate potential gestures for you. One methodology [Wobbrock et al 2009] 1 Show participant start and end states of UI 2 Participant performs gesture for that effect 3 Analyze collected gestures from population Must still consider interplay with task/context

User-Designed Gestures

Discussion

Discussion Questions Kristen: I found it interesting how the study was based around an effect/cause testing model (having users perform actions they think would result in the shown effect). Is this a popular method in other areas of hci? Lauren: I think showing users an effect, and then asking them to perform its cause, is an incredibly intelligent way to gather data on natural, intuitive gestures This method will undoubtedly decrease the cognitive load placed on the user and allow them to act naturally while interacting with the device, instead of requiring the user to memorize and recall several different learned gestures of interaction, creating a much more enjoyable user experience.

Discussion Questions Sara: How might gestures differ significantly across cultures (especially in ones that read from right to left or tend to prefer using the right hand instead of the left, for example)? Taysser: This reading says that users are not designers; therefore, care must be taken to elicit user behavior profitable for design what effect design has on developing a user-defined gesture set? This reading doesn t provide an example that clarifies this statement.

Discussion Questions Stuart: Morris, Saponas, & Tan posit that a glasses-shaped display is "the best candidate for always-available output in the near-term future." This paper was written in 2011, and since then, Google Glass has flopped tremendously. What went wrong? Why are glasses-based optical interfaces not as great an idea as they thought? Rick: With the announcement of Microsoft HoloLens, it seems that users can interact with virtual objects in their field of vision. I am very curious that whether the interaction mode of HoloLens will be more similar to a touch system or an in-air gesturing system?

Discussion Questions Acacio: Although the problems mentioned are somewhat obvious, I think I found a hidden conclusion where multi modal inputs make the most sense from a usability standpoint and have been the most successful in real world usage. In an ideal situation, each mode complements the other, adding functionality like the article mentions in the case of the keyboard and mouse. Another advance that I have found hard to live without is my apple magic mouse that incorporates touch as well. It has a clutch so when the mouse is actively moving the touch is disabled. This article opened my eyes to looking at these things around me.

The Kinematic Chain

Yves Guiard: Kinematic Chain Asymmetry in bimanual activities Under standard conditions, the spontaneous writing speed of adults is reduced by some 20% when instructions prevent the non-preferred hand from manipulating the page Non-dominant hand (NDH) provides a frame of reference for the dominant hand (DH) NDH operates at a coarse temporal and spatial scale; DH operates at a finer scales.

Proxemics

Proxemics Proxemics is the study of measurable distances between people as they interact. [Hall 1966] Taxonomy of Distance Intimate: embracing, touching or whispering Personal: interaction among friends / family Social: interactions among acquaintances Public: distance used for public speaking

We continuously reference elements in the world in ambiguous ways, yet for the most part we seem to convey our intentions quite well. Deixis: Reference by means of an expression whose interpretation is relative to the (usually) extralinguistic context. Vogel & Balakrishnan, 2004 Incorporating Marquardt et al, 2011 Common methods of physical reference: pointing & placing [Clark 2003] Proxemics

Final Thoughts Leverage the unique opportunities provided by a particular input technology. Don t shoehorn new modalities where old techniques excel. Consider perceptual vs. symbolic input. Prevent accidental (vs. intentional) input via unambiguous design and/or clutching. Respect existing conventions of spatial reference and social use of space.