GUI and Gestures CS334 Fall 2013 Daniel G. Aliaga Department of Computer Science Purdue University
User Interfaces Human Computer Interaction Graphical User Interfaces History 2D interfaces VR/AR Interfaces 3D interfaces Adaptive User Interfaces Tangible User Interfaces Haptic User Interfaces Gestures
Human Computer Interaction Human-computer interaction is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them.
Graphical User Interfaces 1970s Xerox Parc Palo Alto Research Center SmallTalk project developed the Star GUI on Alto System 1982 Steve Jobs saw it and thought that s awesome dude Apple hired many Xerox people and his version of the GUI for the failed Lisa computer 1984 Apple released the Macintosh computer with the WIMP interface (Window, icon, menus, pointer)
Graphical User Interfaces
Graphical User Interfaces
VR/AR User Interfaces
VR/AR User Interface
VR/AR User Interface
VR/AR User Interface Optical See-Through HMD
Adaptive User Interface An Adaptive System is a knowledge-based system which automatically alters aspects of the system functionality and interface in order to accommodate the differing preferences and requirements of individual system users" (Benyon 1990) Examples of adaptive behavior: Selection of interaction techniques and communication channels Task-dependent presentation of forms and menus Task- or user-dependent information presentation Adaptive help
Tangible User Interfaces
Tangible User Interfaces
Haptic User Interfaces
Gestures All kinds of instances where an individual engages in physical movements whose intent is recognized and used to communicate desired actions or ideas Kinds of gestures Gesticulation: those associated with speech Autonomous: those that function independently from speech
Gestures Gesture Recognition Track-based: recognize gestures by motions of hands and body Pen-based: recognize gestures of (2D) input devices
Gestures History 1963: First pen-based system the RAND
Gestures History 1963: First pen-based system the RAND 1992: Apple Newton First widespread pen-based system
Gestures History 1963: First pen-based system the RAND 1992: Apple Newton First widespread pen-based system 1980s/1990s/2000s Instrumented gloves
Gestures History 1963: First pen-based system the RAND 1992: Apple Newton First widespread pen-based system 1980s/1990s/2000s Instrumented gloves Arms
Gestures From the 60s: for handling radioactive material Now for protein docking
Gestures: today Kinect Wii Smart Phones/Tablets
Gestures Types of gestures Sign language Body motions and configurations Hand motions and configurations Facial motions and expressions Pen-based symbols Circle and copy Scratch-out Make object Etc
Gestures How do Pen-based gestures differ? In shape? In timing? In strokes?
Single Stroke Gestures What are the challenges? Fast Robust recogniton Invariant to rotation/scale/translation
Single Stroke Gesture Recognition Basic Algorithm: Given C classes of gestures E c training examples per class Compute a set of F features per gesture that uniquely identify it Given a new gesture to recognize Compute its F features Find the class with the most similar set of features Done ( Specifying Gestures by Example, Dean Rubine, SIGGRAPH 1991)
Some Results
Features Each gesture is the sequence g p = (x p, yp, tp) Feature vector is f = (f 1, f F ) Gesture class c maximizes v c = w c0 + F w ci f i 1 Example features on board
Training Goal: Obtain a classifier that recognizes each gesture Options: Use an iterative scheme to refine weights Use a closed formula with assumptions Classical linear discriminator
See board Linear Discriminator
Recognition For each new stroke, compute its feature vector and find the feature class that maximizes the weighted sum of its features Classic linear discriminator