On-line Gesture Recognition

Size: px
Start display at page:

Download "On-line Gesture Recognition"

Transcription

1 On-line Gesture Recognition Luis A. Leiva

2 Presentation Outline Introduction 1 Preliminaries 19 Some Techniques 31 Recap 59 References 63 Slides available at On-line Gesture Recognition 1

3 Introduction On-line Gesture Recognition 2

4 Definitions Gesture / dzsts@/ noun The use of motions of limbs or body as a means of expression. Synonyms: signal, sign, motion, indication, gesticulation Gestures are hand-drawn strokes that do things. Lipscomb (1991) On-line Gesture Recognition 3

5 Definitions 1. Off-line gesture recognition: post-hoc, processed after user interaction static data, no temporal info available 2. On-line gesture recognition: realtime, direct manipulation sequential, time series data On-line Gesture Recognition 4

6 Historical Precedents Sketchpad RAND tablet Sutherland (1963) Davis and Ellis (1964) On-line Gesture Recognition 5

7 Gestures Today Minority Report. Image by 20th Century Fox & DreamWorks On-line Gesture Recognition 6

8 Input Devices Wii. Image by Nintendo Co., Ltd. On-line Gesture Recognition 7

9 Input Devices T(ether). Image by Massachusetts Institute of Technology On-line Gesture Recognition 8

10 Input Devices Kinect for Xbox 360. Image by Microsoft Corporation On-line Gesture Recognition 9

11 Input Devices Humantenna. Image by Microsoft Research On-line Gesture Recognition 10

12 Input Devices Skinput. Image by Carnegie Mellon University On-line Gesture Recognition 11

13 Input Devices Myo. Image by Thalmic Labs On-line Gesture Recognition 12

14 Input Devices Leap motion. Image by Leap Motion, Inc. On-line Gesture Recognition 13

15 Input Devices Air Clicker. Image by Yanko Design On-line Gesture Recognition 14

16 Input Devices TapTap. Image by Woodenshark LLC On-line Gesture Recognition 15

17 Input Devices Pen Tail gestures, by Tian et al. (2012) On-line Gesture Recognition 16

18 Natural communication Expressiveness Ergonomics Usability Fun Advantages On-line Gesture Recognition 17

19 Disadvantages May break fundamental interaction principles: Discoverability, Reliability, Scalability, etc. Ambiguity: non-deterministic decoding Lack of standards Cultural issues On-line Gesture Recognition 18

20 Trade-offs accuracy design setup performance On-line Gesture Recognition 19

21 Preliminaries On-line Gesture Recognition 20

22 Interaction Paradigms Mid-air Onscreen On-line Gesture Recognition 21

23 Definition stroke = pointer down pointer move pointer up s = {(x 1,y 1,t 1 ) (x j,y j,t j ) (x N,y N,t N )} On-line Gesture Recognition 22

24 A Taxonomy 1. zero-order gestures On-line Gesture Recognition 23

25 A Taxonomy 2. first-order gestures (unistrokes) On-line Gesture Recognition 24

26 A Taxonomy 3. higher-order gestures (multistrokes) On-line Gesture Recognition 25

27 Processing Pipeline Input gesture Capture Preprocessing Output command Recognition Feature extraction Classifier selection Feature selection On-line Gesture Recognition 26

28 Capture Event-based Polling (constant frequency) 1 px Sampling rate matters! x5 x6 x3 x8 low freq. high freq. On-line Gesture Recognition 27

29 Preprocessing input capture segmentation noise removal resampling normalization *optional steps On-line Gesture Recognition 28

30 Features Feature Engineering is an art! On-line Gesture Recognition 29

31 Recognition Techniques Hashing: dictionary lookup, zone coding, chain codes Parametric: linear fitting, corner detection Matching: DTW, k-nn, dollar family Statistical: GLM, RF, ANN, HMM, CRFs Ad-hoc: knowledge-based, decision trees, FSM On-line Gesture Recognition 30

32 Bonus: Continuous Recognition OctoPocus, by Bau and Mackay (2008) On-line Gesture Recognition 31

33 Some Techniques On-line Gesture Recognition 32

34 Marking Menus by Kurtenbach (1991) On-line Gesture Recognition 33

35 Marking Menus Blender. Image by Blender Foundation On-line Gesture Recognition 34

36 Linear Fitting ŷ = a+bx minimize R 2 = N i=1 r i 2 y vertical offsets y perpendicular offsets x r i = y i (a+bx i ) r i = y i (a+bx i ) x 1+b 2 On-line Gesture Recognition 35

37 Corner Detection PDL, ShortStraw, Firefox s QuickGestures, etc. G = s 1,...,s n,...,s N s n {L,R,U,D} On-line Gesture Recognition 36

38 Graffiti & Unistrokes Comparison by Castellucci and MacKenzie (2008) On-line Gesture Recognition 37

39 Graffiti & Unistrokes On-line Gesture Recognition 38

40 Rubine recognizer by Rubine (1991) On-line Gesture Recognition 39

41 Rubine recognizer Linear classifier using N = 13 stroke features f ( w T g ) = w o + N i=1 Σ 1 µ i w 0 = 1 2 N i=1 w i µ i Weight estimation: perceptron, LSBF, LDA, SVM, logistic regression, etc. On-line Gesture Recognition 40

42 Shapewriting SHARK 2, by Kristensson and Zhai (2004) On-line Gesture Recognition 41

43 Shapewriting sokgraph of hello Q W E R T Y U I O P sokgraph of there Q W E R T Y U I O P A S D F G H J K L A S D F G H J K L Z X C V B N M Z X C V B N M Ŵ = argmax W P(W g) Ŵ = argmax W P(g W)P(W) P(g) = argmax W P(g W)P(W) On-line Gesture Recognition 42

44 Euclidean Matching point-wise distances g t D(g,t) = 1 g g i=1 g i t i On-line Gesture Recognition 43

45 Elastic Matching dynamic programming g t D(g,t) = min W W 1 W W k=1 w k On-line Gesture Recognition 44

46 Elastic Matching g i W = w 1,...,w k,...,w K w k w K w 1 j t γ(i,j) = d(i,j)+min{γ(i 1,j),γ(i,j 1),γ(i 1,j 1)} On-line Gesture Recognition 45

47 The Dollar Family: $1 recognizer by Wobbrock et al. (2007) On-line Gesture Recognition 46

48 The Dollar Family: $1 recognizer Preprocessing: resampling, rotation, scaling, translation indicative angle (0,0) g t D(g,t) = argmin π 4 θ π 4 1 N N i=1 g i t i (θ) On-line Gesture Recognition 47

49 The Dollar Family: $1 recognizer On-line Gesture Recognition 48

50 The Dollar Family: $N recognizer by Anthony and Wobbrock (2010) On-line Gesture Recognition 49

51 The Dollar Family: $N recognizer $N is $1 with combinatory overhead: O(n s 2 s ) per template Memory drained out with 20 templates (n = 32 pts) in a quad-core computer with 4 GB RAM. On-line Gesture Recognition 50

52 The Dollar Family: $P recognizer by Vatavu et al. (2012) On-line Gesture Recognition 51

53 The Dollar Family: $P recognizer Variation of the Hungarian algorithm: D(g,t) = min{ g t, t g } Haussdorf s alternatives: D(g,t) = max i D(g,t) = 1 N N i=1 min j g i t j min j g i t j On-line Gesture Recognition 52

54 The Dollar Family: $P recognizer On-line Gesture Recognition 53

55 The Dollar Family: Protractor by Li (2010) On-line Gesture Recognition 54

56 The Dollar Family: Protractor Closed-form solution, minimum angular distance D(g,t) = 1 arccos(acosˆθ +bsinˆθ) ˆθ = arctan b a a = N (x gi x ti +y gi y ti ) b = N (x gi y ti y gi x ti ) i=1 i=1 On-line Gesture Recognition 55

57 The Dollar Family: Protractor On-line Gesture Recognition 56

58 MinGestures for MIUIs Disambiguate gestures from handwritten text LABEL ACTION RESULT LABEL ACTION RESULT Substitute Lorem Ipsum Lorem Ipsan Split Lorem Lor em Reject Lorem Ipsum Lorem... Validate Lorem Ipsum Lorem Ipsum Merge Lorem Ipsum LoremIpsum Delete Lorem Ipsum Lorem Undo Lorem Lorem Ipsum Redo Lorem Ipsum Lorem Insert Lorem Ipsum Lorem et Ipsum Help Lorem Ipsum <help event> by Leiva et al. (2014) On-line Gesture Recognition 57

59 x = N i=2 MinGestures for MIUIs Three disambiguating features: RMSE = 1 N x i y i N max(x i 1 x i,0) i=1 Classification rule: θ = ŷ b x ±ǫ ϕ = max(x) min(x) max(y) min(y) On-line Gesture Recognition 58

60 MinGestures for MIUIs E-pen Mouse System training test training test $1 recognizer Marking Menus Modified $ Rubine MinGestures Error rates in % On-line Gesture Recognition 59

61 Recap On-line Gesture Recognition 60

62 Takeaways Gestures shortcut tedious commands Many recognition techniques for many input devices Trade-offs: design, setup, performance, accuracy Gestures should be fast and simple: 1. For humans to perform and recall 2. For computers to recognize On-line Gesture Recognition 61

63 Open Problems 1. Integration: free-form gestures in context 2. Error analysis and recovery: When should the recognizer ask the user? How much to ask? 3. Segmentation: automatic gesture parts identification 4. Generation: grammar-based, kinematic theory, VAEs, etc. On-line Gesture Recognition 62

64 On-line Gesture Recognition 63

65 Bibliography I. E. Sutherland. Sketchpad: A man-machine graphical communication system. Tech. Report 296, Lincoln Laboratory, MIT, M. Davis and T. Ellis. The RAND tablet: A man-machine graphical communication device. In Proc. AFIPS, D. H. Rubine. The Automatic Recognition of Gestures. PhD thesis, Carnegie Mellon University, S. Zhai, P. O. Kristensson, C. Appert, T. H. Andersen, and X. Cao. Foundations and trends in Human Computer Interaction. Foundational Issues in Touch- Surface Stroke Gesture Design An Integrative Review, 5(2), C. C. Tappert, C. Y. Suen, and T. Wakahara. The state of the art in on-line handwriting recognition. IEEE T. PAMI, 12(8), R. Plamondon and S. N. Srihari. On-line and off-line handwriting recognition: a comprehensive survey. IEEE T. PAMI, 22(1), J. S. Lipscomb. A trainable gesture recognizer. Patt. Recogn., 24(9), On-line Gesture Recognition 64

66 G. P. Kurtenbach. The design and evaluation of marking menus. PhD thesis, University of Toronto, O. Bau and W. E. Mackay. OctoPocus: A dynamic guide for learning gesture-based command sets. In Proc. UIST, S. J. Castellucci and I. S. MacKenzie. Graffiti vs. Unistrokes: An empirical comparison. In Proc. CHI, P. O. Kristensson and S. Zhai. SHARK 2 : A large vocabulary shorthand writing system for pen-based computers. In Proc. UIST, L. A. Leiva, V. Alabau, V. Romero, A. H. Toselli, and E. Vidal. Context-aware gestures for mixed-initiative text editing UIs. Interact. Comput., 27(1), J. O. Wobbrock, A. D. Wilson, and Y. Li. Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. In Proc. UIST, Y. Li. Protractor: a fast and accurate gesture recognizer. In Proc. CHI, L. Anthony and J. O. Wobbrock. A lightweight multistroke recognizer for user interface prototypes. In Proc. GI, R.-D. Vatavu, L. Anthony, and J. O. Wobbrock. Gestures as point clouds: a $P recognizer for user interface prototypes. In Proc. ICMI, On-line Gesture Recognition 65

67 F. Tian, F. Lu, Y. Jiang, X. Zhang, X. Cao, G. Dai, and H. Wang. An exploration of pen tail gestures for interactions. Int. J. Human-Comput. Stud., 71, On-line Gesture Recognition 66

68 Videography Sketchpad. RAND tablet. Minority Report. Nintendo Wii. T(ether). MS Kinnect. Humantenna. Skinput. Myo. Leap Motion. On-line Gesture Recognition 67

69 TapTap. Intugine. Wacom gestures. OctoPocus. Marking Menus. Shapewriting. $1 recognizer. $N recognizer. $P recognizer. On-line Gesture Recognition 68

On-line Gesture Recognition

On-line Gesture Recognition On-line Gesture Recognition Luis A. Leiva luileito@prhlt.upv.es PRHLT Research Center Departamento de Sistemas Informáticos y Computación Universitat Politècnica de València http://creativecommons.org/licenses/by/4.0/

More information

GestureCommander: Continuous Touch-based Gesture Prediction

GestureCommander: Continuous Touch-based Gesture Prediction GestureCommander: Continuous Touch-based Gesture Prediction George Lucchese george lucchese@tamu.edu Jimmy Ho jimmyho@tamu.edu Tracy Hammond hammond@cs.tamu.edu Martin Field martin.field@gmail.com Ricardo

More information

Research Seminar. Stefano CARRINO fr.ch

Research Seminar. Stefano CARRINO  fr.ch Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks

More information

3D Data Navigation via Natural User Interfaces

3D Data Navigation via Natural User Interfaces 3D Data Navigation via Natural User Interfaces Francisco R. Ortega PhD Candidate and GAANN Fellow Co-Advisors: Dr. Rishe and Dr. Barreto Committee Members: Dr. Raju, Dr. Clarke and Dr. Zeng GAANN Fellowship

More information

Sketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph

Sketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph Sketching Interface Larry April 24, 2006 1 Motivation Natural Interface touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different from speech

More information

Sketching Interface. Motivation

Sketching Interface. Motivation Sketching Interface Larry Rudolph April 5, 2007 1 1 Natural Interface Motivation touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different

More information

EECS 4441 Human-Computer Interaction

EECS 4441 Human-Computer Interaction EECS 4441 Human-Computer Interaction Topic #1:Historical Perspective I. Scott MacKenzie York University, Canada Significant Event Timeline Significant Event Timeline As We May Think Vannevar Bush (1945)

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

THE Touchless SDK released by Microsoft provides the

THE Touchless SDK released by Microsoft provides the 1 Touchless Writer: Object Tracking & Neural Network Recognition Yang Wu & Lu Yu The Milton W. Holcombe Department of Electrical and Computer Engineering Clemson University, Clemson, SC 29631 E-mail {wuyang,

More information

RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems

RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems Yuxiang Zhu, Joshua Johnston, and Tracy Hammond Department of Computer Science and Engineering Texas A&M University College

More information

UUIs Ubiquitous User Interfaces

UUIs Ubiquitous User Interfaces UUIs Ubiquitous User Interfaces Alexander Nelson April 16th, 2018 University of Arkansas - Department of Computer Science and Computer Engineering The Problem As more and more computation is woven into

More information

EECS 4441 / CSE5351 Human-Computer Interaction. Topic #1 Historical Perspective

EECS 4441 / CSE5351 Human-Computer Interaction. Topic #1 Historical Perspective EECS 4441 / CSE5351 Human-Computer Interaction Topic #1 Historical Perspective I. Scott MacKenzie York University, Canada 1 Significant Event Timeline 2 1 Significant Event Timeline 3 As We May Think Vannevar

More information

Sketchpad Ivan Sutherland (1962)

Sketchpad Ivan Sutherland (1962) Sketchpad Ivan Sutherland (1962) 7 Viewable on Click here https://www.youtube.com/watch?v=yb3saviitti 8 Sketchpad: Direct Manipulation Direct manipulation features: Visibility of objects Incremental action

More information

Humantenna. ubicomp lab. Using the Human Body as an Antenna for Real-Time Whole-Body Interaction

Humantenna. ubicomp lab. Using the Human Body as an Antenna for Real-Time Whole-Body Interaction Humantenna Using the Human Body as an Antenna for Real-Time Whole-Body Interaction Gabe Cohn 1,2 Dan Morris 1 Shwetak N. Patel 1,2 Desney S. Tan 1 1 Microsoft Research 2 University of Washington MSR Faculty

More information

GART: The Gesture and Activity Recognition Toolkit

GART: The Gesture and Activity Recognition Toolkit GART: The Gesture and Activity Recognition Toolkit Kent Lyons, Helene Brashear, Tracy Westeyn, Jung Soo Kim, and Thad Starner College of Computing and GVU Center Georgia Institute of Technology Atlanta,

More information

MARQS: RETRIEVING SKETCHES USING DOMAIN- AND STYLE-INDEPENDENT FEATURES LEARNED FROM A SINGLE EXAMPLE USING A DUAL-CLASSIFIER

MARQS: RETRIEVING SKETCHES USING DOMAIN- AND STYLE-INDEPENDENT FEATURES LEARNED FROM A SINGLE EXAMPLE USING A DUAL-CLASSIFIER MARQS: RETRIEVING SKETCHES USING DOMAIN- AND STYLE-INDEPENDENT FEATURES LEARNED FROM A SINGLE EXAMPLE USING A DUAL-CLASSIFIER Brandon Paulson, Tracy Hammond Sketch Recognition Lab, Texas A&M University,

More information

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes)

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes) GESTURES Luis Carriço (based on the presentation of Tiago Gomes) WHAT IS A GESTURE? In this context, is any physical movement that can be sensed and responded by a digital system without the aid of a traditional

More information

Pose Invariant Face Recognition

Pose Invariant Face Recognition Pose Invariant Face Recognition Fu Jie Huang Zhihua Zhou Hong-Jiang Zhang Tsuhan Chen Electrical and Computer Engineering Department Carnegie Mellon University jhuangfu@cmu.edu State Key Lab for Novel

More information

A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron

A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron Proc. National Conference on Recent Trends in Intelligent Computing (2006) 86-92 A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron

More information

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Kiwon Yun, Junyeong Yang, and Hyeran Byun Dept. of Computer Science, Yonsei University, Seoul, Korea, 120-749

More information

Occlusion-Aware Menu Design for Digital Tabletops

Occlusion-Aware Menu Design for Digital Tabletops Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at

More information

A Gestural Interaction Design Model for Multi-touch Displays

A Gestural Interaction Design Model for Multi-touch Displays Songyang Lao laosongyang@ vip.sina.com A Gestural Interaction Design Model for Multi-touch Displays Xiangan Heng xianganh@ hotmail ABSTRACT Media platforms and devices that allow an input from a user s

More information

An exploration of pen tail gestures for interactions

An exploration of pen tail gestures for interactions Available online at www.sciencedirect.com Int. J. Human-Computer Studies 71 (2012) 551 569 www.elsevier.com/locate/ijhcs An exploration of pen tail gestures for interactions Feng Tian a,d,n, Fei Lu a,

More information

An Hybrid MLP-SVM Handwritten Digit Recognizer

An Hybrid MLP-SVM Handwritten Digit Recognizer An Hybrid MLP-SVM Handwritten Digit Recognizer A. Bellili ½ ¾ M. Gilloux ¾ P. Gallinari ½ ½ LIP6, Université Pierre et Marie Curie ¾ La Poste 4, Place Jussieu 10, rue de l Ile Mabon, BP 86334 75252 Paris

More information

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera GESTURE BASED HUMAN MULTI-ROBOT INTERACTION Gerard Canal, Cecilio Angulo, and Sergio Escalera Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 2/27 Introduction Nowadays robots are able

More information

FINGER PLACEMENT CORRECTION FOR STATIC GESTURE RECOGNITION IN AMERICAN SIGN LANGUAGE. Veronica Yenquenida Flamenco Cordova

FINGER PLACEMENT CORRECTION FOR STATIC GESTURE RECOGNITION IN AMERICAN SIGN LANGUAGE. Veronica Yenquenida Flamenco Cordova FINGER PLACEMENT CORRECTION FOR STATIC GESTURE RECOGNITION IN AMERICAN SIGN LANGUAGE A thesis presented to the faculty of the Graduate School of Western Carolina University in partial fulfillment of the

More information

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15) Outline 01076568 Human Computer Interaction Chapter 5 : Paradigms Introduction Paradigms for interaction (15) ดร.ชมพ น ท จ นจาคาม [kjchompo@gmail.com] สาขาว ชาว ศวกรรมคอมพ วเตอร คณะว ศวกรรมศาสตร สถาบ นเทคโนโลย

More information

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different

More information

CS 315 Intro to Human Computer Interaction (HCI)

CS 315 Intro to Human Computer Interaction (HCI) CS 315 Intro to Human Computer Interaction (HCI) Direct Manipulation Examples Drive a car If you want to turn left, what do you do? What type of feedback do you get? How does this help? Think about turning

More information

GUI and Gestures. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

GUI and Gestures. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University GUI and Gestures CS334 Fall 2013 Daniel G. Aliaga Department of Computer Science Purdue University User Interfaces Human Computer Interaction Graphical User Interfaces History 2D interfaces VR/AR Interfaces

More information

Static Signature Verification and Recognition using Neural Network Approach-A Survey

Static Signature Verification and Recognition using Neural Network Approach-A Survey Available online www.ejaet.com European Journal of Advances in Engineering and Technology, 2015, 2(4): 46-50 Review Article ISSN: 2394-658X Static Signature Verification and Recognition using Neural Network

More information

Introduction to Autodesk Inventor for F1 in Schools (Australian Version)

Introduction to Autodesk Inventor for F1 in Schools (Australian Version) Introduction to Autodesk Inventor for F1 in Schools (Australian Version) F1 in Schools race car In this course you will be introduced to Autodesk Inventor, which is the centerpiece of Autodesk s Digital

More information

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation Direct Manipulation and Instrumental Interaction 1 Review: Interaction vs. Interface What s the difference between user interaction and user interface? Interface refers to what the system presents to the

More information

Tahuti: A Geometrical Sketch Recognition System for UML Class Diagrams

Tahuti: A Geometrical Sketch Recognition System for UML Class Diagrams Tahuti: A Geometrical Sketch Recognition System for UML Class Diagrams Tracy Hammond and Randall Davis AI Lab, MIT 200 Technology Square Cambridge, MA 02139 hammond, davis@ai.mit.edu Abstract We have created

More information

Learning Guide. ASR Automated Systems Research Inc. # Douglas Crescent, Langley, BC. V3A 4B6. Fax:

Learning Guide. ASR Automated Systems Research Inc. # Douglas Crescent, Langley, BC. V3A 4B6. Fax: Learning Guide ASR Automated Systems Research Inc. #1 20461 Douglas Crescent, Langley, BC. V3A 4B6 Toll free: 1-800-818-2051 e-mail: support@asrsoft.com Fax: 604-539-1334 www.asrsoft.com Copyright 1991-2013

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Visual Recognition of Sketched Symbols

Visual Recognition of Sketched Symbols Visual Recognition of Sketched Symbols Tom Y. Ouyang MIT CSAIL 32 Vassar St, Cambridge MA, 02139, USA ouyang@csail.mit.edu Randall Davis MIT CSAIL 32 Vassar St, Cambridge MA, 02139, USA davis@csail.mit.edu

More information

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN Patrick Chiu FX Palo Alto Laboratory Palo Alto, CA 94304, USA chiu@fxpal.com Chelhwon Kim FX Palo Alto Laboratory Palo

More information

Full text available at: Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review

Full text available at:  Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review Shumin Zhai Google, USA zhai@acm.org Per

More information

A Brief Survey of HCI Technology. Lecture #3

A Brief Survey of HCI Technology. Lecture #3 A Brief Survey of HCI Technology Lecture #3 Agenda Evolution of HCI Technology Computer side Human side Scope of HCI 2 HCI: Historical Perspective Primitive age Charles Babbage s computer Punch card Command

More information

Pen Computing for Air Traffic Control

Pen Computing for Air Traffic Control Pen Computing for Air Traffic Control Stéphane Chatty Patrick Lecoanet Centre d Études de la Navigation Aérienne Centre d Études de la Navigation Aérienne 7 avenue Édouard Belin Orly Sud 205 31055 TOULOUSE

More information

VQ Source Models: Perceptual & Phase Issues

VQ Source Models: Perceptual & Phase Issues VQ Source Models: Perceptual & Phase Issues Dan Ellis & Ron Weiss Laboratory for Recognition and Organization of Speech and Audio Dept. Electrical Eng., Columbia Univ., NY USA {dpwe,ronw}@ee.columbia.edu

More information

Composite Body-Tracking:

Composite Body-Tracking: Composite Body-Tracking: Device Abstraction Layer with Data Fusion for Gesture Recognition in Virtual Reality Applications Vortragender: Betreuer: Verantwortlicher Professor: Luis Alejandro Rojas Vargas

More information

Classification for Motion Game Based on EEG Sensing

Classification for Motion Game Based on EEG Sensing Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,

More information

PRODIM CT 3.0 MANUAL the complete solution

PRODIM CT 3.0 MANUAL the complete solution PRODIM CT 3.0 MANUAL the complete solution We measure it all! General information Copyright All rights reserved. Apart from the legally laid down exceptions, no part of this publication may be reproduced,

More information

Beyond: collapsible tools and gestures for computational design

Beyond: collapsible tools and gestures for computational design Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Sensors and Materials, Vol. 28, No. 6 (2016) 695 705 MYU Tokyo 695 S & M 1227 Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Chun-Chi Lai and Kuo-Lan Su * Department

More information

A SURVEY ON HAND GESTURE RECOGNITION

A SURVEY ON HAND GESTURE RECOGNITION A SURVEY ON HAND GESTURE RECOGNITION U.K. Jaliya 1, Dr. Darshak Thakore 2, Deepali Kawdiya 3 1 Assistant Professor, Department of Computer Engineering, B.V.M, Gujarat, India 2 Assistant Professor, Department

More information

Sketch Recognition. AW2 Colloquium by Hauke Wittern

Sketch Recognition. AW2 Colloquium by Hauke Wittern AW2 Colloquium by Hauke Wittern Agenda Introduction Vision Definition of sketch recognition Research on sketch recognition Today s sketch recognition systems Recent research topics Using and recognizing

More information

Measuring FlowMenu Performance

Measuring FlowMenu Performance Measuring FlowMenu Performance This paper evaluates the performance characteristics of FlowMenu, a new type of pop-up menu mixing command and direct manipulation [8]. FlowMenu was compared with marking

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger There were things I resented

More information

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand

The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand Ravin Balakrishnan 1,2 and Pranay Patel 2 1 Dept. of Computer Science 2 Alias wavefront University of Toronto 210

More information

x au*.- 1'L.-.IV oq> 21 j o oor ED « h '2 I] li NO.

x au*.- 1'L.-.IV oq> 21 j o oor ED « h '2 I] li NO. X I I IMPORTANT PLEASE DO NOT GET THiS CARD DAMP OR WET. IT IS USED FOR COMPUTER INPU j 1 ; 4 S j ; 9 'i TT'I '4 A I l "'9 j 70 21 ;"T ' ; n r? pa n 23 34 3b v is j; (' «' «i

More information

SLIC based Hand Gesture Recognition with Artificial Neural Network

SLIC based Hand Gesture Recognition with Artificial Neural Network IJSTE - International Journal of Science Technology & Engineering Volume 3 Issue 03 September 2016 ISSN (online): 2349-784X SLIC based Hand Gesture Recognition with Artificial Neural Network Harpreet Kaur

More information

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison e-issn 2455 1392 Volume 2 Issue 10, October 2016 pp. 34 41 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Design a Model and Algorithm for multi Way Gesture Recognition using Motion and

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Context-based bounding volume morphing in pointing gesture application

Context-based bounding volume morphing in pointing gesture application Context-based bounding volume morphing in pointing gesture application Andreas Braun 1, Arthur Fischer 2, Alexander Marinc 1, Carsten Stocklöw 1, Martin Majewski 2 1 Fraunhofer Institute for Computer Graphics

More information

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction Unit 23 QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction Unit 23 Outcomes Know the impact of HCI on society, the economy and culture Understand the fundamental principles of interface

More information

Recognizing Military Gestures: Developing a Gesture Recognition Interface. Jonathan Lebron

Recognizing Military Gestures: Developing a Gesture Recognition Interface. Jonathan Lebron Recognizing Military Gestures: Developing a Gesture Recognition Interface Jonathan Lebron March 22, 2013 Abstract The field of robotics presents a unique opportunity to design new technologies that can

More information

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Visual Interpretation of Hand Gestures as a Practical Interface Modality Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate

More information

Online handwritten signature verification system: A Review

Online handwritten signature verification system: A Review Online handwritten signature verification system: A Review Abstract: Online handwritten signature verification system is one of the most reliable, fast and cost effective tool for user authentication.

More information

Enabling Gesture Interaction with 3D Point Cloud

Enabling Gesture Interaction with 3D Point Cloud Enabling Gesture Interaction with 3D Point Cloud Harrison Cook School of Computing, Engineering and Mathematics, Western Sydney University harrison.cook42@gmai l.com Quang Vinh Nguyen MARCS Institute and

More information

Making Pen-based Operation More Seamless and Continuous

Making Pen-based Operation More Seamless and Continuous Making Pen-based Operation More Seamless and Continuous Chuanyi Liu and Xiangshi Ren Department of Information Systems Engineering Kochi University of Technology, Kami-shi, 782-8502 Japan {renlab, ren.xiangshi}@kochi-tech.ac.jp

More information

Machine Learning. Classification, Discriminative learning. Marc Toussaint University of Stuttgart Summer 2014

Machine Learning. Classification, Discriminative learning. Marc Toussaint University of Stuttgart Summer 2014 Machine Learning Classification, Discriminative learning Structured output, structured input, discriminative function, joint input-output features, Likelihood Maximization, Logistic regression, binary

More information

HUMAN COMPUTER INTERACTION 0. PREFACE. I-Chen Lin, National Chiao Tung University, Taiwan

HUMAN COMPUTER INTERACTION 0. PREFACE. I-Chen Lin, National Chiao Tung University, Taiwan HUMAN COMPUTER INTERACTION 0. PREFACE I-Chen Lin, National Chiao Tung University, Taiwan About The Course Course title: Human Computer Interaction (HCI) Lectures: ED202, 13:20~15:10(Mon.), 9:00~9:50(Thur.)

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

Classification of Road Images for Lane Detection

Classification of Road Images for Lane Detection Classification of Road Images for Lane Detection Mingyu Kim minkyu89@stanford.edu Insun Jang insunj@stanford.edu Eunmo Yang eyang89@stanford.edu 1. Introduction In the research on autonomous car, it is

More information

Recent Advances in Sampling-based Alpha Matting

Recent Advances in Sampling-based Alpha Matting Recent Advances in Sampling-based Alpha Matting Presented By: Ahmad Al-Kabbany Under the Supervision of: Prof.Eric Dubois Recent Advances in Sampling-based Alpha Matting Presented By: Ahmad Al-Kabbany

More information

Customized Foam for Tools

Customized Foam for Tools Table of contents Make sure that you have the latest version before using this document. o o o o o o o Overview of services offered and steps to follow (p.3) 1. Service : Cutting of foam for tools 2. Service

More information

Mikko Myllymäki and Tuomas Virtanen

Mikko Myllymäki and Tuomas Virtanen NON-STATIONARY NOISE MODEL COMPENSATION IN VOICE ACTIVITY DETECTION Mikko Myllymäki and Tuomas Virtanen Department of Signal Processing, Tampere University of Technology Korkeakoulunkatu 1, 3370, Tampere,

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

4) Click on Load Point Cloud to load the.czp file from Scene. Open Intersection_Demo.czp

4) Click on Load Point Cloud to load the.czp file from Scene. Open Intersection_Demo.czp Intersection 2D Demo 1) Open the Crash Zone or Crime Zone diagram program. 2) Click on to open the CZ Point Cloud tool. 3) Click on 3D/Cloud Preferences. a) Set the Cloud File Units (Feet or Meters). b)

More information

SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System

SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System Zhenyao Mo +1 213 740 4250 zmo@graphics.usc.edu J. P. Lewis +1 213 740 9619 zilla@computer.org Ulrich Neumann +1 213 740 0877 uneumann@usc.edu

More information

Creo Parametric 2.0: Introduction to Solid Modeling. Creo Parametric 2.0: Introduction to Solid Modeling

Creo Parametric 2.0: Introduction to Solid Modeling. Creo Parametric 2.0: Introduction to Solid Modeling Creo Parametric 2.0: Introduction to Solid Modeling 1 2 Part 1 Class Files... xiii Chapter 1 Introduction to Creo Parametric... 1-1 1.1 Solid Modeling... 1-4 1.2 Creo Parametric Fundamentals... 1-6 Feature-Based...

More information

ACI Sketch. Copyright October 2008, ACI All Rights Reserved. 24 Old Kings Road North Palm Coast, FL appraiserschoice.com

ACI Sketch. Copyright October 2008, ACI All Rights Reserved. 24 Old Kings Road North Palm Coast, FL appraiserschoice.com ACI Sketch Copyright October 2008, ACI All Rights Reserved 24 Old Kings Road North Palm Coast, FL 32137 appraiserschoice.com 800.234.8727 ACI Sketch Contents Getting Started 1 Working with the Grid 2 Unit

More information

INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction

INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction Xavier Suau 1,MarcelAlcoverro 2, Adolfo Lopez-Mendez 3, Javier Ruiz-Hidalgo 2,andJosepCasas 3 1 Universitat Politécnica

More information

Understanding Freehand Diagrams: Combining Appearance and Context for Multi-Domain Sketch Recognition

Understanding Freehand Diagrams: Combining Appearance and Context for Multi-Domain Sketch Recognition Understanding Freehand Diagrams: Combining Appearance and Context for Multi-Domain Sketch Recognition by Tom Yu Ouyang Submitted to the Department of Electrical Engineering and Computer Science in partial

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology Natural User Interfaces What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger

More information

2. Visually- Guided Grasping (3D)

2. Visually- Guided Grasping (3D) Autonomous Robotic Manipulation (3/4) Pedro J Sanz sanzp@uji.es 2. Visually- Guided Grasping (3D) April 2010 Fundamentals of Robotics (UdG) 2 1 Other approaches for finding 3D grasps Analyzing complete

More information

WHITE PAPER Need for Gesture Recognition. April 2014

WHITE PAPER Need for Gesture Recognition. April 2014 WHITE PAPER Need for Gesture Recognition April 2014 TABLE OF CONTENTS Abstract... 3 What is Gesture Recognition?... 4 Market Trends... 6 Factors driving the need for a Solution... 8 The Solution... 10

More information

Discriminative Training for Automatic Speech Recognition

Discriminative Training for Automatic Speech Recognition Discriminative Training for Automatic Speech Recognition 22 nd April 2013 Advanced Signal Processing Seminar Article Heigold, G.; Ney, H.; Schluter, R.; Wiesler, S. Signal Processing Magazine, IEEE, vol.29,

More information

Author(s) Corr, Philip J.; Silvestre, Guenole C.; Bleakley, Christopher J. The Irish Pattern Recognition & Classification Society

Author(s) Corr, Philip J.; Silvestre, Guenole C.; Bleakley, Christopher J. The Irish Pattern Recognition & Classification Society Provided by the author(s) and University College Dublin Library in accordance with publisher policies. Please cite the published version when available. Title Open Source Dataset and Deep Learning Models

More information

μscope Microscopy Software

μscope Microscopy Software μscope Microscopy Software Pixelink μscope Essentials (ES) Software is an easy-to-use robust image capture tool optimized for productivity. Pixelink μscope Standard (SE) Software had added features, making

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies

Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case Study Available online at: www.ijarcsms.com A Survey

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

A Wearable RFID System for Real-time Activity Recognition using Radio Patterns

A Wearable RFID System for Real-time Activity Recognition using Radio Patterns A Wearable RFID System for Real-time Activity Recognition using Radio Patterns Liang Wang 1, Tao Gu 2, Hongwei Xie 1, Xianping Tao 1, Jian Lu 1, and Yu Huang 1 1 State Key Laboratory for Novel Software

More information

AEROPLANE. Create a New Folder in your chosen location called Aeroplane. The four parts that make up the project will be saved here.

AEROPLANE. Create a New Folder in your chosen location called Aeroplane. The four parts that make up the project will be saved here. AEROPLANE Prerequisite Knowledge Previous knowledge of the following commands is required to complete this lesson. Sketching (Line, Rectangle, Arc, Add Relations, Dimensioning), Extrude, Assemblies and

More information

Towards the Design of Effective Freehand Gestural Interaction for Interactive TV

Towards the Design of Effective Freehand Gestural Interaction for Interactive TV Towards the Design of Effective Freehand Gestural Interaction for Interactive TV Gang Ren a,*, Wenbin Li b and Eamonn O Neill c a School of Digital Arts, Xiamen University of Technology, No. 600 Ligong

More information

6 System architecture

6 System architecture 6 System architecture is an application for interactively controlling the animation of VRML avatars. It uses the pen interaction technique described in Chapter 3 - Interaction technique. It is used in

More information

Individuality of Fingerprints

Individuality of Fingerprints Individuality of Fingerprints Sargur N. Srihari Department of Computer Science and Engineering University at Buffalo, State University of New York srihari@cedar.buffalo.edu IAI Conference, San Diego, CA

More information

AN ACCELEROMETER BASED DIGITAL PEN FOR HANDWRITEN DIGIT & GESTURE RECOGNITION

AN ACCELEROMETER BASED DIGITAL PEN FOR HANDWRITEN DIGIT & GESTURE RECOGNITION AN ACCELEROMETER BASED DIGITAL PEN FOR HANDWRITEN DIGIT & GESTURE RECOGNITION Mrs. Prajakta P Shrimandilkar 1, Prof. R.B. Sonawane 2 1,2 Department of Electronics (Digital System) A V C O E Sangamner,

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses

More information

AI Fairness 360. Kush R. Varshney

AI Fairness 360. Kush R. Varshney IBM Research AI AI Fairness 360 Kush R. Varshney krvarshn@us.ibm.com http://krvarshney.github.io @krvarshney http://aif360.mybluemix.net https://github.com/ibm/aif360 https://pypi.org/project/aif360 2018

More information

How Many Pixels Do We Need to See Things?

How Many Pixels Do We Need to See Things? How Many Pixels Do We Need to See Things? Yang Cai Human-Computer Interaction Institute, School of Computer Science, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA ycai@cmu.edu

More information