On-line Gesture Recognition

Size: px
Start display at page:

Download "On-line Gesture Recognition"

Transcription

1 On-line Gesture Recognition Luis A. Leiva PRHLT Research Center Departamento de Sistemas Informáticos y Computación Universitat Politècnica de València Presentation Outline Introduction 1 Preliminaries 19 Some Techniques 31 Recap 59 References 63 Slides available at On-line Gesture Recognition 1

2 Introduction On-line Gesture Recognition 2 Definition Gesture / dzsts@/ noun The use of motions of the limbs or body as a means of expression. synonyms: signal, sign, motion, indication, gesticulation Gestures are hand-drawn strokes that do things. Lipscomb (1991) On-line Gesture Recognition 3

3 Definition Off-line gesture recognition: post-hoc, processed after user interaction static data, no temporal info available On-line gesture recognition: realtime, direct manipulation sequential, time series data On-line Gesture Recognition 4 Historical Precedents sketchpad RAND tablet Sutherland (1963) Davis and Ellis (1964) On-line Gesture Recognition 5

4 Gestures Today Minority Report. Image by 20th Century Fox & DreamWorks On-line Gesture Recognition 6 Input Devices Wii. Image by Nintendo Co., Ltd. On-line Gesture Recognition 7

5 Input Devices T(ether). Image by Massachusetts Institute of Technology On-line Gesture Recognition 8 Input Devices Kinect for Xbox 360. Image by Microsoft Corporation On-line Gesture Recognition 9

6 Input Devices Humantenna. Image by Microsoft Research On-line Gesture Recognition 10 Input Devices Skinput. Image by Carnegie Mellon University On-line Gesture Recognition 11

7 Input Devices Myo. Image by Thalmic Labs On-line Gesture Recognition 12 Input Devices Leap motion. Image by Leap Motion, Inc. On-line Gesture Recognition 13

8 Input Devices Air Clicker. Image by Yanko Design On-line Gesture Recognition 14 Input Devices TapTap. Image by Woodenshark LLC On-line Gesture Recognition 15

9 Input Devices Pen Tail gestures, by Tian et al. (2012) On-line Gesture Recognition 16 Advantages Natural communication Expressiveness Ergonomics Usability Fun On-line Gesture Recognition 17

10 Disadvantages May break fundamental interaction principles: Discoverability, Reliability, Scalability, etc. Ambiguity: non-deterministic decoding Lack of standards Cultural issues On-line Gesture Recognition 18 Trade-offs accuracy design setup performance On-line Gesture Recognition 19

11 Preliminaries On-line Gesture Recognition 20 Interaction Paradigms Mid-air Onscreen On-line Gesture Recognition 21

12 Definition stroke = pointer down pointer move pointer up s = {(x 1, y 1, t 1 ) (x j, y j, t j ) (x N, y N, t N )} On-line Gesture Recognition 22 A Taxonomy zero-order gestures On-line Gesture Recognition 23

13 A Taxonomy first-order gestures (unistrokes) On-line Gesture Recognition 24 A Taxonomy higher-order gestures (multistrokes) On-line Gesture Recognition 25

14 Processing Pipeline Input gesture Capture Preprocessing Output command Recognition Feature extraction Classifier selection Feature selection On-line Gesture Recognition 26 Capture Event-based Polling (constant frequency) 1 px x5 x6 x3 x8 low freq. high freq. Sampling rate matters! On-line Gesture Recognition 27

15 Preprocessing input capture segmentation noise removal resampling normalization *optional steps On-line Gesture Recognition 28 Features On-line Gesture Recognition 29

16 Recognition Techniques Hashing: dictionary lookup, chain codes Parametric: linear fitting, corner detection Matching: DTW, k-nn, dollar family Statistical: linear classifier, NNs, HMMs, CRFs Ad hoc: knowledge-based, decision trees, FSM On-line Gesture Recognition 30 Continuous Recognition OctoPocus, by Bau and Mackay (2008) On-line Gesture Recognition 31

17 Some Techniques On-line Gesture Recognition 32 Marking Menus by Kurtenbach (1991) On-line Gesture Recognition 33

18 Marking Menus Blender. Image by Blender Foundation On-line Gesture Recognition 34 Linear Fitting ŷ = a + bx minimize R 2 = N i=1 r i 2 y vertical offsets y perpendicular offsets x r i = y i (a + bx i ) r i = y i (a+bx i ) 1+b 2 x On-line Gesture Recognition 35

19 Corner Detection PDL, ShortStraw, Firefox s QuickGestures, etc. G = s 1,..., s n,..., s N s n {L, R, U, D} On-line Gesture Recognition 36 Graffiti & Unistrokes Comparison by Castellucci and MacKenzie (2008) On-line Gesture Recognition 37

20 Graffiti & Unistrokes Demo: mousegesture/gesturedemo.swf On-line Gesture Recognition 38 Rubine recognizer by Rubine (1991) On-line Gesture Recognition 39

21 Rubine recognizer linear classifier using F = 13 stroke features c = f ( w T g ) = w o + F Σ 1 µ i i=1 w 0 = 1 2 F w i µ i i=1 weight estimation: perceptron, LSBF, LDA, SVM, logistic regression, etc. On-line Gesture Recognition 40 Shapewriting SHARK 2, by Kristensson and Zhai (2004) On-line Gesture Recognition 41

22 Shapewriting sokgraph of hello Q W E R T Y U I O P sokgraph of there Q W E R T Y U I O P A S D F G H J K L A S D F G H J K L Z X C V B N M Z X C V B N M Ŵ = arg max P (W g) W Ŵ = arg max W P (g W )P (W ) P (g) = arg max P (g W )P (W ) W On-line Gesture Recognition 42 Euclidean Matching point-wise distances g t D(g, t) = 1 g g g i t i i=1 On-line Gesture Recognition 43

23 Elastic Matching dynamic programming g t W 1 D(g, t) = min W W W k=1 w k On-line Gesture Recognition 44 Elastic Matching W = w 1,..., w k,..., w K g i w k w K w 1 j t γ(i, j) = d(i, j) + min{γ(i 1, j), γ(i, j 1), γ(i 1, j 1)} On-line Gesture Recognition 45

24 The Dollar Family: $1 recognizer by Wobbrock et al. (2007) On-line Gesture Recognition 46 The Dollar Family: $1 recognizer Preprocessing: resampling, rotation, scaling, translation indicative angle (0,0) g t D(g, t) = arg min π 4 θ π 4 1 N N g i t i (θ) i=1 On-line Gesture Recognition 47

25 The Dollar Family: $1 recognizer On-line Gesture Recognition 48 The Dollar Family: $N recognizer by Anthony and Wobbrock (2010) On-line Gesture Recognition 49

26 The Dollar Family: $N recognizer $N = $1 with combinatory overhead: O(n s 2 s ) per template Memory drained out with 20 templates (n = 32 pts) in a quad-core computer with 4 GB RAM. On-line Gesture Recognition 50 The Dollar Family: $P recognizer by Vatavu et al. (2012) On-line Gesture Recognition 51

27 The Dollar Family: $P recognizer variation of the Hungarian algorithm D(g, t) = min { g t, t g } Haussdorf s alternatives: D(g, t) = max i min j g i t j D(g, t) = 1 N N i=1 min j g i t j On-line Gesture Recognition 52 The Dollar Family: $P recognizer On-line Gesture Recognition 53

28 The Dollar Family: Protractor by Li (2010) On-line Gesture Recognition 54 The Dollar Family: Protractor closed-form solution, minimum angular distance D(g, t) = 1 arccos(a cos ˆθ + b sin ˆθ) ˆθ = arctan b a a = b = N (x gi x ti + y gi y ti ) i=1 N (x gi y ti y gi x ti ) i=1 On-line Gesture Recognition 55

29 The Dollar Family: Protractor On-line Gesture Recognition 56 MinGestures for MIUIs disambiguate gestures from text with high accuracy & performace LABEL ACTION RESULT LABEL ACTION RESULT Substitute Lorem Ipsum Lorem Ipsan Split Lorem Lor em Reject Lorem Ipsum Lorem... Validate Lorem Ipsum Lorem Ipsum Merge Lorem Ipsum LoremIpsum Delete Lorem Ipsum Lorem Undo Lorem Lorem Ipsum Redo Lorem Ipsum Lorem Insert Lorem Ipsum Lorem et Ipsum Help Lorem Ipsum <help event> by Leiva et al. (2014) Demo: On-line Gesture Recognition 57

30 MinGestures for MIUIs disambiguating features: RMSE = 1 N x i y i N x = ϕ = i=1 N max(x i 1 x i, 0) i=2 max(x) min(x) max(y) min(y) classification rule: θ = ŷ b x ± ǫ On-line Gesture Recognition 58 MinGestures for MIUIs System E-pen Mouse training test training test $1 recognizer Marking Menus Modified $ Rubine MinGestures Error rates in % On-line Gesture Recognition 59

31 Recap On-line Gesture Recognition 60 Takeaways Gestures aim to improve human-computer interaction Many recognition techniques suitable for different input devices Recognition trade-offs: design, setup, performance, accuracy Gestures should be simple: For humans to perform and recall For computers to recognize On-line Gesture Recognition 61

32 TFM Proposals 1. Integration: free-form gestures in context 2. Error analysis & recovery: When should the recognizer ask the user? How much to ask? 3. Segmentation: automatic gesture parts identification 4. Generation: grammar-based, kinematic theory, etc. On-line Gesture Recognition 62 On-line Gesture Recognition 63

33 Bibliography L. Anthony and J. O. Wobbrock. A lightweight multistroke recognizer for user interface prototypes. In Proc. GI, O. Bau and W. E. Mackay. OctoPocus: A dynamic guide for learning gesture-based command sets. In Proc. UIST, S. J. Castellucci and I. S. MacKenzie. Graffiti vs. Unistrokes: An empirical comparison. In Proc. CHI, M. Davis and T. Ellis. The RAND tablet: A man-machine graphical communication device. In Proc. AFIPS, P. O. Kristensson and S. Zhai. SHARK 2 : A large vocabulary shorthand writing system for pen-based computers. In Proc. UIST, G. P. Kurtenbach. The design and evaluation of marking menus. PhD thesis, University of Toronto, L. A. Leiva, V. Alabau, V. Romero, A. H. Toselli, and E. Vidal. Context-aware gestures for mixed-initiative text editing UIs. Interacting with Computers, To appear. Y. Li. Protractor: a fast and accurate gesture recognizer. In Proc. CHI, J. S. Lipscomb. A trainable gesture recognizer. Pattern Recognition, 24(9), On-line Gesture Recognition 64 R. Plamondon and S. N. Srihari. On-line and off-line handwriting recognition: a comprehensive survey. IEEE Transactions On Pattern Analysis And Machine Intelligence, 22(1), D. H. Rubine. The Automatic Recognition of Gestures. PhD thesis, Carnegie Mellon University, I. E. Sutherland. Sketchpad: A man-machine graphical communication system. Tech. Report 296, Lincoln Laboratory, MIT, C. C. Tappert, C. Y. Suen, and T. Wakahara. The state of the art in on-line handwriting recognition. IEEE Transactions On Pattern Analysis And Machine Intelligence, 12(8), F. Tian, F. Lu, Y. Jiang, X. Zhang, X. Cao, G. Dai, and H. Wang. An exploration of pen tail gestures for interactions. Int. J. Human-Computer Studies, 71, R.-D. Vatavu, L. Anthony, and J. O. Wobbrock. Gestures as point clouds: a $P recognizer for user interface prototypes. In Proc. ICMI, J. O. Wobbrock, A. D. Wilson, and Y. Li. Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. In Proc. UIST, S. Zhai, P. O. Kristensson, C. Appert, T. H. Andersen, and X. Cao. Foundations and trends in Human Computer Interaction. Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review, 5(2), On-line Gesture Recognition 65

34 Videography $1 gesture recognition. Humantenna. Leap Motion. Marking Menus. Myo. OctoPocus. Pen Tail. RAND tablet. Shapewriting. Shuterland s Sketchpad. Skinput. Wacom gestures. On-line Gesture Recognition 66

On-line Gesture Recognition

On-line Gesture Recognition On-line Gesture Recognition Luis A. Leiva name@sciling.com http://creativecommons.org/licenses/by/4.0/ Presentation Outline Introduction 1 Preliminaries 19 Some Techniques 31 Recap 59 References 63 Slides

More information

GestureCommander: Continuous Touch-based Gesture Prediction

GestureCommander: Continuous Touch-based Gesture Prediction GestureCommander: Continuous Touch-based Gesture Prediction George Lucchese george lucchese@tamu.edu Jimmy Ho jimmyho@tamu.edu Tracy Hammond hammond@cs.tamu.edu Martin Field martin.field@gmail.com Ricardo

More information

Research Seminar. Stefano CARRINO fr.ch

Research Seminar. Stefano CARRINO  fr.ch Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks

More information

3D Data Navigation via Natural User Interfaces

3D Data Navigation via Natural User Interfaces 3D Data Navigation via Natural User Interfaces Francisco R. Ortega PhD Candidate and GAANN Fellow Co-Advisors: Dr. Rishe and Dr. Barreto Committee Members: Dr. Raju, Dr. Clarke and Dr. Zeng GAANN Fellowship

More information

THE Touchless SDK released by Microsoft provides the

THE Touchless SDK released by Microsoft provides the 1 Touchless Writer: Object Tracking & Neural Network Recognition Yang Wu & Lu Yu The Milton W. Holcombe Department of Electrical and Computer Engineering Clemson University, Clemson, SC 29631 E-mail {wuyang,

More information

EECS 4441 Human-Computer Interaction

EECS 4441 Human-Computer Interaction EECS 4441 Human-Computer Interaction Topic #1:Historical Perspective I. Scott MacKenzie York University, Canada Significant Event Timeline Significant Event Timeline As We May Think Vannevar Bush (1945)

More information

Sketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph

Sketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph Sketching Interface Larry April 24, 2006 1 Motivation Natural Interface touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different from speech

More information

Sketching Interface. Motivation

Sketching Interface. Motivation Sketching Interface Larry Rudolph April 5, 2007 1 1 Natural Interface Motivation touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

EECS 4441 / CSE5351 Human-Computer Interaction. Topic #1 Historical Perspective

EECS 4441 / CSE5351 Human-Computer Interaction. Topic #1 Historical Perspective EECS 4441 / CSE5351 Human-Computer Interaction Topic #1 Historical Perspective I. Scott MacKenzie York University, Canada 1 Significant Event Timeline 2 1 Significant Event Timeline 3 As We May Think Vannevar

More information

Humantenna. ubicomp lab. Using the Human Body as an Antenna for Real-Time Whole-Body Interaction

Humantenna. ubicomp lab. Using the Human Body as an Antenna for Real-Time Whole-Body Interaction Humantenna Using the Human Body as an Antenna for Real-Time Whole-Body Interaction Gabe Cohn 1,2 Dan Morris 1 Shwetak N. Patel 1,2 Desney S. Tan 1 1 Microsoft Research 2 University of Washington MSR Faculty

More information

Sketchpad Ivan Sutherland (1962)

Sketchpad Ivan Sutherland (1962) Sketchpad Ivan Sutherland (1962) 7 Viewable on Click here https://www.youtube.com/watch?v=yb3saviitti 8 Sketchpad: Direct Manipulation Direct manipulation features: Visibility of objects Incremental action

More information

UUIs Ubiquitous User Interfaces

UUIs Ubiquitous User Interfaces UUIs Ubiquitous User Interfaces Alexander Nelson April 16th, 2018 University of Arkansas - Department of Computer Science and Computer Engineering The Problem As more and more computation is woven into

More information

RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems

RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems Yuxiang Zhu, Joshua Johnston, and Tracy Hammond Department of Computer Science and Engineering Texas A&M University College

More information

GART: The Gesture and Activity Recognition Toolkit

GART: The Gesture and Activity Recognition Toolkit GART: The Gesture and Activity Recognition Toolkit Kent Lyons, Helene Brashear, Tracy Westeyn, Jung Soo Kim, and Thad Starner College of Computing and GVU Center Georgia Institute of Technology Atlanta,

More information

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different

More information

Occlusion-Aware Menu Design for Digital Tabletops

Occlusion-Aware Menu Design for Digital Tabletops Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at

More information

A Gestural Interaction Design Model for Multi-touch Displays

A Gestural Interaction Design Model for Multi-touch Displays Songyang Lao laosongyang@ vip.sina.com A Gestural Interaction Design Model for Multi-touch Displays Xiangan Heng xianganh@ hotmail ABSTRACT Media platforms and devices that allow an input from a user s

More information

An exploration of pen tail gestures for interactions

An exploration of pen tail gestures for interactions Available online at www.sciencedirect.com Int. J. Human-Computer Studies 71 (2012) 551 569 www.elsevier.com/locate/ijhcs An exploration of pen tail gestures for interactions Feng Tian a,d,n, Fei Lu a,

More information

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Kiwon Yun, Junyeong Yang, and Hyeran Byun Dept. of Computer Science, Yonsei University, Seoul, Korea, 120-749

More information

MARQS: RETRIEVING SKETCHES USING DOMAIN- AND STYLE-INDEPENDENT FEATURES LEARNED FROM A SINGLE EXAMPLE USING A DUAL-CLASSIFIER

MARQS: RETRIEVING SKETCHES USING DOMAIN- AND STYLE-INDEPENDENT FEATURES LEARNED FROM A SINGLE EXAMPLE USING A DUAL-CLASSIFIER MARQS: RETRIEVING SKETCHES USING DOMAIN- AND STYLE-INDEPENDENT FEATURES LEARNED FROM A SINGLE EXAMPLE USING A DUAL-CLASSIFIER Brandon Paulson, Tracy Hammond Sketch Recognition Lab, Texas A&M University,

More information

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera GESTURE BASED HUMAN MULTI-ROBOT INTERACTION Gerard Canal, Cecilio Angulo, and Sergio Escalera Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 2/27 Introduction Nowadays robots are able

More information

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN Patrick Chiu FX Palo Alto Laboratory Palo Alto, CA 94304, USA chiu@fxpal.com Chelhwon Kim FX Palo Alto Laboratory Palo

More information

FINGER PLACEMENT CORRECTION FOR STATIC GESTURE RECOGNITION IN AMERICAN SIGN LANGUAGE. Veronica Yenquenida Flamenco Cordova

FINGER PLACEMENT CORRECTION FOR STATIC GESTURE RECOGNITION IN AMERICAN SIGN LANGUAGE. Veronica Yenquenida Flamenco Cordova FINGER PLACEMENT CORRECTION FOR STATIC GESTURE RECOGNITION IN AMERICAN SIGN LANGUAGE A thesis presented to the faculty of the Graduate School of Western Carolina University in partial fulfillment of the

More information

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes)

GESTURES. Luis Carriço (based on the presentation of Tiago Gomes) GESTURES Luis Carriço (based on the presentation of Tiago Gomes) WHAT IS A GESTURE? In this context, is any physical movement that can be sensed and responded by a digital system without the aid of a traditional

More information

Context-based bounding volume morphing in pointing gesture application

Context-based bounding volume morphing in pointing gesture application Context-based bounding volume morphing in pointing gesture application Andreas Braun 1, Arthur Fischer 2, Alexander Marinc 1, Carsten Stocklöw 1, Martin Majewski 2 1 Fraunhofer Institute for Computer Graphics

More information

Pose Invariant Face Recognition

Pose Invariant Face Recognition Pose Invariant Face Recognition Fu Jie Huang Zhihua Zhou Hong-Jiang Zhang Tsuhan Chen Electrical and Computer Engineering Department Carnegie Mellon University jhuangfu@cmu.edu State Key Lab for Novel

More information

CS 315 Intro to Human Computer Interaction (HCI)

CS 315 Intro to Human Computer Interaction (HCI) CS 315 Intro to Human Computer Interaction (HCI) Direct Manipulation Examples Drive a car If you want to turn left, what do you do? What type of feedback do you get? How does this help? Think about turning

More information

GUI and Gestures. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

GUI and Gestures. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University GUI and Gestures CS334 Fall 2013 Daniel G. Aliaga Department of Computer Science Purdue University User Interfaces Human Computer Interaction Graphical User Interfaces History 2D interfaces VR/AR Interfaces

More information

Introduction to Autodesk Inventor for F1 in Schools (Australian Version)

Introduction to Autodesk Inventor for F1 in Schools (Australian Version) Introduction to Autodesk Inventor for F1 in Schools (Australian Version) F1 in Schools race car In this course you will be introduced to Autodesk Inventor, which is the centerpiece of Autodesk s Digital

More information

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation Direct Manipulation and Instrumental Interaction 1 Review: Interaction vs. Interface What s the difference between user interaction and user interface? Interface refers to what the system presents to the

More information

Classification for Motion Game Based on EEG Sensing

Classification for Motion Game Based on EEG Sensing Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,

More information

2. Visually- Guided Grasping (3D)

2. Visually- Guided Grasping (3D) Autonomous Robotic Manipulation (3/4) Pedro J Sanz sanzp@uji.es 2. Visually- Guided Grasping (3D) April 2010 Fundamentals of Robotics (UdG) 2 1 Other approaches for finding 3D grasps Analyzing complete

More information

Static Signature Verification and Recognition using Neural Network Approach-A Survey

Static Signature Verification and Recognition using Neural Network Approach-A Survey Available online www.ejaet.com European Journal of Advances in Engineering and Technology, 2015, 2(4): 46-50 Review Article ISSN: 2394-658X Static Signature Verification and Recognition using Neural Network

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Sensors and Materials, Vol. 28, No. 6 (2016) 695 705 MYU Tokyo 695 S & M 1227 Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Chun-Chi Lai and Kuo-Lan Su * Department

More information

Recognizing Military Gestures: Developing a Gesture Recognition Interface. Jonathan Lebron

Recognizing Military Gestures: Developing a Gesture Recognition Interface. Jonathan Lebron Recognizing Military Gestures: Developing a Gesture Recognition Interface Jonathan Lebron March 22, 2013 Abstract The field of robotics presents a unique opportunity to design new technologies that can

More information

A Brief Survey of HCI Technology. Lecture #3

A Brief Survey of HCI Technology. Lecture #3 A Brief Survey of HCI Technology Lecture #3 Agenda Evolution of HCI Technology Computer side Human side Scope of HCI 2 HCI: Historical Perspective Primitive age Charles Babbage s computer Punch card Command

More information

INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction

INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction INTAIRACT: Joint Hand Gesture and Fingertip Classification for Touchless Interaction Xavier Suau 1,MarcelAlcoverro 2, Adolfo Lopez-Mendez 3, Javier Ruiz-Hidalgo 2,andJosepCasas 3 1 Universitat Politécnica

More information

Composite Body-Tracking:

Composite Body-Tracking: Composite Body-Tracking: Device Abstraction Layer with Data Fusion for Gesture Recognition in Virtual Reality Applications Vortragender: Betreuer: Verantwortlicher Professor: Luis Alejandro Rojas Vargas

More information

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15) Outline 01076568 Human Computer Interaction Chapter 5 : Paradigms Introduction Paradigms for interaction (15) ดร.ชมพ น ท จ นจาคาม [kjchompo@gmail.com] สาขาว ชาว ศวกรรมคอมพ วเตอร คณะว ศวกรรมศาสตร สถาบ นเทคโนโลย

More information

Full text available at: Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review

Full text available at:  Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review Foundational Issues in Touch-Surface Stroke Gesture Design An Integrative Review Shumin Zhai Google, USA zhai@acm.org Per

More information

Tahuti: A Geometrical Sketch Recognition System for UML Class Diagrams

Tahuti: A Geometrical Sketch Recognition System for UML Class Diagrams Tahuti: A Geometrical Sketch Recognition System for UML Class Diagrams Tracy Hammond and Randall Davis AI Lab, MIT 200 Technology Square Cambridge, MA 02139 hammond, davis@ai.mit.edu Abstract We have created

More information

SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System

SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System Zhenyao Mo +1 213 740 4250 zmo@graphics.usc.edu J. P. Lewis +1 213 740 9619 zilla@computer.org Ulrich Neumann +1 213 740 0877 uneumann@usc.edu

More information

PRODIM CT 3.0 MANUAL the complete solution

PRODIM CT 3.0 MANUAL the complete solution PRODIM CT 3.0 MANUAL the complete solution We measure it all! General information Copyright All rights reserved. Apart from the legally laid down exceptions, no part of this publication may be reproduced,

More information

Magnetism and Induction

Magnetism and Induction Magnetism and Induction Before the Lab Read the following sections of Giancoli to prepare for this lab: 27-2: Electric Currents Produce Magnetism 28-6: Biot-Savart Law EXAMPLE 28-10: Current Loop 29-1:

More information

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison e-issn 2455 1392 Volume 2 Issue 10, October 2016 pp. 34 41 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Design a Model and Algorithm for multi Way Gesture Recognition using Motion and

More information

Visual Recognition of Sketched Symbols

Visual Recognition of Sketched Symbols Visual Recognition of Sketched Symbols Tom Y. Ouyang MIT CSAIL 32 Vassar St, Cambridge MA, 02139, USA ouyang@csail.mit.edu Randall Davis MIT CSAIL 32 Vassar St, Cambridge MA, 02139, USA davis@csail.mit.edu

More information

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction Unit 23 QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction Unit 23 Outcomes Know the impact of HCI on society, the economy and culture Understand the fundamental principles of interface

More information

An Hybrid MLP-SVM Handwritten Digit Recognizer

An Hybrid MLP-SVM Handwritten Digit Recognizer An Hybrid MLP-SVM Handwritten Digit Recognizer A. Bellili ½ ¾ M. Gilloux ¾ P. Gallinari ½ ½ LIP6, Université Pierre et Marie Curie ¾ La Poste 4, Place Jussieu 10, rue de l Ile Mabon, BP 86334 75252 Paris

More information

Beyond: collapsible tools and gestures for computational design

Beyond: collapsible tools and gestures for computational design Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

Learning Guide. ASR Automated Systems Research Inc. # Douglas Crescent, Langley, BC. V3A 4B6. Fax:

Learning Guide. ASR Automated Systems Research Inc. # Douglas Crescent, Langley, BC. V3A 4B6. Fax: Learning Guide ASR Automated Systems Research Inc. #1 20461 Douglas Crescent, Langley, BC. V3A 4B6 Toll free: 1-800-818-2051 e-mail: support@asrsoft.com Fax: 604-539-1334 www.asrsoft.com Copyright 1991-2013

More information

Pen Computing for Air Traffic Control

Pen Computing for Air Traffic Control Pen Computing for Air Traffic Control Stéphane Chatty Patrick Lecoanet Centre d Études de la Navigation Aérienne Centre d Études de la Navigation Aérienne 7 avenue Édouard Belin Orly Sud 205 31055 TOULOUSE

More information

A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron

A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron Proc. National Conference on Recent Trends in Intelligent Computing (2006) 86-92 A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron

More information

Measuring FlowMenu Performance

Measuring FlowMenu Performance Measuring FlowMenu Performance This paper evaluates the performance characteristics of FlowMenu, a new type of pop-up menu mixing command and direct manipulation [8]. FlowMenu was compared with marking

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger There were things I resented

More information

Classification of Road Images for Lane Detection

Classification of Road Images for Lane Detection Classification of Road Images for Lane Detection Mingyu Kim minkyu89@stanford.edu Insun Jang insunj@stanford.edu Eunmo Yang eyang89@stanford.edu 1. Introduction In the research on autonomous car, it is

More information

Enabling Gesture Interaction with 3D Point Cloud

Enabling Gesture Interaction with 3D Point Cloud Enabling Gesture Interaction with 3D Point Cloud Harrison Cook School of Computing, Engineering and Mathematics, Western Sydney University harrison.cook42@gmai l.com Quang Vinh Nguyen MARCS Institute and

More information

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Hash Function Learning via Codewords

Hash Function Learning via Codewords Hash Function Learning via Codewords 2015 ECML/PKDD, Porto, Portugal, September 7 11, 2015. Yinjie Huang 1 Michael Georgiopoulos 1 Georgios C. Anagnostopoulos 2 1 Machine Learning Laboratory, University

More information

Mikko Myllymäki and Tuomas Virtanen

Mikko Myllymäki and Tuomas Virtanen NON-STATIONARY NOISE MODEL COMPENSATION IN VOICE ACTIVITY DETECTION Mikko Myllymäki and Tuomas Virtanen Department of Signal Processing, Tampere University of Technology Korkeakoulunkatu 1, 3370, Tampere,

More information

x au*.- 1'L.-.IV oq> 21 j o oor ED « h '2 I] li NO.

x au*.- 1'L.-.IV oq> 21 j o oor ED « h '2 I] li NO. X I I IMPORTANT PLEASE DO NOT GET THiS CARD DAMP OR WET. IT IS USED FOR COMPUTER INPU j 1 ; 4 S j ; 9 'i TT'I '4 A I l "'9 j 70 21 ;"T ' ; n r? pa n 23 34 3b v is j; (' «' «i

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Touch & Gesture. HCID 520 User Interface Software & Technology

Touch & Gesture. HCID 520 User Interface Software & Technology Touch & Gesture HCID 520 User Interface Software & Technology Natural User Interfaces What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger

More information

Sketch Recognition. AW2 Colloquium by Hauke Wittern

Sketch Recognition. AW2 Colloquium by Hauke Wittern AW2 Colloquium by Hauke Wittern Agenda Introduction Vision Definition of sketch recognition Research on sketch recognition Today s sketch recognition systems Recent research topics Using and recognizing

More information

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Visual Interpretation of Hand Gestures as a Practical Interface Modality Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate

More information

VQ Source Models: Perceptual & Phase Issues

VQ Source Models: Perceptual & Phase Issues VQ Source Models: Perceptual & Phase Issues Dan Ellis & Ron Weiss Laboratory for Recognition and Organization of Speech and Audio Dept. Electrical Eng., Columbia Univ., NY USA {dpwe,ronw}@ee.columbia.edu

More information

Experiment P10: Acceleration of a Dynamics Cart II (Motion Sensor)

Experiment P10: Acceleration of a Dynamics Cart II (Motion Sensor) PASCO scientific Physics Lab Manual: P10-1 Experiment P10: (Motion Sensor) Concept Time SW Interface Macintosh file Windows file Newton s Laws 30 m 500 or 700 P10 Cart Acceleration II P10_CAR2.SWS EQUIPMENT

More information

Making Pen-based Operation More Seamless and Continuous

Making Pen-based Operation More Seamless and Continuous Making Pen-based Operation More Seamless and Continuous Chuanyi Liu and Xiangshi Ren Department of Information Systems Engineering Kochi University of Technology, Kami-shi, 782-8502 Japan {renlab, ren.xiangshi}@kochi-tech.ac.jp

More information

AI Fairness 360. Kush R. Varshney

AI Fairness 360. Kush R. Varshney IBM Research AI AI Fairness 360 Kush R. Varshney krvarshn@us.ibm.com http://krvarshney.github.io @krvarshney http://aif360.mybluemix.net https://github.com/ibm/aif360 https://pypi.org/project/aif360 2018

More information

A SURVEY ON HAND GESTURE RECOGNITION

A SURVEY ON HAND GESTURE RECOGNITION A SURVEY ON HAND GESTURE RECOGNITION U.K. Jaliya 1, Dr. Darshak Thakore 2, Deepali Kawdiya 3 1 Assistant Professor, Department of Computer Engineering, B.V.M, Gujarat, India 2 Assistant Professor, Department

More information

A Wearable RFID System for Real-time Activity Recognition using Radio Patterns

A Wearable RFID System for Real-time Activity Recognition using Radio Patterns A Wearable RFID System for Real-time Activity Recognition using Radio Patterns Liang Wang 1, Tao Gu 2, Hongwei Xie 1, Xianping Tao 1, Jian Lu 1, and Yu Huang 1 1 State Key Laboratory for Novel Software

More information

HUMAN COMPUTER INTERACTION 0. PREFACE. I-Chen Lin, National Chiao Tung University, Taiwan

HUMAN COMPUTER INTERACTION 0. PREFACE. I-Chen Lin, National Chiao Tung University, Taiwan HUMAN COMPUTER INTERACTION 0. PREFACE I-Chen Lin, National Chiao Tung University, Taiwan About The Course Course title: Human Computer Interaction (HCI) Lectures: ED202, 13:20~15:10(Mon.), 9:00~9:50(Thur.)

More information

The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand

The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand Ravin Balakrishnan 1,2 and Pranay Patel 2 1 Dept. of Computer Science 2 Alias wavefront University of Toronto 210

More information

Research on Hand Gesture Recognition Using Convolutional Neural Network

Research on Hand Gesture Recognition Using Convolutional Neural Network Research on Hand Gesture Recognition Using Convolutional Neural Network Tian Zhaoyang a, Cheng Lee Lung b a Department of Electronic Engineering, City University of Hong Kong, Hong Kong, China E-mail address:

More information

Online handwritten signature verification system: A Review

Online handwritten signature verification system: A Review Online handwritten signature verification system: A Review Abstract: Online handwritten signature verification system is one of the most reliable, fast and cost effective tool for user authentication.

More information

Recent Advances in Sampling-based Alpha Matting

Recent Advances in Sampling-based Alpha Matting Recent Advances in Sampling-based Alpha Matting Presented By: Ahmad Al-Kabbany Under the Supervision of: Prof.Eric Dubois Recent Advances in Sampling-based Alpha Matting Presented By: Ahmad Al-Kabbany

More information

4) Click on Load Point Cloud to load the.czp file from Scene. Open Intersection_Demo.czp

4) Click on Load Point Cloud to load the.czp file from Scene. Open Intersection_Demo.czp Intersection 2D Demo 1) Open the Crash Zone or Crime Zone diagram program. 2) Click on to open the CZ Point Cloud tool. 3) Click on 3D/Cloud Preferences. a) Set the Cloud File Units (Feet or Meters). b)

More information

License Plate Localisation based on Morphological Operations

License Plate Localisation based on Morphological Operations License Plate Localisation based on Morphological Operations Xiaojun Zhai, Faycal Benssali and Soodamani Ramalingam School of Engineering & Technology University of Hertfordshire, UH Hatfield, UK Abstract

More information

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses

More information

Study of 3D Barcode with Steganography for Data Hiding

Study of 3D Barcode with Steganography for Data Hiding Study of 3D Barcode with Steganography for Data Hiding Megha S M 1, Chethana C 2 1Student of Master of Technology, Dept. of Computer Science and Engineering& BMSIT&M Yelahanka Banglore-64, 2 Assistant

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

Definiens Developer Version 7

Definiens Developer Version 7 Definiens Developer Version 7 Differences to Definiens Professional Gregor Willhauck Product Marketing Manager Definiens Professional and Definiens Developer product history Definiens Developer v7 Definiens

More information

A Novel Algorithm for Hand Vein Recognition Based on Wavelet Decomposition and Mean Absolute Deviation

A Novel Algorithm for Hand Vein Recognition Based on Wavelet Decomposition and Mean Absolute Deviation Sensors & Transducers, Vol. 6, Issue 2, December 203, pp. 53-58 Sensors & Transducers 203 by IFSA http://www.sensorsportal.com A Novel Algorithm for Hand Vein Recognition Based on Wavelet Decomposition

More information

Data Flow 4.{1,2}, 3.2

Data Flow 4.{1,2}, 3.2 < = = Computer Science Program, The University of Texas, Dallas Data Flow 4.{1,2}, 3.2 Batch Sequential Pipeline Systems Tektronix Case Study: Oscilloscope Formalization of Oscilloscope "systems where

More information

VICs: A Modular Vision-Based HCI Framework

VICs: A Modular Vision-Based HCI Framework VICs: A Modular Vision-Based HCI Framework The Visual Interaction Cues Project Guangqi Ye, Jason Corso Darius Burschka, & Greg Hager CIRL, 1 Today, I ll be presenting work that is part of an ongoing project

More information

An Improved Bernsen Algorithm Approaches For License Plate Recognition

An Improved Bernsen Algorithm Approaches For License Plate Recognition IOSR Journal of Electronics and Communication Engineering (IOSR-JECE) ISSN: 78-834, ISBN: 78-8735. Volume 3, Issue 4 (Sep-Oct. 01), PP 01-05 An Improved Bernsen Algorithm Approaches For License Plate Recognition

More information

Brandon Jennings Department of Computer Engineering University of Pittsburgh 1140 Benedum Hall 3700 O Hara St Pittsburgh, PA

Brandon Jennings Department of Computer Engineering University of Pittsburgh 1140 Benedum Hall 3700 O Hara St Pittsburgh, PA Hand Posture s Effect on Touch Screen Text Input Behaviors: A Touch Area Based Study Christopher Thomas Department of Computer Science University of Pittsburgh 5428 Sennott Square 210 South Bouquet Street

More information

WHITE PAPER Need for Gesture Recognition. April 2014

WHITE PAPER Need for Gesture Recognition. April 2014 WHITE PAPER Need for Gesture Recognition April 2014 TABLE OF CONTENTS Abstract... 3 What is Gesture Recognition?... 4 Market Trends... 6 Factors driving the need for a Solution... 8 The Solution... 10

More information

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction Piyush Kumar 1, Jyoti Verma 2 and Shitala Prasad 3 1 Department of Information Technology, Indian Institute of Information Technology,

More information

Number Plate Detection with a Multi-Convolutional Neural Network Approach with Optical Character Recognition for Mobile Devices

Number Plate Detection with a Multi-Convolutional Neural Network Approach with Optical Character Recognition for Mobile Devices J Inf Process Syst, Vol.12, No.1, pp.100~108, March 2016 http://dx.doi.org/10.3745/jips.04.0022 ISSN 1976-913X (Print) ISSN 2092-805X (Electronic) Number Plate Detection with a Multi-Convolutional Neural

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

μscope Microscopy Software

μscope Microscopy Software μscope Microscopy Software Pixelink μscope Essentials (ES) Software is an easy-to-use robust image capture tool optimized for productivity. Pixelink μscope Standard (SE) Software had added features, making

More information

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES International Journal of Advanced Research in Engineering and Technology (IJARET) Volume 9, Issue 3, May - June 2018, pp. 177 185, Article ID: IJARET_09_03_023 Available online at http://www.iaeme.com/ijaret/issues.asp?jtype=ijaret&vtype=9&itype=3

More information

CS 354R: Computer Game Technology

CS 354R: Computer Game Technology CS 354R: Computer Game Technology http://www.cs.utexas.edu/~theshark/courses/cs354r/ Fall 2017 Instructor and TAs Instructor: Sarah Abraham theshark@cs.utexas.edu GDC 5.420 Office Hours: MW4:00-6:00pm

More information

Hedonic Coalition Formation for Distributed Task Allocation among Wireless Agents

Hedonic Coalition Formation for Distributed Task Allocation among Wireless Agents Hedonic Coalition Formation for Distributed Task Allocation among Wireless Agents Walid Saad, Zhu Han, Tamer Basar, Me rouane Debbah, and Are Hjørungnes. IEEE TRANSACTIONS ON MOBILE COMPUTING, VOL. 10,

More information