3D Data Navigation via Natural User Interfaces

Similar documents
Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

GestureCommander: Continuous Touch-based Gesture Prediction

Building a bimanual gesture based 3D user interface for Blender

VICs: A Modular Vision-Based HCI Framework

Interface Design V: Beyond the Desktop

Classifying 3D Input Devices

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Classifying 3D Input Devices

Cricut Design Space App for ipad User Manual

IMGD 4000 Technical Game Development II Interaction and Immersion

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Advancements in Gesture Recognition Technology

R (2) Controlling System Application with hands by identifying movements through Camera

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Enabling Cursor Control Using on Pinch Gesture Recognition

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

Human Computer Interaction Lecture 04 [ Paradigms ]

UUIs Ubiquitous User Interfaces

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Direct Manipulation. and Instrumental Interaction. Direct Manipulation 1

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

Virtual Grasping Using a Data Glove

Direct Manipulation. and Instrumental Interaction. Direct Manipulation

DRAFT: SPARSH UI: A MULTI-TOUCH FRAMEWORK FOR COLLABORATION AND MODULAR GESTURE RECOGNITION. Desirée Velázquez NSF REU Intern

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

What was the first gestural interface?

SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

GART: The Gesture and Activity Recognition Toolkit

CATIA V5 Workbook Release V5-6R2013

Robust Hand Gesture Recognition for Robotic Hand Control

A Gestural Interaction Design Model for Multi-touch Displays

Virtual Touch Human Computer Interaction at a Distance

6 Ubiquitous User Interfaces

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Touch Interfaces. Jeff Avery

CSE 165: 3D User Interaction. Lecture #11: Travel

A Method for Temporal Hand Gesture Recognition

Interior Design with Augmented Reality

Projection Based HCI (Human Computer Interface) System using Image Processing

Toolkit For Gesture Classification Through Acoustic Sensing

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

Building a gesture based information display

Recent Progress on Wearable Augmented Interaction at AIST

User s handbook Last updated in December 2017

CS 247 Project 2. Part 1. Reflecting On Our Target Users. Jorge Cueto Edric Kyauk Dylan Moore Victoria Wee

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

Saphira Robot Control Architecture

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Using Hands and Feet to Navigate and Manipulate Spatial Data

FLORIDA INTERNATIONAL UNIVERSITY. Miami, Florida 3D NAVIGATION WITH SIX DEGREES-OF-FREEDOM USING A MULTI-TOUCH DISPLAY

3D Interaction Techniques

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

The Control of Avatar Motion Using Hand Gesture

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction

Double-side Multi-touch Input for Mobile Devices

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne


Input devices and interaction. Ruth Aylett

Research Seminar. Stefano CARRINO fr.ch

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Frictioned Micromotion Input for Touch Sensitive Devices

Sketchpad Ivan Sutherland (1962)

Easy Input For Gear VR Documentation. Table of Contents

Service Robots in an Intelligent House

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Tangible User Interfaces

Microsoft Scrolling Strip Prototype: Technical Description

RingEdit: A Control Point Based Editing Approach in Sketch Recognition Systems

CHAPTER 1. INTRODUCTION 16

Sense. 3D scanning application for Intel RealSense 3D Cameras. Capture your world in 3D. User Guide. Original Instructions

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

MURDOCH RESEARCH REPOSITORY

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Owner s Guide. DB-303 Version 1.0 Copyright Pulse Code, Inc. 2009, All Rights Reserved

Auditory System For a Mobile Robot

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

COMPARATIVE STUDY AND ANALYSIS FOR GESTURE RECOGNITION METHODOLOGIES

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Mobile Interaction with the Real World

Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality

IMMIView. A multi-user solution for design review in real-time. Real Time Image Processing manuscript No. (will be inserted by the editor)

The 21 st Century Wireless Classroom Network for AP Calculus

EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment

Chapter 1 - Introduction

ŞahinSim: A Flight Simulator for End-Game Simulations

Touch & Gesture. HCID 520 User Interface Software & Technology

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment

VR4D: An Immersive and Collaborative Experience to Improve the Interior Design Process

Realtime 3D Computer Graphics Virtual Reality

GUI and Gestures. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

Android User manual. Intel Education Lab Camera by Intellisense CONTENTS

Guidelines for choosing VR Devices from Interaction Techniques

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

3D Projected Imagery Freespace Control Unit (PIFCU): Design, Implementation, and Analysis

Transcription:

3D Data Navigation via Natural User Interfaces Francisco R. Ortega PhD Candidate and GAANN Fellow Co-Advisors: Dr. Rishe and Dr. Barreto Committee Members: Dr. Raju, Dr. Clarke and Dr. Zeng GAANN Fellowship Supervisor: Dr. Milani

The most profound technologies are those that disappear The computer for the 21st century Mark Weiser, 1999 Father of ubiquitous computing

WIMP Windows Icons Menu Pointers Paradigm Most successful input paradigm

Bill Buxton Three State Model Stylus Example

Bill Buxton Three State Model Mouse Example

Fitt s Law

Post-WIMP The set of components that will be required for the correct interaction in this new paradigm: Multi Touch Vision Tracking Speech Recognition Pen Based Interaction Virtual Devices Etc.

NUI and Other Devices

Software Visual Studio 2010 C++ Ogre3D OpenGL Windows 7 Additional libraries

Motivation Find methods that allow users to interact with technology seamlessly Add building blocks to Post-WIMP Contribute a easy adaptable framework to software without full gesture support Help domain specific problems via MT Neural Sciences (e.g., MRI)

Example: 3D-2D-3D

Domain Specific Example MRI 3D and MRI data slices

BACKGROUND Understanding Touch Multi-Touch Interaction Gesture Recognition Virtual Devices Design Guidelines Current Open Systems / Frameworks

Understanding Touch Interactions Oblique finger orientation w/o sensors Rotations and Translations Combined or separate? One or two hands One hand better for simpler tasks (rotations) Two hands better for separable type of tasks (anchor gestures)

Multi-Touch Techniques Hancock et el, 2007 One two three touches Algorithm

Gesture Recognition Methods Hidden Markov Models Sezgin and Davis, 2005. Neural Networks Pittman, 1991. Featured-Based Classifiers Rubine, 1991. Template Matching Kara and Stahovic, 2005. Geometric Recognizers Wobbrock and Wilson, 2007 Etc.

Gesture Recognizer Example The $1 and the $N algorithms are examples of a combination of : Geometric Recognizers Template Matching

The $1 Algorithm (Wobbrock and Wilson, 2007) Resample the points in the path (N=64) Find indicative angle between the centroid and gesture s first point Rotate gesture to Indicative angle = 0 Scale gesture to reference square Calculate the distance between: Candidate gesture to each stored template

The $N Algorithm (extension to $1) Anthony and Wobbrock, 2010

Virtual Devices New set of devices that either: Emulate Physical Devices Find new ways for interaction that are not possible with current physical devices Examples Virtual Keyboard Virtual Sphere

Virtual Keyboard

Virtual Sphere Chen et el. 1998 Simulates a 3D Trackball X axis: Left-Right Y Axis: Top-Down Z Axis: Circular Fashion *

Design Guidelines I (Bowman et el. 2005) Use existing manipulation techniques unless an application will benefit greatly from creating new ones. Swipe Gesture Perform task analysis when selecting a 3D interaction technique. How precise was the Rotation? Match the interaction to the device. Consider the specific device properties

Design Guidelines II (Bowman et el. 2005) Reduce wasted motion. Amplify the user hand motion Reducethe number of degrees of freedom (DOF) whenever possible. Constraint 3D interaction when possible There is no single best 3D manipulation technique

Set-up Methods / Implementation Main Display 3M 22 M2256PW Monitor Windows 7 MT Drivers and Library Recognition and Navigation Algorithms

Camera Setup

Gestures [GestureWorks] Anchor Rotate Two Hand (5) Divide 4 Finger Rotate 5 Finger Hold

Basic Gestures

Gesture Action: Gesture Rotation

Beyond Basic

Point of Clouds

How does it work? Receive Trace Collection from Operating System Generic Touch Events: Up, Move, Down events Detect Trace Event Uses a MAP to keep collections of traces Tap, Double Tap and Drag Delegate responsibility to Trace Manager Trace Manager is aware of different recognizers Allow event and poll techniques

How does it work? (cont.) Decision Manager Collects possible gestures from recognizers Using a priority criteria, selects Gesture Action Manager Selects action with correct values Name of Gestures Callback function (or raised event) Transformation Matrix

Overview of the Framework System Diagram A system level view of our proposed framework Class Diagram The current classes that are part of our framework MT FSM Multi-Touch Finite State Machine

System Diagram

Evaluation Pilot Program (3-5 subjects) Correct bias produced by Instruments Experimental procedures Environmental Factors **

Main Experiment Main Experiment (30 Subjects) Repeated Measure design Compares Our approach to an existing Approach Will measure: Execution time: measures gesture actions to accomplish task e.g. : User will navigate within a 3D maze Accuracy of the movement: measures steps to finish What was the path taken to accomplish task? Likert scale: measures usability of the user

Preliminary Results

Preliminary Results Support for other devices for comparison Wiimote 3D Mouse Gamepad

Questions For references, please email me to forte007@fiu.edu Thank you