Touch & Gesture. HCID 520 User Interface Software & Technology
|
|
- Sydney Lynch
- 5 years ago
- Views:
Transcription
1 Touch & Gesture HCID 520 User Interface Software & Technology
2 Natural User Interfaces
3 What was the first gestural interface?
4
5
6 Myron Krueger There were things I resented about computers.
7 Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.
8 Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.
9 Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.... that it wasn't perceptual it was all symbolic.
10 Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.... that it wasn't perceptual it was all symbolic. I started thinking that artists and musicians had the best relationships to their tools.
11 Myron Krueger There were things I resented about computers. I resented that I had to sit down to use them.... that it was denying that I had a body.... that it wasn't perceptual it was all symbolic. I started thinking that artists and musicians had the best relationships to their tools. As early as '74, the computer could see you. Krueger, 1988
12 [O Sullivan]
13
14
15 compared to. Pictures Under Glass
16 What makes an input method natural?
17 Top 8 images for natural interaction.
18 Top 8 images for natural
19
20 What makes an input method natural?
21 What makes an input method natural? An ill-posed question
22 A reasonable definition? A user interface is natural if: The experience of using a system matches expectations, such that it is always clear to the user how to proceed, and that few steps (with a minimum of physical and cognitive effort) are required to complete common tasks. Hinckley & Wigdor
23 A reasonable definition? A user interface is natural if: The experience of using a system matches expectations, such that it is always clear to the user how to proceed, and that few steps (with a minimum of physical and cognitive effort) are required to complete common tasks. Hinckley & Wigdor Q: Is this just usability by another name?
24 It is a common mistake to attribute the naturalness of a product to the underlying input technology. A touch screen, or any other input method for that matter, is not inherently natural. Hinckley & Wigdor
25 It is a common mistake to attribute the naturalness of a product to the underlying input technology. A touch screen, or any other input method for that matter, is not inherently natural. Hinckley & Wigdor Fluent experiences depend on the context and expectations of the user, often relying on prior learning and skill acquisition.
26 Touch Input
27 Touch Input Different types of sensors
28 Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch.
29 Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch. Capacitive: Layer holds electric charge, changed by touch at contact point. Supports multi-touch.
30 Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch. Capacitive: Layer holds electric charge, changed by touch at contact point. Supports multi-touch. Surface acoustic wave: Measure changes to ultrasonic audio waves. Expensive, sensitive.
31 Touch Input Different types of sensors Resistive: Pressure connects conductive and resistive circuits. Cheap, supports single touch. Capacitive: Layer holds electric charge, changed by touch at contact point. Supports multi-touch. Surface acoustic wave: Measure changes to ultrasonic audio waves. Expensive, sensitive. Optical imaging: Use IR light and cameras to track touches (appear as shadows). Multi-touch.
32
33
34 In-Air Gestures
35 Kinect Sensor We continuously reference elements in the world in ambiguous ways, yet for the most part RGB infrared we seem to convey our intentions quite well. Deixis: Reference by means of an expression infrared whose interpretation is relative to the (usually) projector extralinguistic context. camera Common methods of physical reference: pointing & placing [Clark 2003] camera Microphones Motor USB
36 Depth Cameras Structured IR light cheap, fast, accurate missing pixels, shadows Structured IR Missing pixels (not IR reflective) shadow far RGB Depth near
37 How Kinect Works Structured Light 3D Scanner
38 RGB vs. Depth for Pose Estimation RGB Only works when well lit Background clutter Scale unknown Clothing, skin colour DEPTH Depth Works in low light Person pops from bg Scale known Uniform texture Shadows, missing pixels Much easier with depth
39 Human Pose Estimation x y z θ Kinect tracks 20 body joints in real time.
40 Skeletal Tracking top view Input depth image Inferred body parts & overlaid joint hypotheses front view 3D joint hypotheses side view
41 Kinect API Input Image Data Streams: RGB, Depth images Skeletal Tracking Audio (Microsoft Speech Platform) Constraints Latency: data analysis introduces lag 86cm to 4m range Not outdoors (too much IR noise) Not too close to other Kinects (IR interference) Track 1-2 people only; full bodies must be in view (?)
42
43 Gesture Design
44 Designing Gestural UIs A designer must consider: (a) the physical sensor
45 Input Device Properties Property Sensed: position, force, angle, joints States Sensed: contact, hover, Precision: accuracy of selection Latency: delay in property/state sensing Acquisition Time: get pen, move hand to mouse False Input: accidental touches
46 On clutches and live mics Device Property State Tracked
47 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press
48 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position??
49 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact
50 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position??
51 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact
52 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact Gesture 2D/3D Position??
53 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact Gesture 2D/3D Position?? In-air gestures may involve a live mic, increasing chances of false positives and false negatives.
54 On clutches and live mics Device Property State Tracked Mouse 2D Position Hover, Button-Press Stylus 2D Position Hover, Contact Touch 2D Position Contact Gesture 2D/3D Position?? In-air gestures may involve a live mic, increasing chances of false positives and false negatives. Clutch: differentiate actions intended to drive the computing system from those that are not.
55 Managing a live mic Reserved Actions Design gestures that will not be triggered unless specifically desired by the user. Reserved Clutches Use a special gesture to indicate that the system should now monitor for input commands. Multi-Modal Input Use another modality such as buttons or voice input to engage tracking by the system.
56 Designing Gestural UIs A designer must consider: (a) the physical sensor
57 Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user
58 Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user (c) ergonomic and industrial design
59 Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user (c) ergonomic and industrial design (d) the interplay between all interaction techniques and among all devices in the surrounding context
60 Designing Gestural UIs A designer must consider: (a) the physical sensor (b) the feedback presented to the user (c) ergonomic and industrial design (d) the interplay between all interaction techniques and among all devices in the surrounding context (e) the learning curve
61 How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity.
62 How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity. Participatory design: have representative users generate potential gestures for you.
63 How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity. Participatory design: have representative users generate potential gestures for you. One methodology [Wobbrock et al 2009] 1 Show participant start and end states of UI 2 Participant performs gesture for that effect 3 Analyze collected gestures from population
64 How to design gestures? Observation: generate potential gestures by observing (and participating in) situated activity. Participatory design: have representative users generate potential gestures for you. One methodology [Wobbrock et al 2009] 1 Show participant start and end states of UI 2 Participant performs gesture for that effect 3 Analyze collected gestures from population Must still consider interplay with task/context
65 User-Designed Gestures
66
67 Discussion
68 Discussion Questions Kristen: I found it interesting how the study was based around an effect/cause testing model (having users perform actions they think would result in the shown effect). Is this a popular method in other areas of hci? Lauren: I think showing users an effect, and then asking them to perform its cause, is an incredibly intelligent way to gather data on natural, intuitive gestures This method will undoubtedly decrease the cognitive load placed on the user and allow them to act naturally while interacting with the device, instead of requiring the user to memorize and recall several different learned gestures of interaction, creating a much more enjoyable user experience.
69 Discussion Questions Sara: How might gestures differ significantly across cultures (especially in ones that read from right to left or tend to prefer using the right hand instead of the left, for example)? Taysser: This reading says that users are not designers; therefore, care must be taken to elicit user behavior profitable for design what effect design has on developing a user-defined gesture set? This reading doesn t provide an example that clarifies this statement.
70 Discussion Questions Stuart: Morris, Saponas, & Tan posit that a glasses-shaped display is "the best candidate for always-available output in the near-term future." This paper was written in 2011, and since then, Google Glass has flopped tremendously. What went wrong? Why are glasses-based optical interfaces not as great an idea as they thought? Rick: With the announcement of Microsoft HoloLens, it seems that users can interact with virtual objects in their field of vision. I am very curious that whether the interaction mode of HoloLens will be more similar to a touch system or an in-air gesturing system?
71 Discussion Questions Acacio: Although the problems mentioned are somewhat obvious, I think I found a hidden conclusion where multi modal inputs make the most sense from a usability standpoint and have been the most successful in real world usage. In an ideal situation, each mode complements the other, adding functionality like the article mentions in the case of the keyboard and mouse. Another advance that I have found hard to live without is my apple magic mouse that incorporates touch as well. It has a clutch so when the mouse is actively moving the touch is disabled. This article opened my eyes to looking at these things around me.
72 The Kinematic Chain
73
74
75
76
77 Yves Guiard: Kinematic Chain Asymmetry in bimanual activities Under standard conditions, the spontaneous writing speed of adults is reduced by some 20% when instructions prevent the non-preferred hand from manipulating the page Non-dominant hand (NDH) provides a frame of reference for the dominant hand (DH) NDH operates at a coarse temporal and spatial scale; DH operates at a finer scales.
78 Proxemics
79
80
81
82
83 Proxemics Proxemics is the study of measurable distances between people as they interact. [Hall 1966] Taxonomy of Distance Intimate: embracing, touching or whispering Personal: interaction among friends / family Social: interactions among acquaintances Public: distance used for public speaking
84
85 We continuously reference elements in the world in ambiguous ways, yet for the most part we seem to convey our intentions quite well. Deixis: Reference by means of an expression whose interpretation is relative to the (usually) extralinguistic context. Vogel & Balakrishnan, 2004 Incorporating Marquardt et al, 2011 Common methods of physical reference: pointing & placing [Clark 2003] Proxemics
86
87 Final Thoughts Leverage the unique opportunities provided by a particular input technology. Don t shoehorn new modalities where old techniques excel. Consider perceptual vs. symbolic input. Prevent accidental (vs. intentional) input via unambiguous design and/or clutching. Respect existing conventions of spatial reference and social use of space.
Touch & Gesture. HCID 520 User Interface Software & Technology
Touch & Gesture HCID 520 User Interface Software & Technology What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger There were things I resented
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More informationThe 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X
The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-103 DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationTouch Interfaces. Jeff Avery
Touch Interfaces Jeff Avery Touch Interfaces In this course, we have mostly discussed the development of web interfaces, with the assumption that the standard input devices (e.g., mouse, keyboards) are
More informationCSE Tue 10/09. Nadir Weibel
CSE 118 - Tue 10/09 Nadir Weibel Today Admin Teams Assignments, grading, submissions Mini Quiz on Week 1 (readings and class material) Low-Fidelity Prototyping 1st Project Assignment Computer Vision, Kinect,
More informationAir Marshalling with the Kinect
Air Marshalling with the Kinect Stephen Witherden, Senior Software Developer Beca Applied Technologies stephen.witherden@beca.com Abstract. The Kinect sensor from Microsoft presents a uniquely affordable
More informationResearch Seminar. Stefano CARRINO fr.ch
Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks
More informationGESTURES. Luis Carriço (based on the presentation of Tiago Gomes)
GESTURES Luis Carriço (based on the presentation of Tiago Gomes) WHAT IS A GESTURE? In this context, is any physical movement that can be sensed and responded by a digital system without the aid of a traditional
More informationCSE 165: 3D User Interaction. Lecture #14: 3D UI Design
CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware
More informationStereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.
Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.
More informationHuman Factors. We take a closer look at the human factors that affect how people interact with computers and software:
Human Factors We take a closer look at the human factors that affect how people interact with computers and software: Physiology physical make-up, capabilities Cognition thinking, reasoning, problem-solving,
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationMulti-Modal User Interaction
Multi-Modal User Interaction Lecture 4: Multiple Modalities Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk MMUI, IV, Zheng-Hua Tan 1 Outline Multimodal interface
More informationof interface technology. For example, until recently, limited CPU power has dictated the complexity of interface devices.
1 Introduction The primary goal of this work is to explore the possibility of using visual interpretation of hand gestures as a device to control a general purpose graphical user interface (GUI). There
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationImage Manipulation Interface using Depth-based Hand Gesture
Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking
More informationAdvancements in Gesture Recognition Technology
IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Virtual Reality Input Devices Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1 WIMP:
More informationRV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI
RV - AULA 05 - PSI3502/2018 User Experience, Human Computer Interaction and UI Outline Discuss some general principles of UI (user interface) design followed by an overview of typical interaction tasks
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard
More informationProject Multimodal FooBilliard
Project Multimodal FooBilliard adding two multimodal user interfaces to an existing 3d billiard game Dominic Sina, Paul Frischknecht, Marian Briceag, Ulzhan Kakenova March May 2015, for Future User Interfaces
More informationUser Interface Software Projects
User Interface Software Projects Assoc. Professor Donald J. Patterson INF 134 Winter 2012 The author of this work license copyright to it according to the Creative Commons Attribution-Noncommercial-Share
More informationIMGD 4000 Technical Game Development II Interaction and Immersion
IMGD 4000 Technical Game Development II Interaction and Immersion Robert W. Lindeman Associate Professor Human Interaction in Virtual Environments (HIVE) Lab Department of Computer Science Worcester Polytechnic
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationActive Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1
Active Stereo Vision COMP 4102A Winter 2014 Gerhard Roth Version 1 Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can
More informationTable of Contents. Display + Touch + People = Interactive Experience. Displays. Touch Interfaces. Touch Technology. People. Examples.
Table of Contents Display + Touch + People = Interactive Experience 3 Displays 5 Touch Interfaces 7 Touch Technology 10 People 14 Examples 17 Summary 22 Additional Information 23 3 Display + Touch + People
More informationKINECT CONTROLLED HUMANOID AND HELICOPTER
KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED
More informationUbiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1
Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility
More informationMobile Applications 2010
Mobile Applications 2010 Introduction to Mobile HCI Outline HCI, HF, MMI, Usability, User Experience The three paradigms of HCI Two cases from MAG HCI Definition, 1992 There is currently no agreed upon
More information3D Interaction Techniques
3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?
More informationComputer Vision in Human-Computer Interaction
Invited talk in 2010 Autumn Seminar and Meeting of Pattern Recognition Society of Finland, M/S Baltic Princess, 26.11.2010 Computer Vision in Human-Computer Interaction Matti Pietikäinen Machine Vision
More informationProspective Teleautonomy For EOD Operations
Perception and task guidance Perceived world model & intent Prospective Teleautonomy For EOD Operations Prof. Seth Teller Electrical Engineering and Computer Science Department Computer Science and Artificial
More information3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta
3D Interaction using Hand Motion Tracking Srinath Sridhar Antti Oulasvirta EIT ICT Labs Smart Spaces Summer School 05-June-2013 Speaker Srinath Sridhar PhD Student Supervised by Prof. Dr. Christian Theobalt
More informationSensing and Perception
Unit D tion Exploring Robotics Spring, 2013 D.1 Why does a robot need sensors? the environment is complex the environment is dynamic enable the robot to learn about current conditions in its environment.
More informationStop Compromising My Touchscreen!
Stop Compromising My Touchscreen! Nathan Moyal GM Asia 2 Whitepaper Stop Compromising My Touchscreen! NateMoyal GM Asia Abstract The choice of touchscreen technologies is commonly focused on a few recognizable
More informationUUIs Ubiquitous User Interfaces
UUIs Ubiquitous User Interfaces Alexander Nelson April 16th, 2018 University of Arkansas - Department of Computer Science and Computer Engineering The Problem As more and more computation is woven into
More informationNaturalness in the Design of Computer Hardware - The Forgotten Interface?
Naturalness in the Design of Computer Hardware - The Forgotten Interface? Damien J. Williams, Jan M. Noyes, and Martin Groen Department of Experimental Psychology, University of Bristol 12a Priory Road,
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationBuilding Perceptive Robots with INTEL Euclid Development kit
Building Perceptive Robots with INTEL Euclid Development kit Amit Moran Perceptual Computing Systems Innovation 2 2 3 A modern robot should Perform a task Find its way in our world and move safely Understand
More informationOutline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)
Outline 01076568 Human Computer Interaction Chapter 5 : Paradigms Introduction Paradigms for interaction (15) ดร.ชมพ น ท จ นจาคาม [kjchompo@gmail.com] สาขาว ชาว ศวกรรมคอมพ วเตอร คณะว ศวกรรมศาสตร สถาบ นเทคโนโลย
More informationCommunity Update and Next Steps
Community Update and Next Steps Stewart Tansley, PhD Senior Research Program Manager & Product Manager (acting) Special Guest: Anoop Gupta, PhD Distinguished Scientist Project Natal Origins: Project Natal
More informationGetting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning...
Contents Getting started 1 System Requirements......................... 1 Software Installation......................... 2 Hardware Installation........................ 2 System Limitations and Tips on
More informationVideo Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces
Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Content based on Dr.LaViola s class: 3D User Interfaces for Games and VR What is a User Interface? Where
More informationEffective Iconography....convey ideas without words; attract attention...
Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the
More information3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray
Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User
More informationInterface Design V: Beyond the Desktop
Interface Design V: Beyond the Desktop Rob Procter Further Reading Dix et al., chapter 4, p. 153-161 and chapter 15. Norman, The Invisible Computer, MIT Press, 1998, chapters 4 and 15. 11/25/01 CS4: HCI
More informationThe 5 Types Of Touch Screen Technology.! Which One Is Best For You?!
The 5 Types Of Touch Screen Technology. Which One Is Best For You? Touch Screens have become very commonplace in our daily lives: cell phones, ATM s, kiosks, ticket vending machines and more all use touch
More informationEvaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras
Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras TACCESS ASSETS 2016 Lee Stearns 1, Ruofei Du 1, Uran Oh 1, Catherine Jou 1, Leah Findlater
More informationPractical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction
More informationCSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2
CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter
More informationAnalysis of Various Methodology of Hand Gesture Recognition System using MATLAB
Analysis of Various Methodology of Hand Gesture Recognition System using MATLAB Komal Hasija 1, Rajani Mehta 2 Abstract Recognition is a very effective area of research in regard of security with the involvement
More informationAbstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction
Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri
More informationSensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.
Sensing Key requirement of autonomous systems. An AS should be connected to the outside world. Autonomous systems Convert a physical value to an electrical value. From temperature, humidity, light, to
More informationUniversal Usability: Children. A brief overview of research for and by children in HCI
Universal Usability: Children A brief overview of research for and by children in HCI Gerwin Damberg CPSC554M, February 2013 Summary The process of developing technologies for children users shares many
More informationA Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,
IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,
More informationHeads up interaction: glasgow university multimodal research. Eve Hoggan
Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not
More informationCS415 Human Computer Interaction
CS415 Human Computer Interaction Lecture 10 Advanced HCI Universal Design & Intro to Cognitive Models October 30, 2016 Sam Siewert Summary of Thoughts on ITS Collective Wisdom of Our Classes (2015, 2016)
More informationLecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)
Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces
More informationIntelligent Robotics Sensors and Actuators
Intelligent Robotics Sensors and Actuators Luís Paulo Reis (University of Porto) Nuno Lau (University of Aveiro) The Perception Problem Do we need perception? Complexity Uncertainty Dynamic World Detection/Correction
More informationINDE/TC 455: User Interface Design
INDE/TC 455: User Interface Design Module 13.0 Interface Technology 1 Three more interface considerations What is the best allocation of responsibility between the human and the tool? What is the best
More informationIceTrendr - Polygon. 1 contact: Peder Nelson Anne Nolin Polygon Attribution Instructions
INTRODUCTION We want to describe the process that caused a change on the landscape (in the entire area of the polygon outlined in red in the KML on Google Earth), and we want to record as much as possible
More informationAvailable online at ScienceDirect. Procedia Computer Science 50 (2015 )
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 50 (2015 ) 503 510 2nd International Symposium on Big Data and Cloud Computing (ISBCC 15) Virtualizing Electrical Appliances
More informationModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern
ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern
More informationIntroduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne
Introduction to HCI CS4HC3 / SE4HC3/ SE6DO3 Fall 2011 Instructor: Kevin Browne brownek@mcmaster.ca Slide content is based heavily on Chapter 1 of the textbook: Designing the User Interface: Strategies
More informationCollaboration in Multimodal Virtual Environments
Collaboration in Multimodal Virtual Environments Eva-Lotta Sallnäs NADA, Royal Institute of Technology evalotta@nada.kth.se http://www.nada.kth.se/~evalotta/ Research question How is collaboration in a
More informationM-16DX 16-Channel Digital Mixer
M-16DX 16-Channel Digital Mixer Workshop Using the M-16DX with a DAW 2007 Roland Corporation U.S. All rights reserved. No part of this publication may be reproduced in any form without the written permission
More informationNontraditional Interfaces
Nontraditional Interfaces An Introduction into Nontraditional Interfaces SWEN-444 What are Nontraditional Interfaces? So far we have focused on conventional or traditional GUI s Nontraditional interfaces
More informationQuestionnaire Design with an HCI focus
Questionnaire Design with an HCI focus from A. Ant Ozok Chapter 58 Georgia Gwinnett College School of Science and Technology Dr. Jim Rowan Surveys! economical way to collect large amounts of data for comparison
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests
More informationINDE/TC 455: User Interface Design
INDE/TC 455: User Interface Design Autumn 2008 Class #21 URL:courses.washington.edu/ie455 1 TA Moment 2 Class #20 Review Review of flipbooks 3 Assignments for Class #22 Individual Review modules: 5.7,
More informationApple ARKit Overview. 1. Purpose. 2. Apple ARKit. 2.1 Overview. 2.2 Functions
Apple ARKit Overview 1. Purpose In the 2017 Apple Worldwide Developers Conference, Apple announced a tool called ARKit, which provides advanced augmented reality capabilities on ios. Augmented reality
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationUniversidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs
Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction
More informationInput devices and interaction. Ruth Aylett
Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time
More informationEvaluation of Input Devices for Musical Expression: Borrowing Tools from HCI
Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI Marcelo Mortensen Wanderley Nicola Orio Outline Human-Computer Interaction (HCI) Existing Research in HCI Interactive Computer
More informationPaper Prototyping Kit
Paper Prototyping Kit Share Your Minecraft UI IDEAs! Overview The Minecraft team is constantly looking to improve the game and make it more enjoyable, and we can use your help! We always want to get lots
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationControlling vehicle functions with natural body language
Controlling vehicle functions with natural body language Dr. Alexander van Laack 1, Oliver Kirsch 2, Gert-Dieter Tuzar 3, Judy Blessing 4 Design Experience Europe, Visteon Innovation & Technology GmbH
More informationComparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application
Comparison of Head Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application Nehemia Sugianto 1 and Elizabeth Irenne Yuwono 2 Ciputra University, Indonesia 1 nsugianto@ciputra.ac.id
More informationHUMAN COMPUTER INTERFACE
HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the
More informationUbiquitous Computing. Spring 2010
Ubiquitous Computing Spring 2010 - Making Sense of Sensing Systems: Five questions for designers and Researchers - Distributed mediation of ambiguous context an aware environments - RFID: A key to Automating
More informationMimetic Interaction Spaces : Controlling Distant Displays in Pervasive Environments
Mimetic Interaction Spaces : Controlling Distant Displays in Pervasive Environments Hanae Rateau Universite Lille 1, Villeneuve d Ascq, France Cite Scientifique, 59655 Villeneuve d Ascq hanae.rateau@inria.fr
More informationinteractive laboratory
interactive laboratory ABOUT US 360 The first in Kazakhstan, who started working with VR technologies Over 3 years of experience in the area of virtual reality Completed 7 large innovative projects 12
More informationTangible User Interfaces
Tangible User Interfaces Seminar Vernetzte Systeme Prof. Friedemann Mattern Von: Patrick Frigg Betreuer: Michael Rohs Outline Introduction ToolStone Motivation Design Interaction Techniques Taxonomy for
More informationENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS
BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of
More informationZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field
ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field Figure 1 Zero-thickness visual hull sensing with ZeroTouch. Copyright is held by the author/owner(s). CHI 2011, May 7 12, 2011, Vancouver, BC,
More informationDeveloping a VR System. Mei Yii Lim
Developing a VR System Mei Yii Lim System Development Life Cycle - Spiral Model Problem definition Preliminary study System Analysis and Design System Development System Testing System Evaluation Refinement
More informationSMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE
ISSN: 0976-2876 (Print) ISSN: 2250-0138 (Online) SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE L. SAROJINI a1, I. ANBURAJ b, R. ARAVIND c, M. KARTHIKEYAN d AND K. GAYATHRI e a Assistant professor,
More informationSketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph
Sketching Interface Larry April 24, 2006 1 Motivation Natural Interface touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different from speech
More informationPortfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088
Portfolio About Me: I am a Computer Science graduate student at The University of Texas at Dallas. I am currently working as Augmented Reality Engineer at Aireal, Dallas and also as a Graduate Researcher
More informationTracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye
Tracking Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Outline of this talk Introduction: what makes a good tracking system? Example hardware and their tradeoffs Taxonomy of tasks:
More informationR (2) Controlling System Application with hands by identifying movements through Camera
R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity
More informationSketching Interface. Motivation
Sketching Interface Larry Rudolph April 5, 2007 1 1 Natural Interface Motivation touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different
More informationChapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space
Chapter 2 Understanding and Conceptualizing Interaction Anna Loparev Intro HCI University of Rochester 01/29/2013 1 Problem space Concepts and facts relevant to the problem Users Current UX Technology
More informationMulti-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group
Multi-touch Technology 6.S063 Engineering Interaction Technologies Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group how does my phone recognize touch? and why the do I need to press hard on airplane
More informationProjection Based HCI (Human Computer Interface) System using Image Processing
GRD Journals- Global Research and Development Journal for Volume 1 Issue 5 April 2016 ISSN: 2455-5703 Projection Based HCI (Human Computer Interface) System using Image Processing Pankaj Dhome Sagar Dhakane
More informationControlling Humanoid Robot Using Head Movements
Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika
More informationAugmented and Virtual Reality
CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS
More information