E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Similar documents
Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Haptic presentation of 3D objects in virtual reality for the visually disabled

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Controlling vehicle functions with natural body language

Geo-Located Content in Virtual and Augmented Reality

Projection Based HCI (Human Computer Interface) System using Image Processing

Multi-Modal User Interaction

Gregory Bock, Brittany Dhall, Ryan Hendrickson, & Jared Lamkin Project Advisors: Dr. Jing Wang & Dr. In Soo Ahn Department of Electrical and Computer

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

VR based HCI Techniques & Application. November 29, 2002

Microsoft Scrolling Strip Prototype: Technical Description

Blind navigation with a wearable range camera and vibrotactile helmet

A Kinect-based 3D hand-gesture interface for 3D databases

R (2) Controlling System Application with hands by identifying movements through Camera

Comparison of Haptic and Non-Speech Audio Feedback

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Virtual Reality Calendar Tour Guide

Exploring Geometric Shapes with Touch

Gesture Recognition with Real World Environment using Kinect: A Review

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Image Manipulation Interface using Depth-based Hand Gesture

Realistic Robot Simulator Nicolas Ward '05 Advisor: Prof. Maxwell

Advancements in Gesture Recognition Technology

Haptic Holography/Touching the Ethereal

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Title: A Comparison of Different Tactile Output Devices In An Aviation Application

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Frictioned Micromotion Input for Touch Sensitive Devices

Heads up interaction: glasgow university multimodal research. Eve Hoggan

CS415 Human Computer Interaction

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

EE 314 Spring 2003 Microprocessor Systems

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Interactive Exploration of City Maps with Auditory Torches

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Year 11 PHYSICS ERT EXTENDED RESPONSE TASK. Solar Spotlights

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

Face Detection using 3-D Time-of-Flight and Colour Cameras

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Autonomous Machine To Manufacture PCB and 3-D Design

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

What was the first gestural interface?

Vein and Fingerprint Identification Multi Biometric System: A Novel Approach

these systems has increased, regardless of the environmental conditions of the systems.

6 Ubiquitous User Interfaces

CS415 Human Computer Interaction

Comprehensive Design Review. Team Toccando March 9, 2016

3D Data Navigation via Natural User Interfaces

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Considerations: Evaluating Three Identification Technologies

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Indiana K-12 Computer Science Standards

Haptic holography/touching the ethereal Page, Michael

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Virtual Grasping Using a Data Glove

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Wireless Technology in Robotics

Assignment 5: Virtual Reality Design

Touch & Gesture. HCID 520 User Interface Software & Technology

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Multi-application platform for education & training purposes in photonical measurement engineering & quality assurance with image processing

Figure 1 HDR image fusion example

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Application Note. Digital Low-Light CMOS Camera. NOCTURN Camera: Optimized for Long-Range Observation in Low Light Conditions

Automatic Locking Door Using Face Recognition

Saphira Robot Control Architecture

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

Peter Berkelman. ACHI/DigitalWorld

History of Virtual Reality. Trends & Milestones

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Omni-Directional Catadioptric Acquisition System

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Virtual Experiments as a Tool for Active Engagement

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES

A maze-solving educational robot with sensors simulated by a pen Thomas Levine and Jason Wright

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

Event Monitoring Setup

702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet

Virtual Tactile Maps

ISSN No: International Journal & Magazine of Engineering, Technology, Management and Research

The Mixed Reality Book: A New Multimedia Reading Experience

Virtual Environments. Ruth Aylett

Motion Controlled Manipulator System (MCMS) Vincent Wong Kevin Wong Jing Xu Kay Sze Hsiu-Yang Tseng Arnaud Martin

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Enabling Cursor Control Using on Pinch Gesture Recognition

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.

How to Create a Touchless Slider for Human Interface Applications

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Transcription:

E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright

Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7 List of Tasks.7 The Critical Path Method.9 Materials List.11 2

Abstract A physical computer interface system that combines haptic feedback and user motion input in three-dimensions is proposed. The interface will be developed to be suitable for use by visually impaired users, although other applications of the system may be investigated. The academic goal is to use knowledge of a broad range of topics including signal processing, control theory, digital systems, and computer vision to create a functional product. 3

Introduction This project seeks to implement a 3-D capable haptic computer interface. The system will be able to track and recognize user motion and provide appropriate tactile responses through the haptic interface. The underlying system is potentially applicable to a number of novel interface paradigms; however, the project will primarily focus on developing the system into an interface suitable for visually impaired users, but may explore use in alternative, haptically enhanced interfaces for sighted users as well. Typical graphical computer interfaces are not well suited to visually impaired users. While alternative interfaces, such as screen readers and keyboard access exist, they may not always be an ideal solution, especially when used with a program designed without consideration for their capabilities. Being able to navigate a graphical interface positionally, as was intended, may prove useful for some applications. A combination of haptic and audio feedback coupled with an input device that relies on absolute positioning, as opposed to relative positioning used by mice and trackballs, may prove to be a practical interface. The project will use a rather atypical method for obtaining input: a three dimensional camera. While perhaps not the most practical solution, this will allow for some interesting applications in recognizing three dimensional gestures and should allow for a flexible system that can be modified for other uses. This proposal outlines the technical background of the project and provides a list of major tasks and a schedule for project completion. Technical Discussion The project consists of two, largely distinct, technical challenges. The first is to implement the input system which collects and processes user input. The second is to provide meaningful feedback to the user through the haptic interface. Tracking Input While the nature of the sensory feedback provided to the user will change significantly depending on the application, a means of capturing three dimensional user input will be integral to virtually all possible applications. At the core of the three dimensional interface system is this functional ability to capture and recognize commands given by the user s motion and interpret them correctly. 4

Before any interpretation of a user s movements can be made, it is first necessary to translate the information into a form that can be processed by the computer. One solution to this problem employs a camera, developed by Canesta Incorporated, which provides a sense of pixel depth in addition to the two dimensional information contained in ordinary pixels. Canesta s technology incorporates an infrared light source and a CMOS pixel array on a single chip. The system (Figure 1) works using the principle of Light Detection and Ranging (LIDAR), which is similar to radar except that the target s range is calculated from the time-of-flight of light as opposed to radio-waves. Figure 1. Overview of Canesta s EPT system. Soure: http://www.canesta.com/html/sensors.htm The three dimensional camera is composed of a CMOS pixel array that captures a twodimensional representation of a scene based on the intensity of visible light. However, an infrared source on the chip emits a signal, s( t) sin( 2! f t) phase,!, of the reflected signal, r( t) = R sin ( 2" f t #!) reflecting surface as related by ( d) ( =, modulated at the frequency f m. The m m, depends on the range, d, of the 2d # = 2 " f )!. Canesta s CMOS pixel design allows this c m phase shift to be measured without the complex calculations and filtering required by traditional signal processing techniques. Each pixel contains two photo-sensitive gates which are modulated by the same signal as the light source. Depending on the phase shift, more photons 5

will be incident on one gate than the other. This voltage difference is proportional to the phase shift which is proportional to the target s range. Canesta s camera technology presents a powerful solution to the problem of sensing three dimensional movements, such as stylus motion and hand gestures. There are other methods of sensing a third, depth dimension, with triangulation from two cameras being the most robust. However, this option requires a rigid setup because the relative positions of the multiple cameras are critical to the accuracy of the system. With Canesta s single sensor system, it should be possible to set a few markers to define an input area without careful calibration, allowing a working environment to be resized for a particular use or moved relatively easily. Additionally, algorithms for determining which pixels correspond to each other in a triangulation setup using two optical systems are computationally intensive, while Canesta s solution requires little postprocessing. Haptic Feedback The system is designed to provide haptic, or tactile, feedback to the user. For visually impaired users, feedback needs to accomplish two tasks: providing cues as to where objects are located and identifying objects when they are encountered. The first task is the more challenging, as simple vibrations or audible signals are not likely to suffice. The proposal is to use a motor controlled joystick to provide a sense of direction as well as distance to a nearby object. The stick will bend in the direction of the nearest object and the angle will be indicative of the distance to the object, where a greater angle indicates a greater distance. Such a device will require a control system to implement. A simple diagram of the basic feedback loop required is shown in Figure 2. The cues could be made context sensitive, letting a user to request the nearest radio button, for example. The system would then ignore other interface elements and help guide the user to the requested object. This joystick will be operated by the hand not being used for input. Figure 2. Feedback loop for a joystick controller. 6

Haptic feedback could also be used to provide identification of interface elements by vibrators that create a sense of texture. These vibrators could be in the off-hand joystick device or in a stylus held in the input hand. Auditory feedback will probably be used as well to help in identification. Apart from the obvious utility to visually impaired users, haptic feedback may provide some advantage in certain circumstances to sighted users. 1 Touch is an important sense and using it to enhance a user experience may be possible. The project may investigate this potential in the course of usability studies. Project Implementation An effort has been made to identify all important tasks required for the successful implementation of the project. The following is a list of the identified activities and a brief description of each. List of Tasks Task A: Researching existing solutions This task will involve researching existing interface solutions for visually impaired users and current uses of haptic interfaces. Task B: Obtaining camera system This task will involve writing and sending out a project proposal to Canesta Inc. to obtain a camera development kit. Task C: Getting to know the camera API and equipment Upon receipt of the camera, its development environment will be studied. This task will provide the necessary hardware and software familiarity that will be required throughout the project. Task D: Position tracking in 2-D First, a basic capability will be implemented to ensure the system s functionality. This will involve tracking the position of a stylus or a finger within a two dimensional coordinate system. Task E: Developing a prototype feedback controller in MATLAB 1 The Importance of the Sense of Touch in Virtual and Real Environments, Gabriel Robles-De-La-Torre, IEEE Multimedia 13(3), Special issue on Haptic User Interfaces for Multimedia Systems, pp. 24-30. 7

This task will involve developing and simulating a haptic feedback system in MATLAB. The algorithm details will be ironed out at this stage. Task F: Programming a micro-controller The working MATLAB simulation will be transferred from software into a hardware implementation. Task G: Designing a PCB A printed circuit board for the haptic controller and motors will be designed at this stage. Task H: Manufacturing PCB The PCB design from Task G will be sent out for manufacturing. Task I: Developing 3D functionalities Three dimensional capabilities will be implemented at this stage. This will involve developing gesture recognition routines to identify taps, grabs and other signals. Task J: Familiarizing with accessibility API/writing test program If possible the system will interface with existing accessibility APIs in a modern operating system. If this proves too difficult, a test program will be written to provide a graphical interface with which a user can interact using the system. Task K: Integrating systems and debugging All components will be integrated into one functional unit. Task L: Testing system The overall functionality of the system will be tested with interesting test cases. Task M: Performing usability studies Experiments will be conducted with a number of subjects utilizing the developed system to interact with a test program. This will provide insight into whether the developed system is an improvement over other solutions. Task N: Presenting mid semester progress Progress made by this point in the semester will be presented to an evaluation committee. Task O: Writing a draft report A draft for the final project report will be written at this stage. Task P: Writing the Final Report 8

The final report for the project will be written at this stage. Task Q: Final Presentation All progress made will be presented to an open audience. The Critical Path Method A critical path method (CPM) analysis was performed on the identified tasks. Table 1 associates expected duration and required effort for the tasks and lists dependencies. This information was used to construct the CPM diagram in Figure 3 and the Gannt Chart in Figure 4. Table 1. A Table listing the tasks and estimates of duration and effort to complete them. Activity Needs Feeds Duration (weeks) Effort (man-hours) A - B 1 15 B A C 1 30 C B D, I 1.5 30 D D K 1.5 24 E B F 3 30 F E G 1 24 G F H 1 20 H G K 1 1 I C K 2 30 J B L 1 30 K I, D, H L 0.5 12 L K, J M 0.5 15 M L O, Q 0.5 10 N - - 1 20 O M P 2 60 P O - 2 60 Q M - 1 20 9

Figure 3. A CPM diagram for the project. The critical path is shown in red. Note that events A, B, and N which occur during vacation or are not directly related to the project s progress were not assigned early start times. Figure 4. A GANTT chart for the project. 10

Materials List Figure 5. Diagram of the envisioned product. CanestaVision Development Kit. Cost: TBD Microcontroller for Haptic feedback. Cost: TBD (possibly free sample) Motors and vibrators for Haptic feedback Cost: TBD, ~$50 PCB for feedback system Cost: ~$60 (ExpressPCB) The most expensive component for this project is the Canesta camera. Canesta has provided discounts on the camera for educational purposes. The camera will be a reusable component. After project completion it will be possible to use the camera in other Engineering Department courses and projects. 11