Projection Based HCI (Human Computer Interface) System using Image Processing

Similar documents
HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART

International Journal of Advance Research in Computer Science and Management Studies

International Journal of Advance Engineering and Research Development. Surface Computer

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

HUMAN COMPUTER INTERFACE

Model-Based Design for Sensor Systems

A Brief Survey of HCI Technology. Lecture #3

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Gesture Recognition with Real World Environment using Kinect: A Review

Practical Image and Video Processing Using MATLAB

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

CHAPTER 1. INTRODUCTION 16

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

The University of Algarve Informatics Laboratory

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Multi-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

TOUCHABLE HOLOGRAMS AND HAPTIC FEEDBACK: REAL EXPERIENCE IN A VIRTUAL WORLD

T.C. MARMARA UNIVERSITY FACULTY of ENGINEERING COMPUTER ENGINEERING DEPARTMENT

Advancements in Gesture Recognition Technology

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Where Image Quality Begins

What was the first gestural interface?

TEACHING PARAMETRIC DESIGN IN ARCHITECTURE

Car Over-Speed Detection with Remote Alerting

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Bar Code Labels. Introduction

A PROPOSED ALGORITHM FOR DIGITAL WATERMARKING

Digital Photographic Imaging Using MOEMS

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

3D Data Navigation via Natural User Interfaces

Computer Vision. The Pinhole Camera Model

Interior Design using Augmented Reality Environment

VISUAL FINGER INPUT SENSING ROBOT MOTION

Building a bimanual gesture based 3D user interface for Blender

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

R (2) Controlling System Application with hands by identifying movements through Camera

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Controlling Humanoid Robot Using Head Movements

Design and evaluation of Hapticons for enriched Instant Messaging

An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi

Image Compression using DPCM

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Digital Photogrammetry. Presented by: Dr. Hamid Ebadi

Enhancing Shipboard Maintenance with Augmented Reality

ANALYSIS OF IMAGE ENHANCEMENT TECHNIQUES USING MATLAB

The Use of Digital Technologies to Enhance User Experience at Gansu Provincial Museum

Multi-Modal User Interaction

Vehicle Number Plate Recognition with Bilinear Interpolation and Plotting Horizontal and Vertical Edge Processing Histogram with Sound Signals

International Journal of Informative & Futuristic Research ISSN (Online):

Sri Shakthi Institute of Engg and Technology, Coimbatore, TN, India.

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Displacement Measurement of Burr Arch-Truss Under Dynamic Loading Based on Image Processing Technology

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

MANAGING HUMAN-CENTERED DESIGN ARTIFACTS IN DISTRIBUTED DEVELOPMENT ENVIRONMENT WITH KNOWLEDGE STORAGE

Development of Image Processing Technique for Preventing Unauthorized Photography

TGR EDU: EXPLORE HIGH SCHOOL DIGITAL TRANSMISSION

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

Multi-touch Interface for Controlling Multiple Mobile Robots

Linear Gaussian Method to Detect Blurry Digital Images using SIFT

AUTOMATIC SPEECH RECOGNITION FOR NUMERIC DIGITS USING TIME NORMALIZATION AND ENERGY ENVELOPES

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Spotlight White paper

Touch & Gesture. HCID 520 User Interface Software & Technology

CS 315 Intro to Human Computer Interaction (HCI)

Quality Measure of Multicamera Image for Geometric Distortion

Booklet of teaching units

ME 6406 MACHINE VISION. Georgia Institute of Technology

Virtual Tactile Maps

Technology offer. Aerial obstacle detection software for the visually impaired

Automatic optical measurement of high density fiber connector

Various Calibration Functions for Webcams and AIBO under Linux

CD: (compact disc) A 4 3/4" disc used to store audio or visual images in digital form. This format is usually associated with audio information.

VICs: A Modular Vision-Based HCI Framework

The Evolution of User Research Methodologies in Industry

Studying of Reflected Light Optical Laser Microscope Images Using Image Processing Algorithm

ISO INTERNATIONAL STANDARD. Ergonomics of human-system interaction Part 910: Framework for tactile and haptic interaction

Parallel Architecture for Optical Flow Detection Based on FPGA

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming)

APPLICATIONS FOR TELECENTRIC LIGHTING

MATLAB: Basics to Advanced

Humera Syed 1, M. S. Khatib 2 1,2

Finger rotation detection using a Color Pattern Mask

Robust Hand Gesture Recognition for Robotic Hand Control

Blue Eyes Technology with Electric Imp Explorer Kit Ankita Shaily*, Saurabh Anand I.

LENSLESS IMAGING BY COMPRESSIVE SENSING

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms

6.869 Advances in Computer Vision Spring 2010, A. Torralba

Basic Principles of the Surgical Microscope. by Charles L. Crain

Virtual Environments. Ruth Aylett

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Reikan FoCal Aperture Sharpness Test Report

A Real Time Static & Dynamic Hand Gesture Recognition System

Optoliner NV. Calibration Standard for Sighting & Imaging Devices West San Bernardino Road West Covina, California 91790

Introduction to Humans in HCI

Development of Hybrid Image Sensor for Pedestrian Detection

Transcription:

GRD Journals- Global Research and Development Journal for Volume 1 Issue 5 April 2016 ISSN: 2455-5703 Projection Based HCI (Human Computer Interface) System using Image Processing Pankaj Dhome Sagar Dhakane Abhijit Jagtap Amol Funde Prof. V. M. Joshi Professor Abstract We use camera attach to the projector to locate a frame shape marker embedded in the large image. Interactive public displays give access as an innovative media to promote enhanced communication between people and information. In this project, we proposed idea to implement contents with interaction elements for vision-based digital public display. Virtual object, laser point detection and projection installation are applied for attracting attention from user. Preliminary study showed positive feedback of interactive content designing towards the public display. This project enables an effective low cost touch interface utilizing only a single camera and a projector. It embeds a small shape in the image generated by the user application (e.g. a touch screen menu with icons) and detects touch by measuring the geometrical distortion in the camera captured image. Keywords- Interactive, Direct-Touch Surface, Orientation Aware Interface, Human Computer Interface I. INTRODUCTION Since the computer had been invented, Human-Computer Interaction(HCI) Technology is being developing since then first comes the mechanical keyboard, Then mouse after that track ball then touch screen technology comes in market and still this technology is developing. Now a day this technology is affecting our day-to-day life since it affecting this amount on our life so we decided to have some new thing for developing this technology further for the future. In this project, we are introducing some new algorithm and its implementation thus we have taken reference from the professor, researcher who had been working on this project before us. II. METHODOLOGY AND IMPLEMENTATION In this Chapter we are going to see about various types of methods of system architectures and Algorithm to achieve this project s specification. In this chapter we are going to discuss about various system architecture it s benefit and different Algorithm For laser points detection. A. System Architecture Model of this project will be box shaped in which all hardware component are fitted carefully such that we gate high efficient working system. This box does not allow enter the external light inside the box. All the system work on the projector light and infrared light flows through the acrylic glass. All rights reserved by www.grdjournals.com 10

B. Position of Projector in Model Fig. 1: Position of Projector in model III. PROJECT DESIGN-MODULE In this project, there are various type of the input, output and processing devices are used. Each hardware component has it own work and process in this project. Some of them as given bellow. A. Hardware 1) Amkette Trueview Camera A digital camera is a camera that encodes digital images and videos digitally and stores them for later reproduction. Most cameras sold today are digital, and digital cameras are incorporated into many devices ranging from PDAs and mobile phones to vehicles. Digital and film cameras share an optical system, typically using a lens with a variable diaphragm to focus light onto an image pickup device. The diaphragm and shutter admit the correct amount of light to the imager, just as with film but the image pickup device is electronic rather than chemical. However, unlike film cameras, digital cameras can display images on a screen immediately after being recorded, and store and delete images from memory. 2) Projector OMANI M9000 A projector or image projector is an optical device that projects an image (or moving images) onto a surface, commonly a projection screen. Most projectors create an image by shining a light through a small transparent lens, but some newer types of projectors can project the image directly, by using laser points. A virtual retinal display, or retinal projector, is a projector that projects an image directly on the retina instead of using an external projection screen. B. Software 1) MATLAB MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. Developed by Math Works, MATLAB allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages, including C, C++, Java, and FORTRAN. MATLAB has vast number of toolboxes used in various field as per the requirement of application, for this project we mainly used following toolboxes. 2) Block Diagram The following diagram shows the block diagram of Projection Based 2-Way Interaction with The Operating System Using MATLAB All rights reserved by www.grdjournals.com 11

Fig. 2: Working of HCI system The above block diagram shows how this HCI system works and how data and interrupt controls passes though system. C. Input/output Section 1) Projection Glass This glass is specially designed for the projection window. This glass have special diffusing characteristics such that it can illuminate itself when the projection rays coming from the projector and visualize whole GUI screen on the piece of diffused glass. 2) Projector In this project, the projector act as output device. It plays important role in this project. It project the Application GUI on the piece of diffused glass. 3) Camera Camera act as input sensor which takes continuous input from the physical world. It take the number of snapshot per second and transfer image data toward image processing section. D. Image Processing Section This software act as environment to implement the image processing algorithm in very simple manner. 1) Image processing Algorithm This block consists of different type algorithmic function to detect the laser point in continuous frame. Good algorithm gives the effective program runs and reduction in program length. 2) Touch Co-ordinates Processing After detecting laser tips in the input frame from camera this block find out which key or link is selected. 3) Operating System Interrupt After detecting which key is pressed this block pass the associated interrupt or set of instruction s to the operating system such that required application or program should run at the output. All rights reserved by www.grdjournals.com 12

IV. FLOWCHART & ALGORITHM Fig. 3: Flow Chart & Algorithm A. Algorithm 1) System Initialization 2) Capture Frame 3) Perform Preprocessing 4) Detect laser points 5) After the laser point detection 6) Perform Link analysis 7) Initialize projection on Acrylic glass 8) Stop. B. Description 1) System Initialization At this step, the entire component will be start. System will check the entire component for its status. If any component fails then system correct that error or notify the user about component failure. 2) Capture Frame At this stage camera takes the Snapshot of the Application GUI which projected on Acrylic glass. Camera takes the snapshot and transmit it to the System to process using MATLAB Software. 3) Perform Preprocessing Image we got from the last stage does not have that many details in the captured frame so we have to perform preprocessing operation on the captured frame so that we get some details about the laser point s detection. At this stage we perform following operation on the captured frame, to get detail output Contrast Stretching Segmentation 4) Detect Laser Points After preprocessing detail in the frame enhance to the greater extent, we have to find out if laser points are detected on frame. For that purpose we use the High Pass Filtering Method. It is technique to find out there is laser points in the frame or not. 5) After The Laser Point Detection After previous stage if there is no sign of laser points detection then camera initialize itself for new frame take the new snapshot from the Model Environment, There are different methodologies to find out pixel position in image frame, So using that algorithm we have to find out laser point s touch position, and output of this stage will be shown as bellow. All rights reserved by www.grdjournals.com 13

Fig. 4: After the laser point detection 6) Perform Link Analysis After getting the position of laser points touch we have get information about which type link user has been selected, what interrupt should be generated through this touch-coordination, so that we get maximum output efficiency from the proposed system. 7) Initialize Projection on Acrylic Glass After executing interrupt service routine program as per the Laser points touch System should give the satisfied output at the acrylic glass and So that the interaction of Laser points with glass should be real time or say fast. 8) Stop ACKNOWLEDGEMENT Author are thankful to the department of Electronics and telecommunication and specially thank to college faculty for their collaboration in the realization of early warning system. V. CONCLUSION We have to apply this algorithm on various direct-touch surfaces to serve their own applications. These designs and inferences can be useful for interaction with a variety of direct-touch devices that generate laser detection information, either using our general algorithm or other more specialized sensing technologies. VI. FUTURE SCOPE As this HCI technology have a wide range of applications, it can be further improved by adding thumb orientation and their controls using the hand gesture. This would make the HCI effective and can be helpful to the people as their personal virtual assistant. VII. APPLICATIONS In institution As tablet PC/PC Industry application As a notice board At railway platform All rights reserved by www.grdjournals.com 14

Fig. 5: Application REFERENCES [1] J. Nielsen, Usability, Morgan Kaufman, San Francisco (2004). [2] D. Te eni, Designs that fit: an overview of fit conceptualization in HCI, in P. Zhang and D. Galletta (eds), Human-Computer Interaction and Management Information Systems: Foundations, M.E. Sharpe, Armonk (2006). [3] A. Chapanis, Man Machine, Wadsworth, Belmont (2009). [4] D. Norman, Cognitive, in D. Norman and S. Draper (eds), User Centered Design: New Perspective on Human-Computer Interaction, Lawrence Erlbaum, Hillsdale (2011). [5] S. Brewster, Non speech auditory output, in J.A. Jacko and A. Sears (eds), The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Application, Lawrence Erlbaum Associates, Mahwah (2003). [6] G. Robles-De-La-Torre, The Importance of the sense of touch in virtual and real environments, IEEE Multimedia 13(3), Special issue on Haptic User Interfaces for Multimedia Systems, pp 24-30 (2006). [7] J.S. Greenstein, Pointing devices, in M.G. Helander, T.K. Landauer and P. Prabhu (eds), Handbook of Human-Computer Interaction, Elsevier Science, Amsterdam sept (2013). All rights reserved by www.grdjournals.com 15