DATA GLOVES USING VIRTUAL REALITY

Similar documents
Virtual Grasping Using a Data Glove

WiCon Robo Hand. Electrical & Computer Engineering Department, Texas A&M University at Qatar

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Advancements in Gesture Recognition Technology

Gesture Recognition with Real World Environment using Kinect: A Review

FLEX SENSOR BASED ROBOTIC ARM CONTROLLER: DEVELOPMENT

GUIBDSS Gestural User Interface Based Digital Sixth Sense The wearable computer

Input devices and interaction. Ruth Aylett

Building a bimanual gesture based 3D user interface for Blender

Modelling and Simulation of Tactile Sensing System of Fingers for Intelligent Robotic Manipulation Control

Hand Gesture Recognition Using Radial Length Metric

The use of gestures in computer aided design

Classifying 3D Input Devices

R (2) Controlling System Application with hands by identifying movements through Camera

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction

Subject Description Form. Upon completion of the subject, students will be able to:

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

Geo-Located Content in Virtual and Augmented Reality

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

3-Degrees of Freedom Robotic ARM Controller for Various Applications

A Kinect-based 3D hand-gesture interface for 3D databases

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Classifying 3D Input Devices

Live Hand Gesture Recognition using an Android Device

Building a gesture based information display

Spatial Mechanism Design in Virtual Reality With Networking

Development of a telepresence agent

HUMAN COMPUTER INTERFACE

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Robotic Hand Using Arduino

Design and Control of the BUAA Four-Fingered Hand

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

Introduction to Virtual Reality (based on a talk by Bill Mark)

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

Finger Posture and Shear Force Measurement using Fingernail Sensors: Initial Experimentation

Live. With Michelangelo

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of excavator training simulator using leap motion controller

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Live. With Michelangelo

Interior Design using Augmented Reality Environment

Realtime 3D Computer Graphics Virtual Reality

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

A Survey of Mobile Augmentation for Mobile Augmented Reality System

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

Date Issued: 12/13/2016 iarmc.06: Draft 6. TEAM 1 - iarm CONTROLLER FUNCTIONAL REQUIREMENTS

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

AUTONOMOUS MOTION CONTROLLED HAND-ARM ROBOTIC SYSTEM

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

RAPID PROTOTYPING AND EMBEDDED CONTROL FOR AN ANTHROPOMORPHIC ROBOTIC HAND

Microcontroller Based Closed Loop Speed and Position Control of DC Motor

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

World Automation Congress

Mobile Motion: Multimodal Device Augmentation for Musical Applications

Enabling Cursor Control Using on Pinch Gesture Recognition

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

ROBOT DESIGN AND DIGITAL CONTROL

The Control of Avatar Motion Using Hand Gesture

Wireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing

A Dynamic Gesture Language and Graphical Feedback for Interaction in a 3D User Interface

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Controlling Humanoid Robot Using Head Movements

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Hand Tracking and Visualization in a Virtual Reality Simulation

Gesture Controlled Car

International Journal of Scientific & Engineering Research Volume 8, Issue 7, July-2017 ISSN

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

Pick and Place Robotic Arm Using Arduino

Peter Berkelman. ACHI/DigitalWorld

Omni-Directional Catadioptric Acquisition System

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space

AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY

The Magic Glove. H. Kazerooni, D. Fairbanks, A. Chen, G. Shin University of California at Berkeley Berkeley, California

WHITE PAPER Need for Gesture Recognition. April 2014

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

these systems has increased, regardless of the environmental conditions of the systems.

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Published by: PIONEER RESEARCH & DEVELOPMENT GROUP ( 1

Hand Gesture Recognition System Using Camera

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Touchscreens, tablets and digitizers. RNDr. Róbert Bohdal, PhD.

LDOR: Laser Directed Object Retrieving Robot. Final Report

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

The prototype of locating device with graphics user interface upon display using multipoints infrared reflection

Transcription:

DATA GLOVES USING VIRTUAL REALITY Raghavendra S.N 1 1 Assistant Professor, Information science and engineering, sri venkateshwara college of engineering, Bangalore, raghavendraewit@gmail.com ABSTRACT This paper surveys glove systems and their applications. Technological revolution has made an enormous positive effect and a major part of it comes from combining our digital world and physical world together. This integrative link between these two worlds let us to virtual reality. This paper`s objective is developing a way to make the input systems more interactive with the reality around them. By providing them with an assistive device that replaces their need for painful and frustrating movement to stick around devices, lives can be made easier as the environment around them is brought right before their eyes in their own hand through data feedback from the device while it is being controlled with our hand. An Arduino Leonardo board is used to interface with the hardware and mimic the users hand input on screen. It also analyzes the characteristics of the devices, provides a road map of the evolution of the technology, and discuss limitations of current technology and trends at the frontiers of research. A foremost goal of this paper is to provide readers who are new to the area with a basis for understanding glove systems technology and how it can be applied, while offering specialists an updated picture of breadth of applications in several engineering areas. Index Terms : Data gloves, RF Module, FRS Sensors, Piezoelectric, Joy stick. ----------------------------------------------------------------------------------------------------------------------------- I. INTRODUCTION Generally hands are used for interacting with and manipulating our environment in a huge number of tasks in everyday life. It is then not surprising that a considerable amount of research effort has been devoted to developing technologies for studying, interaction and manipulation and for augmenting our abilities to perform such tasks. The development of the most popular devices for hand movement acquisition, glove-based system, started about 30 years ago and continues to engage a growing number of researchers. The pressure sensors are widely used in many applications and the hand tracking applications are particularly the most popular ones. In a hand tracking system, data glove is commonly used as input device, equipped with pressure sensors that detect touches and movements of fingers and a communication unit that interfaces those touches and movements with a computer. Human-Machine Interaction time-to-time keeps moving more close towards the natural and intuitive user interfaces. Human beings have a good grasping and manipulating ability with their hands and thus interfaces like keyboard and mouse have in general sufficient for the necessary interaction, one wonders how other styles of interaction would perform in doing tasks. In this paper, we mainly focus on the real-time input and output of the data from the data glove and successfully and accurately grasping the inputs. Hand Data Glove is an electronic device equipped with sensors that senses the inputs of hand and finger s individually, and pass the movements to computer in analog and/or digital signal continuously. Our motivation for writing this paper is the observation that pertinent information on such devices, including measurement performance, is scattered across the engineering and scientific literature and, even when located. This makes it difficult for a novice to determine whether and how well a particular glove suits a particular application.

II. Literature Survey A. Glove system Historically, gloves have been an interesting focus of research for human computer interaction. The first of the gloves started appearing in the late 1970's. The process of tracking a hand generally involves calculating some of the properties of the hand - position, orientation and pose. There are several documented methods for position tracking when using glove-based input: i. Optical tracking (using marker systems or silhouette analysis) ii. Magnetic tracking iii. Acoustic tracking iv. Circuitry tracking There have been many gloves developed for use with a computer, each with its own merits and downfalls for a wide variety of applications : i. Sayre Glove: Richard Sayre postulated that a glove that used flexible tubes (not fiber optics), with a light source at one end and a photocell on the other could measure the extent to which a finger is bent. ii. MIT LED Glove: The MIT Architecture Machine Group used a camera focused on an LED-studded glove to track limb position for real-time computer graphics animation. This glove, however, was designed and used for motion capture rather than a control device. iii. Digital Data Entry Glove: This glove used hard-wired circuitry that consisted of bend, touch and inertial sensors. Although not commercially developed, this system was developed to recognize 80 unique combinations of sensor readings mapped to a subset of the 96 printable ASCII characters from the gestures defined in the Single Hand Manual Alphabet for the American Deaf. iv. Dexterous Hand master: This input device, developed by MIT for the control of Dexterous Hand robot, was far more accurate than the data glove[28], with its 20 degrees of freedom measured (4 per finger) by Hall Effect sensors as potentiometers at the joints. v. Power Glove: Nintendo, inspired by the VPL Data Glove, designed a glove for its gaming consoles that was constructed from molded plastic and Lycra to allow flexible movement, with one resistive ink sensor per finger for flex detection. The glove used an acoustic unit mounted on the hand to track the glove in three dimensional space to one quarter of an inch using a television mounted acoustic sensor, with further trackers to determine the rotation of the hand. vi. Cyber Glove: The Cyber Glove was designed to translate American Sign Language into verbal English. It was constructed from 22 thin foil strain gauges sewn into thin fabric. The analogue signal are processed and converted into a digital streaming signal that is sent to a computer using a serial connection. The observed performance of the glove was smooth and stable, while retaining accuracy within 1º of flexion. vii. Space Glove: Virtual Entertainment Systems, a company in the development of arcade games, developed a glove for use with its arcade games that measure the flexion of the fingers using sensors that measure 1 degree of freedom per finger and 2 degrees of freedom on the thumb. This, in conjunction with the magnetic tracker in the back of the glove that tracks the gloves position in three dimensional space, is used for the gaming interface with other Virtual Entertainment Systems.

B. Gesture Recognition The use of gestures in computer software gives the user the ability to interact with a computer in a moral natural and intuitive fashion. The problem of pattern recognition usually denotes a discrimination or classification of events. A gesture recognizer user the spatiotemporal changes as the gesture progresses for its discrimination/classification process. A recognizer generally has three components: i. Encoding the representation of the gesture ii. Classification the injection of the supported gestures into the recognizer using ideal situations and randomization or by example iii. Recognition the matching of observation to gestures III. RELATED WORK Hand Tracking Interface using the Nintendo Wii Remote: Nintendo Wii Remotes to track hands in three dimensions for a specific task molecular visualization applications.. The Wii Remote`s camera properties were explored, and a hand tracking interface with six degrees of freedom was implemented and tested. The investigator found that this technique was acceptable method for the visualization of complex methods,and could be extended to other computer aided design (CAD) applications. IV. PROJECT DESIGN: MODELLING AND CONTROL A. Data glove: This is the key device used in our research to edit the finger input. The data glove used is the sensor glove. The pressure and orientation of the finger is detected by the flat pressure sensors. The glove uses an Arduino Uno board with Atmel 8-bit AVR RISC-based microcontroller and a RF transmitter. Although the information that can be captured is quite rough, and the glove cannot capture the proper input of the fingers, we found the information provided by this glove is enough to achieve our goals in this research. B. Force sensing technology: Force sensors are based on the piezoelectric materials mentioned. However, since the piezoelectric material is sensitive to the pressure applied to its surface, well controlled contact area becomes the key factor in force measurement. One common approach is to build a pressure sensor in to a mechanism such that the force to be measured is applied to a button higher than the base and hence transmitted to the sensing surface by the button with a fixed area. The pressure measurement can then be multiplied with this fixed area and results in the force applied. This is the basic idea of a one dimensional load cell. By controlling the contact area, a pressure sensor can be used to give force estimations. For multi-direction load cells, the force along each axis is measured by the differential signal from one set of piezoelectric materials. One load cell consisting of several sets is capable of measuring multi-axis forces both translational and rotational. Another approach is by ensuring the sensing area larger than the contact area, so that the pressure can be read out from the pressure sensor while measuring the contact area by other methods such as a video camera. The force can be calculated from the two measurements afterwards. Some new approaches even estimate force directly from video image with a pre-trained decision model. A critical point of pressure and force sensing of solid contact is that the sensor should have minimum effects to the original shape and deformation features of the surface; otherwise the measured contact is different from the original. Moreover, for pressure sensing, the sensitive area should be smaller than the contact area, so that the sensor gets fully engaged and gathers the most possible information from the contact.however, for force sensing, it is important that the contact area is not larger than the sensitive area, so that no force is disturbed outside of the sensitive region and thus cannot be detected by the sensor. Therefore the pressure sensors suffer from modest accuracy when used as force sensors, since the location of processing yields a significant difference in the results. Load cells see much better performances in this aspect, but the housing makes them impossible for wearable devices which are getting more and more popular in presence related researches. However, as the increasing of sensor density, the sensor sixe is getting much smaller than the contact area so the force can be estimated at a better accuracy by the summation of the force values on each sensor cell.

V.WORKING PRINCIPLE A. Procedure The flow chart of the procedure to give the input data is shown in Figure. The method we propose in this research can be divided into the input stage and the reproduction stage. The procedure done in each stage is as follows: In the input stage, the FRS defines the relationship between the pressure of the fingers and the given input. At first, the FSR mimics the human input by recording the input pressure on the given sensors, as shown in Figure. The pressure of the index finger and the middle finger are considered to be corresponding to the left and right motion of the pointer on the screen. Similarly, the pressure of the third finger and the little finger are considered to be corresponding to the up and down motion of the pointer on the screen. The corresponding parameters of the human input such as the duration of the pressure, and range of the generalized coordinates will also be obtained. The synchronization of the two motions is done by matching the timing of the tips and pits of the pressure curves to ATmega328 microcontroller which send the desired instruction to the RF transmitter, this RF transmitter send the signals to RF receiver which is connected to the system. In the reproduction stage, the ATmega32u4 microcontroller is present in Arduino Leonard board which convert these incoming input signals to respective input data. The user can change the input of sensor which is present in ASCII code, and the motion of the pointer will reproduce similar but different motions. For example, if the original clip is a left motion, it is possible to generate right motion with same sensor, by changing the corresponding ASCII code related to that sensor. VI. ADVANTAGES & LIMITATIONS The following are the advantages of the project: i.. More realistic input device. ii. Easy to operate. iii. Sign language education. iv. Helpful in education and presentation. v. Cost efficient compared to other glove input device The following are the limitations of the project: i. Every good project has limitations; the limitation of this design lies in the effectiveness of the sensor.

VII. APPLICATIONS The following are the applications of the project: i. Design and Manufacturing : Using a computer screen or a head mounted display, the user, who can be located either on site or remotely can give input. Compared to traditional interfaces such as keyboards and mice, glovebased systems allow a more natural interaction with the environment. ii. Information Visualization: Glove based systems can potentially improve the naturalness of the user s interaction with the data, thereby potentially enhancing the effectiveness of traditional data visualization techniques. iii. Robotics: Glove-based systems can potentially make robot programming, a central issue in robotics more natural and easier, particularly when methods based on tele-operation or automatic programming are used. iv. Wearable and Portable Computers: The introduction of gloves as controllers for consumer electronics, and in particular, as text entry and pointing devices for portable and wearable computers is one of the most recent developments in glove based systems applications. v. Art and Entertainment: Attraction between glove-based systems and the entertainment industry has been long standing Gloves have been used for video games and animation of computer-generated characters as well as movie productions. IX.CONCLUSION AND FUTURE WORK In this paper, we proposed a new method to edit sensor input using the data glove. Using the input human data, it is possible to generate various new applications using the data glove. Even we limited the input to a sensor data, another option to improve the results is to generate the motions. For example, if we can have option such as gesture recognition in the database, and if the system can automatically select such motions to be used for the newly generated motion when the animator controls the glove in a way that is similar to such motions, then the results have a chance to look more natural and realistic. The corresponding joints of the fingers and those of the whole body were strictly fixed in this research. Although this approach is enough for the inputs we introduced in this paper, It would be far more convenient for the animators if the best matching joints were automatically found by comparing the basic human motion and the motion of the motion of the fingers and the wrist. This can be considered as the next step for this research. REFERENCES [1] G. J. Grimes. Digital data entry glove interface device, November 1983. [2] D. J. Sturman and D. Zeltzer. A Survey of Glove-Based input, IEEE Computer Graphics and Application, January/Febuary 1994. [3] R. Watson. A Survey of gesture recognition techniques. Technical report TCDCS-93-11, Trinity College, Dublin, July 1993. [4] M. W. Kruegar. Artificial Reality. Addison-Wesley,2 nd edition,1990. [5] P. Wellner. Interacting with paper on the digital desk. Communications of the ACM, 36(7):87_96, July 1993. [6] T. G. Zimmerman et al. A hand gesture interface device. In proceedings of Human factors in Computing Systems and Graphics Interface. Pages 189_192,New York, April 1997. ACM Press.