Available online at ScienceDirect. Procedia Computer Science 50 (2015 )

Similar documents
Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Gesture Recognition with Real World Environment using Kinect: A Review

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

KINECT CONTROLLED HUMANOID AND HELICOPTER

The Making of a Kinect-based Control Car and Its Application in Engineering Education

GUIBDSS Gestural User Interface Based Digital Sixth Sense The wearable computer

Available online at ScienceDirect. Procedia Computer Science 76 (2015 ) 2 8

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

Air Marshalling with the Kinect

Controlling Humanoid Robot Using Head Movements

What was the first gestural interface?

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Sixth Sense Technology

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

CSE Tue 10/09. Nadir Weibel

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Available online at ScienceDirect. Procedia Technology 17 (2014 )

Available online at ScienceDirect. Procedia Computer Science 24 (2013 )

Available online at ScienceDirect. Procedia Computer Science 105 (2017 )

SIXTH SENSE TECHNOLOGY A STEP AHEAD

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Touch & Gesture. HCID 520 User Interface Software & Technology

Available online at ScienceDirect. Anugerah Firdauzi*, Kiki Wirianto, Muhammad Arijal, Trio Adiono

Available online at ScienceDirect. Procedia Technology 14 (2014 )

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Available online at ScienceDirect. Measurement of Power Frequency with Higher Accuracy using PIC Microcontroller

Available online at ScienceDirect. Procedia Manufacturing 3 (2015 )

Direct gaze based environmental controls

Team Project: A Surveillant Robot System

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Classification for Motion Game Based on EEG Sensing

International Journal of Research in Computer and Communication Technology, Vol 2, Issue 12, December- 2013

Service Robots in an Intelligent House

Available online at ScienceDirect. Procedia Computer Science 76 (2015 )

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device

International Conference on Advances in Mechanical Engineering and Industrial Informatics (AMEII 2015)

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Touch & Gesture. HCID 520 User Interface Software & Technology

Providing The Natural User Interface(NUI) Through Kinect Sensor In Cloud Computing Environment

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Design of an Interactive Smart Board Using Kinect Sensor

Development of a telepresence agent

Bare PCB Inspection and Sorting System

Journal of Mechatronics, Electrical Power, and Vehicular Technology

An Electronic Eye to Improve Efficiency of Cut Tile Measuring Function

MRT: Mixed-Reality Tabletop

Bandit Detection using Color Detection Method

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Available online at ScienceDirect. Procedia Computer Science 92 (2016 ) 36 41

Real-Time Face Detection and Tracking for High Resolution Smart Camera System

Available online at ScienceDirect. Ehsan Golkar*, Anton Satria Prabuwono


The ideal K-12 science microscope solution. User Guide. for use with the Nova5000

Virtual Touch Human Computer Interaction at a Distance

Procedia - Social and Behavioral Sciences 195 ( 2015 ) World Conference on Technology, Innovation and Entrepreneurship

War Field Spying Robot With Night Vision Camera

Blue Eyes Technology with Electric Imp Explorer Kit Ankita Shaily*, Saurabh Anand I.

The Xbox One System on a Chip and Kinect Sensor

INTERNATIONAL JOURNAL OF MECHANICAL ENGINEERING AND TECHNOLOGY (IJMET)

Internet of Things (Winter Training Program) 6 Weeks/45 Days

BEAMFORMING WITH KINECT V2

Design of intelligent vehicle control system based on machine visual

A High Definition Motion JPEG Encoder Based on Epuma Platform

Gesticulation Based Smart Surface with Enhanced Biometric Security Using Raspberry Pi

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

Agent-based/Robotics Programming Lab II

Training Schedule. Robotic System Design using Arduino Platform

Image Manipulation Interface using Depth-based Hand Gesture

Fernando Ribeiro, Gil Lopes, Davide Oliveira, Fátima Gonçalves, Júlio

IMPLEMENTATION OF EMBEDDED SYSTEM FOR INDUSTRIAL AUTOMATION

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Short Course on Computational Illumination

Available online at ScienceDirect. Procedia Engineering 168 (2016 ) th Eurosensors Conference, EUROSENSORS 2016

TOUCHABLE HOLOGRAMS AND HAPTIC FEEDBACK: REAL EXPERIENCE IN A VIRTUAL WORLD

CS415 Human Computer Interaction

Assistant Professor, 2, 3, 4, 5 Students, 1, 2, 3, 4, 5

The University of Algarve Informatics Laboratory

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

WIRELESS SPEED CONTROL OF SINGLE PHASE AC MOTOR

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Voice Command Based Robotic Vehicle Control

Available online at ScienceDirect. Procedia Computer Science 75 (2015 )

RF module and Sensing Workshop Proposal. Tachlog Pvt. Ltd.

Design & Development of a Robotic System Using LEGO Mindstorm

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

ScienceDirect. Improvement of the Measurement Accuracy and Speed of Pupil Dilation as an Indicator of Comprehension

The Application of the Three-dimensional Display Technology in the Website Construction

DATA GLOVES USING VIRTUAL REALITY

Team Description Paper

Tech is Here to Stay and Changing Everyday: Here s How Those Changes Can Help You With excerpts from an interview with Jean Robichaud, CTO, of

Various Calibration Functions for Webcams and AIBO under Linux

Prototype Model of Li-Fi Technology using Visible Light Communication

Implementaion of High Performance Home Automation using Arduino

Transcription:

Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 50 (2015 ) 503 510 2nd International Symposium on Big Data and Cloud Computing (ISBCC 15) Virtualizing Electrical Appliances and Gadgets Input with Human Gestures Aravind. B 1, Ajay. S 2, Priyadarshini 3 School of Computer science and Engineering 1,2 MS Software Engineering 3 Associate Professor Abstract VIT University Chennai INDIA. aravind.b2010@vit.ac.in ajay.s2010@vit.ac.in priyadarshini.j@vit.ac.in Natural User Interface [NUI] is the medium of interaction between a user and a machine through natural entity (Air) in the form of user s gesture, recognized by the machine using gesture recognition. Gesture Recognition is the detection of human s bodily motion and behavior. Encroachments have been made using advanced cameras, hardware devices like Kinect. Kinect is a motion-sensing device developed by Microsoft for gaming purpose, which now used in the paper to virtualize the input. The paper talks in detail about controlling Domestic Electrical appliances by simple human gestures using Microsoft Kinect. There are systems available to sense human motions for controlling electrical appliances but none of them provide user defined commands and gesture recognition techniques. This system is unique in finding and understanding human gestures. Keywords: Virtualization, Natural User Interface, Gesture Recognition, Motion Sensing, Skeleton Tracking, Human Machine Interaction, Microsoft Kinect, Xbox 360 KINECT, IR sensor. 1. INTRODUCTION Virtualization is the concept of performing an object s action yet in the absence of its actual source which can include hardware source, operating system, storage devices and computer networks. The project has the key notion as virtualization. Consider a human wants to switch a fan on, actually he/she should stroke the switch on but here this is achieved virtually just by waving hands (Human Gestures) in front of a sensor. 1877-0509 2015 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of scientific committee of 2nd International Symposium on Big Data and Cloud Computing (ISBCC 15) doi:10.1016/j.procs.2015.04.022

504 B. Aravind et al. / Procedia Computer Science 50 ( 2015 ) 503 510 Application development in Human machine interaction and Natural user Interface has reached the crowning from the release of Microsoft s motion sensing gaming device Microsoft Kinect and its Software Development Kit [1] [6][7]. Humans however migrate towards technology advancements always expect flexibility in the way they use their system and machinery. At present lots of techniques and modulations are being introduced and are under research to minimize or simplify the human machine interaction [2]. The paper is been proposed in the aim of minimizing all those complexions and to attain maximum accuracy in controlling the electrical appliances with gestures. Human gestures are an important sign of human communication and an attribute of human actions informally known as the body language. A lot of methods are being in use to track human gestures [3]. To get maximum accuracy and to bring out the system unique a lot of methods are attempted and best case is user defined actions (gestures) to control the system. For example consider a person who can switch his lights on just by touching his head. Which will be variable according to the users requirement. The use of Microsoft Kinect in the system is to track the human joints and gestures; the stream of input data to the Kinect will be the live action of human s gestures. Once the human skeleton is identified the system keeps track on the gestures and matches with the user defined gestures. Once if both the gestures suits the switch is tripped on. 2. LITERATURE RESEARCH MIT Media Lab is working on SixthSense [10] which is a gestural interface device comprising a neckworn pendant that contains both a data projector and camera. Headworn versions were also built at MIT Media Lab in 1997 that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers). SixthSense is a name for extra information supplied by a wearable computer, such as the device called "WuW" (Wear your World) by Pranav Mistry et al., building on the concept of the Telepointer, a neckworn projector and camera combination first proposed and reduced to practice by MIT Media Lab student Steve Mann [9]. One big limitation of this project is it is completely wearable and which is not flexible under all circumstances. This paper overcomes this wearable limitation that the user need not wear any sort of gadgets to stimulate the input stream. 3. MICROSOFT KINECT SENSOR Kinect is a motion sensing device by Microsoft for the Xbox 360 video game console and Windows PCs. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands [1] & [7]. Figure 1: Microsoft Kinect (XBOX 360 gaming console sensor)

B. Aravind et al. / Procedia Computer Science 50 ( 2015 ) 503 510 505 The basic parts of the Kinect are (Figure 1):RGB camera, 3D depth sensing system, Multi-array microphone, Motorized tilt.it interacts to the system by understanding human gestures. Software can use video, sound and gesture recognition driven interactions The Kinect sensor contains a high quality video camera which can provide up to 1280x1024 resolution at 30 frames a second The Kinect depth sensor uses an IR projector and an IR camera to measure the depth of objects in the scene in front of the sensor The Kinect sensor contains four microphones These can perform background noise cancelling Kinect is able to capture the surrounding world in 3D by combining the information from depth sensors and a standard RGB camera. The result of this combination is an RGBD image with 640x480 resolution, where each pixel is assigned color information and depth information (however some depth map pixels do not contain data, so the depth map is never complete). In ideal conditions the resolution of the depth information can be as high as 3 mm [4], using 11-bit resolution. Kinect works with the frequency 30 Hz for both RGB and depth cameras. On the left side of the Kinect is a laser infrared light source that generates electromagnetic waves with the frequency of 830 nm. Information is encoded in light patterns that are being deformed as the light reflects from objects in front of the Kinect. Based on these deformations captured by the sensor on the right side of RGB camera a depth map is created as shown in figure 2. According to Prime Sense this is not the time-of-flight method used in other 3D cameras [5]. Figure 2: Depth, skeleton and VGA view

506 B. Aravind et al. / Procedia Computer Science 50 ( 2015 ) 503 510 4. PROPOSED WORK The proposed work is controlling electrical appliances with simple human gestures and motion sensing technology with the help of Microsoft s Kinect sensor. This paper started its initiations in the vision to successfully minimize the human machine interaction and to take up the Natural User Interface at the forefront. The initial phase of the project began in controlling a simple PowerPoint presentation with gestures like moving hands from left to right or moving from right to left to move between PowerPoint slides. The system is already trained with the understanding of human gesture movements. Based on the body gestures the electric appliances are controlled. The system is stably designed to identify 20 of human joints like (head, hand_right, hand_left and so on...) as shown in figure 3. Figure 3: Skeleton Understanding by Kinect with its joints Figure 4: Coordinate Axis The position of each joint is given as an offset from the Kinect sensor as shown in figure 4. X Left, right Y Up, down Z Away from the sensor The values are given in millimetre. The electric devices are connected to a relay where the relay with four points (1, 2, 3, and 5) as shown in figure 5.

B. Aravind et al. / Procedia Computer Science 50 ( 2015 ) 503 510 507 Figure 5: Relay connection Where 2 is grounded 1 is connected to switch 3 to power source 5 to the accessories or the appliances Point 1 is connects to the switched power through the Microcontroller s output port, which is the Microcontroller of type AT Mega 8 from ATMEL products. The micro controller switches the values to trip the accessories or the appliances ON or OFF where the microcontroller is connected to the computer by means of a serial port. Human Gestues : Input Stream Kinect : Receive Gestures Application : Match Gestures Micro Controller : Relay Activation Relay: Trip switch Appliances : ON/ OFF Figure 6: Sequential process flows.

508 B. Aravind et al. / Procedia Computer Science 50 ( 2015 ) 503 510 The sequential process flow is as shown in figure 6. The Microcontroller gets input from a hyper terminal. The Kinect is connected to the central processing unit of the computer through a USB port and the program developed in.net platform takes control of the Kinect and its input. Once after a person steps in the room (i.e. once a skeleton frame is recognized the execution starts and the stream of input is continuously received from the Kinect to the program the human gestures are identified in time intervals and matched with the user defined gestures. Once both the input streamed gesture and the User defined gesture matches there will be a unique code for each device which is represented in hexadecimal is sent to the microcontroller, the micro controller receives the input and connects to the respected relay and the specific appliance switches ON. 5. PERFORMANCE ANALYSIS A series of experiments are being conducted to evaluate the system s returns and limitations. In a test done for a sample of 100 spells for different electrical appliances. Accuracy up to 90 percent has been achieved. The below given tabular column shows a few random tests. Table 1: Random Sample Test Particulars Name Appliance of Users Response time (Seconds) Result Tube light 1 1.2 Pass Fan 1 1.3 Pass AC 1 1.2 Pass Television 1 1.1 Pass Case Study- 1. Conference Hall At certain times, during a conference, delegates wish to present PowerPoint slides. So, at every point while presenting, the delegate has to move each slide by clicking a keyboard button, rather he/she can signal with their hand (left or right) as desired before the device (Kinect) to move each slide. 2. Emergency Exit Labs are dangerous at every unexpected spell. In an emergency one can t be at the right time in the area to operate and resolve. The device (Kinect) provides the way to operate from outside to solve issues by injecting right input signals via human gestures.

B. Aravind et al. / Procedia Computer Science 50 ( 2015 ) 503 510 509 3. Physically Challenged A physically disabled person in a room was trying to switch on the fan that merely is distant from him. Now, the device (Kinect) delivers a new platform for him to direct the fan. 6. Conclusion This paper is not only aimed at minimizing the complexity of human machine interaction. Also one primary focus of the project is to benefit the physically challenged and old aged community. The system is well designed to understand human gesture behaviors and convert them in to procedural input stream. The user is given autonomy set his own gestures to trip the switches and also the system is capable of fault tolerance, it can identify maximum of two individual persons command. It s also a revolution in the field of Natural User Interface to provide the users with the ultimate interface and a real time operating system that allows them to interact with their day-to-day usable machines naturally. 7. FUTURE WORK The globe genuinely empathize the need for the development in field of Natural User Interface and Human Machine Interaction. The project will lead to its next gen development in rituals like Controlling electrical appliance through remote procedures Easy embedment of the device with household and business application To track maximum individuals and to accept unique commands for each of them Controlling Graphical User Interface through Natural User Interface Speech recognition can be added to the system for better accuracy. The sensor has the capability of recognizing human voice which will be an added advantage to future works. ACKNOWLEDGMENT This project was selected by the University to display in Microsoft I spark project display and had been widely appreciated by Mr. Phani Kondupedi of Microsoft and Mr. Ramaprasanna chellamuthu (Former Microsoft employee, current founder and CEO of GotoPal Pvt. Ltd). This project also gave us a First prize in the National Science Day 2012 project display in the University and acquired us appreciations from Mr. Viswanathan.G (Chancellor, VIT University India).

510 B. Aravind et al. / Procedia Computer Science 50 ( 2015 ) 503 510 REFERENCES [1] Kinect camera, http://www.xbox.com/en-us/kinect/default.htm. [2] Fakhreddine Karray ET, AL., Human-Computer Interaction: Overview on state of the art, International Journal On Smart Sensing And Intelligent Systems, Vol. 1, No.1, March 2008. [3] Matthew Tang, Hand Gesture Recognition Using Microsoft s Kinect, March 16 2011. [4] The Bilibot website. [Online]. Available: http://ww.bilibot.com. [5] Kinect: The company behind the tech explains how it works.[online]. Available: http://www.joystiq.com/2010/06/19/kinect-how-it-works-from-the- company-behindthe-tech [6] Jungong Han, Enhanced Computer Vision with Microsoft Kinect Sensor: A Review, IEEE TRANSACTIONS ON CYBERNETICS. [7] Microsoft Kinect SDK,http://www.microsoft.com/en- us/kinectforwindows/. [8] E. R Melgar et al., Ardino and Kinect Projects, Enrique Ramos Melgar and Ceiriaeo Castro Diez 2012 [9] Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer, Steve Mann with Hal Niedzviecki, ISBN 0385658257 (Hardcover), Random House Inc, 304 pages, 2001. [10] Sixth Sense by PranavMistry[Online]. Available: http://www.pranavmistry.com/projects/sixthsense/ 2015 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of of scientific committee of of 2nd 2nd International Symposium on on Big Big Data Data and and Cloud Cloud Computing (ISBCC 15).