KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Similar documents
KINECT CONTROLLED HUMANOID AND HELICOPTER

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Gesture Recognition with Real World Environment using Kinect: A Review

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

Controlling Humanoid Robot Using Head Movements

What was the first gestural interface?

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Service Robots in an Intelligent House

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Available online at ScienceDirect. Procedia Computer Science 50 (2015 )

Advancements in Gesture Recognition Technology

Air Marshalling with the Kinect

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

International Conference on Advances in Mechanical Engineering and Industrial Informatics (AMEII 2015)

RECOGNITION OF EMERGENCY AND NON-EMERGENCY LIGHT USING MATROX AND VB6 MOHD NAZERI BIN MUHAMMAD

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

In cooperative robotics, the group of robots have the same goals, and thus it is

AUTOMATIC SPEECH RECOGNITION FOR NUMERIC DIGITS USING TIME NORMALIZATION AND ENERGY ENVELOPES

CSE Tue 10/09. Nadir Weibel

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

The Making of a Kinect-based Control Car and Its Application in Engineering Education

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

R (2) Controlling System Application with hands by identifying movements through Camera

A Kinect-based 3D hand-gesture interface for 3D databases

The Hand Gesture Recognition System Using Depth Camera

A Smart Home Design and Implementation Based on Kinect

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

PYBOSSA Technology. What is PYBOSSA?

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

3D Data Navigation via Natural User Interfaces

Getting started 1 System Requirements... 1 Software Installation... 2 Hardware Installation... 2 System Limitations and Tips on Scanning...

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Study on Hand Gesture Recognition

Hand Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided

Community Update and Next Steps

Design of an Interactive Smart Board Using Kinect Sensor

SIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB

Autonomous Face Recognition

A Study on Motion-Based UI for Running Games with Kinect

GESTURE RECOGNITION WITH 3D CNNS

Cosmic Color Ribbon CR150D. Cosmic Color Bulbs CB100D. RGB, Macro & Color Effect Programming Guide for the. February 2, 2012 V1.1

Research Seminar. Stefano CARRINO fr.ch

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Recognizing Military Gestures: Developing a Gesture Recognition Interface. Jonathan Lebron

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device

Using Gestures to Interact with a Service Robot using Kinect 2

Wadehra Kartik, Kathpalia Mukul, Bahl Vasudha, International Journal of Advance Research, Ideas and Innovations in Technology

Multi-Modal User Interaction

Cosmic Color Ribbon CR150D. Cosmic Color Bulbs CB50D. RGB, Macro & Color Effect Programming Guide for the. November 22, 2010 V1.0

Inspiring Creative Fun Ysbrydoledig Creadigol Hwyl. Kinect2Scratch Workbook

Image Manipulation Interface using Depth-based Hand Gesture

Free Making Things See: 3D Vision With Kinect, Processing, Arduino, And MakerBot (Make: Books) Ebooks Online

Humera Syed 1, M. S. Khatib 2 1,2

Providing The Natural User Interface(NUI) Through Kinect Sensor In Cloud Computing Environment

Finger rotation detection using a Color Pattern Mask

Touch & Gesture. HCID 520 User Interface Software & Technology

CS415 Human Computer Interaction

3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

A SURVEY ON HCI IN SMART HOMES. Department of Electrical Engineering Michigan Technological University

Eye-to-Hand Position Based Visual Servoing and Human Control Using Kinect Camera in ViSeLab Testbed

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

USING ROBOCOMP AND KINECT IN AUGMENTED REALITY APPLICATIONS. Leandro P. Serrano July 2011, Coimbra

Responding to Voice Commands

We create robot! You create future!

Activity monitoring and summarization for an intelligent meeting room

Kigamo Scanback which fits in your view camera in place of conventional film.

Following Dirt Roads at Night-Time

Human Computer Interaction by Gesture Recognition

Computer Vision in Human-Computer Interaction

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

A Driver Assaulting Event Detection Using Intel Real-Sense Camera

PERCEPTUAL COMPUTING: Perceptual 3D Editing

Short Course on Computational Illumination

Aspects of Microsoft Kinect sensor application to servomotor control

Analysis of Various Methodology of Hand Gesture Recognition System using MATLAB

Team Description Paper

Touch & Gesture. HCID 520 User Interface Software & Technology

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Lecture Overview. c D. Poole and A. Mackworth 2017 Artificial Intelligence, Lecture 1.1, Page 1 1 / 15

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

Continuous Gesture Recognition Fact Sheet

User Interface Software Projects

A User Friendly Software Framework for Mobile Robot Control

International Journal of Research in Computer and Communication Technology, Vol 2, Issue 12, December- 2013

Human Robot Interaction

Voice based Control Signal Generation for Intelligent Patient Vehicle

More Info at Open Access Database by S. Dutta and T. Schmidt

A Gesture Oriented Android Multi Touch Interaction Scheme of Car. Feilong Xu

A Real Time Static & Dynamic Hand Gesture Recognition System

Table of Contents. Display + Touch + People = Interactive Experience. Displays. Touch Interfaces. Touch Technology. People. Examples.

A Publicly Available RGB-D Data Set of Muslim Prayer Postures Recorded Using Microsoft Kinect for Windows

Mechatronics Educational Robots Robko PHOENIX

ReVRSR: Remote Virtual Reality for Service Robots

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Transcription:

KINECT HANDS-FREE Rituj Beniwal Pranjal Giri Agrim Bari Raman Pratap Singh Akash Jain Department of Aerospace Engineering Indian Institute of Technology, Kanpur Atharva Mulmuley Department of Chemical Engineering Indian Institute of Technology, Kanpur Contents 1 Abstract 3 2 Introduction 3 2.1 Motivation 3 3 Working 4

4 Implementation 5 4.1 Hardware & Software 5 5 Limitations and Future Scope 6 6 Links and References 6 6.1 Github 6 6.2 Readings 6 1.Abstract This project describes the design and implementation of the speech and gesture recognition system to control inputs in the computer using the Microsoft Kinect.This system focuses on the identification of natural gestures that occur during computer and user interface, making the user experience as fluid as possible. The system uses an HMM to classify the performed gestures in conjunction with an SVM to perform real-time segmentation of gestures. The fusion of these two models allows the system to classify gestures in real time as they are being performed instead of waiting until completion. The incorporation of speech commands gives the user an additional level of precision and control over the system. 2.Introduction

2.1 Motivation In today's world, technology pervades nearly every aspect of the average person's daily life. People interact with computers and other technology as frequently as they do with other people and they should have the ability to communicate with computers as naturally as they do with other humans. Speech is perhaps the most comfortable form of communication between humans. It is quick, efficient, and allows people to express themselves with great degrees of freedom, limited only by their own vocabulary. Since the dawn of computers half a century ago, people have dreamed of being able to have conversations with robots and other artificial intelligences as easily as they do with other humans. Unfortunately, keyboard and mouse have been the primary means of interfacing with computers even to this day. While effective in many situations, they are limiting and not a particularly natural means of interaction. Gesture recognition is a current area of research that is trying to address this problem. Everyone is familiar with gestural interaction with other humans. It occurs naturally during speech as a way people for people to express themselves. Gestures are a form of body language that are essential to effectively communicate ideas in addition to spoken language. People already gesture when communicating with other humans, so why not use this mode of communication for natural interaction with computers. 3. Working The Kinect performs 3D depth sensing by emitting a structured point cloud pattern of infrared (IR) light and calculating the depth from the images taken with its IR sensor. Because this IR point cloud originates at a single point in the Kinect sensor, as the distance from the Kinect increases, the point cloud pattern disperses proportionally to the distance the light has traveled. By measuring the offset between the expected location of the IR grid pattern at a calibrated distance and its actual location, the Kinect can calculate the depth at each point in the projected point cloud. Using this depth image, the Kinect can identify foreground objects and determine people and their poses by comparing the detected body to millions of stored examples of body poses. The Kinect then uses a randomized decision forest technique to map the body depth image to body parts from which the skeleton representation can be built.

4. Implementation 4.1 Hardware and Software 1. Microsoft Kinect 1.0 is the main hardware component of our project, developed by Microsoft. The Kinect is a very versatile equipment and has been put to use in various spheres of life. It has various inbuilt sensors such as - 1. An RGB Camera 2. An infrared sensor 3 Microphone Array In our project we are mainly making use of three of these sensors. Using the Infrared sensor and RGB Camera data we are processing the body movements(gestures Recognition).

2. Kinect for Windows SDK v1.8 The Kinect for Windows SDK provides the tools and APIs(Application Programming interface), both native and managed, that you need to develop Kinect-enabled applications for Microsoft Windows 3.Microsoft Visual Studio 2015 Microsoft Visual Studio is an integrated development environment (IDE) from Microsoft. It is used to develop computer programs for Microsoft Windows, as well as web sites, web applications and web services. In our project we made use of these softwares systems, which provides us the set of drivers to get data from the Kinect and thereafter code it to use it our own way. 5. Limitations and Future Scope The data sent by Kinect 1.0 is not accurate enough to be able to process Facial details and finger movements as a result we have not been able to implement Facial Recognition or Finger movements.the project can be continued further in the future using the Kinect v2.0 to include Facial Recognition and delicate finger gestures and further smoothness. Thus a fully interactive user friendly environment can be done. 6. Links and References 6.1 Github https://github.com/pranjalgiri/kinect-handsfree 6.2 Readings http://users.dickinson.edu/~jmac/selected-talks/kinect.pdf http://pterneas.com/2014/01/27/implementing-kinect-gestures/

https://www.youtube.com/watch?v=uq9sejxziug https://github.com/ftsrg/publication-pages/wiki/realtime-gesture-recognitionwith-jnect-and-esper https://github.com/marsyangkang/kinectpowerpointcontrol/blob/master/kinectpo werpointcontrol.sln https://dspace.mit.edu/bitstream/handle/1721.1/85410/870310033- MIT.pdf?sequence=2