Designing a Lightweight Gesture Recognizer Based on the Kinect Version 2
|
|
- Wesley Price
- 6 years ago
- Views:
Transcription
1 10 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV'15 Designing a Lightweight Gesture Recognizer Based on the Kinect Version 2 Leonidas Deligiannidis Wentworth Institute of Technology Dept. of Computer Science and Networking 550 Huntington Av. Boston, MA USA deligiannidisl@wit.edu Hamid R. Arabnia University of Georgia Dept. of Computer Science 415 GSRC Athens, GA USA hra@cs.uga.edu Abstract - We present a lightweight gesture recognizer utilizing Microsoft s Kinect version 2 sensor. Our recognizer seems to be robust enough for many applications and does not require any training. Because this version of the Kinect sensor is equipped with a higher resolution than its predecessor depth camera, it can track some of the user s fingers, and the Kinect SDK is able to provide state information of the user s hands. Armed with this capability, we were able to design our gesture recognizer. New gestures can be specified programmatically at the moment, but we are also working on a graphical user interface (GUI) that would allow a user to define new gestures with it. We show how we built the recognizer and demonstrate its usage via two applications we designed. The first application is a simple picture manipulation application. For the second application, we designed a 3DOF robotic arm that can be controlled using gestures. Keywords: Kinect 2, Gesture Recognition. 1. Introduction In late 2010, Microsoft Corporation introduced the first version of a gaming device, the Kinect, which could be used along with its Xbox gaming console. Recently, Microsoft released the second version of the Kinect [1], for their new Xbox One console, which is faster and provides higher resolution video and depth feeds. The Kinect is a motion sensing input device and can connect to a PC via a Universal Serial Bus (USB) adapter. The Kinect sensor consists of a video camera and a depth camera. The depth camera provides depth information for each pixel using an infrared-ir projector and an IR camera. It also has a multiarray microphone that can detect the direction of where spoken commands are issued. The primary purpose of the Kinect sensor is to enable game-players interact and play games without the need of holding a physical game controller. This innovation changed the way with which we play and interact with games. Players can now use natural command such as tilting to the left / right, raising their hands, jumping, etc. to issue commands. The Kinect sensor enables the players to do this by continuously tracking their body movements, tracking their gestures, as well as observing for verbal commands. These capabilities that it offers, along with its low and affordable price, made the Kinect sensor an attractive device to researchers. Using the freely available SDK [1] for the Kinect, we can now design programs that incorporate the functionalities of the Kinect sensor in our research. The fact that the device does not need to be trained or calibrated to be used, make it easy and simple to be used in many environments other than the environments it was originally designed for. For example, in [2] the functionality of the sensor has been extended to detect and recognize objects and obstacles so that visual impaired people can avoid them. Because of its contactless nature of interaction [3], the Kinect found its way into operating rooms where non-sterilize-able devices cannot be used [4]. Its depth camera can be used to scan and construct 3D maps and object [5]. An effective techniques to control applications such as Google Earth and Bing Maps 3D utilizing a small, yet easy to remember, set of hand gestures is illustrated in [6]. The robotics community [7] adopted the Kinect sensor so that users can interact with robots in a more natural way. Some applications require the tracking of the fingers [8], which was not supported by the original sensor [9][10][11] but it is now (with the second generation of the sensor), in a limited way however. There are several methods for detecting and tracking fingers, but this is a hard problem mainly because of the resolution of the depth sensor; this is also true for the second generation of the camera. Some methods work well such as [12], as long as the orientation of the hands does not vary. Other techniques to work require specialized instruments and arrangements such as infrared camera [13], stereo camera [14], a fixed background [15], and track-able markers on hands and fingers [16]. Other systems need a training phase [17] to recognize gestures such as clapping, waving, shaking head, etc.
2 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV' Gestures Interacting in an application utilizing the Kinect sensor requires the sensor to actively track the motion of the user. Even though playing a game could require only large movements of the user s body, limbs, or hands, to interpret the movements as commands such as jumping, leaning, waving, ducking, etc. other applications could require more precise input. Specifically, an application should be able to classify a posture as an event as well as a gesture and should be able to differentiate between the two. A gesture normally has a beginning and an end. A posture is a static positioning of the user and her arms, legs, etc. where as a gesture is dynamic by nature. A user should be able to indicate the beginning and possibly the ending of a gesture. A gesture, such as waving, is a dynamic motion of one s hand(s) but doesn t have to be precise. Other gestures need to be more precise as the user s arm, for example, is being tracked to control the movement of a remote robotic arm, but more importantly the initiation and the termination of the gesture could be of greater importance. A system that actively tracks the movements of a user and miss-interprets actions of a user s intent as actual commands will soon be abandoned by the user as it confuses her. Initiating the termination of a gesture is very valuable too, as it can be used to cancel or stop the current tracking state. As it is reported in [18] the main difficulties in designing a gesture recognizer has to address the issues of temporal segmentation ambiguity which deals with the beginning and the ending of a gesture, and spatial-temporal variability" which deals with the tolerance of the initiation and termination of a gesture since each person performs the same gestures differently. The Kinect s version 1 depth sensor was low resolution to a point where finger detection was difficult to make, and impossible to make from a distance. The Kinect 2 has a higher resolution depth camera, and finger detection is provided by the SDK. Finger detection is still limited but at least the SDK reports postures of the hand based on the fingers arrangement. For example, the Kinect 2 can distinguish, in any orientation, if the hand is Open, Closed, or Lasso. Lasso is defined by closing the hand and extending the index finger (like pointing to an object). However, because of the still-low resolution of the depth sensor, it is recommended that the user extends both the index and the middle fingers while touching each other to indicate the Lasso posture. If the user is close to the camera, extending the index finger alone is enough. Having two hands, where each hand can perform 3 different postures, we have 9 different combinations we can use to indicate the beginning and ending of a gesture. Additional postures can be defined by, for example, hiding your hand behind your back while performing a posture with your other hand; the Kinect SDK provides tracking state information for each joint, in addition to its location and orientation in space. Depending on the posture and the gesture, the user must be aware of the position of the Kinect sensor. For example, if the user performs the Lasso posture and points at the Kinect, it is possible that the Kinect will report false posture as the Lasso gesture seen from the front looks very similar to the Closed hand-posture. 3. Gesture Recognizer The Kinect SDK provides an API where the skeleton information of up to 6 people can be reported. It tracks a human body, even partially when some joints are hidden, and reports the position and orientation of each of the 25 joints. It also reports if a joint information is accurate or not; it is being tracked or it is not visible in the current frame and its value is inferred. Figure 1. The four joints needed by the gesture recognizer to define the gestures involving the right hand. In Kinect 2, which has a higher resolution depth sensor, the hands states are also reported; the main states are Open, Closed and Lasso open palm, fist, fist with index finger extending. Based on the upper body joint positions and the state of the hands we designed a gesture recognizer engine. The engine takes as input the joint information from the Kinect and determines which gesture / posture is being performed. The main advantages of this recognizer are that it is lightweight, rotation invariant, does not require any training, and in addition, it can be configured for many different gestures. The configuration, however, is done programmatically at this time but we are developing a graphical user interface tool where no programming will be needed to define new gestures.
3 12 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV'15 Based on only a few joints, we divide the user space into five areas. Figure 1 shows the four joints needed to recognize gestures for the right hand: the left and right shoulder, the right elbow and the right hand. The left elbow and the left hand are needed for the gestures involving the left hand, but for simplicity reasons we only show here the joints involved in the gesture recognizer for the right hand. shoulder line and to the left, b) above the shoulder line and to the right, and c) below the shoulder line and to the right. Figure 2. Calculating the 3 vectors needed for the recognizer. A vector defining the shoulder line, another vector that is perpendicular to the vector that defines the shoulder line, and another vector that defines the elbowhand. The H-SS vector is only used in the robotic arm application we will discuss later. If we treat the positions of these joints as vectors, we can define a vector RS-LS as shown in figure 2. Then we define a vector that is perpendicular to RS-LS, shown as perp(rs- LS). We can also calculate the vector RH-RE which is the vector defined be the right elbow and right hand. The Head and Spine_Shoulder joints are only used to control the roll of the robotic arm application that we will discuss later. Using the dot product operation of vectors, we can calculate in which area the right hand is located, as shown in figure 3. The hand can be in 3 different areas: a) above the Figure 3. Using the dot product of vectors, we can calculate in which area the right hand is. Two vector subtractions are needed to calculate the shoulder line and the elbow-hand vectors. Based on the shoulder line vector, we can easily construct a perpendicular to it vector as well. Then, two dot product operations are needed to determine in which area the hand is. Figure 4 shown the five areas we define for both hands. Even though a user could move his right hand to the area
4 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV'15 13 defined for the left hand, we don t consider motions like these as valid, as these movements obstruct the view of the user and they are anatomically awkward to perform. Using this technique, one can define other areas for tracking hands, such as using the waist line, or the spine line, etc. depending on the applications needs. Figure 5. The first gesture is used to increase the size of the picture, the second gesture is used to decrease the size of the picture, and the last gesture is used to rotate the selected picture. Figure 4. The five areas defined by the shoulder line and the two hands. 4. Picture Control Application The first application we designed based on our gesture recognizer, was a picture manipulation application. The application is designed in Java. Using the Java Native Interface (JNI), we call our C++ compiled functions that communicate with the Kinect and deliver the joint information and the hands states to our java gesture recognizer. As shown in figure 5, the user moves his hand near his head and closes his hands to grab a picture, and then pulls apart his hands to increase the size of the picture. Opening his hands, stops the current operation. Similarly, if the user grabs the picture with his hands apart (by closing his hands) and moves them close to each other, the size of the picture decreases; this operation is similar to what most users are familiar with on mobile devices but instead of using their hands, they use two finger. The last gesture is used to rotate an image. To activate and control the rotation of the picture, the user s left hand moves close to the body in the open posture, and the right hand performs the lasso posture. The orientation of the picture is controlled by the right hand s continues rotations. 5. Robotic Arm Control We designed a second application that uses our gesture recognizer which controls a robotic arm. Figure 6 shows a top view of the robot arm which consists of three heavy duty servo motors, a USB servo controller from Phidgets.com and a 5V / 5A power supply to power the servo motors. The servo motors are physically connected to each other to provide a three-degrees-of-freedom of the arm, shown in figure 7. Figure 7 also shows how the robotic arm is connected to a PC and the Kinect sensor. The sensor is connected to a PC via a proprietary Kinect-to-USB adapter. Via another USB port, the PC is connected to the servo-controller of the robotic arm. The application receives joint and hand state information from the Kinect, the gesture recognizer component interprets these commands and instructs the servo-controller to rotate the appropriate servo motors by a specified amount. Figure 8 shows the gestures implemented to control the robotic arm. The top two figures (in figure 8), are used to disengage and engage the servo motors respectively. The second set of gestures are used to control the bottom servos to make the arm rotate right and left respectively. The third set of gesture, are used to control the top servo motor and move the arm up and down. The last gesture is used to instruct the arm to follow the user s right hand. As the user moves his arm left-right and up-down, the robotic arm mimics these movements by controlling the three servo motors simultaneously and in real time. By leaning the head left and right, we change the roll of the arm ±10 degrees. As shown in figure 2, we construct the H-SS vector. By taking the dot product of the H-SS and the RS- LS vectors, we can determine the direction and the amount
5 14 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV'15 of the roll (which is implemented by rotating the middle servo motor). Figure 6. The robotic arm (top view), showing the three servo motors attach to each other to provide 3-degrees-of- freedom. Next to the servo assembly is the Phidgets servo controller which receives its commands via its USB port. At the other end, the 5V / 5A power supply is shown which provides power to the three servo motors. Figure 7. The robotic arm (side view), and how it is connected to the controlling PC and the Kinect camera. The Kinect camera is connected to the PC via a proprietary adapter. There is also a USB connection between the PC and the robotic arm s servo controller.
6 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV'15 15 one would be able to define gestures and associated actions via a GUI instead of doing the same programmatically. Figure 8. The gestures used to control the robotic arm. The top two gestures are used to disengage and engage the servo motors, respectively. The next set of gestures are used to rotate the arm left-right. The next set of gestures are used to move the tip of the arm up-down. The last gesture at the bottom, is used to allow the robotic arm to follow the user s right hand. As the user moves his hand, the robotic arm mimics these movements in real time. 6. Conclusion Skeleton tracking with joint position and hand state information from the Kinect version 2 sensor can be very useful input to a gesture recognizer. Having a gesture recognizer, we can interact with software applications and other hardware devices without using a tangible controller. We illustrated our gesture recognizer in this paper by presenting a couple of applications utilizing it. Because this new version of Kinect reports hand state information, we can design many different applications that require gestures. We wish to develop a graphical interface where 7. References [1] Microsoft Corporation s Kinect version 2 home page. Retrieved March [2] Atif Khan, Febin Moideen, Juan Lopez, Wai L. Khoo and Zhigang Zhu. KinDectect: Kinect Detecting Objects. K. Miesenberger et al. (Eds.): Computers Helping People with Special Needs, Lecture Notes in Computer Science (LNCS) Volume 7383, 2012, pp , Springer-Verlag Berlin Heidelberg [3] K. Montgomery, M. Stephanides, S. Schendel, and M. Ross. User interface paradigms for patient-specific surgical planning: lessons learned over a decade of research. Computerized Medical Imaging and Graphics, 29(5): , [4] Luigi Gallo, Alessio Pierluigi Placitelli, Mario Ciampi Controller-free exploration of medical image data: experiencing the Kinect. 24th International Symposium on Computer-Based Medical Systems (CBMS), June p1-6. [5] Peter Henry, Michael Krainin, Evan Herbst, Xiaofeng Ren, Dieter Fox. RGB-D mapping: Using Kinectstyle depth cameras for dense 3D modeling of indoor environments. The International Journal of Robotics Research 0(0) 1 17, March [6] Maged N Kamel Boulos, Bryan J Blanchard, Cory Walker, Julio Montero, Aalap Tripathy, Ricardo Gutierrez-Osuna. Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation. International Journal of Health Geographics 2011, 10:45. [7] Wei-Chen Chiu, Ulf Blanke, Mario Fritz, Improving the Kinect by Cross-Modal Stereo, In Jesse Hoey, Stephen McKenna and Emanuele Trucco, Proceedings of the British Machine Vision Conference, pages BMVA Press, September [8] Guanglong Du, Ping Zhang, Jianhua Mai and Zeling Li. Markerless Kinect-Based Hand Tracking for Robot Teleoperation. International Journal of Advanced Robotic Systems Vol 9(36) May [9] Zhou Ren, Junsong Yuan, Jingjing Meng, Zhengyou Zhang. Robust Part-Based Hand Gesture Recognition Using Kinect Sensor. IEEE Transactions on Multimedia, Vol. 15, No. 5, pp.1-11, Aug [10] Jagdish L. Raheja, Ankit Chaudhary, Kunal Singal, Tracking of Fingertips and Centre of Palm using KINECT, In proceedings of the 3 rd IEEE International Conference on Computational Intelligence, Modelling and Simulation, Malaysia, Sep, 2011, pp [11] Valentino Frati, Domenico Prattichizzo, Using Kinect for hand tracking and rendering in wearable haptics.
7 16 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV'15 IEEE World Haptics Conference June, Istanbul, Turkey, pp [12] Yang, D., Jin, L.W., Yin, J. and Others, An effective robust fingertip detection method for finger writing character recognition system, Proceedings of the Fourth International Conference On Machine Learning And Cybernetics, Guangzhou, China, 2005, pp [13] Oka, K., Sato, Y., Koike, H., Real time Tracking of Multiple Fingertips and Gesture Recognition for Augmented Desk Interface Systems, Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FGR.02), Washington, D.C., USA, May, 2002, pp [14] Ying H., Song, J., Renand, X., Wang, W., Fingertip Detection and Tracking Using 2D and 3D Information, Proceedings of the seventh World Congress on Intelligent Control and Automation, Chongqing, China, 2008, pp [15] Crowley, J. L., Berardand F., Coutaz, J., Finger Tacking As an Input Device for Augmented Reality, Proceedings of International Workshop on Automatic Face and Gesture Recognition, Zurich, Switzerland, 1995, pp [16] Raheja, J. L., Das, K., Chaudhary, A., An Efficient Real Time Method of Fingertip Detection, Proceedings of 7th International Conference on Trends in Industrial Measurements and Automation (TIMA 2011), CSIR Complex, Chennai, India, 6-8 Jan, 2011, pp [17] K. K. Biswas, Saurav Kumar Basu. Gesture Recognition using Microsoft Kinect. Proceedings of the 5th International Conference on Automation, Robotics and Applications, Dec 6-8, 2011, Wellington, New Zealand, pp [18] Caifeng Shan. Gesture Control for Consumer Electronics. In Ling. Shao et. al., editors, Multimedia Interaction and Intelligent User Interfaces, Advances in Pattern Recognition, pages Springer London, 2010.
Fingertip Detection: A Fast Method with Natural Hand
Fingertip Detection: A Fast Method with Natural Hand Jagdish Lal Raheja Machine Vision Lab Digital Systems Group, CEERI/CSIR Pilani, INDIA jagdish@ceeri.ernet.in Karen Das Dept. of Electronics & Comm.
More informationFabrication of the kinect remote-controlled cars and planning of the motion interaction courses
Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 INTE 2014 Fabrication of the kinect remote-controlled cars and planning of the motion
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationImage Manipulation Interface using Depth-based Hand Gesture
Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking
More informationA Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,
IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,
More informationThe Making of a Kinect-based Control Car and Its Application in Engineering Education
The Making of a Kinect-based Control Car and Its Application in Engineering Education Ke-Yu Lee Department of Computer Science and Information Engineering, Cheng-Shiu University, Taiwan Chun-Chung Lee
More informationAir Marshalling with the Kinect
Air Marshalling with the Kinect Stephen Witherden, Senior Software Developer Beca Applied Technologies stephen.witherden@beca.com Abstract. The Kinect sensor from Microsoft presents a uniquely affordable
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationMorphological filters applied to Kinect depth images for noise removal as pre-processing stage
Morphological filters applied to Kinect depth images for noise removal as pre-processing stage Garduño-Ramón M. A. #1, Morales-Hernández L. A. *2, Osornio-Rios R. A. #3 # Facultad de Ingeniería, Campus
More informationKINECT CONTROLLED HUMANOID AND HELICOPTER
KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED
More informationRobust Hand Gesture Recognition for Robotic Hand Control
Robust Hand Gesture Recognition for Robotic Hand Control Ankit Chaudhary Robust Hand Gesture Recognition for Robotic Hand Control 123 Ankit Chaudhary Department of Computer Science Northwest Missouri State
More informationLecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)
Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces
More informationThe Hand Gesture Recognition System Using Depth Camera
The Hand Gesture Recognition System Using Depth Camera Ahn,Yang-Keun VR/AR Research Center Korea Electronics Technology Institute Seoul, Republic of Korea e-mail: ykahn@keti.re.kr Park,Young-Choong VR/AR
More informationA Novel System for Hand Gesture Recognition
A Novel System for Hand Gesture Recognition Matthew S. Vitelli Dominic R. Becker Thinsit (Laza) Upatising mvitelli@stanford.edu drbecker@stanford.edu lazau@stanford.edu Abstract The purpose of this project
More informationResearch on Hand Gesture Recognition Using Convolutional Neural Network
Research on Hand Gesture Recognition Using Convolutional Neural Network Tian Zhaoyang a, Cheng Lee Lung b a Department of Electronic Engineering, City University of Hong Kong, Hong Kong, China E-mail address:
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationDevelopment of excavator training simulator using leap motion controller
Journal of Physics: Conference Series PAPER OPEN ACCESS Development of excavator training simulator using leap motion controller To cite this article: F Fahmi et al 2018 J. Phys.: Conf. Ser. 978 012034
More informationKinect Interface for UC-win/Road: Application to Tele-operation of Small Robots
Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots Hafid NINISS Forum8 - Robot Development Team Abstract: The purpose of this work is to develop a man-machine interface for
More informationDesign a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison
e-issn 2455 1392 Volume 2 Issue 10, October 2016 pp. 34 41 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Design a Model and Algorithm for multi Way Gesture Recognition using Motion and
More informationSLIC based Hand Gesture Recognition with Artificial Neural Network
IJSTE - International Journal of Science Technology & Engineering Volume 3 Issue 03 September 2016 ISSN (online): 2349-784X SLIC based Hand Gesture Recognition with Artificial Neural Network Harpreet Kaur
More informationHand Gesture Recognition System Using Camera
Hand Gesture Recognition System Using Camera Viraj Shinde, Tushar Bacchav, Jitendra Pawar, Mangesh Sanap B.E computer engineering,navsahyadri Education Society sgroup of Institutions,pune. Abstract - In
More informationClassification for Motion Game Based on EEG Sensing
Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,
More informationMOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.
More informationJournal of Mechatronics, Electrical Power, and Vehicular Technology
Journal of Mechatronics, Electrical Power, and Vehicular Technology 8 (2017) 85 94 Journal of Mechatronics, Electrical Power, and Vehicular Technology e-issn: 2088-6985 p-issn: 2087-3379 www.mevjournal.com
More informationCSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2
CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter
More informationOutline. Comparison of Kinect and Bumblebee2 in Indoor Environments. Introduction (Cont d) Introduction
Middle East Technical University Department of Mechanical Engineering Comparison of Kinect and Bumblebee2 in Indoor Environments Serkan TARÇIN K. Buğra ÖZÜTEMİZ A. Buğra KOKU E. İlhan Konukseven Outline
More informationINTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY
INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY A PATH FOR HORIZING YOUR INNOVATIVE WORK HAND GESTURE CONTROLLED REAL TIME APPLICATION FOR AUTOMATION SHIVAM S. SHINDE 1,
More informationDevelopment of a telepresence agent
Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented
More informationA Real Time Static & Dynamic Hand Gesture Recognition System
International Journal of Engineering Inventions e-issn: 2278-7461, p-issn: 2319-6491 Volume 4, Issue 12 [Aug. 2015] PP: 93-98 A Real Time Static & Dynamic Hand Gesture Recognition System N. Subhash Chandra
More informationDesign and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device
Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Hung-Chi Chu 1, Yuan-Chin Cheng 1 1 Department of Information and Communication Engineering, Chaoyang University
More informationReal Time Hand Gesture Tracking for Network Centric Application
Real Time Hand Gesture Tracking for Network Centric Application Abstract Chukwuemeka Chijioke Obasi 1 *, Christiana Chikodi Okezie 2, Ken Akpado 2, Chukwu Nnaemeka Paul 3, Asogwa, Chukwudi Samuel 1, Akuma
More information3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta
3D Interaction using Hand Motion Tracking Srinath Sridhar Antti Oulasvirta EIT ICT Labs Smart Spaces Summer School 05-June-2013 Speaker Srinath Sridhar PhD Student Supervised by Prof. Dr. Christian Theobalt
More informationCS415 Human Computer Interaction
CS415 Human Computer Interaction Lecture 10 Advanced HCI Universal Design & Intro to Cognitive Models October 30, 2016 Sam Siewert Summary of Thoughts on ITS Collective Wisdom of Our Classes (2015, 2016)
More informationDesign of an Interactive Smart Board Using Kinect Sensor
Design of an Interactive Smart Board Using Kinect Sensor Supervisor: Dr. Jia Uddin Nasrul Karim Sarker - 13201025 Muhammad Touhidul Islam - 13201021 Md. Shahidul Islam Majumder - 13201022 Department of
More informationTeam Description Paper
Tinker@Home 2016 Team Description Paper Jiacheng Guo, Haotian Yao, Haocheng Ma, Cong Guo, Yu Dong, Yilin Zhu, Jingsong Peng, Xukang Wang, Shuncheng He, Fei Xia and Xunkai Zhang Future Robotics Club(Group),
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationCONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM
CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,
More informationWireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing
Wireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing Presented by: Benjamin B. Rhoades ECGR 6185 Adv. Embedded Systems January 16 th 2013
More informationHumera Syed 1, M. S. Khatib 2 1,2
A Hand Gesture Recognition Approach towards Shoulder Wearable Computing Humera Syed 1, M. S. Khatib 2 1,2 CSE, A.C.E.T/ R.T.M.N.U, India ABSTRACT: Human Computer Interaction needs computer systems and
More informationPrediction and Correction Algorithm for a Gesture Controlled Robotic Arm
Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Pushkar Shukla 1, Shehjar Safaya 2, Utkarsh Sharma 3 B.Tech, College of Engineering Roorkee, Roorkee, India 1 B.Tech, College of
More informationActive Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1
Active Stereo Vision COMP 4102A Winter 2014 Gerhard Roth Version 1 Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can
More informationPROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION
International Journal of Advanced Research in Engineering and Technology (IJARET) Volume 7, Issue 2, March-April 2016, pp. 159 167, Article ID: IJARET_07_02_015 Available online at http://www.iaeme.com/ijaret/issues.asp?jtype=ijaret&vtype=7&itype=2
More informationMULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT
MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003
More informationSIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB
SIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB S. Kajan, J. Goga Institute of Robotics and Cybernetics, Faculty of Electrical Engineering and Information Technology, Slovak University
More informationGESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera
GESTURE BASED HUMAN MULTI-ROBOT INTERACTION Gerard Canal, Cecilio Angulo, and Sergio Escalera Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 2/27 Introduction Nowadays robots are able
More informationAvailable online at ScienceDirect. Procedia Computer Science 50 (2015 )
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 50 (2015 ) 503 510 2nd International Symposium on Big Data and Cloud Computing (ISBCC 15) Virtualizing Electrical Appliances
More informationRecognizing Military Gestures: Developing a Gesture Recognition Interface. Jonathan Lebron
Recognizing Military Gestures: Developing a Gesture Recognition Interface Jonathan Lebron March 22, 2013 Abstract The field of robotics presents a unique opportunity to design new technologies that can
More informationTeam Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League
Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League Chung-Hsien Kuo 1, Hung-Chyun Chou 1, Jui-Chou Chung 1, Po-Chung Chia 2, Shou-Wei Chi 1, Yu-De Lien 1 1 Department
More informationComparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application
Comparison of Head Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application Nehemia Sugianto 1 and Elizabeth Irenne Yuwono 2 Ciputra University, Indonesia 1 nsugianto@ciputra.ac.id
More informationIntegration of Hand Gesture and Multi Touch Gesture with Glove Type Device
2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More informationThe 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X
The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-103 DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS
More informationKINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri
KINECT HANDS-FREE Rituj Beniwal Pranjal Giri Agrim Bari Raman Pratap Singh Akash Jain Department of Aerospace Engineering Indian Institute of Technology, Kanpur Atharva Mulmuley Department of Chemical
More informationDevelopment of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device
RESEARCH ARTICLE OPEN ACCESS Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device 1 Dr. V. Nithya, 2 T. Sree Harsha, 3 G. Tarun Kumar,
More informationAugmented Reality using Hand Gesture Recognition System and its use in Virtual Dressing Room
International Journal of Innovation and Applied Studies ISSN 2028-9324 Vol. 10 No. 1 Jan. 2015, pp. 95-100 2015 Innovative Space of Scientific Research Journals http://www.ijias.issr-journals.org/ Augmented
More informationCost Oriented Humanoid Robots
Cost Oriented Humanoid Robots P. Kopacek Vienna University of Technology, Intelligent Handling and Robotics- IHRT, Favoritenstrasse 9/E325A6; A-1040 Wien kopacek@ihrt.tuwien.ac.at Abstract. Currently there
More informationRobot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica Agell
DEGREE PROJECT FOR MASTER OF SCIENCE WITH SPECIALIZATION IN ROBOTICS DEPARTMENT OF ENGINEERING SCIENCE UNIVERSITY WEST Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica
More informationTraining NAO using Kinect
Training NAO using Kinect Michalis Chartomatsidis, Emmanouil Androulakis, Ergina Kavallieratou University of the Aegean Samos, Dept of Information & Communications Systems, Greece kavallieratou@aegean.gr
More informationChangjiang Yang. Computer Vision, Pattern Recognition, Machine Learning, Robotics, and Scientific Computing.
Changjiang Yang Mailing Address: Department of Computer Science University of Maryland College Park, MD 20742 Lab Phone: (301)405-8366 Cell Phone: (410)299-9081 Fax: (301)314-9658 Email: yangcj@cs.umd.edu
More informationCS415 Human Computer Interaction
CS415 Human Computer Interaction Lecture 10 Advanced HCI Universal Design & Intro to Cognitive Models October 30, 2017 Sam Siewert Summary of Thoughts on Intelligent Transportation Systems Collective Wisdom
More informationHumanoid robot. Honda's ASIMO, an example of a humanoid robot
Humanoid robot Honda's ASIMO, an example of a humanoid robot A humanoid robot is a robot with its overall appearance based on that of the human body, allowing interaction with made-for-human tools or environments.
More informationResearch Seminar. Stefano CARRINO fr.ch
Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks
More informationA Smart Home Design and Implementation Based on Kinect
2018 International Conference on Physics, Computing and Mathematical Modeling (PCMM 2018) ISBN: 978-1-60595-549-0 A Smart Home Design and Implementation Based on Kinect Jin-wen DENG 1,2, Xue-jun ZHANG
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationA Study on Motion-Based UI for Running Games with Kinect
A Study on Motion-Based UI for Running Games with Kinect Jimin Kim, Pyeong Oh, Hanho Lee, Sun-Jeong Kim * Interaction Design Graduate School, Hallym University 1 Hallymdaehak-gil, Chuncheon-si, Gangwon-do
More informationBeyond Actuated Tangibles: Introducing Robots to Interactive Tabletops
Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer
More informationA SURVEY ON GESTURE RECOGNITION TECHNOLOGY
A SURVEY ON GESTURE RECOGNITION TECHNOLOGY Deeba Kazim 1, Mohd Faisal 2 1 MCA Student, Integral University, Lucknow (India) 2 Assistant Professor, Integral University, Lucknow (india) ABSTRACT Gesture
More informationThe Control of Avatar Motion Using Hand Gesture
The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,
More informationVOICE CONTROL BASED PROSTHETIC HUMAN ARM
VOICE CONTROL BASED PROSTHETIC HUMAN ARM Ujwal R 1, Rakshith Narun 2, Harshell Surana 3, Naga Surya S 4, Ch Preetham Dheeraj 5 1.2.3.4.5. Student, Department of Electronics and Communication Engineering,
More informationProseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging
Proseminar Roboter und Aktivmedien Educational robots achievements and challenging Lecturer Lecturer Houxiang Houxiang Zhang Zhang TAMS, TAMS, Department Department of of Informatics Informatics University
More informationControlling Humanoid Robot Using Head Movements
Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika
More informationCognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many
Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July
More informationTouch & Gesture. HCID 520 User Interface Software & Technology
Touch & Gesture HCID 520 User Interface Software & Technology Natural User Interfaces What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015
ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015 Yu DongDong, Liu Yun, Zhou Chunlin, and Xiong Rong State Key Lab. of Industrial Control Technology, Zhejiang University, Hangzhou,
More informationInternational Conference on Advances in Mechanical Engineering and Industrial Informatics (AMEII 2015)
International Conference on Advances in Mechanical Engineering and Industrial Informatics (AMEII 2015) Equipment body feeling maintenance teaching system Research Based on Kinect Fushuan Wu 1, a, Jianren
More informationBaset Adult-Size 2016 Team Description Paper
Baset Adult-Size 2016 Team Description Paper Mojtaba Hosseini, Vahid Mohammadi, Farhad Jafari 2, Dr. Esfandiar Bamdad 1 1 Humanoid Robotic Laboratory, Robotic Center, Baset Pazhuh Tehran company. No383,
More informationHUMAN MACHINE INTERFACE
Journal homepage: www.mjret.in ISSN:2348-6953 HUMAN MACHINE INTERFACE Priyesh P. Khairnar, Amin G. Wanjara, Rajan Bhosale, S.B. Kamble Dept. of Electronics Engineering,PDEA s COEM Pune, India priyeshk07@gmail.com,
More informationMedical Robotics. Part II: SURGICAL ROBOTICS
5 Medical Robotics Part II: SURGICAL ROBOTICS In the last decade, surgery and robotics have reached a maturity that has allowed them to be safely assimilated to create a new kind of operating room. This
More informationA Dynamic Fitting Room Based on Microsoft Kinect and Augmented Reality Technologies
A Dynamic Fitting Room Based on Microsoft Kinect and Augmented Reality Technologies Hsien-Tsung Chang, Yu-Wen Li, Huan-Ting Chen, Shih-Yi Feng, Tsung-Tien Chien Department of Computer Science and Information
More informationProviding The Natural User Interface(NUI) Through Kinect Sensor In Cloud Computing Environment
IJIRST International Journal for Innovative Research in Science & Technology Volume 1 Issue 7 December 2014 ISSN (online): 2349-6010 Providing The Natural User Interface(NUI) Through Kinect Sensor In Cloud
More informationLocalized Space Display
Localized Space Display EE 267 Virtual Reality, Stanford University Vincent Chen & Jason Ginsberg {vschen, jasong2}@stanford.edu 1 Abstract Current virtual reality systems require expensive head-mounted
More informationDevelopment of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture
Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Nobuaki Nakazawa 1*, Toshikazu Matsui 1, Yusaku Fujii 2 1 Faculty of Science and Technology, Gunma University, 29-1
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationService Robots in an Intelligent House
Service Robots in an Intelligent House Jesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering Autonomous National University of Mexico UNAM 2017 OUTLINE Introduction A System
More informationTeam KMUTT: Team Description Paper
Team KMUTT: Team Description Paper Thavida Maneewarn, Xye, Pasan Kulvanit, Sathit Wanitchaikit, Panuvat Sinsaranon, Kawroong Saktaweekulkit, Nattapong Kaewlek Djitt Laowattana King Mongkut s University
More informationAugmented Keyboard: a Virtual Keyboard Interface for Smart glasses
Augmented Keyboard: a Virtual Keyboard Interface for Smart glasses Jinki Jung Jinwoo Jeon Hyeopwoo Lee jk@paradise.kaist.ac.kr zkrkwlek@paradise.kaist.ac.kr leehyeopwoo@paradise.kaist.ac.kr Kichan Kwon
More informationAugmented Desk Interface. Graduate School of Information Systems. Tokyo , Japan. is GUI for using computer programs. As a result, users
Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface Yoichi Sato Institute of Industrial Science University oftokyo 7-22-1 Roppongi, Minato-ku Tokyo 106-8558, Japan ysato@cvl.iis.u-tokyo.ac.jp
More informationTouch & Gesture. HCID 520 User Interface Software & Technology
Touch & Gesture HCID 520 User Interface Software & Technology What was the first gestural interface? Myron Krueger There were things I resented about computers. Myron Krueger There were things I resented
More informationAdvancements in Gesture Recognition Technology
IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka
More informationAn Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi
An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi Department of E&TC Engineering,PVPIT,Bavdhan,Pune ABSTRACT: In the last decades vehicle license plate recognition systems
More informationAugmented and Virtual Reality
CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS
More informationHand Gesture Recognition Using Radial Length Metric
Hand Gesture Recognition Using Radial Length Metric Warsha M.Choudhari 1, Pratibha Mishra 2, Rinku Rajankar 3, Mausami Sawarkar 4 1 Professor, Information Technology, Datta Meghe Institute of Engineering,
More informationMSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation
MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation Rahman Davoodi and Gerald E. Loeb Department of Biomedical Engineering, University of Southern California Abstract.
More informationHuman Computer Interaction by Gesture Recognition
IOSR Journal of Electronics and Communication Engineering (IOSR-JECE) e-issn: 2278-2834,p- ISSN: 2278-8735.Volume 9, Issue 3, Ver. V (May - Jun. 2014), PP 30-35 Human Computer Interaction by Gesture Recognition
More informationJane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute
Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute State one reason for investigating and building humanoid robot (4 pts) List two
More informationStereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.
Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.
More informationWHITE PAPER Need for Gesture Recognition. April 2014
WHITE PAPER Need for Gesture Recognition April 2014 TABLE OF CONTENTS Abstract... 3 What is Gesture Recognition?... 4 Market Trends... 6 Factors driving the need for a Solution... 8 The Solution... 10
More informationZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014
ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014 Yu DongDong, Xiang Chuan, Zhou Chunlin, and Xiong Rong State Key Lab. of Industrial Control Technology, Zhejiang University, Hangzhou,
More information