Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device
|
|
- Byron Davidson
- 5 years ago
- Views:
Transcription
1 Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Hung-Chi Chu 1, Yuan-Chin Cheng 1 1 Department of Information and Communication Engineering, Chaoyang University of Technology, Taiwan {hcchu, s }@cyut.edu.tw Abstract. As sensing technologies have developed, information equipment operating interfaces have changed. Thus, traditional instruction operation has evolved into intuitive operation, providing users a convenient operation experience without the need of learning in advance. At present, several kinds of intuitive user interface systems have been developed in intelligent mobile phones, such as picture vertical/horizontal automatic sensing switch and mobile phone face/back sensing switch. However, this system cannot process complicated action sensing. Therefore, this study developed a new intuitive gesture recognition system, where a G-sensor inside a phone records gestures, and then the gestures are identified by our gesture recognition algorithm in order to realize the intuitive gesture recognition system. Keywords: gesture recognition, G-sensor, accelerometer. 1 Introduction In recent years, there have been many studies on gesture recognition technologies in various domains, such as computer vision technology [1], intelligent gloves [2], inertial motion tracking system [3], and mobile equipments [4, 5]. The equipments used in various technologies are different. The studies related to computer vision technology use infrared sensing technology to measure body movements and gestures, e.g. the Wii Remote [6, 7]. The studies of intelligent gloves use diversified sensors (accelerometer) to measure fine gestures [2]. The accelerometer is commonly used in studies of inertial systems, and gyroscopes and magnetometers are also used for action detection. In mobile equipments, the accelerometer is used as a sensitive element able to identify gestures [3, 4]. However, the computer vision technology, intelligent gloves, and inertial motion tracking systems require enormous data for operation and special data acquisition equipments, thus, they are inapplicable to mobile equipments. Among mobile equipment, as intelligent mobile phones are rapidly developed, the phones have become smaller, have more functions, and hardware is equipped with diversified sensors, such as G-sensor, gyroscopes, magnetometers, and light sensors. The intelligent mobile phone operating systems are Android, Window mobile, ios4, etc., as well as the Android system released by Google, becomes a popular application target platform in recent years. This system has the following advantages.
2 (1) Open operating system: users can add functions or modify the operating system according to individual needs, and the operating system is provided with an open source code, which can be modified without authorization. (2) High freedom of software development: all users can design application programs for this operating system. The application programs can be shared or sold through Android Market services. (3) Integrated with Google service: the operating system can be integrated with cloud services, such as Gmail and Google maps. Based on the above advantages, users can develop application programs for this system. However, there are many technical challenges in the basic interactions of gesture recognition for mobile phones. First, the gesture recognition lacks a standardized and extensively used gesture vocabulary. Secondly, spontaneous interactions should be participated in immediately. For example, when a user inputs a gesture in an intelligent mobile phone, it will be identified by predefined gestures in its vocabulary, and the results corresponding to the gesture must be executed immediately. Thirdly, the intelligent mobile phone platform is highly limited to cost and system resources, which include computing power and an electric quantity battery. In addition, most of the gesture tracking recognition methods use an accelerometer to capture three-axis acceleration variation values, and the total displacement of the accelerometer is obtained upon calculations of Euclidean Distance. Finally the threeaxis acceleration variation values and the total displacement are used as the conditions of gesture tracking. However, if gesture tracking is identified only by using the above two conditions, the failure rate of gesture tracking recognition will increase as a result of error accumulation. Although the intuitive user interface system has been applied to intelligent mobile phones in recent years, the complicated automatic motion sensing and control cannot yet be performed. Therefore, this study developed a new intuitive user interface system, where a G-sensor inside a phone records gestures, and the gestures are identified by the proposed gesture recognition algorithm, in order to realize this system. The gesture recognition algorithm can solve the abovementioned challenges, and the precision and success rate of gesture recognition can be improved by using the proposed algorithm. 2 Related works In general inertial motion tracking systems, the motion direction is measured by sensors [3]. The workable sensors include accelerometers, gyroscopes, and magnetometers. The accelerometer measures acceleration, the gyroscope measures rotation rate, and the magnetometer measures the direction of motion. Therefore, Inertia Measurement Units (IMUs) are composed of the three components, and changes in the direction of motion are measured by the gyroscope or magnetometer. However, the inertial measurement system has two disadvantages, one is the high price of the gyroscope, the other, is that the magnetometer is likely to be disturbed by other electronic equipment, which result in errors. Therefore, [3] used three three-axis accelerometers to measure movements, the algorithm proposed in the system is
3 combined with an extrapolation method, the Least Squares Method (LSM), and Least Squares Problem (LSP), and the calculated error value is minimized using Lagrange multipliers. A highly efficiency recognition algorithm, using only a single three-axis accelerometer to identify gestures, is proposed in [4,5], called the uwave. The algorithm requires only one training sample as the identification condition for each gesture mode, which allows users to use personalized gestures as recognition of real operations. However, the core of the uwave algorithm is dynamic time warping (DTW) [8], which has been extensively studied and used in voice recognition systems. The uwave algorithm consists of acceleration quantization, dynamic time warping, and Template adaptation, and the dynamic time warping algorithm is used for two time series to determine the optimal corresponding value at the same time point. The error magnitude of the optimal corresponding value is used as the condition for identifying gestures. In [2], five two-axis accelerometers are used in the intelligent glove for gesture recognition, and the accelerometers are mounted at the five fingertips of the glove, allowing the finger movements to be precisely identified. However, considering the cost and size of the mobile equipment, the hardware cannot be equipped with several accelerometers. Furthermore, most of the previous studies of gesture recognition placed stress on detecting the outline of hand movements rather than finger movements. In addition, many studies and discussions regarding gesture recognition have used computer vision technology [1], such as Wii Remote [6, 7] and Kinect [9]. The body sensing principle of the Wii Remote includes two functions, which are direction positioning and motion sensing. Direction positioning refers to tracking and fixing coordinates using infrared sensing. The operation mode is that the LED inside the optical sensor bar emits infrared rays, and the infrared COMS sensor at the front end of Wii Remote receives the infrared spots from the optical sensor bar to determine the position and distance between optical sensor bar and remote controller. In addition, the relative position of the user is fixed using the received infrared spots, and based on infrared indoor location technology, direction positioning can be obtained. Motion sensing can detect movements and rotations in a three-dimensional space, and the gradient and travel direction are judged according to the voltage variation values of the x-axis, the y-axis, and the z-axis collected by the built-in ADX330 accelerometer of the Wii Remote. Therefore, body sensing operations can be attained by combining direction positioning with motion sensing. The interface device Kinect, of the Microsoft Xbox 360, has three kinds of lens, including an RGB color camera, an infrared transmitter, and an infrared CMOS camera. The infrared transmitter and the infrared CMOS camera form a 3D depth sensor, which is the major component of Kinect for detecting user's movements. Therefore, the Kinect with the above components can capture three elements at one time, which are color image, 3D depth image, and sound signal. The Kinect sensing technology produces depth images based on PrimeSensor technology [10]. The Kinect operation contains movement tracking, voice recognition, and a built-in motor. In the movement tracking, the infrared VGA lens built into Kinect transmits an infrared pulsed light, the infrared COMS camera receives the reflected infrared ray within the scan limit, and the PS1080 sensing chip analyzes and judges the user's position. The
4 sensing chip marks the Depth Field of all the scanned objects, using different colors and distances between user and Kinect to mark different objects, i.e. to mark the user in different colors according to different distances. Then, the user's body is separated from background objects, and the user's pose is judged correctly by the image identification system, thus, a 3D depth image is formed. The 3D depth image data are converted into a skeleton drawing, and the user's movement is identified by the skeleton tracking system. 3 Gesture recognition system 3.1 System structure Figure 1 is the system operation environment structure diagram. Users are provided with a mobile phone, which has various sensitive elements, such as a light sensing element, G-sensor, and magnetometer. In this study, the mobile phone is equipped with a G-sensor. In addition, the server is equipped with a database and a gesture recognition algorithm. The database stores several sets of gestures and the corresponding execution actions, while the gesture recognition algorithm calculates the tracking data sent of the user and identifies the gestures. When the user inputs the gesture tracking data using the G-sensor inside the phone, which is sent to the server through Wi-Fi or 3G/3.5G network for judgment, then the server will calculate and obtain the identification condition data by using the gesture recognition algorithm and read the preset several sets of gestures in the database to identify the gesture. Finally the gesture recognition result will be sent to the user and displayed on the phone, and the action corresponding to the gesture will be executed. Figure 1. System operation environment structure diagram 3.2 Gesture recognition algorithm In this algorithm, the tracking data are collected at intervals of 200ms, and after collection, the recorded tracking data can be divided into x-axis y-axis and z-axis parts. The gesture recognition calculation is executed according to the three axes' data, which contains five steps: (1) Data initialization, (2) Record single axis acceleration again, (3) Calculate interval acceleration variation, (4) Calculate interval displacement, and (5) Gesture recognition.
5 (1) Data initialization: the fundamental purpose is to arrange the captured threeaxis data. For gesture data captured by the intelligent mobile phone Android system, the acceleration data decreases as the x-axis shifts rightwards, and increases as the x-axis shifts leftwards. The data decreases as the y-axis moves up, and increases as the y-axis moves down. Therefore, the data are converted into increasing as shifting rightwards and decreasing as shifting leftwards through data initialization. (2) Record single axis acceleration again: the fundamental purpose is to record the data again, after data initialization in step one. (3) Calculate interval acceleration variation: if x-axis acceleration is {0.1, 0.3, 0.4, 0.2, -0.2}, as shown in Figure 2. Figure 2. Acceleration sample value of x-axis Through (1) is the interval acceleration variation, the x-axis acceleration variation at 200ms is: The rest may be deduced by analogy. (4) Calculate interval displacement: is the interval displacement, the displacement of each time point can be obtained. (5) Gesture recognition: the comparison method is when new tracking data are calculated through the abovementioned steps, the interval displacements are added as a judgment condition. The gesture comparison is carried out by reading the preset several sets of gesture tracking conditions in the database. The comparison method is based on: (2) (3)
6 Where, is the error rate, is the total displacement of new tracks, is the total displacement of preset tracks. The error values between the new tracks and all the preset tracks in database are calculated by using Eq.(3). Finally, when the minimum error value between a new track and a gesture of the preset tracks occurs, means the new track is identical to the gesture. 3.3 System operation flow chart Figure 3 shows the system operation flow. When a user sends gesture tracking data to the server, Step 1 of gesture recognition algorithm is executed first, and the gesture data are initialized. Step 2 is then executed, and the initialized data are rerecorded. The interval acceleration variation is calculated in Step 3, and the interval displacement is calculated in Step 4. Finally, gesture recognition is executed, all the preset gesture data in database will be read, and all the gesture data are compared with the new gesture data. If the error value between the gesture data and a gesture in database is at the minimum, the execution action corresponding to this gesture will be sent to the user and executed. If the comparison fails, the result will be sent directly to the user. Figure 3. System operation flow 4 Experimental results and analysis 4.1 Experimental results In the experiment, the equipment for capturing gesture tracking data is HTC Desire, and the operating system is the Android 2.3 operating system developed by Google on December 7, This system has revised UI and added hardware [11], and it provides new API (TYPE_LINEAR_ACCELERATION) for capturing the value of the G-sensor. The main difference between this API, and version 2.2 is the captured acceleration value excludes the gravity value of gravitation. The G-sensor chip built into the phone is BMA150. The captured HTC acceleration value is within ±9.8g.
7 . The server host is DELL-Optiplex 745MT, the CPU is Intel Core (TM) 2 Qual Q8400, and the memory is 4GB. Figure 4 shows the captured gestures in the experiment, the gesture data are captured and the gesture tracking diagram is redrawn according to the four gestures. In this figure, 1 represents move down, 2 represents a circle drawn clockwise. Figure 4. Gestures (dot is start, arrowhead is end) Figure 5 (a) shows the redrawn double axis tracking diagram after calculations and comparison of gestures. As seen Figure 5, the error value of double axis track is 6.92%. Therefore, the track lines on x-axis and z-axis planes are different slightly, which is due to a little shaking or slight error in movement when capturing tracking data, causing slight errors in gesture data. In addition, the error value in the three axes' tracking diagram is 11.06%. Therefore, there is a significant error between y-axis value of movement tracking and that of the initial tracking, as shown in Figure 5 (b). This is because when a mobile phone is used for capturing gesture data, the acceleration value will be slightly affected by the gravitation, thus, the track data are influenced. Therefore, in the tracking data of gesture 1, there is a slight error in the tracks, as the y-axis is influenced by gravitation. (a) (b) Figure 5. (a) (x,y) tracks of gesture 1, (b) (x,y,z) tracks of gesture 1 Figure 6 (a) shows the tracks of two axes after calculations and comparison of gesture 2, the error value of the redrawn double axis track is 5.90%, thus, there is a slight error resulted from shaking or error in movement when capturing gesture data. In addition, the error value in three axes' track diagram is 17.01%. Therefore, the movement track also has slight errors as the initial track does, as shown in Figure 6 (b). The track error is increased as the track is slightly affected by gravitation and slight errors in movement.
8 (a) Figure 6. (a) (x,y) tracks of gesture 2, (b) (x,y,z) tracks of gesture 2 (b) 4.2 Experimental analysis According to the experimental results, the acceleration value captured by the new API of the Android 2.3 operating system will continue to be slightly influenced by the gravity value of gravitation. Therefore, this API cannot completely exclude the gravity value of gravitation. The track data error will increase as the acceleration values of x, y, and z axes are slightly influenced by gravitation. The gesture tracking data of gestures 3 and 4 used in the experiment are slightly influenced by gravitation, thus, the error increases. Table 1 shows the error rate of gestures analyzed by equation (4) after ten tests. Where, is the mean error rate, is the Maximum error rate, is the Minimum error rate. It is observed that the gesture tracking data mean error rates captured from gestures 1 and 2 are 13.76% and 13.42%, respectively. Since the new Android2.3 the API cannot completely eliminate the effects of gravity, therefore, the acceleration value of x, y and z axes slightly affected. Therefore, the failure rate of gesture recognition will increase. (4) is the mean error rate, equation (3), is the number of tests. is the total error rate calculated by Table 1. error rate Gesture % 13.42% 26.3% 17.01% 8.25% 8.73%
9 5 Conclusions This study developed a new intuitive gesture recognition system based on the Android operating system. This system can realize intuitive gesture recognition through the gesture recognition algorithm designed by this study. Users can create functions corresponding to gestures using this system, and thus, realize intuitive gestures. However, one challenge in the experiment is that the acceleration value captured by the hand-held device was slightly affected by gravity, causing the error value to increase. Therefore, the follow-up study will attempt to solve the above problem in the future, and reduce the gravity effect to increase the recognition success rate. Acknowledgement This work was supported in part by the National Science Council, Taiwan, under grant NSC E MY3. References [1] Y. Wu and T. S. Huang, Vision-Based Gesture Recognition: A Review, in: Proceedings of the International Gesture Workshop on Gesture-Based Communication in Human- Computer Interaction: SpringerVerlag, [2] J. K. Perng, B. Fisher, S. Hollar, K. S. J. Pister, Acceleration sensing glove (ASG), The Third International Symposium on Wearable Computers, [3] C.-W. Yi, C.-M. Su, W.-T. Chai, J.-L. Huang, T.-C. Chiang, G-Constellations: G-Sensor Motion Tracking Systems, IEEE 71st Vehicular Technology Conference (VTC Spring), [4] J. Liu, Z. Wang, and L. Zhong, J. Wickramasuriya and V. Vasudevan, uwave: Accelerometer-based Personalized Gesture Recognition and Its Applications, IEEE International Conference on Pervasive Computing and Communications, [5] J. Liu, L. Zhong, J. Wickramasuriya and V. Vasudevan, uwave: Accelerometer-based personalized gesture recognition and its applications, Pervasive and Mobile Computing, Volume 5, Issue 6, December 2009, Pages [6] T. Petric, A. Gams, A. Ude, L. Zlajpah, Real-time 3D marker tracking with a WIIMOTE stereo vision system: Application to robotic throwing, IEEE 19th International Workshop on Robotics in Alpe-Adria-Danube Region (RAAD), [7] P.-W. Chen, K.-S. Ou, K.-S. Chen, IR Indoor Localization and Wireless Transmission for Motion Control in Smart Building Applications based on Wiimote Technology, In Proceedings of SICE Annual Conference, Taiwan, [8] C. S. Myers, L. R. Rabiner, A cpmparative study of several dynamic time-warping algorithms for connected word recognition. The Bell System Technical Journal, vol. 60, pp , [9] Kinect for Xbox 360, [10]PrimeSense Supplies 3-D-Sensing Technology to Project Natal for Xbox 360, [11] Wikipedia, Android
CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2
CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter
More informationThe Hand Gesture Recognition System Using Depth Camera
The Hand Gesture Recognition System Using Depth Camera Ahn,Yang-Keun VR/AR Research Center Korea Electronics Technology Institute Seoul, Republic of Korea e-mail: ykahn@keti.re.kr Park,Young-Choong VR/AR
More informationKINECT CONTROLLED HUMANOID AND HELICOPTER
KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED
More informationArtificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization
Sensors and Materials, Vol. 28, No. 6 (2016) 695 705 MYU Tokyo 695 S & M 1227 Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Chun-Chi Lai and Kuo-Lan Su * Department
More informationKinect Interface for UC-win/Road: Application to Tele-operation of Small Robots
Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots Hafid NINISS Forum8 - Robot Development Team Abstract: The purpose of this work is to develop a man-machine interface for
More informationGesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS
Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,
More informationCENG 5931 HW 5 Mobile Robotics Due March 5. Sensors for Mobile Robots
CENG 5931 HW 5 Mobile Robotics Due March 5 Sensors for Mobile Robots Dr. T. L. Harman: 281 283-3774 Office D104 For reports: Read HomeworkEssayRequirements on the web site and follow instructions which
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationA Smart Home Design and Implementation Based on Kinect
2018 International Conference on Physics, Computing and Mathematical Modeling (PCMM 2018) ISBN: 978-1-60595-549-0 A Smart Home Design and Implementation Based on Kinect Jin-wen DENG 1,2, Xue-jun ZHANG
More informationA Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,
IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,
More informationRESEARCH AND DEVELOPMENT OF DSP-BASED FACE RECOGNITION SYSTEM FOR ROBOTIC REHABILITATION NURSING BEDS
RESEARCH AND DEVELOPMENT OF DSP-BASED FACE RECOGNITION SYSTEM FOR ROBOTIC REHABILITATION NURSING BEDS Ming XING and Wushan CHENG College of Mechanical Engineering, Shanghai University of Engineering Science,
More informationFabrication of the kinect remote-controlled cars and planning of the motion interaction courses
Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 INTE 2014 Fabrication of the kinect remote-controlled cars and planning of the motion
More informationHigh-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control
High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control Pedro Neto, J. Norberto Pires, Member, IEEE Abstract Today, most industrial robots are programmed using the typical
More informationARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION
ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION ABSTRACT *Miss. Kadam Vaishnavi Chandrakumar, ** Prof. Hatte Jyoti Subhash *Research Student, M.S.B.Engineering College, Latur, India
More informationA 3D Gesture Based Control Mechanism for Quad-copter
I J C T A, 9(13) 2016, pp. 6081-6090 International Science Press A 3D Gesture Based Control Mechanism for Quad-copter Adarsh V. 1 and J. Subhashini 2 ABSTRACT Objectives: The quad-copter is one of the
More informationIndoor Location System with Wi-Fi and Alternative Cellular Network Signal
, pp. 59-70 http://dx.doi.org/10.14257/ijmue.2015.10.3.06 Indoor Location System with Wi-Fi and Alternative Cellular Network Signal Md Arafin Mahamud 1 and Mahfuzulhoq Chowdhury 1 1 Dept. of Computer Science
More informationSPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB
SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB MD.SHABEENA BEGUM, P.KOTESWARA RAO Assistant Professor, SRKIT, Enikepadu, Vijayawada ABSTRACT In today s world, in almost all sectors, most of the work
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationAdvancements in Gesture Recognition Technology
IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka
More informationResearch on an Economic Localization Approach
Computer and Information Science; Vol. 12, No. 1; 2019 ISSN 1913-8989 E-ISSN 1913-8997 Published by Canadian Center of Science and Education Research on an Economic Localization Approach 1 Yancheng Teachers
More informationIndoor Positioning by the Fusion of Wireless Metrics and Sensors
Indoor Positioning by the Fusion of Wireless Metrics and Sensors Asst. Prof. Dr. Özgür TAMER Dokuz Eylül University Electrical and Electronics Eng. Dept Indoor Positioning Indoor positioning systems (IPS)
More informationWHITE PAPER Need for Gesture Recognition. April 2014
WHITE PAPER Need for Gesture Recognition April 2014 TABLE OF CONTENTS Abstract... 3 What is Gesture Recognition?... 4 Market Trends... 6 Factors driving the need for a Solution... 8 The Solution... 10
More informationIntroduction to Mobile Sensing Technology
Introduction to Mobile Sensing Technology Kleomenis Katevas k.katevas@qmul.ac.uk https://minoskt.github.io Image by CRCA / CNRS / University of Toulouse In this talk What is Mobile Sensing? Sensor data,
More informationDevelopment of a telepresence agent
Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented
More informationHAND GESTURE CONTROLLED ROBOT USING ARDUINO
HAND GESTURE CONTROLLED ROBOT USING ARDUINO Vrushab Sakpal 1, Omkar Patil 2, Sagar Bhagat 3, Badar Shaikh 4, Prof.Poonam Patil 5 1,2,3,4,5 Department of Instrumentation Bharati Vidyapeeth C.O.E,Kharghar,Navi
More informationImplementing RoshamboGame System with Adaptive Skin Color Model
American Journal of Engineering Research (AJER) e-issn: 2320-0847 p-issn : 2320-0936 Volume-6, Issue-12, pp-45-53 www.ajer.org Research Paper Open Access Implementing RoshamboGame System with Adaptive
More informationIoT Wi-Fi- based Indoor Positioning System Using Smartphones
IoT Wi-Fi- based Indoor Positioning System Using Smartphones Author: Suyash Gupta Abstract The demand for Indoor Location Based Services (LBS) is increasing over the past years as smartphone market expands.
More informationActive Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1
Active Stereo Vision COMP 4102A Winter 2014 Gerhard Roth Version 1 Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can
More informationImage Manipulation Interface using Depth-based Hand Gesture
Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking
More informationImage Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network
436 JOURNAL OF COMPUTERS, VOL. 5, NO. 9, SEPTEMBER Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network Chung-Chi Wu Department of Electrical Engineering,
More informationTeam Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League
Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League Chung-Hsien Kuo, Yu-Cheng Kuo, Yu-Ping Shen, Chen-Yun Kuo, Yi-Tseng Lin 1 Department of Electrical Egineering, National
More informationDesign of High-Precision Infrared Multi-Touch Screen Based on the EFM32
Sensors & Transducers 204 by IFSA Publishing, S. L. http://www.sensorsportal.com Design of High-Precision Infrared Multi-Touch Screen Based on the EFM32 Zhong XIAOLING, Guo YONG, Zhang WEI, Xie XINGHONG,
More informationHand Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided
, pp. 407-418 http://dx.doi.org/10.14257/ijseia.2016.10.12.34 Hand Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided Min-Soo Kim 1 and Choong Ho Lee 2 1 Dept.
More informationPrediction and Correction Algorithm for a Gesture Controlled Robotic Arm
Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Pushkar Shukla 1, Shehjar Safaya 2, Utkarsh Sharma 3 B.Tech, College of Engineering Roorkee, Roorkee, India 1 B.Tech, College of
More informationANDROID APPS DEVELOPMENT FOR MOBILE GAME
ANDROID APPS DEVELOPMENT FOR MOBILE GAME Lecture 5: Sensor and Location Sensor Overview Most Android-powered devices have built-in sensors that measure motion, orientation, and various environmental conditions.
More informationSensor system of a small biped entertainment robot
Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO
More informationInput devices and interaction. Ruth Aylett
Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time
More informationPortfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088
Portfolio About Me: I am a Computer Science graduate student at The University of Texas at Dallas. I am currently working as Augmented Reality Engineer at Aireal, Dallas and also as a Graduate Researcher
More informationInternational Conference on Advances in Mechanical Engineering and Industrial Informatics (AMEII 2015)
International Conference on Advances in Mechanical Engineering and Industrial Informatics (AMEII 2015) Equipment body feeling maintenance teaching system Research Based on Kinect Fushuan Wu 1, a, Jianren
More informationMOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.
More informationEL6483: Sensors and Actuators
EL6483: Sensors and Actuators EL6483 Spring 2016 EL6483 EL6483: Sensors and Actuators Spring 2016 1 / 15 Sensors Sensors measure signals from the external environment. Various types of sensors Variety
More informationDrink Bottle Defect Detection Based on Machine Vision Large Data Analysis. Yuesheng Wang, Hua Li a
Advances in Computer Science Research, volume 6 International Conference on Artificial Intelligence and Engineering Applications (AIEA 06) Drink Bottle Defect Detection Based on Machine Vision Large Data
More informationLecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)
Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces
More informationReVRSR: Remote Virtual Reality for Service Robots
ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe
More informationDesign of intelligent vehicle control system based on machine visual
Advances in Engineering Research (AER), volume 117 2nd Annual International Conference on Electronics, Electrical Engineering and Information Science (EEEIS 2016) Design of intelligent vehicle control
More informationIMU Platform for Workshops
IMU Platform for Workshops Lukáš Palkovič *, Jozef Rodina *, Peter Hubinský *3 * Institute of Control and Industrial Informatics Faculty of Electrical Engineering, Slovak University of Technology Ilkovičova
More informationMaster Thesis Presentation Future Electric Vehicle on Lego By Karan Savant. Guide: Dr. Kai Huang
Master Thesis Presentation Future Electric Vehicle on Lego By Karan Savant Guide: Dr. Kai Huang Overview Objective Lego Car Wifi Interface to Lego Car Lego Car FPGA System Android Application Conclusion
More informationMulti-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living
Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living Javier Jiménez Alemán Fluminense Federal University, Niterói, Brazil jjimenezaleman@ic.uff.br Abstract. Ambient Assisted
More informationThe 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X
The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-103 DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS
More information3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta
3D Interaction using Hand Motion Tracking Srinath Sridhar Antti Oulasvirta EIT ICT Labs Smart Spaces Summer School 05-June-2013 Speaker Srinath Sridhar PhD Student Supervised by Prof. Dr. Christian Theobalt
More informationDevelopment of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device
RESEARCH ARTICLE OPEN ACCESS Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device 1 Dr. V. Nithya, 2 T. Sree Harsha, 3 G. Tarun Kumar,
More informationImprovement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere
Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere Kiyotaka Fukumoto (&), Takumi Tsuzuki, and Yoshinobu Ebisawa
More informationComparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application
Comparison of Head Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application Nehemia Sugianto 1 and Elizabeth Irenne Yuwono 2 Ciputra University, Indonesia 1 nsugianto@ciputra.ac.id
More informationDigital inertial algorithm for recording track geometry on commercial shinkansen trains
Computers in Railways XI 683 Digital inertial algorithm for recording track geometry on commercial shinkansen trains M. Kobayashi, Y. Naganuma, M. Nakagawa & T. Okumura Technology Research and Development
More informationInteractive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1
VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio
More informationKINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri
KINECT HANDS-FREE Rituj Beniwal Pranjal Giri Agrim Bari Raman Pratap Singh Akash Jain Department of Aerospace Engineering Indian Institute of Technology, Kanpur Atharva Mulmuley Department of Chemical
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard
More informationDevelopment of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture
Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Nobuaki Nakazawa 1*, Toshikazu Matsui 1, Yusaku Fujii 2 1 Faculty of Science and Technology, Gunma University, 29-1
More informationA Vehicular Visual Tracking System Incorporating Global Positioning System
A Vehicular Visual Tracking System Incorporating Global Positioning System Hsien-Chou Liao and Yu-Shiang Wang Abstract Surveillance system is widely used in the traffic monitoring. The deployment of cameras
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationGesture Controlled Car
Gesture Controlled Car Chirag Gupta Department of ECE ITM University Nitin Garg Department of ECE ITM University ABSTRACT Gesture Controlled Car is a robot which can be controlled by simple human gestures.
More informationResearch on Hand Gesture Recognition Using Convolutional Neural Network
Research on Hand Gesture Recognition Using Convolutional Neural Network Tian Zhaoyang a, Cheng Lee Lung b a Department of Electronic Engineering, City University of Hong Kong, Hong Kong, China E-mail address:
More informationMotion Capture for Runners
Motion Capture for Runners Design Team 8 - Spring 2013 Members: Blake Frantz, Zhichao Lu, Alex Mazzoni, Nori Wilkins, Chenli Yuan, Dan Zilinskas Sponsor: Air Force Research Laboratory Dr. Eric T. Vinande
More informationTeam Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League
Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League Chung-Hsien Kuo 1, Hung-Chyun Chou 1, Jui-Chou Chung 1, Po-Chung Chia 2, Shou-Wei Chi 1, Yu-De Lien 1 1 Department
More information3D-Position Estimation for Hand Gesture Interface Using a Single Camera
3D-Position Estimation for Hand Gesture Interface Using a Single Camera Seung-Hwan Choi, Ji-Hyeong Han, and Jong-Hwan Kim Department of Electrical Engineering, KAIST, Gusung-Dong, Yusung-Gu, Daejeon, Republic
More informationMobile Motion: Multimodal Device Augmentation for Musical Applications
Mobile Motion: Multimodal Device Augmentation for Musical Applications School of Computing, School of Electronic and Electrical Engineering and School of Music ICSRiM, University of Leeds, United Kingdom
More informationBased on the ARM and PID Control Free Pendulum Balance System
Available online at www.sciencedirect.com Procedia Engineering 29 (2012) 3491 3495 2012 International Workshop on Information and Electronics Engineering (IWIEE) Based on the ARM and PID Control Free Pendulum
More informationCalibration-Based Auto White Balance Method for Digital Still Camera *
JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 26, 713-723 (2010) Short Paper Calibration-Based Auto White Balance Method for Digital Still Camera * Department of Computer Science and Information Engineering
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationUsing Intelligent Mobile Devices for Indoor Wireless Location Tracking, Navigation, and Mobile Augmented Reality
Using Intelligent Mobile Devices for Indoor Wireless Location Tracking, Navigation, and Mobile Augmented Reality Chi-Chung Alan Lo, Tsung-Ching Lin, You-Chiun Wang, Yu-Chee Tseng, Lee-Chun Ko, and Lun-Chia
More informationIndoor navigation with smartphones
Indoor navigation with smartphones REinEU2016 Conference September 22 2016 PAVEL DAVIDSON Outline Indoor navigation system for smartphone: goals and requirements WiFi based positioning Application of BLE
More informationWirelessly Controlled Wheeled Robotic Arm
Wirelessly Controlled Wheeled Robotic Arm Muhammmad Tufail 1, Mian Muhammad Kamal 2, Muhammad Jawad 3 1 Department of Electrical Engineering City University of science and Information Technology Peshawar
More informationOmni-Directional Catadioptric Acquisition System
Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationResearch Seminar. Stefano CARRINO fr.ch
Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks
More informationDevelopment of intelligent systems
Development of intelligent systems (RInS) Robot sensors Danijel Skočaj University of Ljubljana Faculty of Computer and Information Science Academic year: 2017/18 Development of intelligent systems Robotic
More informationMore Info at Open Access Database by S. Dutta and T. Schmidt
More Info at Open Access Database www.ndt.net/?id=17657 New concept for higher Robot position accuracy during thermography measurement to be implemented with the existing prototype automated thermography
More informationNon-Contact Gesture Recognition Using the Electric Field Disturbance for Smart Device Application
, pp.133-140 http://dx.doi.org/10.14257/ijmue.2014.9.2.13 Non-Contact Gesture Recognition Using the Electric Field Disturbance for Smart Device Application Young-Chul Kim and Chang-Hyub Moon Dept. Electronics
More informationAutomatic Docking System with Recharging and Battery Replacement for Surveillance Robot
International Journal of Electronics and Computer Science Engineering 1148 Available Online at www.ijecse.org ISSN- 2277-1956 Automatic Docking System with Recharging and Battery Replacement for Surveillance
More informationSPTF: Smart Photo-Tagging Framework on Smart Phones
, pp.123-132 http://dx.doi.org/10.14257/ijmue.2014.9.9.14 SPTF: Smart Photo-Tagging Framework on Smart Phones Hao Xu 1 and Hong-Ning Dai 2* and Walter Hon-Wai Lau 2 1 School of Computer Science and Engineering,
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationKeywords Mobile Phones, Accelerometer, Gestures, Hand Writing, Voice Detection, Air Signature, HCI.
Volume 5, Issue 3, March 2015 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Advanced Techniques
More information3D-Map Aided Multipath Mitigation for Urban GNSS Positioning
Summer School on GNSS 2014 Student Scholarship Award Workshop August 2, 2014 3D-Map Aided Multipath Mitigation for Urban GNSS Positioning I-Wen Chu National Cheng Kung University, Taiwan. Page 1 Outline
More informationSELF STABILIZING PLATFORM
SELF STABILIZING PLATFORM Shalaka Turalkar 1, Omkar Padvekar 2, Nikhil Chavan 3, Pritam Sawant 4 and Project Guide: Mr Prathamesh Indulkar 5. 1,2,3,4,5 Department of Electronics and Telecommunication,
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationR (2) Controlling System Application with hands by identifying movements through Camera
R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity
More informationOPEN CV BASED AUTONOMOUS RC-CAR
OPEN CV BASED AUTONOMOUS RC-CAR B. Sabitha 1, K. Akila 2, S.Krishna Kumar 3, D.Mohan 4, P.Nisanth 5 1,2 Faculty, Department of Mechatronics Engineering, Kumaraguru College of Technology, Coimbatore, India
More informationAn Embedded System for Tracking Human Motion and Humanoid Interfaces
ORIGINAL ARTICLE SPECIAL ISSUE: Intelligent Robotics (1/2) Guest edited by Yi-Hung Liu An Embedded System for Tracking Human Motion and Humanoid Interfaces Ming-June Tsai 1, Hung-Wen Lee 1, *, Trinh-Ngoc
More information3-Degrees of Freedom Robotic ARM Controller for Various Applications
3-Degrees of Freedom Robotic ARM Controller for Various Applications Mohd.Maqsood Ali M.Tech Student Department of Electronics and Instrumentation Engineering, VNR Vignana Jyothi Institute of Engineering
More informationThe Design and Implementation of Indoor Localization System Using Magnetic Field Based on Smartphone
The Design and Implementation of Indoor Localization System Using Magnetic Field Based on Smartphone Liu Jiaxing a, Jiang congshi a, Shi zhongcai a a International School of Software,Wuhan University,Wuhan,China
More informationClassification for Motion Game Based on EEG Sensing
Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,
More informationFace Registration Using Wearable Active Vision Systems for Augmented Memory
DICTA2002: Digital Image Computing Techniques and Applications, 21 22 January 2002, Melbourne, Australia 1 Face Registration Using Wearable Active Vision Systems for Augmented Memory Takekazu Kato Takeshi
More informationDesigning of a Shooting System Using Ultrasonic Radar Sensor
2017 Published in 5th International Symposium on Innovative Technologies in Engineering and Science 29-30 September 2017 (ISITES2017 Baku - Azerbaijan) Designing of a Shooting System Using Ultrasonic Radar
More informationSURVEILLANCE SYSTEMS WITH AUTOMATIC RESTORATION OF LINEAR MOTION AND OUT-OF-FOCUS BLURRED IMAGES. Received August 2008; accepted October 2008
ICIC Express Letters ICIC International c 2008 ISSN 1881-803X Volume 2, Number 4, December 2008 pp. 409 414 SURVEILLANCE SYSTEMS WITH AUTOMATIC RESTORATION OF LINEAR MOTION AND OUT-OF-FOCUS BLURRED IMAGES
More informationCONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM
CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,
More informationOughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg
OughtToPilot Project Report of Submission PC128 to 2008 Propeller Design Contest Jason Edelberg Table of Contents Project Number.. 3 Project Description.. 4 Schematic 5 Source Code. Attached Separately
More informationResearch Article Handwriting Recognition in Free Space Using WIMU-Based Hand Motion Analysis
Journal of Sensors Volume 2016, Article ID 3692876, 10 pages http://dx.doi.org/10.1155/2016/3692876 Research Article Handwriting Recognition in Free Space Using WIMU-Based Hand Motion Analysis Shashidhar
More informationInstructions for the Experiment
Instructions for the Experiment Excitonic States in Atomically Thin Semiconductors 1. Introduction Alongside with electrical measurements, optical measurements are an indispensable tool for the study of
More informationLive Hand Gesture Recognition using an Android Device
Live Hand Gesture Recognition using an Android Device Mr. Yogesh B. Dongare Department of Computer Engineering. G.H.Raisoni College of Engineering and Management, Ahmednagar. Email- yogesh.dongare05@gmail.com
More informationAn Inertial Pen with Dynamic Time Warping Recognizer for Handwriting and Gesture Recognition L.M.MerlinLivingston #1, P.Deepika #2, M.
An Inertial Pen with Dynamic Time Warping Recognizer for Handwriting and Gesture Recognition L.M.MerlinLivingston #1, P.Deepika #2, M.Benisha #3 #1 Professor, #2 Assistant Professor, #3 Assistant Professor,
More information