High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

Size: px
Start display at page:

Download "High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control"

Transcription

1 High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control Pedro Neto, J. Norberto Pires, Member, IEEE Abstract Today, most industrial robots are programmed using the typical teaching process. This paper presents a robotic system where the user can instruct and program a robot just showing what it should do, and with a high-level of abstraction from the robot language. This is done using the two most natural human interfaces (gestures and speech), a force control system and several code generation techniques. The performance of this system is compared with a similar system that instead of gestures uses a manual guidance system based on a force control strategy. Two different demonstrations with two different robots (MOTOMAN and ABB) are presented, showing that the developed systems can be customised for different users and robots. P I. INTRODUCTION ROGRAMMING an industrial robot by the typical teaching method is a tedious and time-consuming task that requires some technical expertise. In opposition to the highly intelligent robots described in science fiction, current industrial robots are non-intelligent machines that work in a controlled and well known environment. Generally, robots are designed, equipped and programmed to perform specific tasks, and thus, an unskilled worker will not be able to reprogram the robot to perform a different task. The goal is to create a methodology that helps users to control and program a robot with a high-level of abstraction from the robot language i.e., making a robotic demonstration in terms of high-level behaviors (using gestures, speech, etc.) the user should be able to demonstrate to the robot what it should do. This type of learning is often known as programming by demonstration (PbD). Several approaches for PbD have been investigated, using different input devices, manipulators and learning strategies [1] [5]. In fact, the demand for new and more natural humanmachine interfaces (HMIs) has been increasing in recent years, and the field of robotics has followed this trend [6]. The speech recognition is seen as one of the most promising interfaces between humans and machines, because it is probably the most natural and intuitive way of communication between humans. For this reason, and given Manuscript received September 15, This work was supported in part by the European Commission s Sixth Framework Program under grant no as part of the Integrated Project SMErobot TM, and the Portuguese Foundation for Science and Technology (FCT) (SFRH/BD/39218/2007). Pedro Neto is a PhD student in the Industrial Robotics Laboratory - Mechanical Engineering Department of the University of Coimbra, POLO- II, , Coimbra, Portugal; ( pedro.neto@robotics.dem.uc.pt). J. Norberto Pires is with the Industrial Robotics Laboratory - Mechanical Engineering Department of the University of Coimbra, POLO-II, , Coimbra, Portugal; ( jnp@robotics.dem.uc.pt). the high demand for more natural and intuitive HMIs, the automatic speech recognition (ASR) systems had a great development in the last years. Today, these systems present a good performance and robustness, allowing, for example, the control of industrial robots in an industrial environment (in the presence of surrounding noise) [7]. Gestures are another natural form of communication between humans. In the robotics field, several works has been done in order to identify and recognize motions and gestures performed by humans. The above can be done using vision-based interfaces that detect human gestures [8], motion capture sensors [3], using the combination of both (a vision system and a data glove) [1], or using finger gesture recognition systems based on active tracking mechanisms [9]. The use of artificial neural networks (ANNs) for the recognition of gestures [10] has also been used extensively in the past with very good results. In this work a Wii Remote controller was used to capture human hand behaviors, manual postures (static hand positions) and gestures (dynamic hand positions). Basically, the Wii Remote is an inexpensive device that uses accelerometers to detect motion and user commands for game control. The motion data extracted from the three-axis accelerometer embedded in the Wii Remote are used as input to a statistical model, and then feed to an ANN algorithm for posture and gesture recognition. This information will serve as input to the robot control system used in this paper, which also incorporates a speech recognition software allowing the user to manage the cell, acting on the robot and on the code generation system. To avoid excessive contact forces between the robot toll and workpiece and at the same time detect and avoid collisions during the robot operation, a force control system is used. In summary, the robotic system used in this paper allows the user to control and generate robot code, using perhaps the two most natural human interfaces: gestures and speech. Those features are also used for user feedback, i.e., the user receives warning sounds and spoken alerts (generated using a TTS interface) and tactile vibrations (the Wii Remote vibrates when the force control system detects excessive forces). In order to analyze the viability of presented system, two test cases are presented and discussed. The first test case is a common pick-and-place operation and in the second test case the robot is used to write some letters on paper. Finally, the performance of this system is compared with the performance of a similar system that instead of gestures uses a manual guidance system based on a force control

2 strategy. II. EXPERIMENTAL SETUP A. System description The experimental setup consists of an industrial robot MOTOMAN HP6 equipped with the NX100 controller, a Wii Remote controller to capture human hand behaviors, a headset microphone to capture the user voice, a force/torque (F/T) sensor and a computer running the application that manages the cell (Fig. 1). Fig. 1. The robotic cell is basically composed of an industrial robot, a F/T sensor, two input devices (Wii Remote and headset), and a computer running the application that manages the cell. The above mentioned application receives data from the Wii Remote, interprets the received data and acts in the robot, using for this purpose the MotomanLib, a Data Link Library created in our laboratory to control and manage the robot remotely via Ethernet (Fig. 2). The application has incorporated a speech recognition software that recognizes the voice commands received from the headset, and depending on the commands received acts in the robot or in the code generation system that is embedded in the application. To communicate with the F/T sensor is used an Active X named JR3PCI [11], which allows the application to be continuously receiving feedback from the F/T sensor and if any component of force or torque exceeds a set value, is sent a command that makes the Wii Remote vibrate (tactile feedback). This is a way of providing feedback about the cell state to the user, beyond the sound feedback (alert sounds and a TTS system that reports the cell state and occurrences). Finally, the application also includes a section to train the statistical model and the ANN. Fig. 2. A schematic representation of the cell, in terms of communication technology. The input devices work without wires (via Bluetooth), giving a greater freedom to the user. B. The Wii Remote The demand for new interaction devices to improve the game experience has led to the development of new devices that allow the user to feel more immersed in the game. In contrast to the traditional game pads or joysticks, the Wii Remote from Nintendo allows users to control the game using gestures as well as button presses. It uses a combination of motion sensing and infrared (IR) detection to sense its poses (rotations and translations) in 3D space. The Wii Remote has a 3-axis accelerometer, an IR camera with an object tracking system and 11 buttons used as input features. In order to provide feedback to the user, the Wii Remote contains 4 LEDs, a rumble that can be activated to cause the controller vibration and a speaker. The Wii Remote communicates with the Wii console or with a computer via Bluetooth wireless link, reporting back data at 100 packages per second. The reported data can contains information about the controller state (acceleration, buttons, IR camera, etc.). Several studies have been done using the Wii Remote as interaction device, particularly in the construction of interactive whiteboards, finger tracking systems, and control of robots. Due to its characteristics and low price, the Wii Remote was selected to be integrated in the presented robotic system as input device to capture human hand behaviors (postures and gestures). In order to extract relevant information from the Wii Remote, the motion sensor and the IR capabilities of the controller was explored, but after some experiments, it was concluded that the IR capabilities of the Wii Remote were not usable. The Wii Remote s IR sensor offers the possibility to locate IR light sources in the controller s field of view, but the viewing angle of the Wii Remote is too limited. Other problems arises with the placement of the IR source in the cell, calibration of the IR sensor, the limited distance from the Wii Remote to the IR source that the user should maintain during the demonstration process, and detection problems when other infrared sources are around. Thus, the information provided by the motion sensor will be used to achieve the goals. This motion sensor is a 3-axis accelerometer ADXL330 from Analog Devices, physically rated to measure accelerations over a range of at least +/- 3g with 10% of sensitivity. C. Speech recognition The ASR systems have been used with relative success in the control of machines [7]. In this system, during the robotic demonstration the user can use voice commands to act remotely in the robot or in the code generation system, for example, if the user wants to stop the robot motors he should say ROBOT MOTORS OFF. Otherwise, if he wants generate robot code, for example, a command to move linearly the robot to the current pose he should say COMPUTER MOVE LINE. It is important to note that each voice command must be identified with a confidence

3 higher than 70%, otherwise it is rejected. D. Force control The robotic manipulators are often in direct contact with its surrounding environment. For purely positioning tasks such as robotic painting where the forces of interaction with the environment are negligible, no information about force feedback is required. However, in applications such as polishing, grinding or even in the manipulation of objects, the knowledge of the contact forces has a great influence on the quality and robustness of the process. In this paper, the robotic force control is done using a F/T sensor that measures both force and torque along 3 perpendicular axes, allowing the user to have a better perception of the surrounding environment. The application is continuously receiving feedback from the F/T sensor and if any component of force or torque exceeds a set value, is sent a command that makes the Wii Remote vibrate (tactile feedback). If the set value of that component is increased by 10% or more, the robot stops (see section III C. Security systems). E. Code generation In the construction of an algorithm to generate code, the keyword is generalize and never particularize, in other words, the algorithm must be prepared to cover a wide range of variations in the process. In the developed system, the code generation algorithm receives the instructions from the identified spoken commands, allowing that during the demonstration the user uses the speech to build the robot code step-by-step (write any type of variables, robot commands, etc.). Finally, after finalizing the robotic demonstration task, the user can generate the entire robot program, upload it to the robot controller and run it. III. CONTROL STRATEGY A. Robot control The robot is controlled remotely via Ethernet using the MOTOMAN IMOV function that moves linearly the robot according to a specified pose increment =. The first three components represent the robot translation along the X, Y, and Z axes, respectively; while the last three components represent the robot rotation about the X, Y, and Z axes, respectively. These components have the enough information to control the robot, and which it is necessary to identify by analyzing the behavior of the user hand that holds the Wii Remote. In this case it is completely unnecessary to extract precise displacements or rotations, because it is only necessary to know what the pose increment components which must be activated. In a first approach, the robot control strategy was to identify translation movements and rotations of the user hand, and, depending on these inputs, were sent continuously to the robot small pose increments. However, was quickly concluded that this approach was not viable, because the robot was constantly stop and go, presenting a high-level of vibration. The achieved solution was to send to the robot only one pose increment that will move the robot to the limit of their field of operation. The robot movement is activated by pressing the Wii Remote B button and making a hand gesture or posture, according to the desired robot movement. After this, immediately the robot starts the movement and when the user releases the B button the robot stops. If the B button is never released, the robot continues the movement up to the limit of its field of operation. B. Field of operation of the robot - increment calculation As above mentioned, according to the user hand behavior the robot is moved to the limit of their field operation, or more specifically, for a pose close to the limit of their field of operation. The field of operation of a 6-DOF robot manipulator is approximately a volume region bounded by two spherical surfaces. By this way, it can be considered that the field of operation of the used robot is bounded by two spherical surfaces (1), both with the center coincident with the zero reference point of the robot, and where and are respectively the radius of the external and internal spherical surface (1) Before starting any robot movement, the current robot position,, is acquired. In order to calculate the pose increment, firstly it is necessary to achieve the increment components which must be activated. This is done by recurring to the Wii Remote acceleration values,, that will define the robot movement direction =,, 1 (see section IV). This acceleration vector, in conjugation with the current robot position point,,, will be used to achieve a straight line (2) that will intersect the external spherical surface in two points (Fig. 3). In a first approach, it is considered that only the external spherical surface limits the robot field of operation.,, =,, +,, 1, R (2) From (1) and (2): = (3) Extracting from (3), and considering only the positive value of (vector direction), the distance from the current robot position to the external spherical surface point is:,, =,, =,, 1, R (4)

4 Thus, in terms of robot translation movements, the pose increment is = Note that, for example, if it is found that the robot should be moved along the X axis in the negative direction, the vector becomes 1,0,0, and then = An analog approach was employed to obtain when the robot field of operation is limited by the internal spherical surface. In this case, if has no value (impossible to calculate), it means that the straight line does not intercept the internal spherical surface and is the external spherical surface that limits the robot field of operation. In terms of rotation increments, since we know the robot rotation limit values and the current robot pose, it is easy to obtain the increments. time. In both modes, the user has the chance to make null the translation increment given to any of the three axes. In terms of rotations, in both cases, the rotation around each of the three axes is done separately, an axis at a time. B. Robot translations The accelerations extracted from the 3-axis accelerometer,, will be used to detect the user hand gestures (Fig. 4). Fig. 3. The two spherical surfaces that define the robot field of operation. The current robot point and the acceleration vector components that will define the robot movement direction are represented in figure. C. Security systems When a human interacts directly with a robot in a coworker scenario, the security systems present in the robotic cell should have a high-level of robustness, in order to avoid accidents. The developed robotic cell contains a system that is continually receiving data from the Wii Remote (via Bluetooth) and if the communication fails, the robot immediately stops. The same thing happens if the communication with the robot fails. The force control system also works as a security system, because if any component of force or torque exceeds a set value the user is alerted (the Wii Remote vibrate), and if the value of that component is increased by 10% or more, the robot immediately stops. The above is done with an independent system that actuates directly in a low-level of the control hierarchy (stopping the robot), independently from the software that is running in the robot. Fig. 4. The developed system can recognize six different gestures (X+, X-, Y+, Y-, Z+, and Z-). Depending on the mode of operation, the user can move the robot along the X, Y, and Z axes separately or at the same time. Note that in both movements the Wii Remote is held horizontally. Moving the Wii Remote over each one of the three axes (in both directions), we can extract,, for each of the six different gestures (Fig. 5). When the Wii Remote is moved in the positive X direction (X+), initially the value of acceleration increases because the hand begins to move and then, when the hand begins to slow the positive value of is converted to a negative value. This inversion point is highlighted in the figure with a white dot and marks the beginning of the shaded area. The acceleration remains near to zero and remains near to one, because the Wii Remote is held horizontally (see C. Robot rotations). A similar reasoning can be done to the other gestures (X-, Y+, Y-, Z+, and Z-). IV. POSTURE AND GESTURE RECOGNITION A. Modes of operation In fact, the developed system has two distinct modes of operation that the user can select during the robot demonstration phase. In the first mode, the robot moves along the X, Y, and Z axes separately, while in the other mode the robot can moves along the three axes at the same Fig. 5. The measured accelerations along the three axes, for each of the six different gestures. To interpret the acceleration values and recognize the hand movements, a statistical approach was used. For each of the six gestures is calculated the arithmetic mean of the

5 accelerations measured in the learning phase,,, (only in the non-shaded area). After this, the standard deviation is used to measure how widely spreads the acceleration values are from each mean,,. By this way, is established a range of acceleration values which define each of gestures. The above is done in the learning phase. During the robotic demonstration phase, a gesture is recognized when, +,, + and, +, where,, are a mean of the acceleration values measured during the robotic demonstration phase. The experimental tests showed that this method have an average of 93% of correctly recognized movements (see D. Learning phase and ANNs). To define the robot increment, these gestures are then transformed in the vector, for example, if is detected the movement (Y+), = 0,1,0. In the second mode of operation, the robot is moved linearly along the direction that the user hand demonstrates, in other words, the vector =,, 1 will directly define the robot movement direction. The third component of is 1 because the Wii Remote is held horizontally, reporting an acceleration in the Z axis (see C. Robot rotations). C. Robot rotations The robot control system needs to have as input six different robot rotations (Rx+, Rx-, Ry+, Ry-, Rz+, and Rz- ). If the Wii Remote is in free fall, it will report zero acceleration. But if the Wii Remote is held horizontally, it will report acceleration in the Z axis, the acceleration due to gravity that near to the surface of the earth is approximately 9.8 /. Thus, even when the user is not accelerating the Wii Remote, a static measurement can determine the rotation of the Wii Remote (posture recognition). Analyzing figure 6-A, when the Wii Remote is held horizontally, it will report an acceleration along the Z axis in the positive direction;, 0, and 0. But when the Wii Remote is rotated around the Y axis (Fig. 6- B),, 0, and 0. On contrary, when the Wii Remote is rotated around the Y axis in the reverse direction (Fig. 6-C),, 0, and 0. In order to detect the rotations around the X axis, a similar approach was done (Fig. 6-D, 6-E). However, in terms of rotation around the Z axis (Fig. 6-F, 6-G), nothing can be concluded because in both cases the gravity is going along the Z axis. To solve this problem, an ANN was used to detect rotation movements around the Z axis (see D. Learning phase and ANNs). D. Learning phase and ANNs Before starting the robotic demonstration, the user should train the system (learning phase). In terms of robot translations, this is done demonstrating several times each movement (X+, X-, Y+, Y-, Z+, and Z-). The experimental Fig. 6. A No rotation. B Rotation around the Y axis in the negative direction (Ry-). C (Ry+). D (Rx-). E (Rx+). F (Rz+). G (Rz-). tests showed that 20 demonstrations are enough to have an average of 93% of correctly recognized movements. In terms of robot rotations (Rx+, Rx-, Ry+, and Ry-), the same procedure was applied. The ANNs has been applied in a wide range of applications such as in the recognition of gestures [10]. In order to detect rotation movements around the Z axis (Rz+ or Rz-), an ANN trained with a back-propagation algorithm was implemented. The input signals (acceleration data) are represented by a vector =,,, and the output from a neuron j is given by (5), where is the output of neuron i, is the weight of the link from neuron i to neuron j, is

6 the bias of neuron j, and F is the activation function. =. + (5) In the learning phase, the user should provide inputs and the outputs to the algorithm. The accuracy of the output depends a lot on the samples we provided during the training phase and the number of times we trained the network, but the tests showed that in 2 minutes we can perform 80 demonstrations, obtaining 87% of correctly recognized gestures. V. TEST CASES To assess the performance of our system, two different experimental tests were performed. The first is a common robotic pick-and-place operation and in the second test case the robot is used to write some letters on paper (Fig. 7). In both test cases the results obtained were very promising, showing that an unskilled user can generate a robot program for a specific task in an intuitive way (only using gestures and speech) [12]. Fig. 7. A Robotic pick-and-place operation. B Robot writing letters. VI. COMPARISON WITH A MANUAL GUIDANCE SYSTEM The performance of the developed system was compared with a similar system that instead of gestures uses a manual guidance system based on a force control strategy to move the robot (Fig. 8) [13]. Both systems are intuitive and easy to use, however, to do the same robotic demonstration the manual guidance system takes less time ( 30%), and presents better robustness than the other system that sometimes does not recognizes the hand postures and gestures. VII. CONCLUSION Due to the growing demand for natural HMIs, a robotic system that allows users to program an industrial robot using gestures and speech was proposed. The ASR has shown great robustness (even in the presence of surrounding noise), and the recognition of gestures presents promising results, which should be improved in future. ACKNOWLEDGMENT The authors want also to acknowledge the help of the Portuguese Office of the Microsoft Language Development Centre, especially Professor Miguel Salles Dias, for their support with the Microsoft ASR and TTS engines and related APIs. REFERENCES [1] R. Dillmann, Teaching and learning of robot tasks via observation of human performance, in Robotics and Autonomous Systems, vol. 47, no.2-3, pp , [2] M. Ehrenmann, R. D. Zöllner, O. Rogalla and R. Dillmann, Programming service tasks in household environments by human demonstration, 11 th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2002), pp , Berlin, Germany, [3] J. Aleotti, A. Skoglund and T. Duckett, Position teaching of a robot arm by demonstration with a wearable input device International Conference on Intelligent Manipulation and Grasping (IMG04), Genoa, Italy, July 1-2, [4] J. N. Pires, G. Veiga, and R. Araújo, Programming-by-demonstration in the coworker scenario for SMEs, in Industrial Robot, Emerald, 2008, submitted for publication. [5] J. N. Pires, Industrial Robots Programming, Building Applications for the Factories of the Future, Springer, New York, USA, [6] R. Cravotta, Recognizing gestures. Blurring the line between humans and machines, EDN Europe, Available: [7] J. N. Pires, Robot-by-voice: Experiments on commanding an industrial robot using the human voice, in Industrial Robot, An International Journal, vol. 32, no.6, pp , Emerald, [8] I. Mihara, Y. Yamauchi, and M. Doi, A real-time vision-based interface using motion processor and applications to robotics, in Systems and Computers in Japan, vol. 34, pp , [9] S. Perrin, A. Cassinelli, and M. Ishikawa, Gesture recognition using laser-based tracking system, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp , [10] K. Murakami, and H. Taguchi, Gesture Recognition using Recurrent Neural Networks In Proceedings of ACM CHI 91Conference on Human Factors in Computing Systems, pp , New Orleans, USA, [11] J. N. Pires, J. Ramming, S. Rauch, and R. Araújo, Force/torque sensing applied to industrial robotic deburring, Sensor Review Journal, An International Journal, vol. 22 no.3, pp , [12] Available: [13] Available: Fig. 8. Due to their force control system, the ABB robot can be guided manually by the user.

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,

More information

INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE

INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE J. Norberto Pires Mechanical Engineering

More information

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Application Areas of AI   Artificial intelligence is divided into different branches which are mentioned below: Week 2 - o Expert Systems o Natural Language Processing (NLP) o Computer Vision o Speech Recognition And Generation o Robotics o Neural Network o Virtual Reality APPLICATION AREAS OF ARTIFICIAL INTELLIGENCE

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

CAD-Based Robot Programming: the role of Fuzzy-PI Force Control in Unstructured Environments

CAD-Based Robot Programming: the role of Fuzzy-PI Force Control in Unstructured Environments CAD-Based Robot Programming: the role of Fuzzy-PI Force Control in Unstructured Environments Pedro Neto, Nuno Mendes, J. Norberto Pires, Member, IEEE, and A. Paulo Moreira, Member, IEEE Abstract More and

More information

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Hung-Chi Chu 1, Yuan-Chin Cheng 1 1 Department of Information and Communication Engineering, Chaoyang University

More information

Intelligent interaction

Intelligent interaction BionicWorkplace: autonomously learning workstation for human-machine collaboration Intelligent interaction Face to face, hand in hand. The BionicWorkplace shows the extent to which human-machine collaboration

More information

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB MD.SHABEENA BEGUM, P.KOTESWARA RAO Assistant Professor, SRKIT, Enikepadu, Vijayawada ABSTRACT In today s world, in almost all sectors, most of the work

More information

WIRELESS VOICE CONTROLLED ROBOTICS ARM

WIRELESS VOICE CONTROLLED ROBOTICS ARM WIRELESS VOICE CONTROLLED ROBOTICS ARM 1 R.ASWINBALAJI, 2 A.ARUNRAJA 1 BE ECE,SRI RAMAKRISHNA ENGINEERING COLLEGE,COIMBATORE,INDIA 2 ME EST,SRI RAMAKRISHNA ENGINEERING COLLEGE,COIMBATORE,INDIA aswinbalaji94@gmail.com

More information

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Nobuaki Nakazawa 1*, Toshikazu Matsui 1, Yusaku Fujii 2 1 Faculty of Science and Technology, Gunma University, 29-1

More information

Wirelessly Controlled Wheeled Robotic Arm

Wirelessly Controlled Wheeled Robotic Arm Wirelessly Controlled Wheeled Robotic Arm Muhammmad Tufail 1, Mian Muhammad Kamal 2, Muhammad Jawad 3 1 Department of Electrical Engineering City University of science and Information Technology Peshawar

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Service Robots in an Intelligent House

Service Robots in an Intelligent House Service Robots in an Intelligent House Jesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering Autonomous National University of Mexico UNAM 2017 OUTLINE Introduction A System

More information

Gesture Control of Robotic Arm for Hazardous Environment

Gesture Control of Robotic Arm for Hazardous Environment Gesture Control of Robotic Arm for Hazardous Environment Ms.Pavithra R, Shreeja P, Sirisha MVK, Varshinee S Assistant Professor, UG Students, EEE RMK Engineering College R.S.M Nagar, Kavaraipettai-601

More information

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different

More information

KINECT CONTROLLED HUMANOID AND HELICOPTER

KINECT CONTROLLED HUMANOID AND HELICOPTER KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED

More information

Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica Agell

Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica Agell DEGREE PROJECT FOR MASTER OF SCIENCE WITH SPECIALIZATION IN ROBOTICS DEPARTMENT OF ENGINEERING SCIENCE UNIVERSITY WEST Robot manipulation based on Leap Motion - For small and medium sized enterprises Ulrica

More information

Off-line Programming and Simulation from CAD Drawings: Robot-Assisted Sheet Metal Bending

Off-line Programming and Simulation from CAD Drawings: Robot-Assisted Sheet Metal Bending Off-line Programming and Simulation from CAD Drawings: Robot-Assisted Sheet Metal Bending Pedro Neto Department of Mechanical Engineering, CEMUC University of Coimbra POLO II, 3030-788, Coimbra, Portugal

More information

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July

More information

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION Hand gesture recognition for vehicle control Bhagyashri B.Jakhade, Neha A. Kulkarni, Sadanand. Patil Abstract: - The rapid evolution in technology has made electronic gadgets inseparable part of our life.

More information

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Sensors and Materials, Vol. 28, No. 6 (2016) 695 705 MYU Tokyo 695 S & M 1227 Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Chun-Chi Lai and Kuo-Lan Su * Department

More information

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY Sidhesh Badrinarayan 1, Saurabh Abhale 2 1,2 Department of Information Technology, Pune Institute of Computer Technology, Pune, India ABSTRACT: Gestures

More information

Gesture Recognition with Real World Environment using Kinect: A Review

Gesture Recognition with Real World Environment using Kinect: A Review Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

Robust Hand Gesture Recognition for Robotic Hand Control

Robust Hand Gesture Recognition for Robotic Hand Control Robust Hand Gesture Recognition for Robotic Hand Control Ankit Chaudhary Robust Hand Gesture Recognition for Robotic Hand Control 123 Ankit Chaudhary Department of Computer Science Northwest Missouri State

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam

Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam Tavares, J. M. R. S.; Ferreira, R. & Freitas, F. / Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam, pp. 039-040, International Journal of Advanced Robotic Systems, Volume

More information

Available online at ScienceDirect. Procedia CIRP 55 (2016 ) 1 5

Available online at   ScienceDirect. Procedia CIRP 55 (2016 ) 1 5 Available online at www.sciencedirect.com ScienceDirect Procedia CIRP 55 (2016 ) 1 5 5th CIRP Global Web Conference Research and Innovation for Future Production High level robot programming using body

More information

Birth of An Intelligent Humanoid Robot in Singapore

Birth of An Intelligent Humanoid Robot in Singapore Birth of An Intelligent Humanoid Robot in Singapore Ming Xie Nanyang Technological University Singapore 639798 Email: mmxie@ntu.edu.sg Abstract. Since 1996, we have embarked into the journey of developing

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Multi-touch Interface for Controlling Multiple Mobile Robots

Multi-touch Interface for Controlling Multiple Mobile Robots Multi-touch Interface for Controlling Multiple Mobile Robots Jun Kato The University of Tokyo School of Science, Dept. of Information Science jun.kato@acm.org Daisuke Sakamoto The University of Tokyo Graduate

More information

Controlling Humanoid Robot Using Head Movements

Controlling Humanoid Robot Using Head Movements Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-103 DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS

More information

Robot Task-Level Programming Language and Simulation

Robot Task-Level Programming Language and Simulation Robot Task-Level Programming Language and Simulation M. Samaka Abstract This paper presents the development of a software application for Off-line robot task programming and simulation. Such application

More information

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison e-issn 2455 1392 Volume 2 Issue 10, October 2016 pp. 34 41 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Design a Model and Algorithm for multi Way Gesture Recognition using Motion and

More information

Development of Micro-manipulation System for Operation in Scanning Electron Microscope

Development of Micro-manipulation System for Operation in Scanning Electron Microscope Development of Micro-manipulation System for Operation in Scanning Electron Microscope H. Eda, L. Zhou, Y. Yamamoto, T. Ishikawa, T. Kawakami and J. Shimizu System Engineering Department, Ibaraki University,

More information

Simulation of a mobile robot navigation system

Simulation of a mobile robot navigation system Edith Cowan University Research Online ECU Publications 2011 2011 Simulation of a mobile robot navigation system Ahmed Khusheef Edith Cowan University Ganesh Kothapalli Edith Cowan University Majid Tolouei

More information

Designing Toys That Come Alive: Curious Robots for Creative Play

Designing Toys That Come Alive: Curious Robots for Creative Play Designing Toys That Come Alive: Curious Robots for Creative Play Kathryn Merrick School of Information Technologies and Electrical Engineering University of New South Wales, Australian Defence Force Academy

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device RESEARCH ARTICLE OPEN ACCESS Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device 1 Dr. V. Nithya, 2 T. Sree Harsha, 3 G. Tarun Kumar,

More information

Robotic Polishing of Streamline Co-Extrusion Die: A Case Study

Robotic Polishing of Streamline Co-Extrusion Die: A Case Study Proceedings of the 2017 International Conference on Industrial Engineering and Operations Management (IEOM) Bristol, UK, July 24-25, 2017 Robotic Polishing of Streamline Co-Extrusion Die: A Case Study

More information

The Control of Avatar Motion Using Hand Gesture

The Control of Avatar Motion Using Hand Gesture The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,

More information

2. Visually- Guided Grasping (3D)

2. Visually- Guided Grasping (3D) Autonomous Robotic Manipulation (3/4) Pedro J Sanz sanzp@uji.es 2. Visually- Guided Grasping (3D) April 2010 Fundamentals of Robotics (UdG) 2 1 Other approaches for finding 3D grasps Analyzing complete

More information

ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE. G. Pires, U. Nunes, A. T. de Almeida

ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE. G. Pires, U. Nunes, A. T. de Almeida ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE G. Pires, U. Nunes, A. T. de Almeida Institute of Systems and Robotics Department of Electrical Engineering University of Coimbra, Polo II 3030

More information

Final Report. Chazer Gator. by Siddharth Garg

Final Report. Chazer Gator. by Siddharth Garg Final Report Chazer Gator by Siddharth Garg EEL 5666: Intelligent Machines Design Laboratory A. Antonio Arroyo, PhD Eric M. Schwartz, PhD Thomas Vermeer, Mike Pridgen No table of contents entries found.

More information

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.

More information

3-Degrees of Freedom Robotic ARM Controller for Various Applications

3-Degrees of Freedom Robotic ARM Controller for Various Applications 3-Degrees of Freedom Robotic ARM Controller for Various Applications Mohd.Maqsood Ali M.Tech Student Department of Electronics and Instrumentation Engineering, VNR Vignana Jyothi Institute of Engineering

More information

Design of WSN for Environmental Monitoring Using IoT Application

Design of WSN for Environmental Monitoring Using IoT Application Design of WSN for Environmental Monitoring Using IoT Application Sarika Shinde 1, Prof. Venkat N. Ghodke 2 P.G. Student, Department of E and TC Engineering, DPCOE Engineering College, Pune, Maharashtra,

More information

Affordable Real-Time Vision Guidance for Robot Motion Control

Affordable Real-Time Vision Guidance for Robot Motion Control Affordable Real-Time Vision Guidance for Robot Motion Control Cong Wang Assistant Professor ECE and MIE Departments New Jersey Institute of Technology Mobile: (510)529-6691 Office: (973)596-5744 Advanced

More information

Classification for Motion Game Based on EEG Sensing

Classification for Motion Game Based on EEG Sensing Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,

More information

Team KMUTT: Team Description Paper

Team KMUTT: Team Description Paper Team KMUTT: Team Description Paper Thavida Maneewarn, Xye, Pasan Kulvanit, Sathit Wanitchaikit, Panuvat Sinsaranon, Kawroong Saktaweekulkit, Nattapong Kaewlek Djitt Laowattana King Mongkut s University

More information

FU-Fighters. The Soccer Robots of Freie Universität Berlin. Why RoboCup? What is RoboCup?

FU-Fighters. The Soccer Robots of Freie Universität Berlin. Why RoboCup? What is RoboCup? The Soccer Robots of Freie Universität Berlin We have been building autonomous mobile robots since 1998. Our team, composed of students and researchers from the Mathematics and Computer Science Department,

More information

MATLAB is a high-level programming language, extensively

MATLAB is a high-level programming language, extensively 1 KUKA Sunrise Toolbox: Interfacing Collaborative Robots with MATLAB Mohammad Safeea and Pedro Neto Abstract Collaborative robots are increasingly present in our lives. The KUKA LBR iiwa equipped with

More information

Chapter 1 Introduction

Chapter 1 Introduction Chapter 1 Introduction It is appropriate to begin the textbook on robotics with the definition of the industrial robot manipulator as given by the ISO 8373 standard. An industrial robot manipulator is

More information

Live Hand Gesture Recognition using an Android Device

Live Hand Gesture Recognition using an Android Device Live Hand Gesture Recognition using an Android Device Mr. Yogesh B. Dongare Department of Computer Engineering. G.H.Raisoni College of Engineering and Management, Ahmednagar. Email- yogesh.dongare05@gmail.com

More information

Eye-to-Hand Position Based Visual Servoing and Human Control Using Kinect Camera in ViSeLab Testbed

Eye-to-Hand Position Based Visual Servoing and Human Control Using Kinect Camera in ViSeLab Testbed Memorias del XVI Congreso Latinoamericano de Control Automático, CLCA 2014 Eye-to-Hand Position Based Visual Servoing and Human Control Using Kinect Camera in ViSeLab Testbed Roger Esteller-Curto*, Alberto

More information

GESTURE BASED ROBOTIC ARM

GESTURE BASED ROBOTIC ARM GESTURE BASED ROBOTIC ARM Arusha Suyal 1, Anubhav Gupta 2, Manushree Tyagi 3 1,2,3 Department of Instrumentation And Control Engineering, JSSATE, Noida, (India) ABSTRACT In recent years, there are development

More information

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Dipartimento di Elettronica Informazione e Bioingegneria Robotics Dipartimento di Elettronica Informazione e Bioingegneria Robotics Behavioral robotics @ 2014 Behaviorism behave is what organisms do Behaviorism is built on this assumption, and its goal is to promote

More information

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed AUTOMOTIVE Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed Yoshiaki HAYASHI*, Izumi MEMEZAWA, Takuji KANTOU, Shingo OHASHI, and Koichi TAKAYAMA ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

More information

Trust and Interaction in Industrial Human-Robot Collaborative applications

Trust and Interaction in Industrial Human-Robot Collaborative applications Trust and Interaction in Industrial Human-Robot Collaborative applications Iñaki Maurtua IK4-TEKNIKER This project has received funding from the European Union s Horizon 2020 research and innovation programme

More information

Information and Program

Information and Program Robotics 1 Information and Program Prof. Alessandro De Luca Robotics 1 1 Robotics 1 2017/18! First semester (12 weeks)! Monday, October 2, 2017 Monday, December 18, 2017! Courses of study (with this course

More information

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Humanoid robot. Honda's ASIMO, an example of a humanoid robot Humanoid robot Honda's ASIMO, an example of a humanoid robot A humanoid robot is a robot with its overall appearance based on that of the human body, allowing interaction with made-for-human tools or environments.

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

Towards Interactive Learning for Manufacturing Assistants. Andreas Stopp Sven Horstmann Steen Kristensen Frieder Lohnert

Towards Interactive Learning for Manufacturing Assistants. Andreas Stopp Sven Horstmann Steen Kristensen Frieder Lohnert Towards Interactive Learning for Manufacturing Assistants Andreas Stopp Sven Horstmann Steen Kristensen Frieder Lohnert DaimlerChrysler Research and Technology Cognition and Robotics Group Alt-Moabit 96A,

More information

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Anders J Johansson, Joakim Linde Teiresias Research Group (www.bigfoot.com/~teiresias) Abstract Force feedback (FF) is a technology

More information

Learning and Using Models of Kicking Motions for Legged Robots

Learning and Using Models of Kicking Motions for Legged Robots Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract

More information

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic Universal Journal of Control and Automation 6(1): 13-18, 2018 DOI: 10.13189/ujca.2018.060102 http://www.hrpub.org Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic Yousef Moh. Abueejela

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Sensors & Systems for Human Safety Assurance in Collaborative Exploration Sensing and Sensors CMU SCS RI 16-722 S09 Ned Fox nfox@andrew.cmu.edu Outline What is collaborative exploration? Humans sensing robots Robots sensing humans Overseers sensing both Inherently safe systems

More information

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Control and robotics remote laboratory for engineering education

Control and robotics remote laboratory for engineering education Control and robotics remote laboratory for engineering education R. Šafarič, M. Truntič, D. Hercog and G. Pačnik University of Maribor, Faculty of electrical engineering and computer science, Maribor,

More information

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES Bulletin of the Transilvania University of Braşov Series I: Engineering Sciences Vol. 6 (55) No. 2-2013 PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES A. FRATU 1 M. FRATU 2 Abstract:

More information

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION ABSTRACT *Miss. Kadam Vaishnavi Chandrakumar, ** Prof. Hatte Jyoti Subhash *Research Student, M.S.B.Engineering College, Latur, India

More information

CONTACT: , ROBOTIC BASED PROJECTS

CONTACT: , ROBOTIC BASED PROJECTS ROBOTIC BASED PROJECTS 1. ADVANCED ROBOTIC PICK AND PLACE ARM AND HAND SYSTEM 2. AN ARTIFICIAL LAND MARK DESIGN BASED ON MOBILE ROBOT LOCALIZATION AND NAVIGATION 3. ANDROID PHONE ACCELEROMETER SENSOR BASED

More information

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity and acceleration sensing Force sensing Vision based

More information

Robot-by-voice: Experiments on commanding an industrial robot using the human voice

Robot-by-voice: Experiments on commanding an industrial robot using the human voice Robot-by-voice: Experiments on commanding an industrial robot using the human voice J. Norberto Pires Mechanical Engineering Department and Mechanical Engineering Research Center (a research center from

More information

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers Proceedings of the 3 rd International Conference on Mechanical Engineering and Mechatronics Prague, Czech Republic, August 14-15, 2014 Paper No. 170 Adaptive Humanoid Robot Arm Motion Generation by Evolved

More information

2 Focus of research and research interests

2 Focus of research and research interests The Reem@LaSalle 2014 Robocup@Home Team Description Chang L. Zhu 1, Roger Boldú 1, Cristina de Saint Germain 1, Sergi X. Ubach 1, Jordi Albó 1 and Sammy Pfeiffer 2 1 La Salle, Ramon Llull University, Barcelona,

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

3D-Position Estimation for Hand Gesture Interface Using a Single Camera 3D-Position Estimation for Hand Gesture Interface Using a Single Camera Seung-Hwan Choi, Ji-Hyeong Han, and Jong-Hwan Kim Department of Electrical Engineering, KAIST, Gusung-Dong, Yusung-Gu, Daejeon, Republic

More information

Term Paper: Robot Arm Modeling

Term Paper: Robot Arm Modeling Term Paper: Robot Arm Modeling Akul Penugonda December 10, 2014 1 Abstract This project attempts to model and verify the motion of a robot arm. The two joints used in robot arms - prismatic and rotational.

More information

Available theses in robotics (March 2018) Prof. Paolo Rocco Prof. Andrea Maria Zanchettin

Available theses in robotics (March 2018) Prof. Paolo Rocco Prof. Andrea Maria Zanchettin Available theses in robotics (March 2018) Prof. Paolo Rocco Prof. Andrea Maria Zanchettin Ergonomic positioning of bulky objects Thesis 1 Robot acts as a 3rd hand for workpiece positioning: Muscular fatigue

More information

SELF-BALANCING MOBILE ROBOT TILTER

SELF-BALANCING MOBILE ROBOT TILTER Tomislav Tomašić Andrea Demetlika Prof. dr. sc. Mladen Crneković ISSN xxx-xxxx SELF-BALANCING MOBILE ROBOT TILTER Summary UDC 007.52, 62-523.8 In this project a remote controlled self-balancing mobile

More information

Virtual Grasping Using a Data Glove

Virtual Grasping Using a Data Glove Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct

More information

Multi-Modal Robot Skins: Proximity Servoing and its Applications

Multi-Modal Robot Skins: Proximity Servoing and its Applications Multi-Modal Robot Skins: Proximity Servoing and its Applications Workshop See and Touch: 1st Workshop on multimodal sensor-based robot control for HRI and soft manipulation at IROS 2015 Stefan Escaida

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living Javier Jiménez Alemán Fluminense Federal University, Niterói, Brazil jjimenezaleman@ic.uff.br Abstract. Ambient Assisted

More information

UNIT-1 INTRODUCATION The field of robotics has its origins in science fiction. The term robot was derived from the English translation of a fantasy play written in Czechoslovakia around 1920. It took another

More information

SitiK KIT. Team Description for the Humanoid KidSize League of RoboCup 2010

SitiK KIT. Team Description for the Humanoid KidSize League of RoboCup 2010 SitiK KIT Team Description for the Humanoid KidSize League of RoboCup 2010 Shohei Takesako, Nasuka Awai, Kei Sugawara, Hideo Hattori, Yuichiro Hirai, Takesi Miyata, Keisuke Urushibata, Tomoya Oniyama,

More information

Design and Implementation of FPGA-Based Robotic Arm Manipulator

Design and Implementation of FPGA-Based Robotic Arm Manipulator Design and Implementation of FPGABased Robotic Arm Manipulator Mohammed Ibrahim Mohammed Ali Military Technical College, Cairo, Egypt Supervisors: Ahmed S. Bahgat 1, Engineering physics department Mahmoud

More information

Autonomous Wheelchair for Disabled People

Autonomous Wheelchair for Disabled People Proc. IEEE Int. Symposium on Industrial Electronics (ISIE97), Guimarães, 797-801. Autonomous Wheelchair for Disabled People G. Pires, N. Honório, C. Lopes, U. Nunes, A. T Almeida Institute of Systems and

More information

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots Hafid NINISS Forum8 - Robot Development Team Abstract: The purpose of this work is to develop a man-machine interface for

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

Multi-Robot Cooperative System For Object Detection

Multi-Robot Cooperative System For Object Detection Multi-Robot Cooperative System For Object Detection Duaa Abdel-Fattah Mehiar AL-Khawarizmi international collage Duaa.mehiar@kawarizmi.com Abstract- The present study proposes a multi-agent system based

More information

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision 11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste

More information