2.1 Dual-Arm Humanoid Robot A dual-arm humanoid robot is actuated by rubbertuators, which are McKibben pneumatic artiæcial muscles as shown in Figure

Size: px
Start display at page:

Download "2.1 Dual-Arm Humanoid Robot A dual-arm humanoid robot is actuated by rubbertuators, which are McKibben pneumatic artiæcial muscles as shown in Figure"

Transcription

1 Integrating Visual Feedback and Force Feedback in 3-D Collision Avoidance for a Dual-Arm Humanoid Robot S. Charoenseang, A. Srikaew, D. M. Wilkes, and K. Kawamura Center for Intelligent Systems Vanderbilt University Nashville, TN 37235, USA fsammy, srikaewa, wilkes, kawamurag@vuse.vanderbilt.edu Abstract This paper presents a simple but eæective method for integrating sensors and actuators in both real and virtual environments using an interactive simulator over the Internet. Our research explores the combination of visual feedback and force feedback to enhance our 3-D collision avoidance approach for a dual-arm humanoid robot. The robot control server uses 3-D vision-based position estimates of obstacles, such asahuman hand or objects on the table, to compute a collision-free path in real-time. The dual-arm robot is capable of performing multiple interactive tasks while avoiding collisions between its own arms or with other objects. The interactive simulator provides visual feedback to the user and sends the user's commands to the robot server and the stereo camera server via the Internet. Thus, it is well-designed for robot teleassistance applications. Force feedback eæects are generated, using an inexpensive commercially available joystick, to enhance the sensation of the real environment while interacting with the simulation. Finally, experimental results using our collision avoidance approach are presented to demonstrate the performance of this system. Keywords: Collision Avoidance, Dual-Arm Humanoid Robot, Interactive Simulator, Color Tracking, 3-D Position Estimation 1 Introduction Teleassistance and virtual reality techniques can provide signiæcant enhancement of current robotic applications. Recent research has been explored techniques of integrating sensors and actuators in virtual and real environments ë1ëë2ëë3ë. This paper discusses a simple but eæective and inexpensive waytointegrate actuators and sensors in real and virtual environments for robotic applications under the PC platform. The 3-D collision avoidance problem is used to illustrate this technique. The architecture of this system uses an object-oriented model and a distributed network-based paradigm. The user is allowed to control a low-cost camera head over the Internet. The camera server will report back to the user how many objects are present along with the locations. The user can use that information to build the virtual scene including the type and the location of each obstacle. An OpenGL-based simulator is used to render the virtual robotic environment atthe user site. This simulator is also connected to the camera control server and the robot control server at the robot site. Multiple clients can connect to the camera server and robot server at the same time. The operation of the robot is based on the ærstcome ærst-serve strategy. When the virtual scene is built, the user can control the positions of the robot arms. The arm server also uses 3-D vision-based position estimates of obstacle, such asahuman hand, to compute a collision-free path in real-time. The simulator will be updated via the network to provide the visual feedback. Also, force-feedback will be sent to the force-feedback joystick at the user site. When the simulator is notiæed that a potential collision is about to occur, it will generate sound feedback to the user. Speech recognition and synthesis are integrated into the simulator to obtain a friendly user interface. The experimental results are presented to demonstrate the performance of the system. 2 System Architecture To allow a user to build a virtual robotic environment at a remote site, various kinds of hardware and software modules are integrated to sense information from the real environment. Multimedia-based feedback such as vision, force and sound are also used to assist the user to intuitively operate a dual-arm humanoid from a remote site. The dual-arm robot used is shown in Figure 1. A simple real-time 3-D collision avoidance algorithm is also implemented for robot control. This system is developed for the PC platform under the Microsoft Windows 95 and Windows NT operating system.

2 2.1 Dual-Arm Humanoid Robot A dual-arm humanoid robot is actuated by rubbertuators, which are McKibben pneumatic artiæcial muscles as shown in Figure 1 ë4ë. This dual-arm robot is a service robot developed to aid the elderly and the disabled, and also for use in holonic manufacturing. A PC-based controller developed in-house is used to control each 6-DOF arm. A 6-axis forceètorque sensor is mounted at the wrist of each arm to acquire the external force and torque exerted on each arm. Figure 1: The Dual-Arm Humanoid Robot and its Simulator 2.2 Interactive 3-D Robotic Simulator A 3-D OpenGL-based robotic simulator is used to render a virtual robotic environment at the user site. 3-D graphical models of the dual-arm humanoid robot, a panètiltèverge camera head and a robotic environment including obstacles are constructed via the network as shown in Figure 1. The simulation of the dual-arm robot is updated by reading the joint angles from the robot control server over the Internet. Similarly, the camera control server updates the angles of the camera head. Moreover, a user can send control commands to the robot control server and the camera control server by using this interactive simulator. This simulator integrates multimedia-based feedback such as force feedback, visual feedback, and sound feedback as follows: A commercial force feedback joystick, the Microsoft SideWinder Force Feedback Pro, is programmed to generate force feedback eæects for the user. When a potential collision is detected or the forceètorque sensor detects some external force data, all force information will be processed and sent to the force feedback joystick to generate eæects. Further, the user can use this 3-D joystick to control the position of the panètiltèverge camera head remotely. æ Force Feedback Joystick Control: æ Speech Recognition and Synthesis: Simple speech recognition and synthesis are integrated to provide the user a natural way to communicate with the simulator. The user can issue pre-deæned voice commands, and speech synthesis is used to prompt and verify for the user's input. æ Sound Feedback: When the potential collision between an arm and an obstacle or between arms is detected, sound feedback will be generated simultaneously with force feedback at the user site. æ Live Video: Live video feedback will be provided for the user to monitor the real environment at the robot site. 2.3 Visual Tracking System The objective of the visual tracking module is to provide information èsuch as positionè about the obstacle to the robot. This tracking system is composed of a camera head with two color cameras, the Connectix color QuickCam, a color image acquisition board and two Pentium PCs for color image processing, obstacle tracking, live video transmission and camera control. Figure 2 shows two color CCD cameras and the color QuickCam mounted on a panètiltèverge system. The obstacle tracker utilizes the 2-D position of the obstacles in the image planes and subsequently controls the camera head to center the obstacle in the camera views. After the target is æxated, the 3-D position can be computed and used by the robot system to ænd the location of the obstacle relative to the robot coordinate system. Details of each module are discussed in the following sections. Figure 2: Camera Head Object Color-Based Tracking The goal of the color segmentation module is to locate an obstacle in a color image and guide the camera head to center it in the camera views. To accomplish this, color models of the obstacle were created. The color segmentation is then performed to separate the pixels into possible obstacle-color pixels and nonobstacle-color pixels by the color model. This color segmentation method was proposed by Barile ë5ë. In the segmentation stage, pixels from the input image are tested to determine if they fall inside a predeæned RGB color model space. From this process, a

3 binary mask image is created for use by later stages. In the binary mask image, the white pixels represent a color that matches the color model and black pixels represent a color that does not match the color model. A color model is created oæ-line by manually segmenting several images of the obstacle. In order to build the virtual scene, information about the workspace needs to be known; for example, the location of a box, a soda can or an orange on the table. Each object is tracked using its color attribute, one objectècolor at a time. Consequently, a color model database is created for storing color information of prospective objects in the workspace, such as, a blue box, a green soda can or an orange grapefruit. Once objectècolor is segmented, the output images are used by the camera tracker in order to estimate the 3-D position of the object. Moreover, after the virtual scene is built, the camera tracker is responsible for tracking a human obstacle such as an arm, or a hand. The color of these obstacles can be considered as a skin-tone color. This then becomes skin-tone color tracking Camera Tracker and 3-D Object Estimator Given the location of the skin-tone centroid in the mask image, the camera tracker then moves the camera head to guide the centroid towards a dead zone in the camera view. The dead zone is deæned to be a circular area in the center of the image view. The size of the dead zone indicates the accuracy of the camera æxation point. The smaller the area of the dead zone, the more accurate. The cameras move according to a direction vector generated by the distance from the skin-tone centroid to the center of the image view. The amount of the cameras' movement is proportional to the magnitude of the direction vector. The y-component of the direction vector controls the tilt motor and the x-component controls the leftèright motors. Once the target has reached the dead zone, the tracker is notiæed that the target has been æxated and then stops the camera head. This behavior-based method allows the system to focus on the robot's attention using active vision. The camera control scheme is shown in Figure 3. Figure 4: Camera Geometry D Collision Avoidance Humanoid robots have a unique opportunity to become useful assistants in homes, hospitals and factory æoors ë4ë. Since they must interact with humans at many levels, safety is a major concern. Collisions between the robots and humans or among the robots should be avoided. There have been several studies on operating multiple robots in an environment containing obstacles ë6ëë7ëë8ë. Most of these systems require complex path planning or only address collision avoidance among robots. Our approach presents a simple 3-D collision avoidance technique that can be applied to collision avoidance between the robot and external obstacles, or among robots ë9ë Multiple-Point Virtual Impedance Control The multiple-point virtual impedance control approach is used for 3-D collision avoidance. In a simple case, a virtual sphere is created to cover the endeæector of the robot arm as shown in Figure 5. Each obstacle is also covered by a virtual sphere. After a potential collision is detected for the virtual sphere on the arm, we can compute the total force exerted on the arm as follows: M e d í Xe + B e d _X e + K e dx e = F ext + F v è1è where M e, B e, and K e are the desired inertia, damp- Figure 3: Camera Control Scheme. Once the left and right cameras have æxated the target, the 3-D position of the obstacle can be determined geometrically. The coordinates of the obstacle in the camera frame can be determined as shown in Figure 4. Figure 5: Virtual Impedance Control Scheme ing, and stiæness matrices, respectively. The deviation dx e is the diæerence between the current position X of the end-eæector and the desired position X d. The

4 velocity and the acceleration of dx e are d _X e and d Xe í, respectively. F v is a virtual force generated when the obstacle is within the virtual sphere on the arm. F ext is an external force acquired by using the forceètorque sensor. With the above equation, we are able to compute a new desired position for controlling the robot's motion. For the current implementation, three virtual spheres are created to cover the elbow, the wrist, and the end-eæector for each arm. When a potential collision between the arm and the obstacle is detected, a virtual force will be generated to push the arm away from the obstacle. Further, the total computed force is used to generate a force feedback eæect on the force feedback joystick at the user site. Figure 6: Basic Component Module 3 Interface Design Since there are several diæerent software and hardware modules in this system, a well-designed interface is needed. Our research addresses a distributed networkbased interface under the PC platform. This interface component provides feasibility tocombine small modules into a large system. A small module is constructed based on the object-oriented paradigm in the Windows environment. Each module consists of a communication interface component and its own functionality such ascontrolling hardware or processing some algorithms as shown in Figure Communication Interface Component A communication interface component provides the underlying interconnectivity for each module in the system. It is the Windows socket interface handler that manages the sending and receiving of data over the network. Thus, it is also called a socket manager, which is multithreading and a message-passing handler. Multiple threads allow asocket manager to send and receive data concurrently and independently. A socket message is a C++ derived object, which can contain diæerent kinds of data such as the robot's position and camera head control command. Further, a command header in the socket message is used to indicate what type of command has been sent and received. After interpreting the command header, data inside the message will be retrieved for further processing. Thus, the socket message is reusable. There are two types of socket managers: server and client socket managers. A server socket manager can handle multiple clients at the same time. A clientsocket manager is responsible for communicating with the speciæed server. 3.2 System Conæguration As described above, each module consists of a communication interface component and its speciæc functionality. Client-server socket managers are used to create a communication link between modules. The conæguration of the current system is shown in Figure 7. Figure 7: System Conæguration of a Dual-Arm Humanoid Robot Robot Control Server This robot control server is responsible for controlling the robot arm and performing 3-D collision avoidance. It sends pressure commands to the robot controller and reads the current robot joint angles at the same time. It also contains a derived object of the server socket manager for communicating with its clients. This robot control server receives socket messages such as the arm joint angles, the position and the size of the virtual obstacles from the clients. Further, it sends the update of the robot joint angles to the clients Camera Head Control Server The camera head control server is responsible for sending and reading the angle commands to and from the panètiltèverge camera head. A server socket manager object is derived to get the joint angles of the camera head from the clients. The current joint angles of the camera head will be sent to its clients through the socket message Vision Manager Module All image processing routines are handled by this module. It communicates with the camera head control server in order to move the camera head and track objects in the image scene. In the scene building phase, a sequence performed by this module is as follows:

5 1. Move the camera head to look at the table within the workspace. 2. Perform color segmentation of current color model. 3. Fixate that object and estimate its 3-D position. 4. Change to the next color model and restart the sequence again. After all color models in a database have been searched, this module sends all of the objects' 3-D positions to the coordination module to begin the next phase. This module is then changed to a free-running mode to perform obstacle èe.g., handè tracking. If there is a hand moving into this workspace, then the camera head æxates, estimates the hand's 3-D position, and sends it to the coordination module as the obstacle information. Figure 8 shows the resulting segmentation of each object from the robot workspace used for tracking. 4.1 Design Mode In the design mode, a user can use a 3-D joystick to control the camera head. Live video of the real environment from the robot site is sent over the network. While the camera head scans the environment, it tracks multiple objects simultaneously. If it detects some objects, information such as the number and the locations of the objects will be sent to the user. The user can use that information to render the virtual scene such as specifying 3-D object models èbox, cylinder, sphereè including their dimensions as shown in Figure 9 èleftè. Further, the user can design 3-D virtual spheres with radii covering those objects as shown in Figure 9 èrightè Figure 9: Virtual Robotic Environment in the Design Mode Figure 8: Robot Workspace and the Resulting Segmentation Coordination Module Since each robot arm has its own reference frame and performs multiple tasks independently and concurrently, a coordination module is needed. This module is used to coordinate the motions of both arms and communicate with other component modules. A 3-D robot simulator with multimedia feedback is integrated into this module. This simulator renders the virtual robot environment from the robot site and displays it at the user site. Two client socket manager objects are also created. The ærst client socket manager is the robot client manager, which is responsible to communicate with the robot control server. The second client socket manager is the camera head control client manager, which is used to communicate with the camera head control server. A server socket manager is established to receive the positions of obstacles from the vision manager. 4 System Operation There are three operation modes to assist the user to build the virtual robotic environment and control the dual-arm humanoid robot over the Internet. 4.2 Preview Mode Since a virtual robotic environment is constructed, the user can a specify task such as the trajectory for each arm. By running the 3-D interactive simulator, the user can preview the result of an operation before operating the real robot. Moreover, the user is allowed to reconægure the system such as the scene model, the starting points and ending points, etc. When a potential collision is detected, some force feedback eæects will be generated on the force feedback joystick. 4.3 Operation Mode Upon the user's satisfaction in the preview mode, the user can send the control commands to operate the robot. During the robot's operation, live video is sent to the user. The 3-D obstacle locator can be used to track an external dynamic obstacle such as a human hand. The robot system is robust enough to cope with the dynamic change in the environment. When a potential collision is detected or an external force is acquired from the force sensor, some force feedback eæects will be generated on the force feedback joystick at the user site. The simulation of the virtual robotic environment is also updated based on the information obtained from the robot control server and the camera head control server. Two experimental sets are established to demonstrate the performance of this system. Figure 10 ètopè shows collision avoidance between the dual-arm robot and æve external obstacles. A virtual force is generated to push the arm away from

6 Figure 10: Collision Avoidance between the Dual- Arm Robot and the External Obstacleètopè, Collision Avoidance between Two Armsèbottomè the obstacle. The second experiment consists of moving the two arms toward each other. The result in Figure 10 èbottomè shows that new trajectories are generated in order to move both arms away from each other after a potential collision is detected. 5 Conclusions This paper presented a simple but eæective and inexpensive way to integrate sensors and actuators in real and virtual environments under the PC platform. With the interactive 3-D simulator, the user can control robots and the camera head and build a virtual scene from the remote site. The advantages of the proposed system are: 1. It can be used in an oæ-line mode as a simulatorètrainer for teleassistance 2. It can be used on-line to implement a multimedia remote teleassisted robotic system. Currently, this technique is implemented on the dual-arm humanoid robot in the Intelligent Robotics Lab at Vanderbilt University. Robotics and Automation Conference, San Diego, CA, May, ë4ë K. Kawamura, D.M. Wilkes, T. Pack, M. Bishay, and J. Barile, ëhumanoids: Future Robots for Home and Factory, "Proceedings of the First International Symposium on Humanoid Robots, Waseda University, Tokyo, Japan, pp , October, ë5ë J. Barile, M. Bishay, M. Cambron, R. Watson, R.A. Peters, K. Kawamura, ëcolor-based Initialization for Human Tracking with a Trinocular Camera System," Proceedings of the Fifth IASTED International Conference on Robotics and Manufacturing, p , ë6ë T. Nagata, K. Honda, and Y. Teramoto, ëmultirobot Plan Generation in a Continuous Domain: Planning by Use of Plan Graph and Avoiding Collisions Among Robots," IEEE Journal of Robotics and Automation, pp. 2-13, February ë7ë M. Fischer, ëeæcient Path Planning Strategies for Cooperating Manipulators in Environments With Obstacles," Proceedings of the 1994 IEEE International Conference on Robotics and Automation, pp , ë8ë B. Cao, G. I. Dodds, and G. W. Irwin, ëimplementation of Time-Optimal Smooth and Collision-Free Path Planning in a Two Robot Arm Environment," Proceedings of the 1995 IEEE International Conference on Robotics and Automation, pp , ë9ë S. Charoenseang, A. Srikaew, D.M. Wilkes, and K. Kawamura, ë3-d Collision Avoidance for the Dual- Arm Humanoid Robot, " to appear in the Proceedings of IASTED International Conference on Robotics and Manufacturing, July References ë1ë C. Cooke and S. Stansæeld, ëinteractive Graphical Model Building using Telepresence and Virtual Reality," Proceedings of the 1994 IEEE Robotics and Automation Conference, San Diego, CA, May, ë2ë Y. Kunii and H. Hashimoto, ëtele-teaching by Human Demonstration in Virtual Environment for Robotic Network System,"Proceedings of the 1997 IEEE International Conference on Robotics and Automation, ë3ë N. E. Miner and S. A. Stansæeld, ëan Interactive Virtual Reality Simulation System for Robot Control and Operator Training," Proceedings of the 1994 IEEE

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1 Development of Multi-D.O.F. Master-Slave Arm with Bilateral Impedance Control for Telexistence Riichiro Tadakuma, Kiyohiro Sogen, Hiroyuki Kajimoto, Naoki Kawakami, and Susumu Tachi 7-3-1 Hongo, Bunkyo-ku,

More information

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES Bulletin of the Transilvania University of Braşov Series I: Engineering Sciences Vol. 6 (55) No. 2-2013 PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES A. FRATU 1 M. FRATU 2 Abstract:

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

Wireless Robust Robots for Application in Hostile Agricultural. environment.

Wireless Robust Robots for Application in Hostile Agricultural. environment. Wireless Robust Robots for Application in Hostile Agricultural Environment A.R. Hirakawa, A.M. Saraiva, C.E. Cugnasca Agricultural Automation Laboratory, Computer Engineering Department Polytechnic School,

More information

On Observer-based Passive Robust Impedance Control of a Robot Manipulator

On Observer-based Passive Robust Impedance Control of a Robot Manipulator Journal of Mechanics Engineering and Automation 7 (2017) 71-78 doi: 10.17265/2159-5275/2017.02.003 D DAVID PUBLISHING On Observer-based Passive Robust Impedance Control of a Robot Manipulator CAO Sheng,

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Humanoid robot. Honda's ASIMO, an example of a humanoid robot Humanoid robot Honda's ASIMO, an example of a humanoid robot A humanoid robot is a robot with its overall appearance based on that of the human body, allowing interaction with made-for-human tools or environments.

More information

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Kei Okada 1, Yasuyuki Kino 1, Fumio Kanehiro 2, Yasuo Kuniyoshi 1, Masayuki Inaba 1, Hirochika Inoue 1 1

More information

Team KMUTT: Team Description Paper

Team KMUTT: Team Description Paper Team KMUTT: Team Description Paper Thavida Maneewarn, Xye, Pasan Kulvanit, Sathit Wanitchaikit, Panuvat Sinsaranon, Kawroong Saktaweekulkit, Nattapong Kaewlek Djitt Laowattana King Mongkut s University

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

Intelligent interaction

Intelligent interaction BionicWorkplace: autonomously learning workstation for human-machine collaboration Intelligent interaction Face to face, hand in hand. The BionicWorkplace shows the extent to which human-machine collaboration

More information

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS

More information

UNIT VI. Current approaches to programming are classified as into two major categories:

UNIT VI. Current approaches to programming are classified as into two major categories: Unit VI 1 UNIT VI ROBOT PROGRAMMING A robot program may be defined as a path in space to be followed by the manipulator, combined with the peripheral actions that support the work cycle. Peripheral actions

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space Limits of a Distributed Intelligent Networked Device in the Intelligence Space Gyula Max, Peter Szemes Budapest University of Technology and Economics, H-1521, Budapest, Po. Box. 91. HUNGARY, Tel: +36

More information

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

CS277 - Experimental Haptics Lecture 2. Haptic Rendering CS277 - Experimental Haptics Lecture 2 Haptic Rendering Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering A note on timing...

More information

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space Yuki Fujibayashi and Hiroki Imamura Department of Information Systems Science, Graduate School

More information

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic

More information

Real Time Hand Gesture Tracking for Network Centric Application

Real Time Hand Gesture Tracking for Network Centric Application Real Time Hand Gesture Tracking for Network Centric Application Abstract Chukwuemeka Chijioke Obasi 1 *, Christiana Chikodi Okezie 2, Ken Akpado 2, Chukwu Nnaemeka Paul 3, Asogwa, Chukwudi Samuel 1, Akuma

More information

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine

More information

KMUTT Kickers: Team Description Paper

KMUTT Kickers: Team Description Paper KMUTT Kickers: Team Description Paper Thavida Maneewarn, Xye, Korawit Kawinkhrue, Amnart Butsongka, Nattapong Kaewlek King Mongkut s University of Technology Thonburi, Institute of Field Robotics (FIBO)

More information

Service Robots in an Intelligent House

Service Robots in an Intelligent House Service Robots in an Intelligent House Jesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering Autonomous National University of Mexico UNAM 2017 OUTLINE Introduction A System

More information

Haptic Tele-Assembly over the Internet

Haptic Tele-Assembly over the Internet Haptic Tele-Assembly over the Internet Sandra Hirche, Bartlomiej Stanczyk, and Martin Buss Institute of Automatic Control Engineering, Technische Universität München D-829 München, Germany, http : //www.lsr.ei.tum.de

More information

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements Jose Fortín and Raúl Suárez Abstract Software development in robotics is a complex task due to the existing

More information

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,

More information

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON Josep Amat 1, Alícia Casals 2, Manel Frigola 2, Enric Martín 2 1Robotics Institute. (IRI) UPC / CSIC Llorens Artigas 4-6, 2a

More information

Stabilize humanoid robot teleoperated by a RGB-D sensor

Stabilize humanoid robot teleoperated by a RGB-D sensor Stabilize humanoid robot teleoperated by a RGB-D sensor Andrea Bisson, Andrea Busatto, Stefano Michieletto, and Emanuele Menegatti Intelligent Autonomous Systems Lab (IAS-Lab) Department of Information

More information

Incorporating a Software System for Robotics Control and Coordination in Mechatronics Curriculum and Research

Incorporating a Software System for Robotics Control and Coordination in Mechatronics Curriculum and Research Paper ID #15300 Incorporating a Software System for Robotics Control and Coordination in Mechatronics Curriculum and Research Dr. Maged Mikhail, Purdue University - Calumet Dr. Maged B. Mikhail, Assistant

More information

Los Alamos. DOE Office of Scientific and Technical Information LA-U R-9&%

Los Alamos. DOE Office of Scientific and Technical Information LA-U R-9&% LA-U R-9&% Title: Author(s): Submitted M: Virtual Reality and Telepresence Control of Robots Used in Hazardous Environments Lawrence E. Bronisz, ESA-MT Pete C. Pittman, ESA-MT DOE Office of Scientific

More information

KINECT CONTROLLED HUMANOID AND HELICOPTER

KINECT CONTROLLED HUMANOID AND HELICOPTER KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED

More information

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision 11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste

More information

The Control of Avatar Motion Using Hand Gesture

The Control of Avatar Motion Using Hand Gesture The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,

More information

Fast, Robust Colour Vision for the Monash Humanoid Andrew Price Geoff Taylor Lindsay Kleeman

Fast, Robust Colour Vision for the Monash Humanoid Andrew Price Geoff Taylor Lindsay Kleeman Fast, Robust Colour Vision for the Monash Humanoid Andrew Price Geoff Taylor Lindsay Kleeman Intelligent Robotics Research Centre Monash University Clayton 3168, Australia andrew.price@eng.monash.edu.au

More information

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Taichi Yamada 1, Yeow Li Sa 1 and Akihisa Ohya 1 1 Graduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1,

More information

ROBOT-DISCOVERER: A ROLE MODEL FOR ANY INTELLIGENT AGENT. and Institute of Computer Science, Polish Academy of Sciences.

ROBOT-DISCOVERER: A ROLE MODEL FOR ANY INTELLIGENT AGENT. and Institute of Computer Science, Polish Academy of Sciences. ROBOT-DISCOVERER: A ROLE MODEL FOR ANY INTELLIGENT AGENT JAN M. _ ZYTKOW Department of Computer Science, UNC Charlotte, Charlotte, NC 28223, USA and Institute of Computer Science, Polish Academy of Sciences

More information

Traffic Control for a Swarm of Robots: Avoiding Group Conflicts

Traffic Control for a Swarm of Robots: Avoiding Group Conflicts Traffic Control for a Swarm of Robots: Avoiding Group Conflicts Leandro Soriano Marcolino and Luiz Chaimowicz Abstract A very common problem in the navigation of robotic swarms is when groups of robots

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots Hafid NINISS Forum8 - Robot Development Team Abstract: The purpose of this work is to develop a man-machine interface for

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

Team Description 2006 for Team RO-PE A

Team Description 2006 for Team RO-PE A Team Description 2006 for Team RO-PE A Chew Chee-Meng, Samuel Mui, Lim Tongli, Ma Chongyou, and Estella Ngan National University of Singapore, 119260 Singapore {mpeccm, g0500307, u0204894, u0406389, u0406316}@nus.edu.sg

More information

Essential Understandings with Guiding Questions Robotics Engineering

Essential Understandings with Guiding Questions Robotics Engineering Essential Understandings with Guiding Questions Robotics Engineering 1 st Quarter Theme: Orientation to a Successful Laboratory Experience Student Expectations Safety Emergency MSDS Organizational Systems

More information

Haptic Rendering CPSC / Sonny Chan University of Calgary

Haptic Rendering CPSC / Sonny Chan University of Calgary Haptic Rendering CPSC 599.86 / 601.86 Sonny Chan University of Calgary Today s Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering

More information

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair. ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means

More information

Virtual Grasping Using a Data Glove

Virtual Grasping Using a Data Glove Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct

More information

A Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator

A Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator International Conference on Control, Automation and Systems 2008 Oct. 14-17, 2008 in COEX, Seoul, Korea A Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator

More information

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

Modeling and Experimental Studies of a Novel 6DOF Haptic Device Proceedings of The Canadian Society for Mechanical Engineering Forum 2010 CSME FORUM 2010 June 7-9, 2010, Victoria, British Columbia, Canada Modeling and Experimental Studies of a Novel DOF Haptic Device

More information

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free

More information

MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation

MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation Rahman Davoodi and Gerald E. Loeb Department of Biomedical Engineering, University of Southern California Abstract.

More information

The Haptic Impendance Control through Virtual Environment Force Compensation

The Haptic Impendance Control through Virtual Environment Force Compensation The Haptic Impendance Control through Virtual Environment Force Compensation OCTAVIAN MELINTE Robotics and Mechatronics Department Institute of Solid Mechanicsof the Romanian Academy ROMANIA octavian.melinte@yahoo.com

More information

Robot Task-Level Programming Language and Simulation

Robot Task-Level Programming Language and Simulation Robot Task-Level Programming Language and Simulation M. Samaka Abstract This paper presents the development of a software application for Off-line robot task programming and simulation. Such application

More information

Design and Control of the BUAA Four-Fingered Hand

Design and Control of the BUAA Four-Fingered Hand Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Design and Control of the BUAA Four-Fingered Hand Y. Zhang, Z. Han, H. Zhang, X. Shang, T. Wang,

More information

Design and Controll of Haptic Glove with McKibben Pneumatic Muscle

Design and Controll of Haptic Glove with McKibben Pneumatic Muscle XXVIII. ASR '2003 Seminar, Instruments and Control, Ostrava, May 6, 2003 173 Design and Controll of Haptic Glove with McKibben Pneumatic Muscle KOPEČNÝ, Lukáš Ing., Department of Control and Instrumentation,

More information

The Humanoid Robot ARMAR: Design and Control

The Humanoid Robot ARMAR: Design and Control The Humanoid Robot ARMAR: Design and Control Tamim Asfour, Karsten Berns, and Rüdiger Dillmann Forschungszentrum Informatik Karlsruhe, Haid-und-Neu-Str. 10-14 D-76131 Karlsruhe, Germany asfour,dillmann

More information

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people Space Research expeditions and open space work Education & Research Teaching and laboratory facilities. Medical Assistance for people Safety Life saving activity, guarding Military Use to execute missions

More information

Robust Hand Gesture Recognition for Robotic Hand Control

Robust Hand Gesture Recognition for Robotic Hand Control Robust Hand Gesture Recognition for Robotic Hand Control Ankit Chaudhary Robust Hand Gesture Recognition for Robotic Hand Control 123 Ankit Chaudhary Department of Computer Science Northwest Missouri State

More information

Affordance based Human Motion Synthesizing System

Affordance based Human Motion Synthesizing System Affordance based Human Motion Synthesizing System H. Ishii, N. Ichiguchi, D. Komaki, H. Shimoda and H. Yoshikawa Graduate School of Energy Science Kyoto University Uji-shi, Kyoto, 611-0011, Japan Abstract

More information

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute Use an example to explain what is admittance control? You may refer to exoskeleton

More information

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Quy-Hung Vu, Byeong-Sang Kim, Jae-Bok Song Korea University 1 Anam-dong, Seongbuk-gu, Seoul, Korea vuquyhungbk@yahoo.com, lovidia@korea.ac.kr,

More information

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality R. Marín, P. J. Sanz and J. S. Sánchez Abstract The system consists of a multirobot architecture that gives access

More information

Robots Learning from Robots: A proof of Concept Study for Co-Manipulation Tasks. Luka Peternel and Arash Ajoudani Presented by Halishia Chugani

Robots Learning from Robots: A proof of Concept Study for Co-Manipulation Tasks. Luka Peternel and Arash Ajoudani Presented by Halishia Chugani Robots Learning from Robots: A proof of Concept Study for Co-Manipulation Tasks Luka Peternel and Arash Ajoudani Presented by Halishia Chugani Robots learning from humans 1. Robots learn from humans 2.

More information

LASER ASSISTED COMBINED TELEOPERATION AND AUTONOMOUS CONTROL

LASER ASSISTED COMBINED TELEOPERATION AND AUTONOMOUS CONTROL ANS EPRRSD - 13 th Robotics & remote Systems for Hazardous Environments 11 th Emergency Preparedness & Response Knoxville, TN, August 7-10, 2011, on CD-ROM, American Nuclear Society, LaGrange Park, IL

More information

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July

More information

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Dipartimento di Elettronica Informazione e Bioingegneria Robotics Dipartimento di Elettronica Informazione e Bioingegneria Robotics Behavioral robotics @ 2014 Behaviorism behave is what organisms do Behaviorism is built on this assumption, and its goal is to promote

More information

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit) Vishnu Nath Usage of computer vision and humanoid robotics to create autonomous robots (Ximea Currera RL04C Camera Kit) Acknowledgements Firstly, I would like to thank Ivan Klimkovic of Ximea Corporation,

More information

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison e-issn 2455 1392 Volume 2 Issue 10, October 2016 pp. 34 41 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Design a Model and Algorithm for multi Way Gesture Recognition using Motion and

More information

Formation and Cooperation for SWARMed Intelligent Robots

Formation and Cooperation for SWARMed Intelligent Robots Formation and Cooperation for SWARMed Intelligent Robots Wei Cao 1 Yanqing Gao 2 Jason Robert Mace 3 (West Virginia University 1 University of Arizona 2 Energy Corp. of America 3 ) Abstract This article

More information

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii 1ms Sensory-Motor Fusion System with Hierarchical Parallel Processing Architecture Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii Department of Mathematical Engineering and Information

More information

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- Hitoshi Hasunuma, Kensuke Harada, and Hirohisa Hirukawa System Technology Development Center,

More information

MATLAB is a high-level programming language, extensively

MATLAB is a high-level programming language, extensively 1 KUKA Sunrise Toolbox: Interfacing Collaborative Robots with MATLAB Mohammad Safeea and Pedro Neto Abstract Collaborative robots are increasingly present in our lives. The KUKA LBR iiwa equipped with

More information

Hanuman KMUTT: Team Description Paper

Hanuman KMUTT: Team Description Paper Hanuman KMUTT: Team Description Paper Wisanu Jutharee, Sathit Wanitchaikit, Boonlert Maneechai, Natthapong Kaewlek, Thanniti Khunnithiwarawat, Pongsakorn Polchankajorn, Nakarin Suppakun, Narongsak Tirasuntarakul,

More information

Internet-based Teleoperation of a Robot Manipulator for Education

Internet-based Teleoperation of a Robot Manipulator for Education nternet-based Teleoperation of a Robot Manipulator for Education Xiaoli Yang, Qing Chen2, Dorina C. Petri$, Emil M. Petrid Lakehead Universiy, Thunder Bay, ON, Canada 2University of Ottawa, Ottawa, ON,

More information

Telematic Control and Communication with Industrial Robot over Ethernet Network

Telematic Control and Communication with Industrial Robot over Ethernet Network Telematic Control and Communication with Industrial Robot over Ethernet Network M.W. Abdullah*, H. Roth, J. Wahrburg Institute of Automatic Control Engineering University of Siegen Siegen, Germany *abdullah@zess.uni-siegen.de

More information

2. Introduction to Computer Haptics

2. Introduction to Computer Haptics 2. Introduction to Computer Haptics Seungmoon Choi, Ph.D. Assistant Professor Dept. of Computer Science and Engineering POSTECH Outline Basics of Force-Feedback Haptic Interfaces Introduction to Computer

More information

VLSI Implementation of Impulse Noise Suppression in Images

VLSI Implementation of Impulse Noise Suppression in Images VLSI Implementation of Impulse Noise Suppression in Images T. Satyanarayana 1, A. Ravi Chandra 2 1 PG Student, VRS & YRN College of Engg. & Tech.(affiliated to JNTUK), Chirala 2 Assistant Professor, Department

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Face Detector using Network-based Services for a Remote Robot Application

Face Detector using Network-based Services for a Remote Robot Application Face Detector using Network-based Services for a Remote Robot Application Yong-Ho Seo Department of Intelligent Robot Engineering, Mokwon University Mokwon Gil 21, Seo-gu, Daejeon, Republic of Korea yhseo@mokwon.ac.kr

More information

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa VIRTUAL REALITY Introduction Emil M. Petriu SITE, University of Ottawa Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality HUMAN PERCEPTION OF

More information

Cognition & Robotics. EUCog - European Network for the Advancement of Artificial Cognitive Systems, Interaction and Robotics

Cognition & Robotics. EUCog - European Network for the Advancement of Artificial Cognitive Systems, Interaction and Robotics Cognition & Robotics Recent debates in Cognitive Robotics bring about ways to seek a definitional connection between cognition and robotics, ponder upon the questions: EUCog - European Network for the

More information

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page FUNDAMENTALS of ROBOT TECHNOLOGY An Introduction to Industrial Robots, T eleoperators and Robot Vehicles D J Todd &\ Kogan Page First published in 1986 by Kogan Page Ltd 120 Pentonville Road, London Nl

More information

Efficient Gesture Interpretation for Gesture-based Human-Service Robot Interaction

Efficient Gesture Interpretation for Gesture-based Human-Service Robot Interaction Efficient Gesture Interpretation for Gesture-based Human-Service Robot Interaction D. Guo, X. M. Yin, Y. Jin and M. Xie School of Mechanical and Production Engineering Nanyang Technological University

More information

Team Description Paper

Team Description Paper Tinker@Home 2016 Team Description Paper Jiacheng Guo, Haotian Yao, Haocheng Ma, Cong Guo, Yu Dong, Yilin Zhu, Jingsong Peng, Xukang Wang, Shuncheng He, Fei Xia and Xunkai Zhang Future Robotics Club(Group),

More information

Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam

Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam Tavares, J. M. R. S.; Ferreira, R. & Freitas, F. / Control a 2-Axis Servomechanism by Gesture Recognition using a Generic WebCam, pp. 039-040, International Journal of Advanced Robotic Systems, Volume

More information

Live Hand Gesture Recognition using an Android Device

Live Hand Gesture Recognition using an Android Device Live Hand Gesture Recognition using an Android Device Mr. Yogesh B. Dongare Department of Computer Engineering. G.H.Raisoni College of Engineering and Management, Ahmednagar. Email- yogesh.dongare05@gmail.com

More information

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION ABSTRACT *Miss. Kadam Vaishnavi Chandrakumar, ** Prof. Hatte Jyoti Subhash *Research Student, M.S.B.Engineering College, Latur, India

More information

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM Takafumi Taketomi Nara Institute of Science and Technology, Japan Janne Heikkilä University of Oulu, Finland ABSTRACT In this paper, we propose a method

More information

GE 320: Introduction to Control Systems

GE 320: Introduction to Control Systems GE 320: Introduction to Control Systems Laboratory Section Manual 1 Welcome to GE 320.. 1 www.softbankrobotics.com 1 1 Introduction This section summarizes the course content and outlines the general procedure

More information

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT Shin-ichiro Kaneko, Yasuo Nasu, Shungo Usui, Mitsuhiro Yamano, Kazuhisa Mitobe Yamagata University, Jonan

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

League <BART LAB AssistBot (THAILAND)>

League <BART LAB AssistBot (THAILAND)> RoboCup@Home League 2013 Jackrit Suthakorn, Ph.D.*, Woratit Onprasert, Sakol Nakdhamabhorn, Rachot Phuengsuk, Yuttana Itsarachaiyot, Choladawan Moonjaita, Syed Saqib Hussain

More information

John Henry Foster INTRODUCING OUR NEW ROBOTICS LINE. Imagine Your Business...better. Automate Virtually Anything jhfoster.

John Henry Foster INTRODUCING OUR NEW ROBOTICS LINE. Imagine Your Business...better. Automate Virtually Anything jhfoster. John Henry Foster INTRODUCING OUR NEW ROBOTICS LINE Imagine Your Business...better. Automate Virtually Anything 800.582.5162 John Henry Foster 800.582.5162 What if you could automate the repetitive manual

More information

IMPLEMENTING MULTIPLE ROBOT ARCHITECTURES USING MOBILE AGENTS

IMPLEMENTING MULTIPLE ROBOT ARCHITECTURES USING MOBILE AGENTS IMPLEMENTING MULTIPLE ROBOT ARCHITECTURES USING MOBILE AGENTS L. M. Cragg and H. Hu Department of Computer Science, University of Essex, Wivenhoe Park, Colchester, CO4 3SQ E-mail: {lmcrag, hhu}@essex.ac.uk

More information

A Virtual Reality Tool for Teleoperation Research

A Virtual Reality Tool for Teleoperation Research A Virtual Reality Tool for Teleoperation Research Nancy RODRIGUEZ rodri@irit.fr Jean-Pierre JESSEL jessel@irit.fr Patrice TORGUET torguet@irit.fr IRIT Institut de Recherche en Informatique de Toulouse

More information

Chapter 1 Introduction

Chapter 1 Introduction Chapter 1 Introduction It is appropriate to begin the textbook on robotics with the definition of the industrial robot manipulator as given by the ISO 8373 standard. An industrial robot manipulator is

More information

Gesture Recognition with Real World Environment using Kinect: A Review

Gesture Recognition with Real World Environment using Kinect: A Review Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,

More information

Nao Devils Dortmund. Team Description for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann

Nao Devils Dortmund. Team Description for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann Nao Devils Dortmund Team Description for RoboCup 2014 Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann Robotics Research Institute Section Information Technology TU Dortmund University 44221 Dortmund,

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems October 9-15, 2006, Beijing, China Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

More information

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof.

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof. Wednesday, October 29, 2014 02:00-04:00pm EB: 3546D TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof. Ning Xi ABSTRACT Mobile manipulators provide larger working spaces and more flexibility

More information