On-site Humanoid Navigation Through Hand-in-Hand Interface
|
|
- Kathleen Elfrieda Thompson
- 5 years ago
- Views:
Transcription
1 Proceedings of 0 th IEEE-RAS International Conference on Humanoid Robots On-site Humanoid Navigation Through Hand-in-Hand Interface Takashi Ogura, Atsushi Haneda, Kei Okada, Masayuki Inaba Department of Mechano-Informatics Graduate School of Information Science and Technology, The University of Tokyo Hongo, Bunkyo-ku, Tokyo , Japan {ogura, hane, k-okada, inaba}@jsk.t.u-tokyo.ac.jp Abstract In order to realize humanoid robots that support people in their daily lives, on-site humanoid behavior navigation system which a robot behaviors autonomously as well as a robot reacts to the intentions of a human. This paper proposes new on-site navigation interface method for humanoid robots which a human navigate a humanoid robot hand-in-hand. Previous humanoid researches aim to realize an autonomous intelligence, which is able to survive in a real environment without any help from a person. However to utilize a humanoid robot in a real world, it is necessary to take a robot to various locations or make a robot to manipulate and use various kinds of objects and tools, on-site navigation that a person is able to induce a humanoid behavior. In the experiment the operator takes a humanoid robot to a kitchen and teaches holding a kettle using this interaction kernel, and played back the motion sequence by planning in different environments. Index Terms Interaction, Motion Planning, Humanoid Software System, Teaching of Humanoids Fig. 1. Online teaching for humanoid robots through hand-in-hand interface. I. INTRODUCTION One of the most expected application of humanoids robots is support of daily life. What is the necessary function for that? Previous humanoid researches aim to realize an autonomous intelligence, which is able to survive in a real environment without any help from a person. However to utilize a humanoid robot in a real world, it is necessary to take a robot to various locations or make a robot to manipulate and use various kinds of objects and tools, on-site navigation that a person is able to induce a humanoid behavior like Fig.1. In order to realize humanoid robots that support people in their daily lives, on-site humanoid behavior navigation system which a robot behaviors autonomously as well as a robot reacts to the intentions of a human. This paper proposes new onsite navigation interface method for humanoid robots which a human navigate a humanoid robot hand-in-hand. The robot and the human look like a child and a parent with this interface. Previous on-site navigation systems are able to be classified into two groups, one is a direct navigation approach which a human controls a humanoid robot in a master slave manner with some devices, and an another approach is that a human teaches by demonstrating or dialogue. Our hand-in-hand navigation interface can be categorized in a direct navigation approach, however this approach is intuitive when compare to other direct approach such as the researches that uses joysticks [1], [2] and is convenient because the person does not need to use any devices. We show four experiments through proposed hand-in-hand interface. The first one shows navigation of walking on flat ground. The operator leads the robot from inside an elevator to a corridor. In the next, navigation of going upstairs and downstairs. These are direct navigation using change of hand position by impedance control as inputs. The robot changes the result of path planning by interrupt from the operator in the third experiment. This shows that he control the walking not only directly but can change the result of path planning by navigation. A teaching system through hand-in-hand interface are described in the end. The operator takes the robot to a kitchen by leading by the hand, and teaches manipulation of a kettle. II. HUMANOID NAVIGATION INTERFACE In order to realize an on-site humanoid navigation, Briefness that a human is able to navigate a robot without any training and adaptability which a robot is able to behave a memorized behavior in different places are important. This section discuses navigation methods for humanoid robots from this point of view /0/$.00 0 IEEE 17
2 input type symbolized numerical A. Navigation Level TABLE I DEVICES FOR ON-LINE HUMANOID NAVIGATION devices microphones (voice recognition, sound), buttons, switches, keyboards joysticks, cameras (vision processing), master-slave devices, bodies (direct teaching) In case of a humanoid robot that is able to behave various tasks, we discuss from what to teach first, then we are able to discuss how to teach according to a teaching method. Humanoid navigation can be classified into following five levels. direct motion teaching assistance or modification of motions instruction of targets or behaviors teaching of states or phase of the robots navigation of recognition for planning or motion Motion teaching or modification needs numerical inputs, but instruction of targets or teaching phases requires symbolized inputs. The navigation system should be able to accept numerical and symbolized inputs because these all levels are important for humanoids. B. Navigation Devices for On-site Navigation Table I shows devices for humanoid on-line navigation. These are classified into two, symbolized input methods and numerial input methods. Numerical methods such as joystick or master-slave like cockpit [3], can get analog values, and symbolized methods such as voice recognition [4] or vision processing [], [6], deal symbolic values. The methods that the operator does not have to have any devices are voice recognition, vision processing and direct teaching in these. Researches of direct teaching of humanoids using their bodies are very few, although it is intuitive. C. Navigation Methods through Bodies It is difficult to apply inputs through bodies to life size humanoids because they are large and have many degrees of freedoms. It is too dangerous and troublesome. Therefore, we use only their hands for inputs and control whole bodies. We call this interface hand-in-hand interface. If the humanoid robot has force sensors on their hands, it is able to use the forces as inputs and control the robot by interpreting them depending on their situations. However, it is a problem that it can not feed back the operation if it uses the force directly. It makes feed back of the operation possible to make its arms compliant and use the position of its hands as input. The feature of this interface is interaction as with human no devices and easy force feedback control. Force Sensors Cameras Vision Processes Color Extract Voice Recognition Impedance Control Stereo Vision of Hnads Position update position of Symbols target change of position Three-dimensional Models Robot Environments Cost Changer search cost Model Based Mtion Programs Inverse Motion Planners target Kinematics Manipulation Planner Manipulation Walking Navigator Path Planner Locomotion Sit down path walking joint angles states velocity Online Walking Pattern Generator Motion Stabilizer Motors Mic Voice Synthesis Speakers Fig. 2. Softwares that compose Humanoid Interaction Kernel and their configuration. a) interaction as with human: The operator can lead the robot somewhere leading by the hand and teach things handin-hand with this interface, as same as parents do for his/her children. Therefore, this is an intuitive interface. b) no devices and easy: This interface needs no devices and the operator can control whole body by touching only the hands. This enables on-site navigation. c) force feedback control: Impedance control realize force feedback that if a person pull the hand too strong, the robot reactive strongly. Furthermore, it is superior because it can reflect the input forces to the positions of their hands. III. IMPLEMENTATION OF NAVIGATION SYSTEM THROUGH HAND-IN-HAND INTERFACE This section shows a concrete implementation of handin-hand navigation system. Fig.2 shows the software system configuration that compose a navigation system through hand-in-and interface. Inputs are force sensors, cameras and microphone. Impedance control of hands position is basic function for this system, and Cost Changer and Walking Navigator use the output of this. The motion planners use three-dimensional models for motion generation. These models are composed on EusLisp [7], which is a lisp language that has three-dimensional modeling system and multi-thread. We used Galatea tool kit [8] for Voice Recognition and Voice Synthesis programs, and made it possible to control these from EusLisp. Vision Processes are also controlled by EusLisp. IV. WALKING NAVIGATION THROUGH HAND-IN-HAND INTERFACE A. Navigation of hands position and posture by impedance control It is a basic function that can navigate position and posture of robot hands online for teaching and navigate humanoid motions directly. Impedance control of position and posture of hands by 6 axes force sensor mounted on them realize that function. 176
3 Humanoid robots joints are often controlled by position control. eq.(1) shows the impedance control equation for position control system. M d y t 2y t 1 + y t 2 t 2 +D d y t y t 1 t +K d (y t y d )=F (1) y t is position and posture of hands at the time of t. y t 1 is y at the time of t t. y t 2 is y at the time of t 2 t. t is digitizing unit time. Solved result of eq.(1) about y t is y t = F + M y (2y t 2 t 1 y t 2 )+D t 1 d t + K d y d M d t + D 2 d t + K. (2) d B. Navigation of walking by change of position of hands This subsection shows the method for navigation of walking by leading humanoid robot s hands. Navigation of walking by leading its hands is superior to other methods in two aspects. At first it is natural because it is very similar to human. The scene that the operator takes the robot anywhere by leading its hand is like that parents lead their child s hands. The other side is that it is a device less method. For online system in diary life, the operator can not use any devices. Because he/she will be normal person without any knowledge about the robot control, the two aspects are very important. Furthermore it is important that the robot speaks its states by voice synthesis when start walking, stop walking and getting too large input. This output is useful for keeping safety. The simplest way for realizing this method is use force sensor mounted on the hands. The force can be converted to the velocity of walking directly, but this has no feed back for the operator. Even if he/she leads the hands strongly enough to upset the robot, he/she can not know that. If change of position by impedance control is converted to walking velocity instead of force, the force can be smaller and the operator can know how fast he/she is navigating. eq.(3) is an implement of decision walking velocity. V =(V x,v y,v θ )=(K x Σ i y xi,k y Σ i y yi,k θ Σ i y yi Σ i y xi ) V is walking speed. K x, K y and K θ are scalar weight. y xi means change of hand position in direction of x, arm i. i =0, 1 Fig.3 is the scene of the experiment about this navigation method. The humanoid is in the elevator at first. The operator holds the hands of the robot and pull to navigate it. He takes it out of the elevator and turns the corner. C. Navigation of walking on stairs This system can navigate going upstairs and downstairs as same as walking navigation. The input is the change of hand position by impedance control as same as walking navigation on planer grounds. (3) 7 8 Fig. 3. Walking navigation by leading robot s hand from inside of an elevator to a corridor. 1) known stairs: If the height and depth of the stairs are known, the input can use for switch. if Σ i y zi > z up then upstairs, else if Σ i y zi <z down downstairs. An experimental result is shown in Fig.4. The operator leads the hands of the robot above and the robot start walking on the stairs. And leads lower, then it walks downstairs. 2) unknown stairs: The operator must teach the height and depth of the stairs by leading the position of swing leg in unknown stairs. If he/she teaches at once, the robot can go stairs continuously as same as known stairs. V. CHANGING THE PLANNING BY INTERRUPTION This section shows the system which not only navigates motion directly but navigates autonomous motion planning by the navigation of an operator. Change of search costs enables this mechanism. This section shows the method through an experiment. Fig. shows the picture of the experimental environments and its three-dimensional geometric model. The details of the real environments are not modeled in the model. Furthermore, there are obstacles not modeled in the photograph right-hand side. The planner may generate a path via the obstacles. In such situation, if the operator can navigate the path different direction with fewer efforts, the navigation of autonomous system is effective. The robot has environmental model beforehand as shown in Fig.2, and search the path to the given target using it, as will hereinafter be described in detail. This section describes 177
4 where x is the state of the robot. K(x) is the search cost coefficient and x is moving distance in a unit step. Sum of K(x) x is the cost g. x x navi 2 means the distance from the state given navigation of the operator. Kdir navi is less value than default value initial plan: f(x) cost (θ=0) renewed plan: f(x) cost (θ=0) initial plan: f(x) cost (θ=π/2) renewed plan: f(x) cost (θ=π/2) Fig. 6. Change of cost g by on-site navigation. The areas covered with obstacles are yellow. Left two figures are f cost maps used in initial plan. This shows that clockwise path is generated. Because of human navigation, the maps are changed into right two maps, so that a counter clockwise path is created. Fig. 4. Navigation of going upstairs and downstairs start Operator s Navigation to the Left Cost Changer Start Unknow obstacles Start Goal replanning Goal Goal Fig. 7. Change of path plan by on-site navigation. The left picture is the initial plan. The right is the plan that is inducted by human navigation. Fig.. An environment of everyday life and its 3D geometric map used in the experiment. Thought the real world is similar to the 3D model, there are always modeling errors and many unpredictable things. There are humans and corrugated cartons in the circled area. the realization approach of navigation of path by search cost change. The left figure of Fig.7 shows three-dimensional model of an environment with given goal and start point. The path planner generate the clockwise path, and it started walking(figrefplanchange-exp(1)). Fig.8(2) shows the scene that the operator navigates the path. Cost Changer changes the search cost to generate a new path in navigated direction, the costs are shown in Fig.6, and replan the path(fig.7.) The results are shown in Fig.8. This method uses A* search, the search cost g is changed if the operator gives the robot navigation at the state x n avi as eq.(4). g = C K exp x x navi 2 2σ 2 Kdir navi x (4) VI. WALKING AND MANIPULATION TEACHING SYSTEM A. Locomotion Planner Locomotion Planner makes a plan of walking from the start position to the goal. Fig.9 shows the sequence of planning. This planner uses the robot and the environmental geometric model and start point and goal point. At first it makes bounding cylinder of the robot and build up grid map the robot can go through (figure of (2)). Then search the nearest path by A* search method for the cylinder and generates the path of walking of the robot. B. Manipulation Planner Manipulation Planner makes whole body motion plan from environmental geometric model and target position and posture. This planner generates using whole body inverse kinematics with various types of constraints [9]. Statically-stable full body posture and collision-free motion can be generated by this planner[]. Fig. shows the motion sequence of grasping the target (the handle of a red kettle). 178
5 (1) (2) (3) (4) Fig. 8. The scene of a real world navigation experiment. The system planned a global path and the robot started walking toward the left direction of that. Because a human operator found that there are cardboard boxes on the path, he navigated the robot toward the opposite direction. This changed cost coefficient vectors of this state and its surrounding states, and this follows that the robot changed the planning and changed the walking direction. C. Evaluation experiment of walking and manipulation teaching system This section shows an experimental result using a humanoid HRP-2[11] and verify the system discussed above. In the kitchen as we use in daily life, the operator takes the robot in front of a kettle, teaches pouring out tea. The robot replays the whole body motions using path planner and motion planner after the teaching phase. Fig.11 explains the software system for the experiment. Motion Navigators are composed of Head Navigator, Walking Navigator and Arm Navigator. Walking Navigator enables the navigation of walking by leading the robot hand. The operator teaches a target point for locomotion by leading the hand of the humanoid robot, and then teaches manipulation and grasping directly. After this teaching phase, the robot replays the motion using the planners explained in this paper. Fig.14 shows the experimental environment and results. Though the initial position of robot is changed and an obstacle, which is a table, appeared, it can go to the target point that taught in the teaching phase. However environment map and the start position are given. The robot finds the kettle as shown in Fig.13 and grasp it in the next. After find it by extracting kettle color, stereo vision calculates the 3D position of that and motion planner generates grasping motion using inverse kinematics. VII. CONCLUSION We have described on-site humanoid navigation interface that requires no devices and has intuitive interfaces. This interface enabled us to navigate walking of robots by leading () (6) Fig. 9. Walking path planner. (1)Start, goal and geometric environmental models, the robot model are given. (2)Then make grid map collision free space. (3),(4) Search the path to the goal using simplified cylinder robot model. The path and motion sequence is shown in () and (6). Head Legs Motion Navigators Head Walking Navigator Navigator Vision Voice Teaching Phase on/off Fig.. Arms Arm Navigator Arms Force Fig. 11. Manipulation Planner Purpose Interpreter Memory Environment Model Target Object Target Position Experimental system. Head Legs Arms Motion Planner Walking Planner Manipulation Planner Vision Playback Phase 179
6 Fig. 12. Teaching phase: the operator leads the robot s hand for navigating walking and manipulation. Fig. 14. Play back phase: the robot uses planner for replay the taught motions Fig. 13. Image Processing for manipulation. The stereo vision calculates the position of target object. its hands, going upstairs, and manipulation of a kettle. Not only controlling the motions directly, the robot has some autonomy, and the operator can guide the path. Then we showed it is possible to compose an online teaching system for humanoids by body interaction that can play back taught motions even if the environments changes by the hand-in-hand interface. This illustrated that you can teach many things with this interface if the robot knows the situation. However combination of other inteface, like voice recognition, is needed for recognition of change of situations and this is essential for teaching of humanoid robots. Hand-in-hand interface will be one of the most important interface comparable with voice, which human does not feel that it is artificial one. REFERENCES [1] Neo, Ee, Sian., Yokoi, K., Kajita, S., Kanehiro, F., Tanie, and K. Whole Body Teleoperation of a Humanoid Robot -Development of a Simple Master Device using Joysticks-. In Proc. Int. Conf. on Intelligent Robotics and Systems(IROS), pages , 02. [2] Koichi Nishiwaki, Satoshi Kagami, Yasuo Kuniyoshi, Masayuki Inaba, and Hirochika Inoue. Online Generation of Humanoid Walking Motion based on a Fast Generation Method of Motion Pattern that Follows Desired ZMP. In Proc. Int. Conf. on Robotics and Automation (ICRA), pages , 02. [3] Hasunuma H., Kobayashi M., Moriyama H., Itoko T., Yanagihara Y., Ueno T., Ohya K., and Yokoi K. A Tele-operated Humanoid Robot Drives a Lift Truck. In Proc. Int. Conf. on Robotics and Automation(ICRA), pages , 02. [4] Kazuhiro Nakadai, Hiroshi G. Okuno, and Hiroaki Kitano. Robot Recognizes Three Simultaneous Speech By Active Audition. In Proceedings of the 03 IEEE International Conference on Robotics and Automation (ICRA03), pages , 03. [] Kei Okada, Yasuyuki Kino, Masayuki Inaba, and Hirochika Inoue. Visually-based Humanoid Remote Control System under Operator s Assistance and it s Application to Object Manipulation. In Proceedings of Third IEEE International Conference on Humanoid Robots (Humanoid 03), 03. [6] Yoichi Sato, Makiko Saito, and Hideki Koike. Real-time input of 3D pose and gestures of a user s hand and its applications for HCI. In Proceedings of 01 IEEE Virtual Reality Conference (IEEE VR01), pages 79 86, 01. [7] T. Matsui. Multithread object-oriented language euslisp for parallel and asynchronous programming in robotics. In In Workshop on Concurrent Object-based Systems, IEEE 6th Symposium on Parallel and Distributed Processing, [8] Shin ichi Kawamoto, Hiroshi Shimodaira, Tsuneo Nitta, Takuya Nishimoto, Satoshi Nakamura, Katsunobu Itou, Shigeo Morishima, Tatsuo Yotsukura, Atsuhiko Kai, Akinobu Lee, Yoichi Yamashita, Takao Kobayashi, Keiichi Tokuda, Keikichi Hirose, Nobuaki Minematsu, Atsushi Yamada, Yasuharu Den, Takehito Utsuro, and Shigeki Sagayama. Galatea: Open-Source Software for Developing Anthropomorphic Spoken Dialog Agents. Life-Like Characters, Tools, Affective Functions, and Applications, pages , 03. [9] Katsu Yamane and Yoshihiko Nakamura. Natural Motion Animation through Constraining and Deconstraining at Will. IEEE TRANSAC- TIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 9(3), [] Kei Okada, Takashi Ogura, Atushi Haneda, Daisuke Kousaka, Hiroyuki Nakai, Masayuki Inaba, and Hirochika Inoue. Integrated System Software for HRP2 Humanoid. In Proc. of International Conference on Robotics and Automation (ICRA 04), pages , 04. [11] K.Kaneko, F.Kanehiro, S.Kajita, H.Hirukawa, T.Kawasaki, M.Hirata, K.Akachi, and T.Isozumi. Humanoid Robot HRP-2. In Proc. Int. Conf. on Robotics and Automation(ICRA), pages 83 90,
Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface
Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Kei Okada 1, Yasuyuki Kino 1, Fumio Kanehiro 2, Yasuo Kuniyoshi 1, Masayuki Inaba 1, Hirochika Inoue 1 1
More informationThe Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-
The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- Hitoshi Hasunuma, Kensuke Harada, and Hirohisa Hirukawa System Technology Development Center,
More informationGraphical Simulation and High-Level Control of Humanoid Robots
In Proc. 2000 IEEE RSJ Int l Conf. on Intelligent Robots and Systems (IROS 2000) Graphical Simulation and High-Level Control of Humanoid Robots James J. Kuffner, Jr. Satoshi Kagami Masayuki Inaba Hirochika
More informationHRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments
Book Title Book Editors IOS Press, 2003 1 HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments Tetsunari Inamura a,1, Masayuki Inaba a and Hirochika Inoue a a Dept. of
More informationShuffle Traveling of Humanoid Robots
Shuffle Traveling of Humanoid Robots Masanao Koeda, Masayuki Ueno, and Takayuki Serizawa Abstract Recently, many researchers have been studying methods for the stepless slip motion of humanoid robots.
More informationIntent Imitation using Wearable Motion Capturing System with On-line Teaching of Task Attention
Intent Imitation using Wearable Motion Capturing System with On-line Teaching of Task Attention Tetsunari Inamura, Naoki Kojo, Tomoyuki Sonoda, Kazuyuki Sakamoto, Kei Okada and Masayuki Inaba Department
More informationISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1
Development of Multi-D.O.F. Master-Slave Arm with Bilateral Impedance Control for Telexistence Riichiro Tadakuma, Kiyohiro Sogen, Hiroyuki Kajimoto, Naoki Kawakami, and Susumu Tachi 7-3-1 Hongo, Bunkyo-ku,
More informationVision based behavior verification system of humanoid robot for daily environment tasks
Vision based behavior verification system of humanoid robot for daily environment tasks Kei Okada, Mitsuharu Kojima, Yuichi Sagawa, Toshiyuki Ichino, Kenji Sato and Masayuki Inaba Graduate School of Information
More informationDesign and Experiments of Advanced Leg Module (HRP-2L) for Humanoid Robot (HRP-2) Development
Proceedings of the 2002 IEEE/RSJ Intl. Conference on Intelligent Robots and Systems EPFL, Lausanne, Switzerland October 2002 Design and Experiments of Advanced Leg Module (HRP-2L) for Humanoid Robot (HRP-2)
More informationIntegration of Manipulation and Locomotion by a Humanoid Robot
Integration of Manipulation and Locomotion by a Humanoid Robot Kensuke Harada, Shuuji Kajita, Hajime Saito, Fumio Kanehiro, and Hirohisa Hirukawa Humanoid Research Group, Intelligent Systems Institute
More informationDescription and Execution of Humanoid s Object Manipulation based on Object-environment-robot Contact States
2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) November 3-7, 2013. Tokyo, Japan Description and Execution of Humanoid s Object Manipulation based on Object-environment-robot
More informationDEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT
DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT Shin-ichiro Kaneko, Yasuo Nasu, Shungo Usui, Mitsuhiro Yamano, Kazuhisa Mitobe Yamagata University, Jonan
More informationROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION
ROBOTICS INTRODUCTION THIS COURSE IS TWO PARTS Mobile Robotics. Locomotion (analogous to manipulation) (Legged and wheeled robots). Navigation and obstacle avoidance algorithms. Robot Vision Sensors and
More informationTask Guided Attention Control and Visual Verification in Tea Serving by the Daily Assistive Humanoid HRP2JSK
2008 IEEE/RSJ International Conference on Intelligent Robots and Systems Acropolis Convention Center Nice, France, Sept, 22-26, 2008 Task Guided Attention Control and Visual Verification in Tea Serving
More informationDevice Distributed Approach to Expandable Robot System Using Intelligent Device with Super-Microprocessor
Paper: Device Distributed Approach to Expandable Robot System Using Intelligent Device with Super-Microprocessor Kei Okada *, Akira Fuyuno *, Takeshi Morishita *,**, Takashi Ogura *, Yasumoto Ohkubo *,
More information4R and 5R Parallel Mechanism Mobile Robots
4R and 5R Parallel Mechanism Mobile Robots Tasuku Yamawaki Department of Mechano-Micro Engineering Tokyo Institute of Technology 4259 Nagatsuta, Midoriku Yokohama, Kanagawa, Japan Email: d03yamawaki@pms.titech.ac.jp
More informationDistributed Vision System: A Perceptual Information Infrastructure for Robot Navigation
Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp
More informationA Tele-operated Humanoid Robot Drives a Lift Truck
A Tele-operated Humanoid Robot Drives a Lift Truck Hitoshi Hasunuma, Masami Kobayashi, Hisashi Moriyama, Toshiyuki Itoko, Yoshitaka Yanagihara, Takao Ueno, Kazuhisa Ohya, and Kazuhito Yokoi System Technology
More informationAdaptive Motion Control with Visual Feedback for a Humanoid Robot
The 21 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 21, Taipei, Taiwan Adaptive Motion Control with Visual Feedback for a Humanoid Robot Heinrich Mellmann* and Yuan
More informationFlexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information
Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems September 28 - October 2, 2004, Sendai, Japan Flexible Cooperation between Human and Robot by interpreting Human
More informationDevelopment of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics
Development of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics Kazunori Asanuma 1, Kazunori Umeda 1, Ryuichi Ueda 2, and Tamio Arai 2 1 Chuo University,
More informationPushing Manipulation by Humanoid considering Two-Kinds of ZMPs
Proceedings of the 2003 IEEE International Conference on Robotics & Automation Taipei, Taiwan, September 14-19, 2003 Pushing Manipulation by Humanoid considering Two-Kinds of ZMPs Kensuke Harada, Shuuji
More informationPHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES
Bulletin of the Transilvania University of Braşov Series I: Engineering Sciences Vol. 6 (55) No. 2-2013 PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES A. FRATU 1 M. FRATU 2 Abstract:
More informationSensor system of a small biped entertainment robot
Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO
More informationInteractive Teaching of a Mobile Robot
Interactive Teaching of a Mobile Robot Jun Miura, Koji Iwase, and Yoshiaki Shirai Dept. of Computer-Controlled Mechanical Systems, Osaka University, Suita, Osaka 565-0871, Japan jun@mech.eng.osaka-u.ac.jp
More informationMotion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment
Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free
More informationAutonomous Stair Climbing Algorithm for a Small Four-Tracked Robot
Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Quy-Hung Vu, Byeong-Sang Kim, Jae-Bok Song Korea University 1 Anam-dong, Seongbuk-gu, Seoul, Korea vuquyhungbk@yahoo.com, lovidia@korea.ac.kr,
More informationOptic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball
Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine
More informationUKEMI: Falling Motion Control to Minimize Damage to Biped Humanoid Robot
Proceedings of the 2002 IEEE/RSJ Intl. Conference on Intelligent Robots and Systems EPFL, Lausanne, Switzerland October 2002 UKEMI: Falling Motion Control to Minimize Damage to Biped Humanoid Robot Kiyoshi
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationThe Control of Avatar Motion Using Hand Gesture
The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,
More informationDevelopment of a telepresence agent
Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented
More informationPr Yl. Rl Pl. 200mm mm. 400mm. 70mm. 120mm
Humanoid Robot Mechanisms for Responsive Mobility M.OKADA 1, T.SHINOHARA 1, T.GOTOH 1, S.BAN 1 and Y.NAKAMURA 12 1 Dept. of Mechano-Informatics, Univ. of Tokyo., 7-3-1 Hongo Bunkyo-ku Tokyo, 113-8656 Japan
More informationChapter 1 Introduction
Chapter 1 Introduction It is appropriate to begin the textbook on robotics with the definition of the industrial robot manipulator as given by the ISO 8373 standard. An industrial robot manipulator is
More informationThe Future of AI A Robotics Perspective
The Future of AI A Robotics Perspective Wolfram Burgard Autonomous Intelligent Systems Department of Computer Science University of Freiburg Germany The Future of AI My Robotics Perspective Wolfram Burgard
More informationUNIT VI. Current approaches to programming are classified as into two major categories:
Unit VI 1 UNIT VI ROBOT PROGRAMMING A robot program may be defined as a path in space to be followed by the manipulator, combined with the peripheral actions that support the work cycle. Peripheral actions
More informationA Concept Study on Wearable Cockpit for Construction Work - not only for machine operation but also for project control -
A Concept Study on Wearable Cockpit for Construction Work - not only for machine operation but also for project control - Thomas Bock, Shigeki Ashida Chair for Realization and Informatics of Construction,
More informationDevelopment of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics
Development of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics Kazunori Asanuma 1, Kazunori Umeda 1, Ryuichi Ueda 2,andTamioArai 2 1 Chuo University,
More informationCooperative Works by a Human and a Humanoid Robot
Proceedings of the 2003 IEEE International Conference on Robotics & Automation Taipei, Taiwan, September 14-19, 2003 Cooperative Works by a Human and a Humanoid Robot Kazuhiko YOKOYAMA *, Hiroyuki HANDA
More informationHAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA
HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationService Robots in an Intelligent House
Service Robots in an Intelligent House Jesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering Autonomous National University of Mexico UNAM 2017 OUTLINE Introduction A System
More informationDesign and Control of the BUAA Four-Fingered Hand
Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Design and Control of the BUAA Four-Fingered Hand Y. Zhang, Z. Han, H. Zhang, X. Shang, T. Wang,
More informationCooperative Transportation by Humanoid Robots Learning to Correct Positioning
Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Yutaka Inoue, Takahiro Tohge, Hitoshi Iba Department of Frontier Informatics, Graduate School of Frontier Sciences, The University
More informationREBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL
World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced
More informationBirth of An Intelligent Humanoid Robot in Singapore
Birth of An Intelligent Humanoid Robot in Singapore Ming Xie Nanyang Technological University Singapore 639798 Email: mmxie@ntu.edu.sg Abstract. Since 1996, we have embarked into the journey of developing
More informationHumanoid Robot HanSaRam: Recent Development and Compensation for the Landing Impact Force by Time Domain Passivity Approach
Humanoid Robot HanSaRam: Recent Development and Compensation for the Landing Impact Force by Time Domain Passivity Approach Yong-Duk Kim, Bum-Joo Lee, Seung-Hwan Choi, In-Won Park, and Jong-Hwan Kim Robot
More informationHaptics CS327A
Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller
More informationPrediction of Human s Movement for Collision Avoidance of Mobile Robot
Prediction of Human s Movement for Collision Avoidance of Mobile Robot Shunsuke Hamasaki, Yusuke Tamura, Atsushi Yamashita and Hajime Asama Abstract In order to operate mobile robot that can coexist with
More informationAssociated Emotion and its Expression in an Entertainment Robot QRIO
Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation,
More informationHumanoid robot. Honda's ASIMO, an example of a humanoid robot
Humanoid robot Honda's ASIMO, an example of a humanoid robot A humanoid robot is a robot with its overall appearance based on that of the human body, allowing interaction with made-for-human tools or environments.
More informationMULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT
MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003
More informationAn Adaptive Action Model for Legged Navigation Planning
An Adaptive Action Model for Legged Navigation Planning Joel Chestnutt Koichi Nishiwaki James Kuffner Satoshi Kagami Robotics Institute Digital Human Research Center Carnegie Mellon University AIST Waterfront
More informationSensing Ability of Anthropomorphic Fingertip with Multi-Modal Sensors
Sensing Ability of Anthropomorphic Fingertip with Multi-Modal Sensors Yasunori Tada, Koh Hosoda, and Minoru Asada Adaptive Machine Systems, HANDAI Frontier Research Center, Graduate School of Engineering,
More informationBuilding Perceptive Robots with INTEL Euclid Development kit
Building Perceptive Robots with INTEL Euclid Development kit Amit Moran Perceptual Computing Systems Innovation 2 2 3 A modern robot should Perform a task Find its way in our world and move safely Understand
More informationVision Based Robot Behavior: Tools and Testbeds for Real World AI Research
Vision Based Robot Behavior: Tools and Testbeds for Real World AI Research Hirochika Inoue Department of Mechano-Informatics The University of Tokyo 7-3-1 Hongo, Bunkyo-ku, Tokyo, JAPAN Abstract Vision
More informationSteering a humanoid robot by its head
University of Wollongong Research Online Faculty of Engineering and Information Sciences - Papers: Part B Faculty of Engineering and Information Sciences 2009 Steering a humanoid robot by its head Manish
More informationCognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many
Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July
More informationA Semi-Minimalistic Approach to Humanoid Design
International Journal of Scientific and Research Publications, Volume 2, Issue 4, April 2012 1 A Semi-Minimalistic Approach to Humanoid Design Hari Krishnan R., Vallikannu A.L. Department of Electronics
More informationMasatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii
1ms Sensory-Motor Fusion System with Hierarchical Parallel Processing Architecture Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii Department of Mathematical Engineering and Information
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationTasks prioritization for whole-body realtime imitation of human motion by humanoid robots
Tasks prioritization for whole-body realtime imitation of human motion by humanoid robots Sophie SAKKA 1, Louise PENNA POUBEL 2, and Denis ĆEHAJIĆ3 1 IRCCyN and University of Poitiers, France 2 ECN and
More informationRandomized Motion Planning for Groups of Nonholonomic Robots
Randomized Motion Planning for Groups of Nonholonomic Robots Christopher M Clark chrisc@sun-valleystanfordedu Stephen Rock rock@sun-valleystanfordedu Department of Aeronautics & Astronautics Stanford University
More informationKid-Size Humanoid Soccer Robot Design by TKU Team
Kid-Size Humanoid Soccer Robot Design by TKU Team Ching-Chang Wong, Kai-Hsiang Huang, Yueh-Yang Hu, and Hsiang-Min Chan Department of Electrical Engineering, Tamkang University Tamsui, Taipei, Taiwan E-mail:
More informationLearning Actions from Demonstration
Learning Actions from Demonstration Michael Tirtowidjojo, Matthew Frierson, Benjamin Singer, Palak Hirpara October 2, 2016 Abstract The goal of our project is twofold. First, we will design a controller
More informationMay Edited by: Roemi E. Fernández Héctor Montes
May 2016 Edited by: Roemi E. Fernández Héctor Montes RoboCity16 Open Conference on Future Trends in Robotics Editors Roemi E. Fernández Saavedra Héctor Montes Franceschi Madrid, 26 May 2016 Edited by:
More informationCurriculum Vitae. Ryuma Niiyama
Curriculum Vitae Ryuma Niiyama Office: Robot Locomotion Group Computer Science and Artificial Intelligence Lab Massachusetts Institute of Technology MIT 32-380, 32 Vassar Street Cambridge, MA 02139 USA.
More informationTeam Description 2006 for Team RO-PE A
Team Description 2006 for Team RO-PE A Chew Chee-Meng, Samuel Mui, Lim Tongli, Ma Chongyou, and Estella Ngan National University of Singapore, 119260 Singapore {mpeccm, g0500307, u0204894, u0406389, u0406316}@nus.edu.sg
More informationMulti-touch Interface for Controlling Multiple Mobile Robots
Multi-touch Interface for Controlling Multiple Mobile Robots Jun Kato The University of Tokyo School of Science, Dept. of Information Science jun.kato@acm.org Daisuke Sakamoto The University of Tokyo Graduate
More informationHuman-robot relation. Human-robot relation
Town Robot { Toward social interaction technologies of robot systems { Hiroshi ISHIGURO and Katsumi KIMOTO Department of Information Science Kyoto University Sakyo-ku, Kyoto 606-01, JAPAN Email: ishiguro@kuis.kyoto-u.ac.jp
More informationInteraction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping
Robotics and Autonomous Systems 54 (2006) 414 418 www.elsevier.com/locate/robot Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Masaki Ogino
More informationMotion Generation for Pulling a Fire Hose by a Humanoid Robot
Motion Generation for Pulling a Fire Hose by a Humanoid Robot Ixchel G. Ramirez-Alpizar 1, Maximilien Naveau 2, Christophe Benazeth 2, Olivier Stasse 2, Jean-Paul Laumond 2, Kensuke Harada 1, and Eiichi
More informationInitial Report on Wheelesley: A Robotic Wheelchair System
Initial Report on Wheelesley: A Robotic Wheelchair System Holly A. Yanco *, Anna Hazel, Alison Peacock, Suzanna Smith, and Harriet Wintermute Department of Computer Science Wellesley College Wellesley,
More informationBody Movement Analysis of Human-Robot Interaction
Body Movement Analysis of Human-Robot Interaction Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono ATR Intelligent Robotics & Communication Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun,
More informationDevelopment of a Walking Support Robot with Velocity-based Mechanical Safety Devices*
2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) November 3-7, 2013. Tokyo, Japan Development of a Walking Support Robot with Velocity-based Mechanical Safety Devices* Yoshihiro
More informationEROS TEAM. Team Description for Humanoid Kidsize League of Robocup2013
EROS TEAM Team Description for Humanoid Kidsize League of Robocup2013 Azhar Aulia S., Ardiansyah Al-Faruq, Amirul Huda A., Edwin Aditya H., Dimas Pristofani, Hans Bastian, A. Subhan Khalilullah, Dadet
More informationCognition & Robotics. EUCog - European Network for the Advancement of Artificial Cognitive Systems, Interaction and Robotics
Cognition & Robotics Recent debates in Cognitive Robotics bring about ways to seek a definitional connection between cognition and robotics, ponder upon the questions: EUCog - European Network for the
More information* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged
ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing
More informationAn Improved Path Planning Method Based on Artificial Potential Field for a Mobile Robot
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 15, No Sofia 015 Print ISSN: 1311-970; Online ISSN: 1314-4081 DOI: 10.1515/cait-015-0037 An Improved Path Planning Method Based
More informationUnderstanding the Mechanism of Sonzai-Kan
Understanding the Mechanism of Sonzai-Kan ATR Intelligent Robotics and Communication Laboratories Where does the Sonzai-Kan, the feeling of one's presence, such as the atmosphere, the authority, come from?
More informationMulti-Agent Planning
25 PRICAI 2000 Workshop on Teams with Adjustable Autonomy PRICAI 2000 Workshop on Teams with Adjustable Autonomy Position Paper Designing an architecture for adjustably autonomous robot teams David Kortenkamp
More informationRunning Pattern Generation for a Humanoid Robot
Running Pattern Generation for a Humanoid Robot Shuuji Kajita (IST, Takashi Nagasaki (U. of Tsukuba, Kazuhito Yokoi, Kenji Kaneko and Kazuo Tanie (IST 1-1-1 Umezono, Tsukuba Central 2, IST, Tsukuba Ibaraki
More informationEE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department
EE631 Cooperating Autonomous Mobile Robots Lecture 1: Introduction Prof. Yi Guo ECE Department Plan Overview of Syllabus Introduction to Robotics Applications of Mobile Robots Ways of Operation Single
More informationJane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute
Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute (6 pts )A 2-DOF manipulator arm is attached to a mobile base with non-holonomic
More informationMotion Generation for Pulling a Fire Hose by a Humanoid Robot
2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids) Cancun, Mexico, Nov 15-17, 2016 Motion Generation for Pulling a Fire Hose by a Humanoid Robot Ixchel G. Ramirez-Alpizar 1, Maximilien
More informationBehaviour-Based Control. IAR Lecture 5 Barbara Webb
Behaviour-Based Control IAR Lecture 5 Barbara Webb Traditional sense-plan-act approach suggests a vertical (serial) task decomposition Sensors Actuators perception modelling planning task execution motor
More informationDevelopment of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -
Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics - Hiroshi Ishiguro 1,2, Tetsuo Ono 1, Michita Imai 1, Takayuki Kanda
More informationContinuous Rotation Control of Robotic Arm using Slip Rings for Mars Rover
International Conference on Mechanical, Industrial and Materials Engineering 2017 (ICMIME2017) 28-30 December, 2017, RUET, Rajshahi, Bangladesh. Paper ID: AM-270 Continuous Rotation Control of Robotic
More informationFUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page
FUNDAMENTALS of ROBOT TECHNOLOGY An Introduction to Industrial Robots, T eleoperators and Robot Vehicles D J Todd &\ Kogan Page First published in 1986 by Kogan Page Ltd 120 Pentonville Road, London Nl
More informationNCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects
NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS
More informationHumanoids. Lecture Outline. RSS 2010 Lecture # 19 Una-May O Reilly. Definition and motivation. Locomotion. Why humanoids? What are humanoids?
Humanoids RSS 2010 Lecture # 19 Una-May O Reilly Lecture Outline Definition and motivation Why humanoids? What are humanoids? Examples Locomotion RSS 2010 Humanoids Lecture 1 1 Why humanoids? Capek, Paris
More informationProposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3
Proposal for the Object Oriented Display : The Design and Implementation of the MEDIA 3 Naoki KAWAKAMI, Masahiko INAMI, Taro MAEDA, and Susumu TACHI Faculty of Engineering, University of Tokyo 7-3- Hongo,
More informationINTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY
INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,
More informationHMM-based Error Recovery of Dance Step Selection for Dance Partner Robot
27 IEEE International Conference on Robotics and Automation Roma, Italy, 1-14 April 27 ThA4.3 HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot Takahiro Takeda, Yasuhisa Hirata,
More informationU ROBOT March 12, 2008 Kyung Chul Shin Yujin Robot Co.
U ROBOT March 12, 2008 Kyung Chul Shin Yujin Robot Co. Is the era of the robot around the corner? It is coming slowly albeit steadily hundred million 1600 1400 1200 1000 Public Service Educational Service
More informationsin( x m cos( The position of the mass point D is specified by a set of state variables, (θ roll, θ pitch, r) related to the Cartesian coordinates by:
Research Article International Journal of Current Engineering and Technology ISSN 77-46 3 INPRESSCO. All Rights Reserved. Available at http://inpressco.com/category/ijcet Modeling improvement of a Humanoid
More informationDevelopment and Evaluation of a Centaur Robot
Development and Evaluation of a Centaur Robot 1 Satoshi Tsuda, 1 Kuniya Shinozaki, and 2 Ryohei Nakatsu 1 Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan {amy65823,
More information2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY
2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY -Improvement of Manipulability Using Disturbance Observer and its Application to a Master-slave System- Shigeki KUDOMI*, Hironao YAMADA**
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationHigh-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control
High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control Pedro Neto, J. Norberto Pires, Member, IEEE Abstract Today, most industrial robots are programmed using the typical
More information