We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Size: px
Start display at page:

Download "We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors"

Transcription

1 We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3, , M Open access books available International authors and editors Downloads Our authors are among the 151 Countries delivered to TOP 1% most cited scientists 12.2% Contributors from top 500 universities Selection of our books indexed in the Book Citation Index in Web of Science Core Collection (BKCI) Interested in publishing with us? Contact book.department@intechopen.com Numbers displayed above are based on latest data collected. For more information visit

2 12 The robot voice-control system with interactive learning Miroslav Holada, Martin Pelc Technical University of Liberec Czech Republic 1. Introduction Nowadays, robots are penetrating human lives and carry out many tasks that would have been impossible for machines just few decades ago. One can usually get an end user narrow- or single-purpose robotic solution for a specific task with a user interface providing control within the scope of the given task. Another possibility is to get a more generalpurpose robot and design a custom control system for peculiar tasks. The latter usually involves programming using the manufacturer's rudimentary robot movement interface and requires deeper knowledge of robotics, computer sense, control systems etc. Providing a general-purpose control system for general-purpose robots which wouldn't require most of the aforementioned knowledge poses a challenge. It would make robotics appealing to wider audience and possibly speed up development and application of robots. Such system needs to have some learning capabilities as well as a friendly user interface for task teaching. The goal of our work described in this capture is a PC-software-based interactive system for general-purpose robot voice-control. This paper describes the designed prototype, its structure and the dialogue strategy in particular. The interactive control of robots could be used in special situations, when a robot is working in dangerous areas and no programming beforehand is possible. It could also be used in a situation when supervised learning for robot s later autonomous operation has to be done, without knowledge about the robot s programming language. Generally, the robots are actuated by sets of control commands, sometimes by a manual control interface (such as touchpad or joystick). The operator has to know the control commands, syntax rules and other properties necessary for successful robot control. The proposed system tries to simplify this robot programming and make it more user-friendly and easy to use. The system offers commands like move left or elevate arm that are translated and sent into the corresponding device (robot). 2. Project features The project is based on a former research. The research involved a voice-control dialog system, speech recognition, vocabulary design and speech synthesis feedback for user

3 220 New Developments in Robotics, Automation and Control command confirmation. Together with a scene manager and a digital image processing module, it forms the core of the control system as shown in figure 1. The key feature of the system is that it can learn a series of commands to autonomously perform certain tasks using the robot. Digital cameras will be used to navigate in the robot's working space. Supported by computer vision algorithms, the system should be able to find objects of interest (and to keep track of them) on the scene including the robot itself and allow the objects to be referred to by user's commands. The main feature is that a user is not required to have knowledge about robot programming, computer vision, etc. The system should also offer a straightforward robot movement control via either verbal commands or graphical user interface (GUI). The system is being developed as a pure-software solution hosted on the Windows platform. The components of the system are described below. Sensors Communication interface Microphone Headphones Robot 1 (ABB arm RAPID interface) Robot 2 (virtual) Events manager Main project engine with GUI Voice control & Dialogue manager Recogniser (HMM) TTS (EPOS) Camera Solved task Image processing - calibration - image enhancement - segmentation Executive code of distributed algorithms Scene manager - collision detection - trajectory DB scene - objects - working area - robot s range (commands, program, dialogue log) -files- Scene image output + detected objects + marked places Fig. 1. Functional layout of the system. 2.1 Scene manager The scene manager forms a connection between the main program (engine) and the image processing part. It actually controls the image processing module and initiates image acquisition and processing. Using the processed image data, it updates the scene database, keeps track of objects found on the scene and provides the scene object and image data to

4 The robot voice-control system with interactive learning 221 the main engine. It is also aware of the robot's coordinate system and plans the robot s movement when requested by the engine. The database itself consists of two types of data. It contains the list of parametrized objects detected on the scene as well as the robot calibration data. The latter allows mutual imagespace to robot-space coordinate translation which is used in robot navigation. Each object detected on the scene is internally represented as a data object (class instance), all the objects are stored in a dynamic list. Some of the attributes are: a unique object identifier, object's shape descriptor, central point coordinates, bounding rectangle etc. Such data allows smooth object manipulation and serves as a base for object collision avoidance along the manipulation trajectory. The scene manager also combines unprocessed camera image with scene data to highlight detected objects and to present them to the user via a GUI as shown in figure 2. The user has a view of the computer's scene understanding and may correctly designate objects of interest in his or her commands. Being in its early stages, the project currently works only with 2D data and relies on the user's z-axis navigation aid. The system is expected to incorporate a second camera and 3D computer vision in the future to become fully 3D aware. 2.2 Speech processing The voice interface between an operator and the controlled process is provided by a speech recogniser and a text-to-speech synthesis (TTS) system (both for Czech language). The TTS synthesis system named EPOS was developed by URE AV Prague. It allows various male or female voices with many options of setting (Hanika & Horak, 1999). The speech recognition is based on a proprietary isolated word engine that was developed in previous projects (Nouza, 2000). The recogniser is speaker independent, noise robust, phoneme based with 3-state HMM (Hidden Markov Models) and 32 Gaussians. It is suitable for large vocabularies (up 10k words or short phrases) and allows us to apply various commands and their synonyms (Nouza & Nouza, 2004). Both voice components are built into a distributed system named DUNDIS (Holada, 2004). The advantage of this solution is the fact that the designed system needs to incorporate only a relatively simple software client. This client sends speech data to the recognition server where speech recognition is executed. The TTS engine can work separately but in this case it is linked to recognizer due to echo cancellation problem. The unwanted acoustic feedback (meaning that the computer hears and recognizes what it speaks ) is eliminated by halfduplexing the communication (it either speaks or listens but not both at the same time). 2.3 Image processing The robot s working area is captured by a colour high-resolution digital camera (AVT Marlin F-146C, 1/2 CCD sensor). The camera is placed directly above the scene in a fixed position. We implemented a simple interactive method to synchronize the robot s coordinate system (XY) and the camera s one using pixel units and prepare modifications to compensate geometric distortions introduced by camera lens. The picture 2 shows the overall view of the test workplace. The camera is placed above the scene and is partially visible on the top of the picture. The working scene is composed of the

5 222 New Developments in Robotics, Automation and Control robot s surrounding, most notably the white desk with disks that are placed on ribbons to prevent robot s tool damage (crashing directly into the desk). Digital image processing methods are placed in a library which is served by the scene manager with the object database. The figure 3 shows the circular object detection using the reliable Hough transform (HT). HT is commonly used for line or circle detection but could be extended to identify positions of arbitrary parametrizable shapes. Such edge-based object detection is not too sensitive to imperfect input data or noise. Using a touch-display or verbal commands it is possible to focus the robot onto a chosen object (differentiated by its color or numbering) and then tell the robot what to do. So far the system supports detection of basic geometric shapes (circle, rectangle) and basic colors. Fig. 2. Overall view of the test workplace with robot, working scene and the camera. 2.4 Robots description For the purpose of debugging the system, a virtual robot device was designed which behaved like the real one but worked only as a graphical computer simulation. For field tests a real robot had to be used and we chose an industrial robot typically used in robotics lessons which was available to us. The prototype system uses a compact industrial general-purpose robotic arm (ABB IRB 140). The robot is a 6-axes machine with fast acceleration, wide working area and high payload. It is driven by a high performance industrial motion control unit (S4Cplus) which employs the RAPID programming language. The control unit offers extensive communication capabilities - FieldBus, two Ethernet channels and two RS-232 channels. The serial channel

6 The robot voice-control system with interactive learning 223 was chosen for communication between the robot and the developed control system running on a PC. The robotic control software module simplifies the robot use from the main engine's point of view. It abstracts from the aspects of physical communication and robot's programming interface. It either accepts or refuses movement commands issued by the core engine (depending on command's feasibility). When a command is accepted, it is carried out asynchronously, only notifying the engine once the command is completed. Industrial robots have their own sophisticated control systems which allow arbitrary task programming. In our case, the RAPID-based ABB control system proved to be not very suitable for applications which require direct movement control by the computer program, not just the RAPID program stored in the control system. Fig. 3. The system s GUI with the scene view and highlighted objects. 2.5 Distributed computing Most of the system's modules are developed and run on a standard PC to which the robot is connected. Since some of the software modules require significant computational power, the system's response time was far from satisfactory when the whole system ran on a single computer. Therefore, the most demanding computations (namely the object recognition and the voice recognition) were distributed to other (high performance) computers via network (TCP connections). The solution with distributed components is advantageous especially for research and debugging. If any part of the system crashes then after its restart the other parts are quickly

7 224 New Developments in Robotics, Automation and Control reconnected without need to reload and initialize them. Today s local networks are fast enough so any introduced transfer delays are insignificant. 3. Dialogue strategy The dialogue scenario contains four vocabularies. The first is composed of simple basic control commands like move up, stop or take it. They are necessary for basic robot control and many synonyms may be defined for each action. The second group contains unused words and short phrases. It is the biggest group (vocabulary) with tens of thousands items and we can define names of new actions from this group. The names of learned tasks defined by user are the third group of words in the dialogue scenario. The fourth group contains titles for built-in activities like robot calibration, learning initialization or defining a new name for the most recent operation. Items from this group cannot be used in newly defined commands, though. The process of learning a new function starts when the operator says the built-in command beginning of learning. Any known commands issued afterwards are memorized until end of learning command is given. A newly defined task has to be given a name. The new name should consist of one previously unused vocabulary word or unused combination of several words (for example take it + and + move up ). This simple strategy allows to define new robot s tasks just using voice, without keyboard or mouse. It is possible to use any previously defined tasks to compose a new and more complex task. The picture 3 shows a captured screen of the designed system s GUI. There are buttons representing the basic voice commands on the left side. The majority of the screen is taken up by camera view showing highlighted significant points and detected objects. A) B) Fig. 4. Scene capture and object detection: A) initial shot with arm outside of view, B) arm carrying out a task During the dialog some basic logic rules have to be respected. When the objects on the working scene are being analysed, the robot s arm is moved out of scene first (fig. 4) to

8 The robot voice-control system with interactive learning 225 avoid object confusion and occlusion. The system stores the last detection result in the scene database. The whole dialog system is event-driven. We can categorize the events into three fundamental branches: operator events, scene manager events and device events. 3.1 Operator events Operator events usually occur in response to operator s requests. For example, commands which are supposed to cause robot s movement, object detection, new command definition or detection of a new object. This kind of event can occur at any time, but the dialog manager has to decide if it was a relevant and feasible request or if it was just a random speech recognition error. Although the acoustic conditions in robotic applications usually involve high background noise (servos, air-pump), the speech recognizer works usually with over 90% recognition score. If the operator says a wrong command or a command out of context (for example, the operator says drop but the robot doesn t hold anything) then the event manager asks him or her for a feasible command in the stead of the nonsensical one. 3.2 Scene manager events This sort of event occurs when the scene manager detects a discrepancy in the scene. For example when the operator says move up and the robot s arm moves all the way up until the maximum range is reached. When this happen a scene event is generated and the system indicates that the top position was reached. Other scene event occurs when the operator wants to take up an object, but the system does not know which one because of multiple objects detected on the scene. This event generates a query to the operator for proper object specification. 3.3 Device events These events are produced by external sensors and other components and devices connected to the system. They are processed in the event manager where corresponding action is taken. The response manifests itself in the form of a request for the operator, or more often causes a change in robot s behaviour. The difference between scene manager events and device events is that scene events are generated by the system itself (based on a known scenario, robot geometry, object shape and position). They are computed and predictable. On the other hand, device events time cannot be exactly predicted before they actually happen. 3.4 Examples of dialog For a simpler robot orientation and navigation the positions on the scene are virtualized. They are named after the Greek letters like Position (alpha) or Position (beta). These virtual positions may be redefined to suit the operator s needs. A blind-area may also be defined and it is completely omitted from any image processing and anything in this area is completely ignored. As an example (see figure 3.) the robot can grab all the black disks and move them to some other place on the scene. This place is defined as Position alpha and the blind area is set up on the same coordinates. After this the operator starts an example dialog:

9 226 New Developments in Robotics, Automation and Control Start recording new command....this is operator s command (italic text)... I m recording...the system says (bold text)... Search black disks I m searching... Four disks were found Move on first I m moving... Done Take it. Ok Move on position alpha. I m moving... Done Put it Ok Stop recording. I stop the recording. Please, say new command Search Disks Done New command is entered and named: Search disks. Is it right? Yes...now, the newly defined command may be used... Repeat command Enter command Search disks OK...now the system repeats the command until no more disks are found... No object found. Repeating done....now all the disks on the scene are transported into position alpha... Fig. 5. a) The initial scene. b) The robot grabbing a target disc. The robot finds the remaining three disks and puts them into the selected area. If no disk is found the robot interrupts the execution of the given command and waits for a new command. This is shown in figures 5 and 6.

10 The robot voice-control system with interactive learning 227 The figure 7 shows an operation where the system finds a small red disk, grabs it and puts it onto a chosen black disk. The navigation of the arm relies only on the image processing results. Fig. 6. The robot lifting a disk, moving it around and placing it in a desired position. Fig. 7. The robot grabbing another disc and stacking up a pile. 4. Conclusion The system is especially usable as an accessory robot control interface for assistant and second-rate operations. The designed prototype cooperates with only one specific industry

11 228 New Developments in Robotics, Automation and Control robot (ABB) so far but the robotic control module may easily be extended to support other robots (Katana, mobile robots, etc.) as well. The system offers robot control and robot task programming even to people without explicit programming knowledge. It is sufficient for the operator to know the Czech voice interface of the presented system. The system is able to memorize issued commands and reproduce tasks. The designed dialogue strategy was verified using a real robot in real conditions. The fusion of computer vision, voice recognition and robot control is quite challenging but it looks promising. The development itself is rather complicated as it requires knowledge from many different areas of science. Employed computer vision greatly simplifies robot navigation as the user actually sees the system s understanding of the scene. This also allows for a much better utilization of voice control. Contemporary computer hardware seems to be adequate for the demanding operations involved, but the system may still require distributed computing (to achieve reasonable response times and user comfort). The presented prototype serves as a base for further development. The system is planned to use 3D vision as well as arbitrary object detection and description to become fully 3D-aware and needing as little user aid as possible. 5. Acknowledgement This work has been supported by the Grant Agency of the Czech Republic (grant no. 102/07/P455) and the internal grant IG FM TUL 2007/ References Nouza J. (2000). A Czech Large Vocabulary Recognition System for Real-Time Applications. In Text, Speech and Dialogue (eds. Sojka, Kopecek, Pala) Springer-Verlag, Heidelberg Nouza, J., Nouza, T. (2004). A Voice Dictation System for a Million-Word Czech Vocabulary. In: Proc. of ICCCT 2004, Austin, USA Holada, M. (2004). The experiences and usability of distributed speech recognition system DUNDIS. In: Proc. of 14th Czech-German Workshop Speech Processing, Prague, Czech Republic, pp , Hanika, J, Horak, P. (1999). Text to Speech Control Protocol. In: Proc of the Int. Conf. Eurospeech'99, Budapest, Hungary Sonka, M., Hlaváč, V., Boyle, R. D. (1998). Image Processing, Analysis and Machine Vision. PWS, Boston, USA Cerva, P., Nouza, J. (2007). Design and Development of Voice Controlled Aids for Motor- Handicapped Persons, In: Conference of the International Speech Communication Association (Interspeech 2007), ISSN:

12 New Developments in Robotics Automation and Control Edited by Aleksandar Lazinica ISBN Hard cover, 450 pages Publisher InTech Published online 01, October, 2008 Published in print edition October, 2008 This book represents the contributions of the top researchers in the field of robotics, automation and control and will serve as a valuable tool for professionals in these interdisciplinary fields. It consists of 25 chapter that introduce both basic research and advanced developments covering the topics such as kinematics, dynamic analysis, accuracy, optimization design, modelling, simulation and control. Without a doubt, the book covers a great deal of recent research, and as such it works as a valuable source for researchers interested in the involved subjects. How to reference In order to correctly reference this scholarly work, feel free to copy and paste the following: Miroslav Holada and Martin Pelc (2008). The Robot Voice-control System with Interactive Learning, New Developments in Robotics Automation and Control, Aleksandar Lazinica (Ed.), ISBN: , InTech, Available from: InTech Europe University Campus STeP Ri Slavka Krautzeka 83/A Rijeka, Croatia Phone: +385 (51) Fax: +385 (51) InTech China Unit 405, Office Block, Hotel Equatorial Shanghai No.65, Yan An Road (West), Shanghai, , China Phone: Fax:

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,000 116,000 120M Open access books available International authors and editors Downloads Our

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,000 116,000 120M Open access books available International authors and editors Downloads Our

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,800 116,000 120M Open access books available International authors and editors Downloads Our

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space Limits of a Distributed Intelligent Networked Device in the Intelligence Space Gyula Max, Peter Szemes Budapest University of Technology and Economics, H-1521, Budapest, Po. Box. 91. HUNGARY, Tel: +36

More information

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,

More information

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS Nuno Sousa Eugénio Oliveira Faculdade de Egenharia da Universidade do Porto, Portugal Abstract: This paper describes a platform that enables

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,500 108,000 1.7 M Open access books available International authors and editors Downloads Our

More information

The Real-Time Control System for Servomechanisms

The Real-Time Control System for Servomechanisms The Real-Time Control System for Servomechanisms PETR STODOLA, JAN MAZAL, IVANA MOKRÁ, MILAN PODHOREC Department of Military Management and Tactics University of Defence Kounicova str. 65, Brno CZECH REPUBLIC

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,500 108,000 1.7 M Open access books available International authors and editors Downloads Our

More information

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS

More information

Team KMUTT: Team Description Paper

Team KMUTT: Team Description Paper Team KMUTT: Team Description Paper Thavida Maneewarn, Xye, Pasan Kulvanit, Sathit Wanitchaikit, Panuvat Sinsaranon, Kawroong Saktaweekulkit, Nattapong Kaewlek Djitt Laowattana King Mongkut s University

More information

May Edited by: Roemi E. Fernández Héctor Montes

May Edited by: Roemi E. Fernández Héctor Montes May 2016 Edited by: Roemi E. Fernández Héctor Montes RoboCity16 Open Conference on Future Trends in Robotics Editors Roemi E. Fernández Saavedra Héctor Montes Franceschi Madrid, 26 May 2016 Edited by:

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

An Open Robot Simulator Environment

An Open Robot Simulator Environment An Open Robot Simulator Environment Toshiyuki Ishimura, Takeshi Kato, Kentaro Oda, and Takeshi Ohashi Dept. of Artificial Intelligence, Kyushu Institute of Technology isshi@mickey.ai.kyutech.ac.jp Abstract.

More information

More Info at Open Access Database by S. Dutta and T. Schmidt

More Info at Open Access Database  by S. Dutta and T. Schmidt More Info at Open Access Database www.ndt.net/?id=17657 New concept for higher Robot position accuracy during thermography measurement to be implemented with the existing prototype automated thermography

More information

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Dipartimento di Elettronica Informazione e Bioingegneria Robotics Dipartimento di Elettronica Informazione e Bioingegneria Robotics Behavioral robotics @ 2014 Behaviorism behave is what organisms do Behaviorism is built on this assumption, and its goal is to promote

More information

Mel Spectrum Analysis of Speech Recognition using Single Microphone

Mel Spectrum Analysis of Speech Recognition using Single Microphone International Journal of Engineering Research in Electronics and Communication Mel Spectrum Analysis of Speech Recognition using Single Microphone [1] Lakshmi S.A, [2] Cholavendan M [1] PG Scholar, Sree

More information

Direct gaze based environmental controls

Direct gaze based environmental controls Loughborough University Institutional Repository Direct gaze based environmental controls This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: SHI,

More information

products PC Control

products PC Control products PC Control 04 2017 PC Control 04 2017 products Image processing directly in the PLC TwinCAT Vision Machine vision easily integrated into automation technology Automatic detection, traceability

More information

Android Speech Interface to a Home Robot July 2012

Android Speech Interface to a Home Robot July 2012 Android Speech Interface to a Home Robot July 2012 Deya Banisakher Undergraduate, Computer Engineering dmbxt4@mail.missouri.edu Tatiana Alexenko Graduate Mentor ta7cf@mail.missouri.edu Megan Biondo Undergraduate,

More information

Virtual Engineering: Challenges and Solutions for Intuitive Offline Programming for Industrial Robot

Virtual Engineering: Challenges and Solutions for Intuitive Offline Programming for Industrial Robot Virtual Engineering: Challenges and Solutions for Intuitive Offline Programming for Industrial Robot Liwei Qi, Xingguo Yin, Haipeng Wang, Li Tao ABB Corporate Research China No. 31 Fu Te Dong San Rd.,

More information

Responding to Voice Commands

Responding to Voice Commands Responding to Voice Commands Abstract: The goal of this project was to improve robot human interaction through the use of voice commands as well as improve user understanding of the robot s state. Our

More information

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people Space Research expeditions and open space work Education & Research Teaching and laboratory facilities. Medical Assistance for people Safety Life saving activity, guarding Military Use to execute missions

More information

User manual Automatic Material Alignment Beta 2

User manual Automatic Material Alignment Beta 2 www.cnccamera.nl User manual Automatic Material Alignment For integration with USB-CNC Beta 2 Table of Contents 1 Introduction... 4 1.1 Purpose... 4 1.2 OPENCV... 5 1.3 Disclaimer... 5 2 Overview... 6

More information

CAPACITIES FOR TECHNOLOGY TRANSFER

CAPACITIES FOR TECHNOLOGY TRANSFER CAPACITIES FOR TECHNOLOGY TRANSFER The Institut de Robòtica i Informàtica Industrial (IRI) is a Joint University Research Institute of the Spanish Council for Scientific Research (CSIC) and the Technical

More information

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit) Vishnu Nath Usage of computer vision and humanoid robotics to create autonomous robots (Ximea Currera RL04C Camera Kit) Acknowledgements Firstly, I would like to thank Ivan Klimkovic of Ximea Corporation,

More information

Development of excavator training simulator using leap motion controller

Development of excavator training simulator using leap motion controller Journal of Physics: Conference Series PAPER OPEN ACCESS Development of excavator training simulator using leap motion controller To cite this article: F Fahmi et al 2018 J. Phys.: Conf. Ser. 978 012034

More information

Lab 7: Introduction to Webots and Sensor Modeling

Lab 7: Introduction to Webots and Sensor Modeling Lab 7: Introduction to Webots and Sensor Modeling This laboratory requires the following software: Webots simulator C development tools (gcc, make, etc.) The laboratory duration is approximately two hours.

More information

S.P.Q.R. Legged Team Report from RoboCup 2003

S.P.Q.R. Legged Team Report from RoboCup 2003 S.P.Q.R. Legged Team Report from RoboCup 2003 L. Iocchi and D. Nardi Dipartimento di Informatica e Sistemistica Universitá di Roma La Sapienza Via Salaria 113-00198 Roma, Italy {iocchi,nardi}@dis.uniroma1.it,

More information

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging Proseminar Roboter und Aktivmedien Educational robots achievements and challenging Lecturer Lecturer Houxiang Houxiang Zhang Zhang TAMS, TAMS, Department Department of of Informatics Informatics University

More information

A New Simulator for Botball Robots

A New Simulator for Botball Robots A New Simulator for Botball Robots Stephen Carlson Montgomery Blair High School (Lockheed Martin Exploring Post 10-0162) 1 Introduction A New Simulator for Botball Robots Simulation is important when designing

More information

The project. General challenges and problems. Our subjects. The attachment and locomotion system

The project. General challenges and problems. Our subjects. The attachment and locomotion system The project The Ceilbot project is a study and research project organized at the Helsinki University of Technology. The aim of the project is to design and prototype a multifunctional robot which takes

More information

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington Department of Computer Science and Engineering The University of Texas at Arlington Team Autono-Mo Jacobia Architecture Design Specification Team Members: Bill Butts Darius Salemizadeh Lance Storey Yunesh

More information

Federico Forti, Erdi Izgi, Varalika Rathore, Francesco Forti

Federico Forti, Erdi Izgi, Varalika Rathore, Francesco Forti Basic Information Project Name Supervisor Kung-fu Plants Jakub Gemrot Annotation Kung-fu plants is a game where you can create your characters, train them and fight against the other chemical plants which

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,900 116,000 120M Open access books available International authors and editors Downloads Our

More information

Medical Robotics LBR Med

Medical Robotics LBR Med Medical Robotics LBR Med EN KUKA, a proven robotics partner. Discerning users around the world value KUKA as a reliable partner. KUKA has branches in over 30 countries, and for over 40 years, we have been

More information

Applying Usability Testing in the Evaluation of Products and Services for Elderly People Lei-Juan HOU a,*, Jian-Bing LIU b, Xin-Zhu XING c

Applying Usability Testing in the Evaluation of Products and Services for Elderly People Lei-Juan HOU a,*, Jian-Bing LIU b, Xin-Zhu XING c 2016 International Conference on Service Science, Technology and Engineering (SSTE 2016) ISBN: 978-1-60595-351-9 Applying Usability Testing in the Evaluation of Products and Services for Elderly People

More information

HeroX - Untethered VR Training in Sync'ed Physical Spaces

HeroX - Untethered VR Training in Sync'ed Physical Spaces Page 1 of 6 HeroX - Untethered VR Training in Sync'ed Physical Spaces Above and Beyond - Integrating Robotics In previous research work I experimented with multiple robots remotely controlled by people

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,800 6,000 0M Open access books available International authors and editors Downloads Our authors

More information

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture Akira Suganuma Depertment of Intelligent Systems, Kyushu University, 6 1, Kasuga-koen, Kasuga,

More information

Virtual Testing of Autonomous Vehicles

Virtual Testing of Autonomous Vehicles Virtual Testing of Autonomous Vehicles Mike Dempsey Claytex Services Limited Software, Consultancy, Training Based in Leamington Spa, UK Office in Cape Town, South Africa Experts in Systems Engineering,

More information

Feasibility of a multifunctional morphological system for use on field programmable gate arrays

Feasibility of a multifunctional morphological system for use on field programmable gate arrays Journal of Physics: Conference Series Feasibility of a multifunctional morphological system for use on field programmable gate arrays To cite this article: A J Tickle et al 2007 J. Phys.: Conf. Ser. 76

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,000 116,000 120M Open access books available International authors and editors Downloads Our

More information

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS Eva Cipi, PhD in Computer Engineering University of Vlora, Albania Abstract This paper is focused on presenting

More information

Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS

Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS Matt Schikore Yiannis E. Papelis Ginger Watson National Advanced Driving Simulator & Simulation Center The University

More information

Design and Control of the BUAA Four-Fingered Hand

Design and Control of the BUAA Four-Fingered Hand Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Design and Control of the BUAA Four-Fingered Hand Y. Zhang, Z. Han, H. Zhang, X. Shang, T. Wang,

More information

MINHO ROBOTIC FOOTBALL TEAM. Carlos Machado, Sérgio Sampaio, Fernando Ribeiro

MINHO ROBOTIC FOOTBALL TEAM. Carlos Machado, Sérgio Sampaio, Fernando Ribeiro MINHO ROBOTIC FOOTBALL TEAM Carlos Machado, Sérgio Sampaio, Fernando Ribeiro Grupo de Automação e Robótica, Department of Industrial Electronics, University of Minho, Campus de Azurém, 4800 Guimarães,

More information

Navigation of Transport Mobile Robot in Bionic Assembly System

Navigation of Transport Mobile Robot in Bionic Assembly System Navigation of Transport Mobile obot in Bionic ssembly System leksandar Lazinica Intelligent Manufacturing Systems IFT Karlsplatz 13/311, -1040 Vienna Tel : +43-1-58801-311141 Fax :+43-1-58801-31199 e-mail

More information

Advanced Man-Machine Interaction

Advanced Man-Machine Interaction Signals and Communication Technology Advanced Man-Machine Interaction Fundamentals and Implementation Bearbeitet von Karl-Friedrich Kraiss 1. Auflage 2006. Buch. XIX, 461 S. ISBN 978 3 540 30618 4 Format

More information

Sliding Mode Control of Wheeled Mobile Robots

Sliding Mode Control of Wheeled Mobile Robots 2012 IACSIT Coimbatore Conferences IPCSIT vol. 28 (2012) (2012) IACSIT Press, Singapore Sliding Mode Control of Wheeled Mobile Robots Tisha Jose 1 + and Annu Abraham 2 Department of Electronics Engineering

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,900 116,000 120M Open access books available International authors and editors Downloads Our

More information

ADAS Development using Advanced Real-Time All-in-the-Loop Simulators. Roberto De Vecchi VI-grade Enrico Busto - AddFor

ADAS Development using Advanced Real-Time All-in-the-Loop Simulators. Roberto De Vecchi VI-grade Enrico Busto - AddFor ADAS Development using Advanced Real-Time All-in-the-Loop Simulators Roberto De Vecchi VI-grade Enrico Busto - AddFor The Scenario The introduction of ADAS and AV has created completely new challenges

More information

Graz University of Technology (Austria)

Graz University of Technology (Austria) Graz University of Technology (Austria) I am in charge of the Vision Based Measurement Group at Graz University of Technology. The research group is focused on two main areas: Object Category Recognition

More information

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair. ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means

More information

INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE

INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE J. Norberto Pires Mechanical Engineering

More information

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface 6th ERCIM Workshop "User Interfaces for All" Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface Tsutomu MIYASATO ATR Media Integration & Communications 2-2-2 Hikaridai, Seika-cho,

More information

Classical Control Based Autopilot Design Using PC/104

Classical Control Based Autopilot Design Using PC/104 Classical Control Based Autopilot Design Using PC/104 Mohammed A. Elsadig, Alneelain University, Dr. Mohammed A. Hussien, Alneelain University. Abstract Many recent papers have been written in unmanned

More information

DESIGN OF AN IMAGE PROCESSING ALGORITHM FOR BALL DETECTION

DESIGN OF AN IMAGE PROCESSING ALGORITHM FOR BALL DETECTION DESIGN OF AN IMAGE PROCESSING ALGORITHM FOR BALL DETECTION Ikwuagwu Emole B.S. Computer Engineering 11 Claflin University Mentor: Chad Jenkins, Ph.D Robotics, Learning and Autonomy Lab Department of Computer

More information

Multimodal Research at CPK, Aalborg

Multimodal Research at CPK, Aalborg Multimodal Research at CPK, Aalborg Summary: The IntelliMedia WorkBench ( Chameleon ) Campus Information System Multimodal Pool Trainer Displays, Dialogue Walkthru Speech Understanding Vision Processing

More information

Formation and Cooperation for SWARMed Intelligent Robots

Formation and Cooperation for SWARMed Intelligent Robots Formation and Cooperation for SWARMed Intelligent Robots Wei Cao 1 Yanqing Gao 2 Jason Robert Mace 3 (West Virginia University 1 University of Arizona 2 Energy Corp. of America 3 ) Abstract This article

More information

Sketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph

Sketching Interface. Larry Rudolph April 24, Pervasive Computing MIT SMA 5508 Spring 2006 Larry Rudolph Sketching Interface Larry April 24, 2006 1 Motivation Natural Interface touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different from speech

More information

Project Multimodal FooBilliard

Project Multimodal FooBilliard Project Multimodal FooBilliard adding two multimodal user interfaces to an existing 3d billiard game Dominic Sina, Paul Frischknecht, Marian Briceag, Ulzhan Kakenova March May 2015, for Future User Interfaces

More information

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality R. Marín, P. J. Sanz and J. S. Sánchez Abstract The system consists of a multirobot architecture that gives access

More information

Sketching Interface. Motivation

Sketching Interface. Motivation Sketching Interface Larry Rudolph April 5, 2007 1 1 Natural Interface Motivation touch screens + more Mass-market of h/w devices available Still lack of s/w & applications for it Similar and different

More information

Debugging a Boundary-Scan I 2 C Script Test with the BusPro - I and I2C Exerciser Software: A Case Study

Debugging a Boundary-Scan I 2 C Script Test with the BusPro - I and I2C Exerciser Software: A Case Study Debugging a Boundary-Scan I 2 C Script Test with the BusPro - I and I2C Exerciser Software: A Case Study Overview When developing and debugging I 2 C based hardware and software, it is extremely helpful

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,800 116,000 120M Open access books available International authors and editors Downloads Our

More information

Baset Adult-Size 2016 Team Description Paper

Baset Adult-Size 2016 Team Description Paper Baset Adult-Size 2016 Team Description Paper Mojtaba Hosseini, Vahid Mohammadi, Farhad Jafari 2, Dr. Esfandiar Bamdad 1 1 Humanoid Robotic Laboratory, Robotic Center, Baset Pazhuh Tehran company. No383,

More information

Verified Mobile Code Repository Simulator for the Intelligent Space *

Verified Mobile Code Repository Simulator for the Intelligent Space * Proceedings of the 8 th International Conference on Applied Informatics Eger, Hungary, January 27 30, 2010. Vol. 1. pp. 79 86. Verified Mobile Code Repository Simulator for the Intelligent Space * Zoltán

More information

Control and robotics remote laboratory for engineering education

Control and robotics remote laboratory for engineering education Control and robotics remote laboratory for engineering education R. Šafarič, M. Truntič, D. Hercog and G. Pačnik University of Maribor, Faculty of electrical engineering and computer science, Maribor,

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Simultaneous Recognition of Speech Commands by a Robot using a Small Microphone Array

Simultaneous Recognition of Speech Commands by a Robot using a Small Microphone Array 2012 2nd International Conference on Computer Design and Engineering (ICCDE 2012) IPCSIT vol. 49 (2012) (2012) IACSIT Press, Singapore DOI: 10.7763/IPCSIT.2012.V49.14 Simultaneous Recognition of Speech

More information

1 Abstract and Motivation

1 Abstract and Motivation 1 Abstract and Motivation Robust robotic perception, manipulation, and interaction in domestic scenarios continues to present a hard problem: domestic environments tend to be unstructured, are constantly

More information

Visual compass for the NIFTi robot

Visual compass for the NIFTi robot CENTER FOR MACHINE PERCEPTION CZECH TECHNICAL UNIVERSITY IN PRAGUE Visual compass for the NIFTi robot Tomáš Nouza nouzato1@fel.cvut.cz June 27, 2013 TECHNICAL REPORT Available at https://cw.felk.cvut.cz/doku.php/misc/projects/nifti/sw/start/visual

More information

Interface Design V: Beyond the Desktop

Interface Design V: Beyond the Desktop Interface Design V: Beyond the Desktop Rob Procter Further Reading Dix et al., chapter 4, p. 153-161 and chapter 15. Norman, The Invisible Computer, MIT Press, 1998, chapters 4 and 15. 11/25/01 CS4: HCI

More information

IDS5 Digital ATIS System for AFAS and AAAS Workstations. Description and Specifications

IDS5 Digital ATIS System for AFAS and AAAS Workstations. Description and Specifications IDS5 Digital ATIS System for AFAS and AAAS Workstations Description and Specifications 1. Introduction The Digital Automated Terminal Information Service (DATIS) component of the IDS5 DATIS solution is

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

UNIVERSIDAD CARLOS III DE MADRID ESCUELA POLITÉCNICA SUPERIOR

UNIVERSIDAD CARLOS III DE MADRID ESCUELA POLITÉCNICA SUPERIOR UNIVERSIDAD CARLOS III DE MADRID ESCUELA POLITÉCNICA SUPERIOR TRABAJO DE FIN DE GRADO GRADO EN INGENIERÍA DE SISTEMAS DE COMUNICACIONES CONTROL CENTRALIZADO DE FLOTAS DE ROBOTS CENTRALIZED CONTROL FOR

More information

Proposal for a Rapid Prototyping Environment for Algorithms Intended for Autonoumus Mobile Robot Control

Proposal for a Rapid Prototyping Environment for Algorithms Intended for Autonoumus Mobile Robot Control Mechanics and Mechanical Engineering Vol. 12, No. 1 (2008) 5 16 c Technical University of Lodz Proposal for a Rapid Prototyping Environment for Algorithms Intended for Autonoumus Mobile Robot Control Andrzej

More information

Physical Presence in Virtual Worlds using PhysX

Physical Presence in Virtual Worlds using PhysX Physical Presence in Virtual Worlds using PhysX One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are

More information

The use of gestures in computer aided design

The use of gestures in computer aided design Loughborough University Institutional Repository The use of gestures in computer aided design This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: CASE,

More information

Service Robots in an Intelligent House

Service Robots in an Intelligent House Service Robots in an Intelligent House Jesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering Autonomous National University of Mexico UNAM 2017 OUTLINE Introduction A System

More information

Scratch Coding And Geometry

Scratch Coding And Geometry Scratch Coding And Geometry by Alex Reyes Digitalmaestro.org Digital Maestro Magazine Table of Contents Table of Contents... 2 Basic Geometric Shapes... 3 Moving Sprites... 3 Drawing A Square... 7 Drawing

More information

Concept and Architecture of a Centaur Robot

Concept and Architecture of a Centaur Robot Concept and Architecture of a Centaur Robot Satoshi Tsuda, Yohsuke Oda, Kuniya Shinozaki, and Ryohei Nakatsu Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan

More information

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

USER-ORIENTED INTERACTIVE BUILDING DESIGN * USER-ORIENTED INTERACTIVE BUILDING DESIGN * S. Martinez, A. Salgado, C. Barcena, C. Balaguer RoboticsLab, University Carlos III of Madrid, Spain {scasa@ing.uc3m.es} J.M. Navarro, C. Bosch, A. Rubio Dragados,

More information

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer University of Toronto Companion ECE1778 Winter 2015 Creative Applications for Mobile Devices Wei Hao Chang Apper Alexander Hong Programmer April 9, 2015 Contents 1 Introduction 3 1.1 Problem......................................

More information

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS MotionCore, the smallest size AHRS in the world, is an ultra-small form factor, highly accurate inertia system based

More information

Analog Circuit for Motion Detection Applied to Target Tracking System

Analog Circuit for Motion Detection Applied to Target Tracking System 14 Analog Circuit for Motion Detection Applied to Target Tracking System Kimihiro Nishio Tsuyama National College of Technology Japan 1. Introduction It is necessary for the system such as the robotics

More information

The Fastest, Easiest, Most Accurate Way To Compare Parts To Their CAD Data

The Fastest, Easiest, Most Accurate Way To Compare Parts To Their CAD Data 210 Brunswick Pointe-Claire (Quebec) Canada H9R 1A6 Web: www.visionxinc.com Email: info@visionxinc.com tel: (514) 694-9290 fax: (514) 694-9488 VISIONx INC. The Fastest, Easiest, Most Accurate Way To Compare

More information

Saphira Robot Control Architecture

Saphira Robot Control Architecture Saphira Robot Control Architecture Saphira Version 8.1.0 Kurt Konolige SRI International April, 2002 Copyright 2002 Kurt Konolige SRI International, Menlo Park, California 1 Saphira and Aria System Overview

More information

CiberRato 2019 Rules and Technical Specifications

CiberRato 2019 Rules and Technical Specifications Departamento de Electrónica, Telecomunicações e Informática Universidade de Aveiro CiberRato 2019 Rules and Technical Specifications (March, 2018) 2 CONTENTS Contents 3 1 Introduction This document describes

More information

UNIT VI. Current approaches to programming are classified as into two major categories:

UNIT VI. Current approaches to programming are classified as into two major categories: Unit VI 1 UNIT VI ROBOT PROGRAMMING A robot program may be defined as a path in space to be followed by the manipulator, combined with the peripheral actions that support the work cycle. Peripheral actions

More information

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of

More information

DESIGN AND CAPABILITIES OF AN ENHANCED NAVAL MINE WARFARE SIMULATION FRAMEWORK. Timothy E. Floore George H. Gilman

DESIGN AND CAPABILITIES OF AN ENHANCED NAVAL MINE WARFARE SIMULATION FRAMEWORK. Timothy E. Floore George H. Gilman Proceedings of the 2011 Winter Simulation Conference S. Jain, R.R. Creasey, J. Himmelspach, K.P. White, and M. Fu, eds. DESIGN AND CAPABILITIES OF AN ENHANCED NAVAL MINE WARFARE SIMULATION FRAMEWORK Timothy

More information