A VIRTUAL REALITY TELEOPERATOR INTERFACE FOR ASSEMBLY OF HYBRID MEMS PROTOTYPES

Size: px
Start display at page:

Download "A VIRTUAL REALITY TELEOPERATOR INTERFACE FOR ASSEMBLY OF HYBRID MEMS PROTOTYPES"

Transcription

1 Proceedings of DETC ASME Design Engineering Technical Conference September 13-16, 1998, Atlanta, GA DETC98/MECH-5836 A VIRTUAL REALITY TELEOPERATOR INTERFACE FOR ASSEMBLY OF HYBRID MEMS PROTOTYPES JOSEPH ALEX BARMESHWAR VIKRAMADITYA BRADLEY J. NELSON Research Assistant Research Assistant Assistant Professor Mechanical Engineering Mechanical Engineering Mechanical Engineering University of Illinois at Chicago University of Illinois at Chicago University of Illinois at Chicago Chicago IL, Chicago IL, Chicago IL, jalex1@uic.edu bvikra1@uic.edu bnelson@uic.edu ABSTRACT In this paper we describe a teleoperated microassembly workcell that integrates a VRML-based virtual microworld with visual servoing micromanipulation strategies. Java is used to program the VRML-based supervisory interface and to communicate with the microassembly workcell. This provides platform independence and allows remote teleoperation over the Internet. A key aspect of our approach entails the integration of teleoperation and visual servoing strategies. This allows a supervisor to guide the task remotely, while visual servoing strategies compensate for the imprecisely calibrated microworld. Results are presented that demonstrate system performance when a supervisor manipulates a microobject remotely. Though Internet delays impact the dynamic performance of the system, teleoperated relative parts placements with submicron precision is successfully demonstrated. 1. INTRODUCTION With the development of increasingly complex microelectromechanical systems (MEMS), the assembly and testing of prototype devices requires increasingly sophisticated micromanipulation techniques. Although monolithic microfabrication has been a requirement for the commercial success of MEMS devices in the past, such as pressure sensors and accelerometers, the future of MEMS will require increasingly sophisticated assembly of hybrid components. For example, if a microdevice must be made of different materials, has a complicated geometry, or is manufactured using incompatible processes, assembly is required. With these developments on the horizon, the need for sophisticated teleoperated micromanipulation environments for the assembly and testing of prototype devices is apparent. Our past work has demonstrated the advantages of applying visual servoing techniques to micropositioning of MEMS components for purposes of assembly (Vikramaditya and Nelson 97). Because of the high precision required for parts placement in assembling MEMS devices (often requiring submicron precision for relative parts placement), conventional open loop precision assembly devices used in industry are inadequate (Slocum 92). However, we have shown that visual servoing strategies combined with high resolution optical systems are able to achieve the required submicron precision. This is due to the ability of closedloop vision feedback to compensate not only for inaccurate sensor and manipulator kinematic models, but also the difficulties associated with thermal expansion and the complex microphysics that are inherent to the microworld. In this paper, we propose a framework for visually servoed teleoperated micromanipulation that uses an expectation-based approach to task execution. A key aspect of our approach entails the integration of a virtual microworld with visual servoing strategies to perform the task in the real microworld. The virtual microworld is represented using VRML (Virtual Reality Modeling Language). A supervisor interacts within this virtual microworld by selecting and dragging objects to be manipulated. As objects are moved, the changing desired visual representation of the microworld is determined by the VRML environment. This constantly changing desired visual representation is transmitted as a vector of reference feature states to a visual servoing agent that executes the required visually servoed micromotion. The

2 interface between VRML, the supervisor, and the visually servoed microassembly workcell uses the Java programming language. This provides a significant level of portability across computing platforms for providing the graphical user interface, i.e. the virtual microworld, to the supervisor and for communication, thus allowing for remote teleoperation over the Internet. Experimental results are presented that demonstrate system performance. Though Internet delays do impact the dynamic performance of the system, we are able to demonstrate teleoperated relative parts placements with submicron precision. Within this paper, we first present previous work related to microassembly and teleoperation using the Internet. In Section 3, we describe our system framework, and in Section 4 we discuss our hardware and software implementation. Section 5 presents experimental results, and Section 6 concludes the paper. 2. BACKGROUND 2.1. Microassembly manipulation probe glass fiber V-groove Figure 1. Manipulating a 308µm dia. glass fiber into a 270µm wide V-groove for constructing an electron column for a miniature scanning electron microscope (Feinerman et al. 92). Currently, microdevices requiring complex manipulation are assembled by hand using an optical microscope and probes or small tweezers, and is a crude form of teleoperated micromanipulation. For example, specially trained technicians use this technique to assemble precision optical and magnetic devices (Yamagata and Higuchi 95). In the Microfabrication Applications Laboratory, we have assembled many different microdevices by hand using optical microscopes. Some of the devices include miniature fiber optic assemblies, micropumps, and electron columns for miniature scanning electron microscopes (Feinerman et al. 92) (see Figure 1). A primary goal of this project is to develop more robust teleoperated micromanipulation strategies for these types of assembly tasks. Many researchers are actively pursuing strategies for manipulating micron sized objects for various applications. For example, researchers have used feedback from a scanning electron microscope (SEM) to teleoperatively guide micromanipulation (Sato et al. 95); look-and-move techniques for remote teleoperation of micro/milli sized structures have been developed (Hannaford et al. 97); vision based methods have been proposed (Koyano and Sato 96) (Sulzmann et al. 97) (Vikramaditya and Nelson 97); and microassembly workcells are being built (Menciassi et al. 97), to name a few of the efforts in this area. Our approach uses visual servoing techniques, as opposed to lookand-move, to guide and provide feedback on relative parts placement over large ranges. The visual servoing approach is integrated with a virtual microworld for providing a graphical user interface to the supervisor performing the task. The 3D representation of the microworld is also used to develop an expectationbased framework for micromanipulation, as will be discussed in Section Teleoperation Using the Web and Java One of the first and most popular web-based teleoperation systems is described in (Goldberg et al. 95). This system allowed any remote web user to position a manipulator arm and discharge compressed air to uncover sand covered objects. An image was sent to show the effect of the air discharge. Due to the limitations of the HTTP protocol at the time (Java was not used), only openloop motion commands could be sent to the manipulator and single snap shots of the workcell were returned as feedback. In (DePasquale et al. 97), the issue of teleoperation of a robot using a Java applet running on a web page from anywhere on the Internet was first explored. A painting robot was used for demonstration purposes, though a VRML interface was not incorporated. The possibility of combining Java and VRML utilizing CGI scripts (Common Gateway Interface) is mentioned in (Hirukawa et al. 97), though a full implementation is not described. Our implementation of a remote teleoperation system is more general in that a Netscape web browser is used, rather than a specific software tool such as CGI. This paper describes our successful integration of a remote supervisor, VRML, and our micromanipulation workcell through the use of Java VRML VRML 2.0, Virtual Reality Modeling Language, is a scene description language that can be used to describe 3D models of objects and scenes with the capabilities of interactive operations on them. These models can be viewed using a web browser with a plug-in for VRML 2.0. The capabilities of navigation and viewpoints are built into VRML and, thus, can be used as a graphical display engine. VRML inherently supports an event driven model, which allows routing of the field values inside the

3 nodes to other values thus changing the scene. To communicate with the real world, a programming language is needed to link VRML with the real world. For this purpose, Java is used. This requires that the VRML browser supports the Java-VRML interface. The two existing interfaces are the Script Authoring Interface and the External Authoring Interface (EAI). We use EAI to link Java and VRML due to implementation issues on our chosen supervisor interface development platform, a Silicon Graphics machine. 3. SYSTEM FRAMEWORK Our overall system framework is based on the concept of an expectation or verification based approach to scene understanding (Dickmanns 92) (Roth and Jain 91). A key point of both the expectation and verification-based approaches is that strong internal models of the recent world state are maintained. Neisser s view of the human perceptual cycle (Neisser 76), as Jain points out (Jain 89), is similar in many ways to a verification or expectation based approach. Figure 2 shows a modified representation of Neisser s perceptual cycle. This figure illustrates our view of the relationship between the VRML representation of the microworld, the real world, the visually servoed micromanipulator, and the supervisor. The counter-clockwise flow of information represents the cyclical nature of the system; sensory data updates the VRML representation which accepts plans from a supervisor; VRML s desired world view (in terms of image-based visual features) guides the visually servoed micromanipulator; which provides sensory data obtained from the real world to VRML. This cycle illustrates the interaction between perception of the world, actions taken within this world by the visually servoed micromanipulator, and plans made about the world by the supervisor. Supervisor Sensory Data VRML Real World Control Directives Sensing and Actuation Visual Servoing Agent Figure 2. A modified perceptual cycle for visually servoed manipulators VRML - Supervisor Interface The VRML world contains 3D models of objects and manipulators in the real world. Through a camera-lens model approximately equivalent to our microscope-ccd system, a virtual image of the scene is created; for our experiments a 50µm diameter polysilicon gear and a microgripper are modeled. The supervisor moves a microobject by clicking on its corresponding virtual object on the screen and dragging it. This motion in VRML creates a change in image plane coordinates for visual features located on the object. As fast as the browser and the Internet will allow (currently approximately 10Hz), these new desired feature states are transmitted to the visual servoing system located at the microassembly workcell Visual servoing with an optical microscope The job of the visual servoing system is to accept a vector of desired feature states from VRML and determine a motion control command that will result in the desired image despite system disturbances. In formulating the visual servoing component of our system, the Jacobian mapping from real world task space to CCD sensor space is used. We desire a Jacobian for the camera-lens system of the form x S = J v ( φ)x T where x S is a velocity vector in sensor space; J v ( φ) is the image Jacobian matrix and is a function of the extrinsic and intrinsic parameters of the vision sensor as well as the number of features tracked and their locations on the image plane; and X T is a velocity vector in task space. For a microscope that is allowed to translate and rotate, J v ( φ) is of the form J v where s x and are pixel dimension on the CCD; the total linear magnification m is given by m = h 2 h 1 = ( gc) ( f o 'f e ') (3) where g is the optical tube length; and c is the distance that the CCD lies behind the posterior principal focal plane of the eyepiece and is shown in Figure 3. Generally several features are tracked. Thus, for n feature points the Jacobian is of the form. where J i (t) is the Jacobian matrix for each feature given by (2). A complete derivation of (2) can be found in our previously published work (Vikramaditya and Nelson 97). The state equation for the visual servoing system is created (1) Z m y m c s s s x x s = x (2) m Z c m x s s x J v = J 1 ()... t J n () t T (4)

4 by discretizing (1) and rewriting the discretized equation as x( k + 1) = x() k + TJ v ()u k () k (5) where x(k) R 2M (M is the number of features being tracked); T is the sampling period of the vision system; and T u() k = X T Y T Z T ω XT ω YT ω ZT is the velocity in the task space of the manipulator end-effector. The Jacobian is written as J v () k in order to emphasize its time varying nature due to the changing feature coordinates on the image plane. The intrinsic parameters of the camera-lens system are constant for the experimental results to be presented. h 1 objective f o f o g h f e mechanical zoom elements tube optics f e c h 2 4. IMPLEMENTATION 4.1. Hardware Implementation Experiments were conducted with the microassembly workstation shown in Figure 4. The workstation is centered around a Wentworth MP950 Integrated Circuit Probe Station with a Mitutoyo FS60 optical microscope. The probe station has been retrofitted for motion control using high precision Kollmorgen brushless DC motors and a Queensgate NPS3330 3DOF piezoactuated nanopositioner. Image processing and visual servoing control calculations were performed with a vision system consisting of a digitizer and multiple TMS320C40 DSP s. The vision system is able to track up to 5 16x16 feature templates at 30Hz. The hardware architecture of the visual servoing system is shown in Figure 5. Two micromanipulators have been integrated with the system, a Wentworth HOP 2000 and a Sutter MP285 micromanipulator. Both manipulators provide 3 DOF. Silicon micro vacuum grippers have been fabricated in our Microfabrication Applications Laboratory and are used for grasping objects with dimensions greater than 10µm. Figure 3. Ray diagram for a microscope optical system. Optimal control techniques are used to arrive at the following expression for the control input. T 1 T u() k = ( TJ v ()QTJv k () k + L) TJv ()Qxk k [ xd ( k + 1) ] (6) The vector x D ( k + 1) is the vector that is sent from the virtual desired image created by VRML. This vector represents the desired image that the supervisor wants the visual servoing system to achieve through motion of the probe stage or micromanipulator. The weighting matrices Q and L allow the user to place more or less emphasis on the feature error and the control input. Extensions to this system model and control derivations that account for system delays, modeling and control inaccuracies, and measurement noise have been experimentally investigated (Papanikolopoulos et al. 92). The measurement of the motion of the features on the image plane, where the features are described by x(k), must be done continuously and quickly. The method used to measure this motion is based on an optical flow technique called Sum-of- Squares-Differences (SSD). The method assumes that the intensities around a feature point remain constant as that point moves across the image plane. The displacement of a point p a =(x S,y S ) at the next time increment to p a =(x S + x, y S + y), is determined by finding the displacement x=( x, y) which minimizes the SSD measure e(p a, x). A pyramidal search scheme is used to reduce the search space. A more complete description of the algorithm and its implementation can be found in (Nelson et al. 93). Figure 4. Microassembly workcell 4.2. Software Implementation The communication implementation among the client and the visual servoing agent is shown in Figure 6. The main system components consist of the remotely located client and the PCbased visual servoing system that controls the microassembly workcell VRML Figure 7 shows the VRML interface provided to the supervisor. Shown in the figure is the 3D virtual model of a 50µm polysilicon gear and a manipulation probe. A live view of the image

5 Microscope Camera C40 Vision System PCI/ISA Bus virtual image very closely. Initially, the position of the images are set to the corresponding VRML coordinates by using the Initialize button in the applet. Tim40 Carrier Probe Station Stage and Micromanipulators PMAC-PC manipulation probe 50µm dia. gear Figure 5. PC-based visual servoing hardware architecture. Supervisor Microworld Java applet Event handler Client Browser with VRML plug-in Java Socket Server Internet PC Visual servoing system: vision system motion control Application Server (Visual C++) Figure 7. Supervisor interface as shown in VRML running under Netscape The upper left window is a live image transmitted over the Internet by the microassembly workcell. Figure 6. Communication scheme for remote teleoperation using VRML and Java. can also transmitted over the Internet. The supervisor moves a microobject by clicking on its corresponding virtual object on the screen and dragging it. This motion in VRML is detected using the PlaneSensor node attached to the virtual image. This changes the translation field value of the image and sends out the translation_changed EventOut. The particular eventout triggers the callback function in the EAI applet. The function transforms the VRML coordinates to the pixel coordinates of the camera image using a camera-lens model and sends the data through the socket connection to the application running on the remote server. Due to security restrictions, an applet can open a socket connection only to the server from which it is downloaded. Here, the application running on the SGI accepts the connection and opens another socket connection to the PC which transforms the pixel coordinates to the actual movement to the controller. The controller sends the data to the workcell to affect the movement. As the micropart moves, its current position is tracked by the vision system and is sent back to the EAI applet through the application on the server. Upon receiving the pixel coordinates, the applet performs the transformation to VRML coordinates and the model of the real micropart is updated in the VRML world by routing an EventIn to the translation field of the realgear node. Since the PlaneSensor outputs events at nearly a continuous rate (10 to 15 Hz), the real image follows the desired EAI The Java-VRML interaction takes place through the EAI supported by CosmoPlayer 1.02 for IRIX. This gives the functionality to extend the features of VRML by adding the power of a programming language, Java. EAI allows an external program to access the nodes in a VRML scene using the existing VRML event model. The communication takes place with the browser plug-in interface that allows embedded objects on a web page to communicate with each other. We use a Java applet embedded in an HTML page, which also contains a VRML world. The interface allows access into the VRML scene by reading the last value sent from eventouts of the nodes inside the scene upon getting notification when these events occur. These can be used to modify the fields of the nodes inside the scene by sending events to eventins of nodes inside the scene. For this, the applet has methods (user-defined functions) that are called when the specified eventout occurs. A method is registered with an eventout of a given node and is called when the specific eventout event is generated. The framework provides a tool for the user to interact directly with the real world and receive feedback on the manipulated world. The ability of Java bytecode to run on different platforms makes it possible to view the HTML page and perform teleoperation across the Internet from a client side machine with the plug-in.

6 5. EXPERIMENTAL RESULTS The workcell is remotely teleoperated over the Internet under the web browser Netscape running VRML. We have used a Silicon Graphics O2 and various PC s running Windows95 for providing a virtual environment to the supervisor. We have also performed remote teleoperation from West Lafayette, Indiana of the microassembly workcell in Chicago. Experimental results demonstrating system performance are shown in Figure 8. The top two figures show object location in x, and y image coordinates. The solid curves represent VRML virtual image coordinates as a supervisor moves a virtual 3D object in an approximate circle, while the dashed curves show the delayed CCD image coordinates resulting from these commanded motions. A time delay of approximately 260ms was present when remotely teleoperating the system over the Internet while the remote host was located on the same hub as the microassembly workcell. From the plots one can see good tracking performance of the real image with the desired image created within VRML. A 20x objective lens was used to collect these results. Using microfabricated calibration grids, we estimate the relative positioning precision that is attainable with this optical configuration is approximately 0.4µm. 6. CONCLUSION In order to develop more complex hybrid MEMS devices, teleoperated micromanipulation strategies must be enhanced. The integration of a complex virtual environment with visual servoing strategies provides a remote environment with the tools for developing complex micromanipulation strategies. By incorporating a VRML environment with a Java-based interface, we are developing a remotely teleoperated micromanipulation system that can be operated over a variety of platforms. To date, our system has demonstrated submicron precision in relative parts placement using a variety of remote platforms, including an SGI O2 and a variety of PC s running Windows95. x (pix) y (pix) (sec) (sec) 7. ACKNOWLEDGMENTS This research was supported in part by the National Science Foundation through Grant Numbers IRI , CDA , and IRI , by the Office of Naval Research through Grant Number N , and by DARPA through Grant Number N µm 8. REFERENCES DePasquale, P., Lewis, J. and Stein, M. R., 1997, A Java Interface for Asserting Interactive Telerobotic Control, Proc. of SPIE, Vol. 3206, Dickmanns, E. D., 1992, Expectation-based Dynamic Scene Understanding, in Active Vision, Blake, A., and Yuille, A., ed., , The MIT Press, Cambridge. Feinerman, A. D., Crewe, D. A., Perng, D. C., Shoaf, S. E. and Crewe, A. V., 1992, Sub-centimeter micromachined elec- (sec) Figure 8. Experimental results showing object location in VRML pixel coordinates (solid) and CCD image coordinates (dashed) in x and y, and X (solid) and Y (dashed) motion of the object in the real world.

7 tron microscope, J. Vac. Sci. Technol. A, 10(4), pp Goldberg, K., Mascha, M., Gentner, S. and Rothenberg, N., 1995, Desktop Teleoperation via the World Wide Web, in Proc IEEE Int. Conf. on Robotics and Automation, Hannaford, B., Hewitt, J., Maneewarn, T., Venema, S., Appleby, M. and Ehsresman, R., 1997, Telerobotic remote handling of protein crystals, Proc IEEE Int. Conf. on Robotics and Automation, Hirukawa, H., Matsui, T. and Onda, H., 1997, Prototypes of Teleoperation Systems via a Standard Protocol with a Standard Human Interface Proc IEEE Int. Conf. on Robotics and Automation, pp Jain, R., 1989, Environment Models and Information Assimilation, Technical Report RJ 6866(65692), IBM-Yorktown Heights. Koyano K. and Sato, T., 1996, Micro object handling system with concentrated visual fields and new handling skills, Proc IEEE Int. Conf. on Robotics and Automation, pp Menciassi, A., Carroza, M. C., Ristori, C., Tiezzi, G. and Dario, P., 1997, A workstation for manipulation of micron sized objects, Proc th Int. Conf. on Advanced Robotics, pp Neisser, U., 1976, Cognition and Reality, W.H. Freeman and Co., New York. Nelson, B. J., Papanikolopoulos, N. P. and Khosla, P. K., 1993, Visual servoing for robotic assembly, Visual Servoing Real-Time Control of Robot Manipulators Based on Visual Sensory Feedback, Hashimoto, K., ed., River Edge, NJ:World Scientific Publishing Co. Pte. Ltd. pp Papanikolopoulos, N. P., Nelson, B. J. and Khosla, P. K., 1992, Full 3-d tracking using the controlled active vision paradigm, Proc IEEE Int. Symp. on Intelligent Control (ISIC- 92), pp Roth, Y. and Jain, R., 1991, Verification versus Discovery in Vision-Based Systems, Technical Report CSE-TR , The University of Michigan. Sato, T., Kameya, T., Miyazaki, H. and Hatamura, Y., 1995, Hand-eye System in the Nano Manipulation World, Proc IEEE Int. Conf. on Robotics and Automation, pp Slocum, A. H., 1992, Precision Machine Design, Prentice Hall Ṡulzmann, A., Breguet, J. M. and Jacot, J., 1997, Micromotor assembly using high accurate optical vision feedback for microrobot relative 3D displacement in submicron range, Proc Int. Conf. on Solid-State Sensors and Actuators (Transducers 97), Vikramaditya, B. and Nelson, B. J., 1997, Visually Guided Microassembly Using Optical Microscopes and Active Vision Techniques, Proc IEEE Int. Conf. on Robotics and Automation, pp Yamagata, Y. and Higuchi, T., 1995, A Micropositioning Device for Precision Automatic Assembly using Impact Force of Piezoelectric Elements, Proc IEEE Int. Conf. on Robotics and Automation, pp

Micropositioning of a Weakly Calibrated Microassembly System Using Coarse-to-Fine Visual Servoing Strategies

Micropositioning of a Weakly Calibrated Microassembly System Using Coarse-to-Fine Visual Servoing Strategies IEEE TRANSACTIONS ON ELECTRONICS PACKAGING MANUFACTURING, VOL. 23, NO. 2, APRIL 2000 123 Micropositioning of a Weakly Calibrated Microassembly System Using Coarse-to-Fine Visual Servoing Strategies Stephen

More information

Fusing Force and Vkion Feedback for Micromanipulation

Fusing Force and Vkion Feedback for Micromanipulation Proceedings of the 1998 EEE nternational Conference on Robotics& Automation Leuven, Belgium - May 1998 Fusing Force and Vkion Feedback for Micromanipulation Yu Zhou Bradley J. Nelson Barmeshwar Vikramaditya

More information

A flexible microassembly system based on hybrid manipulation scheme for manufacturing photonics components

A flexible microassembly system based on hybrid manipulation scheme for manufacturing photonics components Int J Adv Manuf Technol (2006) 28: 379 386 DOI 10.1007/s00170-004-2360-8 ORIGINAL ARTICLE Byungkyu Kim Hyunjae Kang Deok-Ho Kim Jong-Oh Park A flexible microassembly system based on hybrid manipulation

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality R. Marín, P. J. Sanz and J. S. Sánchez Abstract The system consists of a multirobot architecture that gives access

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

Design and Control of the BUAA Four-Fingered Hand

Design and Control of the BUAA Four-Fingered Hand Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Design and Control of the BUAA Four-Fingered Hand Y. Zhang, Z. Han, H. Zhang, X. Shang, T. Wang,

More information

Control and robotics remote laboratory for engineering education

Control and robotics remote laboratory for engineering education Control and robotics remote laboratory for engineering education R. Šafarič, M. Truntič, D. Hercog and G. Pačnik University of Maribor, Faculty of electrical engineering and computer science, Maribor,

More information

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,

More information

6 System architecture

6 System architecture 6 System architecture is an application for interactively controlling the animation of VRML avatars. It uses the pen interaction technique described in Chapter 3 - Interaction technique. It is used in

More information

HYBRID MICRO-ASSEMBLY SYSTEM FOR TELEOPERATED AND AUTOMATED MICROMANIPULATION

HYBRID MICRO-ASSEMBLY SYSTEM FOR TELEOPERATED AND AUTOMATED MICROMANIPULATION HYBRID MICRO-ASSEMBLY SYSTEM FOR TELEOPERATED AND AUTOMATED MICROMANIPULATION Michael F. Zaeh, Dirk Jacob, Michael Ehrenstrasser, Johannes Schilp Technische Universitaet Muenchen, Institute for Machine

More information

Telerobotics and Virtual Reality. Swiss Federal Institute of Technology. experiments, we are still in a phase where those environments

Telerobotics and Virtual Reality. Swiss Federal Institute of Technology. experiments, we are still in a phase where those environments September 10 12, 1997 in Geneva, Switzerland. \KhepOnTheWeb" : An Experimental Demonstrator in Telerobotics and Virtual Reality Olivier Michel, Patrick Saucy and Francesco Mondada Laboratory of Microcomputing

More information

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM Takafumi Taketomi Nara Institute of Science and Technology, Japan Janne Heikkilä University of Oulu, Finland ABSTRACT In this paper, we propose a method

More information

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology Robot Sensors 2.12 Introduction to Robotics Lecture Handout September 20, 2004 H. Harry Asada Massachusetts Institute of Technology Touch Sensor CCD Camera Vision System Ultrasonic Sensor Photo removed

More information

Shape Memory Alloy Actuator Controller Design for Tactile Displays

Shape Memory Alloy Actuator Controller Design for Tactile Displays 34th IEEE Conference on Decision and Control New Orleans, Dec. 3-5, 995 Shape Memory Alloy Actuator Controller Design for Tactile Displays Robert D. Howe, Dimitrios A. Kontarinis, and William J. Peine

More information

Adaptive Scanning Optical Microscope (ASOM) for Large Workspace Micro-robotic Applications

Adaptive Scanning Optical Microscope (ASOM) for Large Workspace Micro-robotic Applications Adaptive Scanning Optical Microscope (ASOM) for Large Workspace Micro-robotic Applications Benjamin Potsaid and John T. Wen Center for Automation Technologies and Systems Rensselaer Polytechnic Institute

More information

Mars Rover: System Block Diagram. November 19, By: Dan Dunn Colin Shea Eric Spiller. Advisors: Dr. Huggins Dr. Malinowski Mr.

Mars Rover: System Block Diagram. November 19, By: Dan Dunn Colin Shea Eric Spiller. Advisors: Dr. Huggins Dr. Malinowski Mr. Mars Rover: System Block Diagram November 19, 2002 By: Dan Dunn Colin Shea Eric Spiller Advisors: Dr. Huggins Dr. Malinowski Mr. Gutschlag System Block Diagram An overall system block diagram, shown in

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

Advanced Digital Motion Control Using SERCOS-based Torque Drives

Advanced Digital Motion Control Using SERCOS-based Torque Drives Advanced Digital Motion Using SERCOS-based Torque Drives Ying-Yu Tzou, Andes Yang, Cheng-Chang Hsieh, and Po-Ching Chen Power Electronics & Motion Lab. Dept. of Electrical and Engineering National Chiao

More information

INTERNET-BASED REAL-TIME CONTROL ARCHITECTURES WITH TIME-DELAY/PACKET-LOSS COMPENSATION

INTERNET-BASED REAL-TIME CONTROL ARCHITECTURES WITH TIME-DELAY/PACKET-LOSS COMPENSATION Asian Journal of Control, Vol. 9, No., pp. 7-, March 7 7 -Brief Paper- INTERNET-BASED REAL-TIME CONTROL ARCHITECTURES WITH TIME-DELAY/PACKET-LOSS COMPENSATION Kun Ji, Won-jong Kim, and Abhinav Srivastava

More information

Real-Time Bilateral Control for an Internet-Based Telerobotic System

Real-Time Bilateral Control for an Internet-Based Telerobotic System 708 Real-Time Bilateral Control for an Internet-Based Telerobotic System Jahng-Hyon PARK, Joonyoung PARK and Seungjae MOON There is a growing tendency to use the Internet as the transmission medium of

More information

Tim Salter (ME) Jason Smith (ME) Ahmed Alfadhel (EE) Mike Neurohr (CE)

Tim Salter (ME) Jason Smith (ME) Ahmed Alfadhel (EE) Mike Neurohr (CE) Design Review Tim Salter (ME) Jason Smith (ME) Ahmed Alfadhel (EE) Mike Neurohr (CE) 1 Definition of Project Micro-goniophotometer gloss measurement device Evaluate the gloss characteristics of a sample/surface

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Web-Based Mobile Robot Simulator

Web-Based Mobile Robot Simulator Web-Based Mobile Robot Simulator From: AAAI Technical Report WS-99-15. Compilation copyright 1999, AAAI (www.aaai.org). All rights reserved. Dan Stormont Utah State University 9590 Old Main Hill Logan

More information

Leading in Desktop SEM Imaging and Analysis

Leading in Desktop SEM Imaging and Analysis Leading in Desktop SEM Imaging and Analysis Fast. Outstanding. Reliable SEM imaging and analysis. The Phenom: World s Fastest Scanning Electron Microscope With its market-leading Phenom desktop Scanning

More information

Introduction of New Products

Introduction of New Products Field Emission Electron Microscope JEM-3100F For evaluation of materials in the fields of nanoscience and nanomaterials science, TEM is required to provide resolution and analytical capabilities that can

More information

Development of Micro-manipulation System for Operation in Scanning Electron Microscope

Development of Micro-manipulation System for Operation in Scanning Electron Microscope Development of Micro-manipulation System for Operation in Scanning Electron Microscope H. Eda, L. Zhou, Y. Yamamoto, T. Ishikawa, T. Kawakami and J. Shimizu System Engineering Department, Ibaraki University,

More information

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3 INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3 Labshare 2011 Table of Contents 1 Introduction... 3 1.1 Remote Laboratories... 3 1.2 Inclined Plane - The Rig Apparatus... 3 1.2.1 Block Masses & Inclining

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach Proc. 1 st International Conference on Machine Learning and Data Engineering (icmlde2017) 20-22 Nov 2017, Sydney, Australia ISBN: 978-0-6480147-3-7 Teleoperated Robot Controlling Interface: an Internet

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

An augmented reality interface for training robotics through the web

An augmented reality interface for training robotics through the web An augmented reality interface for training robotics through the web Carlos A. Jara, Francisco A. Candelas, Pablo Gil, Manuel Fernández and Fernando Torres Department of Physics, System Engineering and

More information

Putting It All Together: Computer Architecture and the Digital Camera

Putting It All Together: Computer Architecture and the Digital Camera 461 Putting It All Together: Computer Architecture and the Digital Camera This book covers many topics in circuit analysis and design, so it is only natural to wonder how they all fit together and how

More information

Robust Haptic Teleoperation of a Mobile Manipulation Platform

Robust Haptic Teleoperation of a Mobile Manipulation Platform Robust Haptic Teleoperation of a Mobile Manipulation Platform Jaeheung Park and Oussama Khatib Stanford AI Laboratory Stanford University http://robotics.stanford.edu Abstract. This paper presents a new

More information

The Haptic Impendance Control through Virtual Environment Force Compensation

The Haptic Impendance Control through Virtual Environment Force Compensation The Haptic Impendance Control through Virtual Environment Force Compensation OCTAVIAN MELINTE Robotics and Mechatronics Department Institute of Solid Mechanicsof the Romanian Academy ROMANIA octavian.melinte@yahoo.com

More information

Topics VRML. The basic idea. What is VRML? History of VRML 97 What is in it X3D Ruth Aylett

Topics VRML. The basic idea. What is VRML? History of VRML 97 What is in it X3D Ruth Aylett Topics VRML History of VRML 97 What is in it X3D Ruth Aylett What is VRML? The basic idea VR modelling language NOT a programming language! Virtual Reality Markup Language Open standard (1997) for Internet

More information

Guidance of a Mobile Robot using Computer Vision over a Distributed System

Guidance of a Mobile Robot using Computer Vision over a Distributed System Guidance of a Mobile Robot using Computer Vision over a Distributed System Oliver M C Williams (JE) Abstract Previously, there have been several 4th-year projects using computer vision to follow a robot

More information

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii 1ms Sensory-Motor Fusion System with Hierarchical Parallel Processing Architecture Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii Department of Mathematical Engineering and Information

More information

Visual Servoing. Charlie Kemp. 4632B/8803 Mobile Manipulation Lecture 8

Visual Servoing. Charlie Kemp. 4632B/8803 Mobile Manipulation Lecture 8 Visual Servoing Charlie Kemp 4632B/8803 Mobile Manipulation Lecture 8 From: http://www.hsi.gatech.edu/visitors/maps/ 4 th floor 4100Q M Building 167 First office on HSI side From: http://www.hsi.gatech.edu/visitors/maps/

More information

ACTUATORS AND SENSORS. Joint actuating system. Servomotors. Sensors

ACTUATORS AND SENSORS. Joint actuating system. Servomotors. Sensors ACTUATORS AND SENSORS Joint actuating system Servomotors Sensors JOINT ACTUATING SYSTEM Transmissions Joint motion low speeds high torques Spur gears change axis of rotation and/or translate application

More information

Differences in Fitts Law Task Performance Based on Environment Scaling

Differences in Fitts Law Task Performance Based on Environment Scaling Differences in Fitts Law Task Performance Based on Environment Scaling Gregory S. Lee and Bhavani Thuraisingham Department of Computer Science University of Texas at Dallas 800 West Campbell Road Richardson,

More information

Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network

Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network 436 JOURNAL OF COMPUTERS, VOL. 5, NO. 9, SEPTEMBER Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network Chung-Chi Wu Department of Electrical Engineering,

More information

Elements of Haptic Interfaces

Elements of Haptic Interfaces Elements of Haptic Interfaces Katherine J. Kuchenbecker Department of Mechanical Engineering and Applied Mechanics University of Pennsylvania kuchenbe@seas.upenn.edu Course Notes for MEAM 625, University

More information

A simple embedded stereoscopic vision system for an autonomous rover

A simple embedded stereoscopic vision system for an autonomous rover In Proceedings of the 8th ESA Workshop on Advanced Space Technologies for Robotics and Automation 'ASTRA 2004' ESTEC, Noordwijk, The Netherlands, November 2-4, 2004 A simple embedded stereoscopic vision

More information

AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY

AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY J. C. Álvarez, J. Lamas, A. J. López, A. Ramil Universidade da Coruña (SPAIN) carlos.alvarez@udc.es, jlamas@udc.es, ana.xesus.lopez@udc.es,

More information

3-Degrees of Freedom Robotic ARM Controller for Various Applications

3-Degrees of Freedom Robotic ARM Controller for Various Applications 3-Degrees of Freedom Robotic ARM Controller for Various Applications Mohd.Maqsood Ali M.Tech Student Department of Electronics and Instrumentation Engineering, VNR Vignana Jyothi Institute of Engineering

More information

SRV02-Series Rotary Experiment # 3. Ball & Beam. Student Handout

SRV02-Series Rotary Experiment # 3. Ball & Beam. Student Handout SRV02-Series Rotary Experiment # 3 Ball & Beam Student Handout SRV02-Series Rotary Experiment # 3 Ball & Beam Student Handout 1. Objectives The objective in this experiment is to design a controller for

More information

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: ,  Volume 2, Issue 11 (November 2012), PP 37-43 IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719, Volume 2, Issue 11 (November 2012), PP 37-43 Operative Precept of robotic arm expending Haptic Virtual System Arnab Das 1, Swagat

More information

Team KMUTT: Team Description Paper

Team KMUTT: Team Description Paper Team KMUTT: Team Description Paper Thavida Maneewarn, Xye, Pasan Kulvanit, Sathit Wanitchaikit, Panuvat Sinsaranon, Kawroong Saktaweekulkit, Nattapong Kaewlek Djitt Laowattana King Mongkut s University

More information

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS

More information

Optical Correlator for Image Motion Compensation in the Focal Plane of a Satellite Camera

Optical Correlator for Image Motion Compensation in the Focal Plane of a Satellite Camera 15 th IFAC Symposium on Automatic Control in Aerospace Bologna, September 6, 2001 Optical Correlator for Image Motion Compensation in the Focal Plane of a Satellite Camera K. Janschek, V. Tchernykh, -

More information

Distributed Virtual Learning Environment: a Web-based Approach

Distributed Virtual Learning Environment: a Web-based Approach Distributed Virtual Learning Environment: a Web-based Approach Christos Bouras Computer Technology Institute- CTI Department of Computer Engineering and Informatics, University of Patras e-mail: bouras@cti.gr

More information

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

Modeling and Experimental Studies of a Novel 6DOF Haptic Device Proceedings of The Canadian Society for Mechanical Engineering Forum 2010 CSME FORUM 2010 June 7-9, 2010, Victoria, British Columbia, Canada Modeling and Experimental Studies of a Novel DOF Haptic Device

More information

SMARTSCAN Smart Pushbroom Imaging System for Shaky Space Platforms

SMARTSCAN Smart Pushbroom Imaging System for Shaky Space Platforms SMARTSCAN Smart Pushbroom Imaging System for Shaky Space Platforms Klaus Janschek, Valerij Tchernykh, Sergeij Dyblenko SMARTSCAN 1 SMARTSCAN Smart Pushbroom Imaging System for Shaky Space Platforms Klaus

More information

MEMS in ECE at CMU. Gary K. Fedder

MEMS in ECE at CMU. Gary K. Fedder MEMS in ECE at CMU Gary K. Fedder Department of Electrical and Computer Engineering and The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213-3890 fedder@ece.cmu.edu http://www.ece.cmu.edu/~mems

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa VIRTUAL REALITY Introduction Emil M. Petriu SITE, University of Ottawa Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality HUMAN PERCEPTION OF

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

Lab 7: Introduction to Webots and Sensor Modeling

Lab 7: Introduction to Webots and Sensor Modeling Lab 7: Introduction to Webots and Sensor Modeling This laboratory requires the following software: Webots simulator C development tools (gcc, make, etc.) The laboratory duration is approximately two hours.

More information

Randomized Motion Planning for Groups of Nonholonomic Robots

Randomized Motion Planning for Groups of Nonholonomic Robots Randomized Motion Planning for Groups of Nonholonomic Robots Christopher M Clark chrisc@sun-valleystanfordedu Stephen Rock rock@sun-valleystanfordedu Department of Aeronautics & Astronautics Stanford University

More information

Robot Simulation and Monitoring on Real Controllers (RoboSiM)

Robot Simulation and Monitoring on Real Controllers (RoboSiM) Robot Simulation and Monitoring on Real Controllers (RoboSiM) A. Speck Wilhelm-Schickard-Institut für Informatik Universität Tübingen D-72076 Tübingen, Germany E-mail: speck@informatik.uni-tuebingen.de

More information

Collaborative Virtual Environment for Industrial Training and e-commerce

Collaborative Virtual Environment for Industrial Training and e-commerce Collaborative Virtual Environment for Industrial Training and e-commerce J.C.OLIVEIRA, X.SHEN AND N.D.GEORGANAS School of Information Technology and Engineering Multimedia Communications Research Laboratory

More information

Lecture 20: Optical Tools for MEMS Imaging

Lecture 20: Optical Tools for MEMS Imaging MECH 466 Microelectromechanical Systems University of Victoria Dept. of Mechanical Engineering Lecture 20: Optical Tools for MEMS Imaging 1 Overview Optical Microscopes Video Microscopes Scanning Electron

More information

KMUTT Kickers: Team Description Paper

KMUTT Kickers: Team Description Paper KMUTT Kickers: Team Description Paper Thavida Maneewarn, Xye, Korawit Kawinkhrue, Amnart Butsongka, Nattapong Kaewlek King Mongkut s University of Technology Thonburi, Institute of Field Robotics (FIBO)

More information

Key-Words: - Neural Networks, Cerebellum, Cerebellar Model Articulation Controller (CMAC), Auto-pilot

Key-Words: - Neural Networks, Cerebellum, Cerebellar Model Articulation Controller (CMAC), Auto-pilot erebellum Based ar Auto-Pilot System B. HSIEH,.QUEK and A.WAHAB Intelligent Systems Laboratory, School of omputer Engineering Nanyang Technological University, Blk N4 #2A-32 Nanyang Avenue, Singapore 639798

More information

1.6 Beam Wander vs. Image Jitter

1.6 Beam Wander vs. Image Jitter 8 Chapter 1 1.6 Beam Wander vs. Image Jitter It is common at this point to look at beam wander and image jitter and ask what differentiates them. Consider a cooperative optical communication system that

More information

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- Hitoshi Hasunuma, Kensuke Harada, and Hirohisa Hirukawa System Technology Development Center,

More information

Image Processing & Projective geometry

Image Processing & Projective geometry Image Processing & Projective geometry Arunkumar Byravan Partial slides borrowed from Jianbo Shi & Steve Seitz Color spaces RGB Red, Green, Blue HSV Hue, Saturation, Value Why HSV? HSV separates luma,

More information

Challenges of Precision Assembly with a Miniaturized Robot

Challenges of Precision Assembly with a Miniaturized Robot Challenges of Precision Assembly with a Miniaturized Robot Arne Burisch, Annika Raatz, and Jürgen Hesselbach Technische Universität Braunschweig, Institute of Machine Tools and Production Technology Langer

More information

ME 6406 MACHINE VISION. Georgia Institute of Technology

ME 6406 MACHINE VISION. Georgia Institute of Technology ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class

More information

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS Nuno Sousa Eugénio Oliveira Faculdade de Egenharia da Universidade do Porto, Portugal Abstract: This paper describes a platform that enables

More information

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people Space Research expeditions and open space work Education & Research Teaching and laboratory facilities. Medical Assistance for people Safety Life saving activity, guarding Military Use to execute missions

More information

SELF-BALANCING MOBILE ROBOT TILTER

SELF-BALANCING MOBILE ROBOT TILTER Tomislav Tomašić Andrea Demetlika Prof. dr. sc. Mladen Crneković ISSN xxx-xxxx SELF-BALANCING MOBILE ROBOT TILTER Summary UDC 007.52, 62-523.8 In this project a remote controlled self-balancing mobile

More information

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR Proceedings of IC-NIDC2009 DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR Jun Won Lim 1, Sanghoon Lee 2,Il Hong Suh 1, and Kyung Jin Kim 3 1 Dept. Of Electronics and Computer Engineering,

More information

Wireless crack measurement for control of construction vibrations

Wireless crack measurement for control of construction vibrations Wireless crack measurement for control of construction vibrations Charles H. Dowding 1, Hasan Ozer 2, Mathew Kotowsky 3 1 Professor, Northwestern University, Department of Civil and Environmental Eng.,

More information

Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition

Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition V. K. Beri, Amit Aran, Shilpi Goyal, and A. K. Gupta * Photonics Division Instruments Research and Development

More information

GUIDELINES FOR DESIGN LOW COST MICROMECHANICS. L. Ruiz-Huerta, A. Caballero Ruiz, E. Kussul

GUIDELINES FOR DESIGN LOW COST MICROMECHANICS. L. Ruiz-Huerta, A. Caballero Ruiz, E. Kussul GUIDELINES FOR DESIGN LOW COST MICROMECHANICS L. Ruiz-Huerta, A. Caballero Ruiz, E. Kussul Center of Applied Sciences and Technological Development, UNAM Laboratory of Mechatronics and Micromechanics,

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page FUNDAMENTALS of ROBOT TECHNOLOGY An Introduction to Industrial Robots, T eleoperators and Robot Vehicles D J Todd &\ Kogan Page First published in 1986 by Kogan Page Ltd 120 Pentonville Road, London Nl

More information

Linear Motion Servo Plants: IP01 or IP02. Linear Experiment #0: Integration with WinCon. IP01 and IP02. Student Handout

Linear Motion Servo Plants: IP01 or IP02. Linear Experiment #0: Integration with WinCon. IP01 and IP02. Student Handout Linear Motion Servo Plants: IP01 or IP02 Linear Experiment #0: Integration with WinCon IP01 and IP02 Student Handout Table of Contents 1. Objectives...1 2. Prerequisites...1 3. References...1 4. Experimental

More information

Internet Control of Personal Robot between KAIST and UC Davis

Internet Control of Personal Robot between KAIST and UC Davis Internet Control of Personal Robot between KAIST and UC Davis Kuk-Hyun Han 1, Yong-Jae Kim 1, Jong-Hwan Kim 1 and Steve Hsia 2 1 Department of Electrical Engineering and Computer Science, Korea Advanced

More information

ROBOTIC MANIPULATION AND HAPTIC FEEDBACK VIA HIGH SPEED MESSAGING WITH THE JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS)

ROBOTIC MANIPULATION AND HAPTIC FEEDBACK VIA HIGH SPEED MESSAGING WITH THE JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS) ROBOTIC MANIPULATION AND HAPTIC FEEDBACK VIA HIGH SPEED MESSAGING WITH THE JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS) Dr. Daniel Kent, * Dr. Thomas Galluzzo*, Dr. Paul Bosscher and William Bowman INTRODUCTION

More information

MATLAB is a high-level programming language, extensively

MATLAB is a high-level programming language, extensively 1 KUKA Sunrise Toolbox: Interfacing Collaborative Robots with MATLAB Mohammad Safeea and Pedro Neto Abstract Collaborative robots are increasingly present in our lives. The KUKA LBR iiwa equipped with

More information

On Application of Virtual Fixtures as an Aid for Telemanipulation and Training

On Application of Virtual Fixtures as an Aid for Telemanipulation and Training On Application of Virtual Fixtures as an Aid for Telemanipulation and Training Shahram Payandeh and Zoran Stanisic Experimental Robotics Laboratory (ERL) School of Engineering Science Simon Fraser University

More information

Robotics: Evolution, Technology and Applications

Robotics: Evolution, Technology and Applications Robotics: Evolution, Technology and Applications By: Dr. Hamid D. Taghirad Head of Control Group, and Department of Electrical Engineering K.N. Toosi University of Tech. Department of Electrical Engineering

More information

Vision-Based Robot Learning for Behavior Acquisition

Vision-Based Robot Learning for Behavior Acquisition Vision-Based Robot Learning for Behavior Acquisition Minoru Asada, Takayuki Nakamura, and Koh Hosoda Dept. of Mechanical Eng. for Computer-Controlled Machinery, Osaka University, Suita 565 JAPAN E-mail:

More information

AUTOMATIC INSPECTION SYSTEM FOR CMOS CAMERA DEFECT. Byoung-Wook Choi*, Kuk Won Ko**, Kyoung-Chul Koh***, Bok Shin Ahn****

AUTOMATIC INSPECTION SYSTEM FOR CMOS CAMERA DEFECT. Byoung-Wook Choi*, Kuk Won Ko**, Kyoung-Chul Koh***, Bok Shin Ahn**** AUTOMATIC INSPECTION SYSTEM FOR CMOS CAMERA DEFECT Byoung-Wook Choi*, Kuk Won Ko**, Kyoung-Chul Koh***, Bok Shin Ahn**** * Dept. of Electrical Engineering, Seoul Nat'l Univ. of Technology, Seoul, Korea

More information

SmartSenseCom Introduces Next Generation Seismic Sensor Systems

SmartSenseCom Introduces Next Generation Seismic Sensor Systems SmartSenseCom Introduces Next Generation Seismic Sensor Systems Summary: SmartSenseCom, Inc. (SSC) has introduced the next generation in seismic sensing technology. SSC s systems use a unique optical sensing

More information

Components for virtual environments Michael Haller, Roland Holm, Markus Priglinger, Jens Volkert, and Roland Wagner Johannes Kepler University of Linz

Components for virtual environments Michael Haller, Roland Holm, Markus Priglinger, Jens Volkert, and Roland Wagner Johannes Kepler University of Linz Components for virtual environments Michael Haller, Roland Holm, Markus Priglinger, Jens Volkert, and Roland Wagner Johannes Kepler University of Linz Altenbergerstr 69 A-4040 Linz (AUSTRIA) [mhallerjrwagner]@f

More information

Design of Infrared Wavelength-Selective Microbolometers using Planar Multimode Detectors

Design of Infrared Wavelength-Selective Microbolometers using Planar Multimode Detectors Design of Infrared Wavelength-Selective Microbolometers using Planar Multimode Detectors Sang-Wook Han and Dean P. Neikirk Microelectronics Research Center Department of Electrical and Computer Engineering

More information

LASER ASSISTED COMBINED TELEOPERATION AND AUTONOMOUS CONTROL

LASER ASSISTED COMBINED TELEOPERATION AND AUTONOMOUS CONTROL ANS EPRRSD - 13 th Robotics & remote Systems for Hazardous Environments 11 th Emergency Preparedness & Response Knoxville, TN, August 7-10, 2011, on CD-ROM, American Nuclear Society, LaGrange Park, IL

More information

BioInstrumentation Laboratory

BioInstrumentation Laboratory BioInstrumentation Laboratory Ian Hunter Vienna, May 22 2013 BioInstrumentation Lab, Mechanical Engineering, MIT - Robotic endoscopes - Needle-free drug delivery devices - Eye micro-surgery robots - High

More information

Multi-robot Formation Control Based on Leader-follower Method

Multi-robot Formation Control Based on Leader-follower Method Journal of Computers Vol. 29 No. 2, 2018, pp. 233-240 doi:10.3966/199115992018042902022 Multi-robot Formation Control Based on Leader-follower Method Xibao Wu 1*, Wenbai Chen 1, Fangfang Ji 1, Jixing Ye

More information

Graphical Simulation and High-Level Control of Humanoid Robots

Graphical Simulation and High-Level Control of Humanoid Robots In Proc. 2000 IEEE RSJ Int l Conf. on Intelligent Robots and Systems (IROS 2000) Graphical Simulation and High-Level Control of Humanoid Robots James J. Kuffner, Jr. Satoshi Kagami Masayuki Inaba Hirochika

More information

MICROMACHINED INTERFEROMETER FOR MEMS METROLOGY

MICROMACHINED INTERFEROMETER FOR MEMS METROLOGY MICROMACHINED INTERFEROMETER FOR MEMS METROLOGY Byungki Kim, H. Ali Razavi, F. Levent Degertekin, Thomas R. Kurfess G.W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta,

More information

648. Measurement of trajectories of piezoelectric actuators with laser Doppler vibrometer

648. Measurement of trajectories of piezoelectric actuators with laser Doppler vibrometer 648. Measurement of trajectories of piezoelectric actuators with laser Doppler vibrometer V. Grigaliūnas, G. Balčiūnas, A.Vilkauskas Kaunas University of Technology, Kaunas, Lithuania E-mail: valdas.grigaliunas@ktu.lt

More information

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging Proseminar Roboter und Aktivmedien Educational robots achievements and challenging Lecturer Lecturer Houxiang Houxiang Zhang Zhang TAMS, TAMS, Department Department of of Informatics Informatics University

More information

Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with Disabilities

Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with Disabilities The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with

More information

Virtual Grasping Using a Data Glove

Virtual Grasping Using a Data Glove Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct

More information

BLuAC5 Brushless Universal Servo Amplifier

BLuAC5 Brushless Universal Servo Amplifier BLuAC5 Brushless Universal Servo Amplifier Description The BLu Series servo drives provide compact, reliable solutions for a wide range of motion applications in a variety of industries. BLu Series drives

More information