A flexible microassembly system based on hybrid manipulation scheme for manufacturing photonics components

Similar documents
A Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator

Adaptive Scanning Optical Microscope (ASOM) for Large Workspace Micro-robotic Applications

HYBRID MICRO-ASSEMBLY SYSTEM FOR TELEOPERATED AND AUTOMATED MICROMANIPULATION

Digital Photographic Imaging Using MOEMS

Motion Control of Excavator with Tele-Operated System

Fiber Optic Device Manufacturing

Rapid and precise control of a micro-manipulation stage combining H with ILC algorithm

Smart Electromechanical Systems Modules

Development of Micro-manipulation System for Operation in Scanning Electron Microscope

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

INTERFEROMETER VI-direct

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Some Issues on Integrating Telepresence Technology into Industrial Robotic Assembly

NOVEL TWO-DIMENSIONAL (2-D) DEFECTED GROUND ARRAY FOR PLANAR CIRCUITS

MICROMACHINED INTERFEROMETER FOR MEMS METROLOGY

Measurement of Microscopic Three-dimensional Profiles with High Accuracy and Simple Operation

Applications of Maskless Lithography for the Production of Large Area Substrates Using the SF-100 ELITE. Jay Sasserath, PhD

Study on Repetitive PID Control of Linear Motor in Wafer Stage of Lithography

II. EXPERIMENTAL SETUP

Novel machine interface for scaled telesurgery

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space

ASSEMIC : A European Project for Advanced Microhandling and - assembly

Design and Optimization of Ultrasonic Vibration Mechanism using PZT for Precision Laser Machining

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition

An Excavator Simulator for Determining the Principles of Operator Efficiency for Hydraulic Multi-DOF Systems Mark Elton and Dr. Wayne Book ABSTRACT

PicoMaster 100. Unprecedented finesse in creating 3D micro structures. UV direct laser writer for maskless lithography

Shape Memory Alloy Actuator Controller Design for Tactile Displays

SUPRA Optix 3D Optical Profiler

TELEOPERATED SYSTEM WITH ACCELEROMETERS FOR DISABILITY

ON THE REDUCTION OF SUB-PIXEL ERROR IN IMAGE BASED DISPLACEMENT MEASUREMENT

Automatic Testing of Photonics Components

Micropositioning of a Weakly Calibrated Microassembly System Using Coarse-to-Fine Visual Servoing Strategies

Human-like Assembly Robots in Factories

Applications of Piezoelectric Actuator

GUIDELINES FOR DESIGN LOW COST MICROMECHANICS. L. Ruiz-Huerta, A. Caballero Ruiz, E. Kussul

Vertical Integration of MM-wave MMIC s and MEMS Antennas

Systematic Workflow via Intuitive GUI. Easy operation accomplishes your goals faster than ever.

A Haptic Tele-operated system for Microassembly

A NOVEL CONTROL SYSTEM FOR ROBOTIC DEVICES

Displacement Measurement of Burr Arch-Truss Under Dynamic Loading Based on Image Processing Technology

MEMS-based Micro Coriolis mass flow sensor

A VIRTUAL REALITY TELEOPERATOR INTERFACE FOR ASSEMBLY OF HYBRID MEMS PROTOTYPES

Team KMUTT: Team Description Paper

Basler. Line Scan Cameras

Peter Berkelman. ACHI/DigitalWorld

Design and Control of the BUAA Four-Fingered Hand

Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface

A novel tunable diode laser using volume holographic gratings

Online dressing of profile grinding wheels

sensors ISSN

Performance Issues in Collaborative Haptic Training

PICO MASTER 200. UV direct laser writer for maskless lithography

Robot Task-Level Programming Language and Simulation

A PASSIVITY-BASED SYSTEM DESIGN

Wireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing

Figure 7 Dynamic range expansion of Shack- Hartmann sensor using a spatial-light modulator

Development of a telepresence agent

3-Degrees of Freedom Robotic ARM Controller for Various Applications

AHAPTIC interface is a kinesthetic link between a human

Force Controlled Robotic Assembly

Methods for Haptic Feedback in Teleoperated Robotic Surgery

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Chapter 2 Mechatronics Disrupted

Laboratory of Advanced Simulations

Parallel Mode Confocal System for Wafer Bump Inspection

1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany

An Improved Bernsen Algorithm Approaches For License Plate Recognition

User s Manual. English SUN-HMS200/400. Fiber Microscope. Table: SUN-UM-TL-HMS001 Version: A/0

Fabrication of Silicon Master Using Dry and Wet Etching for Optical Waveguide by Thermal Embossing Technique

Real-Time Bilateral Control for an Internet-Based Telerobotic System

Sensitivity Enhancement of Bimaterial MOEMS Thermal Imaging Sensor Array using 2-λ readout

Mechanical Spectrum Analyzer in Silicon using Micromachined Accelerometers with Time-Varying Electrostatic Feedback

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS. Carlos Vázquez Jan Rosell,1

Effect of Initial Deflection of Diamond Wire on Thickness Variation of Sapphire Wafer in Multi-Wire Saw

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

PROCEEDINGS OF SPIE. Automated asphere centration testing with AspheroCheck UP

Micro-manipulated Cryogenic & Vacuum Probe Systems

Robust Haptic Teleoperation of a Mobile Manipulation Platform

The Haptic Impendance Control through Virtual Environment Force Compensation

ILLUMINATION AND IMAGE PROCESSING FOR REAL-TIME CONTROL OF DIRECTED ENERGY DEPOSITION ADDITIVE MANUFACTURING

Sintec Optronics Technology Pte Ltd 10 Bukit Batok Crescent #07-02 The Spire Singapore Tel: Fax:

AUTOMATION & ROBOTICS LABORATORY. Faculty of Electronics and Telecommunications University of Engineering and Technology Vietnam National University

Hexagonal Liquid Crystal Micro-Lens Array with Fast-Response Time for Enhancing Depth of Light Field Microscopy

KMUTT Kickers: Team Description Paper

Development of a Low-order Adaptive Optics System at Udaipur Solar Observatory

An Electronic Eye to Improve Efficiency of Cut Tile Measuring Function

ERGOS: Multi-degrees of Freedom and Versatile Force-Feedback Panoply

Analog Circuit for Motion Detection Applied to Target Tracking System

Machine Tools with an Enhanced Ball Screw Drive in Vertical Axis for Shaping of Micro Textures

MEMS for RF, Micro Optics and Scanning Probe Nanotechnology Applications

Wafer-level Vacuum Packaged X and Y axis Gyroscope Using the Extended SBM Process for Ubiquitous Robot applications

1272. Phase-controlled vibrational laser percussion drilling

Supplementary Figure 1

Q-Motion Miniature Linear Stage

HexGen HEX HL Hexapod Six-DOF Positioning System

Applying Model Mediation Method to a Mobile Robot Bilateral Teleoperation System Experiencing Time Delays in Communication

Transcription:

Int J Adv Manuf Technol (2006) 28: 379 386 DOI 10.1007/s00170-004-2360-8 ORIGINAL ARTICLE Byungkyu Kim Hyunjae Kang Deok-Ho Kim Jong-Oh Park A flexible microassembly system based on hybrid manipulation scheme for manufacturing photonics components Received: 27 March 2004 / Accepted: 28 July 2004 / Published online: 16 March 2005 Springer-Verlag London Limited 2005 Abstract In this paper, a flexible microassembly system based on hybrid manipulation scheme is proposed to apply to the assembly of photonics components such as lensed optical fiber ferrules and laser diode (LD) pumps. In order to achieve both high precision and dexterity in microassembly, we propose a hybrid microassembly system with sensory feedbacks of vision and force. This system consists of the distributed six degrees of freedom (DOF) micromanipulation units, the stereomicroscope, and haptic interface for the force feedback-based microassembly. A hybrid assembly method, which combines the visionbased microassembly and the scaled teleoperated microassembly with force feedback, is proposed. The feasibility of the proposed method is investigated via experimental studies for assembling micro-optoelectrical components. Experimental results show that the hybrid microassembly system is feasible for applications to the assembly of photonic components in the commercial market with better flexibility and efficiency. Keywords Fine alignment Force feedback Microassembly Optoelectrical components Vision 1 Introduction In recent years, commercial markets for micro-electromechanical systems (MEMS) products, for example, inkjet printer heads, accelerometers and optical MEMS, are growing rapidly. These products require little or no assembly, because silicon-based micromachining enables a monolithic fabrication process. However, the monolithic process is not feasible in the microsystem products consisting of many MEMS subsystems. There are lots of commercial MEMS products that require flexible precision B. Kim ( ) H. Kang D.-H. Kim J.-O. Park Microsystems Research Center, Korea Institute of Science and Technology, P.O. Box 131, Cheongryang, Seoul 130-650, South Korea E-mail: bkim@kist.re.kr Tel.: +82-2-9585616 Fax: +82-2-9586910 assembly, for example, hard drive read/write head assemblies, fiber optic assemblies, sensors and RF switches/filters. Currently, these products are assembled semi-automatically or manually due to the difficulties of assembly. The robotic microassembly system with both dexterity and high precision has much potential for the microassembly of hybrid MEMS components. Several research groups investigated optimal system design [1, 2] and manipulation methods [3, 4] based on sensory feedback for efficient and reliable microassembly. For example, a flexible microrobot-based microassembly station in which assembly is executed under vision control was proposed [2]. A 3D micromanipulation system based on visionbased human interfaces and haptic feedback was proposed [5]. A reconfigurable assembly system with low cost and flexible assembly concept was designed to handle micro parts with high precision [6]. As discussed in several literatures [1 8], the fundamental requirements of the microassembly system are the dexterity, the high precision, and the feedback-sensing capability, which enable monitoring and control of assembly process. In this paper, we propose a flexible microassembly system based on hybrid manipulation scheme to achieve both high precision and dexterity in microassembly. The word hybrid means combination of two different paradigms for manipulation: scaled teleoperation and vision-based automation. Although these two paradigms are complementary to each other in their benefits, they are often used independently and exclusively. The main benefit of teleoperation is dexterous assembly, whereas that of automation is fast assembly. Until recently, most assemblers pursue dexterity (assembly complexity) and speed (assembly efficiency) separately. For example, most of microscale products assembled by automation are simple in their structures and they do not need dexterous assembly planning by multi-dof teleoperation. However, complexity and efficiency in assembly will be pursued together in the near future, because MEMS products become more and more highly integrated and multi-functional. This paper reflects the current trends in microassembly to combine assembly complexity with assembly efficiency by incorporating teleoperation with automation in the control strategy of microassembly.

380 Fig. 1. Pump LD aligning and welding process In particular, due to the increased demand for hybrid microoptoelectromechanical systems (MOEMS) in telecommunications, our research applications focus on manipulating and assembling optoelectrical components such as lensed optical fiber ferrules and 980 nm LD pump (manufactured by COSET Inc.), as shown in Fig. 1. The goal of our research is to demonstrate the mutual benefits of the hybrid microassembly system by combining force feedback-based teleoperation with vision-guided automation, and thus to realize both dexterity and efficiency in microscale assembly. The paper is structured as follows: In Sect. 2, the assembly process of optoelectrical components is introduced. In Sect. 3, the flexible microassembly system for photonics applications is briefly described. In order to achieve both high precision and dexterity in microassembly, the proposed hybrid manipulation scheme based on sensory feedbacks of vision and force is described in Sects. 4 and 5. In Sect. 6, the application of the proposed hybrid microassembly method to the actual assembly process of optoelectrical components is described. Finally, conclusions are given in Sect. 7. on the plate before the LD through the first and second fine alignment. Our research scope covers the insertion of fiber ferrule into the window holder, and the angle tuning of lensed optical fibers. In Fig. 2, the hybrid assembly process we propose in this paper is compared with the conventional manual assembly process. The hybrid assembly process that consists of manual microassembly, vision-based automated microassembly and scaled teleoperated microassembly with force feedback, is carefully planned and scheduled. For example, the insertion process is carried out by teleoperation to prevent the collision between the window of LD pump module and the lensed optical fiber fer- 2 Hybrid assembly process of optoelectrical components In this study, we propose hybrid microassembly process for commercial optoelectrical components, which have been manually assembled by skilled workers in the industrial field so far. Figure 1 shows the well-defined assembly process of optoelectrical components in the photonics industry, for example, the aligning and welding of the laser diode (LD) pump. The whole assembly process is performed as follows: After loading the part, the lensed optical fiber ferrule is inserted into the window holder of the pump module. Then, the angle of the lensed optical fiber is adjusted with monitoring of the lens shape. The ferrule is welded Fig. 2. Assembly process of optoelectrical components

381 rule. Other processes are performed by vision-based automatic microassembly. 3 Flexible microassembly system 3.1 System overview Figure 3a depicts a schematic of the proposed microassembly system that performs fine alignment tasks of micro-optoelectrical components. This system consists of the distributed 6-DOF micromanipulation unit, the multiple imaging unit, haptic interface (model: 6-DOF PHANToM 1.5 Premium, SensAble Technologies, Inc.) and graphical user interface for facilitating the visionbased microassembly and the scaled teleoperated microassembly with force feedback. To provide the operator with visual information, a stereomicroscope (model: MZ-12.5, Leica Inc.) with long working distance is mounted with multiple CCD cameras. The images are analyzed on a PC by a video frame grabber (model: Matrox Genesis, Matrox Electronic Systems Ltd.) and then supplied to the operator. The PC modules of the control computer are connected via a TCP/IP (Ethernet) to a higher-level control PC which collects all relevant data to control the masterslave micromanipulation system and provides an operator with a graphical user interface. A photograph of the experimental microassembly system setup is shown in Fig. 3b. 3.2 Manipulation unit The microassembly system consists of a 3-DOF micromanipulator, a 1-DOF high resolution rotation stage, and a 2-DOF precision positioning system with a large workspace. Fine movements needed for microassembly are implemented by a 3-DOF micromanipulator (model: MP 285 micromanipulator, SHUTTER Inc.) with a travel of 22 mm and a resolution of 0.04 µm in each axis. For the angle tuning process of lensed optical fiber ferrules, a rotation mechanism is added on the 3-DOF micromanipulator. The 1-DOF rotation stage (model: M-037 DG, PI Inc.) is actuated by a microstep motor with a maximum resolution of 0.03 /step. The 2-DOF precision positioning system (model: M-410DG, PI Inc.) has planar motions with a large workspace and a resolution of 8.5 µm under the stereomicroscope and changes the field of view of the optical vision system. The 6-DOF micromanipulation system with decomposition of manipulation DOF is controlled by a PC-based control board. 3.3 Imaging unit The vision system consists of optical microscopes with superior performance in recognizing 3D-shaped micro parts, multiple CCD cameras, and a video frame grabber. In order to recognize 3D micro parts visually and monitor the assembly process, the imaging unit is integrated with one optical stereomicroscope of the top view, and two microscopes of the global view and the lateral view. The optical stereomicroscope we used provides a wide range of magnification, higher depth-of-field, and long working distance. For example, the working distance of our stereomicroscope is 97 mm, which is good enough for a robotic microassembly system. Generally an optical microscope has a small working distance around 10 mm, which often limits the workspace of micromanipulator. For our specific assembly task we used multiple CCD cameras (model: SONY XC-55) to obtain two different field of views of 11.2 8.4 mm (for insertion task) and 1.41 1.05 mm (for angle tuning process). Also, the global view optical microscope was used in monitoring the assembly process with a large field of view. The obtained visual data from multiple CCD cameras are synchronized both horizontally and vertically and fed back to the imaging processing board. The multi-thread method in Visual C++ is used for real-time data translation. 3.4 Graphical user interface Fig. 3. a A schematic of the flexible microassembly system b experimental microassembly system setup Our previous research [9] proposed the concept of intelligent user interface to integrate sensory feedback information of vision and force and monitor assembly processes in real-time. Through this user interface, strategies for vision-based microassembly

382 Fig. 4a,b. User interface for hybrid microassembly a fiber alignment process b angle tuning process and scaled teleoperated microassembly can be realized. Figure 4 shows the user interface to monitor and control assembly processes of optical fiber alignment and angle tuning, respectively. It consists of the vision engine, the graphic engine, the haptic engine, and the control engine for facilitating hybrid microassembly of optoelectrical components. 4 Vision-based microassembly 4.1 Vision-based automation In consideration of the speed and the efficiency, the teleoperation scheme is not proper for the simple task such as part transferring. In this study, the damage-free tasks such as part transfer and angle tuning process are performed by vision-based automation. Figure 5 illustrates the vision-based automation process for transferring optical fiber. Firstly, an optical fiber is moved from the initial position in part loading to the desired position so as to guarantee the safety of optical fiber. It is processed based on object recognition and calculating the distance between window hole and the tip of an optical fiber. 4.2 Vision-based angle tuning process Figure 6 shows contours of the tip of the single optical fiber during the angle turning process of the optical fiber. As an optical Fig. 5. Vision-based automation for optical fiber transfer fiber is rotated, the triangular area of the bottom contour of the lens changes. Therefore, we control the angle of the lens by monitoring and identifying the triangular area to obtain the maximum performance of a LD pump. Area calculation algorithm. In order to obtain the triangular area of the lens of an optical fiber ferrule, we use several image processing algorithms. Firstly, images from the video frame grabber are filtered to get rid of noise. Then, the edge of the lens is detected by using the SOBEL operator [10]. Finally, we make Fig. 6. The contour change of area of the lens

383 Fig. 8. The change of triangular area of lens by rotating the optical fiber Fig. 7. a Parameters to calculate the triangular area of lens b area filter a binary image. Figure 7a shows parameters to calculate the triangular area. The triangular area is given by Area = n (C(0) C(i)), (1) i=1 where n is the number of pixels of the bottom contour, C(i) is the y-coordinate of the ith pixel, and C(0) is the y-coordinate of the initial pixel. When the lensed optical fiber ferrule rotates, the lens slopes upward or downward from the x-axis of the vision system due to the lack of alignment between the manipulator and lens. It causes a change in the end point s shape of the lens. Then, it influences calculation of the precise triangular area of the lens. To solve this problem, the area filter was designed based on comparing the slope of the contour of the lens, and unnecessary pixels were eliminated as shown in Fig. 7b. Figure 8 shows the experimental result calculating the triangular area of the lens with area filter and without area filter. It indicates that we could obtain the precise triangular area of the lens using area filter. Searching algorithm. It has been known that the best performance of the LD pump module lies at the minimum value of triangular area of the lens because optical transmission loss is lowest at that position. So, in aligning the lensed optical fiber ferrule into the LD pump module, each optical fiber should be accurately aligned to the position at the minimum value of triangular area of Fig. 9. Experimental result of angle tuning algorithm the lens. But, the rate of change in the triangular area of the lens is higher in its maximum value range than its minimum value. Therefore, we found the position that gives maximum value of the triangular area of the lens, and the rotation stage is rotated 90 from found rotation stage position. We applied the hill-climb algorithm [10] to identify the alignment of the lens that has the maximum value of triangular area of the lens. Then, the rotational stage is controlled to move to the position at the minimum triangular area of the lens. Figure 9 shows the experimental result of our vision-based angle tuning process. The movement of the rotational stage is controlled adaptively with coarse and fine control steps according to the aligning error. After the tenth control step, the angle of the lens approaches the desirable angle. Then, the fine control step command to a rotational stage is applied to have more accuracy. 5 Scaled teleoperated microassembly with force feedback In this study, the scaled teleoperated microassembly with force feedback is applied when the assembly process requires the

384 Fig. 10. Block diagram of the scaled teleoperated microassembly multi-dof teaching, and because the automated process can cause the breakage of an optical fiber. As shown in Fig. 10, the scaled teleoperated microassembly with force feedback is realized through the haptic interface with the help of the user interface described in Sect. 3. 5.1 Adaptive scaling factor tuning When we insert the lensed optical fiber ferrule into the window holder of the LD pump module, the tracking motion of y-axis manipulator depending on the motion of the haptic device is presented in Fig. 11, under variation of scale factors of the y-axis from 50 to 800 with the fixed scale factor of the x-axis as 50. Here, the scale factor is defined as the ratio between the moving distance of the haptic device and the distance of the actual micromanipulator. As the scale factor increases, we could obtain the stability of assembly and safety of an optical fiber because the micromanipulator is not highly influenced by the operator s trembling. Also, as the scale factor increases, the motion of micromanipulator and the speed of the assembly decrease. So, we used the vision-based scaling factor-tuning algorithm to get the safety of an optical fiber and the speed of the assembly task. Figure 12 depicts the concept of vision-based scale factor tuning. The y-axis and z-axis s scaling factors are adjusted adaptively when a lensed optical fiber ferrule is located in the specified range around the window hole s center and escaped from it. In this application, the scale factor is doubled whenever the optical fiber passes around the center of the window hole. This prevents the optical fiber from escaping greatly from alignment position and we obtain faster assembly completion. Figure 13 shows the result of the vision-based scale factor tuning algorithm. This algorithm was tested for the same task as shown in Fig. 11. Once optical fiber is located around the center of the window hole, the position of micromanipulator s y-axis converges into a specified range. Therefore, we could obtain the safety in aligning an optical fiber and heighten the speed of the assembly task using the vision-based scale factor tuning algorithm. Fig. 11. The movement in y-axis of the micromanipulator according to various scale factors in insertion task Fig. 12. Concept of vision-based scale factor tuning

385 sion of each part using virtual reality s geometrical information of each part. As a first warning, in case of one point contact, the user interface gives a warning message offering the sound feedback to the operator. If area contact occurs, the user interface offers force feedback to the operator by the haptic device, and in doing so, prevents damage of an optical fiber. 6 Applications Fig. 13. Experimental result of vision-based scale factor tuning 5.2 Vision-based sound and force feedback In the case of alignment tasks of the micro-optoelectrical components such as LD pump module, lensed optical fiber ferrules are fragile. They typically will break at the milli-newton (mn) range force, which cannot be felt by a human operator. The assembly processes require precise alignment of a lensed optical fiber ferrule within several µm, and the damage due to the collision between assembly parts happens frequently during the assembly process. Therefore, the force feedback as well as visual feedback, to the operator is of great importance in identifying the contact. In our experiments, if the contact between a lensed optical fiber ferrule and a LD pump module happens, the visionbased sound and force feedback to the operator prevents damage of the lensed optical fiber ferrule by adjusting the teaching command to the micromanipulator through the haptic interface. As shown in Fig. 14, the user interface offers sound and force feedback to the operator using visual information in scaled teleoperation. First, after the position of each part from visual information is calculated in the user interface, the user interface constructs a virtual reality about the assembly environment for enhancing efficiency of microassembly work. The user interface judges a colli- We achieved the optoelectrical component assembly task successfully using our flexible microassembly system based on hybrid manipulation schemes. We employed the vision-based automatic assembly technique for part transfer and angle tuning process of lensed optical fiber ferrules and the scaled teleoperated microassembly with force feedback for optical fiber insertion tasks. Optical fiber insertion task requires about 3 minutes, and the time required for the angle tuning process was approximately less than 1 minute. Therefore, we could save about 10 minutes compared to manual assembly with painful training in the actual manufacturing process. 7 Conclusions In this paper, a flexible microassembly system based on hybrid manipulation schemes is proposed to achieve both high precision and dexterity in microassembly. This system consists of the distributed 6-DOF micromanipulation units with the stereomicroscope, and haptic interface for force feedback-based microassembly. For the angle tuning task, the searching algorithm is tested and the task is completed within 1 minute. To reduce the operation time of insertion task and to prevent the breakage of an optical fiber, the scaled teleoperation technique with force feedback is employed. It results in task completion within 3 minutes without fracture of the lensed optical fiber ferrule. Experimental results suggest that the proposed system is expected for applications in the assembly of photonic components in the commercial market Fig. 14. A flowchart of vision-based sound and force feedback

386 with better flexibility and efficiency. Also, the proposed system can be applicable to general peg-in-hole task and angle tuning tasks. It may even be expandable to general optical fiber alignment since the proposed system has dexterous functions that can carry out each task separately. We are currently conducting experiments to integrate a micro gripper integrated with force sensor in order to provide the actual force feedback to the operator. Acknowledgement Authors would like to thank COSET Inc. for valuable discussion and providing the commercial optoelectrical components. This work was supported by the 21st Century s Frontier R&D Projects, under the contract number MS-02-324-01, sponsored by the Ministry of Science and Technology, Korea. References 1. Ando N, Korondi P, Hashimoto H (2001) Development of micromanipulator and haptic interface for networked micromanipulation. IEEE/ASME Trans Mechatron 6(4):417 427 2. Fatikow S, Seyfried J, Fahlbusch S, Buerkle A, Schmoeckel F (2000) A flexible microrobot-based microassembly station. J Intell Robot Syst 27:135 169 3. Kim DH, Kim B, Kang HJ (2004) Development of a piezoelectric polymer-based sensorized microgripper for micromanipulation and microassembly. Microsyst Technol 10(4):275 280 4. Zhou Y, Nelson BJ, Vikramaditya B (2000) Integrating optical force sensing and visual serving for microassembly. J Intell Robot Syst 28(3):259 276 5. Kim DH, Kim K, Kim KY, Cha SM (2001) Dexterous teleoperation for micro parts handing based on haptic/visual interface. Proceedings of the 2001 IEEE International Symposium on Micromechatronics and Human Science, Nagoya, Japan, 2001, pp 211 217 6. Popa D, Kang BH, Sin J, Zou J (2002) Reconfigurable microassembly system for photonics applications. Proceedings of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, May 2002, pp 1495 1500 7. Nelson BJ, Zhou Y, Vikramaditya B (1998) Sensor-based microassembly of hybrid MEMS device. IEEE Control Syst 18(6):35 45 8. Yang G, Gaines JA, Nelson BJ (2003) A supervisory wafer-level 3D microassembly system for hybrid MEMS fabrication. J Intell Robot Syst 37:43 68 9. Song EH, Kim DH, Kim K, Lee J (2001) Intelligent user interface for teleoperated microassembly. Proceedings of the 2001 International Conference on Control, Automation and Systems, Jeju, Korea, 17 21 October 2001, pp 784 788 10. Matrox Electronic Systems Ltd. (2001) Matrox imaging library version 7.0 user guide. Matrox Electronic Systems, Dorval, Canada