High-speed Gaze Controller for Millisecond-order Pan/tilt Camera

Similar documents
A shooting direction control camera based on computational imaging without mechanical motion

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

Real-Time Scanning Goniometric Radiometer for Rapid Characterization of Laser Diodes and VCSELs

Sensor system of a small biped entertainment robot

Lenses. Overview. Terminology. The pinhole camera. Pinhole camera Lenses Principles of operation Limitations

Digital Photographic Imaging Using MOEMS

Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

More problems for Chapter 12 of Introduction to Wave Phenomena (Hirose- Lonngren) θ =.

Chapter 25 Optical Instruments

Speed and Image Brightness uniformity of telecentric lenses

Development of Drum CVT for a Wire-Driven Robot Hand

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

R 1 R 2 R 3. t 1 t 2. n 1 n 2

LENSES. INEL 6088 Computer Vision

Development of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics

APPLICATIONS FOR TELECENTRIC LIGHTING

An Optical Characteristic Testing System for the Infrared Fiber in a Transmission Bandwidth 9-11μm

Coherent Laser Measurement and Control Beam Diagnostics

CS 443: Imaging and Multimedia Cameras and Lenses

The Mathematics of the Stewart Platform

Spectacle lens design following Hamilton, Maxwell and Keller

Development of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics

Full-Wave Analysis of Planar Reflectarrays with Spherical Phase Distribution for 2-D Beam-Scanning using FEKO Electromagnetic Software

Active Aperture Control and Sensor Modulation for Flexible Imaging

Opto Engineering S.r.l.

White Paper: Modifying Laser Beams No Way Around It, So Here s How

A 3D Profile Parallel Detecting System Based on Differential Confocal Microscopy. Y.H. Wang, X.F. Yu and Y.T. Fei

Applications of Optics

Computer Vision. The Pinhole Camera Model

1.6 Beam Wander vs. Image Jitter

Section 3. Imaging With A Thin Lens

Distance Estimation with a Two or Three Aperture SLR Digital Camera

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

Dr F. Cuzzolin 1. September 29, 2015

Investigations towards an optical transmission line for longitudinal phase space measurements at PITZ

Development of a Low-order Adaptive Optics System at Udaipur Solar Observatory

Parallel Mode Confocal System for Wafer Bump Inspection

Information & Instructions

On Cosine-fourth and Vignetting Effects in Real Lenses*

Laser Telemetric System (Metrology)

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

Length-Sensing OpLevs for KAGRA

Overview. Image formation - 1

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Chapter 18 Optical Elements

Estimation of Absolute Positioning of mobile robot using U-SAT

Congress Best Paper Award

Very short introduction to light microscopy and digital imaging

Installation of OpLevs in KAGRA - Manual -

Tangents. The f-stops here. Shedding some light on the f-number. by Marcus R. Hatch and David E. Stoltzmann

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Digital inertial algorithm for recording track geometry on commercial shinkansen trains

MEM: Intro to Robotics. Assignment 3I. Due: Wednesday 10/15 11:59 EST

Compact camera module testing equipment with a conversion lens

3.0 Alignment Equipment and Diagnostic Tools:

Optical Correlator for Image Motion Compensation in the Focal Plane of a Satellite Camera

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Parity and Plane Mirrors. Invert Image flip about a horizontal line. Revert Image flip about a vertical line.

Fiber Optic Communications

Design Description Document

CMOS Image Sensor for High Speed and Low Latency Eye Tracking

Robot Joint Angle Control Based on Self Resonance Cancellation Using Double Encoders

Optics: An Introduction

Patents of eye tracking system- a survey

Properties of optical instruments. Projection optical systems

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Optical Components - Scanning Lenses

POINTING ERROR CORRECTION FOR MEMS LASER COMMUNICATION SYSTEMS

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design

Tutorial Zemax 9: Physical optical modelling I

Supplementary Information

Image Formation and Capture

TRIANGULATION-BASED light projection is a typical

Introduction. Lighting

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember

Optical Coherence: Recreation of the Experiment of Thompson and Wolf

Tests and Measurements of Electrical and Optical Characteristics for CMOS Image Sensors

1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany

Removing Temporal Stationary Blur in Route Panoramas

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

Supplementary Materials

Multi-robot Formation Control Based on Leader-follower Method

A Study on Laser Based Vision System for Inspection Height of Structural Adhesive

Virtual Wiper Removal of Adherent Noises from Images of Dynamic Scenes by Using a Pan-Tilt Camera

Εισαγωγική στην Οπτική Απεικόνιση

Characteristics of point-focus Simultaneous Spatial and temporal Focusing (SSTF) as a two-photon excited fluorescence microscopy

Sensors and Sensing Cameras and Camera Calibration

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Week IV: FIRST EXPERIMENTS WITH THE ADVANCED OPTICS SET

Breadboard adaptive optical system based on 109-channel PDM: technical passport

Dynamic beam shaping with programmable diffractive optics

Diagnosis and compensation of motion errors in NC machine tools by arbitrary shape contouring error measurement

Cameras. CSE 455, Winter 2010 January 25, 2010

SMARTSCAN Smart Pushbroom Imaging System for Shaky Space Platforms

Ron Liu OPTI521-Introductory Optomechanical Engineering December 7, 2009

Tobii T60XL Eye Tracker. Widescreen eye tracking for efficient testing of large media

Advanced Motion Control Optimizes Laser Micro-Drilling

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation

Transcription:

211 IEEE International Conference on Robotics and Automation Shanghai International Conference Center May 9-13, 211, Shanghai, China High-speed Gaze Controller for Millisecond-order /tilt Camera Kohei Okumura, Hiromasa Oku and Masatoshi Ishikawa Abstract We developed an optical high-speed gaze controller, called the Saccade Mirror, and used it to realize a high-speed pan/tilt camera with a high-speed image processor. Generally, in a pan/tilt camera, the gaze is controlled mechanically by rotational actuators. However, because pan/tilt cameras will be expected to use high-speed image processing ( 1 fps), sufficiently high-speed performance of the gaze controller, comparable to the high frame rate, cannot be obtained with the usual method. In our system, the camera itself was fixed, and an external Saccade Mirror subsystem was used for optical gaze control. An arbitrary gaze direction within ±3 deg could be achieved in less than 3.5 ms. I. INTRODUCTION The purpose our study was to realize a high-speed pan/tilt camera that has the ability to change its gaze direction extremely quickly, at speeds comparable to the processing speed of a high-speed image processor, while still providing a sufficient angle of view. The term gaze used here means the direction in which images are captured by the camera, and high-speed image processor means a kind oamera (a vision sensor) that can realize both capturing and processing of more than 1 images per second. The background of this research follows. It is said that more than 8 % of the sensory information that humans obtain is from the visual organs. For robots as well, visual servo systems have been widely used to grasp changing external environments. Vision sensors with a frame rate of 3 fps, such as CCDs, are mainly used in visual information processing. However, this approach cannot realize high-speed visual control because the image sampling rate in visual information processing is restricted to at most 3 Hz. In such cases, a high-speed image processor has been demonstrated to be quite useful [1]. On the other hand, in some applications, it is necessary to acquire images over a range wider than the camera s original angle of view. Hence, cameras whose gaze can be controlled, called pan/tilt cameras (or active vision systems), have been developed mainly for the purpose of monitoring, inspection, and so on. In a general pan/tilt camera, the camera is mounted on a two-axis rotational platform including actuators (Fig. 1) [2]. If a high-speed image processor can control its gaze at a speed sufficiently high compared with its frame rate, various advanced applications are expected. Some examples are: Observation of high-speed dynamic objects without motion blur. K. Okumura, H. Oku and M. Ishikawa are with the Dept. of Information Physics and Computing, Graduate School of Information Science and Technology, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan. kohei okumura@ipc.i.u-tokyo.ac.jp Quasi-extension of the angle of view by combining multiple images taken from different gaze directions in real time. Displaying multiple images from different gaze directions in real time using only one vision sensor. This kind of system is expected to be valuable in many vision applications, including robot vision, medical services, and broadcasting. However, sufficient high-speed performance cannot be obtained even if the gaze direction of the high-speed image processor is controlled mechanically by rotational actuators. The time required to reach the desired gaze direction is usually more than 2 ms [3], whereas the imaging cycle of the high-speed image processor is less than 1 ms. This is a critical bottleneck to realize a high-speed pan/tilt camera, particularly for a visual servo system. To solve this issue, we developed an external high-speed gaze controller, called the Saccade Mirror. This name is based on the rapid movement of the human eye known as a saccade. Two-axis rotational mirrors, the critical parts of the system, respond in millisecond-order to rapidly control the gaze direction optically (Fig.1). Gaze Vision sensor Gaze Fig. 1. Illustration of a general pan/tilt camera, and our method involving gaze control using two-axis rotational mirrors. II. COMPONENTS OF SACCADE MIRROR A. Two-axis Rotational Mirrors for High-speed Gaze Control There have been several studies aimed at realizing a highspeed pan/tilt camera even with a conventional structure. However, the required level of performance has not been obtained. For example, Y. Nakabo has developed a 1 ms target tracking system, which is a good example of a pan/tilt camera with a high-speed image processor [3]. In this system, a high-speed image processor called Column Parallel Vision 1 (CPV1) is mounted on a two-axis rotational platform including high-speed actuators. Nevertheless, its response time for switching the gaze direction is more than 2 ms, much longer than the control cycle period of 1 ms. Also, the cutoff frequency of the gaze control in the visual feedback 978-1-61284-38-3/11/$26. 211 IEEE 6186

system is approximately 1 Hz, whereas the regulation, or imaging, rate is 1 khz. For the millisecond-order control that we aim for, the inertia of the rotating parts must be reduced as much as possible. We focused attention on two-axis rotational mirrors for gaze control, as described in the introduction section. In this case, because the only rotating parts are the two mirrors, the inertia of that subsystem can be considerably reduced. However, two-axis rotational mirrors have not been adopted often for pan/tilt cameras. They are mainly used for scanning non-diverging bundles of rays, such as laser beams. For example, J. Taylor et al. and H. Kim et al. both developed depth measurement systems (a kind of laser range finder) using a camera, a laser, and two-axis rotational mirrors [4], [5]. If we use rotational mirrors for the pan/tilt camera, one critical issue is that the obtained angle of view is significantly restricted because the mirrors are small and are constructed to rotate on two axes. A model of the system geometry now follows. A pinhole camera model is shown in Fig. 2 [6]. The angle of view θ in Fig. 2 is defined as the angular extent of a given scene in the camera. Rays from the scene are concentrated at the pinhole. The point where the bundle of rays is concentrated is called the pupil. The relationship between θ and S(θ) in Fig. 2 is presented here. S(θ) is the area of a certain image plane a distance d away from the center of the pupil. If θ becomes larger, S(θ) will also obviously become larger. For gaze control with mirrors, if a wide angle of view is required, large mirrors will also be required. Thus, it is not easy to realize both a wide angle of view and highspeed performance at the same time because large mirrors are difficult to drive quickly. This will be even more difficult particularly when using independent two-axis mirrors. We call the mirror near the pupil the pan-mirror and the other one the tilt-mirror. If a certain wide controllable gaze range is needed, the required size of the tilt-mirror will be larger than that of the pan-mirror because the area through which rays pass on the tilt-mirror varies depending on the driven angle of the pan-mirror. Fig. 2. S(θ) Angle of view d θ Pupil Image sensor Angle of view in the model of a pinhole camera. B. Pupil Shift Lenses for Achieving Practical Angle of View We solved the issue described above by using positive lenses called pupil shift lenses [7]. The basic theory of the function of a pupil shift lens with a pinhole camera model is shown in Fig. 3. A bundle of rays can be refracted by the positive lens. Some of the bundle of rays that would otherwise be concentrated at B are also concentrated at the desired point A using the positive lens. Thinking of A as the pupil of the camera, this vision system including the positive lens has two pupils, the original one A and the shifted one B. This theory can be applied to our Saccade Mirror, enabling both a wide angle of view and high-speed performance to be realized at the same time. That is, if we need gaze control at a sufficiently high speed compared with the frame rate of a high-speed image processor, this combination should give a practical angle of view, or if we need a practical angle of view with a certain vision system, this combination give sufficiently high speed. For example, to obtain a 3 deg angle of view without pupil shift lenses, the inertia of the tilt-mirror is estimated to be more than one-hundred times larger compared with the case where pupil shift lenses are used. Fig. 3. B Shifted Positive lens A Pinhole camera Shift of the pupil using a positive lens. III. DETAILED OPTICAL THEORY A. Gaze and Visibility Model Here, the gaze direction and appearance of an image through two mirrors are mathematically considered. The two mirrors are mounted as shown in Fig.4. z -mirror α x O l m y O -mirror β Camera Image plane v Fig. 4. Mirror alignment, and appearance of an image through the mirrors. The pan-mirror surface (z = (tan α)x) includes the origin, and its rotational axis is the y-axis. The tilt-mirror surface (z = (tan β)y + l m ) includes O (,, l m ), and its rotational axis is parallel to the x-axis. The original gaze direction lies in the +x-axis direction (v = T (1,, )). A matrix S p that gives a line-symmetric transformation with respect to the pan-mirror surface and a matrix S t that y u z u 6187

gives a line-symmetric transformation with respect to the tiltmirror surface are expressed as: cos(2α) sin(2α) S p = 1 sin(2α) cos(2α) S t = 1 (1) cos(2α) sin(2α) sin(2α) cos(2α) Position vectors, a = T (a x, a y, a z ) and b = T (b x, b y, b z ) that satisfy x < near the x-axis are considered. a, the line-symmetric displacement mapping of a with respect to the two mirrors, is calculated as: a = S t S p a x a y + (2) a z l m l m When a 1 = b 1 1, a 2 = b 2, and a 3 = b 3 are assumed, b a = v. Thus, the gaze direction through the mirrors, b a = v, is calculated using (2): v = sin ϕ p cos ϕ p cos ϕ t cos ϕ p sin ϕ t (3) where 2α = π 2 + ϕ p, 2β = π 2 + ϕ t. In addition, a general pan/tilt camera has a gaze vector v g : v g = sin ϕ p cos ϕ t cos ϕ p (4) sin ϕ t On the other hand, when a 1 = b 1, b 2 a 2 = cos θ, and b 3 a 3 = sin θ are assumed, b a is a vector u = T (, cos θ, sin θ) on the camera image plane. Then, its mapping with respect to the mirrors is as follows: u = sin θ cos ϕ p cos θ sin ϕ t + sin θ sin ϕ t cos ϕ t cos θ cos ϕ t + sin θ sin ϕ p sin ϕ t (5) To compare u and u, u is rotated onto the x-axis using a matrix: R = sin ϕ p cos ϕ p cos ϕ t cos ϕ p sin ϕ t cos ϕ p sin ϕ p cos ϕ t sin ϕ p sin ϕ t sin ϕ t cos ϕ t (6) B. Design of Pupil Shift Lenses An object greater than 2 m away is assumed to be at infinity because the camera pupil is comparatively small (5 1 mm). Therefore, the pupil shift lenses are designed for an object at infinity. First, two positive lenses are mounted, separated by a distance that is the sum of their two focal lengths in order to keep both the input and output bundles of rays as parallel light. The lens on the input side is called the object lens, and that on the output side is called the collimator. One more lens called a field lens is placed at the focal positions of these two lenses in order to prevent vignetting. The effect of the field lens is shown in Fig. 5 and. Bundle of Ray (i) Bundle of Ray (ii) Bundle of Ray (i ) Bundle of Ray (ii ) Field Lens Refraction Vignetting Fig. 5. Behavior of a bundle of rays through the lenses: without a field lens, and with a field lens. C. Overall Optical Design We now describe how each optical device is designed. The angle of view centered on the original camera pupil, α, and that centered on the shifted camera pupil, β, are generally different. Here, β means the real angle of view while the Saccade Mirror is used. The angles of view are related to the parameters of the pupil shift lenses as follows: ( α ) ( ) β : = tan : tan = d β : d α (8) 2 2 where is the focal length of the object lens, is that of the collimator, d α is the diameter of the original camera pupil, and d β is that of the shifted pupil [8]. These are shown in Fig. 6. Then, u = Ru = sin θ cosθ = cos(θ + π 2 ) sin(θ + π 2 ) (7) is obtained. That is, the image through the mirrors is inclined at 9 deg, as shown in Fig. 4, and the camera should be tilted. 6188 β d β α d α Field Lens Camera pupil Fig. 6. Focal length and angle of view.

Here, each device is assumed to be selected from commercially available devices. However, the process cannot be uniquely determined. It varies on the intended purpose, use, situation (the limits of the obtainable devices etc.), and so on. In our approach, we first determine the maximum angle of view, β, and that of the controllable gaze angle, ϕ. Rotational mirrors should be used at this point because ϕ is based on only rotational mirrors. Next, the positions of the twoaxis mirrors are determined in consideration of the mirror size, β, and the shifted pupil position. Then, the position and diameter of the object lens are determined. An unfolded diagram of the optical system is shown in Fig. 7. Clockwise D o M2 manufactured by GSI Group), and for the pupil shift lenses, we used three achromatic lenses (Edmund Optics). The setup is shown in Fig. 9, and the optical specifications were as follows: Gazing angle range, ϕ: ±3 deg Maximum beam aperture: 3 mm Maximum angle of view, β: 38.6 deg (measured), 4 deg (designed) [mm] -Mirror 2 2 Field Lens -Mirror 6 8 3 Fig. 9. =6 D o =4 f f =1 D f =5 =8 D c =4 The optical setup of the prototype. Camera pupil d α 1 Desired angle of view : β Counterclockwise D o Fig. 7. Determining the mirror positions. We should avoid vignetting caused by the angles of the mirrors. This figure shows the case where both mirrors are driven by 15 degrees in both rotational directions. With the paths of the bundles of rays in mind, the diameters of the collimator and the field lens, D c and D f, are determined. On the other hand, α can be calculated from D c and the distance between the collimator and the camera pupil, l c p. Now both α and β are determined, and the focal lengths of the lenses,, and f f, can be calculated by (8). This process is shown in Fig. 8. B. Response Performance The response time for switching the gaze from a certain direction to any desired direction was measured using a highspeed camera. For the sake of simplicity, only the pan-mirror was driven, and the gazing angle was set to the maximum angle possible (6 deg). For the desired gazing angle ϕ in [deg], a ramp input that is a function of t [ms] was supplied to the prototype: ϕ in = 3 (t ) 3 + 3t ( < t < 2) 3 (2 t) The results are shown in Fig. 1. Even when the mirrors were scanned quite rapidly, the desired image captured by the high-speed camera appeared stable. The response time was only 3.5 ms. Compared with general mechanical pan/tilt cameras (response time 2 ms), we successfully achieved the expected millisecond-order response. Initial value (9) D f Dc Field Lens α t= [ms] t=.5 t=1. t=1.5 Desired value Camera pupil l c-p t=2. t=2.5 t=3. t=3.5 Fig. 8. Simulation for determining the positions and the sizes of the lenses. Fig. 1. Response time for switching gaze; image sequences captured by a high-speed camera. IV. PROTOTYPE SACCADE MIRROR In this section, we describe a prototype Saccade Mirror developed with some commercially available devices. A. Optical Setup For high-speed gaze control, we used rotational mirrors originally designed for laser scanning (galvanometer mirrors V. MILLISECOND-ORDER HIGH-SPEED PAN/TILT A. Setup for Target Tracking CAMERA We developed a millisecond-order high-speed pan/tilt camera using the prototype Saccade Mirror described above 6189

and a high-speed image processor. We used the camera to implement a target tracking application (Fig. 11). We used IDP-512e as the high-speed image processor [9] and a PC for both image processing and mirror control. The detailed specifications are as follows: PC: DELL Precision T74. CPU: Xeon, 2.83 GHz. RAM: 3.25 GB. OS: Windows XP Professional, Service Pack 2. PC Images High-speed image processor Fig. 11. B. Tracking algorithm Instruction Data logger Saccade Mirror Mirror angle System setup for target tracking. Target The tracking algorithm is described here. An image at a certain time is first captured by the high-speed image processor and transferred to the PC. Next, the target is distinguished precisely from the background using adequate thresholding of the HSV color image to obtain a binarized image, expressed as: { I(x, y) = (1) 1 Using I(x, y), the (p + q)-th image moment is defined as: m pq = x p y q I(x, y). (11) x y The center of mass in the image can be calculated as x xb(x,y) y x cm I(x,y) x y = y cm yb(x,y). (12) x y x y I(x,y) Then, the mirrors are controlled to reduce the distance between G(x cm, y cm ) and the center of the image to close to zero. This process is repeated every 1 ms to realize target tracking [1]. C. Frequency Response The frequency response of the target tracking system was measured using a marker on a rotating fan as the target. A Bode plot is shown in Fig. 12. The cutoff frequency (-3 db) was found to be around 1 Hz (pan gazing) or more than 1 Hz (tilt gazing). The phase delay at 1 Hz was approximately 18 deg (5 ms). In a previous study [3], the frequency response of a target tracking system implemented with a mechanical pan/tilt camera was obtained. The cutoff frequency was 1 Hz (pan gazing), and the phase delay at the cutoff frequency was 9 deg (25 ms). Thus, compared with the previous system, our target tracking system successfully attained a performance level almost ten times higher, in terms of the cutoff frequency. However, the phase delay at the cutoff frequency in our system was larger than that in the previous system. The reasons for the differences can be considered as follows. If a Windows PC is used, a delay (2 3 ms) from the PC will inevitably appear regardless of the gazing method. This delay is actually not so long for conventional systems, whose response time is 2 ms. However, because the response time our millisecond-order pan/tilt camera using the Saccade Mirror was 3.5 ms, as mentioned above, the delay of 2 3 ms from PC became comparatively large. Fig. 12. delay. Gain [db] 4 2-2 -4-6 -8-1 Phase delay [deg] -18 Frequency [Hz] 1 1 1 6-2 -4-6 -8-1 -12-14 -16-3 db (Cut off frequency) Frequency [Hz] 1 1 1 9 deg Bode plot of the target tracking system: gain, and phase D. Observation of Moving Ball Our millisecond-order pan/tilt camera should be useful for observing or evaluating high-speed dynamic phenomena with high resolution and no motion blur. Inly a highspeed camera is used, the entire range of the dynamic motion must be let into the visual field of the camera using a wideangle lens. Thus, the resolution of the desired area becomes relatively low. Moreover, motion blur also appears because the target moves at high speed. We focused on applications in the field of sports. Balls in baseball, football or tennis are difficult to observe continuously because the velocity vectors of the balls fluctuate extremely rapidly during games. The ability to acquire dynamic images of the fast-moving balls would make significant 619

A ball is falling from a hand. start (d) The ball is being hit by the racket. hit. [s].1 [s].2 [s].3 [s].4 [s] The ball is bouncing against a table. bounce.782 [s].784 [s].786 [s].788 [s].79 [s].465 [s] (c) The ball is approaching a racket..471 [s].477 [s].483 [s].489 [s].792 [s].794 [s].796 [s].798 [s].8 [s] (e) The ball is flying away..7 [s].72 [s].74 [s].76 [s].78 [s].85 [s].815 [s].825 [s].835 [s].845 [s] Fig. 13. Image sequences of a dynamic rubber ball: The ball is falling from a hand, bouncing against a table, (c) approaching a racket, (d) being hit by the racket, and (e) flying away. contributions to broadcasting techniques, practice for players, product development for sports goods manufacturers, and so on. Therefore, as an exploratory experiment, we observed a dynamic rubber ball using our millisecond-order pan/tilt camera. The target suddenly came into the visual field, bounced against a table, and was finally hit by a racket. The image sequences are shown in Fig. 13. To keep the images of the target ball stationary, they are corrected so that the center of mass in each image is placed at the center. Although the instant where the ball bounces or is hit (change of the velocity vector) is conventionally difficult to capture, clearly observable image sequences could be successfully obtained with our system. The ball s estimated trajectory is also shown in Fig. 14. Y-Displacement [pixel] 5 4 3 2 1-1 -2-3 -4-5 5 Fig. 14. t=.79 [s] hit bounce t=.479 [s] 3 start t= [s] 1-1 X-Displacement [pixel] -3 The ball s estimated trajectory. VI. CONCLUSION In this paper, we proposed an optical high-speed gaze controller, called the Saccade Mirror. The developed prototype successfully demonstrated a response time as low as 3.5-5 ms. We also developed a millisecond-order pan/tilt camera using this prototype and a high-speed image processor, and we used it to implement a target tracking application. The performance (in terms outoff frequency) was almost ten times higher than that of a conventional mechanical pan/tilt camera. We demonstrated the utility of the system for the field of sports with an exploratory experiment in which a fast-moving rubber ball was observed. In future work, we intend to make improvements to the prototype and other systems to be developed, such as construction of a higher performance prototype, implementing other practical applications, and so on. REFERENCES [1] T. Komuro, I. Ishii, M. Ishikawa and A. Yoshida: A digital vision chip specialized for high-speed target tracking, IEEE Trans. Electron Devices, Vol. 5, pp. 191-199, 23. [2] J. Aloimonos, I. Weiss and A. Bandyopadhyay: Active Vision, International Journal of Computer Vision, Vol. 1, No. 4, pp. 333-356, 1988. [3] Y. Nakabo, M. Ishikawa, H. Toyoda and S. Mizuno: 1ms column parallel vision system and its application of high speed target tracking, Proc. of IEEE Int. Conf. Robotics and Automation, pp. 65-655, 2. [4] J. Taylor, J.-A. Beraldin, G. Godin, L. Cournoyer, R. Baribeau, F. Blais, M. Rioux and J. Domey: NRC 3D imaging technology for museum and heritage applications, The Journal of Visualization and Computer Animation, Vol. 14, pp. 121-138, 23. [5] H. Kim, C. Lin, C. Yoon, H. Lee and H. Son: A Depth Measurement System with the Active Vision of the Striped Lighting and Rotating Mirror, Progress in Pattern Recognition, Image Analysis and Applications, Vol. 3287, pp. 18-115, 24. [6] R. M. Haralick and L. G. Shapiro: Computer and Robot Vision, 1st edition, Addison-Wesley Longman Publishing Co., Inc, 1992. [7] M. A. Paesler and P. J. Moyer: Near-Field Optics: Theory, Instrumentation, and Applications, John Wiley & Sons, Inc, 1996. [8] H. Gross, W. Singer and M. Totzeck: Handbook of Optical Systems, Vol. 2: Physical Image Formation, WILEY-VCH Verlag GmbH & Co. KGaA, 25. [9] I. Ishii, T. Tatebe, Q. Gu, Y. Moriue, T. Takaki and K. Tajima: 2 fps Real-time Vision System with High-frame-rate Video Recording, Proc. of the 21 IEEE International Conference on Robotics and Automation (ICRA21), pp. 1536-1541, 21. [1] B. K. P. Horn: ROBOT VISION, The MIT Press, 1986. 6191