Stroboscopic illumination scheme for seamless 3D endoscopy
|
|
- Roland Snow
- 5 years ago
- Views:
Transcription
1 Stroboscopic illumination scheme for seamless 3D endoscopy Neil T. Clancy* 1,2, Danail Stoyanov 3, Guang-Zhong Yang 1,4 and Daniel S. Elson 1,2 1 Hamlyn Centre for Robotic Surgery, Imperial College London, UK; 2 Department of Surgery and Cancer, Imperial College London, UK; 3 Centre for Medical Image Computing, Department of Computer Science, University College London, UK; 4 Department of Computing, Imperial College London, UK ABSTRACT Intraoperative 3D imaging during minimally invasive surgery (MIS) is possible using structured lighting and has applications in the quantification of tissue morphology. However, projection schemes containing various patterns and colours can be disruptive to the surgeon s field of view. In this paper, a stroboscopic system is proposed in which structured lighting and white light images are interleaved during a high-speed camera acquisition so that the patterned light is not perceived and white light can be used solely for navigation and visual assessment. A beam chopper synchronised with the camera switches rapidly between the two lighting modes while still providing video rate display. A spectrally-encoded structured lighting system is provided by an optical fibre-based probe developed in our lab and is suitable for use in endoscopic biopsy channels. In this dual acquisition mode it is possible to display an augmented view so that the centroids of the structured lighting features are visible on the white light image. Sequential acquisition of varying exposure time images with the high speed camera also allowed the generation of high dynamic range images of the wavelength-encoded structured lighting pattern. Possible applications of this work include classification of polyp morphology as an indicator of pathology. Keywords: Structured light, endoscopy, stroboscopic illumination, high dynamic range 1. INTRODUCTION Tissue surface shape is of clinical interest as tissue morphology may be an indicator of pathology. For example, visual inspection of colonic polyps have revealed correlations between their shape and whether or not they were neoplastic 1. Analysis of this shape in a quantitative manner using a shape-based classification scheme could aid biopsy site selection and decrease the detection of false positives and negatives. Quantification of tissue shape also has potential applications in MIS, where it could be employed to aid registration of pre-operative images, such as MRI, US or CT, with the live surgical view. A real-time updated 3D mesh of the tissue in the field of view could form a deformable framework onto which a 3D scan of a tumour could be overlaid, forming an augmented reality view 2, 3 to aid demarcation of tumour margins. Both the tissue inspection and image registration applications mentioned thus far are achieved endoscopically, where the small cross-sectional area of the instruments used limits the space available to a 3D imaging device. Furthermore, for inspection of colonic polyps, a flexible endoscope must be used, which puts additional demands on the size of the optics and their mechanical properties. A recent publication by our group has addressed these demands and presents a miniature fibre-based structured lighting probe with gradient refractive index optics within a maximum outer diameter of 1.9 mm 4. The probe is small enough to fit into an endoscopic biopsy port and the illumination source is a supercontinuum laser with relatively high optical throughput (> 100 mw) at the distal end to create a pattern that is easily detected by the camera. The probe uses spectral encoding in the visible range to label each projected feature so that it may be identified during the 3D triangulation process. However, due to the fact that the pattern used is made up of multiple colours across the visible range from approximately nm, normal navigation and visualization becomes confusing. One approach to this problem that has been used in non-biological scenarios is the technique of imperceptible structured light, which involves the *n.clancy@imperial.ac.uk; imperial.ac.uk/roboticsurgery Advanced Biomedical and Clinical Diagnostic Systems X, edited by Tuan Vo-Dinh, Anita Mahadevan-Jansen, Warren S. Grundfest, Proc. of SPIE Vol. 8214, 82140M 2012 SPIE CCC code: /12/$18 doi: / Proc. of SPIE Vol M-1
2 alternate projection of two complementary patterns at a high framerate 5. These patterns may be simple stripes 6 or more complex 7. If the projection frequency is high enough, the human eye will only perceive one evenly illuminated area rather than two distinct patterns. To preserve the colour information in the scene being imaged however, the sum of all of the coloured patterns must be white light for all pixels 8, which is not the case with the miniature multispectral structured lighting probe described earlier. In this paper, an interleaved dual illumination scheme is proposed where a beam chopper is used to alternately switch between white and structured light with endoscope-compatible optics. The measured frequency of the high precision chopper is used to trigger a high speed camera so that the computer display shows the scene as illuminated under white light, with the images under patterned light saved for segmentation and 3D processing. This stroboscopic system allows video rate display (15 Hz) at high resolution ( pixels). Although the labels assigned to the projected features in the wavelength-based segmentation are insensitive to the background optical properties, they may still be affected by varying reflectivity by becoming more or less intense on different surfaces. During surgery the reflectivity of tissue may change sharply between different tissue types, for example, white highly reflecting fat and dark highly absorbing liver. Since the projected pattern works in the visible range of the spectrum, different spots will be visible to varying degrees: blue being absorbed strongly by haemoglobin in comparison to red. This, coupled with the fact that the emission strength of the laser is weaker at the blue end, means that for any given surgical scene not all of the projected features will be clearly visible. In this paper, a high speed camera allows the acquisition of a number of frames of varying exposure time so that a high dynamic range image can be created where less bright blue dots are correctly exposed without saturating the red ones. 2.1 Multispectral Structured Lighting Probe 2. MATERIALS AND METHODS The structured lighting probe used in this paper has been described in detail elsewhere 4 but a brief description is provided here. The broadband output of a supercontinuum laser is directed onto a prism which disperses the light into its constituent wavelengths. This dispersed light is then focused onto a linear array of μm core fibres at the proximal end of the fibre probe. These fibres are bundled incoherently and are packed into a circular arrangement at the distal end so that the result is a mixing of the laser wavelengths and a different wavelength per fibre, each with a full-width at halfmaximum of approximately 5 nm. Finally, a GRIN lens is used to form a sharp image of the probe s end face over working distances from less than 2 cm to 20 cm. The colour camera interprets each of these spots, which effectively have a pure wavelength, as varying amounts of red, green and blue. Converting the RGB space to chromaticity coordinates allows calculation of the dominant wavelength detected at a particular pixel, which is independent of the background colour due to its narrow bandwidth. The calculated wavelength of a particular spot then serves as its label so that it may be identified and its position triangulated in 3D space using active stereo techniques used previously 9, Stroboscopic illumination In this paper, a high speed camera synchronized with a stroboscopic illumination system is proposed that separates the white light and structured lighting sources onto alternate frames. The system is designed so that the white light frames are displayed live and the structured lighting images are saved for further processing and 3D triangulation. A schematic of the system in Figure 1 shows how an optical beam chopper (3501 Optical Chopper, New Focus, Inc., USA) is used to stroboscopically switch between white light and laser illumination. Both beam paths are directed through the chopper wheel so that they are out of phase by 90 and the chopping frequency is outputted as a square wave voltage signal (0-5 V). The laser output is collimated and so is chopped as it propagates in free space. The xenon light (xenon 300, Karl Storz GmbH, Germany) is first coupled into a light cable (5 mm diameter optical fibre bundle; Karl Storz GmbH, Germany) which is then placed as close as possible to the beam chopper, with a second light cable on the other side. This can then be connected to the standard lighting input of a laparoscope or flexible endoscope. The chopping signal is then used to trigger a high-speed colour camera (Prosilica GX1050c, Allied Vision Technologies, Inc., USA). The camera streams colour image data via two gigabit ethernet cables, enabling a maximum speed of 112 fps. The acquisition sequence is detailed in Figure 2. On the rising edge of the trigger signal (t 1 ), white light is allowed through the chopper and displayed on screen. On the falling edge (t 2 ), laser light is transmitted and an image of the structured lighting pattern is saved for wavelength-based segmentation and 3D triangulation. Proc. of SPIE Vol M-2
3 Figure 1. Stroboscopic illumination system. The optical chopper is placed in both white light and laser beam paths so that they are alternately switched on and off. Figure 2. Trigger timing schematic. The 5 V output signal from the chopper wheel is used to trigger acquisitions from the camera at both the rising and falling edges, t 1 and t 2 respectively. The chopper period is indicated by T, while the trigger latency, exposure and readout times define the upper limit of the acquisition framerate. The final framerate of the system is dependent on the exposure time needed, trigger latency period and the readout time, which is itself determined by the image height. According to the manufacturer s specifications, the trigger latency is approximately 1.5±0.5 μs and for an image height of 1024 pixels, a framerate of 112 Hz is possible in free run mode. The relationship between image height and maximum framerate is approximately a negative exponential, with 200 Hz possible at an image height of 400 pixels. 2.3 High dynamic range imaging The multispectral structured lighting probe generates a pattern of dots that are assigned unique wavelength labels. To form a high dynamic range image of the dot pattern with all colours correctly exposed, a number of low dynamic range images were acquired with varying exposure times. In this approach, RGB pixels in one particular image in the sequence where saturation or low intensity values are detected are replaced by pixels from an image of more suitable exposure time. To demonstrate the procedure, a sample of ex vivo tissue (lamb s liver) was used as the target illuminated by the structured lighting pattern. Two successive images were then acquired with exposure times of 1.3 ms and 4.0 ms. The high dynamic range image was created by combining these images using an algorithm in Matlab s Image Processing Toolbox (The Math Works, Inc., USA). The low exposure time image was used as a base image and its pixel values checked for agreement with the noise and saturation settings. Any pixel with a grayscale value below 10 was considered Proc. of SPIE Vol M-3
4 too noisy and replaced by the corresponding RGB triplet from the longer exposure image, while any saturated pixel (grayscale level greater than 250 in either the red, green or blue colour planes) was not included in the final image. The algorithm then smoothed the boundaries of the resulting image by solving Laplace s equation to remove discontinuities. 3. RESULTS 3.1 Stroboscopic illumination Figure 3 shows the output from the stroboscopic system. White light illuminated frames provide a clear view of the tissue to the surgeon, while the structured lighting pattern is saved in the background. These images can then be used to reconstruct the surface shape of the tissue in view by finding the centroids of each spot using the wavelength-based segmentation algorithm and triangulating their 3D coordinates. Figure 3. Synchronised stroboscopic imaging of liver ex vivo showing (a) the white light image displayed on screen, (b) the structured lighting image saved to disk and (c) the pattern centroids computed using the structured lighting algorithm4 superimposed on the white light image. After the projected spot coordinates were obtained, they could be overlaid onto the white light image, indicating the region covered by the structured lighting pattern to the surgeon without the distracting presence of the varying spot colour. 3.2 High dynamic range imaging To maintain sufficient image brightness without saturating pixels, two images of different exposure times were acquired using the high speed camera. At an exposure time of 1.3 ms, Figure 4 (a) shows that the red dots are correctly resolved and but the green and blue dots are much more dim, and are barely visible above the noise. When the exposure time is increased to 4 ms, as in Figure 4 (b), these dots become visible but the red dots are now saturated. This overexposure has the effect of making these dots more difficult to distinguish from each other and, as a result, will introduce errors into the 3D reconstruction. Figure 4 (c) shows the high dynamic range image created by using the most desirable pixels from each image, subject to the inclusion criteria. This is illustrated in Figures 4 (d) to (f) where the grayscale profiles across two sample spots are presented for the two low dynamic range images and the high dynamic range image. In Figure 4 (d) the blue dot has a maximum grayscale value that is just above the background noise, while the red spot signal is much stronger and occupies more of the grayscale range. When the exposure time is increased, as in Figure 4 (e), the blue spot is almost four times brighter than it was initially, but the red is now saturated. Since the blue and green also contribute to the colour of this red spot however, erroneous wavelength values will be calculated in the segmentation algorithm as the ratio of red to green and blue is no longer constant across its diameter. The high dynamic range profile in Figure 4 (f) combines the two images so that the blue spot is brighter and the red spot is not saturated and is correctly in proportion to its green and blue components. Proc. of SPIE Vol M-4
5 Figure 4. High dynamic range imaging of ex vivo liver tissue. (a) Exposure time = 1.3 ms; red correctly exposed, blue and green noisy. (b) Exposure time = 4 ms; blue and green visible, red overexposed. (c) High dynamic range image; red, green and blue spots visible without saturation. (d)-(f) Grayscale profiles corresponding to the values along the white lines in (a)-(c). 4. DISCUSSION AND CONCLUSIONS A stroboscopic illumination system for imperceptible structured lighting during MIS has been demonstrated. At full resolution ( pixels), it is possible to display white light illuminated frames at 15 Hz while saving structured lighting images for further processing. Using a previously developed segmentation algorithm, the different wavelength spots in the pattern were segmented from the background tissue and their centroids calculated. These can then be used as the input to the 3D reconstruction algorithm that triangulates their position. The dual acquisition mode of the system allows the user to overlay the centroids on the white light image with minimal disruption to normal viewing. This is particularly advantageous as it provides the surgeon with fine control over the pattern illumination area and allows checking for the presence of occluding tissue and whether or not the correct tissue is being measured. The high speed camera has also been shown to be capable of acquiring multiple images to generate a high dynamic range image of the projected multispectral pattern in order to correct for high absorption by blood in the blue and green range of the spectrum. The result is an image that is low-noise with features that are more easily segmented than would otherwise be possible in a single exposure low dynamic range image. A limitation of this method is that the low dynamic range images used must be spatially aligned. Therefore, any movement of the tissue must be small in comparison to the acquisition time of the image set. At full resolution, this system is capable of saving images at 15 Hz, which is potentially quick enough to image the abdomen and bowel as proposed, where small movements related to breathing and peristalsis are present. However, a higher framerate may be necessary to avoid misregistration during sharp endoscope camera movements or faster moving tissue such as the beating heart. The stroboscopic system described here displays white light images and saves structured lighting images for offline processing. Future work will involve optimization of the acquisition software to allow online simultaneous processing of the structured lighting data without the need for saving every image. The stability of the high dynamic range images will Proc. of SPIE Vol M-5
6 also be investigated to validate the robustness of the projected feature segmentation after combination of the multiple low dynamic range images. Adjustment of the beam chopper duty cycle will enable the integration of the high dynamic range images and white light display with a decreased risk of misalignment errors. For example, a 75% duty cycle, rather than the 50% described here, would allow the successive acquisition of three structured light images before being interrupted by a white light image. Finally, the white light illumination side of the system could be made to be more efficient as the current air gap proximity coupling loses a lot of light due to reflection alone. Use of an ultra-bright LED source may be a viable alternative as it may be switched on and off electronically, removing the need for it to interact with the chopper. ACKNOWLEDGEMENTS Funding for this work was provided by ERC grant , and UK EPSRC and Technology Strategy Board grants EP/E06342X/1 and DT/E011101/1. Danail Stoyanov would like to acknowledge the financial support of a Royal Academy of Engineering/EPSRC Fellowship. REFERENCES [1] Kato, S., Fu, K. I., Sano, Y., Fujii, T., Saito, Y., Matsuda, T., Koba, I., Yoshida, S., and Fujimori, T., "Magnifying colonoscopy as a non-biopsy technique for differential diagnosis of non-neoplastic and neoplastic lesions," World J. Gastroenterol., 12(9), (2006). [2] Stoyanov, D., Mylonas, G. P., Lerotic, M., Chung, A. J., and Yang, G.-Z., "Intra-operative visualizations: perceptual fidelity and human factors," J. Disp. Technol., 4(4), (2008). [3] Edwards, P. J., King, A. P., Hawkes, D. J., Fleig, O., Maurer, C. R. J., Hill, D. L., Fenlon, M. R., de Cunha, D. A., Gaston, R. P., Chandra, S., Mannss, J., Strong, A. J., Gleeson, M. J., and Cox, T. C., "Stereo augmented reality in the surgical microscope," Stud. Health Technol. Inform., 62, (1999). [4] Clancy, N. T., Stoyanov, D., Maier-Hein, L., Groch, A., Yang, G.-Z., and Elson, D. S., "Spectrally-encoded fibre-based structured lighting probe for intraoperative 3D imaging," Biomed. Opt. Express, 2(11), (2011). [5] Raskar, R., Welch, G., Cutts, M., Lake, A., Stesin, L., and Fuchs, H., "The office of the future: a unified approach to image-based modeling and spatially immersive displays," Proceedings of ACM SIGGRAPH (1998). [6] Fuchs, H., Cotting, D., Naef, M., and Gross, M., [Methods, systems, and computer program products for imperceptibly embedding structured light patterns in projected color images for display on planar and nonplanar surfaces] The University of North Carolina, USA, Patent No. US 7,182,465 B2 (2007). [7] Molinier, T., Fofi, D., Salvi, J., and Gorria, P., "2D virtual texture on 3D real object with coded structured light," Proc. of SPIE-IS&T Electronic Imaging. 6813, 68130Q (2008). [8] Dai, J., and Chung, R., "Head pose estimation by imperceptible structured light sensing," International Conference on Robotics and Automation (2011). [9] Wu, T. T., and Qu, J. Y., "Optical imaging for medical diagnosis based on active stereo vision and motion tracking," Opt. Express, 15(16), (2007). [10] Batlle, J., Mouaddib, E., and Salvi, J., "Recent progress in coded structured light as a technique to solve the correspondence problem: a survey," Pattern Recogn., 31(7), (1998). Proc. of SPIE Vol M-6
Stereoscopic Augmented Reality System for Computer Assisted Surgery
Marc Liévin and Erwin Keeve Research center c a e s a r, Center of Advanced European Studies and Research, Surgical Simulation and Navigation Group, Friedensplatz 16, 53111 Bonn, Germany. A first architecture
More informationPhotoacoustic imaging using an 8-beam Fabry-Perot scanner
Photoacoustic imaging using an 8-beam Fabry-Perot scanner Nam Huynh, Olumide Ogunlade, Edward Zhang, Ben Cox, Paul Beard Department of Medical Physics and Biomedical Engineering, University College London,
More informationA miniature all-optical photoacoustic imaging probe
A miniature all-optical photoacoustic imaging probe Edward Z. Zhang * and Paul C. Beard Department of Medical Physics and Bioengineering, University College London, Gower Street, London WC1E 6BT, UK http://www.medphys.ucl.ac.uk/research/mle/index.htm
More informationActive Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1
Active Stereo Vision COMP 4102A Winter 2014 Gerhard Roth Version 1 Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can
More informationEC-433 Digital Image Processing
EC-433 Digital Image Processing Lecture 2 Digital Image Fundamentals Dr. Arslan Shaukat 1 Fundamental Steps in DIP Image Acquisition An image is captured by a sensor (such as a monochrome or color TV camera)
More informationReprint (R37) DLP Products DMD-Based Hyperspectral Imager Makes Surgery Easier
Reprint (R37) DLP Products DMD-Based Hyperspectral Imager Makes Surgery Easier Reprinted with permission by Dr. Karel J. Zuzak University of Texas/Arlington October 2008 Gooch & Housego 4632 36 th Street,
More informationAcoustic resolution. photoacoustic Doppler velocimetry. in blood-mimicking fluids. Supplementary Information
Acoustic resolution photoacoustic Doppler velocimetry in blood-mimicking fluids Joanna Brunker 1, *, Paul Beard 1 Supplementary Information 1 Department of Medical Physics and Biomedical Engineering, University
More informationME 6406 MACHINE VISION. Georgia Institute of Technology
ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class
More informationChapter 36: diffraction
Chapter 36: diffraction Fresnel and Fraunhofer diffraction Diffraction from a single slit Intensity in the single slit pattern Multiple slits The Diffraction grating X-ray diffraction Circular apertures
More informationImproving Depth Perception in Medical AR
Improving Depth Perception in Medical AR A Virtual Vision Panel to the Inside of the Patient Christoph Bichlmeier 1, Tobias Sielhorst 1, Sandro M. Heining 2, Nassir Navab 1 1 Chair for Computer Aided Medical
More informationVision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5
Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain
More informationTL2 Technology Developer User Guide
TL2 Technology Developer User Guide The Waveguide available for sale now is the TL2 and all references in this section are for this optic. Handling and care The TL2 Waveguide is a precision instrument
More informationInstruction manual and data sheet ipca h
1/15 instruction manual ipca-21-05-1000-800-h Instruction manual and data sheet ipca-21-05-1000-800-h Broad area interdigital photoconductive THz antenna with microlens array and hyperhemispherical silicon
More informationSuperfast phase-shifting method for 3-D shape measurement
Superfast phase-shifting method for 3-D shape measurement Song Zhang 1,, Daniel Van Der Weide 2, and James Oliver 1 1 Department of Mechanical Engineering, Iowa State University, Ames, IA 50011, USA 2
More informationExercise questions for Machine vision
Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided
More informationHartmann Sensor Manual
Hartmann Sensor Manual 2021 Girard Blvd. Suite 150 Albuquerque, NM 87106 (505) 245-9970 x184 www.aos-llc.com 1 Table of Contents 1 Introduction... 3 1.1 Device Operation... 3 1.2 Limitations of Hartmann
More informationObservational Astronomy
Observational Astronomy Instruments The telescope- instruments combination forms a tightly coupled system: Telescope = collecting photons and forming an image Instruments = registering and analyzing the
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
Student Name Date MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.161 Modern Optics Project Laboratory Laboratory Exercise No. 3 Fall 2005 Diffraction
More informationCoded Aperture for Projector and Camera for Robust 3D measurement
Coded Aperture for Projector and Camera for Robust 3D measurement Yuuki Horita Yuuki Matugano Hiroki Morinaga Hiroshi Kawasaki Satoshi Ono Makoto Kimura Yasuo Takane Abstract General active 3D measurement
More informationConfocal chromatic sensors and confocal microscope Micrometer measurement of thickness, displacement, position
Confocal chromatic sensors and confocal microscope Micrometer measurement of thickness, displacement, position 2 optoncdt 2401 Confocal displacement measurement system - Non-contact measurement principle
More informationDigital Photographic Imaging Using MOEMS
Digital Photographic Imaging Using MOEMS Vasileios T. Nasis a, R. Andrew Hicks b and Timothy P. Kurzweg a a Department of Electrical and Computer Engineering, Drexel University, Philadelphia, USA b Department
More informationAccuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery
Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery Matteo Fusaglia 1, Daphne Wallach 1, Matthias Peterhans 1, Guido Beldi 2, Stefan Weber 1 1 Artorg
More informationOptical coherence tomography
Optical coherence tomography Peter E. Andersen Optics and Plasma Research Department Risø National Laboratory E-mail peter.andersen@risoe.dk Outline Part I: Introduction to optical coherence tomography
More informationRotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition
Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition V. K. Beri, Amit Aran, Shilpi Goyal, and A. K. Gupta * Photonics Division Instruments Research and Development
More informationMulti-channel imaging cytometry with a single detector
Multi-channel imaging cytometry with a single detector Sarah Locknar 1, John Barton 1, Mark Entwistle 2, Gary Carver 1 and Robert Johnson 1 1 Omega Optical, Brattleboro, VT 05301 2 Philadelphia Lightwave,
More informationDevelopment of a new multi-wavelength confocal surface profilometer for in-situ automatic optical inspection (AOI)
Development of a new multi-wavelength confocal surface profilometer for in-situ automatic optical inspection (AOI) Liang-Chia Chen 1#, Chao-Nan Chen 1 and Yi-Wei Chang 1 1. Institute of Automation Technology,
More informationused to diagnose and treat medical conditions. State the precautions necessary when X ray machines and CT scanners are used.
Page 1 State the properties of X rays. Describe how X rays can be used to diagnose and treat medical conditions. State the precautions necessary when X ray machines and CT scanners are used. What is meant
More informationAyuekanbe Atagabe. Physics 464(applied Optics) Winter Project Report. Fiber Optics in Medicine. March 11, 2003
Ayuekanbe Atagabe Physics 464(applied Optics) Winter 2003 Project Report Fiber Optics in Medicine March 11, 2003 Abstract: Fiber optics have become very important in medical diagnoses in this modern era
More informationImproving the Collection Efficiency of Raman Scattering
PERFORMANCE Unparalleled signal-to-noise ratio with diffraction-limited spectral and imaging resolution Deep-cooled CCD with excelon sensor technology Aberration-free optical design for uniform high resolution
More informationDynamic Phase-Shifting Microscopy Tracks Living Cells
from photonics.com: 04/01/2012 http://www.photonics.com/article.aspx?aid=50654 Dynamic Phase-Shifting Microscopy Tracks Living Cells Dr. Katherine Creath, Goldie Goldstein and Mike Zecchino, 4D Technology
More informationStudying of Reflected Light Optical Laser Microscope Images Using Image Processing Algorithm
IRAQI JOURNAL OF APPLIED PHYSICS Fatema H. Rajab Al-Nahrain University, College of Engineering, Department of Laser and Optoelectronic Engineering Studying of Reflected Light Optical Laser Microscope Images
More informationObserving Microorganisms through a Microscope LIGHT MICROSCOPY: This type of microscope uses visible light to observe specimens. Compound Light Micros
PHARMACEUTICAL MICROBIOLOGY JIGAR SHAH INSTITUTE OF PHARMACY NIRMA UNIVERSITY Observing Microorganisms through a Microscope LIGHT MICROSCOPY: This type of microscope uses visible light to observe specimens.
More informationIntroduction to Computer Vision
Introduction to Computer Vision CS / ECE 181B Thursday, April 1, 2004 Course Details HW #0 and HW #1 are available. Course web site http://www.ece.ucsb.edu/~manj/cs181b Syllabus, schedule, lecture notes,
More informationTechnical Explanation for Displacement Sensors and Measurement Sensors
Technical Explanation for Sensors and Measurement Sensors CSM_e_LineWidth_TG_E_2_1 Introduction What Is a Sensor? A Sensor is a device that measures the distance between the sensor and an object by detecting
More informationApplications of Optics
Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics
More informationLaser Scanning for Surface Analysis of Transparent Samples - An Experimental Feasibility Study
STR/03/044/PM Laser Scanning for Surface Analysis of Transparent Samples - An Experimental Feasibility Study E. Lea Abstract An experimental investigation of a surface analysis method has been carried
More informationNon-contact Photoacoustic Tomography using holographic full field detection
Non-contact Photoacoustic Tomography using holographic full field detection Jens Horstmann* a, Ralf Brinkmann a,b a Medical Laser Center Lübeck, Peter-Monnik-Weg 4, 23562 Lübeck, Germany; b Institute of
More informationX-ray phase-contrast imaging
...early-stage tumors and associated vascularization can be visualized via this imaging scheme Introduction As the selection of high-sensitivity scientific detectors, custom phosphor screens, and advanced
More informationFRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION
FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION Revised November 15, 2017 INTRODUCTION The simplest and most commonly described examples of diffraction and interference from two-dimensional apertures
More informationSupercontinuum based mid-ir imaging
Supercontinuum based mid-ir imaging Nikola Prtljaga workshop, Munich, 30 June 2017 PAGE 1 workshop, Munich, 30 June 2017 Outline 1. Imaging system (Minerva Lite ) wavelength range: 3-5 µm, 2. Scanning
More informationDesigning an MR compatible Time of Flight PET Detector Floris Jansen, PhD, Chief Engineer GE Healthcare
GE Healthcare Designing an MR compatible Time of Flight PET Detector Floris Jansen, PhD, Chief Engineer GE Healthcare There is excitement across the industry regarding the clinical potential of a hybrid
More informationInformation & Instructions
KEY FEATURES 1. USB 3.0 For the Fastest Transfer Rates Up to 10X faster than regular USB 2.0 connections (also USB 2.0 compatible) 2. High Resolution 4.2 MegaPixels resolution gives accurate profile measurements
More informationNOVA S12. Compact and versatile high performance camera system. 1-Megapixel CMOS Image Sensor: 1024 x 1024 pixels at 12,800fps
NOVA S12 1-Megapixel CMOS Image Sensor: 1024 x 1024 pixels at 12,800fps Maximum Frame Rate: 1,000,000fps Class Leading Light Sensitivity: ISO 12232 Ssat Standard ISO 64,000 monochrome ISO 16,000 color
More informationPHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT
PHYSICS FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E Chapter 35 Lecture RANDALL D. KNIGHT Chapter 35 Optical Instruments IN THIS CHAPTER, you will learn about some common optical instruments and
More informationSMART LASER SENSORS SIMPLIFY TIRE AND RUBBER INSPECTION
PRESENTED AT ITEC 2004 SMART LASER SENSORS SIMPLIFY TIRE AND RUBBER INSPECTION Dr. Walt Pastorius LMI Technologies 2835 Kew Dr. Windsor, ON N8T 3B7 Tel (519) 945 6373 x 110 Cell (519) 981 0238 Fax (519)
More informationNovel machine interface for scaled telesurgery
Novel machine interface for scaled telesurgery S. Clanton, D. Wang, Y. Matsuoka, D. Shelton, G. Stetten SPIE Medical Imaging, vol. 5367, pp. 697-704. San Diego, Feb. 2004. A Novel Machine Interface for
More informationMultispectral. imaging device. ADVANCED LIGHT ANALYSIS by. Most accurate homogeneity MeasureMent of spectral radiance. UMasterMS1 & UMasterMS2
Multispectral imaging device Most accurate homogeneity MeasureMent of spectral radiance UMasterMS1 & UMasterMS2 ADVANCED LIGHT ANALYSIS by UMaster Ms Multispectral Imaging Device UMaster MS Description
More informationCvision 2. António J. R. Neves João Paulo Silva Cunha. Bernardo Cunha. IEETA / Universidade de Aveiro
Cvision 2 Digital Imaging António J. R. Neves (an@ua.pt) & João Paulo Silva Cunha & Bernardo Cunha IEETA / Universidade de Aveiro Outline Image sensors Camera calibration Sampling and quantization Data
More informationBringing Answers to the Surface
3D Bringing Answers to the Surface 1 Expanding the Boundaries of Laser Microscopy Measurements and images you can count on. Every time. LEXT OLS4100 Widely used in quality control, research, and development
More informationFig Color spectrum seen by passing white light through a prism.
1. Explain about color fundamentals. Color of an object is determined by the nature of the light reflected from it. When a beam of sunlight passes through a glass prism, the emerging beam of light is not
More informationModifications of the coherence radar for in vivo profilometry in dermatology
Modifications of the coherence radar for in vivo profilometry in dermatology P. Andretzky, M. W. Lindner, G. Bohn, J. Neumann, M. Schmidt, G. Ammon, and G. Häusler Physikalisches Institut, Lehrstuhl für
More informationFastest high definition Raman imaging. Fastest Laser Raman Microscope RAMAN
Fastest high definition Raman imaging Fastest Laser Raman Microscope RAMAN - 11 www.nanophoton.jp Observation A New Generation in Raman Observation RAMAN-11 developed by Nanophoton was newly created by
More informationColour correction for panoramic imaging
Colour correction for panoramic imaging Gui Yun Tian Duke Gledhill Dave Taylor The University of Huddersfield David Clarke Rotography Ltd Abstract: This paper reports the problem of colour distortion in
More informationExam 4. Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question.
Name: Class: Date: Exam 4 Multiple Choice Identify the choice that best completes the statement or answers the question. 1. Mirages are a result of which physical phenomena a. interference c. reflection
More informationSingle-photon excitation of morphology dependent resonance
Single-photon excitation of morphology dependent resonance 3.1 Introduction The examination of morphology dependent resonance (MDR) has been of considerable importance to many fields in optical science.
More informationA Short History of Using Cameras for Weld Monitoring
A Short History of Using Cameras for Weld Monitoring 2 Background Ever since the development of automated welding, operators have needed to be able to monitor the process to ensure that all parameters
More informationScopis Hybrid Navigation with Augmented Reality
Scopis Hybrid Navigation with Augmented Reality Intelligent navigation systems for head surgery www.scopis.com Scopis Hybrid Navigation One System. Optical and electromagnetic measurement technology. As
More informationEnFocus Your Upgrade Path to High Performance Intrasurgical OCT
Your Upgrade Path to High Performance Intrasurgical OCT is FDA 510(k) Cleared > Ultra HD OCT extends your microscope s potential with intrasurgical OCT BRILLIANT IMAGES, SUB-SURFACE KNOWLEDGE is an intrasurgical
More informationattocfm I for Surface Quality Inspection NANOSCOPY APPLICATION NOTE M01 RELATED PRODUCTS G
APPLICATION NOTE M01 attocfm I for Surface Quality Inspection Confocal microscopes work by scanning a tiny light spot on a sample and by measuring the scattered light in the illuminated volume. First,
More informationRange Sensing strategies
Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart and Nourbakhsh 4.1.6 Range Sensors (time of flight) (1) Large range distance measurement -> called
More informationIntroduction to the operating principles of the HyperFine spectrometer
Introduction to the operating principles of the HyperFine spectrometer LightMachinery Inc., 80 Colonnade Road North, Ottawa ON Canada A spectrometer is an optical instrument designed to split light into
More informationNew LEDs improve the quality of illumination of fullcolor holograms recorded with red 660 nm, green 532 nm and blue 440 nm lasers
New LEDs improve the quality of illumination of fullcolor holograms recorded with red 660 nm, green 532 nm and blue 440 nm lasers PHILIPPE GENTET, 1,* YVES GENTET, 2 JINBEOM JOUNG, 1 SEUNG-HYUN LEE 1 1
More informationWe attempted to separate the two dyes by acquiring images using a single excitation wavelength and just two emission wavelengths.
TN437: Spectral Separation of monochrome images using Volocity 4.0 Introduction Spectral Separation is a technique that allows the user to separate images containing data from more than one fluorochrome
More informationVixar High Power Array Technology
Vixar High Power Array Technology I. Introduction VCSELs arrays emitting power ranging from 50mW to 10W have emerged as an important technology for applications within the consumer, industrial, automotive
More informationCRISATEL High Resolution Multispectral System
CRISATEL High Resolution Multispectral System Pascal Cotte and Marcel Dupouy Lumiere Technology, Paris, France We have designed and built a high resolution multispectral image acquisition system for digitizing
More informationDesign of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems
Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems Ricardo R. Garcia University of California, Berkeley Berkeley, CA rrgarcia@eecs.berkeley.edu Abstract In recent
More informationSupplementary Figure S1: Schematic view of the confocal laser scanning STED microscope used for STED-RICS. For a detailed description of our
Supplementary Figure S1: Schematic view of the confocal laser scanning STED microscope used for STED-RICS. For a detailed description of our home-built STED microscope used for the STED-RICS experiments,
More informationIn-Vivo IMAGING SYSTEMS. A complete line of high resolution optical & X-ray systems for pre-clinical imaging
In-Vivo IMAGING SYSTEMS A complete line of high resolution optical & X-ray systems for pre-clinical imaging In-Vivo Imaging Systems Carestream is a strong, successful, multi-billion dollar, international
More informationSensitive measurement of partial coherence using a pinhole array
1.3 Sensitive measurement of partial coherence using a pinhole array Paul Petruck 1, Rainer Riesenberg 1, Richard Kowarschik 2 1 Institute of Photonic Technology, Albert-Einstein-Strasse 9, 07747 Jena,
More informationEndoscopic laser speckle contrast imaging system using a fibre image guide
Endoscopic laser speckle contrast imaging system using a fibre image guide Lipei Song* and Daniel Elson Hamlyn Centre for Robotic Surgery; Institute of Global Health Innovation and Department of Surgery
More informationOPTICAL SYSTEMS OBJECTIVES
101 L7 OPTICAL SYSTEMS OBJECTIVES Aims Your aim here should be to acquire a working knowledge of the basic components of optical systems and understand their purpose, function and limitations in terms
More informationInstructions for the Experiment
Instructions for the Experiment Excitonic States in Atomically Thin Semiconductors 1. Introduction Alongside with electrical measurements, optical measurements are an indispensable tool for the study of
More informationEvaluation of laser-based active thermography for the inspection of optoelectronic devices
More info about this article: http://www.ndt.net/?id=15849 Evaluation of laser-based active thermography for the inspection of optoelectronic devices by E. Kollorz, M. Boehnel, S. Mohr, W. Holub, U. Hassler
More informationX-Rays and endoscopes
X-Rays and endoscopes 1 What are X-rays? X-ray refers to electromagnetic radiation with a wavelength between 0.01nm - 10nm. increasing wavelength visible light ultraviolet x-ray increasing energy X-rays
More informationDesign of the Diffuse Optical Tomography Device
Design of the Diffuse Optical Tomography Device A thesis submitted in partial fulfillment of the requirements for the degree of Bachelor of Science degree in Physics from the College of William and Mary
More informationFor a long time I limited myself to one color as a form of discipline. Pablo Picasso. Color Image Processing
For a long time I limited myself to one color as a form of discipline. Pablo Picasso Color Image Processing 1 Preview Motive - Color is a powerful descriptor that often simplifies object identification
More informationOptimal Pupil Design for Confocal Microscopy
Optimal Pupil Design for Confocal Microscopy Yogesh G. Patel 1, Milind Rajadhyaksha 3, and Charles A. DiMarzio 1,2 1 Department of Electrical and Computer Engineering, 2 Department of Mechanical and Industrial
More informationStructured-Light Based Acquisition (Part 1)
Structured-Light Based Acquisition (Part 1) CS635 Spring 2017 Daniel G. Aliaga Department of Computer Science Purdue University Passive vs. Active Acquisition Passive + Just take pictures + Does not intrude
More informationQuantitative, spectrally- resolved intraoperative
1 Quantitative, spectrally- resolved intraoperative fluorescence imaging Pablo A. Valdés 1,2,3, Frederic Leblond 1, Valerie L. Jacobs 2, Brian C. Wilson 5, Keith D. Paulsen 1,2,4, & David W. Roberts 2,3,4
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationNumerical simulation of a gradient-index fibre probe and its properties of light propagation
Numerical simulation of a gradient-index fibre probe and its properties of light propagation Wang Chi( ) a), Mao You-Xin( ) b), Tang Zhi( ) a), Fang Chen( ) a), Yu Ying-Jie( ) a), and Qi Bo( ) c) a) Department
More informationA novel tunable diode laser using volume holographic gratings
A novel tunable diode laser using volume holographic gratings Christophe Moser *, Lawrence Ho and Frank Havermeyer Ondax, Inc. 85 E. Duarte Road, Monrovia, CA 9116, USA ABSTRACT We have developed a self-aligned
More informationNd:YSO resonator array Transmission spectrum (a. u.) Supplementary Figure 1. An array of nano-beam resonators fabricated in Nd:YSO.
a Nd:YSO resonator array µm Transmission spectrum (a. u.) b 4 F3/2-4I9/2 25 2 5 5 875 88 λ(nm) 885 Supplementary Figure. An array of nano-beam resonators fabricated in Nd:YSO. (a) Scanning electron microscope
More informationThe First True Color Confocal Scanner on the Market
The First True Color Confocal Scanner on the Market White color and infrared confocal images: the advantages of white color and confocality together for better fundus images. The infrared to see what our
More informationDigital Pathology and Tissue-based Diagnosis. How do they differ?
Digital Pathology and Tissue-based Diagnosis. How do they differ? P. Hufnagl Institute of Pathology (Rudolf-Virchow-Haus). Humboldt University, Berlin? 10.12.2014 1 Structure of the talk Possible workflow
More informationBackground Subtraction Fusing Colour, Intensity and Edge Cues
Background Subtraction Fusing Colour, Intensity and Edge Cues I. Huerta and D. Rowe and M. Viñas and M. Mozerov and J. Gonzàlez + Dept. d Informàtica, Computer Vision Centre, Edifici O. Campus UAB, 08193,
More informationECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the
ECEN 4606 Lab 8 Spectroscopy SUMMARY: ROBLEM 1: Pedrotti 3 12-10. In this lab, you will design, build and test an optical spectrum analyzer and use it for both absorption and emission spectroscopy. The
More informationBy Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.
Leddar optical time-of-flight sensing technology, originally discovered by the National Optics Institute (INO) in Quebec City and developed and commercialized by LeddarTech, is a unique LiDAR technology
More information(N)MR Imaging. Lab Course Script. FMP PhD Autumn School. Location: C81, MRI Lab B0.03 (basement) Instructor: Leif Schröder. Date: November 3rd, 2010
(N)MR Imaging Lab Course Script FMP PhD Autumn School Location: C81, MRI Lab B0.03 (basement) Instructor: Leif Schröder Date: November 3rd, 2010 1 Purpose: Understanding the basic principles of MR imaging
More informationLecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)
Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces
More informationWhy and How? Daniel Gitler Dept. of Physiology Ben-Gurion University of the Negev. Microscopy course, Michmoret Dec 2005
Why and How? Daniel Gitler Dept. of Physiology Ben-Gurion University of the Negev Why use confocal microscopy? Principles of the laser scanning confocal microscope. Image resolution. Manipulating the
More informationIII III 0 IIOI DID IIO 1101 I II 0II II 100 III IID II DI II
(19) United States III III 0 IIOI DID IIO 1101 I0 1101 0II 0II II 100 III IID II DI II US 200902 19549A1 (12) Patent Application Publication (10) Pub. No.: US 2009/0219549 Al Nishizaka et al. (43) Pub.
More informationParallax-Free Long Bone X-ray Image Stitching
Parallax-Free Long Bone X-ray Image Stitching Lejing Wang 1,JoergTraub 1, Simon Weidert 2, Sandro Michael Heining 2, Ekkehard Euler 2, and Nassir Navab 1 1 Chair for Computer Aided Medical Procedures (CAMP),
More informationHUMAN Robot Cooperation Techniques in Surgery
HUMAN Robot Cooperation Techniques in Surgery Alícia Casals Institute for Bioengineering of Catalonia (IBEC), Universitat Politècnica de Catalunya (UPC), Barcelona, Spain alicia.casals@upc.edu Keywords:
More informationFSI Machine Vision Training Programs
FSI Machine Vision Training Programs Table of Contents Introduction to Machine Vision (Course # MVC-101) Machine Vision and NeuroCheck overview (Seminar # MVC-102) Machine Vision, EyeVision and EyeSpector
More informationHigh-sensitivity. optical molecular imaging and high-resolution digital X-ray. In-Vivo Imaging Systems
High-sensitivity optical molecular imaging and high-resolution digital X-ray In-Vivo Imaging Systems In vivo imaging solutions available in several packages Carestream Molecular Imaging offers a selection
More informationWireless In Vivo Communications and Networking
Wireless In Vivo Communications and Networking Richard D. Gitlin Minimally Invasive Surgery Wirelessly networked modules Modeling the in vivo communications channel Motivation: Wireless communications
More informationYou won t be able to measure the incident power precisely. The readout of the power would be lower than the real incident power.
1. a) Given the transfer function of a detector (below), label and describe these terms: i. dynamic range ii. linear dynamic range iii. sensitivity iv. responsivity b) Imagine you are using an optical
More informationWavefront sensing by an aperiodic diffractive microlens array
Wavefront sensing by an aperiodic diffractive microlens array Lars Seifert a, Thomas Ruppel, Tobias Haist, and Wolfgang Osten a Institut für Technische Optik, Universität Stuttgart, Pfaffenwaldring 9,
More informationScience 8 Unit 2 Pack:
Science 8 Unit 2 Pack: Name Page 0 Section 4.1 : The Properties of Waves Pages By the end of section 4.1 you should be able to understand the following: Waves are disturbances that transmit energy from
More information