Adaptive preprocessing of scanned documents

Similar documents
INSTITUTE OF AERONAUTICAL ENGINEERING Dundigal, Hyderabad

IMAGE ENHANCEMENT - POINT PROCESSING

Digital Image Processing 3/e

Computer Vision. Howie Choset Introduction to Robotics

Image Filtering. Median Filtering

A Fast Median Filter Using Decision Based Switching Filter & DCT Compression

Preprocessing of Digitalized Engineering Drawings

SYLLABUS CHAPTER - 2 : INTENSITY TRANSFORMATIONS. Some Basic Intensity Transformation Functions, Histogram Processing.

Lossy and Lossless Compression using Various Algorithms

Automatic Licenses Plate Recognition System

Compression and Image Formats

Computing for Engineers in Python

VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL

Digital Image Processing

Noise Adaptive and Similarity Based Switching Median Filter for Salt & Pepper Noise

CSC 320 H1S CSC320 Exam Study Guide (Last updated: April 2, 2015) Winter 2015

DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam

Part I Feature Extraction (1) Image Enhancement. CSc I6716 Spring Local, meaningful, detectable parts of the image.

1. (a) Explain the process of Image acquisition. (b) Discuss different elements used in digital image processing system. [8+8]

Image De-Noising Using a Fast Non-Local Averaging Algorithm

Introduction. Computer Vision. CSc I6716 Fall Part I. Image Enhancement. Zhigang Zhu, City College of New York

MAV-ID card processing using camera images

Preprocessing and Segregating Offline Gujarati Handwritten Datasheet for Character Recognition

A Histogram based Algorithm for Denoising Images Corrupted with Impulse Noise

Anna University, Chennai B.E./B.TECH DEGREE EXAMINATION, MAY/JUNE 2013 Seventh Semester

Contrast Enhancement using Improved Adaptive Gamma Correction With Weighting Distribution Technique

Image Processing for feature extraction

The Classification of Gun s Type Using Image Recognition Theory

Research on Methods of Infrared and Color Image Fusion Based on Wavelet Transform

Colored Rubber Stamp Removal from Document Images

Image Enhancement using Histogram Equalization and Spatial Filtering

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

CS 445 HW#2 Solutions

Implementation of global and local thresholding algorithms in image segmentation of coloured prints

Literature Survey On Image Filtering Techniques Jesna Varghese M.Tech, CSE Department, Calicut University, India

Image Compression Using Huffman Coding Based On Histogram Information And Image Segmentation

PHASE PRESERVING DENOISING AND BINARIZATION OF ANCIENT DOCUMENT IMAGE

Keyword: Morphological operation, template matching, license plate localization, character recognition.

Teaching Scheme. Credits Assigned (hrs/week) Theory Practical Tutorial Theory Oral & Tutorial Total

Lane Detection in Automotive

A tight framelet algorithm for color image de-noising

Graphics for Web. Desain Web Sistem Informasi PTIIK UB

Table of contents. Vision industrielle 2002/2003. Local and semi-local smoothing. Linear noise filtering: example. Convolution: introduction

IMAGE PROCESSING: POINT PROCESSES

PRACTICAL IMAGE AND VIDEO PROCESSING USING MATLAB

PENGENALAN TEKNIK TELEKOMUNIKASI CLO

An Improved Bernsen Algorithm Approaches For License Plate Recognition

Digital Image Processing Introduction

A Bi-level Block Coding Technique for Encoding Data Sequences with Sparse Distribution

INDIAN VEHICLE LICENSE PLATE EXTRACTION AND SEGMENTATION

Local Adaptive Contrast Enhancement for Color Images

A Study On Preprocessing A Mammogram Image Using Adaptive Median Filter

Achim J. Lilienthal Mobile Robotics and Olfaction Lab, AASS, Örebro University

A Global-Local Contrast based Image Enhancement Technique based on Local Standard Deviation

CS/ECE 545 (Digital Image Processing) Midterm Review

Image Rendering for Digital Fax

LAB MANUAL SUBJECT: IMAGE PROCESSING BE (COMPUTER) SEM VII

Extraction of Newspaper Headlines from Microfilm for Automatic Indexing

AUTOMATIC SEARCH AND DELIMITATION OF FRONTISPIECES IN ANCIENT SCORES

Recovery of badly degraded Document images using Binarization Technique

Chapter 6. [6]Preprocessing

An Effective Method for Removing Scratches and Restoring Low -Quality QR Code Images

A Fast Segmentation Algorithm for Bi-Level Image Compression using JBIG2

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

Fundamentals of Multimedia

ISSN: (Online) Volume 2, Issue 1, January 2014 International Journal of Advance Research in Computer Science and Management Studies

Images and Graphics. 4. Images and Graphics - Copyright Denis Hamelin - Ryerson University

Robust Document Image Binarization Techniques

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

A Scheme for Salt and Pepper oise Reduction and Its Application for OCR Systems

An Improved Binarization Method for Degraded Document Seema Pardhi 1, Dr. G. U. Kharat 2

Objective Image Quality Evaluation for JPEG, JPEG 2000, and Vidware Vision TM

Weaving Density Evaluation with the Aid of Image Analysis

Image Enhancement Techniques: A Comprehensive Review

Non Linear Image Enhancement

An Analytical Study on Comparison of Different Image Compression Formats

Evolutionary Image Enhancement for Impulsive Noise Reduction

Bogdan Smolka. Polish-Japanese Institute of Information Technology Koszykowa 86, , Warsaw

International Conference on Computer, Communication, Control and Information Technology (C 3 IT 2009) Paper Code: DSIP-024

Wavelet-based image compression

Smooth region s mean deviation-based denoising method

Method for Real Time Text Extraction of Digital Manga Comic

Forward Modified Histogram Shifting based Reversible Watermarking with Reduced Pixel Shifting and High Embedding Capacity

Image Filtering. Reading Today s Lecture. Reading for Next Time. What would be the result? Some Questions from Last Lecture

ROI-based DICOM image compression for telemedicine

Comparative Study of Image Enhancement and Analysis of Thermal Images Using Image Processing and Wavelet Techniques

Frequency Domain Enhancement

Removal of Gaussian noise on the image edges using the Prewitt operator and threshold function technical

A SURVEY ON HAND GESTURE RECOGNITION

IJSRD - International Journal for Scientific Research & Development Vol. 4, Issue 05, 2016 ISSN (online):

Efficient 2-D Structuring Element for Noise Removal of Grayscale Images using Morphological Operations

Solution for Image & Video Processing

A simple Technique for contrast stretching by the Addition, subtraction& HE of gray levels in digital image

Digital Imaging and Image Editing

Content Based Image Retrieval Using Color Histogram

Image Processing Lecture 4

Adaptive Optimum Notch Filter for Periodic Noise Reduction in Digital Images

Level-Successive Encoding for Digital Photography

Image Denoising Using Statistical and Non Statistical Method

Introduction to Video Forgery Detection: Part I

Transcription:

0th WSEAS Int. Conf. on MATHEMATICAL METHODS AD COMPUTATIOAL TECHIQUES I ELECTRICAL EGIEERIG (MMACTEE'08), Sofia, Bulgaria, May -4, 008 Adaptive preprocessing of scanned documents ROUME KOUTCHEV Radio-communications Dept. Technical University of Sofia Bul. Kl. Ohridsky, 8 Sofia 000, BULGARIA VLADIMIR TODOROV T&K Engineering Co. Mladost 3, POB Sofia 7 BULGARIA ROUMIAA KOUTCHEVA T&K Engineering Co. Mladost 3, POB Sofia 7 BULGARIA Abstract: A new approach for scanned documents preprocessing, aimed at their efficient archiving with retained visual quality is presented in this paper. The preprocessing is based on image filtration, which consists of Gaussian noise removal with adaptive digital Wiener filter, followed by correction of the uneven background illumination with D digital filtration. Then the tets (graphics) are etracted using a special method for image segmentation. The presented approach permits further adaptive processing of the etracted tets and their background in result of which higher compression with retained image quality is obtained. The method ensures very good results when old documents with complicated image structure are processed and archived. Key words: image filtration, image segmentation, image archiving, lossless compression Introduction In the everyday practice, large amounts of paper documents have to be archived and stored. The ability to ensure fast access to the authentic information via Internet is of great importance for the up-to-date digital libraries, especially when historical archives are concerned. The current digital technologies permit the creation of electronic copies of the documents and their archiving. For this, the document is scanned and the digital image obtained is compressed and saved. The compression is usually performed transforming the image to jpg format. The efficient compression is of high importance and the preprocessing affects the compression ratio to a high degree. The standards for still image compression, JPEG and JPEG 000 [] are very efficient when natural halftone images are processed, but for high compression ratios the image quality of tets and graphics is significantly deteriorated. The efficient compression of documents images requires more fleible approach, adapted to the data specifics. More problems arose when compound images (comprising tets and pictures) are processed. The best approach is to process them in different way. In many cases the compression of tet and its background requires the use of some kind of adaptive image filtration and segmentation. The most frequently used techniques for noise filtration are based on the use of non-linear (adaptive median, rang or morphologic) filters []. Such filters are usually used to remove noise of the type salt and pepper. Their main disadvantage is that the performance of such filters is unpredictable to some degree and applying them on tets can result in inadmissible distortions. The processing then usually continues with some kind of adaptive segmentation based on recursive merging operations [3]. The aim of the paper is to develop new approach for preprocessing of grayscale images, in result of which to obtain higher compression ratio with retained image quality. The preprocessing comprises adaptive filtration, followed by image segmentation, aimed at the tet etraction from the image background. The work is arranged as follows: Section presents the algorithms used for the image filtration; Section 3 - the method for image segmentation, in Section 4 are presented some of the eperimental results obtained and in the Conclusion (Section 5) are pointed the method advantages, its future development and the possible application areas. Image filtration Two kinds of filtration aimed at the scanned documents efficient preprocessing depending on the image content are proposed in this paper. The first one is used for the noise removal. These are noises in the scanned documents, which usually represent some small irregularities of the white paper. For this operation, the locally adaptive ISB: 978-960-6766-60- 34 ISS: 790-57

0th WSEAS Int. Conf. on MATHEMATICAL METHODS AD COMPUTATIOAL TECHIQUES I ELECTRICAL EGIEERIG (MMACTEE'08), Sofia, Bulgaria, May -4, 008 filtration [4], which efficiently removes the additive Gaussian noise, is most suitable. The tets (graphics) in the image are not affected because their basic characteristics (dimensions, area, shape, contrast, etc.) are quite different from these of the noise. The performance of the locally adaptive digital Wiener filter, modified for this application is presented below: y(i, j) σ (i, j) v µ (i, j) + [ (i, j) µ (i, j)] σ (i, j) () if σ (i, j) v ; µ (i, j), if σ (i, j) < v, where, (i,j) and y(i,j) are the piels of the original and the filtered image, respectively. Here µ and (i, j) ( +)( σ (i,j) [ ( + )( + ) v +) m n n m (i + m, j + n) (i+ n,j+ m)] µ (i,j) () (3) M M σ (i, j). (4) M M i j In these relations, µ and represent (i,j) σ (i,j) respectively the mean value and the variance of the piel (i,j) in a local window of size ( + ) ( + ) ;, are positive integer numbers; M, M represent the size of the image matri and ν is the noise variance. The second step of the filtration is aimed at the uneven background illumination correction. This processing is also used, when old documents are transferred to electronic form and their background compression requires special attention. In this case such operation is necessary because in result the net step (the tet detection and segmentation) is simplified. The method for background processing offered here, is based on the use of D nonlinear digital filter [5] modified for this application. The filter performance is presented with the equation: z (i, j) (i, j) MS {(i, j)} + (5) B µ Here MS B {(i,j)} is a morphological smoothing filter, defined with the equation: MS B {(i, j)} min(ma(ma(min([i - m, j- n])))) for {m,n} [ B], (6) where the term [В] is a rectangular structuring element represented by a matri of size (, whose elements are equal to + ) ( + ) zero. This matri is used to set a symmetrical window around the processed piel (i,j) and after that is defined by the piel with minimum or maimum brightness value. This value is used to substitute the value of (i,j) in the consecutive stages of the filtration, performed recursively. The parameters, of the morphological smoothing filter are defined in accordance with the parameters of the processed background (size of the objects, etc.). The described filtration usually causes some specific distortions at the image edges. In order to avoid this, the image matri should be artificially made larger [] adding piels in both directions (horizontal and vertical). As a result, the original matri of size M M becomes of size ( M + ) (M + ). The easiest way is to add zeros in both directions, but the results obtained usually contain distortions, called zero-padding artifacts. In order to avoid this, in the method implementation was used image replication [5] of size equal to that of the filter window side. 3 Image segmentation based on the modified triangle algorithm The image segmentation is done on the basis of the image histogram analysis. The histograms of the images filtered in accordance with the presented algorithms, usually have only one maimum, which corresponds to the image background and the algorithms based on recursive merging operations [3] are not suitable. For this reason, the tet segmentation is based on the so-called triangle algorithm [5], modified for this application. The segmentation threshold is determined by the following operations.. Calculation of the image histogram: H() for 0,,..,Q-, where Q is the number of grey levels;. In the image histogram H(), are defined 3 points: First point (H 0, 0 ) the maimum of the histogram, which is usually the mean value of the corrected background; Two points (H, ) and (H, ), which are defined by the relations: ISB: 978-960-6766-60- 35 ISS: 790-57

0th WSEAS Int. Conf. on MATHEMATICAL METHODS AD COMPUTATIOAL TECHIQUES I ELECTRICAL EGIEERIG (MMACTEE'08), Sofia, Bulgaria, May -4, 008 H () H ( ) ψh 0 ( 0 ), H ( ) < H 0 ( 0 ) > H ( ) (7) for < 0 <. The value of the parameter ψ is usually set to be ψ 0.; 3. The equations of two straight lines are defined, which join the two pairs of points, (H, ), (H 0, 0 ) and (H, ), (H 0, 0 ), respectively. These equations are defined by the well-known relation: A + BH + C 0, (8) where for the first line: 0, A H - H 0, B 0 -, (9) CH 0 -H 0 ; And for the second one: 0, A H - H 0, B 0 -, (0) CH 0 -H 0. 4. The distance to the first and second line respectively is calculated for each point of the histogram (H, ), using the relation: A + BH + C D (), () A + B where А, B, and C are defined by the equation of the corresponding straight line. 5. The value θ of the variable is defined, for which the distance D(θ) ma. This value is the calculated segmentation threshold, for which from Eq. is obtained: H - H 0 H (θ +) - H(θ) () 0 - for the range 0, and correspondingly: H - H 0 H (θ +) - H(θ ) (3) - 0 for the range 0. Here θ and θ are the image segmentation thresholds, used to separate respectively the dark tet and the light parts in the image. 6. The image is then binarized in accordance with the relations:, p(i, j) 0, and if (i, j) if (i, j) θ ; (4) > θ,, if (i, j) θ ; p (i, j) (5) 0, if (i, j) < θ, The binary image p (i,j) contains the detected and separated dark signs in the image (in this case, tets and graphics), and p (i,j) the light ones. The image obtained is saved and used for further processing. In case, that the image background is not white paper, but some kind of a picture (such images are usually obtained when archive documents are processed), the image processing in accordance with the algorithm, developed by the research team, continues with the following operations: 7. The binary image p (i,j), which contains the tet only, is used as a mask and each piel from the original image, which corresponds to mask s dark parts, is substituted by a preset gray level (usually, the most frequent value, defined by the image histogram). 8. The obtained image is then processed with lossy compression and the compressed image is saved. 9. The etracted tet/graphic image is compressed with lossless compression and the obtained compressed image is saved. 0. To recover the image, the two images are processed together so that to compose the original one. The detailed presentation of operations 7-0 is not an object of this paper. For the compression is used a special format, developed by the research team. 4 Eperimental results The presented algorithms for filtration and segmentation of scanned documents were implemented in special software, DefView (Visual C, Microsoft environment), developed by the research team. For the eperiments were used different kinds of scanned images: printed documents, old documents, etc. The preprocessing was done with the two consecutive filtrations, or with only one of them. Some of the test images and the obtained results are shown below. In Fig. is presented one of the test images an enlarged part of scanned printed document. The image is of size 0 4 piels, grayscale (8bpp). In the image are easily seen many dark points, which usually eist in the scanned documents and as a rule, represent some small paper irregularities. These points are not seen when the image is in real size, but their presence affects significantly the ISB: 978-960-6766-60- 36 ISS: 790-57

0th WSEAS Int. Conf. on MATHEMATICAL METHODS AD COMPUTATIOAL TECHIQUES I ELECTRICAL EGIEERIG (MMACTEE'08), Sofia, Bulgaria, May -4, 008 compression ratio obtained: smaller number of such points enhances the compression ratio, which is very important for the efficient archiving of the processed images. Together with this, the image quality is very important as well. In order to achieve maimum quality, the eample test image had been compressed losslessly with software based on the JPEG000 standard (LuraTech Algovision www.algovision-luratech.com) and with the special software for lossless compression developed by the research team (TKView) [6,7]. The scanned document was processed with the locally-adaptive digital filter (Fig.). The filter parameters were: brightness threshold 65 and window width 5. shown in Fig. 5. This image was compressed, but the compression ratio is smaller than that of TKView. Fig.3. The image from Fig. after lossy compression with JPEG-000 based software. Fig..Test image: Enlarged part of scanned document, 0 4 piels, 8 bpp. Fig.4. The image from Fig. after high-quality JPEG compression (quality factor 90, MS Photo Ed.) Fig.. The image from Fig. after noise filtration The compression ratio (CR) results obtained after lossless compression with the two software products are presented in Table. TABLE. Compression ratio results Image TKView (LS) LuraTech CR File size CR File size Fig.,77 843,0 4 00 Fig. 3,7 543,56 3 9 After lossy JPEG000 compression with same compression ratio (CR3,7) the background of the restored image (Fig.3) has noticeable changes. The results obtained after lossy compression with MS Photo Editor (JPEG-based) software are presented in TABLE. The image from Fig. after high-quality JPEG compression (Quality factor 90, MS Photo Editor) is shown in Fig. 4. In result of the high quality required, the file size was enlarged and the visual image quality is not good (the number of the visible dark points is increased). The result of the JPEG compression for the image in Fig. is Fig.5. The image from Fig. after JPEG compression (quality factor 90, MS Photo Ed.) TABLE. Compression ratio results Image DefView (LS) MS Photo Ed. CR File size CR File size Fig.,77 843 0,90 6 537 Fig. 3,7 543,09 5469 Another test image is shown in Fig. 6. This is a part of scanned old document. The image is of size 568 340 piels, grayscale, 8 bpp. It was filtered with the two filters, presented in the paper and after that the image segmentation was done. The processing was performed with the special software DefView, developed by the researched team. The result of the noise filtration is shown in Fig. 7 and the final result in Fig. 8. In result were obtained two pictures the one on Fig. 8, which contains the etracted tet and the second, with the image background on Fig. 9. The two parts of the original image were processed individually the part with the tet with lossless, and the background with lossy compression. ISB: 978-960-6766-60- 37 ISS: 790-57

0th WSEAS Int. Conf. on MATHEMATICAL METHODS AD COMPUTATIOAL TECHIQUES I ELECTRICAL EGIEERIG (MMACTEE'08), Sofia, Bulgaria, May -4, 008 The compression ratios, obtained for the two images (Fig. 8 and 9) are presented in TABLE 3. The lossy compression was performed with a special method for image compression, developed by the research team and based on the so-called Inverse Difference Pyramid decomposition (IDP) [8], implemented in TKView. Fig.6. A part of old document Fig.0. The recovered test image from Fig. 6 Fig.7. The image from Fig.6 after noise filtration Fig.8. The image from Fig.6 after background illumination correction and segmentation TABLE 3 Image Compression CR PSR [db] File size Fig.8 Lossless,60 Inf. 8 544 Fig.9 Lossy 6, 8,4 3 00 Total size 39 644 The compression ratio for Fig. 8 (the etracted tet) is very high. The compression is lossless and the image Peak Signal to oise Ratio (PSR) is infinity. The background image was compressed with lossy compression and the image PSR is 8,4 db. The restored image is shown in Fig. 0. The relatively low quality of the image background is not noticeable, because the image structure is suitable for this, but the tet quality is not changed. The high CR for the etracted tet image is a result of the algorithm for lossless data compression, developed by the research team. For comparison, the same image was compressed with JPEG000-based software (Lura Tech Algovision). The results obtained are presented in TABLE 4 below. TABLE 4 Image Compression CR PSR [db] File size Fig. 6 Lossy 4,76 8,57 40 540 Fig.9. The background after tet substitution with gray value. The filters parameters were: For filter : Brightness threshold 48, Filter window width 5; For filter : Filter window width 3. The compressed image size is more than 40 KB and the quality of the whole image is 8,57 db, while the presented in this paper new method ensures lossless compression for the tet in the image and the compressed file size is smaller. ISB: 978-960-6766-60- 38 ISS: 790-57

0th WSEAS Int. Conf. on MATHEMATICAL METHODS AD COMPUTATIOAL TECHIQUES I ELECTRICAL EGIEERIG (MMACTEE'08), Sofia, Bulgaria, May -4, 008 5 Conclusion The software implementation of the new approach for preprocessing of scanned documents, presented in this paper, proved the method efficiency. The ability to perform adaptive segmentation and to compress the tet and the background in the most suitable way ensures efficient compression of the processed documents. The two kinds of compression, used for this (lossless and lossy) were developed by the research team and were used for the creation of new image format. The main advantages of the method are: The method offers adaptive preprocessing of the main objects in the scanned documents images the tet and the background; The presented method for image preprocessing ensures higher compression for the most popular compression algorithms (the eperiments were performed with JPEG and JPEG 000) in result of the preprocessing the CR was increased; The presented method for image preprocessing offers better results for the image compression, based on the IDP decomposition, because it suits very well the statistics of the compressed image data; The creation of two different pictures in the process of image processing is not noticeable for the user the final result is one compressed file only. This problem is solved by the software implementation of the presented method; Compared with the most powerful method for still image compression the JPEG 000 standard the new method for preprocessing of scanned documents, followed by the special compression developed by the research team, offers higher compression together with unchanged quality for the tet in the image and visually retained quality for the image background. The future development of the method will continue with a research based on the use of the adaptive IDP decomposition together with lossless data compression and aimed at the processing of color images. In this case the IDP decomposition will permit the brightness processing and the image segmentation to be done first, while the color information will be processed independently. The epected results are higher compression ratio and image quality. The future application areas for the new method are: The creation of the special method for archiving of scanned documents, based on the new image format will be aimed at its application in digital libraries, presentations of training courses, distance learning, etc.; Efficient processing of color and multispectral images of any kind aimed at the efficient archiving of image databases. The method for image filtration and segmentation could be used for contours etraction of sign language interpreter and as a result - for the preparation of special lectures for distance learning courses aimed at deaf or hearing-impaired students the research team already had already obtained very good results in this area. Acknowledgement: This paper was supported by the ational Fund for Scientific Research of the Bulgarian Ministry of Education and Science (Contract VU-I 305/007). References: [] R. Gonzalez, R. Woods. Digital Image Processing. Prentice Hall, 00. [] G. Arce. onlinear Signal Processing. A Statistical Approach. Wiley Interscience, 005. [3] J. Delon, A. Desolneu, J.L. Lisani, A. Petro. A onparametric approach for Histogram Segmentation. IEEE Transaction on Image Processing. Vol. 6, o, 007, pp. 53-6. [4] J. Lim, Two-dimensional Signal and Image Processing. Prentice Hall, 990. [5] I. Young, J. Gerbrands, L. van Vliet. Image Processing Fundamentals. CRC Press, 000. [6] R. Kountchev, Vl. Todorov, R. Kountcheva. Lossless Compression of Graphics and Contour Images, Based on WHT and ARLC. Vol.55, Proc. of the Technical University of Sofia, Bulgaria, 005, pp.87-94. [7] R. Kountchev, Vl. Todorov, M. Milanova, R. Kountcheva. Documents Image Compression with IDP and Adaptive RLE. The 3nd Annual Conference of the IEEE Industrial Electronics Society, Paris, FRACE, IECO 06, ov., 006. [8] R. Kountchev, M, Milanova, C. Ford, R. Kountcheva. Multi-layer Image Transmission with Inverse Pyramidal Decomposition. In Computational Intelligence for Modelling and Predictions, S. Halgamuge, L. Wang (Eds.), Vol., Chapter o. 3, Springer-Verlag, Berlin Heidelberg, 005. ISB: 978-960-6766-60- 39 ISS: 790-57