Empirical Evaluation of Visible Spectrum Iris versus Periocular Recognition in Unconstrained Scenario on Smartphones

Similar documents
ARTICLE IN PRESS. JID: PATREC [m5g;october 17, 2014;15:59] Pattern Recognition Letters 000 (2014) Contents lists available at ScienceDirect

Experiments with An Improved Iris Segmentation Algorithm

Iris Recognition with a Database of Iris Images Obtained in Visible Light Using Smartphone Camera

A Novel Image Fusion Scheme For Robust Multiple Face Recognition With Light-field Camera

Fusing Iris Colour and Texture information for fast iris recognition on mobile devices

Exploring the feasibility of iris recognition for visible spectrum iris images obtained using smartphone camera

Face Presentation Attack Detection by Exploring Spectral Signatures

CAN NO-REFERENCE IMAGE QUALITY METRICS ASSESS VISIBLE WAVELENGTH IRIS SAMPLE QUALITY?

Global and Local Quality Measures for NIR Iris Video

Recent research results in iris biometrics

Presentation Attack Detection Algorithms for Finger Vein Biometrics: A Comprehensive Study

Note on CASIA-IrisV3

Iris Recognition in Mobile Devices

Predicting Eye Color from Near Infrared Iris Images

Contact lens detection in iris images

Image Averaging for Improved Iris Recognition

International Conference on Innovative Applications in Engineering and Information Technology(ICIAEIT-2017)

Face Detection System on Ada boost Algorithm Using Haar Classifiers

An Efficient Approach for Iris Recognition by Improving Iris Segmentation and Iris Image Compression

International Journal of Scientific & Engineering Research, Volume 7, Issue 12, December ISSN IJSER

BEing an internal organ, naturally protected, visible from

ISSN: [Deepa* et al., 6(2): February, 2017] Impact Factor: 4.116

Iris Recognition using Hamming Distance and Fragile Bit Distance

ENHANCHED PALM PRINT IMAGES FOR PERSONAL ACCURATE IDENTIFICATION

License Plate Localisation based on Morphological Operations

Title Goes Here Algorithms for Biometric Authentication

Outdoor Face Recognition Using Enhanced Near Infrared Imaging

Iris Segmentation & Recognition in Unconstrained Environment

Postprint.

Distinguishing Identical Twins by Face Recognition

An Un-awarely Collected Real World Face Database: The ISL-Door Face Database

Iris Recognition using Histogram Analysis

Using Fragile Bit Coincidence to Improve Iris Recognition

EFFECTS OF SEVERE SIGNAL DEGRADATION ON EAR DETECTION. J. Wagner, A. Pflug, C. Rathgeb and C. Busch

Identification of Suspects using Finger Knuckle Patterns in Biometric Fusions

Learning Hierarchical Visual Codebook for Iris Liveness Detection

Feature Extraction Technique Based On Circular Strip for Palmprint Recognition

Evaluation of Biometric Systems. Christophe Rosenberger

Image Averaging for Improved Iris Recognition

SVC2004: First International Signature Verification Competition

The Results of the NICE.II Iris Biometrics Competition. Kevin W. Bowyer. Department of Computer Science and Engineering. University of Notre Dame

IMPACT OF SEVERE SIGNAL DEGRADATION ON EAR RECOGNITION PERFORMANCE. A. Pflug, J. Wagner, C. Rathgeb and C. Busch

An Enhanced Biometric System for Personal Authentication

List of Publications for Thesis

Student Attendance Monitoring System Via Face Detection and Recognition System

3 Department of Computer science and Application, Kurukshetra University, Kurukshetra, India

SVM BASED PERFORMANCE OF IRIS DETECTION, SEGMENTATION, NORMALIZATION, CLASSIFICATION AND AUTHENTICATION USING HISTOGRAM MORPHOLOGICAL TECHNIQUES

Fingerprint Segmentation using the Phase of Multiscale Gabor Wavelets

3D Face Recognition System in Time Critical Security Applications

IRIS Recognition Using Cumulative Sum Based Change Analysis

Empirical Evidence for Correct Iris Match Score Degradation with Increased Time-Lapse between Gallery and Probe Matches

The 2019 Biometric Technology Rally

Biometric Recognition: How Do I Know Who You Are?

Touchless Fingerprint Recognization System

International Journal of Advanced Research in Computer Science and Software Engineering

About user acceptance in hand, face and signature biometric systems

Impact of Resolution and Blur on Iris Identification

Efficient Iris Segmentation using Grow-Cut Algorithm for Remotely Acquired Iris Images

Face Detection using 3-D Time-of-Flight and Colour Cameras

EFFICIENT ATTENDANCE MANAGEMENT SYSTEM USING FACE DETECTION AND RECOGNITION

Impact of out-of-focus blur on iris recognition

VIDEO DATABASE FOR FACE RECOGNITION

Rank 50 Search Results Against a Gallery of 10,660 People

PERFORMANCE TESTING EVALUATION REPORT OF RESULTS

Department of Computer Science & Engineering Michigan State University December 10, 2010

Impact of Out-of-focus Blur on Face Recognition Performance Based on Modular Transfer Function

Published by: PIONEER RESEARCH & DEVELOPMENT GROUP ( 1

Human Identification from Video: A Summary of Multimodal Approaches

A Proficient Matching For Iris Segmentation and Recognition Using Filtering Technique

Iris Recognition-based Security System with Canny Filter

DOI: /IJCSC Page 210

INTERNATIONAL RESEARCH JOURNAL IN ADVANCED ENGINEERING AND TECHNOLOGY (IRJAET)

Fast Subsequent Color Iris Matching in large Database

A Novel Image Deblurring Method to Improve Iris Recognition Accuracy

Feature Level Two Dimensional Arrays Based Fusion in the Personal Authentication system using Physiological Biometric traits

A New Fake Iris Detection Method

SCIENCE & TECHNOLOGY

NOVEL APPROACH OF ACCURATE IRIS LOCALISATION FORM HIGH RESOLUTION EYE IMAGES SUITABLE FOR FAKE IRIS DETECTION

RELIABLE identification of people is required for many

Abstract Terminologies. Ridges: Ridges are the lines that show a pattern on a fingerprint image.

IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, VOL. 4, NO. 4, DECEMBER

Identity and Message recognition by biometric signals

Near Infrared Face Image Quality Assessment System of Video Sequences

Stamp detection in scanned documents

A Novel Algorithm for Hand Vein Recognition Based on Wavelet Decomposition and Mean Absolute Deviation

Image Forgery Detection Using Svm Classifier

Software Development Kit to Verify Quality Iris Images

Multimodal Face Recognition using Hybrid Correlation Filters

Illumination Invariant Face Recognition using Local Directional Number Pattern (LDN)

Feature Extraction Techniques for Dorsal Hand Vein Pattern

Advances in Iris Recognition Interoperable Iris Recognition systems

IRIS Biometric for Person Identification. By Lakshmi Supriya.D M.Tech 04IT6002 Dept. of Information Technology

Tools for Iris Recognition Engines. Martin George CEO Smart Sensors Limited (UK)

International Journal of Scientific & Engineering Research, Volume 5, Issue 1, January ISSN

A Study of Distortion Effects on Fingerprint Matching

Biometric Recognition Techniques

Controlling Humanoid Robot Using Head Movements

Edge Histogram Descriptor for Finger Vein Recognition

Vein and Fingerprint Identification Multi Biometric System: A Novel Approach

Visible-light and Infrared Face Recognition

Transcription:

Empirical Evaluation of Visible Spectrum Iris versus Periocular Recognition in Unconstrained Scenario on Smartphones Kiran B. Raja * R. Raghavendra * Christoph Busch * * Norwegian Biometric Laboratory, Gjøvik University College, Norway. Hochschule Darmstadt - CASED, Haardtring 100, 64295 Darmstadt, Germany { kiran.raja, raghavendra.ramachandra, christoph.busch}@hig.no Abstract Smartphones are increasingly used as biometric sensor for many authentication applications due to the computational ability and high resolution cameras that can be used to capture biometric information. The objective of this paper is to assess the performance of iris versus periocular recognition for smartphones in non ideal conditions (change of illumination, highly pigmented iris, shadows on iris pattern) in real-life for verification in visible spectrum. We introduce various protocols for real-life verification scenarios using smartphones for iris and periocular recognition. Further, we also study the verification performance where enrollment and probe data originate from different smartphones. From the extensive set of experiments conducted on a publicly available smartphone database, it can be observed that the information from periocular region provides substantially good performance in terms of recognition accuracy in cross sensor and varying illumination scenarios as compared to iris under same conditions. I. INTRODUCTION Key players in the smartphone market such as Apple, Motorola and are advocating robust security by employing biometric characteristics for authentication. The smartphones produced by these vendors have integrated biometric sensors onto the smartphones. Along the lines, the research sector in academia has investigated the use of cameras on smartphones to multiplex it as biometric sensors. Raghavendra et al. [17] have successfully demonstrated the use of smartphone for real-life verification by exploiting the 2D/3D face biometrics. In similar terms, other works have explored contactless capture of fingerprint for authentication [16], [21], contactless knuckleprint for identification [3]. Another well established biometric characteristic known for reliable biometric performance is the unique texture pattern present in the iris. Traditional systems in biometrics employ Near-Infra-Red (NIR) light to resolve the texture of the iris. Recent advancements proving the feasibility of imaging the iris pattern in the visible light has sparked the interest in using different cameras for iris imaging [9], [15]. In the light of these developments in visible spectrum iris recognition, smartphones are being actively probed for their potential use as iris imaging device [5], [10]. The limited visibility of the iris texture in visible spectrum is a major limiting factor for the robust performance [8]. However, one has to note that when imaging the iris, the regions surrounding the iris is also captured which constitutes the periocular region. There have been various works reporting the use of periocular information for reliable verification accuracy [18], [14], [2]. The supplementary information from the eye region can be used to boost the verification performance [18], [7]. (a) (b) (c) (d) Iris of (a) Iris of (c) Iris of (b) Iris of (d) Fig. 2: Challenging samples in the iris image database Figure 2 illustrates the challenges in unconstrained iris recognition on smartphones. Figure 2 (a)-(b) illustrates the challenging iris pattern where there is no visibility of iris texture in the captured images. Figure 2 (c)-(d) presents the images with better visibility of texture. The second row of Figure 2 illustrates unwrapped iris illustrating visibility of texture. Under the presence of such challenging samples as indicated in Figure 2 (a)-(b) it is challenging to obtain reliable verification performance. Since iris is an integral part of periocular region, it is always interesting to study the verification performance independently on iris and periocular, especially when the iris pattern is severely degraded as in the case of non-ideal acquisition as illustrated in Figure 2. In this work, we provide an extensive evaluation of iris and periocular recognition in non-ideal conditions in visible spectrum. Further, we also present the extensive analysis on the cross-sensor comparisons and its implication on the verification performance. We have considered publicly available and relatively large scale (1600 images) database captured using two different smartphones from BIPLab[5], [1]. This work also evaluates various score level fusion for iris and periocular features independently. The rest of the paper is organized as follows: Section II gives a brief overview of pro iris recognition framework. 978-616-361-823-8 2014 APSIPA APSIPA 2014

Eye Image Iris Processing Segmentation Iris Recognition Pipeline Feature Extraction and Comparison Pool of BSIF Filters Normalization SRC Match Score Verification Pool of BSIF Filters Periocular Region SRC Feature Extraction and Comparison Match Score Periocular Recognition Pipeline Verification Fig. 1: Iris and periocular recognition framework Section III provides the details of the database, experiments and proposed protocol for evaluation. Section IV discusses the results and Section V provides the conclusive remarks from this work. II. PROPOSED FRAMEWORK FOR IRIS RECOGNITION As illustrated in the Figure 1, the iris recognition framework begins with the acquisition of the image. The acquired image usually consists of regions from the face and background which often lead to wrong segmentation. In order to avoid the failure of segmentation, it is essential to locate the region of eye. In this work, we have employed Haar cascade based eye detector implemented in Matlab [23]. The region localized for the eye is further processed to segment the iris and pupil. Due to the unconstrained nature of imaging, the diameter of the iris varies largely from one image to another. In order to overcome this, we have employed the saliency and diffusion based approach to approximate the radius of iris [8]. The approximated iris radius is provided as the apriori information to the OSIRIS v4.1 [22]. The choice of OSIRIS v4.1 is based on the robustness of its performance even for the visible spectrum iris recognition [18], [9]. The segmented iris is further normalized to a fixed dimension of 512 64 pixels using Daugman s rubber sheet model [4]. The normalized iris is then used to extract the texture features which are employed in verification. Figure 3 illustrates the segmentated iris obtained using OSIRIS v4.1 and normalized using Daugman s rubber sheet model [4]. (a) (b) Fig. 3: Segmentation and normalization of iris When the image of eye is captured, the region close to eye (c) is also imaged which presents the periocular information. As the information presented by the periocular region is unique, it can also be employed for verification [14]. Thus another pipeline for recognition constitutes using periocular features for verification as depicted in the Figure 1. A. Feature Extraction and Verification Various feature extraction techniques are explored on this database in earlier works which include [4], [13], [12], [11], [19], [20], [18]. Out of all the feature extraction techniques employed on this database as a benchmark performance, Local-Binary-Pattern Sparse-Reconstruction-Classifier (LBP- SRC) has proven to be best technique with lowest Equal- Error-Rate (EER) for same sensor verification and 2D Gabor filters have provided lower EER in cross sensor verification scenario. Thus, in this work we follow the same protocol such that the presented results can be compared against the benchmark results obtained earlier on this database [8]. In this we explore Binarized-Statistical-Image-Features (BSIF) for feature extraction by employing a pool of filters along with SRC for classification, motivated by the results presented in earlier works on both visible iris and periocular information [7]. 1) Pool of BSIF filters: BSIF features are extensively studied for their performance towards iris and periocular recognition in visible spectrum and have demonstrated consistently robust performance [7]. Motivated by the superior performance, we have explored BSIF features for smartphone based iris and periocular recognition. Unlike the previous approaches, in this work propose to use a pool of filters constituted by various dimension filters. Specifically, we have employed 9 9, 11 11, 13 13, 15 15, and 17 17 pixel dimension filters to obtain the features. In order to strengthen the features for robust verification, we also explore the various bit depth encoding ranging from 8 to 12 as mentioned [6]. The pool of filters formed with various dimension and bit encoding provides the robust features. Figure 4 presents the

(a) (b) (c) Fig. 4: Illustration of BSIF features obtained with 17x17 filter with 8 scales (d) BSIF features extracted from periocular region and iris region with a filter of dimension 17 17 with a bit encoding of 8 bits. It can be observed from the Figure 4 that the features extracted from periocular region are more superior as compared to iris features due to low visibility of iris texture. A. Database III. EXPERIMENTS In order to evaluate the proposed protocols to assess the cross camera and cross illumination verification performance, we have employed the smartphone iris database from BI- PLab [1]. The dataset consists of images captured from two different smartphones - 5 and Galaxy S4. The images are captured using frontal and rear camera in two different illumination - indoor and outdoor conditions. The indoor illumination corresponds to artificial light and outdoor illumination represents the varying daylight conditions. Although the database consists of 75 unique iris from two different smartphones, we have considered 50 unique iris which are captured from both smartphones. The set of iris employed in this work is captured in all different acquisition conditions representing indoor and outdoor illumination from both frontal and rear camera of smartphones. Further, each unique iris is accompanied by 4 samples making the database employed in this work consists of total number of 1600 images (50 Subjects 4 Samples 2 Smartphones 2 Illumination 2 Camera). B. Performance Evaluation Protocols Owing to the fact that the smartphones are carried everywhere and used in different illumination, in this work we assess the verification performance under different illumination. With availability of smartphones with better features and reduced price, consumers always tend to upgrade for smartphones with newer features. The process of changing the smartphones definitely leads to having a situation where the enrollment data for an application such as secure banking is captured using a phone which is different than the one used for verification. This situation can essentially be translated to cross sensor verification scenario. Based on the above two (a) (b) (c) (d) (e) (f) (i) (g) Fig. 5: Images corresponding to one particular corresponding to a smartphone under particular illumination. (a) - (Frontal camera) (b) - (Rear camera) (c) - (Frontal camera) (d) - (Rear camera) (e) - (Frontal camera) (f) - (Rear camera) (g) - (Frontal camera) (g) - (Rear camera) arguments, in this work we mimic the real-life verification scenario by employing cross-sensor and cross-illumination verification. Importantly, in this work we benchmark the performance of iris versus periocular features as a biometric characteristics in smartphones in unconstrained scenario following the proposed protocols [8]. The experiments are conducted individually on iris and periocular database obtained from BIPLab [1]. All the results are reported in terms of EER in order to make a fair comparison to the results published previously on BIPLab database [8]. Figure 5 presents the image acquired for a particular subject under different illumination and different camera. The impact from change of hardware and illumination can be clearly distinguished. The images in Figure 5 (a)-(d) correspond to images acquired in phone and Figure 5 (e)-(g) correspond to images from. It can be noted that the visibility of iris changes vastly in each of the image while the periocular region remains stable across illumination. Each of the protocol mentioned below is analyzed to study the impact

of such changes on iris and periocular database. 1) Protocol 1: Performance Evaluation of Unconstrained Illumination: The first protocol consists of evaluating the performance of iris and periocular recognition with the images acquired in indoor and outdoor illumination. The images acquired in indoor illumination is treated as set of reference images and probe images correspond to the images captured from outdoor illumination for each camera from a particular phone. This protocol will provide the insights of degradation of recognition performance under different illumination. Since each unique eye is accompanied by the 4 samples, we treat three images as reference and one image as probe to obtain the comparison scores. We swap the probe image and reference image by making each of the image as probe for a corresponding set of images. The set of experiment is repeated 5 times and score corresponding to minimum from the set of obtained scores considered for obtaining the final verification performance. 2) Protocol 2: Performance Evaluation of Unconstrained Hardware: In order to consider the case where the consumer uses a specific smartphone in a specific illumination for enrollment and uses a different smartphone or different illumination for verification. There can be situations where a particular subject uses frontal camera from smartphone for enrollment and rear camera for the verification. This protocol measures the degradation of performance under cross sensor verification with unconstrained conditions. A set of data acquired from a particular smartphone camera under a fixed illumination is treated as reference set of images and the another set of images are treated as probe images. For instance, images corresponding to --Frontal camera as treated as reference and rest of all images obtained from other conditions are treated as probe samples. Owing to availability of 4 samples for each iris, we employ all 4 iris samples as reference and one probe sample is compared at a time. All the 4 samples from the probe set is compared and the minimum score is used to obtain the EER. C. Fusion Schemes To study the impact on recognition with the available scores from pool of BSIF filters, we employ fusion rules based on Min rule, Max rule and Sum rule. Each of them are outlined as below: 1) Min-Rule Fusion: The comparison scores C are obtained by employing different filters with various dimensions and scales. The dimension of each of the filter is given in terms of k k where k = {9,11,13,15,17} and scales are indicated by s where s = {8,9,10,11,12}. The comparison score for computing the EER according to Min rule is given by: C selected = argmin{c ij } (1) i=k,j=s 2) Max-Rule Fusion: The comparison scores C are obtained by employing different filters with various dimensions and scales. The dimension of each of the filter is given in terms of k k where k = {9,11,13,15,17} and scales are indicated by s where s = {8,9,10,11,12}. The comparison score for computing the EER according to Min rule is given by: C selected = argmax{c ij } (2) i=k,j=s 3) Sum Rule Fusion: In the third scheme for fusion, we have employed sum rule fusion in which comparison score obtained by employing different filter from pool of BSIF filters is used to minimize the obtained EER. If the comparison scores from each patch size are given by C n, then the fused verification score v can be given as: v = C ij (3) i=k,j=s IV. RESULTS AND DISCUSSION All the results are listed in two different sections pertaining to iris and periocular to make a clear distinction between the two individual biometric characteristics. A. Results on Iris Recognition Table I presents the baseline results obtained on the database for iris recognition. The elements in the colored cells present the verification scores for same sensor images, for instance, iris images from frontal camera of in indoor illumination is probed against itself. The key observation to be made in the obtained performance is that EER is consistently lower for all data obtained from same smartphone for enrollment and probe. The lowest of the EER corresponds to 2.46% corresponding to images obtained from frontal camera of in indoor illumination. Table III also presents the verification performance for the set of comparisons for images obtained from frontal camera versus the rear camera of same smartphone which gives an intuition into performance degradation as an effect of change of illumination and effect of change of camera. The recognition performance for such cross-camera or cross-illumination is very less as compared to the recognition obtained from same-camera and same-illumination. Table I also presents the results obtained using the M in Rule, M ax Rule and Sum Rule for score level fusion. Figure 6 provides the Detection Error Trade-off (DET) curve depicting the genuine accept rate for iris recognition in visible spectrum using smartphones. Although, similar results can be illustrated for periocular recognition, for the sake of simplicity we have illustrated DET curves for iris only. The fusion schemes have improved the verification scores and particularly, sum rule has resulted in highest improvement. The lowest EER is obtained on the iris images obtained using rear camera of smartphone in outdoor condition. Further, Table II presents the results of various cross sensor (cross-smartphone) comparisons for iris data. The images enrolled using the in a particular scenario is probed against the images obtained from phone. It can be noted from the presented table that the verification accuracy

100 100 95 90 90 80 Genuine Match Rate (%) 85 80 75 Genuine Match Rate (%) 70 60 50 40 70 30 65 Frontal Rear Frontal Rear 60 10 1 10 0 10 1 10 2 False Match Rate (%) (a) Baseline performance for using pool of BSIF filters 20 Frontal Rear Frontal Rear 10 10 1 10 0 10 1 10 2 False Match Rate (%) (b) Baseline performance for using pool of BSIF filter Fig. 6: Detection Error Tradeoff curves for best performance on iris images obtained using from and is boosted when the scores are fused with various fusion schemes. The Sum Rule for fusion provides the best performance for all the conditions as compared to rest of other fusion schemes and the lowest EER is obtained for all different comparisons under this scheme. B. Results on Periocular Recognition In similar terms to experiments on iris recognition, we have conducted extensive experiments using periocular information. Table III presents the baseline results for periocular recognition on BIPLab database. The elements depicted in color present the verification scores for periocular images obtained from same sensor. Pool of filters obtained from BSIF has resulted in an EER of 0% consistently for all data obtained from same smartphone for both enrollment and probe. Unlike the verification performance of iris images, periocular images has resulted in 0% EER. One of the critical observation from this set of experiments is that the lower visibility of iris pattern has impacted the iris recognition performance. Added to the lesser visibility, factors such as reflection in outdoor scenario, lower illumination in indoor scenario and occlusion have contributed to lesser verification accuracy. At the same time, the periocular features are not affected by the factors such as reflection. Thus, the features from periocular information in visible spectrum is observed to outperform the verification from iris features with pool of BSIF filters to obtain robust features. Table III further presents the results of various cross sensor (cross-smartphone) comparisons. Table III also presents the verification performance for the set of comparisons for periocular images obtained from frontal camera versus the rear camera of same smartphone. Further, Table III presents the results of various cross sensor (cross-smartphone) comparisons. It can be observed from the table that the performance is consistently higher when the periocular information is used for verification as compared to iris. Table IV presents the results of periocular recognition from various cross sensor (cross-smartphone) comparisons. For instance, the periocular images enrolled using the frontal camera in indoor scenario is probed against the images obtained from frontal camera of phone in indoor scenario. As compared the cross-sensor comparisons obtained from iris information, the periocular information has consistently outperformed even for cross-sensor comparisons with a performance gain of at-least 15% in terms of EER. The features extracted using the pool of BSIF filters has remained similar across various smartphones for the same subject. Further, the verification accuracy is boosted when the scores are fused using various fusion schemes. The Sum Rule for fusion provides the best performance for all the conditions as compared to rest of other fusion schemes. The lowest EER of 4.17% is obtained for periocular images obtained from rear camera of in indoor illumination versus periocular images obtained from rear camera of. Superior performance of periocular images in visible spectrum can be noted by comparing the corresponding EER with iris (12.5%). C. Remarks : Iris versus Periocular Althoug iris has carved a niche in biometrics for robust verification performance in Near-Infra-Red domain, the visible iris is still being explored to make it more robust. Considering the largest group of brown iris in the world population, the iris recognition schemes have to carefully tackle the problem of low visibility or no visibility of pattern. Further, factors such as external illumination, reflections from environmental light, shadows of eye lashes lead to severe degradation of verification performance [15]. The problem is exaggerated when smartphones are used as biometric sensors in visible spectrum for iris recognition. While the challenges of reflections and occlusion remain, cross-sensor (cross-smartphone) comparisons

TABLE I: EER obtained for iris recognition different fusion schemes for same smartphones Scheme Baseline Min-Rule Max-Rule Sum-Rule Reference Images Probe Images Front Rear Front Rear Front 2.46 35.42 43.35 45.50 Rear 31.72 3.88 45.32 49.82 Front 41.89 45.83 8.33 39.65 Rear 39.10 39.65 33.84 14.58 Front 3.68 37.50 41.64 43.75 Rear 39.58 2.11 45.83 31.69 Front 43.33 41.82 8.27 37.15 Rear 49.67 29.63 43.37 1.84 Front 2.39 39.85 45.63 47.81 Rear 43.88 4.17 45.83 49.60 Front 40.09 48.20 10.13 43.95 Rear 43.73 42.29 37.74 14.61 Front 4.19 40.80 43.75 44.06 Rear 44.06 2.08 47.32 35.26 Front 41.67 45.08 8.33 40.82 Rear 47.87 34.95 43.66 2.08 Front 2.48 35.13 41.67 43.77 Rear 30.76 2.19 45.79 50.02 Front 43.46 45.83 6.69 37.94 Rear 41.45 41.93 35.39 14.58 Front 2.08 35.28 43.75 43.37 Rear 39.23 2.15 45.72 31.76 Front 41.64 39.23 8.69 34.95 Rear 48.45 31.25 41.58 1.91 Front 2.37 30.76 39.58 39.94 Rear 31.49 1.66 41.89 45.90 Front 39.43 43.57 6.25 36.06 Rear 39.07 37.90 33.33 8.20 Front 2.08 35.00 39.58 39.96 Rear 37.63 1.60 43.75 29.19 Front 39.41 41.27 6.47 31.25 Rear 43.84 29.28 39.58 0.42 can further lower the performance due to different characteristics of smartphones. As observed from the presented results, periocualar information can avoid the challenges presented by iris to better extent in visible spectrum. The obtained EER further supports the arguments for using periocular information for recognition on smartphones under visible spectrum and unconstrained situations. The EER is lowered by more than 15% in average when the periocular information is used in cross-sensor scenario. At the same time, the EER is reduced by atleast 2% when the periocular information is used for the comparison of data acquired under same-smartphone (sameillumination and same-camera) as compared to performance TABLE II: EER performance obtained for iris recognition using cross sensor (cross-camera) comparison with different fusion schemes Scheme Baseline Min-Rule Max-Rule Sum-Rule Reference Images obtained using iris information. D. Remarks for Protocols Probe Images Front Rear Front Rear Front 25.02 43.51 45.83 45.86 Rear 35.42 12.52 46.17 35.39 Front 43.97 41.67 27.08 37.52 Rear 45.99 43.84 38.01 33.36 Front 35.90 44.30 47.67 46.37 Rear 39.58 20.83 45.79 41.38 Front 45.86 50.42 35.20 41.89 Rear 46.50 43.82 37.74 33.33 Front 20.83 41.67 43.53 43.77 Rear 41.29 10.42 47.67 31.25 Front 43.75 43.55 27.06 39.58 Rear 47.92 43.75 39.61 31.25 Front 21.17 33.33 41.67 43.75 Rear 31.74 12.50 43.95 33.11 Front 35.42 39.87 23.45 35.42 Rear 42.00 41.69 33.33 31.01 1) Remarks for Protocol 1: Protocol 1 is designed to study the impact of illumination on verification performance. Table I and II provide the results obtained on iris recognition when the enrollment and probe are obtained in different illumination. Similarly, Table III and IV provide the results obtained on periocular recognition when the enrollment and probe images are captured in different illumination such as indoor and outdoor. One of conclusive inferences for smartphone based iris recognition in visible spectrum is that illumination plays a key role to reduce the EER [8]. While inference is validated for iris in this set of experiments, periocular recognition is observed to reduce the impact to a greater degree. The reduction of EER is approximately 20% when periocular information is used for verification. The trends of EER remain same even in case of cross-sensor (cross-smartphone) verification scenario. Thus, the effect of illumination does not impact the periocular features to the extent it impacts the iris features for verification in smartphone based visible spectrum in unconstrained scenario. 2) Remarks for Protocol 2: The key factor to be studied in Protocol 2 is the impact on verification performance by employing different hardware for capturing enrollment and probe data. The factor of imteroperability is studied by employing images from as enrollment images and images from as probe images. Table II and IV present the verification performance in terms of EER for both iris and

TABLE III: EER obtained for periocular recognition with different fusion schemes for same smartphones Scheme Baseline Min-Rule Max-Rule Sum-Rule Reference Images Probe Images Front Rear Front Rear Front 0.00 12.92 29.32 31.63 Rear 12.50 0.00 31.09 23.12 Front 24.49 25.00 0.00 8.71 Rear 27.08 20.83 8.33 0.00 Front 0.00 12.30 25.00 27.08 Rear 10.93 0.00 33.31 19.26 Front 27.08 27.42 0.00 6.03 Rear 29.17 18.75 10.84 0.00 Front 0.00 14.32 31.27 35.48 Rear 14.72 0.00 35.39 27.55 Front 29.57 32.71 0.00 10.20 Rear 31.58 24.47 12.17 0.00 Front 0.00 16.56 35.31 31.03 Rear 18.95 0.00 33.33 32.91 Front 37.17 33.22 0.00 10.15 Rear 31.18 29.12 14.25 0.00 Front 0.00 12.50 28.95 32.87 Rear 12.68 0.00 31.05 22.92 Front 22.89 25.00 0.00 8.36 Rear 26.66 21.19 8.33 0.00 Front 0.00 10.79 22.92 23.01 Rear 10.39 0.00 33.33 18.75 Front 27.08 27.08 0.00 6.27 Rear 29.17 22.92 10.46 0.00 Front 0.00 10.20 22.94 27.08 Rear 10.37 0.00 28.68 21.08 Front 22.83 22.94 0.00 6.25 Rear 25.00 18.48 6.25 0.00 Front 0.00 10.42 22.83 19.17 Rear 10.37 0.00 26.93 19.17 Front 20.66 23.34 0.00 2.59 Rear 24.76 20.66 10.39 0.00 periocular information respectively. The EER obtained when periocular information is used for verification in cross-sensor (cross-smartphone) data is much lesser than the performance obtained using iris images. V. CONCLUSIONS The challenges of using smartphone as a biometric sensor for iris and periocular recognition in visible spectrum has been studied extensively in this work. This work has investigated the performance of individual biometric characteristics (iris and periocular) captured using smartphone camera for different aspects: (1) the impact on verification performance TABLE IV: EER performance obtained for periocular recognition using cross sensor (cross-camera) comparison with different fusion schemes Scheme Baseline Min-Rule Max-Rule Sum-Rule Reference Images Probe Images Front Rear Front Rear Front 6.45 12.19 33.33 29.10 Rear 10.33 4.17 30.83 25.00 Front 22.74 22.94 4.37 10.90 Rear 28.81 20.83 10.73 6.29 Front 10.51 14.45 35.39 37.43 Rear 14.76 5.85 35.42 35.35 Front 27.39 28.41 8.64 16.95 Rear 33.67 27.55 14.63 8.69 Front 8.38 10.42 33.07 27.08 Rear 12.50 5.78 29.01 25.29 Front 20.86 27.08 6.34 10.42 Rear 27.06 20.83 8.36 5.96 Front 6.45 12.19 33.33 29.10 Rear 10.33 4.17 30.83 25.00 Front 22.74 22.94 4.37 10.90 Rear 28.81 20.83 10.73 6.29 on iris versus periocular information; (2) use of multi-feature comparative score level fusion to improve the performance of iris and periocular information; (3) the impact of verification performance due to change of illumination in indoor versus outdoor scenario while the reference and probe are captured from same smartphone; (4) the impact of verification when the enrollment and probe data are captured from different smartphones. The protocols proposed in this work make it easier to compare against the baseline results published earlier on this database [8]. The summary of the findings can be outlined as: The pool of BSIF filters along with the fusion schemes employed in this work can substantially boost the verification performance for both iris and periocular information. The unconstrained iris acquisition resulting in motionblurred iris, off-angle iris and occluded iris under uncontrolled illumination impacts verification rate in the case of iris whereas the pericoular information is not affected to larger extent. As compared to iris, periocular information provides lower EER for cross-sensor and cross-illumination (indoor versus outdoor) scenario. A decrease of 20% in EER is obtained in case of cross-sensor comparisons employing periocular information as compared to iris. ACKNOWLEDGEMENTS The authors wish to express thanks to Morpho (Safran Group) for supporting this work, and in particular to Morpho

Research & Technology team for the fruitful technical and scientific exchanges related to this particular work. REFERENCES [1] BIPLab, University of Salerno. http://biplab.unisa.it/miche/database/. [2] S. Bharadwaj, H. S. Bhatt, M. Vatsa, and R. Singh. Periocular biometrics: When iris recognition fails. In Biometrics: Theory Applications and Systems (BTAS), 2010 Fourth IEEE International Conference on, pages 1 6. IEEE, 2010. [3] K. Cheng and A. Kumar. Contactless finger knuckle identification using smartphones. In Biometrics Special Interest Group (BIOSIG), 2012 BIOSIG-Proceedings of the International Conference of the, pages 1 6. IEEE, 2012. [4] J. Daugman. How iris recognition works. Circuits and Systems for Video Technology, IEEE Transactions on, 14(1):21 30, 2004. [5] M. De Marsico, C. Galdi, M. Nappi, and D. Riccio. Firme: Face and iris recognition for mobile engagement. Image and Vision Computing, 2014. [6] J. Kannala and E. Rahtu. Bsif: Binarized statistical image features. In Pattern Recognition (ICPR), 2012 21st International Conference on, pages 1363 1366. IEEE, 2012. [7] Kiran B. Raja, R. Raghavendra, and C. Busch. Binarized Statistical Image Features for Robust Iris and Periocular Recognition in Visible Spectrum. In In proceedings of IEEE conference on International Workshop on Forensics and Biometrics (IWBF), Malta. IEEE, 2014. [8] Kiran B. Raja, R. Raghavendra, C. Busch, and S. Mondal. An Empirical Study of Smartphone Based Iris Recognition in Visible Spectrum. In In proceedings of IEEE conference on Security of Information and Networks (SIN 14), Glassgow. IEEE, 2014. [9] Kiran B. Raja, R. Raghavendra, F. A. Cheikh, and C. Busch. Robust iris recognition using light field camera. In The Colour and Visual Computing Symposium 2013. IEEE, 2013. [10] Kiran B. Raja, R. Raghavendra, V. K. Vemuri, and C. Busch. Smartphone based visible iris recognition using deep sparse filtering. Pattern Recognition Letters, 2014. [11] J.-G. Ko, Y.-H. Gil, J.-H. Yoo, and K.-I. Chung. A novel and efficient feature extraction method for iris recognition. ETRI journal, 29(3):399 401, 2007. [12] L. Ma, T. Tan, Y. Wang, and D. Zhang. Personal identification based on iris texture analysis. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 25(12):1519 1533, 2003. [13] L. Masek and P. Kovesi. Matlab source code for a biometric identification system based on iris patterns. The School of Computer Science and Software Engineering, The University of Western Australia, 2(4), 2003. [14] U. Park, A. Ross, and A. K. Jain. Periocular biometrics in the visible spectrum: A feasibility study. In IEEE 3rd International Conference on Biometrics: Theory, Applications, and Systems, 2009. BTAS 09., pages 1 6. IEEE, 2009. [15] H. Proenca, S. Filipe, R. Santos, J. Oliveira, and L. A. Alexandre. The ubiris. v2: A database of visible wavelength iris images captured onthe-move and at-a-distance. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 32(8):1529 1535, 2010. [16] R. Raghavendra, C. Busch, and B. Yang. Scaling-robust fingerprint verification with smartphone camera in real-life scenarios. In Biometrics: Theory, Applications and Systems (BTAS), 2013 IEEE Sixth International Conference on, pages 1 8. IEEE, 2013. [17] R. Raghavendra, Kiran B. Raja, A. Pflug, B. Yang, and C. Busch. 3d face reconstruction and multimodal person identification from video captured using smartphone camera. In Technologies for Homeland Security (HST), 2013 IEEE International Conference on, pages 552 557. IEEE, 2013. [18] R. Raghavendra, Kiran B. Raja, B. Yang, and C. Busch. Combining iris and periocular recognition using light field camera. In 2nd IAPR Asian Conference on Pattern Recognition (ACPR2013). IEEE, 2013. [19] C. Rathgeb and A. Uhl. Secure iris recognition based on local intensity variations. In Image Analysis and Recognition, pages 266 275. 2010. [20] C. Rathgeb and A. Uhl. Context-based biometric key generation for iris. IET computer vision, 5(6):389 397, 2011. [21] C. Stein, C. Nickel, and C. Busch. Fingerphoto recognition with smartphone cameras. In 2012 BIOSIG-Proceedings of the International Conference of the Biometrics Special Interest Group (BIOSIG), pages 1 12. IEEE, 2012. [22] G. Sutra, B. Dorizzi, S. Garcia-Salicetti, and N. Othman. A biometric reference system for iris, osiris version 4.1. 2012. [23] The Mathworks Inc. MATLAB, version 8.2.0 (R2013b) - Cascade Object Detector. Natick, Massachusetts, 2013b.