Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32586
Fusion of Shape and Texture for Unconstrained Periocular Authentication

Authors: D. R. Ambika, K. R. Radhika, D. Seshachalam


Unconstrained authentication is an important component for personal automated systems and human-computer interfaces. Existing solutions mostly use face as the primary object of analysis. The performance of face-based systems is largely determined by the extent of deformation caused in the facial region and amount of useful information available in occluded face images. Periocular region is a useful portion of face with discriminative ability coupled with resistance to deformation. A reliable portion of periocular area is available for occluded images. The present work demonstrates that joint representation of periocular texture and periocular structure provides an effective expression and poses invariant representation. The proposed methodology provides an effective and compact description of periocular texture and shape. The method is tested over four benchmark datasets exhibiting varied acquisition conditions.

Keywords: Periocular authentication, Zernike moments, LBPV, shape and texture fusion.

Digital Object Identifier (DOI):

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 847


[1] R. Raghavendra, K. B. Raja, and C. Busch, “On comparison score fusion of deep autoencoders and relaxed collaborative representation for smartphone based accurate periocular verification,” 19th International conference on Information Fusion (FUSION), pp. 2221 – 2228, 2016.
[2] J. M. Smereka, B. V. K. V. Kumar, and A. Rodriguez, “Selecting discriminative regions for periocular verification,” in IEEE International Conference on Identity, Security and Behavior Analysis (ISBA), 2016.
[3] G. Santos, E. Grancho, M. V. Bernardo, and P. T. Fiadeiro, “Fusing iris and periocular information for cross-sensor recognition,” Pattern Recognition Letters, vol. 57, pp. 52 – 59, 2014.
[4] F. Alonso-Fernandez, A. Mikaelyan, and J. Bigun, “Comparison and fusion of multiple iris and periocular matchers using near-infrared and visible images,” IWBF 2015 – 3rd International Workshop on Biometrics and Forensics, March 2015.
[5] S. Bakshi, P. K. Sa, and B. Majhi, “Phase intensive global pattern for periocular recognition,” Annual IEEE India Conference (INDICON), pp. 1 – 5, December 2014.
[6] M. V. Samarth Bharadwaj, Himanshu. S. Bhatt and R. Singh, “Periocular biometrics: When iris recognition fails,” IEEE Transactions, 2010.
[7] R. Jillela and A. Ross, “Matching face against iris images using periocular information,” Proc Intl Conf Image Processing, ICIP 2014, 2014.
[8] M.Uzair, A. Mahmood, A. Mian, and C. McDonald, “Periocularregion- based person identification in the visible, infrared and hyperspectral imagery,” Neurocomputing, Elsevier publication, vol. 149, pp. 854 – 867, August 2014.
[9] J.-X. F. and M. Savvides, “Unconstrained periocular biometric acquisi- tion and recognition using cots ptz camera for uncooperative and non- cooperative subjects,” 2012.
[10] M. Castrillo ́n-Santana, J. Lorenzo-Navarro, and E. Ramo ́n-Balmaseda, “On using periocular biometric for gender classification in the wild,” Pattern Recognition Letters, Elsevier, pp. 1 – 9, 2015.
[11] J. C. Monteiro and J. S. Cardoso, “Periocular recognition under uncon- strained settings with universal background models,” Proceedings of the International Conference on Bio-inspired Systems and Signal Processing (BIOSIGNALS), 2015.
[12] E. Barroso, G. Santos, L. Cardoso, C. Padole, and H. Proenc ̧a, “Peri- ocular recognition: how much facial expressions affect performance?” Journal of Pattern Analysis and Applications, vol. 19, no. 2, pp. 517 – 530, 2016.
[13] J. M. Smereka, V. N. Boddeti, and B. V. Kumar, “Probabilistic de- formation models for challenging periocular image verification,” IEEE Transactions on Information Forensics and Security, vol. 10, no. 9, pp. 1 – 17, 2015.
[14] S. Bakshi, P. K. Sa, and B. Majhi, “Anovelphase-intensive local pattern for periocular recognition under visible spectrum,” Biocybernetics and Biomedical Engineering, vol. 35, no. 1, pp. 30–44, 2015.
[15] K. B. Raja, R. Raghavendra, and C. Busch, “Empirical evaluation of visible spectrum iris versus periocular recognition in unconstrained scenario on smartphones,” Annual Summit and Conference (APSIPA) Asia-Pacific Signal and Information Processing Association, 2014.
[16] K. B. Raja, R. Raghavendra, M. Stokkenes, and C. Busch, “Fusion of face and periocular information for improved authentication on smart- phones,” 18th IEEE International Conference on Information Fusion (FUSION), 2015.
[17] D R Ambika, Radhika K R and D. Seshachalam, “Periocular authentication based on fem using laplace–beltrami eigenvalues,” Pattern Recognition journal, vol. 50, no. C, pp. 178 – 194, February 2016.
[18] D. Z. Zhenhua Guo, Lei Zhang, “Rotation invariant texture classification using lbp variance (lbpv) with global matching,” Pattern Recognition journal, vol. 43, pp. 706 – 719, 2010.
[19] K. A and Y. H. Hong, “Invariant image recognition by zernike moments,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 5, pp. 489 – 497, 1990.
[20] E. Sariyanidi, V. Dagli, S. C. Tek, and M. Go ̈kmen, “Local zernike moments: A new representation for face recognition,” 19th IEEE Inter- national conference on Image Processing, 2012.
[21] J. Daugman, “How iris recognition works,” IEEE transactions on circuits and systems for Video technology, vol. 14, no. 1, pp. 21 – 30, 2004.
[22] A. Wiliem, V. K. Madasu, W. W. Boles, and P. K. Yarlagadda, “A feature based face recognition technique using zernike moments,” in RNSA Security Technology Conference 2007, P. Mendis, J. Lai, E. Dawson, and H. Abbas, Eds. Australian Homeland Research Centre, 2007, pp. 341–355.
[23] B. E. Boser, I. M. Guyon, and V. Vapnik, “A training algorithm for optimal margin classifiers,” pp. 144 – 152, 1992.
[24] C. N.Padole and H. Proenc ̧a, “Periocular recognition: Analysis of performance degradation factors.” in ICB, 2012, pp. 439–445.
[25] R. Goh, L. Liu, X. Liu, and T. Chen, “The cmu face in action (fia) database,” Proceeding AMFG 2005 Proceedings of the Second interna- tional conference on Analysis and Modelling of Faces and Gestures, pp. 255 – 263, 2005.
[26] A. F. Sequeira, L. Chen, J. Ferryman, F. Alonso-Fernandez, J. Bigun, K. B. Raja, R. Raghavendra, C. Busch, and P. Wild, “Cross-eyed - cross-spectral iris/periocular recognition database and competition,” 15th International conference of the Biometrics Special Interest Group (BIOSIG 2016), 2016.
[27] H. Proenca, S. Filipe, R. R. Santos, J. Oliveira, and L. A. Alexandre, “The ubiris.v2: A database of visible wavelength iris images captured on- thepmove and at-a-distance,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 8, pp. 1529–1535, August 2010.