Journal of Multimedia Information System
Korea Multimedia Society
Section A

Non-contact Palmprint Attendance System on PC Platform

Yuxin Wu1, Lu Leng2,*, Huapeng Mao3
1School of Software and Microelectronics, Northwestern Polytechnical University, Xi’an, P. R. China, wuyuxin@mail.nwpu.edu.cn
2School of Software, Nanchang Hangkong University, Nanchang, P. R. China, leng@nchu.edu.cn
3Chengdu BIIDTE Technology Co., Ltd., Chengdu, P. R. China, maohp@qq.com
*Corresponding Author: Lu Leng, Nanchang Hangkong University, 0086-791-86453251, leng@nchu.edu.cn

© Copyright 2018 Korea Multimedia Society. This is an Open-Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Received: Sep 17, 2018 ; Accepted: Sep 25, 2018

Published Online: Sep 30, 2018

Abstract

In order to overcome the problems of contact palmprint recognition, a non-contact palmprint recognition system is developed on personal computer (PC) platform. Three methods, namely “double-line-single-point” (DLSP), “double-assistant-crosshair” (DAC) and “none-assistant-graphic” (NAG), are implemented for the palmprint localization to solve the severe technical challenges, including the complex background, variant illuminations, uncontrollable locations and gestures of hands. In NAG, hand segmentation and the cropping of region of interest are performed without any assistant graphics. The convex hull contour of hand helps detect the outside contour of little finger as well as the valley bottom between thumb and index finger. The three methods of palmprint localization have good operating efficiency and can meet the performance requirements of real-time system. Furthermore, an attendance system on PC platform is designed and developed based on non-contact palmprint recognition.

Keywords: Non-contact Palmprint Recognition; Attendance Check System; Palmprint Localization; System Design and Development

I. INTRODUCTION

The information security is increasingly significant in the digital era. Traditional numeric passwords are commonly used for user authentication. Unlike possessions and passwords, biometric features, i.e., the inherent characteristics of human body and behavior, are hardly lost, forgotten or cracked, so biometric overcomes the defects of traditional authentication methods, and becomes a novel authentication technology with high security and reliability. Compared with other biometric modalities, palmprint has several remarkable advantages, including rich discriminant features, few restrictions, low application cost, high privacy, and so on, so it has become an outstanding biometric modality [1].

Online palmprint recognition can be categorized into contact and contactless modes [2]. In contact mode, users’ hands are in contact with the equipment surface. On the contrary, in contactless mode, users’ hands do not need to contact with any equipment surface [3].

In contact palmprint recognition system, both the locations and gestures of hands can be strictly controlled with the piles or other assistant devices, so the preprocessing is not difficult, and accordingly the accuracy can be high [4]. Unfortunately, contact systems usually lead to epidemic, limited-flexibility, pollution of acquisition sensor, cultural resistance, etc [5]. Non-contact palmprint recognition systems overcome the aforementioned problems in contact systems [6]; however, non-contact systems suffer from several severe technical challenges, including the complex background, variant illuminations, uncontrollable locations and gestures of hands [7]. Ong et al. proposed a competitive hand valley detection method [8]. Gaussian skin color model was used to segment non-contact palm region. Then the hand boundary was detected and all the boundary points were judged according to some rules. Finally, the four valley points between five fingers were found. This method heavily depends on the segmentation results of skin color. Tang et al. also designed a non-contact palmprint key-point localization method that was suitable to both open fingers and close fingers [9]. Leng et al. proposed triple-perpendicular-translation residuals for key-point localization with low computational complexity [10]. Aykut and Ekinci applied active appearance model for palm detection [11]; however the computation complexity is high. Javidnia developed an efficient illumination normalization algorithm to restrain the illumination disturbance [12].

A non-contact palmprint recognition system is developed on personal computer (PC) platform in this paper. In order to solve the severe technical challenges of non-contact palmprint system, three methods are implemented for the palmprint localization. Among them, “double-line-single-point” (DLSP) [13] and “double-assistant-crosshair” (DAC) [14] methods both constrain the hand location and gesture during palmprint image acquisition, and they can effectively help localize palmprint key points and lines. In addition, a novel method named “none-assistant-graphic” (NAG) is designed for palmprint localization. Hand segmentation and the cropping of region of interest (ROI) are performed without any assistant graphics. The convex hull contour of hand helps detect the outside contour of little finger as well as the valley bottom between thumb and index finger. The three palmprint localization methods have good operating efficiency, and can meet the performance requirements of real-time system. Furthermore, an attendance system on PC platform is designed and developed based on non-contact palmprint recognition.

The rest of this paper is organized as follows. In Section II, the three methods for palmprint localization are introduced. The system design and development are explained in Section III and Section IV, respectively. Finally, the conclusions are drawn in Section V.

II. PALMPRINT LOCALIZATION

The main task of preprocessing for palmprint recognition is to crop ROI from the original image. Three methods, namely NAG, DLSP and DAC, are implemented in the system for the palmprint localization.

2.1. None-assistant-graphic Method

None of the assistant graphics is employed for palm localization in NAG method. Since it is not easy to control the stretch degree between the fingers, the four fingers, i.e., index, middle, ring and litter fingers, are combined together. Only the thumb is stretched. The whole hand region is in the acquisition window. The directions of the four fingertips are all upward. The results of the steps in NAG are shown in Fig. 1, which are performed as follows.

jmis-5-3-179-g1
Fig. 1. Results of the steps in NAG.
Download Original Figure
(1) Color likelihood

The skin-color likelihood of each pixel is computed with skin-color model.

(2) Segmentation

The skin-color likelihood are binarized with maximum inter-class variance method, i.e., Otsu method, as shown in Fig. 1(a).

(3) Hole-filling

The holes in the binarized image are filled to obtain the complete hand region, as shown in Fig. 1(b).

(4) Convex hull detection

The convex hull of hand region is detected, as shown in Fig. 1(c).

(5) Convex-concave-point detection

The convex and concave points of hand region are detected, as shown in Fig. 1(d).

Fig. 2 is a part of Fig. 1(d). The convex points, A and B, are the crossover points of the convex hull and the edge contour. An outside region is surrounded by a pair of the convex points and the contour segment between them. For example, an outside region is surrounded by the line segment AB as well as the contour segment between A and B. The point-line-distance between each point on contour segment and the line crossing the pair of the convex points is computed. The point with the longest point-line-distance in each outside region is considered as the concave point of this region. The concave point with the longest point-line-distance among all the concave points is considered as the valley bottom point between thumb and index finger. In Fig. 2, C is a concave point, and also is the valley bottom point.

jmis-5-3-179-g2
Fig. 2. Convex-concave-point detection.
Download Original Figure
(6) ROI localization

The external rectangle of hand region is enclosed, as shown in Fig. 3.

jmis-5-3-179-g3
Fig. 3. External rectangle.
Download Original Figure

The coordinate system is established, in which the positive directions of x-axis and y-axis are rightward and downward, respectively. The four boundaries of the external rectangle, namely left, right, top and bottom sides, are restricted with the four points, the vertically most convex point on the outside of little finger, the valley bottom point between thumb and index finger, the top of the middle finger, and the bottom of the palm. The width of the external rectangle is W, the position of C, the valley bottom point between thumb and index finger, is (xC,yC). D is the top left corner point of ROI in Fig. 4. The position of D is (xD,yD) computed by:

{ x D = x C 0.8 × W y D = y C 0.3 × W
(1)

The side length of ROI is 0.6×W.

jmis-5-3-179-g4
Fig. 4. ROI localization.
Download Original Figure
2.2. Double-line-single-point Method

In consideration of comfort, palmprint images are captured with the build-in rear camera of smart phone, on the contrary, users are apt to capture palmprint images with the front camera of PC. Thus the directions of both assistant lines are adjusted to vertical upward direction on PC platform for comfort.

When the four fingers (index, middle, ring, little fingers) are combined together, that is almost none of interspace is among them, the outside boundaries of the four fingers are appropriately two parallel straight lines. The right hand placement are shown in Fig. 5. The assistant graphics of right hand can be flipped horizontally to become those of left hand. Four fingers are combined together naturally, while thumb is sketched. Two vertical assistant lines are used for the alignment of the two outside boundaries of the four fingers, respectively. The red assistant point should be aligned to the intersection of the outside boundary and the bottom line of index finger.

jmis-5-3-179-g5
Fig. 5. DLSP acquisition.
Download Original Figure
2.3. Double-assistant-crosshair Method

The centers of the two crosshairs in the preview screen form the two assistant points, which are also the centers of the two assistant boxes. The users should try to stretch five fingers and align the two key points, i.e., the two valley bottom points between index and middle fingers as well as between ring and little fingers, to the assistant points, as shown in Fig. 6. The square ROI is localized with the two assistant points.

jmis-5-3-179-g6
Fig. 6. DAC acquisition.
Download Original Figure
2.4. Feature Extraction and Matching

ROI is zoomed to a uniform size of 128×128. Some state-of-the-art palmprint features can be extracted from ROI and matched for palmprint recognition, such as Palm Code, Fusion Code, Competitive Code, Ordinal Code, Robust Line Orientation Code, Binary Orientation Co-occurrence Vector, Extended Binary Orientation Co-occurrence Vector, etc [15]. The palmprint features can be selected according the the balance between accuracy and computation complexity.

III. SYSTEM DESIGN

The main task of preprocessing for palmprint recognition is to crop ROI from the original image. Three methods, namely NAG, DLSP and DAC, are implemented in the system for the palmprint localization.

3.1. Development Environments

The development environments are shown in Table 1.

Table 1. Development environments.
Hardware Configuration
CPU i5-4200U
Memory 4G
Camera resolution 640 × 480
Software configuration
Operating system Windows10
IDE Visual Studio 2015
Development language C++
Framework OpenCV, QT
Database MySQL
Development equipment ASUS A555L
Commissioning equipment ASUS A555L
Download Excel Table
3.2. Functional Module

The functional module hierarchy is shown in Fig. 7.

jmis-5-3-179-g7
Fig. 7. Functional module hierarchy.
Download Original Figure
3.3. Flow Chart

The flow charts of registration/re-registration and authentication/attendance check are shown in Fig. 8 and Fig. 9, respectively. If one user re-registers, he/she has to input his/her ID that has been registered in the system, and then he/she has to pass the authentication to obtain the authority for information update. A new palmprint template replaces the previous one in the database for the update of re-registration.

jmis-5-3-179-g8
Fig. 8. Flow chart of registration/re-registration.
Download Original Figure
jmis-5-3-179-g9
Fig. 9. Flow chart of authentication/attendance check.
Download Original Figure

IV. SYSTEM DEVELOPMENT

4.1. User Graphical Interface
(1) Main interface

The main interface is shown in Fig. 10.

jmis-5-3-179-g10
Fig. 10. Main interface.
Download Original Figure

The controls on main interface are introduced in Table 2.

Table 2. Controls on main interface.
Press key Function
Record Query attendance time record.
Register Register new users.
Authenticate/Attendance check Authenticate the existing users. Record the attendance time of the user.
ID input box Input ID for registration/re-registration or authentication/attendance check.
Date box Display date.
Pull-down menu Select an assistant localization method.
Download Excel Table

The three assistant methods are the alternatives for ROI localization, as shown in Fig. 11.

jmis-5-3-179-g11
Fig. 11. Assistant methods for ROI localization.
Download Original Figure
(2) Registration/re-registration interface

As shown in Fig. 12, the input registered ID and clicked registration button on the main interface means that the user would like to re-register, i.e., to update his/her palmprint template, then he/she has to pass the authentication firstly.

jmis-5-3-179-g12
Fig. 12. Prompt box for re-registration.
Download Original Figure

Fig. 13 shows the registration interface. “Capture” can capture the palmprint image. The left-right hand switch button is on the registration interface of DLSP. “Open” and “Close” can launch and turn off the camera, respectively. Both the prompt box shown in Fig. 14 and the voice prompt notify the user once his/her registration is completed successfully.

jmis-5-3-179-g13
Fig. 13. Registration/Re-registration interface.
Download Original Figure
jmis-5-3-179-g14
Fig. 14. Prompt box for successful registration.
Download Original Figure
(3) Authentication/attendance check interface

The interface of authentication/attendance check is similar to that of registration/re-registration, so only the interface of authentication/attendance check for NAG is shown in Fig. 15. The difference is that there is no “Capture” button on authentication/attendance check interface. The frames of the video stream are automatically real-time captured for palmprint authentication until the authentication is passed or the timer ends. The operation time of each frame is about 0.3 second, so the real-time requirement is met.

jmis-5-3-179-g15
Fig. 15. Authentication/attendance check interface.
Download Original Figure

If the user’s palmprint template is not found in the database, the system prompts the user to register his/her palmprint template, as shown in Fig. 16. The authentication results are shown in Fig. 17. The attendance date and time are recorded and stored in the database if the authentication is passed.

jmis-5-3-179-g16
Fig. 16. Prompt box for palmprint template registration
Download Original Figure
jmis-5-3-179-g17
Fig. 17. Prompt box for authentication results.
Download Original Figure
(4) Query record interface

Click “Record” button on the main interface to open the pop-up query window. Input ID and click “Query” Button, the attendance records, including attendance date and time, are displayed, as shown in Fig. 18.

jmis-5-3-179-g18
Fig. 18. Attendance record query.
Download Original Figure
(5) Database

Because QT toolbox provides a convenient access interface and an efficient data processing speed for MySQL database, MySQL database is used to store all user data, including ID, palmprint template, attendance time and date.

QT toolkit provides QtSql header files that enable the connection and usage of various databases. First, the database driver is set through QtSql database class. The instance defined by this class represents the connection to a database. After the driver instance is initialized, some configurations need to be set, including IP address, database name, port, username and password for login. For the normal operation of the database, the login with the highest level of root privilege is used for the database connection in this system. The database is connected when the program is launched, and the user is notified by the prompt box shown in Fig. 19.

jmis-5-3-179-g19
Fig. 19. Prompt box for successful database connection.
Download Original Figure

V. CONCLUSIONS AND FUTURE WORKS

The developed non-contact palmprint recognition system works well on PC platform. The users can flexibly select the three palmprint localization methods, including DLSP, DAC, NAG, according to their customs. The three palmprint localization methods have good operating efficiency, and can meet the performance requirements. The developed system can be used both for authentication and attendance check. The accuracy of NAG depends on the performance of skin-color model, so it possibly fails in complex background. How to segment hand region more accurately from the complex background will be intensively studied in the future works. In addition, the computational complexity can be further reduced to improve the real-time performance.

Acknowledgement

This work was supported by National Natural Science Foundation of China (61741312, 61866028, 6181101039, 41801288, 61866025, 61763033, 61663031, 61662049), Key Program Project of Research and Development (Jiangxi Provincial Department of Science and Technology) (20171ACE50024, 20161BBE50085), Application Innovation Plan (Ministry of Public Security of P. R. China) (2017YYCXJXST048), Open Foundation of Key Laboratory of Jiangxi Province for Image Processing and Pattern Recognition (ET201680245, TX201604002).

REFERENCES

[1].

A. Kumar, “Toward more accurate matching of contactless palmprint images under less constrained environments,” IEEE Transactions on Information Forensics and Security, vol. 14, no. 1, pp. 34-47, Jan. 2019.

[2].

Y. Xu, L. K. Fei, J. Wen, and D. Zhang, “Discriminative and robust competitive code for palmprint recognition,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 48, no. 2, pp. 232-241, Feb. 2018.

[3].

W. Jia, B. Zhang, J. T. Lu, Y. H. Zhu, Y. Zhao, W. M. Zuo, and H. B. Ling, “Palmprint recognition based on complete direction representation,” IEEE Transactions on Image Processing, vol. 26, no. 9, pp. 4483-4498, May. 2017.

[4].

L. Zhang, L. D. Li, A. Q. Yang, Y. Shen, and M. Yang, “Towards contactless palmprint recognition: a novel device, a new benchmark, and a collaborative representation based identification approach,” Pattern Recognition, vol. 69, pp. 199-212, Sept. 2017.

[5].

L. Leng, M. Li, C. S. Kim, and X Bi, “Dual-source discrimination power analysis for multi-instance contactless palmprint recognition,” Multimedia Tools and Applications, vol. 76, no. 1, pp. 333-354, Jan. 2017.

[6].

L. Leng, M. Li, and A. B. J. Teoh, “Conjugate 2DPalmHash code for secure palm-print-vein verification,” in Proceedings of the 6th International Congress on Image and Signal Processing, Hangzhou, pp. 1694-1699, Dec. 2013.

[7].

J. Almaghtuf and F. Khelifi, “Self-geometric relationship filter for efficient SIFT key-points matching in full and partial palmprint recognition,” IET Biometrics, vol. 7, no. 4, pp. 296-304, Jun. 2018.

[8].

M. G. K. Ong, T. Connie, and A. B. J. Teoh, “Touch-less palmprint biometrics: novel design and implementation,” Image and Vision Computing, vol. 26, no. 12, pp. 1551-1560, Dec. 2008.

[9].

Y. B. Tang, W. Bu, X. Q. Wu, and K. Q. Wang, “Key points localization methods of contactless palmprint recognition technology,” Intelligent Computer and Applications, vol. 1, no. 1, pp. 62-65, Jun. 2011.

[10].

L. Leng, G. Liu, M. Li, and M. K. Khan, “Logical conjunction of triple-perpendicular-directional translation residual for contactless palmprint preprocessing,” in Proceedings of the 11th International Conference on Information Technology: New Generations, Las Vegas, pp. 523-528, Jun. 2014.

[11].

M. Aykut and M. Ekinci, “Developing a contactless palmprint authentication system by introducing a novel ROI extraction method,” Image and Vision Computing, vol. 40, pp. 65-74, Aug. 2015.

[12].

H. Javidnia, A. Ungureanu, C. Costache, and P. Corcoran, “Palmprint as a smartphone biometric,” in Proceedings of the IEEE International Conference on Consumer Electronics, Las Vegas, pp. 463-466, Mar. 2016.

[13].

L. Leng, F. M. Gao, Q. Chen, and C. S. Kim, “Palmprint recognition system on mobile devices with double-line-single-point assistance,” Personal and Ubiquitous Computing, vol. 22, no. 1, pp. 93-104, Jan. 2018.

[14].

K. S. Cao and L. Leng, “Preprocessing based on double-point auxiliary for palmprint recognition on mobile devices,” Journal of Optoelectronics·Lasser, vol. 29, no. 2, pp. 205-211, Feb. 2018.

[15].

L. Leng, A. B. J. Teoh, and M. Li, “Simplified 2DPalmHash code for secure palmprint verification,” Multimedia Tools and Applications, vol. 76, no. 6, pp. 8373-8398, Mar. 2017.