Hand Gesture Recognition for Real-time Liveness Detection in Mobile Phone Applications Using Deep Learning

Main Article Content

Woramet Lertsiwanont
Thitirat Siriborvornratanakul

Abstract


Recently vision-based face identification systems have become popular in mobile phone applications as they introduce easy and non-intrusive ways of human identification. Despite of their popularity, these systems can be easily tricked by 2D printed images or 3D printed models of faces. To prevent such attacks, most vision-based face identification systems enhance their security by involving additional liveness detection methods; this is for the purpose of checking whether the face as seen by camera is a face of living people or a face of non-living 2D images, 3D printed models or other statues. In this research, we conduct a feasibility study regarding usages of real-time hand gestures for liveness detection in smartphone-based face identification systems. Our work includes not only developing a robust in-depth learning model for real-time hand gesture recognition, but also creating a smartphone-based prototype application. This prototype application has been brought to test with 40 different smartphone users from various ranges of ages, allowing us to evaluate on-production technical efficiency, user satisfaction, and use acceptance.


Article Details

Section
Applied Science Research Articles

References

[1] T. Brewster. (2020, September). We broke into a bunch of Android phones with a 3D-printed head. [Online]. Available: https://www.forbes.com/sites/thomasbrewster/2018/12/13/webroke-into-a-bunch-of-android-phones-witha-3d-printed-head/#36a4a5521330

[2] M. Nguyen. (2017, September). Vietnamese researcher shows iPhone X face ID ‘hack’, Reuters. [Online]. Available: https://www.reuters.com/article/us-apple-vietnam-hackidUSKBN1DE1TH

[3] C. Kerdvibulvech, “Human hand motion recognition using an extended particle filter,” in Proceedings AMDO 2014: Articulated Motion and Deformable Objects, 2014, pp. 77–81.

[4] W. Abadi, M. Fezari, and R. Hamdi, “Bag of Visualwords and Chi-squared kernel support vector machine: A way to improve hand gesture recognition,” in Proceedings Proceedings of the International Conference on Intelligent Information Processing, Security and Advanced Communication, 2015, pp. 1–5.

[5] W. Wu, M. Shi, T. Wu, D. Zhao, S. Zhang, and J. Li, “Real-time hand gesture recognition based on deep learning in complex environments,” presented at the Chinese Control And Decision Conference (CCDC), Nanchang, China, 2019.

[6] P. S. Neethu, R. Suguna, and D. Sathish. “An efficient method for human hand gesture detection and recognition using deep learning convolutional neural networks,” Soft Computing, vol. 24, pp. 15239–15248, 2020.

[7] M. Rungruanganukul and T. Siriborvornratanakul, “Deep learning based gesture classification for hand physical therapy interactive program,” in Proceedings HCII 2020: Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. Posture, Motion and Health, 2020, pp. 349–358.

[8] V. Bazarevsky and F. Zhang. (2020, September). On-Device, Real-Time Hand Tracking with MediaPipe. [Online]. Available: https://ai.googleblog.com/2019/08/on-device-realtime-hand-tracking-with.html

[9] C. Kerdvibulvech, “A methodology for hand and finger motion analysis using adaptive probabilistic models,” Eurasip Journal on Embedded Systems, vol. 18, 2014.

[10] V. Bazarevsky, Y. Kartynnik, A. Vakunov, K. Raveendran, and M. Grundmann, “BlazeFace: Sub-millisecond neural face detection on mobile GPUs,” presented at CVPR Workshop on Computer Vision for Augmented and Virtual Reality, Long Beach, CA, USA, 2019.