CNN-Based Model for Identity Recognition on Social Networks
DOI:
https://doi.org/10.63282/3050-9416.ICAIDSCT26-110Keywords:
Accuracy, CNN, Identity Recognition, Model Training Social NetworksAbstract
Identity recognition on social networks has become an essential technology for enhancing user authentication, preventing impersonation, and enabling personalised services, but it also raises concerns around privacy, bias, and security. As social platforms host billions of images and videos, the ability to accurately and fairly identify individuals in this vast multimedia space requires advanced machine learning techniques coupled with strong data protection measures. In this research, a secure, accurate, and fairness-aware facial identity recognition system is designed, implemented, and evaluated using Convolutional Neural Networks (CNN) as the core deep learning model. The system integrates robust security measures such as SHA-256 hashing and Fernet symmetric encryption to ensure end-to-end privacy protection in compliance with modern data regulations. Fairness in predictions was addressed using the Random Over Sampler technique to balance the dataset, resulting in equitable performance across simulated demographic groups, each achieving 0.97 in both accuracy and F1-score. The custom CNN architecture featuring multiple convolutional layers, batch normalization, dropout regularization, and dense layers was trained over 100 epochs on the widely used Labeled Faces on the Wild (LFW) dataset with augmentation, achieving a testing accuracy of 98.7%. The model’s accuracy consistently improved while loss decreased, confirming robust learning without overfitting. These results confirm the system’s superiority over traditional methods, offering a scalable, secure, and ethically aligned solution suitable for privacy-sensitive domains such as healthcare, online authentication, and secure access control.
References
1. Hadgu, A. and Gundam, J. (2020). Learn2link: linking the social and academic profiles of researchers. Proceedings of the International Aaai Conference on Web and Social Media, 14, 240-249. https://doi.org/10.1609/icwsm.v14i1.7295
2. Chen, S. (2023). Cnn combined with data augmentation for face recognition on small dataset. Journal of Physics Conference Series, 2634(1), 012040. https://doi.org/10.1088/1742-6596/2634/1/012040
3. Qin, Z., Zhao, P., Zhuang, T., Deng, F., Ding, Y., and Chen, D. (2023). A survey of identity recognition via data fusion and feature learning. Information Fusion, 91, 694-712.
4. Deedee, B., Onate, T., and Emmah, V. (2024). Fake Profile Detection and Stalking Prediction on X using Random Forest and Deep Convolutional Neural Networks. Journal Press India, 4(1).
5. Li, S., Lu, D., Li, Q., Wu, X., Li, S., and Wang, Z. (2024). MFLink: User identity linkage across online social networks via multimodal fusion and adversarial learning. IEEE Transactions on Emerging Topics in Computational Intelligence, 8(5), 3716-3725.
6. Sun, Y., Liang, D., Wang, X., and Tang, X. (2015). Deepid3: Face recognition with very deep neural networks. arXiv preprint arXiv:1502.00873.
7. Schroff, F., Kalenichenko, D., and Philbin, J. (2015). Facenet: a unified embedding for face recognition and clustering., 815-823. https://doi.org/10.1109/cvpr.2015.7298682
8. Taigman, Y., Yang, M., Ranzato, M. A., and Wolf, L. (2014). Deepface: Closing the gap to human-level performance in face verification. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 1701-1708).
9. Lorenzana, J. (2016). Mediated recognition: the role of facebook in identity and social formations of filipino transnationals in indian cities. New Media and Society, 18(10), 2189-2206. https://doi.org/10.1177/1461444816655613
10. Gan, Y., zhang, c., and Yang, R. (2022). User identity alignment across heterogeneous networks based on meta-path attention., 70. https://doi.org/10.1117/12.2637544
11. Alharbi, A., Dong, H., Yi, X., and Abeysekara, P. (2021). Nps-anticlone: identity cloning detection based on non-privacy-sensitive user profile data., 618-628.