Abstract
This paper presents an algorithm to identify digital illiterates by analyzing age and emotions from facial recognition. In this paper, digital illiterates refer to people who struggle in using digital devices. The study assumed that older individuals who display surprised or angry facial expressions while using digital devices are more likely to be identified as digitally illiterate. For age detection, the study used MTCNN (Multi-task Cascaded Convolutional Networks), and for emotion detection, it employed the VGG-Face model. MTCNN detects facial features and landmarks to preprocess images and distinguish facial characteristics. The VGG-Face model uses convolution operations to analyze facial images and classify emotional states. The dataset used in the study consisted of 3, 000 facial images collected from the internet. The research team categorized the images into faces of individuals aged over 50, angry expressions, and surprised expressions. The dataset included 411 individuals (13.7%) aged over 50, 163 individuals (5.4%) with angry expressions, and 145 individuals (4.8%) with surprised expressions. Accuracy was estimated by comparing the results from the DeepFace algorithm with those from the research team’s classifications. The DeepFace algorithm achieved 95.77% accuracy in detecting individuals aged over 50, 83.45% accuracy for surprise, and 76.07% for anger. The results demonstrate that it is possible to identify digital illiterates based on their age and emotional expressions and could enable the development of personalized services to directly or indirectly support digital illiterates, potentially improving digital accessibility.
Original language | English |
---|---|
Pages (from-to) | 1496-1503 |
Number of pages | 8 |
Journal | International Journal on Advanced Science, Engineering and Information Technology |
Volume | 14 |
Issue number | 5 |
DOIs | |
State | Published - 2024 |
Keywords
- Deep learning
- DeepFace
- digital illiterate
- emotion recognition
- face recognition