Skip to main navigation Skip to search Skip to main content

Transfer Learning-Based Ethnicity Recognition Using Arbitrary Images Captured Through Diverse Imaging Sensors

Hasti Soudbakhsh, Sonjoy Ranjon Das, Bilal Hassan*, Muhammad Farooq Wasiq

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Ethnicity recognition has become increasingly important for a wide range of applications, highlighting the need for accurate and robust predictive models. Despite advances in machine learning, ethnicity classification remains a challenging research problem due to variations in facial features, class imbalance, and generalization issues. This study provides a concise synthesis of prior work to motivate the problem and then introduces a novel experimental framework for ethnicity recognition rather than a survey review. It proposes an improved approach that leverages transfer learning to enhance classification performance. The inclusion of various imaging sensors in the proposed methodology allows for an examination of how these imaging sensors impact the performance of facial recognition systems when a variety of images are captured under a number of real-world conditions, using professional and consumer-grade devices to create a range of conditions; from this dataset, the UTKFace dataset will be used to train and validate our method; an additional balanced dataset of Test Celebrities Faces was also created, representing five different ethnic groups (Black, Asian, White, Indian, and Other); the “Other” classification was specifically excluded for final evaluations to eliminate ambiguity and enhance stability. Rigorous preprocessing of both datasets was performed for optimal extraction of features from the sensors’ acquired images; the performance of several pre-trained CNN (Convolutional Neural Network) models (VGG16, DenseNet169, VGG19, ResNet50, MobileNetV2, InceptionV3 and EfficientNetB4) was used to identify an Ideal Hyperparameter Configuration for Optimal Performance. The resulting experimental results indicate that the VGG19 model achieved an 87% validation accuracy and a Maximum test accuracy of 75% on the Primary Dataset of Celebrity Faces; subsequently, the VGG19 model demonstrated a Range of Per-Class Accuracies, in addition to an overall accuracy of 87% across all five ethnic groups (51–90%+). This work demonstrates that leveraging transfer learning on imaging-sensor-captured images enables robust ethnicity classification with high accuracy and improved training efficiency relative to full model retraining. Furthermore, systematic hyperparameter optimization enhances model generalization and mitigates overfitting. Comparative experiments with recent state-of-the-art methods (2023–2025) further confirm that our optimized VGG19 model achieves competitive performance, reinforcing the effectiveness of the proposed reproducible and fairness-aware evaluation framework.
Original languageEnglish
Article number886
Number of pages25
JournalSensors
Volume26
Issue number3
DOIs
Publication statusPublished - 28 Jan 2026
Externally publishedYes

Keywords

  • imaging sensor
  • ethnicity recognition
  • transfer learning
  • convolutional neural networks
  • deep learning
  • facial image classification
  • UKTFace dataset
  • image pre-processing
  • hyperparameter tuning
  • generalization

Fingerprint

Dive into the research topics of 'Transfer Learning-Based Ethnicity Recognition Using Arbitrary Images Captured Through Diverse Imaging Sensors'. Together they form a unique fingerprint.

Cite this