Exploring Classification Models for Video Source Device Identification: A Study of CNN-SVM and Softmax classifier

Najmath Ottakath*, Younes Akbari, Somaya Almaadeed, Ahmed Bouridane, Fouad Khelifi

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

15 Downloads (Pure)


Video Source device identification plays a crucial role in video forensics as the proliferation of video capturing devices has given rise to crimes with videos that are challenging to trace. Reliance on metadata extraction is insufficient as it can be corrupted or manipulated to conceal the source of the crime. Another technique employed for source identification is noise pattern extraction, which generates a unique identification for the video camera. However, this method is susceptible to capture faults and can produce diverse noise patterns for each video. In addressing these challenges, there is a need to identify distinctive features that are consistent across all videos captured by the same camera. This has led to the adoption of computer vision techniques utilizing machine learning and deep learning. Classifiers play a crucial role in machine learning and data analysis, as they are responsible for categorizing or predicting results based on input data. Our experiments show that the subject is sensitive to classifiers and developing a good classifier or classifier-level fusions can improve results in practice for all datasets.
Original languageEnglish
Title of host publication2023 International Symposium on Networks, Computers and Communications (ISNCC)
Place of PublicationPiscataway, US
Number of pages6
ISBN (Electronic)9798350335590
ISBN (Print)9798350335606
Publication statusPublished - 23 Oct 2023
EventISNCC2023: International Symposium on Networks, Computers and Communications - Hamad Bin Khalifa University, Doha, Qatar
Duration: 23 Oct 202326 Oct 2023

Publication series

NameInternational Symposium on Networks, Computers and Communications (ISNCC)
ISSN (Print)2472-4386
ISSN (Electronic)2768-0940


Internet address

Cite this