Illumination-Based Data Augmentation for Robust Background Subtraction

Dimitrios Sakkos, Hubert P. H. Shum, Edmond S. L. Ho

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)
9 Downloads (Pure)

Abstract

A core challenge in background subtraction (BGS) is handling videos with sudden illumination changes in consecutive frames. In this paper, we tackle the problem from a data point-of-view using data augmentation. Our method performs data augmentation that not only creates endless data on the fly, but also features semantic transformations of illumination which enhance the generalisation of the model. It successfully simulates flashes and shadows by applying the Euclidean distance transform over a binary mask generated randomly. Such data allows us to effectively train an illumination-invariant deep learning model for BGS. Experimental results demonstrate the contribution of the synthetics in the ability of the models to perform BGS even when significant illumination changes take place.
Original languageEnglish
Title of host publicationProceedings of the 2019 13th International Conference on Software, Knowledge, Information Management and Applications (SKIMA)
Subtitle of host publicationIsland of Ulkulhas, Maldives, 26-28 August 2019
Place of PublicationPiscataway, NJ
PublisherIEEE
Number of pages8
ISBN (Electronic)9781728127415, 9781728127408
ISBN (Print)9781728127422
DOIs
Publication statusPublished - Aug 2019
Event13th International Conference on Software, Knowledge, Information Management and Applications - Ulkulhas, Maldives
Duration: 26 Aug 201928 Aug 2019
http://skimanetwork.info/

Conference

Conference13th International Conference on Software, Knowledge, Information Management and Applications
Abbreviated titleSKIMA 2019
CountryMaldives
CityUlkulhas
Period26/08/1928/08/19
Internet address

Fingerprint Dive into the research topics of 'Illumination-Based Data Augmentation for Robust Background Subtraction'. Together they form a unique fingerprint.

Cite this