Evolving Deep DenseBlock Architecture Ensembles for Image Classification

Ben Fielding, Li Zhang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

18 Citations (Scopus)
26 Downloads (Pure)


Automatic deep architecture generation is a challenging task, owing to the large number of controlling parameters inherent in the construction of deep networks. The combination of these parameters leads to the creation of large, complex search spaces that are feasibly impossible to properly navigate without a huge amount of resources for parallelisation. To deal with such challenges, in this research we propose a Swarm Optimised DenseBlock Architecture Ensemble (SODBAE) method, a joint optimisation and training process that explores a constrained search space over a skeleton DenseBlock Convolutional Neural Network (CNN) architecture. Specifically, we employ novel weight inheritance learning mechanisms, a DenseBlock skeleton architecture, as well as adaptive Particle Swarm Optimisation (PSO) with cosine search coefficients to devise networks whilst maintaining practical computational costs. Moreover, the architecture design takes advantage of recent advancements of the concepts of residual connections and dense connectivity, in order to yield CNN models with a much wider variety of structural variations. The proposed weight inheritance learning schemes perform joint optimisation and training of the architectures to reduce the computational costs. Being evaluated using the CIFAR-10 dataset, the proposed model shows great superiority in classification performance over other state-of-the-art methods while illustrating a greater versatility in architecture generation.
Original languageEnglish
Article number1880
Pages (from-to)1-31
Number of pages31
Issue number11
Publication statusPublished - 9 Nov 2020


Dive into the research topics of 'Evolving Deep DenseBlock Architecture Ensembles for Image Classification'. Together they form a unique fingerprint.

Cite this