IoTNet: An Efficient and Accurate Convolutional Neural Network for IoT Devices

Tom Lawrence, Li Zhang

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)
35 Downloads (Pure)

Abstract

Two main approaches exist when deploying a Convolutional Neural Network (CNN) on resource-constrained IoT devices: either scale a large model down or use a small model designed specifically for resource-constrained environments. Small architectures typically trade accuracy for computational cost by performing convolutions as depth-wise convolutions rather than standard convolutions like in large networks. Large models focus primarily on state-of-the-art performance and often struggle to scale down sufficiently. We propose a new model, namely IoTNet, designed for resource-constrained environments which achieves state-of-the-art performance within the domain of small efficient models. IoTNet trades accuracy with computational cost differently from existing methods by factorizing standard 3 × 3 convolutions into pairs of 1 × 3 and 3 × 1 standard convolutions, rather than performing depth-wise convolutions. We benchmark IoTNet against state-of-the-art efficiency-focused models and scaled-down large architectures on data sets which best match the complexity of problems faced in resource-constrained environments. We compare model accuracy and the number of floating-point operations (FLOPs) performed as a measure of efficiency. We report state-of-the-art accuracy improvement over MobileNetV2 on CIFAR-10 of 13.43% with 39% fewer FLOPs, over ShuffleNet on Street View House Numbers (SVHN) of 6.49% with 31.8% fewer FLOPs and over MobileNet on German Traffic Sign Recognition Benchmark (GTSRB) of 5% with 0.38% fewer FLOPs.
Original languageEnglish
Article number5541
JournalSensors
Volume19
Issue number24
DOIs
Publication statusPublished - 14 Dec 2019

Fingerprint

Dive into the research topics of 'IoTNet: An Efficient and Accurate Convolutional Neural Network for IoT Devices'. Together they form a unique fingerprint.

Cite this