TY - JOUR
T1 - DK-CNNs
T2 - Dynamic kernel convolutional neural networks
AU - Liu, Jialin
AU - Chao, Fei
AU - Lin, Chih Min
AU - Zhou, Changle
AU - Shang, Changjing
N1 - Funding Information:
The authors are very grateful to the anonymous reviewers for their constructive comments which have helped significantly in revising this work. This work was supported by the National Natural Science Foundation of China (No. 61673322, 61673326, and 91746103), the Fundamental Research Funds for the Central Universities (No. 20720190142), and the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 663830.
Funding Information:
The authors are very grateful to the anonymous reviewers for their constructive comments which have helped significantly in revising this work. This work was supported by the National Natural Science Foundation of China (No. 61673322, 61673326, and 91746103), the Fundamental Research Funds for the Central Universities (No. 20720190142), and the European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 663830.
Publisher Copyright:
© 2020 Elsevier B.V.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2021/1/21
Y1 - 2021/1/21
N2 - This paper introduces dynamic kernel convolutional neural networks (DK-CNNs), an enhanced type of CNN, by performing line-by-line scanning regular convolution to generate a latent dimension of kernel weights. The proposed DK-CNN applies regular convolution to the DK weights, which rely on a latent variable, and discretizes the space of the latent variable to extend a new dimension; this process is named “DK convolution”. DK convolution increases the expressive capacity of the convolution operation without increasing the number of parameters by searching for useful patterns within the new extended dimension. In contrast to conventional convolution, which applies a fixed kernel to analyse the changed features, DK convolution employs a DK to analyse fixed features. In addition, DK convolution can replace a standard convolution layer in any CNN network structure. The proposed DK-CNNs were compared with different network structures with and without a latent dimension on the CIFAR and FashionMNIST datasets. The experimental results show that DK-CNNs can achieve better performance than regular CNNs.
AB - This paper introduces dynamic kernel convolutional neural networks (DK-CNNs), an enhanced type of CNN, by performing line-by-line scanning regular convolution to generate a latent dimension of kernel weights. The proposed DK-CNN applies regular convolution to the DK weights, which rely on a latent variable, and discretizes the space of the latent variable to extend a new dimension; this process is named “DK convolution”. DK convolution increases the expressive capacity of the convolution operation without increasing the number of parameters by searching for useful patterns within the new extended dimension. In contrast to conventional convolution, which applies a fixed kernel to analyse the changed features, DK convolution employs a DK to analyse fixed features. In addition, DK convolution can replace a standard convolution layer in any CNN network structure. The proposed DK-CNNs were compared with different network structures with and without a latent dimension on the CIFAR and FashionMNIST datasets. The experimental results show that DK-CNNs can achieve better performance than regular CNNs.
KW - Convolution kernel
KW - Convolutional neural networks
KW - Deep neural networks
UR - http://www.scopus.com/inward/record.url?scp=85093684582&partnerID=8YFLogxK
UR - https://ars.els-cdn.com/content/image/1-s2.0-S0925231220314089-gr7_lrg.jpg
U2 - 10.1016/j.neucom.2020.09.005
DO - 10.1016/j.neucom.2020.09.005
M3 - Article
AN - SCOPUS:85093684582
SN - 0925-2312
VL - 422
SP - 95
EP - 108
JO - Neurocomputing
JF - Neurocomputing
ER -