DK-CNNs: Dynamic kernel convolutional neural networks

Jialin Liu, Fei Chao*, Chih Min Lin, Changle Zhou, Changjing Shang

*Awdur cyfatebol y gwaith hwn

Allbwn ymchwil: Cyfraniad at gyfnodolynErthygladolygiad gan gymheiriaid

6 Dyfyniadau (Scopus)
164 Wedi eu Llwytho i Lawr (Pure)

Crynodeb

This paper introduces dynamic kernel convolutional neural networks (DK-CNNs), an enhanced type of CNN, by performing line-by-line scanning regular convolution to generate a latent dimension of kernel weights. The proposed DK-CNN applies regular convolution to the DK weights, which rely on a latent variable, and discretizes the space of the latent variable to extend a new dimension; this process is named “DK convolution”. DK convolution increases the expressive capacity of the convolution operation without increasing the number of parameters by searching for useful patterns within the new extended dimension. In contrast to conventional convolution, which applies a fixed kernel to analyse the changed features, DK convolution employs a DK to analyse fixed features. In addition, DK convolution can replace a standard convolution layer in any CNN network structure. The proposed DK-CNNs were compared with different network structures with and without a latent dimension on the CIFAR and FashionMNIST datasets. The experimental results show that DK-CNNs can achieve better performance than regular CNNs.

Iaith wreiddiolSaesneg
Tudalennau (o-i)95-108
Nifer y tudalennau14
CyfnodolynNeurocomputing
Cyfrol422
Dyddiad ar-lein cynnar12 Medi 2020
Dynodwyr Gwrthrych Digidol (DOIs)
StatwsCyhoeddwyd - 21 Ion 2021

Ôl bys

Gweld gwybodaeth am bynciau ymchwil 'DK-CNNs: Dynamic kernel convolutional neural networks'. Gyda’i gilydd, maen nhw’n ffurfio ôl bys unigryw.

Dyfynnu hyn