Literature DB >> 29715684

Biased Dropout and Crossmap Dropout: Learning towards effective Dropout regularization in convolutional neural network.

Alvin Poernomo1, Dae-Ki Kang2.   

Abstract

Training a deep neural network with a large number of parameters often leads to overfitting problem. Recently, Dropout has been introduced as a simple, yet effective regularization approach to combat overfitting in such models. Although Dropout has shown remarkable results on many deep neural network cases, its actual effect on CNN has not been thoroughly explored. Moreover, training a Dropout model will significantly increase the training time as it takes longer time to converge than a non-Dropout model with the same architecture. To deal with these issues, we address Biased Dropout and Crossmap Dropout, two novel approaches of Dropout extension based on the behavior of hidden units in CNN model. Biased Dropout divides the hidden units in a certain layer into two groups based on their magnitude and applies different Dropout rate to each group appropriately. Hidden units with higher activation value, which give more contributions to the network final performance, will be retained by a lower Dropout rate, while units with lower activation value will be exposed to a higher Dropout rate to compensate the previous part. The second approach is Crossmap Dropout, which is an extension of the regular Dropout in convolution layer. Each feature map in a convolution layer has a strong correlation between each other, particularly in every identical pixel location in each feature map. Crossmap Dropout tries to maintain this important correlation yet at the same time break the correlation between each adjacent pixel with respect to all feature maps by applying the same Dropout mask to all feature maps, so that all pixels or units in equivalent positions in each feature map will be either dropped or active during training. Our experiment with various benchmark datasets shows that our approaches provide better generalization than the regular Dropout. Moreover, our Biased Dropout takes faster time to converge during training phase, suggesting that assigning noise appropriately in hidden units can lead to an effective regularization.
Copyright © 2018 Elsevier Ltd. All rights reserved.

Keywords:  Convolutional neural network; Dropout; Regularization

Mesh:

Year:  2018        PMID: 29715684     DOI: 10.1016/j.neunet.2018.03.016

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  9 in total

1.  Comparison of Convolutional Neural Network Architectures and their Influence on Patient Classification Tasks Relating to Altered Mental Status.

Authors:  Kevin Gagnon; Tami L Crawford; Jihad Obeid
Journal:  Proceedings (IEEE Int Conf Bioinformatics Biomed)       Date:  2021-01-13

2.  An intelligent approach for Arabic handwritten letter recognition using convolutional neural network.

Authors:  Zahid Ullah; Mona Jamjoom
Journal:  PeerJ Comput Sci       Date:  2022-05-27

3.  Development of Machine-Learning Model to Predict COVID-19 Mortality: Application of Ensemble Model and Regarding Feature Impacts.

Authors:  Seung-Min Baik; Miae Lee; Kyung-Sook Hong; Dong-Jin Park
Journal:  Diagnostics (Basel)       Date:  2022-06-14

4.  Optimized splitting of mixed-species RNA sequencing data.

Authors:  Xuan Song; Hai Yun Gao; Karl Herrup; Ronald P Hart
Journal:  J Bioinform Comput Biol       Date:  2022-01-06       Impact factor: 1.204

5.  Few-shot pulse wave contour classification based on multi-scale feature extraction.

Authors:  Peng Lu; Chao Liu; Xiaobo Mao; Yvping Zhao; Hanzhang Wang; Hongpo Zhang; Lili Guo
Journal:  Sci Rep       Date:  2021-02-12       Impact factor: 4.379

6.  Development of machine learning model for diagnostic disease prediction based on laboratory tests.

Authors:  Dong Jin Park; Min Woo Park; Homin Lee; Young-Jin Kim; Yeongsic Kim; Young Hoon Park
Journal:  Sci Rep       Date:  2021-04-07       Impact factor: 4.379

7.  Tensor based stacked fuzzy neural network for efficient data regression.

Authors:  Jie Li; Jiale Hu; Guoliang Zhao; Sharina Huang; Yang Liu
Journal:  Soft comput       Date:  2022-08-17       Impact factor: 3.732

8.  A novel intelligent displacement prediction model of karst tunnels.

Authors:  Hao-Jiang Ding; Yun-Kang Rao; Tao Yang; Ming-Zhe Zhou; Hai-Ying Fu; Yan-Yan Zhao
Journal:  Sci Rep       Date:  2022-10-10       Impact factor: 4.996

9.  The diagnostic accuracy of artificial intelligence in thoracic diseases: A protocol for systematic review and meta-analysis.

Authors:  Yi Yang; Gang Jin; Yao Pang; Wenhao Wang; Hongyi Zhang; Guangxin Tuo; Peng Wu; Zequan Wang; Zijiang Zhu
Journal:  Medicine (Baltimore)       Date:  2020-02       Impact factor: 1.817

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.