Literature DB >> 29994167

Information Dropout: Learning Optimal Representations Through Noisy Computation.

Alessandro Achille, Stefano Soatto.   

Abstract

The cross-entropy loss commonly used in deep learning is closely related to the defining properties of optimal representations, but does not enforce some of the key properties. We show that this can be solved by adding a regularization term, which is in turn related to injecting multiplicative noise in the activations of a Deep Neural Network, a special case of which is the common practice of dropout. We show that our regularized loss function can be efficiently minimized using Information Dropout, a generalization of dropout rooted in information theoretic principles that automatically adapts to the data and can better exploit architectures of limited capacity. When the task is the reconstruction of the input, we show that our loss function yields a Variational Autoencoder as a special case, thus providing a link between representation learning, information theory and variational inference. Finally, we prove that we can promote the creation of optimal disentangled representations simply by enforcing a factorized prior, a fact that has been observed empirically in recent work. Our experiments validate the theoretical intuitions behind our method, and we find that Information Dropout achieves a comparable or better generalization performance than binary dropout, especially on smaller models, since it can automatically adapt the noise to the structure of the network, as well as to the test sample.

Entities:  

Year:  2018        PMID: 29994167     DOI: 10.1109/TPAMI.2017.2784440

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  17 in total

1.  The Convex Information Bottleneck Lagrangian.

Authors:  Borja Rodríguez Gálvez; Ragnar Thobaben; Mikael Skoglund
Journal:  Entropy (Basel)       Date:  2020-01-14       Impact factor: 2.524

Review 2.  Interpreting encoding and decoding models.

Authors:  Nikolaus Kriegeskorte; Pamela K Douglas
Journal:  Curr Opin Neurobiol       Date:  2019-04-28       Impact factor: 6.627

3.  Partitioning variability in animal behavioral videos using semi-supervised variational autoencoders.

Authors:  Matthew R Whiteway; Dan Biderman; Yoni Friedman; Mario Dipoppa; E Kelly Buchanan; Anqi Wu; John Zhou; Niccolò Bonacchi; Nathaniel J Miska; Jean-Paul Noel; Erica Rodriguez; Michael Schartner; Karolina Socha; Anne E Urai; C Daniel Salzman; John P Cunningham; Liam Paninski
Journal:  PLoS Comput Biol       Date:  2021-09-22       Impact factor: 4.779

Review 4.  The overfitted brain: Dreams evolved to assist generalization.

Authors:  Erik Hoel
Journal:  Patterns (N Y)       Date:  2021-05-14

5.  Sparse Convolutional Denoising Autoencoders for Genotype Imputation.

Authors:  Junjie Chen; Xinghua Shi
Journal:  Genes (Basel)       Date:  2019-08-28       Impact factor: 4.096

6.  Utilizing Information Bottleneck to Evaluate the Capability of Deep Neural Networks for Image Classification.

Authors:  Hao Cheng; Dongze Lian; Shenghua Gao; Yanlin Geng
Journal:  Entropy (Basel)       Date:  2019-05-01       Impact factor: 2.524

7.  On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views.

Authors:  Abdellatif Zaidi; Iñaki Estella-Aguerri; Shlomo Shamai Shitz
Journal:  Entropy (Basel)       Date:  2020-01-27       Impact factor: 2.524

8.  CEB Improves Model Robustness.

Authors:  Ian Fischer; Alexander A Alemi
Journal:  Entropy (Basel)       Date:  2020-09-25       Impact factor: 2.524

9.  On the Difference between the Information Bottleneck and the Deep Information Bottleneck.

Authors:  Aleksander Wieczorek; Volker Roth
Journal:  Entropy (Basel)       Date:  2020-01-22       Impact factor: 2.524

10.  Convergence Behavior of DNNs with Mutual-Information-Based Regularization.

Authors:  Hlynur Jónsson; Giovanni Cherubini; Evangelos Eleftheriou
Journal:  Entropy (Basel)       Date:  2020-06-30       Impact factor: 2.524

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.