Literature DB >> 28410050

The Deterministic Information Bottleneck.

D J Strouse1, David J Schwab2.   

Abstract

Lossy compression and clustering fundamentally involve a decision about which features are relevant and which are not. The information bottleneck method (IB) by Tishby, Pereira, and Bialek ( 1999 ) formalized this notion as an information-theoretic optimization problem and proposed an optimal trade-off between throwing away as many bits as possible and selectively keeping those that are most important. In the IB, compression is measured by mutual information. Here, we introduce an alternative formulation that replaces mutual information with entropy, which we call the deterministic information bottleneck (DIB) and argue better captures this notion of compression. As suggested by its name, the solution to the DIB problem turns out to be a deterministic encoder, or hard clustering, as opposed to the stochastic encoder, or soft clustering, that is optimal under the IB. We compare the IB and DIB on synthetic data, showing that the IB and DIB perform similarly in terms of the IB cost function, but that the DIB significantly outperforms the IB in terms of the DIB cost function. We also empirically find that the DIB offers a considerable gain in computational efficiency over the IB, over a range of convergence parameters. Our derivation of the DIB also suggests a method for continuously interpolating between the soft clustering of the IB and the hard clustering of the DIB.

Mesh:

Year:  2017        PMID: 28410050     DOI: 10.1162/NECO_a_00961

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  15 in total

1.  A high-bias, low-variance introduction to Machine Learning for physicists.

Authors:  Pankaj Mehta; Ching-Hao Wang; Alexandre G R Day; Clint Richardson; Marin Bukov; Charles K Fisher; David J Schwab
Journal:  Phys Rep       Date:  2019-03-14       Impact factor: 25.600

2.  The Convex Information Bottleneck Lagrangian.

Authors:  Borja Rodríguez Gálvez; Ragnar Thobaben; Mikael Skoglund
Journal:  Entropy (Basel)       Date:  2020-01-14       Impact factor: 2.524

3.  Adaptive coding for dynamic sensory inference.

Authors:  Wiktor F Młynarski; Ann M Hermundstad
Journal:  Elife       Date:  2018-07-10       Impact factor: 8.140

4.  Trading bits in the readout from a genetic network.

Authors:  Marianne Bauer; Mariela D Petkova; Thomas Gregor; Eric F Wieschaus; William Bialek
Journal:  Proc Natl Acad Sci U S A       Date:  2021-11-16       Impact factor: 11.205

5.  Gaussian Information Bottleneck and the Non-Perturbative Renormalization Group.

Authors:  Adam G Kline; Stephanie E Palmer
Journal:  New J Phys       Date:  2022-03-09       Impact factor: 3.729

6.  Pareto-Optimal Clustering with the Primal Deterministic Information Bottleneck.

Authors:  Andrew K Tan; Max Tegmark; Isaac L Chuang
Journal:  Entropy (Basel)       Date:  2022-05-30       Impact factor: 2.738

7.  A Generalized Information-Theoretic Framework for the Emergence of Hierarchical Abstractions in Resource-Limited Systems.

Authors:  Daniel T Larsson; Dipankar Maity; Panagiotis Tsiotras
Journal:  Entropy (Basel)       Date:  2022-06-09       Impact factor: 2.738

8.  The evolution of lossy compression.

Authors:  Sarah E Marzen; Simon DeDeo
Journal:  J R Soc Interface       Date:  2017-05       Impact factor: 4.118

9.  Utilizing Information Bottleneck to Evaluate the Capability of Deep Neural Networks for Image Classification.

Authors:  Hao Cheng; Dongze Lian; Shenghua Gao; Yanlin Geng
Journal:  Entropy (Basel)       Date:  2019-05-01       Impact factor: 2.524

10.  Universals of word order reflect optimization of grammars for efficient communication.

Authors:  Michael Hahn; Dan Jurafsky; Richard Futrell
Journal:  Proc Natl Acad Sci U S A       Date:  2020-01-21       Impact factor: 11.205

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.