Literature DB >> 33286997

A Comparison of Variational Bounds for the Information Bottleneck Functional.

Bernhard C Geiger1, Ian S Fischer2.   

Abstract

In this short note, we relate the variational bounds proposed in Alemi et al. (2017) and Fischer (2020) for the information bottleneck (IB) and the conditional entropy bottleneck (CEB) functional, respectively. Although the two functionals were shown to be equivalent, it was empirically observed that optimizing bounds on the CEB functional achieves better generalization performance and adversarial robustness than optimizing those on the IB functional. This work tries to shed light on this issue by showing that, in the most general setting, no ordering can be established between these variational bounds, while such an ordering can be enforced by restricting the feasible sets over which the optimizations take place. The absence of such an ordering in the general setup suggests that the variational bound on the CEB functional is either more amenable to optimization or a relevant cost function for optimization in its own regard, i.e., without justification from the IB or CEB functionals.

Entities:  

Keywords:  deep learning; information bottleneck; neural networks

Year:  2020        PMID: 33286997     DOI: 10.3390/e22111229

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


  1 in total

1.  Information Bottleneck: Theory and Applications in Deep Learning.

Authors:  Bernhard C Geiger; Gernot Kubin
Journal:  Entropy (Basel)       Date:  2020-12-14       Impact factor: 2.524

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.