Literature DB >> 31329108

Distributed Variational Representation Learning.

Estella Inaki, Abdellatif Zaidi.   

Abstract

The problem of distributed representation learning is one in which multiple observations X1,…,XK are processed separately to learn as much information as possible about some source Y. We investigate this problem from information-theoretic grounds, through a generalization of Tishby's centralized Information Bottleneck (IB) method to the distributed setting. Specifically, K ≥ 2 encoders, compress their observations X1,…,XK separately such that, collectively, the produced representations preserve as much information as possible about Y. We study both discrete memoryless (DM) and vector Gaussian data models. For the discrete model, we establish a single-letter characterization of the optimal tradeoff for a class of memoryless sources. For the vector Gaussian model, we provide an explicit characterization of the optimal complexity-relevance tradeoff. Furthermore, we develop a variational bound on the complexity-relevance tradeoff which generalizes the evidence lower bound (ELBO) to the distributed setting. We provide two algorithms to compute this bound: i) a Blahut-Arimoto type iterative algorithm which computes optimal complexity-relevance mappings by iterating over a set of self-consistent equations, and ii) a variational inference type algorithm in which the encoding mappings are parametrized by neural networks, the bound approximated by Markov sampling and optimized with stochastic gradient descent. Numerical results on synthetic and real datasets are provided to support the efficiency of the approaches and algorithms developed in this paper.

Entities:  

Year:  2019        PMID: 31329108     DOI: 10.1109/TPAMI.2019.2928806

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  4 in total

1.  A Generalized Information-Theoretic Framework for the Emergence of Hierarchical Abstractions in Resource-Limited Systems.

Authors:  Daniel T Larsson; Dipankar Maity; Panagiotis Tsiotras
Journal:  Entropy (Basel)       Date:  2022-06-09       Impact factor: 2.738

2.  Variational Information Bottleneck for Unsupervised Clustering: Deep Gaussian Mixture Embedding.

Authors:  Yiğit Uğur; George Arvanitakis; Abdellatif Zaidi
Journal:  Entropy (Basel)       Date:  2020-02-13       Impact factor: 2.524

3.  Distributed Quantization for Partially Cooperating Sensors Using the Information Bottleneck Method.

Authors:  Steffen Steiner; Abdulrahman Dayo Aminu; Volker Kuehn
Journal:  Entropy (Basel)       Date:  2022-03-22       Impact factor: 2.524

4.  Information Bottleneck Signal Processing and Learning to Maximize Relevant Information for Communication Receivers.

Authors:  Jan Lewandowsky; Gerhard Bauch; Maximilian Stark
Journal:  Entropy (Basel)       Date:  2022-07-14       Impact factor: 2.738

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.