Literature DB >> 30596568

Advances in Variational Inference.

Cheng Zhang, Judith Butepage, Hedvig Kjellstrom, Stephan Mandt.   

Abstract

Many modern unsupervised or semi-supervised machine learning algorithms rely on Bayesian probabilistic models. These models are usually intractable and thus require approximate inference. Variational inference (VI) lets us approximate a high-dimensional Bayesian posterior with a simpler variational distribution by solving an optimization problem. This approach has been successfully applied to various models and large-scale applications. In this review, we give an overview of recent trends in variational inference. We first introduce standard mean field variational inference, then review recent advances focusing on the following aspects: (a) scalable VI, which includes stochastic approximations, (b) generic VI, which extends the applicability of VI to a large class of otherwise intractable models, such as non-conjugate models, (c) accurate VI, which includes variational models beyond the mean field approximation or with atypical divergences, and (d) amortized VI, which implements the inference over local latent variables with inference networks. Finally, we provide a summary of promising future research directions.

Year:  2018        PMID: 30596568     DOI: 10.1109/TPAMI.2018.2889774

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  16 in total

1.  A Deep Learning Algorithm for High-Dimensional Exploratory Item Factor Analysis.

Authors:  Christopher J Urban; Daniel J Bauer
Journal:  Psychometrika       Date:  2021-02-02       Impact factor: 2.500

2.  Analysing brain networks in population neuroscience: a case for the Bayesian philosophy.

Authors:  Danilo Bzdok; Dorothea L Floris; Andre F Marquand
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2020-02-24       Impact factor: 6.237

3.  Toward an idiomatic framework for cognitive robotics.

Authors:  Malte Rørmose Damgaard; Rasmus Pedersen; Thomas Bak
Journal:  Patterns (N Y)       Date:  2022-07-08

4.  Scalable Bayesian Approach for the Dina Q-Matrix Estimation Combining Stochastic Optimization and Variational Inference.

Authors:  Motonori Oka; Kensuke Okada
Journal:  Psychometrika       Date:  2022-09-12       Impact factor: 2.290

5.  A text data mining approach to the study of emotions triggered by new advertising formats during the COVID-19 pandemic.

Authors:  Angela Maria D'Uggento; Albino Biafora; Fabio Manca; Claudia Marin; Massimo Bilancia
Journal:  Qual Quant       Date:  2022-06-30

6.  Making thermodynamic models of mixtures predictive by machine learning: matrix completion of pair interactions.

Authors:  Fabian Jirasek; Robert Bamler; Sophie Fellenz; Michael Bortz; Marius Kloft; Stephan Mandt; Hans Hasse
Journal:  Chem Sci       Date:  2022-04-04       Impact factor: 9.969

7.  A Bayesian linear mixed model for prediction of complex traits.

Authors:  Yang Hai; Yalu Wen
Journal:  Bioinformatics       Date:  2020-12-17       Impact factor: 6.937

Review 8.  Bayesian statistical learning for big data biology.

Authors:  Christopher Yau; Kieran Campbell
Journal:  Biophys Rev       Date:  2019-02-07

9.  Dynamic causal modelling of COVID-19.

Authors:  Karl J Friston; Thomas Parr; Peter Zeidman; Adeel Razi; Guillaume Flandin; Jean Daunizeau; Ollie J Hulme; Alexander J Billig; Vladimir Litvak; Rosalyn J Moran; Cathy J Price; Christian Lambert
Journal:  Wellcome Open Res       Date:  2020-08-07

10.  MOFA+: a statistical framework for comprehensive integration of multi-modal single-cell data.

Authors:  Ricard Argelaguet; Damien Arnol; Danila Bredikhin; Yonatan Deloro; Britta Velten; John C Marioni; Oliver Stegle
Journal:  Genome Biol       Date:  2020-05-11       Impact factor: 13.583

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.