Literature DB >> 25404848

Bayesian Estimation of Latently-grouped Parameters in Undirected Graphical Models.

Jie Liu1, David Page2.   

Abstract

In large-scale applications of undirected graphical models, such as social networks and biological networks, similar patterns occur frequently and give rise to similar parameters. In this situation, it is beneficial to group the parameters for more efficient learning. We show that even when the grouping is unknown, we can infer these parameter groups during learning via a Bayesian approach. We impose a Dirichlet process prior on the parameters. Posterior inference usually involves calculating intractable terms, and we propose two approximation algorithms, namely a Metropolis-Hastings algorithm with auxiliary variables and a Gibbs sampling algorithm with "stripped" Beta approximation (Gibbs_SBA). Simulations show that both algorithms outperform conventional maximum likelihood estimation (MLE). Gibbs_SBA's performance is close to Gibbs sampling with exact likelihood calculation. Models learned with Gibbs_SBA also generalize better than the models learned by MLE on real-world Senate voting data.

Entities:  

Year:  2013        PMID: 25404848      PMCID: PMC4232941     

Source DB:  PubMed          Journal:  Adv Neural Inf Process Syst        ISSN: 1049-5258


  2 in total

1.  Training products of experts by minimizing contrastive divergence.

Authors:  Geoffrey E Hinton
Journal:  Neural Comput       Date:  2002-08       Impact factor: 2.026

2.  Connections between score matching, contrastive divergence, and pseudolikelihood for continuous-valued variables.

Authors:  Aapo Hyvärinen
Journal:  IEEE Trans Neural Netw       Date:  2007-09
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.