Literature DB >> 30636161

Neural Network Renormalization Group.

Shuo-Hui Li1,2, Lei Wang1,3.   

Abstract

We present a variational renormalization group (RG) approach based on a reversible generative model with hierarchical architecture. The model performs hierarchical change-of-variables transformations from the physical space to a latent space with reduced mutual information. Conversely, the neural network directly maps independent Gaussian noises to physical configurations following the inverse RG flow. The model has an exact and tractable likelihood, which allows unbiased training and direct access to the renormalized energy function of the latent variables. To train the model, we employ probability density distillation for the bare energy function of the physical problem, in which the training loss provides a variational upper bound of the physical free energy. We demonstrate practical usage of the approach by identifying mutually independent collective variables of the Ising model and performing accelerated hybrid Monte Carlo sampling in the latent space. Lastly, we comment on the connection of the present approach to the wavelet formulation of RG and the modern pursuit of information preserving RG.

Year:  2018        PMID: 30636161     DOI: 10.1103/PhysRevLett.121.260601

Source DB:  PubMed          Journal:  Phys Rev Lett        ISSN: 0031-9007            Impact factor:   9.161


  6 in total

1.  Computing Absolute Free Energy with Deep Generative Models.

Authors:  Xinqiang Ding; Bin Zhang
Journal:  J Phys Chem B       Date:  2020-11-03       Impact factor: 2.991

2.  An Integrated World Modeling Theory (IWMT) of Consciousness: Combining Integrated Information and Global Neuronal Workspace Theories With the Free Energy Principle and Active Inference Framework; Toward Solving the Hard Problem and Characterizing Agentic Causation.

Authors:  Adam Safron
Journal:  Front Artif Intell       Date:  2020-06-09

3.  Unsupervised Learning Methods for Molecular Simulation Data.

Authors:  Aldo Glielmo; Brooke E Husic; Alex Rodriguez; Cecilia Clementi; Frank Noé; Alessandro Laio
Journal:  Chem Rev       Date:  2021-05-04       Impact factor: 60.622

4.  Entropic Dynamics in Neural Networks, the Renormalization Group and the Hamilton-Jacobi-Bellman Equation.

Authors:  Nestor Caticha
Journal:  Entropy (Basel)       Date:  2020-05-23       Impact factor: 2.524

5.  Variationally Inferred Sampling through a Refined Bound.

Authors:  Víctor Gallego; David Ríos Insua
Journal:  Entropy (Basel)       Date:  2021-01-19       Impact factor: 2.524

6.  A cautionary tale for machine learning generated configurations in presence of a conserved quantity.

Authors:  Ahmadreza Azizi; Michel Pleimling
Journal:  Sci Rep       Date:  2021-03-18       Impact factor: 4.379

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.