Literature DB >> 33779150

Four Generations of High-Dimensional Neural Network Potentials.

Jörg Behler1.   

Abstract

Since their introduction about 25 years ago, machine learning (ML) potentials have become an important tool in the field of atomistic simulations. After the initial decade, in which neural networks were successfully used to construct potentials for rather small molecular systems, the development of high-dimensional neural network potentials (HDNNPs) in 2007 opened the way for the application of ML potentials in simulations of large systems containing thousands of atoms. To date, many other types of ML potentials have been proposed continuously increasing the range of problems that can be studied. In this review, the methodology of the family of HDNNPs including new recent developments will be discussed using a classification scheme into four generations of potentials, which is also applicable to many other types of ML potentials. The first generation is formed by early neural network potentials designed for low-dimensional systems. High-dimensional neural network potentials established the second generation and are based on three key steps: first, the expression of the total energy as a sum of environment-dependent atomic energy contributions; second, the description of the atomic environments by atom-centered symmetry functions as descriptors fulfilling the requirements of rotational, translational, and permutation invariance; and third, the iterative construction of the reference electronic structure data sets by active learning. In third-generation HDNNPs, in addition, long-range interactions are included employing environment-dependent partial charges expressed by atomic neural networks. In fourth-generation HDNNPs, which are just emerging, in addition, nonlocal phenomena such as long-range charge transfer can be included. The applicability and remaining limitations of HDNNPs are discussed along with an outlook at possible future developments.

Entities:  

Mesh:

Year:  2021        PMID: 33779150     DOI: 10.1021/acs.chemrev.0c00868

Source DB:  PubMed          Journal:  Chem Rev        ISSN: 0009-2665            Impact factor:   60.622


  20 in total

1.  Gaussian Process Regression for Materials and Molecules.

Authors:  Volker L Deringer; Albert P Bartók; Noam Bernstein; David M Wilkins; Michele Ceriotti; Gábor Csányi
Journal:  Chem Rev       Date:  2021-08-16       Impact factor: 60.622

2.  Deep learning study of tyrosine reveals that roaming can lead to photodamage.

Authors:  Julia Westermayr; Michael Gastegger; Dóra Vörös; Lisa Panzenboeck; Florian Joerg; Leticia González; Philipp Marquetand
Journal:  Nat Chem       Date:  2022-06-02       Impact factor: 24.274

Review 3.  Bottom-up Coarse-Graining: Principles and Perspectives.

Authors:  Jaehyeok Jin; Alexander J Pak; Aleksander E P Durumeric; Timothy D Loose; Gregory A Voth
Journal:  J Chem Theory Comput       Date:  2022-09-07       Impact factor: 6.578

4.  Ab initio neural network MD simulation of thermal decomposition of a high energy material CL-20/TNT.

Authors:  Liqun Cao; Jinzhe Zeng; Bo Wang; Tong Zhu; John Z H Zhang
Journal:  Phys Chem Chem Phys       Date:  2022-05-18       Impact factor: 3.945

5.  Theoretical studies on triplet-state driven dissociation of formaldehyde by quasi-classical molecular dynamics simulation on machine-learning potential energy surface.

Authors:  Shichen Lin; Daoling Peng; Weitao Yang; Feng Long Gu; Zhenggang Lan
Journal:  J Chem Phys       Date:  2021-12-07       Impact factor: 3.488

Review 6.  Dynamics of Heterogeneous Catalytic Processes at Operando Conditions.

Authors:  Xiangcheng Shi; Xiaoyun Lin; Ran Luo; Shican Wu; Lulu Li; Zhi-Jian Zhao; Jinlong Gong
Journal:  JACS Au       Date:  2021-11-04

7.  Machine learning potentials for complex aqueous systems made simple.

Authors:  Christoph Schran; Fabian L Thiemann; Patrick Rowe; Erich A Müller; Ondrej Marsalek; Angelos Michaelides
Journal:  Proc Natl Acad Sci U S A       Date:  2021-09-21       Impact factor: 11.205

8.  Artificial intelligence-enhanced quantum chemical method with broad applicability.

Authors:  Peikun Zheng; Roman Zubatyuk; Wei Wu; Olexandr Isayev; Pavlo O Dral
Journal:  Nat Commun       Date:  2021-12-02       Impact factor: 14.919

9.  A Differentiable Neural-Network Force Field for Ionic Liquids.

Authors:  Hadrián Montes-Campos; Jesús Carrete; Sebastian Bichelmaier; Luis M Varela; Georg K H Madsen
Journal:  J Chem Inf Model       Date:  2021-12-23       Impact factor: 4.956

10.  Self-consistent determination of long-range electrostatics in neural network potentials.

Authors:  Ang Gao; Richard C Remsing
Journal:  Nat Commun       Date:  2022-03-23       Impact factor: 14.919

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.