| Literature DB >> 36079994 |
Ziyang Fu1,2,3, Weiyi Liu4, Chen Huang1,2,3, Tao Mei4,5,6.
Abstract
With increasing demand in many areas, materials are constantly evolving. However, they still have numerous practical constraints. The rational design and discovery of new materials can create a huge technological and social impact. However, such rational design and discovery require a holistic, multi-stage design process, including the design of the material composition, material structure, material properties as well as process design and engineering. Such a complex exploration using traditional scientific methods is not only blind but also a huge waste of time and resources. Machine learning (ML), which is used across data to find correlations in material properties and understand the chemical properties of materials, is being considered a new way to explore the materials field. This paper reviews some of the major recent advances and applications of ML in the field of properties prediction of materials and discusses the key challenges and opportunities in this cross-cutting area.Entities:
Keywords: deep learning; machine learning; materials science; performance prediction
Year: 2022 PMID: 36079994 PMCID: PMC9457802 DOI: 10.3390/nano12172957
Source DB: PubMed Journal: Nanomaterials (Basel) ISSN: 2079-4991 Impact factor: 5.719
Figure 1An overview of the components of machine learning for materials’ performance prediction. Data are at the heart of performance prediction, including data acquisition, data generation, and preprocessing. Based on the correlation between the properties derived from the data, predictions can be made for novel material properties (nanomaterials, adsorbing materials, high-performance materials, etc.) and degradation detection (decreasing battery capacity, catalysis material aging, etc.). It is worth pointing out that the predicted fields that can be obtained are not limited to the tasks illustrated above in this schematic, but also include other aspects such as atomic bonding energies, thermodynamic properties, etc.
Popular databases in performance prediction of materials.
| Database Name | Material Categories | Features | URL |
|---|---|---|---|
| MatWeb | Metals, plastics, ceramics and composites | Tensile strength, breaking strength, Vicat softening point, etc. | |
| NIST | Metals, polymers, etc. | Thermochemical, thermophysical and ion energetics data | |
| AZO materials | Alloy, rubber, plastics, etc. | Mechanical strength, element, molecular weight, etc. | |
| M-Base Company | Polymer | Tensile modulus, yield stress and strain, density, molding shrinkage, etc. | |
| Ceramic Industry | Ceramic materials | Forming method, sintering process/temperature, tensile strength, bulk resistivity, dielectric strength and elastic modulus, etc. | |
| NIST | Materials | Phase diagram, various thermodynamic and kinetic parameters, atomic spectra, physical parameters, etc., | |
| PoLyInfo | Polymer | Chemical formula, type of material, physical properties | |
| CAMPUS | Plastic | Molding shrinkage, breaking strength, Vicat softening point, structure, etc. | |
| Cole-Parmer | Materials | Chemical compatibility | |
| The materials project | Materials | Chemical formula, type of material, physical properties, etc. | |
| crystalstar | Crystal | Crystal structures of organic, inorganic and metal-organic compounds and minerals | |
| FactSage Database | Materials | Phase diagram and various thermodynamic and kinetic parameters |
Figure 2(a) CNN architecture. Reprinted from ref. [68]. (b) The AutoML+Matminer (Automatminer) pipeline. The pipeline can be applied to composition-only datasets, structure datasets, and datasets containing electronic band structure information. Once fit, the pipeline accepts one or more material primitives and returns a prediction of a material’s property. During auto featurization, the input dataset is populated with potentially relevant features using the Matminer library. Next, data cleaning and feature reduction stages prepare the feature matrices for input to an AutoML search algorithm. During training, the final stage searches ML pipelines for optimal configurations; during prediction, the best ML pipeline (according to internal validation score) is used to make predictions. Reprinted from Ref. [70].
Figure 3(a) Schematic diagram of the biological neural network. (b) Flow chart of data processing within a neuron. Reprinted with permission from Ref. [75]. Copyright 2021 Elsevier.
Figure 4(a) Left column illustrates the unit cells of four prominent zeolites (BEA, ABW, ACO, LTA). Right column shows the corresponding methane potential energy profiles (i.e., shapes) computed from molecular simulations (dark green: low energy, light green: higher but accessible energy, white: inaccessible regions). (b) Overall schematics of ESGAN. (c) Real energy shapes of zeolites from Deem’s database, and (d) generated energy shapes from ESGAN. Reprinted with permission from Ref. [82]. Copyright 2019 Royal Society of Chemistry.
Figure 5(a) Artificial neural network structure. (b) A simplified version of the ANFIS architecture. Reprinted with permission from Ref. [83]. Copyright 2020 Elsevier.
Figure 6ANN predictions of the flax reinforced polypropylene composite Young’s moduli E11 compared to experiments, orientation averaging (OA) results, and mean-field predictions using Digimat-MF (for three fiber volume fractions 0.13, 0.21 and 0.29). Reprinted from Ref. [93].
Figure 7(a) The framework of LSTM; (b) RUL prediction using a stacked LSTM model. Reprinted with permission from Ref. [100]. Copyright 2020 Elsevier.
Figure 8(a) Choice of input to the model and the approach of increasing the input window with time. (b) The general architecture of the sequence-to-sequence neural network. Reprinted from Ref. [101].