| Literature DB >> 35160516 |
Jing Wang1, Mohamed Arselene Ayari2,3, Amith Khandakar4, Muhammad E H Chowdhury4, Sm Ashfaq Uz Zaman5, Tawsifur Rahman4, Behzad Vaferi6.
Abstract
Biodegradable polymers have recently found significant applications in pharmaceutics processing and drug release/delivery. Composites based on poly (L-lactic acid) (PLLA) have been suggested to enhance the crystallization rate and relative crystallinity of pure PLLA polymers. Despite the large amount of experimental research that has taken place to date, the theoretical aspects of relative crystallinity have not been comprehensively investigated. Therefore, this research uses machine learning methods to estimate the relative crystallinity of biodegradable PLLA/PGA (polyglycolide) composites. Six different artificial intelligent classes were employed to estimate the relative crystallinity of PLLA/PGA polymer composites as a function of crystallization time, temperature, and PGA content. Cumulatively, 1510 machine learning topologies, including 200 multilayer perceptron neural networks, 200 cascade feedforward neural networks (CFFNN), 160 recurrent neural networks, 800 adaptive neuro-fuzzy inference systems, and 150 least-squares support vector regressions, were developed, and their prediction accuracy compared. The modeling results show that a single hidden layer CFFNN with 9 neurons is the most accurate method for estimating 431 experimentally measured datasets. This model predicts an experimental database with an average absolute percentage difference of 8.84%, root mean squared errors of 4.67%, and correlation coefficient (R2) of 0.999008. The modeling results and relevancy studies show that relative crystallinity increases based on the PGA content and crystallization time. Furthermore, the effect of temperature on relative crystallinity is too complex to be easily explained.Entities:
Keywords: biodegradable composite; machine learning methods; polyglycolide; polylactic acid; relative crystallinity
Year: 2022 PMID: 35160516 PMCID: PMC8840207 DOI: 10.3390/polym14030527
Source DB: PubMed Journal: Polymers (Basel) ISSN: 2073-4360 Impact factor: 4.329
Experiment data for the relative crystallinity of PLLA/PGA composites [12].
| Crystallization Time (min) | Crystallization Temperature (°C) | PGA Dosage (wt%) | Relative Crystallinity (%) | Numbers of Measurements |
|---|---|---|---|---|
| 0–50 | 90–125 | 0 | 0–100 | 103 |
| 0–40 | 85–125 | 2 | 0–100 | 80 |
| 0–35 | 85–125 | 4 | 0–100 | 100 |
| 0–35 | 85–125 | 6 | 0–100 | 85 |
| 0–25 | 85–125 | 8 | 0–100 | 63 |
Figure 1Histogram of experimental measurements for all of crystallization times (A), crystallization temperatures (B), PGA contents (C), and relative crystallinities (D).
Physical meaning of the Spearman, Pearson, and Kendall indices.
| Index Value | Direction of Relevancy | Magnitude of Relevancy |
|---|---|---|
| −1 to < 0 | Indirect | Magnitude of indirect relationship increases from zero to −1 |
| 0 | No dependency | No dependency |
| <0 to +1 | Direct | Magnitude of direct relationship increases from zero to 1 |
Figure 2Interdependency of relative crystallinity on time, temperature, and PGA dosage.
Summary of the trial-and-error process to find the best structural features of the machine learning methods.
| Machine Learning Method | Structural Property | Numbers of Model | |
|---|---|---|---|
| Fixed Property | Adjustable Property | ||
| MLPNN | Number of hidden layers, i.e., two [ | Number of hidden neurons | 200 |
| CFFNN | Number of hidden layers, i.e., two [ | Number of hidden neurons | 200 |
| RNN | Number of hidden layers, i.e., two [ | Number of hidden neurons | 160 |
| LSSVR | Training algorithm, i.e., least-squares method [ | Kernel function | 150 |
| ANFIS2 | Membership function, i.e., subtractive clustering [ | Radius of cluster | 400 |
| ANFIS3 | Membership function, i.e., c-means clustering [ | Number of clusters | 400 |
The most appropriate features for the machine learning methods determined through the trial-and-error process.
| Model | The Most Appropriate Characteristics | Collection | AAPD% | RAPE% | RMSE | R2 |
|---|---|---|---|---|---|---|
| MLPNN | Nine hidden neurons | Training | 11.13 | 7.38 | 4.95 | 0.988679 |
| Hyperbolic tangent and logistic | Testing | 6.25 | 5.37 | 2.38 | 0.997467 | |
| Levenberg optimization algorithm | Overall | 10.39 | 7.07 | 4.65 | 0.990062 | |
| CFFNN | Nine hidden neurons | Training | 8.74 | 6.68 | 4.54 | 0.990058 |
| Hyperbolic tangent and logistic | Testing | 9.42 | 7.28 | 5.32 | 0.990337 | |
| Levenberg optimization algorithm | Overall | 8.84 | 6.76 | 4.67 | 0.990082 | |
| RNN | Seven hidden neurons | Training | 10.92 | 9.81 | 4.00 | 0.992677 |
| Hyperbolic tangent and logistic | Testing | 11.07 | 13.76 | 9.14 | 0.966081 | |
| Scaled conjugate gradient algorithm | Overall | 10.94 | 10.44 | 5.12 | 0.988174 | |
| LSSVR | Gaussian kernel function | Training | 13.03 | 8.14 | 5.22 | 0.987382 |
| Testing | 14.13 | 8.78 | 4.33 | 0.992005 | ||
| Overall | 13.20 | 8.24 | 5.09 | 0.988064 | ||
| ANFIS2 | Hybrid optimization algorithm | Training | 8.54 | 5.27 | 4.41 | 0.991163 |
| Testing | 16.28 | 8.79 | 5.36 | 0.985432 | ||
| Overall | 9.71 | 5.74 | 4.57 | 0.990414 | ||
| ANFIS3 | Hybrid optimization algorithm | Training | 25.81 | 13.87 | 6.29 | 0.981923 |
| Testing | 19.01 | 18.39 | 7.78 | 0.971648 | ||
| Overall | 24.78 | 14.53 | 6.54 | 0.980306 |
Figure 3Ranking of machine learning methods during model development, model validation, and their combination.
Investigating the effect of activation functions on the predictive performances of the CFFNN method.
| Hidden Layer | Output Layer | Training | Testing | Overall |
|---|---|---|---|---|
| Hyperbolic tangent | Logistic | 8.74 | 9.42 | 8.84 |
| Logistic | Logistic | 7.97 | 8.61 | 8.06 |
| Logistic | Hyperbolic tangent | 8.33 | 6.80 | 8.10 |
| Hyperbolic tangent | Hyperbolic tangent | 9.35 | 5.53 | 8.78 |
Figure 4Results of the iterative procedure conducted using the Levenberg–Marquardt to train the CFFNN method.
Figure 5Predicted versus actual measured values of relative crystallinity.
Figure 6Histogram presentation of deviation between predicted and actual values of relative crystallinity (average error = 0.398%, standard deviation = 4.72%).
Figure 7Kernel density estimation for actual measurements and CFFNN predictions.
Figure 8Applying leverage analysis to detect reliable as well as outlier information.
Figure 9The effect of PGA dosage and crystallization time on rate of crystallization at 125 °C.
Figure 10How PGA dosage affects relative crystallization of PLLA/PGA composites at 85 °C.
Figure 11Isothermal relative crystallinity of the PLLA/PGA composite with 8 wt% of the fiber.