| Literature DB >> 31409019 |
Juan M Barrios1, Pablo E Romero2.
Abstract
3D printing using fused deposition modeling (FDM) includes a multitude of control parameters. It is difficult to predict a priori what surface finish will be achieved when certain values are set for these parameters. The objective of this work is to compare the models generated by decision tree algorithms (C4.5, random forest, and random tree) and to analyze which makes the best prediction of the surface roughness in polyethylene terephthalate glycol (PETG) parts printed in 3D using the FDM technique. The models have been created using a dataset of 27 instances with the following attributes: layer height, extrusion temperature, print speed, print acceleration, and flow rate. In addition, a dataset has been created to evaluate the models, consisting of 15 additional instances. The models generated by the random tree algorithm achieve the best results for predicting the surface roughness in FDM parts.Entities:
Keywords: C4.5; PETG; data mining; decision tree; fused deposition modeling (FDM); random forest; random tree; surface roughness
Year: 2019 PMID: 31409019 PMCID: PMC6721777 DOI: 10.3390/ma12162574
Source DB: PubMed Journal: Materials (Basel) ISSN: 1996-1944 Impact factor: 3.623
Figure 1Different stages that compose the methodology followed in the present work: 3D printing, surface roughness measurements, data mining, models generation, models testing, and comparison between algorithms.
Factors and levels used in design of the experiments (DOE).
| Print Parameter | Level 1 | Level 2 | Level 3 |
|---|---|---|---|
| Layer height (LH), mm | 0.16 | 0.20 | 0.24 |
| Temperature (T), °C | 240 | 245 | 250 |
| Print speed (PS), mm/s | 40 | 50 | 60 |
| Print acceleration (PA), mm/s2 | 500 | 1000 | 1500 |
| Flow rate (F), % | 90 | 100 | 110 |
Figure 2Measurement of surface roughness in the direction parallel to the direction of extrusion (a) and in the direction perpendicular to the direction of extrusion (b).
Training dataset: design of experiment (L27), according to Taguchi method: layer height (LH), print temperature (T), print speed (PS), print acceleration (PA), and flow rate (F).
| No. | LH (mm) | T (°C) | PS (mm/s) | PA (mm/s2) | F (%) |
|---|---|---|---|---|---|
| 1 | 0.16 | 240 | 40 | 500 | 90 |
| 2 | 0.16 | 240 | 40 | 500 | 100 |
| 3 | 0.16 | 240 | 40 | 500 | 110 |
| 4 | 0.16 | 245 | 50 | 1000 | 110 |
| 5 | 0.16 | 245 | 50 | 1000 | 90 |
| 6 | 0.16 | 245 | 50 | 1000 | 100 |
| 7 | 0.16 | 250 | 60 | 1500 | 100 |
| 8 | 0.16 | 250 | 60 | 1500 | 110 |
| 9 | 0.16 | 250 | 60 | 1500 | 90 |
| 10 | 0.20 | 240 | 50 | 1500 | 100 |
| 11 | 0.20 | 240 | 50 | 1500 | 110 |
| 12 | 0.20 | 240 | 50 | 1500 | 90 |
| 13 | 0.20 | 245 | 60 | 500 | 90 |
| 14 | 0.20 | 245 | 60 | 500 | 100 |
| 15 | 0.20 | 245 | 60 | 500 | 110 |
| 16 | 0.20 | 250 | 40 | 1000 | 110 |
| 17 | 0.20 | 250 | 40 | 1000 | 90 |
| 18 | 0.20 | 250 | 40 | 1000 | 100 |
| 19 | 0.24 | 240 | 60 | 1000 | 110 |
| 20 | 0.24 | 240 | 60 | 1000 | 90 |
| 21 | 0.24 | 240 | 60 | 1000 | 100 |
| 22 | 0.24 | 245 | 40 | 1500 | 100 |
| 23 | 0.24 | 245 | 40 | 1500 | 110 |
| 24 | 0.24 | 245 | 40 | 1500 | 90 |
| 25 | 0.24 | 250 | 50 | 500 | 90 |
| 26 | 0.24 | 250 | 50 | 500 | 100 |
| 27 | 0.24 | 250 | 50 | 500 | 110 |
Test dataset: design of experiment (L), according to Taguchi method: layer height (LH), print temperature (T), print speed (PS), print acceleration (PA), and flow rate (F).
| No. | LH (mm) | T (°C) | PS (mm/s) | PA (mm/s2) | F (%) |
|---|---|---|---|---|---|
| 1 | 0.14 | 240 | 15 | 200 | 95 |
| 2 | 0.14 | 236 | 18 | 300 | 105 |
| 3 | 0.18 | 238 | 15 | 300 | 115 |
| 4 | 0.18 | 243 | 20 | 200 | 95 |
| 5 | 0.14 | 246 | 35 | 400 | 105 |
| 6 | 0.23 | 248 | 46 | 400 | 95 |
| 7 | 0.24 | 243 | 45 | 600 | 103 |
| 8 | 0.30 | 230 | 56 | 2000 | 100 |
| 9 | 0.30 | 250 | 60 | 1600 | 110 |
| 10 | 0.20 | 251 | 70 | 1500 | 102 |
| 11 | 0.20 | 249 | 85 | 1200 | 100 |
| 12 | 0.28 | 249 | 100 | 1200 | 98 |
| 13 | 0.28 | 237 | 25 | 1100 | 90 |
| 14 | 0.14 | 238 | 21 | 800 | 100 |
| 15 | 0.10 | 239 | 50 | 600 | 110 |
Training dataset: results for surface roughness (R, R).
| Test | ||||
|---|---|---|---|---|
| 1 | 10.648 | Class2 | 12.240 | Class2 |
| 2 | 0.916 | Class1 | 6.464 | Class1 |
| 3 | 1.126 | Class1 | 9.160 | Class1 |
| 4 | 2.428 | Class1 | 32.994 | Class2 |
| 5 | 1.800 | Class1 | 5.504 | Class1 |
| 6 | 8.814 | Class2 | 10.922 | Class1 |
| 7 | 4.552 | Class2 | 23.650 | Class2 |
| 8 | 1.370 | Class1 | 14.458 | Class2 |
| 9 | 0.954 | Class1 | 5.414 | Class1 |
| 10 | 1.462 | Class1 | 23.470 | Class2 |
| 11 | 1.666 | Class1 | 9.050 | Class1 |
| 12 | 1.554 | Class1 | 10.074 | Class1 |
| 13 | 6.258 | Class2 | 20.088 | Class2 |
| 14 | 7.788 | Class2 | 15.368 | Class2 |
| 15 | 10.172 | Class2 | 12.560 | Class2 |
| 16 | 9.744 | Class2 | 10.186 | Class2 |
| 17 | 4.696 | Class2 | 5.462 | Class1 |
| 18 | 5.112 | Class2 | 5.330 | Class1 |
| 19 | 4.274 | Class2 | 10.668 | Class1 |
| 20 | 6.994 | Class1 | 8.214 | Class1 |
| 21 | 5.868 | Class2 | 6.056 | Class1 |
| 22 | 3.796 | Class2 | 8.680 | Class1 |
| 23 | 3.054 | Class1 | 5.720 | Class1 |
| 24 | 3.702 | Class1 | 6.804 | Class1 |
| 25 | 4.124 | Class1 | 19.654 | Class1 |
| 26 | 4.682 | Class1 | 8.964 | Class2 |
| 27 | 2.256 | Class2 | 7.122 | Class1 |
Figure 3J48 (C4.5) decision tree for R.
Figure 4J48 (C4.5) decision tree for R.
Testing dataset: results for surface roughness (R, R).
| Test | ||||
|---|---|---|---|---|
| 1 | 1.026 | Class1 | 4.462 | Class1 |
| 2 | 1.178 | Class1 | 2.656 | Class1 |
| 3 | 2.064 | Class1 | 3.97 | Class1 |
| 4 | 1.126 | Class1 | 7.192 | Class1 |
| 5 | 1.984 | Class1 | 9.24 | Class1 |
| 6 | 1.252 | Class1 | 8.276 | Class1 |
| 7 | 1.026 | Class1 | 4.462 | Class1 |
| 8 | 1.744 | Class1 | 7.732 | Class1 |
| 9 | 5.906 | Class2 | 11.408 | Class1 |
| 10 | 3.99 | Class1 | 12.082 | Class2 |
| 11 | 1.182 | Class1 | 6.392 | Class1 |
| 12 | 1.008 | Class1 | 6.622 | Class1 |
| 13 | 6.466 | Class2 | 14.002 | Class2 |
| 14 | 0.606 | Class1 | 6.828 | Class1 |
| 15 | 1.846 | Class1 | 4.758 | Class1 |
Indicators to compare the models generated by the studied algorithms to predict R.
| Indicator | J48 | Random Forest | Random Tree |
|---|---|---|---|
| Correctly Classified Instances | 60.00% | 66.67% | 80.00% |
| Incorrectly Classified Instances | 40.00% | 33.33% | 20.00% |
| Kappa statistic | −0.2162 | 0.1176 | 0.2857 |
| Mean absolute error | 0.4926 | 0.429 | 0.2 |
| Root mean squared error | 0.595 | 0.474 | 0.4472 |
| Relative absolute error | 106.60% | 92.84% | 43.28% |
| Root relative squared error | 128.39% | 102.29% | 96.50% |
Detailed precision parameters achieved by each algorithm for the R prediction model.
| Detailed Accuracy (weighted av.) | J48 | Random Forest | Random Tree |
|---|---|---|---|
| True Positive (TP) Rate | 0.600 | 0.667 | 0.800 |
| False Positive (FP) Rate | 0.908 | 0.474 | 0.454 |
| Precision | 0.709 | 0.807 | 0.839 |
| Recall | 0.600 | 0.667 | 0.800 |
| F-measure | 0.650 | 0.716 | 0.816 |
| MCC | −0.237 | 0.139 | 0.294 |
| ROC Area | 0.154 | 0.692 | 0.673 |
| PRC Area | 0.697 | 0.868 | 0.819 |
Indicators to compare the models generated by the studied algorithms to predict R.
| Indicator | J48 | Random Forest | Random Tree |
|---|---|---|---|
| Correctly Classified Instances | 73.33% | 80.00% | 86.67% |
| Incorrectly Classified Instances | 26.67% | 20.00% | 13.33% |
| Kappa statistic | −0.1538 | −0.0976 | 0.5946 |
| Mean absolute error | 0.2556 | 0.2888 | 0.1333 |
| Root mean squared error | 0.4645 | 0.3854 | 0.3651 |
| Relative absolute error | 66.17% | 74.57% | 34.52% |
| Root relative squared error | 116.01% | 96.26% | 91.20% |
Detailed precision parameters achieved by each algorithm for the R prediction model.
| Detailed Accuracy (weighted av.) | J48 | Random Forest | Random Tree |
|---|---|---|---|
| True Positive (TP) Rate | 0.733 | 0.800 | 0.867 |
| False Positive (FP) Rate | 0.887 | 0.877 | 0.021 |
| Precision | 0.733 | 0.743 | 0.933 |
| Recall | 0.733 | 0.800 | 0.867 |
| F-measure | 0.733 | 0.770 | 0.883 |
| MCC | −0.154 | −0.105 | 0.650 |
| ROC Area | 0.385 | 0.481 | 0.923 |
| PRC Area | 0.745 | 0.797 | 0.916 |
Time used by each algorithm to build and validate the model.
| Algorithm | Computing Time for | Computing Time for |
|---|---|---|
| J48 | 0.11 | 0.19 |
| Random Forest | 0.05 | 0.34 |
| Random Tree | 0.01 | 0.01 |
Strength of concordance for kappa statistic.
| Kappa Statistic | Strength of Concordance |
|---|---|
| 0.00 | Poor |
| 0.01–0.20 | Slight |
| 0.21–0.40 | Fair |
| 0.41–0.60 | Moderate |
| 0.61–0.80 | Substancial |
| 0.81–1.00 | Almost perfect |