| Literature DB >> 36015925 |
Julian Brunthaler1, Patryk Grabski1, Valentin Sturm2, Wolfgang Lubowski1, Dmitry Efrosinin3.
Abstract
The last few decades have been characterised by a very active application of smart technologies in various fields of industry. This paper deals with industrial activities, such as injection molding, where it is required to monitor continuously the manufacturing process to identify both the effective running time and down-time periods. Supervised machine learning algorithms are developed to recognize automatically the periods of the injection molding machines. The former algorithm uses directly the features of the descriptive statistics, while the latter one utilizes a convolutional neural network. The automatic state recognition system is equipped with an 3D-accelerometer sensor whose datasets are used to train and verify the proposed algorithms. The novelty of our contribution is that accelerometer data-based machine learning models are used to distinguish producing and non-producing periods by means of recognition of key steps in an injection molding cycle. The first testing results show the approximate overall balanced accuracy of 72-92% that illustrates the large potential of the monitoring system with the accelerometer. According to the ANOVA test, there are no sufficient statistical differences between the comparative algorithms, but the results of the neural network exhibit higher variances of the defined accuracy metrics.Entities:
Keywords: accelerometer sensor; convolutional neural network; injection molding; machine-learning algorithm; state recognition
Mesh:
Year: 2022 PMID: 36015925 PMCID: PMC9413099 DOI: 10.3390/s22166165
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1The left image shows an injection molding machine with a mounted 3-axis accelerometer (sensor marked by a red circle in the center of the image). The right picture shows the mounting of the same sensor in detail.
Figure 2Depiction of excerpts from three different datasets (a–c). The injection molding cycles shown in (a,b) are rather short cycles of slightly more than 20 s, (c) shows a slightly longer cycle of about 40 s in length.
Figure 3Depiction of the pre-processing step: (a) original 3D data set; (b) transformed 1D time series acceleration magnitude .
Figure 4Our Inception module for the time series segmentation. Note that the three 1D-Conv blocks marked in green have different receptive fields.
Experiment I: Confusion matrix of our tree-based algorithm.
| Predicted Labels | ||||||
| 1 | 2 | 3 | 4 | 5 | ||
| True Labels | 1 | 84.6 | 2.60 | 0.106 | 12.4 | 0.229 |
| 2 | 8.13 | 53.9 | 8.49 | 0.518 | 28.9 | |
| 3 | 0.217 | 14.4 | 79.6 | 4.37 | 1.40 | |
| 4 | 15.8 | 0.368 | 0.0124 | 81.0 | 2.77 | |
| 5 | 2.64 | 18.9 | 3.56 | 10.6 | 64.2 | |
Experiment I: Confusion matrix of our neural network.
| Predicted Labels | ||||||
| 1 | 2 | 3 | 4 | 5 | ||
| True Labels | 1 | 84.8 | 1.95 | 3.82 | 7.71 | 1.68 |
| 2 | 1.27 | 64.4 | 8.92 | 1.20 | 24.2 | |
| 3 | 10.7 | 8.60 | 78.5 | 0.965 | 1.18 | |
| 4 | 7.55 | 1.21 | 0.353 | 88.4 | 2.52 | |
| 5 | 2.30 | 35.2 | 4.42 | 3.65 | 54.4 | |
Figure 5Experiment I: Box whisker chart of BACC values of our 15 datasets for both considered methods.
Figure 6Experiment I: Box whisker chart of ACC values of our 15 datasets for both considered methods.
Figure 7Experiment I: Box whisker chart of F-Score values of our 15 datasets for both considered methods.
Summary of results from our Experiment I.
| Model | BACC | ACC | F-Score |
|---|---|---|---|
| Classic | 0.7268 | 0.6846 | 0.6829 |
| NN | 0.7411 | 0.6698 | 0.7002 |
Statistical comparison of our results based on performance measures for each data-set.
| Perf Measure | Balanced Acc. | Accuracy | F-Score |
|---|---|---|---|
|
| 0.9871 | 0.7837 | 0.9281 |
Experiment II: Confusion matrix of our tree-based algorithm.
| Predicted Labels | ||||||
| 1 | 2 | 3 | 4 | 5 | ||
| True Labels | 1 | 94.7 | 1.34 | 0.651 | 3.83 | 0.0385 |
| 2 | 3.39 | 72.1 | 1.50 | 0.0316 | 23.0 | |
| 3 | 0.0524 | 2.96 | 96.0 | 0.123 | 0.854 | |
| 4 | 3.50 | 0.221 | 0. | 94.6 | 1.65 | |
| 5 | 2.18 | 12.5 | 4.21 | 10.0 | 71.1 | |
Experiment II: Confusion matrix of our neural network.
| Predicted Labels | ||||||
| 1 | 2 | 3 | 4 | 5 | ||
| True Labels | 1 | 98.8 | 1845 | 524 | 1053 | 1415 |
| 2 | 0.231 | 85.7 | 0.412 | 0.121 | 13.5 | |
| 3 | 0.0705 | 0.337 | 99.1 | 0.0303 | 0.492 | |
| 4 | 0.187 | 0.133 | 0.0286 | 98.9 | 0.714 | |
| 5 | 0.321 | 10.1 | 0.755 | 0.761 | 88.1 | |
Summary of results from our Experiment II.
| Model | BACC | ACC | F-Score |
|---|---|---|---|
| Classic | 0.8571 | 0.8027 | 0.8152 |
| NN | 0.9413 | 0.9166 | 0.9338 |