| Literature DB >> 35958762 |
Jui-Sheng Chou1, Stela Tjandrakusuma1, Chi-Yun Liu1.
Abstract
Most building structures that are built today are built from concrete, owing to its various favorable properties. Compressive strength is one of the mechanical properties of concrete that is directly related to the safety of the structures. Therefore, predicting the compressive strength can facilitate the early planning of material quality management. A series of deep learning (DL) models that suit computer vision tasks, namely the convolutional neural networks (CNNs), are used to predict the compressive strength of ready-mixed concrete. To demonstrate the efficacy of computer vision-based prediction, its effectiveness using imaging numerical data was compared with that of the deep neural networks (DNNs) technique that uses conventional numerical data. Various DL prediction models were compared and the best ones were identified with the relevant concrete datasets. The best DL models were then optimized by fine-tuning their hyperparameters using a newly developed bio-inspired metaheuristic algorithm, called jellyfish search optimizer, to enhance the accuracy and reliability. Analytical experiments indicate that the computer vision-based CNNs outperform the numerical data-based DNNs in all evaluation metrics except the training time. Thus, the bio-inspired optimization of computer vision-based convolutional neural networks is potentially a promising approach to predict the compressive strength of ready-mixed concrete.Entities:
Mesh:
Year: 2022 PMID: 35958762 PMCID: PMC9359848 DOI: 10.1155/2022/9541115
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1Ready-mixed concrete manufacturing process.
Figure 2Simple ANN model architecture.
Figure 3Deep neural network (DNN) model architecture.
Figure 4Generic convolutional neural network (CNN) model architecture.
Figure 5Convolutional layer multiplication process and plot of ReLU.
Figure 6Example of max pooling and average pooling.
Figure 7VGG16 and VGG19 models' architectures.
Figure 8Residual connection.
Figure 9ResNet model architecture.
Figure 10Inception-v3 and Inception-ResNet-v2 models' architectures.
Figure 11Xception model architecture.
Figure 12MobileNet and MobileNetV2 models' architectures.
Figure 13DenseNet model architecture.
Figure 14Neural architecture search method.
Figure 15NASNet-A mobile model architecture.
Figure 16Efficientnet model architecture.
Figure 17Phases of the jellyfish search algorithm.
Figure 18Algorithmic flowchart of the jellyfish search algorithm.
Figure 19Pseudocode of the jellyfish search algorithm.
Performance metrics.
| Performance metric | Formula |
|---|---|
| Mean absolute error (MAE) | 1/ |
| Mean squared error (MSE) | 1/ |
| Root mean squared error (RMSE) | (1/ |
| Mean absolute percentage error (MAPE) | 1/ |
| Synthesis index (SI) | 1/ |
Note. n, number of predictions; y, actual value; y′, predicted value; m, number of performance metrics; P, value of the performance metric; Pmin, minimum value of performance metric; Pmax, maximum value of performance metric.
Convolutional neural network-based models in the Keras Application.
| Model | Top 1 accuracy | Top 5 accuracy | Depth | Size (MB) | Parameters | Reference |
|
| ||||||
| VGG16 | 0.713 | 0.901 | 23 | 528 | 138,357,544 | [ |
| VGG19 | 0.713 | 0.900 | 26 | 549 | 143,667,240 | |
| ResNet50 | 0.749 | 0.921 | — | 98 | 25,636,712 | [ |
| ResNet101 | 0.764 | 0.928 | — | 171 | 44,707,176 | |
| ResNet152 | 0.766 | 0.931 | — | 232 | 60,419,944 | |
| ResNet50V2 | 0.760 | 0.930 | — | 98 | 25,613,800 | [ |
| ResNet101V2 | 0.772 | 0.938 | — | 171 | 44,675,560 | |
| ResNet152V2 | 0.780 | 0.942 | — | 232 | 60,380,648 | |
| InceptionV3 | 0.779 | 0.937 | 159 | 92 | 23,851,784 | [ |
| InceptionResNetV2 | 0.803 | 0.953 | 572 | 215 | 55,873,736 | [ |
| Xception | 0.790 | 0.945 | 126 | 88 | 22,910,480 | [ |
| MobileNet | 0.704 | 0.895 | 88 | 16 | 4,253,864 | [ |
| MobileNetV2 | 0.713 | 0.901 | 88 | 14 | 3,538,984 | [ |
| DenseNet121 | 0.750 | 0.923 | 121 | 33 | 8,062,504 | [ |
| DenseNet169 | 0.762 | 0.932 | 169 | 57 | 14,307,880 | |
| DenseNet201 | 0.773 | 0.936 | 201 | 80 | 20,242,984 | |
| NASNetMobile | 0.744 | 0.919 | — | 23 | 5,326,716 | [ |
| NASNetLarge | 0.825 | 0.960 | — | 343 | 88,949,818 | |
| EfficientNetB0 | — | — | — | 29 | 5,330,571 | [ |
| EfficientNetB1 | — | — | — | 31 | 7,856,239 | |
| EfficientNetB2 | — | — | — | 36 | 9,177,569 | |
| EfficientNetB3 | — | — | — | 48 | 12,320,535 | |
| EfficientNetB4 | — | — | — | 75 | 19,466,823 | |
| EfficientNetB5 | — | — | — | 118 | 30,562,527 | |
| EfficientNetB6 | — | — | — | 166 | 43,265,143 | |
| EfficientNetB7 | — | — | — | 256 | 66,658,687 | |
Variables in the datasets.
| Dataset variable | Dataset 1 | Dataset 2 | Dataset 3 |
|
| |||
| Design strength of concrete | — | — | ✓ |
| Target strength of concrete | — | — | ✓ |
| Slump test | — | — | ✓ |
| Chloride ion content | — | — | ✓ |
| Temperature | — | — | ✓ |
| Water-binder ratio | ✓ | ✓ | ✓ |
| Water content of concrete | ✓ | ✓ | ✓ |
| Cementitious material consumption | ✓ | — | ✓ |
| Cement ratio | ✓ | — | ✓ |
| Amount of cement | ✓ | ✓ | ✓ |
| Amount of slag powder | ✓ | ✓ | ✓ |
| Amount of fly ash | ✓ | ✓ | ✓ |
| Amount of fine aggregate | ✓ | ✓ | ✓ |
| Amount of coarse aggregate | ✓ | ✓ | ✓ |
| Sand ratio | ✓ | — | ✓ |
| Location (north) | ✓ | — | ✓ |
| Location (middle) | ✓ | — | ✓ |
| Location (south) | ✓ | — | ✓ |
| Compressive strength test | ✓ | ✓ | ✓ |
Note. Dataset 1 = industry recommendation; dataset 2 = suggested by research community; dataset 3 = all features considered. Variables in dataset 2 are frequently used to determine the compressive strength of concrete in the literature.
Number of data points in the datasets.
| Number of data points | Dataset 1 | Dataset 2 | Dataset 3 |
|---|---|---|---|
| Number of total samples | 6705 | 6705 | 5856 |
| Number of training samples | 6381 | 6381 | 5532 |
| Number of testing samples | 324 | 324 | 324 |
| Number of input variables | 13 | 7 | 18 |
| Number of output variables | 1 | 1 | 1 |
Descriptive statistics of variables from the datasets.
| Variables | Unit | Minimum | Maximum | Average | |
|
| |||||
|
| |||||
|
| Water-binder ratio | — | 0.25 | 0.87 | 0.52 |
|
| Water content of concrete | (kg/m3) | 121.00 | 250.25 | 184.98 |
|
| Cementitious material consumption | (kg/m3) | 209.00 | 690.00 | 361.15 |
|
| Cement ratio | (%) | 30.67 | 100.00 | 70.33 |
|
| Amount of cement | (kg/m3) | 99.20 | 507.00 | 255.43 |
|
| Amount of slag powder | (kg/m3) | 0.00 | 209.35 | 68.90 |
|
| Amount of fly ash | (kg/m3) | 0.00 | 180.00 | 36.82 |
|
| Amount of coarse aggregate | (kg/m3) | 344.24 | 1281.00 | 919.30 |
|
| Amount of fine aggregate | (kg/m3) | 468.00 | 1376.96 | 860.22 |
|
| Sand ratio | (%) | 0.00 | 80.00 | 48.32 |
|
| Location (north) | — | 0.00 | 1.00 | 0.44 |
|
| Location (middle) | — | 0.00 | 1.00 | 0.12 |
|
| Location (south) | — | 0.00 | 1.00 | 0.44 |
|
| Compressive strength test | (kgf/cm2) | 125.00 | 724.00 | 343.49 |
|
| |||||
|
| |||||
|
| Water-binder ratio | — | 0.25 | 0.87 | 0.52 |
|
| Water content of concrete | (kg/m3) | 121.00 | 250.25 | 184.98 |
|
| Amount of cement | (kg/m3) | 99.20 | 507.00 | 255.43 |
|
| Amount of slag powder | (kg/m3) | 0.00 | 209.35 | 68.90 |
|
| Amount of fly ash | (kg/m3) | 0.00 | 180.00 | 36.82 |
|
| Amount of fine aggregate | (kg/m3) | 468.00 | 1376.96 | 860.22 |
|
| Amount of coarse aggregate | (kg/m3) | 344.24 | 1281.00 | 919.30 |
|
| Compressive strength test | (kgf/cm2) | 125.00 | 724.00 | 343.49 |
|
| |||||
|
| |||||
|
| Design strength of concrete | (kgf/cm2) | 140.00 | 420.00 | 254.40 |
|
| Target strength of concrete | (kgf/cm2) | 160.00 | 660.00 | 320.27 |
|
| Slump test | (cm) | 8.50 | 69.00 | 19.48 |
|
| Chloride ion content | (%) | 0.00 | 0.14 | 0.04 |
|
| Temperature | (°C) | 14.00 | 35.00 | 26.19 |
|
| Water-binder ratio | — | 0.25 | 0.83 | 0.52 |
|
| Water content of concrete | (kg/m3) | 121.00 | 250.25 | 184.87 |
|
| Cementitious material consumption | (kg/m3) | 209.00 | 690.00 | 363.42 |
|
| Cement ratio | (%) | 30.67 | 100.00 | 70.10 |
|
| Amount of cement | (kg/m3) | 99.20 | 507.00 | 256.17 |
|
| Amount of slag powder | (kg/m3) | 0.00 | 209.35 | 68.85 |
|
| Amount of fly ash | (kg/m3) | 0.00 | 180.00 | 38.40 |
|
| Amount of fine aggregate | (kg/m3) | 468.00 | 1376.96 | 860.88 |
|
| Amount of coarse aggregate | (kg/m3) | 344.24 | 1281.00 | 916.43 |
|
| Sand ratio | (%) | 0.00 | 80.00 | 48.41 |
|
| Location (north) | — | 0.00 | 1.00 | 0.48 |
|
| Location (middle) | — | 0.00 | 1.00 | 0.13 |
|
| Location (south) | - | 0.00 | 1.00 | 0.39 |
|
| Compressive strength | (kgf/cm2) | 162.00 | 724.00 | 344.82 |
Figure 20Conversion example of numerical data to image data.
Figure 21Input images and corresponding output labels.
Experimental settings.
| Research task | Data type | Purpose | Method |
|---|---|---|---|
| Comparison of deep learning models | Numerical data and image data | Search for the best CNN model (using image data) and compare the best CNN model with a DNN model (using numerical data) | CNNs and DNN: VGG, ResNet, ResNetV2, InceptionV3, InceptionResNetV2, MobileNets, MobileNetV2, NASNet, EfficientNets, DenseNet |
| Construction of optimized deep learning models | Image data | Enhance the best-performing models with optimized hyperparameters | Optimizing deep learning models by jellyfish search algorithm |
Number of hidden nodes in each hidden layer of the deep neural network (DNN).
| Hidden layer | 1st | 2nd | 3rd | 4th | 5th | 6th | 7th | 8th–10th | 11th–20th | 21st–30th | 31st–40th | 41st–50th |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Number of hidden nodes | 4096 | 2048 | 1024 | 512 | 256 | 128 | 64 | 32 | 16 | 8 | 4 | 2 |
Deep learning model performance on dataset 1.
| Model | Training time (h:m:s) | MAPE (%) | RMSE (kgf/cm2) | MAE (kgf/cm2) | SI |
|
| |||||
| Xception | 2:31:01 | 14.0264 | 76.4217 | 57.7471 | 0.285 |
| VGG16 | 0:17:58 | 15.9598 | 85.6755 | 65.6588 | 0.249 |
| VGG19 | 0:21:16 | 14.8719 | 79.9609 | 61.1835 | 0.147 |
| ResNet50 | 0:18:28 | 15.0252 | 80.8499 | 62.5910 | 0.164 |
| ResNet101 | 0:32:30 | 14.3817 | 75.4145 | 57.5325 | 0.091 (3) |
| ResNet152 | 0:44:12 | 14.3462 | 78.5345 | 59.2920 | 0.144 |
| ResNet50V2 | 0:16:56 | 13.8000 | 73.7818 | 56.4419 | 0.027 (1) |
| ResNet101V2 | 0:29:17 | 14.7393 | 74.4318 | 58.7149 | 0.100 |
| ResNet152V2 | 0:43:11 | 16.2188 | 85.7025 | 67.9747 | 0.318 |
| InceptionV3 | 0:45:14 | 14.2727 | 75.2054 | 58.0613 | 0.111 |
| InceptionResNetV2 | 1:43:38 | 14.7849 | 77.9702 | 60.6185 | 0.264 |
| MobileNet | 0:10:35 | 15.1504 | 79.4783 | 62.0158 | 0.141 |
| MobileNetV2 | 0:12:09 | 17.2442 | 82.0049 | 63.9462 | 0.244 |
| DenseNet121 | 0:18:49 | 15.6411 | 82.3700 | 64.8698 | 0.213 |
| DenseNet169 | 0:24:41 | 15.3712 | 80.3540 | 63.4890 | 0.190 |
| DenseNet201 | 0:31:28 | 15.4292 | 80.7504 | 63.7175 | 0.207 |
| NASNetMobile | 0:35:50 | 15.8436 | 82.1353 | 64.5839 | 0.244 |
| EfficientnetB0 | 0:18:10 | 14.2585 | 75.3426 | 58.3330 | 0.069 (2) |
| EfficientnetB1 | 0:27:30 | 14.9977 | 80.3189 | 61.8780 | 0.169 |
| EfficientnetB2 | 0:28:38 | 15.4503 | 80.6015 | 63.4244 | 0.200 |
| EfficientnetB3 | 0:35:30 | 14.9118 | 80.7490 | 61.6706 | 0.181 |
| EfficientnetB4 | 0:45:16 | 14.6289 | 79.1150 | 61.1313 | 0.173 |
| EfficientnetB5 | 0:59:30 | 14.5750 | 76.2138 | 59.6796 | 0.164 |
| EfficientnetB6 | 1:13:08 | 15.0979 | 81.5700 | 62.4580 | 0.261 |
| EfficientnetB7 | 1:42:25 | 14.4783 | 74.6023 | 59.1117 | 0.218 |
| DNN | 0:00:46 | 21.4910 | 112.2759 | 87.9198 | 0.750 |
Deep learning model performance on dataset 2.
| Model | Training time (h:m:s) | MAPE (%) | RMSE (kgf/cm2) | MAE (kgf/cm2) | SI |
|
| |||||
| Xception | 1:12:19 | 17.2430 | 89.4229 | 70.1275 | 0.493 |
| VGG16 | 0:10:26 | 18.3202 | 100.0902 | 73.6012 | 0.387 |
| VGG19 | 0:12:32 | 17.7289 | 98.4250 | 74.1325 | 0.373 |
| ResNet50 | 0:12:24 | 15.0208 | 78.1490 | 61.1607 | 0.105 |
| ResNet101 | 0:21:34 | 14.5379 | 74.5514 | 58.2454 | 0.086 (2) |
| ResNet152 | 0:30:17 | 16.5355 | 90.8698 | 67.9069 | 0.321 |
| ResNet50V2 | 0:11:28 | 15.7710 | 80.7428 | 63.8803 | 0.154 |
| ResNet101V2 | 0:20:19 | 15.1000 | 76.9871 | 60.4406 | 0.124 |
| ResNet152V2 | 0:28:43 | 14.4184 | 73.6713 | 57.4732 | 0.098 |
| InceptionV3 | 0:27:04 | 16.4996 | 87.7770 | 69.3648 | 0.302 |
| InceptionResNetV2 | 1:02:49 | 15.8074 | 84.5667 | 66.3267 | 0.371 |
| MobileNet | 0:06:18 | 15.0699 | 77.6579 | 61.3514 | 0.084 (1) |
| MobileNetV2 | 0:07:26 | 18.6955 | 83.8692 | 68.5122 | 0.264 |
| DenseNet121 | 0:12:41 | 15.3784 | 83.6683 | 64.4323 | 0.167 |
| DenseNet169 | 0:17:17 | 15.5968 | 85.7416 | 65.6683 | 0.209 |
| DenseNet201 | 0:22:02 | 14.9204 | 83.5047 | 62.5952 | 0.175 |
| NASNetMobile | 0:20:32 | 22.3714 | 115.6740 | 93.5611 | 0.745 |
| EfficientnetB0 | 0:12:27 | 15.0674 | 80.2897 | 61.9535 | 0.124 |
| EfficientnetB1 | 0:17:52 | 15.5417 | 81.8874 | 63.5534 | 0.174 |
| EfficientnetB2 | 0:18:12 | 14.5659 | 77.0040 | 59.4023 | 0.096 (3) |
| EfficientnetB3 | 0:21:51 | 15.4034 | 81.4909 | 63.2014 | 0.180 |
| EfficientnetB4 | 0:27:17 | 15.7882 | 83.7080 | 64.3034 | 0.228 |
| EfficientnetB5 | 0:36:30 | 14.9359 | 78.5506 | 60.4780 | 0.185 |
| EfficientnetB6 | 0:45:47 | 15.0286 | 78.3096 | 61.4788 | 0.225 |
| EfficientnetB7 | 1:01:55 | 15.3541 | 80.7350 | 62.9946 | 0.313 |
| DNN | 0:00:40 | 23.9215 | 119.4923 | 95.5619 | 0.750 |
Deep learning model performance on dataset 3.
| Model | Training time (h:m:s) | MAPE (%) | RMSE (kgf/cm2) | MAE (kgf/cm2) | SI |
|
| |||||
| Xception | 2:59:21 | 13.0524 | 64.3952 | 51.9907 | 0.326 |
| VGG16 | 0:20:24 | 13.8902 | 69.6804 | 56.1444 | 0.165 |
| VGG19 | 0:23:28 | 14.1965 | 69.1497 | 55.3852 | 0.169 |
| ResNet50 | 0:20:26 | 13.9585 | 70.9084 | 57.2012 | 0.178 |
| ResNet101 | 0:34:28 | 11.6479 | 59.5678 | 47.1273 | 0.049 (2) |
| ResNet152 | 0:48:51 | 11.7348 | 60.4651 | 47.2528 | 0.076 |
| ResNet50V2 | 0:18:31 | 12.3769 | 64.2470 | 51.1966 | 0.083 |
| ResNet101V2 | 0:31:55 | 12.1501 | 63.4072 | 49.7640 | 0.086 |
| ResNet152V2 | 0:46:54 | 12.0281 | 61.4381 | 48.0813 | 0.087 |
| InceptionV3 | 0:51:37 | 13.3665 | 64.9479 | 52.1857 | 0.157 |
| InceptionResNetV2 | 1:57:42 | 13.5490 | 64.0620 | 51.9634 | 0.248 |
| MobileNet | 0:12:27 | 13.1265 | 65.2462 | 52.4753 | 0.100 |
| MobileNetV2 | 0:14:00 | 13.0155 | 60.0301 | 47.8787 | 0.053 |
| DenseNet121 | 0:20:09 | 11.7167 | 59.3511 | 47.1034 | 0.029 (1) |
| DenseNet169 | 0:25:51 | 11.7929 | 61.0411 | 47.9452 | 0.051 (3) |
| DenseNet201 | 0:32:50 | 11.6047 | 60.6139 | 47.7109 | 0.054 |
| NASNetMobile | 0:41:08 | 24.6483 | 109.9440 | 93.7760 | 0.780 |
| EfficientnetB0 | 0:20:37 | 12.9179 | 64.7033 | 52.1055 | 0.103 |
| EfficientnetB1 | 0:30:28 | 13.3862 | 68.6528 | 55.0783 | 0.159 |
| EfficientnetB2 | 0:31:25 | 13.4363 | 68.1339 | 54.9089 | 0.159 |
| EfficientnetB3 | 0:38:59 | 13.2016 | 67.0390 | 53.1422 | 0.150 |
| EfficientnetB4 | 0:49:39 | 13.1075 | 65.5639 | 52.3979 | 0.153 |
| EfficientnetB5 | 1:09:29 | 13.2787 | 67.5948 | 54.1713 | 0.203 |
| EfficientnetB6 | 1:26:14 | 12.4606 | 63.7319 | 50.5952 | 0.174 |
| EfficientnetB7 | 1:55:15 | 12.8867 | 63.4471 | 51.0061 | 0.224 |
| DNN | 0:00:44 | 22.8024 | 116.0370 | 90.6988 | 0.698 |
Hyperparameter settings for deep learning models.
| Hyperparameter | Literature value | Search range in this study |
|
| ||
|
| ||
| Batch normalization-epsilon | 1.001 | [1.001 |
| Batch size | 64, 256 | [8, 16, 32, 64] |
| Epochs | 40, 90, 300 | [10, 20, 30, 40, 50, 60, 70, 80, 90, 100] |
| ADAM-learning rate | 0.1 | [0.001, 0.005, 0.01, 0.05, 0.1] |
| Dropout rate | 0.5 | 0.00–0.99 |
|
| ||
|
| ||
| Batch size | 64, 256 | [8, 16, 32, 64] |
| Epochs | 40, 90, 300 | [10, 20, 30, 40, 50, 60, 70, 80, 90, 100] |
| ADAM-learning rate | 0.1 | [0.001, 0.005, 0.01, 0.05, 0.1] |
| Dropout rate | 0.5 | 0.00–0.99 |
|
| ||
|
| ||
| Growth rate | 32 | 12–48 |
| Batch normalization-epsilon | 1.001 | [1.001 |
| Batch size | 64, 256 | [8, 16, 32, 64] |
| Epochs | 40, 90, 300 | [10, 20, 30, 40, 50, 60, 70, 80, 90, 100] |
| Reduction | 0.5 | 0.1–1.0 |
| ADAM-learning rate | 0.1 | [0.001, 0.005, 0.01, 0.05, 0.1] |
| Dropout rate | 0.2 | 0.00–0.99 |
Performance of the best and optimized CNN models.
| Dataset | Model | MAPE (%) | RMSE (kgf/cm2) | MAE (kgf/cm2) |
|---|---|---|---|---|
| 1 | ResNet50V2 | 13.8000 | 73.7818 | 56.4419 |
| JS-ResNet50V2 | 13.1327 | 68.5794 | 52.4591 | |
|
| ||||
| 2 | MobileNet | 17.0406 | 91.6945 | 71.1198 |
| JS-MobileNet | 17.0671 | 90.4711 | 70.0870 | |
|
| ||||
| 3 | DenseNet121 | 11.7167 | 59.3511 | 47.1034 |
| JS-DenseNet121 | 11.5443 | 58.4346 | 45.8917 | |
Optimal hyperparameters of the best CNN models.
| Hyperparameter | Optimal value |
|---|---|
|
| |
| Batch normalization-epsilon | 0.0005 |
| Batch size | 64 |
| Epochs | 100 |
| ADAM-learning rate | 0.001 |
| Dropout rate | 0.26 |
|
| |
|
| |
| Batch size | 16 |
| Epochs | 70 |
| ADAM-learning rate | 0.001 |
| Dropout rate | 0.65 |
|
| |
|
| |
| Growth rate | 38 |
| Batch normalization-epsilon | 0.00005 |
| Batch size | 64 |
| Epochs | 90 |
| Reduction | 0.7 |
| ADAM-learning rate | 0.001 |
| Dropout rate | 0.33 |
Sensitivity analysis of input features.
| No. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| MAPE (%) |
|
| |||||||||||||||||||
| 1 | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.94 |
| 2 | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.91 |
| 3 | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.95 |
| 4 | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.66 |
| 5 | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.99 |
| 6 | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.01 |
| 7 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.32 |
| 8 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.05 |
| 9 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.57 |
| 10 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.17 |
| 11 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.63 |
| 12 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.69 |
| 13 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | ✓ | 11.29 |
| 14 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | ✓ | 11.59 |
| 15 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | ✓ | ✓ | ✓ | 11.26 |
| 16 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | — | — | — | 11.98 |
| 17 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 11.72 |
Note. X1 = design strength of concrete, X2 = target strength of concrete, X3 = slump test, X4 = chloride ion content, X5 = temperature, X6 = water-binder ratio, X7 = water content of concrete, X8 = cementitious material consumption, X9 = cement ratio, X10 = amount of cement, X11 = amount of slag powder, X12 = amount of fly ash, X13 = amount of fine aggregate, X14 = amount of coarse aggregate, X15 = sand ratio, X16 = location (north), X17 = location (middle), and X18 = location (south).
Correlation between the feature and compressive strength of ready-mixed concrete.
| Feature | Correlation coefficient between feature and |
|---|---|
|
| 0.75 |
|
| 0.82 |
|
| 0.23 |
|
| 0.05 |
|
| −0.15 |
|
| −0.74 |
|
| −0.15 |
|
| 0.73 |
|
| 0.14 |
|
| 0.46 |
|
| 0.05 |
|
| 0.02 |
|
| −0.44 |
|
| −0.06 |
|
| −0.25 |
|
| 0.24 |
|
| 0.06 |
|
| −0.29 |
Note. X1 = design strength of concrete, X2 = target strength of concrete, X3 = slump test, X4 = chloride ion content, X5 = temperature, X6 = water-binder ratio, X7 = water content of concrete, X8 = cementitious material consumption, X9 = cement ratio, X10 = amount of cement, X11 = amount of slag powder, X12 = amount of fly ash, X13 = amount of fine aggregate, X14 = amount of coarse aggregate, X15 = sand ratio, X16 = location (north), X17 = location (middle), X18 = location (south), and Y = compressive strength of ready-mixed concrete.
Results of the order importance analysis of the image-like dataset.
| Image pixel orientation | Type of pixel order | MAPE (%) | RMSE (kgf/cm2) | MAE (kgf/cm2) |
|---|---|---|---|---|
| Original order | Random arrangement | 11.5443 | 58.4346 | 45.8917 |
| Correlated order | Descending by correlated values | 12.0831 | 61.3435 | 48.1922 |
| Descending by absolute correlated values | 12.5888 | 64.5037 | 50.7435 |