| Literature DB >> 31277468 |
Jessica Fernandes Lopes1, Leniza Ludwig2, Douglas Fernandes Barbin3, Maria Victória Eiras Grossmann2, Sylvio Barbon4.
Abstract
Imaging sensors are largely employed in the food processing industry for quality control. Flour from malting barley varieties is a valuable ingredient in the food industry, but its use is restricted due to quality aspects such as color variations and the presence of husk fragments. On the other hand, naked varieties present superior quality with better visual appearance and nutritional composition for human consumption. Computer Vision Systems (CVS) can provide an automatic and precise classification of samples, but identification of grain and flour characteristics require more specialized methods. In this paper, we propose CVS combined with the Spatial Pyramid Partition ensemble (SPPe) technique to distinguish between naked and malting types of twenty-two flour varieties using image features and machine learning. SPPe leverages the analysis of patterns from different spatial regions, providing more reliable classification. Support Vector Machine (SVM), k-Nearest Neighbors (k-NN), J48 decision tree, and Random Forest (RF) were compared for samples' classification. Machine learning algorithms embedded in the CVS were induced based on 55 image features. The results ranged from 75.00% (k-NN) to 100.00% (J48) accuracy, showing that sample assessment by CVS with SPPe was highly accurate, representing a potential technique for automatic barley flour classification.Entities:
Keywords: computer intelligence; food quality; image processing; machine learning
Year: 2019 PMID: 31277468 PMCID: PMC6650935 DOI: 10.3390/s19132953
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1General overview highlighting the differences among traditional, SPP and Spatial Pyramid Partition ensemble (SPPe) approaches of feature vector composition.
Barley cultivars employed in the experimentation.B, malting Barley; N, Naked barley.
| Sample ID | Cultivar | Type |
|---|---|---|
| B01 | BRS Aliensa | Malting |
| B02 | BRS Itanema | Malting |
| B03 | BRS Brau | Malting |
| B04 | MN 6021 | Malting |
| B05 | BRS Sampa | Malting |
| B06 | BRS Korbel | Malting |
| B07 | MN 6021 | Malting |
| B08 | BRS Elis | Malting |
| B09 | BRS Korbel | Malting |
| B10 | BRS Elis | Malting |
| B11 | BRS Mandurí | Malting |
| B12 | BRS Brau | Malting |
| B13 | BRS Cauê | Malting |
| B14 | BRS Cauê | Malting |
| N01 | 149852 | Naked |
| N02 | 149853 | Naked |
| N03 | 149857 | Naked |
| N04 | 149846 | Naked |
| N05 | 149858 | Naked |
| N06 | 149841 | Naked |
| N07 | 149855 | Naked |
| N08 | 149859 | Naked |
Figure 2General overview of the proposed approach.
Figure 3Samples of barley flour from malting (a–n) and naked (o–v) types.
Figure 4Spatial Pyramidal Partition ensemble (SPPe) for obtaining image samples.
List of all image features used in the proposed SPPe approach for barley flour classification.
| No. | Type | Name | Description |
|---|---|---|---|
| 1 | Color | meanH | Mean value of the H channel |
| 2 | Color | StdH | Standard deviation of the H channel |
| 3 | Color | meanS | Mean value of the S channel |
| 4 | Color | stdS | Standard deviation of the S channel |
| 5 | Color | MeanV | Mean value of the V channel |
| 6 | Color | stdV | Standard deviation of the V channel |
| 7 | Color | stdHistH | Standard deviation of H channel histogram |
| 8 | Color | kurtHistH | Kurtosis of H channel histogram |
| 9 | Color | skewHistH | Skewness of H channel histogram |
| 10 | Color | stdHistS | Standard deviation of S channel histogram |
| 11 | Color | kurtHistS | Kurtosis of S channel histogram |
| 12 | Color | skewHistS | Skewness of S channel histogram |
| 13 | Color | stdHistV | Standard deviation of V channel histogram |
| 14 | Color | kurtHistV | Kurtosis of V channel histogram |
| 15 | Color | skewHistV | Skewness of V channel histogram |
| 16 | Color | meanL | Mean value of the L channel |
| 17 | Color | stdL | Standard deviation of the L channel |
| 18 | Color | meanA | Mean value of the A channel |
| 19 | Color | stdA | Standard deviation of the A channel |
| 20 | Color | meanB | Mean value of the B channel |
| 21 | Color | stdB | Standard deviation of the B channel |
| 22 | Color | stdHistL | Standard deviation of L channel histogram |
| 23 | Color | kurtHistL | Kurtosis of L channel histogram |
| 24 | Color | skewHistL | Skewness of L channel histogram |
| 25 | Color | stdHistA | Standard deviation of A channel histogram |
| 26 | Color | kurtHistA | Kurtosis of A channel histogram |
| 27 | Color | skewHistA | Skewness of A channel histogram |
| 28 | Color | stdHistB | Standard deviation of B channel histogram |
| 29 | Color | kurtHistB | Kurtosis of B channel histogram |
| 30 | Color | skewHistB | Skewness of B channel histogram |
| 31 | Intensity | meanInten | Mean value of intensity image |
| 32 | Intensity | StdInten | Standard deviation of Intensity image |
| 33 | Intensity | entropyInten | Entropy of intensity image |
| 34 | Intensity | stdHistInten | Standard deviation of Intensity image histogram |
| 35 | Intensity | kurtHistInten | Kurtosis of intensity image histogram |
| 36 | Intensity | skewHistInten | Skewness of intensity image histogram |
| 37–46 | Texture | Vector of Local Binary Patterns (LBP) rotationally invariant features | |
| 47 | Texture | entCoMatrix | Entropy of grey-level co-occurrence matrix |
| 48 | Texture | ineCoMatrix | Inertia of grey-level co-occurrence matrix |
| 49 | Texture | eneCoMatrix | Energy of grey-level co-occurrence matrix |
| 50 | Texture | corCoMatrix | Correlation of grey-level co-occurrence matrix |
| 51 | Texture | homCoMatrix | Homogeneity of grey-level co-occurrence matrix |
| 52 | Texture | eneFFT | FFT Energy |
| 53 | Texture | entFFT | FFT Entropy |
| 54 | Texture | ineFFT | FFT Inertia |
| 55 | Texture | homFFT | FFT Homogeneity |
Machine learning algorithms used in the experiments and corresponding R packages.
|
|
|
|
|
| K-Nearest Neighbor (k-NN) | A non-parametric lazy learning algorithm; the training data are not used for any generalization [ |
| Euclidean distance; k= 5 |
| Decision Tree (J48) | A decision tree widely applied to represent series of rules that lead to a class or value [ |
| C = 0.25; threshold = 0.25; with pruning |
| Random Forest (RF) | A combination of decision tree models that provides more accurate prediction [ |
| ntree = 100; mtry = 7 |
| Support Vector Machine (SVM) | A statistical learning algorithm, used for supervised ML and food quality solutions [ |
| kernel = polynomial; |
Performance measures in the comparison of the methods and algorithms (RF, k-NN, J48 and SVM) over the cross-validation and prediction dataset.
| Algorithm | Metric | Cross-Validation | Prediction | ||||
|---|---|---|---|---|---|---|---|
| Traditional | SPP | SPPe | Traditional | SPP | SPPe | ||
| RF | Accuracy | 90.00 | 91.00 | 100.00 | 90.00 | 95.00 | 95.00 |
| Precision | 71.88 | 71.88 | 100.00 | 86.67 | 96.88 | 96.88 | |
| Recall | 68.93 | 69.43 | 100.00 | 86.67 | 90.00 | 90.00 | |
| Time (s) | 65.35 ( | 281.63 (±1.09) | 217.11 (±0.40) | 62.53 (±0.12) | 268.71 (±0.39) | 207.07 (±0.34) | |
| k-NN | Accuracy | 77.56 | 70.56 | 95.56 | 80.00 | 60.00 | 75.00 |
| Precision | 60.79 | 57.25 | 95.85 | 74.51 | 52.75 | 65.63 | |
| Recall | 58.79 | 53.88 | 94.81 | 66.67 | 53.33 | 63.33 | |
| Time (s) | 64.50 (±0.10) | 279.34 (±0.94) | 209.49 (±0.36) | 62.44 (±0.15) | 268.51 (±0.36) | 206.11 (±0.29) | |
| J48 | Accuracy | 89.00 | 88.00 | 100.00 | 85.00 | 85.00 | 100.00 |
| Precision | 71.88 | 71.88 | 100.00 | 79.77 | 91.67 | 100.00 | |
| Recall | 68.43 | 67.93 | 100.00 | 83.33 | 70.00 | 100.00 | |
| Time (s) | 70.14 (±0.26) | 353.37 (±2.31) | 210.79 (±0.37) | 62.61 (±0.10) | 270.71 (±0.38) | 206.32 (±0.33) | |
| SVM | Accuracy | 93.00 | 92.00 | 98.89 | 80.00 | 95.00 | 95.00 |
| Precision | 70.42 | 72.50 | 99.11 | 89.47 | 96.88 | 96.88 | |
| Recall | 70.00 | 70.36 | 98.57 | 60.00 | 90.00 | 90.00 | |
| Time (s) | 64.62 (±0.15) | 280.57 (±0.95) | 213.40 (±0.43) | 62.83 (±0.12) | 268.66 (±0.37) | 206.75 (±0.37) | |
Figure 5RF importance of image features.
Figure 6Accuracy heat map of J48, k-NN, RF, and SVM over the prediction dataset comparing traditional CVS, SPP, and SPPe techniques with repetitions A0, A1, A2, A3, and A4.
Figure 7Samples of cultivar N07, the lowest accuracy of barley flour classification.