| Literature DB >> 35937334 |
Xuyao Liu1, Yaowen Hu1, Guoxiong Zhou1, Weiwei Cai1, Mingfang He1, Jialei Zhan1, Yahui Hu2, Liujun Li3.
Abstract
Affected by various environmental factors, citrus will frequently suffer from diseases during the growth process, which has brought huge obstacles to the development of agriculture. This paper proposes a new method for identifying and classifying citrus diseases. Firstly, this paper designs an image enhancement method based on the MSRCR algorithm and homomorphic filtering algorithm optimized by Laplacian (HFLF-MS) to highlight the disease characteristics of citrus. Secondly, we designed a new neural network DS-MENet based on the DenseNet-121 backbone structure. In DS-MENet, the regular convolution in Dense Block is replaced with depthwise separable convolution, which reduces the network parameters. The ReMish activation function is used to alleviate the neuron death problem caused by the ReLU function and improve the robustness of the model. To further enhance the attention to citrus disease information and the ability to extract feature information, a multi-channel fusion backbone enhancement method (MCF) was designed in this work to process Dense Block. We use the 10-fold cross-validation method to conduct experiments. The average classification accuracy of DS-MENet on the dataset after adding noise can reach 95.02%. This shows that the method has good performance and has certain feasibility for the classification of citrus diseases in real life.Entities:
Keywords: DS-MENet; ReMish; citrus disease detection; depthwise separable convolution; image enhancement; multi-channel fusion backbone enhancement method
Year: 2022 PMID: 35937334 PMCID: PMC9355402 DOI: 10.3389/fpls.2022.884464
Source DB: PubMed Journal: Front Plant Sci ISSN: 1664-462X Impact factor: 6.627
Related research work details.
| References | Method | Advantage | Drawback |
|
| A machine learning method based on color symbiosis (CMM) | The classification accuracy obtained under controlled lighting conditions in an indoor laboratory is excellent | The model is sensitive to the background environment of citrus, and the classification of citrus diseases in natural environment may have limitations |
|
| Identification method based on FIS (Fluorescence Imaging Spectroscopy) and machine learning | High accuracy for classification of similar diseases of citrus (canker and scab) | High requirements for image quality |
|
| Classification of Citrus Diseases Based on Support Vector Machines and VIS-NIR (Near-Infrared Spectroscopy) | Classification algorithms are common and effective | The experimental data has strong uncertainty and is greatly affected by the environment |
|
| Identification method of citrus HLB disease based on visible spectrum image processing and C-SVC (cost-support vector classification) | (a) Low computational complexity and high efficiency | Classification accuracy needs to be improved |
|
| Optimized AlexNet citrus disease classification model | The network structure is simple, the number of parameters is small, and the performance is relatively excellent | There are certain challenges in applying to multi-classification of citrus diseases |
|
| Intelligent Diagnosis System of Citrus Diseases Based on Mobile Service Computing | (a) WeChat applet is rich in functions, convenient and practical | Disease identification accuracy needs to be improved |
|
| A citrus disease classification model based on transfer learning and optimized SGDM (Stochastic Gradient Descent with Momentum) | (a) Stochastic Gradient Descent with Momentum (SGDM) can speed up convergence? | The classification of similar diseases will have certain limitations |
|
| A Patch-Based Framework for Citrus Classification | (a) Fast and accurate classification can be achieved with sparse data | The classification accuracy of similar diseases needs to be improved |
|
| A classification method of citrus diseases based on transfer learning and feature fusion | (a) Image preprocessing with hybrid contrast stretching can improve image quality | Lack of performance comparisons with more state-of-the-art classification models |
|
| Automatic classification of citrus diseases based on optimized weighted segmentation and feature selection | The optimized weighted segmentation algorithm can segment the lesions, which is beneficial to the extraction of later feature information | Less suitable for citrus diseases without lesions, such as citrus Huanglongbing |
|
| A classification model of citrus diseases based on two-stage deep CNN | Feature sharing can be achieved between the two stages to reduce model training overhead | The robustness of the model needs to be further improved |
|
| A lightweight citrus pest identification model based on Weakly DenseNet | (a) Feature reuse and data augmentation algorithms can reduce similarity between images | The proportion of the object to be classified in the image will affect the output result |
FIGURE 1Working principle diagram of the system.
Number and proportion of citrus diseases.
| Disease type | Original number | Expanded number | Percentage (%) |
| Healthy | 564 | 1,692 | 18.3 |
| Canker | 634 | 1,902 | 20.5 |
| Scab | 592 | 1,776 | 19.2 |
| Black spot | 743 | 2,229 | 24.1 |
| Anthracnose | 553 | 1,659 | 17.9 |
FIGURE 2Collection of citrus disease images.
Variable names and their meanings.
| Variable | Meaning |
|
| color channel, |
| luminance component | |
| reflection component | |
| input image | |
| Output of the ith channel SSR | |
| Gaussian wrap function | |
|
| Gaussian Surround Scale |
| the result of MSRCR enhancement of the ith color channel | |
| Gaussian wrapping function at the nth scale | |
| ω | Weight coefficient for the nth scale |
|
| total number of scales, N=3 |
| the result of MSRCR enhancement of the ith color channel | |
| color restoration factor of the ith color channel | |
| α | controlled nonlinear intensity |
| β | Gain constant |
|
| Parameter quantity |
|
| Length of the convolution kernel |
|
| Width of the convolution kernel |
|
| channel number of the input feature map |
|
| channel number of the output feature map |
|
| The parameter quantities of DW convolution |
|
| The parameter quantities of PW convolution |
|
| Input feature maps of the ith group |
|
| The ith group of feature maps |
|
| The average pooling function |
|
| The excitation function |
|
| The multiplication function |
FIGURE 3HFLF-MS working principle diagram.
FIGURE 4Laplacian filter template.
FIGURE 5Enhancement of citrus disease images.
FIGURE 6DS-MENet network structure.
FIGURE 7Output parameters of each layer of DS-MENet.
FIGURE 8Working principle of depthwise separable convolution.
FIGURE 9Working principle diagram of MCF algorithm.
FIGURE 10ReMish.
FIGURE 11Ten-fold cross-validation.
Hyperparameter settings.
| Hyperparameters | Value |
| Learning rate | 0.001 |
| Epoch | 20 |
| Momentum | 0.9 |
| Batchsize | 32 |
| Optimizer | Adam |
Performance of DS-MENet and other network models before and after adding noise to the dataset.
| Methods | Accuracy (test set without noise) (%) | Accuracy (test set with added noise) (%) |
| AlexNet ( | 84.37 | 79.76 |
| ResNet50 ( | 86.67 | 82.48 |
| InceptionV4 ( | 88.45 | 86.31 |
| ResNeXt50 ( | 89.26 | 88.56 |
| EfficientNet ( | 92.56 | 91.94 |
| EfficientNetV2 ( | 95.39 | 94.83 |
| MobileNetV3 ( | 88.64 | 86.93 |
| DenseNet121 ( | 90.06 | 89.74 |
| DS-MENet | 95.25 | 95.02 |
FIGURE 12Classification confusion matrix of citrus diseases by different models (the models corresponding to the (A–I) confusion matrix are: AlexNet, ResNet50, Inceptionv4, ResNext, EfficientNet, EfficientNetv2, MobileNetv3, DenseNet121, and DS-MENet).
Evaluation of classification performance of DS-MENet and other networks.
| Methods | Evaluation indicators | Citrus disease | ||||
| Healthy | Canker | Scab | Black-spot | Anthracnose | ||
| AlexNet | Precision (%) | 82.63 | 75.66 | 81.55 | 83.64 | 73.62 |
| Recall (%) | 92.90 | 75.26 | 76.97 | 80.27 | 73.17 | |
| F1-score (%) | 87.46 | 75.46 | 79.19 | 81.92 | 73.39 | |
| ResNet50 | Precision (%) | 88.14 | 81.98 | 79.01 | 83.49 | 74.17 |
| Recall (%) | 92.31 | 74.21 | 80.34 | 86.10 | 79.27 | |
| F1-score (%) | 90.18 | 77.90 | 79.67 | 84.77 | 76.64 | |
| Inceptionv4 | Precision (%) | 90.56 | 84.95 | 87.21 | 88.58 | 79.04 |
| Recall (%) | 96.45 | 83.16 | 84.27 | 87.00 | 80.49 | |
| F1-score (%) | 93.41 | 84.05 | 85.71 | 87.78 | 79.76 | |
| ResNeXt50 | Precision (%) | 95.35 | 86.84 | 85.87 | 88.74 | 85.90 |
| Recall (%) | 97.04 | 86.84 | 88.76 | 88.34 | 81.71 | |
| F1-score (%) | 96.19 | 86.84 | 87.29 | 88.54 | 83.75 | |
| EfficientNet | Precision (%) | 97.06 | 89.58 | 92.00 | 92.38 | 88.41 |
| Recall (%) | 97.63 | 90.53 | 90.45 | 92.38 | 88.41 | |
| F1-score (%) | 97.34 | 90.05 | 91.22 | 92.38 | 88.41 | |
| EfficientNetv2 | Precision (%) | 96.51 | 90.10 | 93.45 | 92.86 | 88.69 |
| Recall (%) | 98.22 | 91.05 | 88.20 | 93.27 | 90.85 | |
| F1-score (%) | 97.36 | 90.57 | 90.75 | 93.06 | 89.76 | |
| MobileNetv3 | Precision (%) | 94.25 | 88.40 | 85.71 | 86.55 | 80.12 |
| Recall (%) | 97.04 | 84.21 | 84.27 | 86.55 | 83.54 | |
| F1-score (%) | 95.62 | 86.25 | 84.98 | 86.55 | 81.79 | |
| DenseNet121 | Precision (%) | 94.32 | 88.6 | 92.07 | 88.99 | 84.76 |
| Recall (%) | 98.22 | 90.00 | 84.83 | 90.58 | 84.76 | |
| F1-score (%) | 96.23 | 89.29 | 88.30 | 89.78 | 84.76 | |
| DS-MENet | Precision (%) | 98.25 | 93.30 | 95.29 | 94.71 | 93.21 |
| Recall (%) | 99.41 | 95.26 | 91.01 | 96.41 | 92.07 | |
| F1-score (%) | 98.83 | 94.27 | 93.10 | 95.55 | 92.64 | |
FIGURE 13ROC curve.
Classification accuracy of different networks before and after image augmentation and dataset augmentation.
| Image enhancement methods | Original dataset (%) | Extended dataset (%) |
| No image enhancement | 83.17 | 89.62 |
| HFLF-MS | 85.94 | 95.25 |
FIGURE 14Performance comparison of ReLU and ReMish.
Verifying the effectiveness of the improved activation function ReMish.
| Activation function type | Accuracy (%) | Loss |
| ReLU | 93.42 | 0.06 |
| ReMish | 95.02 | 0.03 |
Verifying the effectiveness of depthwise separable convolutions.
| Normal networks | ResNet50 | DenseNet121 | DenseNet121-DS | ResNet50-DS | DS-MENet |
| Training time (s) | 1 h 37 m 34 s | 2 h 20 m 56 s | 2 h 4 m 12 s | 1 h 22 m 24 s | 2 h 18 m 29 s |
| Accuracy (%) | 82.48 | 89.74 | 88.86 | 80.65 | 95.02 |
Verifying the effectiveness of the MCF backbone enhancement method.
| Normal networks | ResNet50 | DenseNet121 | DenseNet121-MCF | ResNet50-MCF | DS-MENet |
| Training Time | 1 h 37 m 34 s | 2 h 20 m 56 s | 2 h 43 m 15 s | 1 h 56 m 44 s | 2 h 18 m 29 s |
| Accuracy (%) | 82.48 | 89.74 | 94.31 | 87.55 | 95.02 |
Ablation experiment results.
| Module | DenseNet121 | InceptionV4 | ResNeXt | |||
| Accuracy (%) | Training time | Accuracy (%) | Training time | Accuracy (%) | Training time | |
| Baseline | 89.74 | 2 h 20 m 56 s | 86.31 | 1 h 58 m 31 s | 88.56 | 2 h 15 m 27 s |
| DS | 88.86 | 2 h 4 m 12 s | 84.16 | 1 h 40 m12 s | 87.12 | 1 h 54 m 42 s |
| MCF | 94.31 | 2 h 43 m 15 s | 89.97 | 2 h 28 m 43 s | 92.05 | 2 h 48 m 29 s |
| ReMish | 90.33 | 2 h 10 m 09 s | 87.81 | 1 h 42 m 51 s | 89.66 | 2 h 03 m 04 s |
| DS+ReMish | 89.18 | 1 h 51 m 31 s | 86.44 | 1 h 23 m 03 s | 88.45 | 1 h 44 m 35 s |
| DS+MCF | 93.42 | 2 h 27 m 18 s | 88.14 | 2 h 04 m 15 s | 91.17 | 2 h 28 m 22 s |
| MCF+ReMish | 95.97 | 2 h 33 m 02 s | 91.21 | 2 h 13 m 55 s | 93.18 | 2 h 31 m 05 s |
| DS+ReMish+MCF | 95.02 | 2 h 18 m 29 s | 90.03 | 1 h 56 m 28 s | 91.83 | 2 h 16 m 08 s |
Public dataset details.
| Dataset | Category | Total | Available |
| PlantVillage | 38 | 55,400 |
|
| Stanford cars | 196 | 16,185 |
|
| ImageNetDogs | 120 | 20,580 |
|
Average classification accuracy of DS-MENet on public datasets.
| Dataset | Accuracy (%) |
| PlantVillage | 96.16 |
| Stanford cars | 95.41 |
| ImageNetDogs | 94.32 |