| Literature DB >> 35813419 |
Jing Qin1.
Abstract
With the development of wireless network technology, the transformation of educational concepts, the upgrading of users' educational needs, and the transformation of lifestyles, online education has made great strides forward. However, due to the rapid growth of online education in my country, many regulatory systems have not kept pace with the development of online education, resulting in low user experience and satisfaction with online education. The establishment of a user satisfaction model is beneficial for attracting attention and thinking about research in the field of online education service quality, assisting enterprises in recognizing the specific impact of various factors in services, accelerating service quality improvement, and assisting in the formulation of industry norms and improving enterprise competitiveness, all of which help students acquire knowledge more easily. In the era of big data, traditional satisfaction evaluation methods have many drawbacks, so more and more machine learning methods are applied to satisfaction evaluation models. This paper takes the research of machine learning algorithm as the core to carry out the research work, uses the cost-sensitive idea to improve the decision tree, considers the cost of different types of classification errors, and uses the random forest principle to integrate the generated decision tree, thereby improving the accuracy of the model. The model has better stability, and the validity of the model is verified by experiments. For a follow-up in-depth investigation of online education satisfaction rating technology, the linked work of this paper has certain reference and reference value.Entities:
Mesh:
Year: 2022 PMID: 35813419 PMCID: PMC9259360 DOI: 10.1155/2022/7958932
Source DB: PubMed Journal: Comput Math Methods Med ISSN: 1748-670X Impact factor: 2.809
Figure 1Machine learning workflow.
Figure 2Online education user satisfaction evaluation index model.
Con fusion matrix.
| True class | |||
|---|---|---|---|
| 0 | 1 | ||
| Predicted class | 0 | TN | FN |
| 1 | FP | TP | |
Figure 3Decision tree model.
Figure 4Ensemble learning model.
Figure 5Random forest algorithm model.
Figure 6Sampling.
Algorithm performance comparison.
| C4.5 | Cost-sensitive decision tree | Random forest | ||||
|---|---|---|---|---|---|---|
| Accuracy | Recall | Accuracy | Recall | Accuracy | Recall | |
| 1 | 0.9637 | 0.3118 | 0.8110 | 0.6058 | 0.9011 | 0.7158 |
| 2 | 0.9642 | 0.3574 | 0.7522 | 0.7623 | 0.8322 | 0.7521 |
| 3 | 0.9634 | 0.3234 | 0.7942 | 0.7221 | 0.8042 | 0.7408 |
| 4 | 0.9637 | 0.3834 | 0.7866 | 0.7208 | 0.7966 | 0.7565 |
| 5 | 0.9642 | 0.3534 | 0.8366 | 0.6765 | 0.8409 | 0.7508 |
| 6 | 0.9634 | 0.3334 | 0.8166 | 0.7008 | 0.8242 | 0.7367 |
| 7 | 0.9637 | 0.3634 | 0.7966 | 0.6901 | 0.8342 | 0.7408 |
| 8 | 0.9642 | 0.3834 | 0.7966 | 0.7208 | 0.8366 | 0.7208 |
| 9 | 0.9634 | 0.3334 | 0.8266 | 0.6208 | 0.8766 | 0.7308 |
| 10 | 0.9615 | 0.3234 | 0.8166 | 0.7108 | 0.8205 | 0.7367 |
| Average value | 0.9635 | 0.3466 | 0.8225 | 0.6931 | 0.8205 | 0.7367 |
| Standard deviation | 0.000789 | 0.02406 | 0.04314 | 0.04547 | 0.01653 | 0.0134 |
Figure 7Recall comparison.
Figure 8Accuracy comparison.
Figure 9Compare with other algorithms.