| Literature DB >> 23861725 |
Mohammad Rezapour1, Morteza Khavanin Zadeh, Mohammad Mehdi Sepehri.
Abstract
Arteriovenous fistula (AVF) is an important vascular access for hemodialysis (HD) treatment but has 20-60% rate of early failure. Detecting association between patient's parameters and early AVF failure is important for reducing its prevalence and relevant costs. Also predicting incidence of this complication in new patients is a beneficial controlling procedure. Patient safety and preservation of early AVF failure is the ultimate goal. Our research society is Hasheminejad Kidney Center (HKC) of Tehran, which is one of Iran's largest renal hospitals. We analyzed data of 193 HD patients using supervised techniques of data mining approach. There were 137 male (70.98%) and 56 female (29.02%) patients introduced into this study. The average of age for all the patients was 53.87 ± 17.47 years. Twenty eight patients had smoked and the number of diabetic patients and nondiabetics was 87 and 106, respectively. A significant relationship was found between "diabetes mellitus," "smoking," and "hypertension" with early AVF failure in this study. We have found that these mentioned risk factors have important roles in outcome of vascular surgery, versus other parameters such as "age." Then we predicted this complication in future AVF surgeries and evaluated our designed prediction methods with accuracy rates of 61.66%-75.13%.Entities:
Mesh:
Year: 2013 PMID: 23861725 PMCID: PMC3686075 DOI: 10.1155/2013/830745
Source DB: PubMed Journal: Comput Math Methods Med ISSN: 1748-670X Impact factor: 2.238
The extracted rules with JRIP learner (“yes” means early AVF failure).
| JRIP rules | |
|---|---|
| (DiabetesM = yes) → Failure = no (87.0/37.0) | |
| Number of rules: 2 |
Figure 1Effects of diabetes and smoking are determined from interpreted rules of this tree.
The extracted rules by running “rule learner.”
| Rule model | |
|---|---|
| If DiabetesM = no and sex = female, then yes (7/19) | |
| If DiabetesM = no and htn = no, then yes (13/28) | |
| If htn = no and age = range2 [49.50–64.50], then no (10/1) | |
| If sex = male and Hgb = range2 [8.45–9.95], then yes (7/13) | |
| If age = range3 [64.50– | |
| If Hgb = range3 [9.95– | |
| If Hgb = range3 [9.950– | |
| If sex = male and age = range3 [64.500– | |
| If sex = female and age = range1 [− | |
| If Hgb = range1 [− | |
| If sex = female and htn = yes, then yes (1/4) | |
| If age = range2 [49.5–64.5] and sex = male, then no (3/1) | |
| If sex = male, then yes (2/4) | |
| Correct: 135 out of 193 training examples. |
Extracted rules, analysis with side of AVF.
| Rule model | |
|---|---|
| If location = brachial and DM = no, then yes (30/2) | |
| If location = radial and DM = yes, then no (24/45) | |
| If dm = yes and Hgb = range2 [8.450–9.950], then yes (5/0) | |
| If Hgb = range3 [9.950– | |
| If age = range1 [− | |
| If age = range2 [49.50–64.50] and sex = female, then no (0/3) | |
| If Hgb = range3 [9.950– | |
| If htn = no and Hgb = range1 [− | |
| If age = range2 [49.500–64.500] and DM = no, then no (1/5) | |
| If htn = yes and sex = female, then yes (4/2) | |
| If sex = female and htn = no, then no (1/2) | |
| If age = range3 [64.500– | |
| If htn = yes and age = range3 [64.500– | |
| If Hgb = range1 [− | |
| If Hgb = range2 [8.450–9.950] and htn = yes, then no (1/2) | |
| If htn = yes and sex = male, then yes (3/2) | |
| If sex = male, then no (5/5) | |
| Correct: 143 out of 193 training examples |
Figure 2Decision tree of system training, after stratified sampling.
The prediction results by method one.
| Id | Integer | Avg = 97 ± 55.714 | [1.0; 193.0] |
|---|---|---|---|
| Failure | Nominal | Mode = yes (106), least = no (87) | No (87), yes (106) |
| Prediction (failure) | Nominal | Mode = yes (112), least = no (81) | No (81), yes (112) |
| Confidence (no) | Real | Avg = 0.457 ± 0.380 | [0.0; 1.0] |
| Confidence (yes) | Real | Avg = 0.543 ± 0.380 | [0.0; 1.0] |
Figure 3System training by J48 tree, after absolute stratified sampling.
The prediction results by the 2nd method.
| Id | Integer | Avg = 97 ± 55.714 | [1.0; 193.0] |
|---|---|---|---|
| Failure | Nominal | Mode = yes (106), least = no (87) | No (87), yes (106) |
| Prediction (failure) | Nominal | Mode = yes (105), least = no (88) | No (88), yes (105) |
| Confidence (no) | Real | Avg = 0.408 ± 0.371 | [0.0; 1.0] |
| Confidence (yes) | Real | Avg = 0.592 ± 0.371 | [0.0; 1.0] |
The accuracy rate in method one.
| Accuracy: 61.66% | |||
|---|---|---|---|
| True no | True yes | Class precision | |
| Pred. no | 47 | 34 | 58.02% |
| Pred. yes | 40 | 72 | 64.29% |
| Class recall | 54.02% | 67.92% | |
The accuracy rate in method two.
| Accuracy: 74.61% | |||
|---|---|---|---|
| True no | True yes | Class precision | |
| Pred. no | 63 | 25 | 71.59% |
| Pred. yes | 24 | 81 | 77.14% |
| Class recall | 72.41% | 76.42% | |
Final predicted results in analysis with location.
| Sampling method | Training algorithm | Prediction results | |
|---|---|---|---|
| Stratified | Neural network | Accuracy: | 69.97% (±5.17%) |
| Yes/no: | 74/119 | ||
| Stratified | Decision tree | Accuracy: | 67.91% |
| Yes/no: | 94/99 | ||
| Stratified | Naïve Bayesian | Accuracy: | 69.97% (±8.61%) |
| Yes/no: | 100/93 | ||
| Absolute stratified | WJ-48 | Accuracy: | 75.13% |
| Yes/no: | 96/97 | ||