| Literature DB >> 36187450 |
Yoichi Okuda1,2, Tsukasa Saida3, Keigo Morinaga4, Arisa Ohara4, Akihiro Hara1, Shinji Hashimoto1, Shinji Takahashi1, Tomoaki Goya1, Nobuhiro Ohkohchi2.
Abstract
Aim: To compare deep learning and experienced physicians in diagnosing gangrenous cholecystitis using computed tomography images and explore the feasibility of diagnostic assistance for acute cholecystitis requiring emergency surgery.Entities:
Keywords: Acute cholecystitis; artificial intelligence; computed tomography; convolutional neural network
Year: 2022 PMID: 36187450 PMCID: PMC9487185 DOI: 10.1002/ams2.783
Source DB: PubMed Journal: Acute Med Surg ISSN: 2052-8817
Fig. 1Flowchart for the patient selection process.
Characteristics of the patients and imaging data
| Training data | Testing data | |||||
|---|---|---|---|---|---|---|
| Gangrenous | Acute | All | Gangrenous | Acute | All | |
| Patients ( | 18 (M:F = 12:6) | 94 (M:F = 53:41) | 112 (M:F = 65:47) | 7 (M:F = 6:1) | 35 (M:F = 22:13) | 42 (M:F = 28:14) |
| Age | ||||||
| Mean ± standard deviation (year) | 76 ± 12 | 75 ± 15 | 75 ± 15 | 71 ± 14 | 72 ± 13 | 73 ± 13 |
| Range (year) | 50–95 | 30–108 | 30–108 | 44–85 | 25–87 | 25–87 |
| Images | ||||||
| All ( | 18/354 | 94/1163 | 112/1517 | 7/11 | 35/57 | 42/68 |
| Noncontrast CT (n/slices) | 18/223 | 92/770 | 110/993 | 6/6 | 34/34 | 40/40 |
| Portal phase contrast CT ( | 6/101 | 22/219 | 28/320 | 3/3 | 15/15 | 18/18 |
| Arterial phase contrast CT ( | 2/30 | 10/116 | 12/146 | 1/1 | 5/5 | 6/6 |
| Equilibrium phase contrast CT ( | 0 | 7/58 | 7/58 | 1/1 | 3/3 | 4/4 |
Abbreviations: F, female; M, male.
Sensitivity, specificity, and AUC of the convolutional neural network
| Interpreter | Sensitivity | 95% CI | Specificity | 95% CI | Accuracy | 95% CI | AUC | 95% CI |
|
|---|---|---|---|---|---|---|---|---|---|
| CNN | 0.70 | 0.44–0.87 | 0.93 | 0.88–0.96 | 0.89 | 0.81–0.95 | 0.84 | 0.68–1.00 | − |
| Surgeon | 0.55 | 0.30–0.77 | 0.67 | 0.62–0.71 | 0.65 | 0.57–0.72 | 0.63 | 0.44–0.82 | 0.048 |
| Radiologist 1 | 0.45 | 0.23–0.70 | 0.72 | 0.68–0.77 | 0.68 | 0.60–0.76 | 0.62 | 0.45–0.79 | 0.085 |
| Radiologist 2 | 0.82 | 0.55–0.95 | 0.47 | 0.42–0.50 | 0.53 | 0.44–0.57 | 0.62 | 0.43–0.82 | 0.018 |
Abbreviations: AUC, area under the receiver operating characteristic curve; CI, confidence interval; CNN, convolutional neural network.
P < 0.05.
Fig. 2Receiver operating characteristic curves for the performance of the convolutional neural network and the reviewers.
Fig. 3Gangrenous cholecystitis in a 44‐year‐old man. (A) Test image: the CNN and all reviewers successfully diagnosed the gangrenous cholecystitis (confidence value for gangrenous cholecystitis: CNN = 86%, surgeon = 80%, radiologist 1 = 60%, radiologist 2 = 100%). On the right side, the gallbladder wall (arrow) is indistinct, and the surrounding fatty tissue is highly opacified. (B) Resected specimen: the gallbladder wall is reddish‐brown in color, and the mucosal surface is strongly devitalized, indicating gangrenous cholecystitis. CNN, convolutional neural network.
Fig. 4Noncomplicated acute cholecystitis in a 25‐year‐old woman. (A) Test image: only the CNN successfully diagnosed the noncomplicated acute cholecystitis (confidence value for gangrenous cholecystitis: CNN = 0%, surgeon = 60%, radiologist 1 = 60%, radiologist 2 = 80%). The high‐density gallbladder wall‐lumen sign (arrow) appears to be present, although the wall itself is difficult to assess on the noncontrast CT image. High opacity is seen around the gallbladder neck. (B) Resected specimen: the mucosal surface of the gallbladder wall is smooth and well preserved, with no color abnormalities, indicating noncomplicated acute cholecystitis. CNN, convolutional neural network.
ICC between the C and the reviewers
| Reviewers | ICC | |
|---|---|---|
| CNN vs reviewers | Surgeon | 0.04 |
| Radiologist 1 | 0.05 | |
| Radiologist 2 | 0.17 | |
| Between reviewers | Surgeon vs radiologist 1 | 0.74 |
| Surgeon vs radiologist 2 | 0.43 | |
| Radiologist 1 vs radiologist 2 | 0.33 |
Abbreviations: CNN, convolutional neural network; ICC, interobserver agreement.