| Literature DB >> 34979965 |
Hidetsugu Asano1, Eiji Hirakawa2,3, Hayato Hayashi4, Keisuke Hamada5,6, Yuto Asayama4, Masaaki Oohashi4, Akira Uchiyama7, Teruo Higashino7.
Abstract
BACKGROUND: Regulation of temperature is clinically important in the care of neonates because it has a significant impact on prognosis. Although probes that make contact with the skin are widely used to monitor temperature and provide spot central and peripheral temperature information, they do not provide details of the temperature distribution around the body. Although it is possible to obtain detailed temperature distributions using multiple probes, this is not clinically practical. Thermographic techniques have been reported for measurement of temperature distribution in infants. However, as these methods require manual selection of the regions of interest (ROIs), they are not suitable for introduction into clinical settings in hospitals. Here, we describe a method for segmentation of thermal images that enables continuous quantitative contactless monitoring of the temperature distribution over the whole body of neonates.Entities:
Keywords: Infants; Semantic segmentation; Temperature; Thermography
Mesh:
Year: 2022 PMID: 34979965 PMCID: PMC8721998 DOI: 10.1186/s12880-021-00730-0
Source DB: PubMed Journal: BMC Med Imaging ISSN: 1471-2342 Impact factor: 1.930
Participant characteristics
| Characteristic ( | Median ± SD |
|---|---|
| Gestational week at delivery | 34 ± 2.8 |
| Birth weight (kg) | 2053 ± 712 |
| Age (days) | 0 ± 0.8 |
| Sex (male) | 7 (58%) |
Fig. 1Thermographic images. Many variations in thermal images were obtained with different sizes and positions of the infants: blue, 28 °C; red, 40 °C
Fig. 2Examples of thermal images and ground truth. The head is shown in red, the body in yellow, the arms in green, the legs in blue, and the other regions in black
Detailed network configuration of U-Net, U-Net GAN Generator, and U-Net GAN + SA Generator
| Layers | Output size | U-Net | U-Net GAN + SA |
|---|---|---|---|
| Input | 320 × 256 × 1 | ||
| Convolution | 320 × 256 × 16 | 3 × 3, 16 d | 3 × 3, 16 d |
| Downscale | 160 × 128 × 32 | 5 × 5, 32 d, CBR 3 × 3, 32 d, CBR | 1 × 1, 32 d 7 × 7, 32 d, SA 1 × 1, 32 d |
| Downscale | 80 × 6464 | 5 × 5, 64 d, CBR 3 × 3, 64 d, CBR | 1 × 1, 64 d 7 × 7, 64 d, SA 1 × 1, 64 d |
| Downscale | 40 × 32 × 128 | 5 × 5, 128 d, CBR 3 × 3, 128 d, CBR | 1 × 1, 128 d 7 × 7, 128 d, SA 1 × 1, 128 d |
| Downscale | 20 × 16 × 256 | 5 × 5, 256 d, CBR 3 × 3, 256 d, CBR | 1 × 1, 256 d 7 × 7, 256 d, SA 1 × 1, 256 d |
| Downscale | 10 × 8 × 512 | 5 × 5, 512 d, CBR 3 × 3, 512 d, CBR | 1 × 1, 512 d 7 × 7, 512 d, SA 1 × 1, 512 d |
| Upscale | 20 × 16 × 256 | 5 × 5, 256 d, CBR 3 × 3, 256 d, CBR | 1 × 1, 256 d 7 × 7, 256 d, SA 1 × 1, 256 d |
| Upscale | 40 × 32 × 128 | 5 × 5, 128 d, CBR 3 × 3, 128 d, CBR | 1 × 1, 128 d 7 × 7, 128 d, SA 1 × 1, 128 d |
| Upscale | 80 × 64 × 64 | 5 × 5, 64 d, CBR 3 × 3, 64 d, CBR | 1 × 1, 64 d 7 × 7, 64 d, SA 1 × 1, 64 d |
| Upscale | 160 × 128 × 32 | 5 × 5, 32 d, CBR 3 × 3, 32 d, CBR | 1 × 1, 32 d 7 × 7, 32 d, SA 1 × 1, 32 d |
| Upscale | 320 × 256 × 16 | 5 × 5, 16 d, CBR 3 × 3, 16 d, CBR | 1 × . 1, 16 d 7 × 7, 16 d, SA 1 × 1, 16 d |
| Convolution | 320 × 256 × 1 | 3 × 3, 1 d | 3 × 3, 1 d |
Fig. 3Network diagram of U-Net GAN
Detailed network configuration of U-Net GAN discriminator and U-Net GAN + SA discriminator
| Layers | Output size | U-Net | U-Net GAN + SA |
|---|---|---|---|
| Input | 320 × 256 × 1 | ||
| Convolution | 320 × 256 × 8 | 3 × 3, 8 d | 3 × 3, 8 d |
| Downscale | 160 × 128 × 16 | 5 × 5, 16 d, CBR 3 × 3, 16 d, CBR | 1 × 1, 16 d 7 × 7, 16 d, SA 1 × 1, 16 d |
| Downscale | 80 × 64 × 32 | 5 × 5, 32 d, CBR 3 × 3, 32 d, CBR | 1 × 1, 32 d 77, 32 d, SA 1 × 1, 32 d |
| Downscale | 40 × 32 × 64 | 5 × 5, 64 d, CBR 3 × 3, 64 d, CBR | 1 × 1, 64 d 7 × 7, 64 d, SA 1 × 1, 64 d |
| Encoder out ( | 5 | ReLU Average Pooling Linear, 5d | ReLU Average Pooling Linear, 5 d |
| Upscale | 80 × 64 × 32 | 5 × 5, 32 d, CBR 3 × 3, 32 d, CBR | 1 × 1, 32 d 7 × 7, 32 d, SA 1 × 1, 32 d |
| Upscale | 160 × 128 × 16 | 5 × 5, 16 d, CBR 3 × 3, 16 d, CBR | 1 × 1, 16 d 7 × 7, 16 d, SA 1 × 1, 16 d |
| Upscale | 0 × 256 × 8 | 5 × 5, 8 d, CBR 33, 8 d, CBR | 1 × 1, 8 d 7 × 7, 8 d, SA 1 × 1, 8 d |
| Convolution ( | 320 × 256 × 2 | 3 × 3, 2 d | 3 × 3, 2 d |
Parameters used for training
| Parameter | Net | U-Net GAN | U-Net GAN + SA |
|---|---|---|---|
| Learning rate | 0.01 | 0.01 (generator) | 0.01 (generator) |
| 1e−4 (discriminator) | 1e−4 (discriminator) | ||
| Batch size | 75 | 30 | 12 |
| Epoch | 200 | 100 | 100 |
Segmentation performance using U-Net with and without normalized convolution, FReLU, and group normalization
| Normalized convolution | FReLU | Group normalization | Accuracy (%) | SD (%) | mIoU (%) | SD (%) |
|---|---|---|---|---|---|---|
| 91.3 | 0.04 | 57.8 | 0.15 | |||
| ✓ | 91.1 | 0.05 | 60.9 | 0.15 | ||
| ✓ | 91.9 | 0.04 | 60.9 | 0.16 | ||
| ✓ | 92.2 | 0.04 | 62.2 | 0.14 | ||
| ✓ | ✓ | 91.4 | 0.05 | 60.7 | 0.15 | |
| ✓ | ✓ | 92.4 | 0.04 | 63.8 | 0.13 | |
| ✓ | ✓ | 92.9 | 0.04 | 64.5 | 0.15 | |
| ✓ | ✓ | ✓ | 92.4 | 0.04 | 62.9 | 0.15 |
Segmentation performance of U-Net, U-Net GAN, and U-Net GAN + SA
| Network | Accuracy (%) | SD (%) | mIoU (%) | SD (%) |
|---|---|---|---|---|
| U-Net | 92.9 | 0.04 | 64.5 | 0.15 |
| U-Net GAN | 93.3 | 0.03 | 66.9 | 0.13 |
| U-Net GAN + SA | 93.5 | 0.03 | 70.4 | 0.13 |
Fig. 4Confusion matrices of U-Net, U-Net GAN, and U-Net GAN + SA
Significant differences between the proposed methods
*p < 0.01, **p < 0.05
Hausdorff distance for each region
| Head | Body | Arm | Leg | Other | All (w/o other) | |||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Mean | SD | Mean | SD | Mean | SD | Mean | SD | Mean | SD | Mean | SD | |
| UNet | 34.6 | 25.3 | 38.1 | 29.5 | 59.2 | 37.7 | 43.4 | 40.0 | 26.7 | 9.3 | 43.9 | 34.8 |
| Normalized convolution | 33.0 | 23.6 | 31.5 | 26.2 | 55.7 | 35.6 | 46.5 | 50.7 | 26.5 | 8.9 | 41.5 | 36.4 |
| FReLU | 31.2 | 21.3 | 31.2 | 22.6 | 58.4 | 38.0 | 42.8 | 40.0 | 25.5 | 9.1 | 40.8 | 33.3 |
| Group normalization | 30.3 | 18.2 | 30.1 | 21.9 | 63.4 | 40.0 | 50.7 | 48.6 | 25.8 | 9.5 | 43.4 | 36.9 |
| Normalized convolution FReLU | 27.8 | 18.9 | 31.1 | 25.6 | 57.2 | 37.1 | 47.9 | 47.5 | 25.5 | 9.2 | 40.8 | 35.7 |
| Normalized convolution group normalization | 30.4 | 21.3 | 30.0 | 20.6 | 64.9 | 34.1 | 52.9 | 50.0 | 26.7 | 9.9 | 44.3 | 36.3 |
| FReLU group normalization | 27.5 | 17.8 | 25.2 | 20.2 | 48.7 | 32.8 | 38.6 | 38.3 | 25.4 | 8.8 | 34.9 | 29.8 |
| ALL | 26.8 | 17.2 | 28.5 | 22.4 | 53.9 | 36.9 | 48.5 | 49.9 | 25.3 | 9.9 | 39.1 | 35.5 |
| U-Net GAN | 27.4 | 19.7 | 26.7 | 22.7 | 49.3 | 34.1 | 39.0 | 42.7 | 24.5 | 9.1 | 35.5 | 32.1 |
| U-Net GAN + SA | 27.1 | 17.7 | 26.7 | 22.7 | 46.3 | 32.6 | 41.4 | 42.1 | 23.7 | 9.4 | 35.2 | 31.1 |
IoU for each region
| Head | Body | Arm | Leg | Other | All | |||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| IoU (%) | SD (%) | IoU (%) | SD (%) | IoU (%) | SD (%) | IoU (%) | SD (%) | IoU (%) | SD (%) | IoU (%) | SD (%) | |
| UNet | 50.8 | 0.16 | 52.1 | 0.16 | 41.6 | 0.17 | 53.5 | 0.23 | 91.1 | 0.03 | 57.8 | 0.15 |
| Normalized convolution | 57.5 | 0.14 | 48.5 | 0.15 | 47.4 | 0.15 | 59.8 | 0.23 | 91.5 | 0.03 | 60.9 | 0.14 |
| FReLU | 54.8 | 0.17 | 56.6 | 0.16 | 44.3 | 0.18 | 57.1 | 0.26 | 91.8 | 0.03 | 60.9 | 0.16 |
| Group normalization | 56.4 | 0.16 | 60.1 | 0.15 | 43.1 | 0.16 | 58.6 | 0.24 | 93.0 | 0.03 | 62.2 | 0.15 |
| Normalized convolution FReLU | 55.1 | 0.14 | 57.3 | 0.14 | 41.4 | 0.13 | 57.6 | 0.21 | 92.2 | 0.03 | 60.7 | 0.13 |
| Normalized convolution group normalization | 58.0 | 0.15 | 61.2 | 0.16 | 47.3 | 0.17 | 60.2 | 0.24 | 92.4 | 0.03 | 63.8 | 0.15 |
| FReLU group normalization | 59.2 | 0.16 | 62.3 | 0.15 | 47.7 | 0.17 | 61.4 | 0.23 | 92.0 | 0.03 | 64.5 | 0.15 |
| ALL | 58.2 | 0.15 | 59.3 | 0.15 | 47.9 | 0.16 | 58.0 | 0.25 | 91.3 | 0.03 | 62.9 | 0.15 |
| U-Net GAN | 61.5 | 0.14 | 64.3 | 0.14 | 49.1 | 0.15 | 66.4 | 0.2 | 93.4 | 0.03 | 66.9 | 0.13 |
| U-Net GAN + SA | 64.8 | 0.14 | 67.9 | 0.14 | 57.9 | 0.14 | 67.6 | 0.2 | 93.6 | 0.02 | 70.4 | 0.13 |
Fig. 5Examples of the differences in segmentation results between U-Net, U-Net GAN, and U-Net GAN + SA. a Input. b Ground truth. c U-Net. d U-Net GAN. e U-Net GAN + SA