Literature DB >> 32613357

Feasibility of new fat suppression for breast MRI using pix2pix.

Mio Mori1, Tomoyuki Fujioka2, Leona Katsuta1, Yuka Kikuchi1, Goshi Oda3, Tsuyoshi Nakagawa3, Yoshio Kitazume1, Kazunori Kubota4, Ukihide Tateishi1.   

Abstract

PURPOSE: To generate and evaluate fat-saturated T1-weighted (FST1W) image synthesis of breast magnetic resonance imaging (MRI) using pix2pix.
MATERIALS AND METHODS: We collected pairs of noncontrast-enhanced T1-weighted an FST1W images of breast MRI for training data (2112 pairs from 15 patients), validation data (428 pairs from three patients), and test data (90 pairs from 30 patients). From the original images, 90 synthetic images were generated with 50, 100, and 200 epochs using pix2pix. Two breast radiologists evaluated the synthetic images (from 1 = excellent to 5 = very poor) for quality of fat suppression, anatomic structures, artifacts, etc. The average score was analyzed for each epoch and breast density.
RESULTS: The synthetic images were scored from 2.95 to 3.60; the best was reduction in artifacts when using 100 epochs. The average overall quality scores for fat suppression were 3.63 at 50 epochs, 3.24 at 100 epochs, and 3.12 at 200 epochs. In the analysis for breast density, each score was significantly better for nondense breasts than for dense breasts; the average score was 2.88-3.18 for nondense breasts and 3.03-3.42 for dense breasts (P = 0.000-0.042).
CONCLUSION: Pix2pix had the potential to generate FST1W synthesis for breast MRI.

Entities:  

Keywords:  Breast imaging; Deep learning; Generative adversarial networks; Magnetic resonance imaging; Pix2pix

Mesh:

Year:  2020        PMID: 32613357     DOI: 10.1007/s11604-020-01012-5

Source DB:  PubMed          Journal:  Jpn J Radiol        ISSN: 1867-1071            Impact factor:   2.374


  6 in total

Review 1.  The Utility of Deep Learning in Breast Ultrasonic Imaging: A Review.

Authors:  Tomoyuki Fujioka; Mio Mori; Kazunori Kubota; Jun Oyama; Emi Yamaga; Yuka Yashima; Leona Katsuta; Kyoko Nomura; Miyako Nara; Goshi Oda; Tsuyoshi Nakagawa; Yoshio Kitazume; Ukihide Tateishi
Journal:  Diagnostics (Basel)       Date:  2020-12-06

2.  Progressive Transmission of Medical Images via a Bank of Generative Adversarial Networks.

Authors:  Ching-Chun Chang; Xu Wang; Ji-Hwei Horng; Isao Echizen
Journal:  J Healthc Eng       Date:  2021-04-28       Impact factor: 2.682

3.  Deep Learning Using Multiple Degrees of Maximum-Intensity Projection for PET/CT Image Classification in Breast Cancer.

Authors:  Kanae Takahashi; Tomoyuki Fujioka; Jun Oyama; Mio Mori; Emi Yamaga; Yuka Yashima; Tomoki Imokawa; Atsushi Hayashi; Yu Kujiraoka; Junichi Tsuchiya; Goshi Oda; Tsuyoshi Nakagawa; Ukihide Tateishi
Journal:  Tomography       Date:  2022-01-05

4.  High-throughput widefield fluorescence imaging of 3D samples using deep learning for 2D projection image restoration.

Authors:  Edvin Forsgren; Christoffer Edlund; Miniver Oliver; Kalpana Barnes; Rickard Sjögren; Timothy R Jackson
Journal:  PLoS One       Date:  2022-05-19       Impact factor: 3.240

5.  Motion correction in MR image for analysis of VSRAD using generative adversarial network.

Authors:  Nobukiyo Yoshida; Hajime Kageyama; Hiroyuki Akai; Koichiro Yasaka; Haruto Sugawara; Yukinori Okada; Akira Kunimatsu
Journal:  PLoS One       Date:  2022-09-14       Impact factor: 3.752

6.  Investigating the Image Quality and Utility of Synthetic MRI in the Breast.

Authors:  Tomoyuki Fujioka; Mio Mori; Jun Oyama; Kazunori Kubota; Emi Yamaga; Yuka Yashima; Leona Katsuta; Kyoko Nomura; Miyako Nara; Goshi Oda; Tsuyoshi Nakagawa; Ukihide Tateishi
Journal:  Magn Reson Med Sci       Date:  2021-02-02       Impact factor: 2.471

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.