Your browser doesn't support javascript.
loading
Attention-Aware Discrimination for MR-to-CT Image Translation Using Cycle-Consistent Generative Adversarial Networks.
Kearney, Vasant; Ziemer, Benjamin P; Perry, Alan; Wang, Tianqi; Chan, Jason W; Ma, Lijun; Morin, Olivier; Yom, Sue S; Solberg, Timothy D.
Afiliação
  • Kearney V; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Ziemer BP; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Perry A; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Wang T; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Chan JW; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Ma L; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Morin O; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Yom SS; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
  • Solberg TD; Department of Radiation Oncology, University of California, 1600 Divisidero St, San Francisco, CA 94115.
Radiol Artif Intell ; 2(2): e190027, 2020 Mar.
Article em En | MEDLINE | ID: mdl-33937817
ABSTRACT

PURPOSE:

To suggest an attention-aware, cycle-consistent generative adversarial network (A-CycleGAN) enhanced with variational autoencoding (VAE) as a superior alternative to current state-of-the-art MR-to-CT image translation methods. MATERIALS AND

METHODS:

An attention-gating mechanism is incorporated into a discriminator network to encourage a more parsimonious use of network parameters, whereas VAE enhancement enables deeper discrimination architectures without inhibiting model convergence. Findings from 60 patients with head, neck, and brain cancer were used to train and validate A-CycleGAN, and findings from 30 patients were used for the holdout test set and were used to report final evaluation metric results using mean absolute error (MAE) and peak signal-to-noise ratio (PSNR).

RESULTS:

A-CycleGAN achieved superior results compared with U-Net, a generative adversarial network (GAN), and a cycle-consistent GAN. The A-CycleGAN averages, 95% confidence intervals (CIs), and Wilcoxon signed-rank two-sided test statistics are shown for MAE (19.61 [95% CI 18.83, 20.39], P = .0104), structure similarity index metric (0.778 [95% CI 0.758, 0.798], P = .0495), and PSNR (62.35 [95% CI 61.80, 62.90], P = .0571).

CONCLUSION:

A-CycleGANs were a superior alternative to state-of-the-art MR-to-CT image translation methods.© RSNA, 2020.

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Revista: Radiol Artif Intell Ano de publicação: 2020 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Revista: Radiol Artif Intell Ano de publicação: 2020 Tipo de documento: Article