• Corpus ID: 239016303

MRI Recovery with A Self-calibrated Denoiser

@inproceedings{Liu2021MRIRW,
  title={MRI Recovery with A Self-calibrated Denoiser},
  author={Sizhuo Liu and Philip Schniter and Rizwan Ahmad},
  year={2021}
}
Plug-and-play (PnP) methods that employ application-specific denoisers have been proposed to solve inverse problems, including MRI reconstruction. However, training applicationspecific denoisers is not feasible for many applications due to the lack of training data. In this work, we propose a PnPinspired recovery method that does not require data beyond the single, incomplete set of measurements. The proposed method, called recovery with a self-calibrated denoiser (ReSiDe), trains the denoiser… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 18 REFERENCES
Plug-and-Play Methods for Magnetic Resonance Imaging: Using Denoisers for Image Recovery
TLDR
This article describes the use of plug-and-play (PnP) algorithms for MRI image recovery and describes how the result of the PnP method can be interpreted as a solution to an equilibrium equation, allowing convergence analysis from this perspective.
DAGAN: Deep De-Aliasing Generative Adversarial Networks for Fast Compressed Sensing MRI Reconstruction
TLDR
This paper provides a deep learning-based strategy for reconstruction of CS-MRI, and bridges a substantial gap between conventional non-learning methods working only on data from a single image, and prior knowledge from large training data sets.
Plug-and-Play priors for model based reconstruction
TLDR
This paper demonstrates with some simple examples how Plug-and-Play priors can be used to mix and match a wide variety of existing denoising models with a tomographic forward model, thus greatly expanding the range of possible problem solutions.
Unsupervised MRI Reconstruction with Generative Adversarial Networks
TLDR
This work presents a deep learning framework for MRI reconstruction without any fully-sampled data using generative adversarial networks and recovers more anatomical structure compared to conventional methods.
Noise2Void - Learning Denoising From Single Noisy Images
TLDR
Noise2Void is introduced, a training scheme that allows us to train directly on the body of data to be denoised and can therefore be applied when other methods cannot, and compares favorably to training-free denoising methods.
Noise2Self: Blind Denoising by Self-Supervision
TLDR
A general framework for denoising high-dimensional measurements which requires no prior on the signal, no estimate of the noise, and no clean training data is proposed, which allows us to calibrate $\mathcal{J}$-invariant versions of any parameterised Denoising algorithm, from the single hyperparameter of a median filter to the millions of weights of a deep neural network.
BM3D Frames and Variational Image Deblurring
TLDR
Simulation experiments show that the decoupled algorithm derived from the GNE formulation demonstrates the best numerical and visual results and shows superiority with respect to the state of the art in the field, confirming a valuable potential of BM3D-frames as an advanced image modeling tool.
Deep Image Prior
TLDR
It is shown that a randomly-initialized neural network can be used as a handcrafted prior with excellent results in standard inverse problems such as denoising, superresolution, and inpainting.
Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising
TLDR
This paper investigates the construction of feed-forward denoising convolutional neural networks (DnCNNs) to embrace the progress in very deep architecture, learning algorithm, and regularization method into image Denoising, and uses residual learning and batch normalization to speed up the training process as well as boost theDenoising performance.
Primal-Dual Plug-and-Play Image Restoration
  • S. Ono
  • Mathematics, Computer Science
    IEEE Signal Processing Letters
  • 2017
TLDR
This approach resolves issues by leveraging the nature of primal-dual splitting, yielding a very flexible plug-and-play image restoration method that is much more efficient than ADMMPnP with an inner loop and keeps the same efficiency in the case where the subproblem of ADM MPnP can be solved efficiently.
...
1
2
...