# iUNets: Fully invertible U-Nets with Learnable Up- and Downsampling

@article{Etmann2020iUNetsFI, title={iUNets: Fully invertible U-Nets with Learnable Up- and Downsampling}, author={Christian Etmann and Rihuan Ke and Carola-Bibiane Sch{\"o}nlieb}, journal={ArXiv}, year={2020}, volume={abs/2005.05220} }

U-Nets have been established as a standard architecture for image-to-image learning problems such as segmentation and inverse problems in imaging. For large-scale data, as it for example appears in 3D medical imaging, the U-Net however has prohibitive memory requirements. Here, we present a new fully-invertible U-Net-based architecture called the iUNet, which employs novel learnable and invertible up- and downsampling operations, thereby making the use of memory-efficient backpropagation…

## 12 Citations

### Beyond NaN: Resiliency of Optimization Layers in The Face of Infeasibility

- Computer Science
- 2022

This work identifies a weakness in a set-up where inputs to the optimization layer lead to undefined output of the neural network, and proposes a defense for the failure cases by controlling the condition number of the input matrix.

### Semi-invertible Convolutional Neural Network for Overall Survival Prediction in Head and Neck Cancer

- Computer ScienceICC 2022 - IEEE International Conference on Communications
- 2022

The model exploits the 3D features of computed tomography scans to enrich the dataset used in the learning phase and thereby improve the prediction accuracy, and designs a first architecture featuring a combination of a CNN classifier with a fully convolutional network pre-processor.

### Reversible Vision Transformers

- Computer Science2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2022

Reversible Vision Transformers achieve a reduced memory footprint of up to 15.5× at identical model complexity, parameters and accuracy, demonstrating the promise of reversible vision transformers as an efficient backbone for resource limited training regimes.

### Continuous Generative Neural Networks

- Mathematics, Computer ScienceArXiv
- 2022

This work presents conditions on the convolutional and nonlinearity and on the non linearity that guarantee that a CGNN is injective, and allows for deriving Lipschitz stability estimates for (possibly nonlinear) in-dimensional inverse problems with unknowns belonging to the manifold generated by aCGNN.

### Restorable Image Operators with Quasi-Invertible Networks

- Computer ScienceAAAI
- 2022

A quasi-invertible model that learns common image processing operators in a restorable fashion is proposed that can generate visually pleasing results with the original content embedded and can be easily applied to practical applications such as restorable human face retouching and highlight preserved exposure adjustment.

### Fully hyperbolic convolutional neural networks

- Computer ScienceResearch in the Mathematical Sciences
- 2022

This work introduces a fully conservative hyperbolic network for problems with high-dimensional input and output and introduces a coarsening operation that allows completely reversible CNNs by using a learnable discrete wavelet transform and its inverse to both coarsen and interpolate the network state and change the number of channels.

### Invertible Learned Primal-Dual

- Mathematics
- 2021

We propose invertible Learned Primal-Dual as a method for tomographic image reconstruction. This is a learned iterative method based on the Learned Primal-Dual neural network architecture, which…

### DDUNet: Dense Dense U-Net with Applications in Image Denoising

- Computer Science2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)
- 2021

This work proposes a novel cascading U-Nets architecture with multi-scale dense processing, named Dense Dense U-Net (DDUNet), which is good at edge recovery and structure preservation in real noisy image denoising and develops a series of related important techniques to improve model performance with fewer parameters.

### Understanding and mitigating exploding inverses in invertible neural networks

- Mathematics, Computer ScienceAISTATS
- 2021

This work shows that commonly-used INN architectures suffer from exploding inverses and are thus prone to becoming numerically non-invertible, and proposes a flexible and efficient regularizer for tasks where local invertibility is sufficient.

### Structure-preserving deep learning

- Computer ScienceEuropean Journal of Applied Mathematics
- 2021

A number of directions in deep learning are reviewed: some deep neural networks can be understood as discretisations of dynamical systems, neural Networks can be designed to have desirable properties such as invertibility or group equivariance, and new algorithmic frameworks based on conformal Hamiltonian systems and Riemannian manifolds to solve the optimisation problems have been proposed.

## References

SHOWING 1-10 OF 31 REFERENCES

### A Partially Reversible U-Net for Memory-Efficient Volumetric Image Segmentation

- Computer ScienceMICCAI
- 2019

A partially reversible U-Net architecture that reduces memory consumption substantially and alleviates the biggest memory bottleneck and enables very deep (theoretically infinitely deep) 3D architectures is proposed.

### Computing the Fréchet Derivative of the Matrix Exponential, with an Application to Condition Number Estimation

- Computer Science, MathematicsSIAM J. Matrix Anal. Appl.
- 2008

It is shown that the implementation of the scaling and squaring method can be extended to compute both $e^A$ and the Frechet derivative at $A$ in the direction of E, denoted by $L(A,E)$, at a cost about three times that for computing $e^{A$ alone.

### Auto-Encoding Variational Bayes

- Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

### Invert to Learn to Invert

- Computer ScienceNeurIPS
- 2019

This work proposes an iterative inverse model with constant memory that relies on invertible networks to avoid storing intermediate activations, and allows us to train models with 400 layers on 3D volumes in an MRI image reconstruction task.

### MemCNN: A Python/PyTorch package for creating memory-efficient invertible neural networks

- Computer ScienceJ. Open Source Softw.
- 2019

### Guided Image Generation with Conditional Invertible Neural Networks

- Computer ScienceArXiv
- 2019

This work introduces a new architecture called conditional invertible neural network (cINN), which combines the purely generative INN model with an unconstrained feed-forward network, which efficiently preprocesses the conditioning input into useful features.

### A Closer Look at Double Backpropagation

- Computer Science, MathematicsArXiv
- 2019

A description of the discontinuous loss surface of ReLU networks both in the inputs and the parameters is provided and it is demonstrated why the discontinuities do not pose a big problem in reality.

### Residual Flows for Invertible Generative Modeling

- MathematicsNeurIPS
- 2019

The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.

### Fully hyperbolic convolutional neural networks

- Computer ScienceResearch in the Mathematical Sciences
- 2022

This work introduces a fully conservative hyperbolic network for problems with high-dimensional input and output and introduces a coarsening operation that allows completely reversible CNNs by using a learnable discrete wavelet transform and its inverse to both coarsen and interpolate the network state and change the number of channels.