Speaker
Description
Unfolded proximal neural networks (PNNs) form a family of methods that combines deep learning and proximal optimization approaches. They consist in designing a task specific neural network by unrolling a proximal algorithm for a fixed number of iterations, where linear operators can be learned from prior training procedure. PNNs have shown to be more robust than traditional deep learning approaches while reaching at least as good performances, in particular in computational imaging. However, training PNNs still depends on the efficiency of available training algorithms. In this work, we propose a lifted training formulation based on Bregman distances for unfolded PNNs. Leveraging deterministic and stochastic block-coordinate forward-backward methods, we design computational strategies beyond traditional back-propagation methods for solving the learning problem efficiently. We assess the behaviour of the proposed training approach through numerical simulations on image denoising problems, where the structure of the denoising PNN is based on dual proximal gradient iterations.