Sparse Overcomplete Denoising: Aggregation Versus Global Optimization
Research output: Contribution to journal › Article › Scientific › peer-review
|Number of pages||5|
|Journal||IEEE Signal Processing Letters|
|Publication status||Published - 1 Oct 2017|
|Publication type||A1 Journal article-refereed|
Denoising is often addressed via sparse coding with respect to an overcomplete dictionary. There are two main approaches when the dictionary is composed of translates of an orthonormal basis. The first, traditionally employed by techniques such as wavelet cycle spinning, separately seeks sparsity w.r.t. each translate of the orthonormal basis, solving multiple partial optimizations and obtaining a collection of sparse approximations of the noise-free image, which are aggregated together to obtain a final estimate. The second approach, recently employed by convolutional sparse representations, instead seeks sparsity over the entire dictionary via a global optimization. It is tempting to view the former approach as providing a suboptimal solution of the latter. In this letter, we analyze whether global sparsity is a desirable property, and under what conditions the global optimization provides a better solution to the denoising problem. In particular, our experimental analysis shows that the two approaches attain comparable performance in case of natural images and global optimization outperforms the simpler aggregation of partial estimates only when the image admits an extremely sparse representation. We explain this phenomenon by separately studying the bias and variance of these solutions, and by noting that the variance of the global solution increases very rapidly as the original signal becomes less and less sparse.
- Convolutional sparse coding, denoising, overcomplete representations, sparse representations