Retrieval Compensated Group Structured Sparsity for Image Super-Resolution

Retrieval Compensated Group Structured Sparsity for Image Super-Resolution

 Abstract:

Sparse representation-based image super-resolution is a well-studied topic; however, a general sparse framework that can utilize both internal and external dependencies remains unexplored. In this paper, we propose a group-structured sparse representation approach to make full use of both internal and external dependencies to facilitate image super-resolution. External compensated correlated information is introduced by a two-stage retrieval and refinement. First, in the global stage, the content-based features are exploited to select correlated external images. Then, in the local stage, the patch similarity, measured by the combination of content and high-frequency patch features, is utilized to refine the selected external data. To better learn priors from the compensated external data based on the distribution of the internal data and further complement their advantages, nonlocal redundancy is incorporated into the sparse representation model to form a group sparsity framework based on an adaptive structured dictionary. Our proposed adaptive structured dictionary consists of two parts: one trained on internal data and the other trained on compensated external data. Both are organized in a cluster-based form. To provide the desired over-completeness property, when sparsely coding a given LR patch, the proposed structured dictionary is generated dynamically by combining several of the nearest internal and external orthogonal subdictionaries to the patch instead of selecting only the nearest one as in previous methods. Extensive experiments on image super-resolution validate the effectiveness and state-of-the-art performance of the proposed method. Additional experiments on contaminated and uncorrelated external data also demonstrate its superior robustness.

 


Comments are closed.