Optimization-inspired Cumulative Transmission Network for image compressive sensing
Research output: Contribution to journal › Journal article › Research › peer-review
Standard
Optimization-inspired Cumulative Transmission Network for image compressive sensing. / Zhang, Tianfang; Li, Lei; Peng, Zhenming.
In: Knowledge-Based Systems, Vol. 279, 110963, 2023.Research output: Contribution to journal › Journal article › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - JOUR
T1 - Optimization-inspired Cumulative Transmission Network for image compressive sensing
AU - Zhang, Tianfang
AU - Li, Lei
AU - Peng, Zhenming
N1 - Publisher Copyright: © 2023 Elsevier B.V.
PY - 2023
Y1 - 2023
N2 - Compressive Sensing (CS) techniques enable accurate signal reconstruction with few measurements. Deep Unfolding Networks (DUNs) have recently been shown to increase the efficiency of CS by emulating iterative CS optimization procedures by neural networks. However, most of these DUNs suffer from redundant update procedures or complex matrix operations, which can impair their reconstruction performances. Here we propose the optimization-inspired Cumulative Transmission Network (CT-Net), a DUN approach for natural image CS. We formulate an optimization procedure introducing an auxiliary variable similar to Half Quadratic Splitting (HQS). Unfolding this procedure defines the basic structure of our neural architecture, which is then further refined. A CT-Net is composed of Reconstruction Fidelity Modules (RFMs) for minimizing the reconstruction error and Constraint Gradient Approximation (CGA) modules for approximating (the gradient of) sparsity constraints instead of relying on an analytic solutions such as soft-thresholding. Furthermore, a lightweight Cumulative Transmission (CT) between CGAs in each reconstruction stage is proposed to facilitate a better feature representation. Experiments on several widely used natural image benchmarks illustrate the effectiveness of CT-Net with significant performance improvements and fewer network parameters compared to existing state-of-the-art methods. The experiments also demonstrate the scene and noise robustness of the proposed method.
AB - Compressive Sensing (CS) techniques enable accurate signal reconstruction with few measurements. Deep Unfolding Networks (DUNs) have recently been shown to increase the efficiency of CS by emulating iterative CS optimization procedures by neural networks. However, most of these DUNs suffer from redundant update procedures or complex matrix operations, which can impair their reconstruction performances. Here we propose the optimization-inspired Cumulative Transmission Network (CT-Net), a DUN approach for natural image CS. We formulate an optimization procedure introducing an auxiliary variable similar to Half Quadratic Splitting (HQS). Unfolding this procedure defines the basic structure of our neural architecture, which is then further refined. A CT-Net is composed of Reconstruction Fidelity Modules (RFMs) for minimizing the reconstruction error and Constraint Gradient Approximation (CGA) modules for approximating (the gradient of) sparsity constraints instead of relying on an analytic solutions such as soft-thresholding. Furthermore, a lightweight Cumulative Transmission (CT) between CGAs in each reconstruction stage is proposed to facilitate a better feature representation. Experiments on several widely used natural image benchmarks illustrate the effectiveness of CT-Net with significant performance improvements and fewer network parameters compared to existing state-of-the-art methods. The experiments also demonstrate the scene and noise robustness of the proposed method.
KW - Compressive sensing
KW - Deep unfolding
KW - Image reconstruction
KW - Neural networks
KW - Optimization
UR - http://www.scopus.com/inward/record.url?scp=85171132561&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2023.110963
DO - 10.1016/j.knosys.2023.110963
M3 - Journal article
AN - SCOPUS:85171132561
VL - 279
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
SN - 0950-7051
M1 - 110963
ER -
ID: 368339468