\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

Tensor train rank minimization with nonlocal self-similarity for tensor completion

  • * Corresponding authors: Ting-Zhu Huang and Xi-Le Zhao

    * Corresponding authors: Ting-Zhu Huang and Xi-Le Zhao 
Abstract / Introduction Full Text(HTML) Figure(15) / Table(4) Related Papers Cited by
  • The tensor train (TT) rank has received increasing attention in tensor completion due to its ability to capture the global correlation of high-order tensors ($ \rm{order} >3 $). For third order visual data, direct TT rank minimization has not exploited the potential of TT rank for high-order tensors. The TT rank minimization accompany with ket augmentation, which transforms a lower-order tensor (e.g., visual data) into a higher-order tensor, suffers from serious block-artifacts. To tackle this issue, we suggest the TT rank minimization with nonlocal self-similarity for tensor completion by simultaneously exploring the spatial, temporal/spectral, and nonlocal redundancy in visual data. More precisely, the TT rank minimization is performed on a formed higher-order tensor called group by stacking similar cubes, which naturally and fully takes advantage of the ability of TT rank for high-order tensors. Moreover, the perturbation analysis for the TT low-rankness of each group is established. We develop the alternating direction method of multipliers tailored for the specific structure to solve the proposed model. Extensive experiments demonstrate that the proposed method is superior to several existing state-of-the-art methods in terms of both qualitative and quantitative measures.

    Mathematics Subject Classification: Primary: 94A08; Secondary: 68U10.

    Citation:

    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  Comparison of TT low-rankness of higher-order tensors generated by KA and NSS. (a-1, 2, 3) the augmented tensor, the original data, and an example of grouped tensors. (b-1) to (b-8) the distribution of singular values of the mode-1 to mode-8 canonical matricizations of the augmented tensor (a-1) and the average ratio of singular values larger than $ 1\% $ of the corresponding largest one is $ 25.2\% $. (c-1, 2, 3) the distribution of singular values of the mode-1, mode-2, and mode-3 canonical matricizations of the grouped tensor (a-3) and the average ratio of singular values larger than $ 1\% $ of the corresponding largest one is $ 7.1\% $. (d-1, 2, 3, 4) the observed data, the recovered results by SiLRTC-TT, TMac-TT, and the proposed method

    Figure 2.  Flowchart of the proposed completion framework

    Figure 3.  Original images

    Figure 4.  The results of testing color images with $ SR = 0.2 $ recovered by different methods. The first three rows and second three rows represent random sampling and tube sampling, respectively. From left to right: (a) the observed image, the results by (b) HaLRTC, (c) tSVD, (d) SiLRTC-TT, (e) TMac-TT, (f) NL-TT, and (g) the original image

    Figure 5.  The PSNR and SSIM values of the reconstructed color image results for random missing entries by different methods

    Figure 6.  The results of testing color images with structural missing entries recovered by different methods. From left to right: (a) the observed image, the results by (b) HaLRTC, (c) tSVD, (d) SiLRTC-TT, (e) TMac-TT, (f) NL-TT, and (g) the original image

    Figure 7.  The results of one band of testing MSIs with $ SR = 0.1 $ recovered by different methods. From left to right: (a) the observed image, the results by (b) HaLRTC, (c) tSVD, (d) SiLRTC-TT, (e) TMac-TT, (f) NL-TT, and (g) the original image

    Figure 8.  The PSNR and SSIM values of all bands of the reconstructed MSIs with $ SR = 0.2 $ recovered by different methods

    Figure 9.  Comparison of the PSNR values by different methods on the dataset CAVE with $ SR = 0.1 $

    Figure 10.  The results of two frames of testing color videos recovered by different methods. The first (third) and second (fourth) rows: the results of color videos bus (mobile), respectively. From left to right: (a) the observed image, the results by (b) HaLRTC, (c) tSVD, (d) SiLRTC-TT, (e) TMac-TT, (f) NL-TT, and (g) the original image

    Figure 11.  The PSNR and SSIM values of all frames of color videos recovered by different methods

    Figure 12.  The PSNR and SSIM curves as the function of the number of the similar cube $ h $. (a) change in the PSNR value, (b) change in the SSIM value

    Figure 13.  The PSNR and SSIM curves as the function of the number of the cube size $ s $. (a) change in the PSNR value, (b) change in the SSIM value

    Figure 14.  The relative error and objective function values curve versus the iteration number for grouped tubes. (a) grouped tensors, (b) change in the relative error value, (c) change in the objective function value

    Figure 15.  The results of testing color images with random sampling ($ SR = 0.3 $) recovered by different methods. From left to right: (a) the observed image, the results by (b) LDMM, (c) WNLL, (d) MLR, (e) NL-TT, and (f) the original image

    Table 1.  The PSNR, SSIM values, and running time (in seconds) obtained by HaLRTC, tSVD, SiLRTC-TT, TMac-TT and NL-TT for color image data with different sampling rates (SRs). The first three rows and second three rows represent random sampling and tube sampling, respectively

    Image SR 0.1 0.2 0.3 0.4
    Method PSNR SSIM Time PSNR SSIM Time PSNR SSIM Time PSNR SSIM Time
    lena HaLRTC 19.29 0.4151 5.52 23.10 0.6047 4.65 25.68 0.7311 8.65 28.00 0.8205 7.83
    tSVD 19.55 0.3500 87.65 23.33 0.5572 88.40 26.08 0.7033 88.80 28.60 0.8066 87.16
    SiLRTC-TT 21.67 0.5954 39.17 24.80 0.7366 27.36 27.01 0.8226 20.32 28.90 0.8782 16.11
    TMac-TT 24.25 0.6829 67.99 27.22 0.8097 65.13 28.87 0.8584 27.99 30.22 0.8902 20.38
    NL-TT 26.52 0.8124 192.95 30.09 0.8970 121.74 32.02 0.9309 90.28 33.87 0.9528 76.41
    airplane HaLRTC 19.80 0.4621 6.80 23.18 0.6437 4.69 25.62 0.7614 3.94 27.97 0.8399 7.36
    tSVD 19.87 0.4196 84.57 23.30 0.6139 87.18 25.86 0.7387 87.63 28.25 0.8258 87.57
    SiLRTC-TT 20.81 0.6072 32.20 23.42 0.7361 25.32 25.62 0.8213 20.04 27.55 0.8768 17.06
    TMac-TT 22.46 0.6766 7.62 25.81 0.8105 60.23 27.67 0.8622 25.36 28.97 0.8915 16.93
    NL-TT 24.33 0.7840 299.77 28.33 0.8929 109.30 30.29 0.9268 85.13 31.99 0.9489 107.19
    monarch HaLRTC 17.12 0.4381 8.20 19.59 0.6069 4.25 21.89 0.7404 3.66 24.20 0.8271 3.33
    tSVD 17.14 0.3372 91.07 19.98 0.5462 87.39 22.60 0.6980 88.03 25.23 0.8023 89.33
    SiLRTC-TT 17.95 0.5784 38.01 20.32 0.7196 30.94 22.38 0.8100 26.44 24.39 0.8702 22.52
    TMac-TT 19.21 0.6621 185.13 22.45 0.7912 104.59 24.86 0.8505 76.80 27.24 0.9046 81.50
    NL-TT 22.22 0.8307 587.32 25.42 0.9140 379.66 27.95 0.9496 306.90 30.74 0.9729 246.85
    lena HaLRTC 17.54 0.2942 6.37 20.97 0.4651 7.06 23.59 0.6144 5.38 25.88 0.7272 4.01
    tSVD 17.88 0.2570 88.13 20.85 0.4186 99.70 23.29 0.5676 93.58 25.50 0.6857 90.83
    SiLRTC-TT 20.90 0.5462 47.73 23.61 0.6830 33.84 25.69 0.7732 25.55 27.35 0.8353 19.25
    TMac-TT 21.62 0.5629 15.30 24.60 0.7193 41.10 26.22 0.7764 17.78 27.55 0.8392 51.87
    NL-TT 23.94 0.7351 223.34 27.45 0.8459 172.99 29.33 0.8928 93.35 31.38 0.9259 72.19
    airplane HaLRTC 17.81 0.3050 5.93 20.77 0.4847 6.32 23.15 0.6214 4.71 25.29 0.7289 4.14
    tSVD 17.97 0.2900 85.67 20.66 0.4588 85.71 22.97 0.5926 84.86 25.06 0.7029 89.59
    SiLRTC-TT 20.20 0.5570 39.31 22.49 0.6809 28.91 24.33 0.7661 23.86 26.09 0.8298 20.87
    TMac-TT 21.06 0.6169 35.50 23.15 0.7114 21.76 24.41 0.7729 19.68 26.17 0.8416 45.12
    NL-TT 22.45 0.7255 175.78 25.25 0.8210 103.25 27.29 0.8749 81.90 29.24 0.9149 67.92
    monarch HaLRTC 16.04 0.3424 6.13 18.28 0.5031 6.17 20.12 0.6363 4.62 21.93 0.7401 4.03
    tSVD 16.33 0.2786 84.71 18.21 0.4312 86.92 19.90 0.5620 115.32 21.65 0.6791 88.34
    SiLRTC-TT 17.46 0.5472 110.83 19.48 0.6695 34.32 21.19 0.7606 41.57 22.83 0.8290 25.77
    TMac-TT 15.12 0.3466 142.62 18.66 0.6710 144.85 21.74 0.7739 73.79 23.49 0.8282 32.73
    NL-TT 18.07 0.6564 147.54 22.33 0.8462 142.52 24.53 0.9086 108.21 26.25 0.9391 89.87
     | Show Table
    DownLoad: CSV

    Table 2.  The PSNR, SSIM values, and running time (in seconds) obtained by HaLRTC, tSVD, SiLRTC-TT, TMac-TT and NL-TT for color image data with structural missing entries

    Method HaLRTC tSVD SiLRTC-TT TMac-TT NL-TT
    Image PSNR SSIM Time PSNR SSIM Time PSNR SSIM Time PSNR SSIM Time PSNR SSIM Time
    house 36.44 0.9707 6.98 36.14 0.9681 86.86 38.52 0.9793 14.05 38.03 0.9740 11.93 45.34 0.9906 26.46
    facade 12.95 0.5681 0.12 12.95 0.5681 83.16 28.14 0.9062 28.01 27.50 0.8947 9.24 29.60 0.9357 416.18
    sailboat 26.49 0.8700 4.63 26.69 0.8696 86.10 26.53 0.8838 39.29 26.40 0.8995 8.44 27.86 0.9370 171.55
    barbara 32.44 0.9580 4.83 32.44 0.9579 86.28 33.99 0.9681 20.24 33.29 0.9654 6.87 37.56 0.9867 40.36
    peppers 31.64 0.9595 2.22 31.53 0.9551 85.85 32.59 0.9676 24.55 32.77 0.9651 9.27 36.33 0.9862 46.86
    Average 27.99 0.8653 3.76 27.95 0.8638 85.65 31.95 0.9410 25.23 31.60 0.9397 9.15 35.34 0.9672 140.28
     | Show Table
    DownLoad: CSV

    Table 3.  The average PSNR, SSIM values, and running time (in seconds) obtained by HaLRTC, tSVD, SiLRTC-TT, TMac-TT and NL-TT for MSIs with different SRs

    Image SR 0.05 0.1 0.2
    Method PSNR SSIM Time PSNR SSIM Time PSNR SSIM Time
    toy HaLRTC 20.14 0.6519 8.42 23.99 0.7790 6.69 28.91 0.8994 7.27
    tSVD 25.89 0.7680 264.95 30.34 0.8844 271.40 36.57 0.9602 300.93
    SiLRTC-TT 22.36 0.7138 379.68 25.81 0.8392 188.36 30.44 0.9433 513.80
    TMac-TT 27.28 0.8329 232.52 32.37 0.9317 143.12 35.74 0.9669 50.91
    NL-TT 29.58 0.9243 1414.34 34.44 0.9730 803.33 38.72 0.9899 557.72
    feathers HaLRTC 20.66 0.6422 9.96 24.26 0.7720 14.14 28.81 0.8876 11.07
    tSVD 25.15 0.6886 274.69 29.29 0.8266 340.60 34.82 0.9265 264.27
    SiLRTC-TT 22.86 0.7196 247.87 26.32 0.8417 232.09 31.11 0.9411 374.05
    TMac-TT 27.29 0.7611 58.00 32.12 0.9190 216.36 36.63 0.9631 62.90
    NL-TT 29.61 0.9102 1080.65 34.76 0.9699 860.40 39.56 0.9879 607.91
    superballs HaLRTC 23.28 0.7661 20.48 28.63 0.8621 9.71 34.10 0.9426 11.93
    tSVD 28.24 0.7636 267.41 32.39 0.8663 270.40 38.20 0.9564 270.92
    SiLRTC-TT 26.27 0.8290 289.90 29.79 0.9087 157.46 34.03 0.9651 379.94
    TMac-TT 29.97 0.8343 60.71 33.90 0.9346 63.73 40.19 0.9803 109.14
    NL-TT 32.93 0.9507 1150.35 37.25 0.9812 666.11 42.67 0.9939 409.49
     | Show Table
    DownLoad: CSV

    Table 4.  The PSNR and SSIM values obtained by LDMM, WNLL, MLR, and NL-TT for color image data with different SRs

    Image SR 0.1 0.2 0.3 0.4
    Method PSNR SSIM PSNR SSIM PSNR SSIM PSNR SSIM
    lena LDMM 22.20 0.6202 26.52 0.7944 27.74 0.8457 29.95 0.8879
    WNLL 26.24 0.8043 28.08 0.8534 29.29 0.8855 30.35 0.9071
    MLR 26.47 0.8052 29.03 0.8747 30.74 0.9119 32.14 0.9333
    NL-TT 26.52 0.8124 30.09 0.8970 32.02 0.9309 33.87 0.9528
    airplane LDMM 20.12 0.6230 24.03 0.7940 25.88 0.8532 28.59 0.8988
    WNLL 23.75 0.7675 25.76 0.8264 27.00 0.8629 28.03 0.8891
    MLR 24.14 0.7750 26.78 0.8595 28.62 0.9015 29.92 0.9260
    NL-TT 24.33 0.7840 28.33 0.8929 30.29 0.9268 31.99 0.9489
    monarch LDMM 18.61 0.6196 19.01 0.6463 22.34 0.8515 25.55 0.9176
    WNLL 20.54 0.7584 22.61 0.8262 23.78 0.8619 24.93 0.8916
    MLR 20.95 0.8030 23.73 0.8884 25.76 0.9291 27.48 0.9525
    NL-TT 22.22 0.8307 25.42 0.9140 27.95 0.9496 30.74 0.9729
     | Show Table
    DownLoad: CSV
  • [1] J. A. BenguaH. N. PhiemH. D. Tuan and M. N. Do, Efficient tensor completion for color image and video recovery: Low-rank tensor train, IEEE Transactions on Image Processing, 26 (2017), 2466-2479.  doi: 10.1109/TIP.2017.2672439.
    [2] D. P. Bertsekas, A. Nedic and A. E. Ozdaglar, Convex Analysis and Optimization, Athena Scientific, 2003.
    [3] M. BertalmioG. SapiroV. Caselles and C. Ballester, Image inpainting, Siggraph, 4 (2000), 417-424.  doi: 10.21236/ADA437378.
    [4] J.-F. CaiE. J. Cand$\grave{e}$s and Z. Shen, A singular value thresholding algorithm for matrix completion, SIAM Journal on Optimization, 20 (2010), 1956-1982.  doi: 10.1137/080738970.
    [5] S. H. ChanR. KhoshabehK. B. GibsonP. E. Gill and T. Q. Nguyen, An augmented Lagrangian method for total variation video restoration, IEEE Transactions on Image Processing, 20 (2011), 3097-3111.  doi: 10.1109/TIP.2011.2158229.
    [6] R. H. ChanM. Tao and X. Yuan, Constrained total variation deblurring models and fast algorithms based on alternating direction method of multipliers, SIAM Journal on Imaging Sciences, 6 (2013), 680-697.  doi: 10.1137/110860185.
    [7] Y. Chang, L.-X. Yan and S. Zhong, Hyper-laplacian regularized unidirectional low-rank tensor recovery for multispectral image denoising, IEEE Conference on Computer Vision and Pattern Recognition, (2017), 5901–5909. doi: 10.1109/CVPR.2017.625.
    [8] Y. ChenC. Hsu and H. M. Liao, Simultaneous tensor decomposition and completion using factor priors, IEEE Transactions on Pattern Analysis and Machine Intelligence, 20 (2014), 577-591. 
    [9] L.-B. CuiX.-Q. Zhang and S.-L. Wu, A new preconditioner of the tensor splitting iterative method for solving multi-linear systems with $\mathcal{M}$-tensors, Computational and Applied Mathematics, 39 (2020), 1-16.  doi: 10.1007/s40314-020-01194-8.
    [10] K. DabovA. FoiV. Katkovnik and K. Egiazarian, Image denoising by sparse 3-D transform-domain collaborative filtering, IEEE Transactions on Image Processing, 16 (2007), 2080-2095.  doi: 10.1109/TIP.2007.901238.
    [11] M. Ding, T.-Z. Huang and T.-H. Ma, Cauchy noise removal using group-based low-rank prior, Applied Mathematics and Computation, 372 (2020), 124971, 15 pp. doi: 10.1016/j.amc.2019.124971.
    [12] M. DingT.-Z. HuangT.-Y. JiX.-L. Zhao and J.-H. Yang, Low-rank tensor completion using matrix factorization based on tensor train rank and total variation, Journal of Scientific Computing, 81 (2019), 941-964.  doi: 10.1007/s10915-019-01044-8.
    [13] M. DingT.-Z. HuangS. WangJ.-J. Mei and X.-L. Zhao, Total variation with overlapping group sparsity for deblurring images under Cauchy noise, Applied Mathematics and Computation, 341 (2019), 128-147.  doi: 10.1016/j.amc.2018.08.014.
    [14] Y. DuG. HanY. QuanZ. YuH. WongC. L. P. Chen and J. Zhang, Exploiting global low-rank structure and local sparsity nature for tensor completion, IEEE Transactions on Cybernetics, 49 (2019), 3898-3910.  doi: 10.1109/TCYB.2018.2853122.
    [15] J. Eckstein and D. P. Bertsekas, On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators, Mathematical Programming, 55 (1992), 293-318.  doi: 10.1007/BF01581204.
    [16] G. Ely, S. Aeron, N. Hao and M. E. Kilmer, 5D seismic data completion and denoising using a novel class of tensor decompositions, GEOPHYSICS, 80 (2015), V83–V95. doi: 10.1190/geo2014-0467.1.
    [17] X. FuK.-J. HuangB. YangW. K. Ma and N. D. Sidiropoulos, Robust volume minimization-based matrix factorization for remote sensing and document clustering, IEEE Transactions on Signal Processing, 64 (2016), 6254-6268.  doi: 10.1109/TSP.2016.2602800.
    [18] S. Gandy, B. Recht and I. Yamada, Tensor completion and low-n-rank tensor recovery via convex optimization, Inverse Problems, 27 (2011), 025010, 19pp. doi: 10.1088/0266-5611/27/2/025010.
    [19] T. GoldsteinB. O'DonoghueS. Setzer and R. Baraniuk, Fast alternating direction optimization methods, SIAM Journal on Imaging Sciences, 7 (2014), 1588-1623.  doi: 10.1137/120896219.
    [20] L. Grasedyck, M. Kluge and S. Krämer, Alternating least squares tensor completion in the TT-format, preprint, arXiv: 1509.00311.
    [21] S.-H. Gu, L. Zhang, W.-M. Zuo and X.-C. Feng, Weighted nuclear norm minimization with application to image denoising, IEEE Conference on Computer Vision and Pattern Recognition, (2014), 2862–2869. doi: 10.1109/CVPR.2014.366.
    [22] B.-S. He and X. Yuan, On the O(1/n) convergence rate of the douglas-rachford alternating direction method, SIAM Journal on Numerical Analysis, 50 (2012), 700-709.  doi: 10.1137/110836936.
    [23] W. HeH.-Y. ZhangL.-P. Zhang and H.-F. Shen, Total-variation-regularized low-rank matrix factorization for hyperspectral image restoration, IEEE Transactions on Geoscience and Remote Sensing, 54 (2016), 178-188.  doi: 10.1109/TGRS.2015.2452812.
    [24] C. J. Hillar and L. H. Lim, Most tensor problems are NP-hard, Journal of the ACM, 60 (2013), Art. 45, 39 pp. doi: 10.1145/2512329.
    [25] Y.-M. HuangH.-Y. YanY.-W. Wen and X. Yang, Rank minimization with applications to image noise removal, Information Sciences, 429 (2018), 147-163.  doi: 10.1016/j.ins.2017.10.047.
    [26] T.-X. Jiang, T.-Z. Huang, X.-L. Zhao and L.-J. Deng, Multi-dimensional imaging data recovery via minimizing the partial sum of tubal nuclear norm, Journal of Computational and Applied Mathematics, 372 (2020), 112680, 15pp. doi: 10.1016/j.cam.2019.112680.
    [27] T.-X. JiangM. K. NgX.-L. Zhao and T.-Z. Huang, Framelet representation of tensor nuclear norm for third-order tensor completion, IEEE Transactions on Image Processing, 29 (2020), 7233-7244.  doi: 10.1109/TIP.2020.3000349.
    [28] T. G. Kolda, B. W. Bader and J. P. Kenny, Higher-order Web link analysis using multilinear algebra, IEEE International Conference on Data Mining, (2005), 242–249. doi: 10.1109/ICDM.2005.77.
    [29] T. G. Kolda and B. W. Bader, Tensor decompositions and applications, SIAM Review, 51 (2009), 455-500.  doi: 10.1137/07070111X.
    [30] M. E. KilmerK. BramanN. Hao and R. C. Hoover, Third-order tensors as operators on matrices: A theoretical and computational framework with applications in imaging, SIAM Journal on Matrix Analysis and Applications, 34 (2013), 148-172.  doi: 10.1137/110837711.
    [31] N. Komodakis, Image inpainting, IEEE Conference on Computer Vision and Pattern Recognition, 1 (2006), 442-452. 
    [32] R.-J. Lai and J. Li, Manifold based low-rank regularization for image restoration and semi-supervised learning, Journal of Scientific Computing, 74 (2018), 1241-1263.  doi: 10.1007/s10915-017-0492-x.
    [33] J. I. Latorre, Image Compression and Entanglement, Computer Science, 2005.
    [34] F. LiM. K. Ng and R. J. Plemmons, Coupled segmentation and denoising/deblurring models for hyperspectral material identification, Numerical Linear Algebra with Applications, 19 (2012), 153-173.  doi: 10.1002/nla.750.
    [35] Y.-P. LiuZ. Long and C. Zhu, Image completion using low tensor tree rank and total variation minimization, IEEE Transactions on Multimedia, 21 (2019), 338-350.  doi: 10.1109/TMM.2018.2859026.
    [36] Y.-Y. LiuF.-H. ShangL.-C. JiaoJ. Cheng and H. Cheng, Trace norm regularized CANDECOMP/PARAFAC decomposition with missing data, IEEE Transactions on Cybernetics, 45 (2015), 2437-2448.  doi: 10.1109/TCYB.2014.2374695.
    [37] Y.-P. LiuZ. LongH.-Y. Huang and C. Zhu, Low CP rank and tucker rank tensor completion for estimating missing components in image data, IEEE Transactions on Circuits and Systems for Video Technology, 30 (2020), 944-954.  doi: 10.1109/TCSVT.2019.2901311.
    [38] J. LiuP. MusialskiP. Wonka and J. Ye, Tensor completion for estimating missing values in visual data, IEEE Transactions on Pattern Analysis and Machine Intelligence, 35 (2013), 208-220.  doi: 10.1109/TPAMI.2012.39.
    [39] C.-Y. Lu, J.-S. Feng, Y.-D. Chen, W. Liu, Z.-C. Lin and S.-C. Yan, Tensor robust principal component analysis: Exact recovery of corrupted low-rank tensors via convex optimization, IEEE Conference on Computer Vision and Pattern Recognition, (2016), 5249–5257. doi: 10.1109/CVPR.2016.567.
    [40] C.-Y. Lu, J.-S. Feng, Z.-C. Lin and S.-C. Yan, Exact Low Tubal Rank Tensor Recovery from Gaussian Measurements, International Joint Conference on Artificial Intelligence, 2018. doi: 10.24963/ijcai.2018/347.
    [41] I. V. Oseledets, Tensor-train decomposition, SIAM Journal on Scientific Computing, 33 (2011), 2295-2317.  doi: 10.1137/090752286.
    [42] S. OsherZ.-Q. Shi and W. Zhu, Low dimensional manifold model for image processing, SIAM Journal on Imaging Sciences, 10 (2017), 1669-1690.  doi: 10.1137/16M1058686.
    [43] E. E. Papalexakis, C. Faloutsos and N. D. Sidiropoulos, Tensors for data mining and data fusion: Models, applications, and scalable algorithms, ACM Transactions on Intelligent Systems and Technology, 8 (2017), 16: 1–16: 44. doi: 10.1145/2915921.
    [44] Z.-Q. ShiS. Osher and W. Zhu, Weighted nonlocal laplacian on interpolation from sparse data, Journal of Scientific Computing, 73 (2017), 1164-1177.  doi: 10.1007/s10915-017-0421-z.
    [45] N. D. SidiropoulosL. De LathauwerX. FuK. HuangE. E. Papalexakis and C. Faloutsos, Tensor decomposition for signal processing and machine learning, IEEE Transactions on Signal Processing, 65 (2017), 3551-3582.  doi: 10.1109/TSP.2017.2690524.
    [46] G. W. Stewart, Matrix Algorithms, Vol I: Basic Decomposition, SIAM, 2001. doi: 10.1137/1.9780898718058.
    [47] W. Wang, V. Aggarwal and S. Aeron, Tensor completion by alternating minimization under the tensor train (TT) model, preprint, arXiv: 1609.05587.
    [48] Y. WangJ.-J. PengQ. ZhaoY. LeungX.-L. Zhao and D.-Y. Meng, Hyperspectral image restoration via total variation regularized low-rank tensor decomposition, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 11 (2018), 1227-1243. 
    [49] Y. WangD.-Y. Meng and M. Yuan, Sparse recovery: From vectors to tensors, National Science Review, 5 (2018), 756-767.  doi: 10.1093/nsr/nwx069.
    [50] T. XieS.-T. LiL.-Y. Fang and L.-C. Liu, Tensor completion via nonlocal low-rank regularization, IEEE Transactions on Cybernetics, 49 (2019), 2344-2354.  doi: 10.1109/TCYB.2018.2825598.
    [51] Z.-M. XingM.-Y. ZhouA. CastrodadG. Sapiro and L. Carin, Dictionary learning for noisy and incomplete hyperspectral images, SIAM Journal on Imaging Sciences, 5 (2012), 33-56.  doi: 10.1137/110837486.
    [52] B. XiongQ.-G. LiuJ.-J. XiongS.-Q. LiS.-S. Wang and D. Liang, Field-of-experts filters guided tensor completion, IEEE Transactions on Multimedia, 20 (2018), 2316-2329.  doi: 10.1109/TMM.2018.2806225.
    [53] R.-T. Xu, Y. Xu and Y.-H. Quan, Factorized tensor dictionary learning for visual tensor data completion, IEEE Transactions on Multimedia, (2020), 1–14.
    [54] Y.-Y. XuR.-R. HaoW.-T. Yin and Z.-X. Su, Parallel matrix factorization for low-rank tensor completion, Inverse Problems and Imaging, 9 (2017), 601-624.  doi: 10.3934/ipi.2015.9.601.
    [55] J.-H. YangX.-L. ZhaoT.-H. MaM. Ding and T.-Z. Huang, Tensor train rank minimization with hybrid smoothness for visual data recovery, Applied Mathematical Modelling, 81 (2020), 711-726.  doi: 10.1016/j.apm.2020.01.039.
    [56] J.-H. YangX.-L. ZhaoT.-H. MaY. ChenT.-Z. Huang and M. Ding, Remote sensing image destriping using unidirectional hybrid total variation and nonconvex low-rank regularization, Journal of Computational and Applied Mathematics, 363 (2020), 124-144.  doi: 10.1016/j.cam.2019.06.004.
    [57] T. YokotaQ.-B. ZhaoC. Li and A. Cichocki, Smooth PARAFAC decomposition for tensor completion, IEEE Transactions on Signal Processing, 64 (2016), 5423-5436.  doi: 10.1109/TSP.2016.2586759.
    [58] Q.-B. ZhaoL-Q. Zhang and A. Cichocki, Bayesian CP factorization of incomplete tensors with automatic rank determination, IEEE Transactions on Pattern Analysis and Machine Intelligence, 37 (2015), 1751-1763.  doi: 10.1109/TPAMI.2015.2392756.
    [59] X.-L. ZhaoW. WangT.-Y. ZengT.-Z. Huang and M. K. Ng, Total variation structured total least squares method for image restoration, SIAM Journal on Scientific Computing, 35 (2013), 1304-1320.  doi: 10.1137/130915406.
    [60] X.-J. Zhang, A nonconvex relaxation approach to low-rank tensor completion, IEEE Transactions on Neural Networks and Learning Systems, 30 (2019), 1659-1671.  doi: 10.1109/TNNLS.2018.2872583.
    [61] K.-B. ZhangX.-B. GaoD. -C.Tao and X.-L. Li, Single image super-resolution with non-local means and steering kernel regression, IEEE Transactions on Image Processing, 21 (2012), 4544-4556.  doi: 10.1109/TIP.2012.2208977.
    [62] Z. ZhangG. Ely and S. Aeron, Exact tensor completion using t-SVD, IEEE Transactions on Signal Processing, 65 (2017), 1511-1526.  doi: 10.1109/TSP.2016.2639466.
    [63] Y.-B. Zheng, T.-Z. Huang, X.-L. Zhao, Y. Chen and W. He, Double-factor-regularized low-rank tensor factorization for mixed noise removal in hyperspectral image, IEEE Transactions on Geoscience and Remote Sensing, (2020), 1–15.
  • 加载中

Figures(15)

Tables(4)

SHARE

Article Metrics

HTML views(3061) PDF downloads(458) Cited by(0)

Access History

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return