\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

Parameter learning and fractional differential operators: Applications in regularized image denoising and decomposition problems

  • * Corresponding author: Nico Weber

    * Corresponding author: Nico Weber
Abstract Full Text(HTML) Figure(21) Related Papers Cited by
  • In this paper, we focus on learning optimal parameters for PDE-based image denoising and decomposition models. First, we learn the regularization parameter and the differential operator for gray-scale image denoising using the fractional Laplacian in combination with a bilevel optimization problem. In our setting the fractional Laplacian allows the use of Fourier transform, which enables the optimization of the denoising operator. We prove stable and explainable results as an advantage in comparison to machine learning approaches. The numerical experiments correlate with our theoretical model settings and show a reduction of computing time in contrast to the Rudin-Osher-Fatemi model. Second, we introduce a new regularized image decomposition model with the fractional Laplacian and the Riesz potential. We provide an explicit formula for the unique solution and the numerical experiments illustrate the efficiency.

    Mathematics Subject Classification: Primary: 49J20, 49K20, 49N45; Secondary: 65N12, 65N35.

    Citation:

    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  Decomposition of the original image (left) in structural component $ u $ (middle) and textural component $ v $ (right)

    Figure 2.  Fractional Laplacian (1) of image $ u_d $. A higher exponent $ s $ in (1) results in stronger smoothing of high oscillations

    Figure 3.  Riesz potential (2) of image $ u_d $. A lower exponent $ s $ in (2) smooths the image $ u_d $

    Figure 4.  The denoising via solving (3) strongly depends on the choice of the parameters $ s $ and $ \alpha $

    Figure 5.  Energy functional $ j_n $ with function $ \varphi_1 $ and different choices for $ s $ and $ \alpha $

    Figure 6.  Results for the image "boat". As clearly seen, the model eliminates the fog based on the normally distributed noise

    Figure 7.  In the image "peppers" there is a significant reduction of noise while maintaining the edges

    Figure 8.  With a higher resolution($ n = 1566 $) than in the sample images before ($ n = 512 $), the noise can be better suppressed optically. The denoised image is visually almost identical to the image in [1]. The small deviation of the optimal parameters is due to the influence of the function $ \varphi_1(s,\alpha) $

    Figure 9.  On the synthetic image we get a good denoising performance, which is reflected in the high SSIM value. The denoised image is visually almost identical to the image in [1]

    Figure 10.  We study the influence of the noise on the parameter $ s $. The numerical results coincide with theoretical considerations that more noise implies a stronger smoothing of the image, i.e a higher value of the parameter $ s $

    Figure 11.  In the case of the parameter $ \alpha $ the numerical results coincide with theoretical considerations. A higher noise has the result that the denoised image is more far away from the noisy image, i.e a lower value of the parameter $ \alpha $

    Figure 12.  In comparison to the fractional Laplacian model, the ROF model produces smoother edges. Moreover the ROF model has a better SSIM-value

    Figure 13.  Runtime comparison between the fractional Laplacian model and the ROF model. Both models show empirically a linear runtime, but the fractional Laplacian model has a reduced computing time by factors 16-22. We used MATLAB R2015a with CPU i3-3240 and 8 GB RAM

    Figure 14.  The PSNR(peak signal-to-noise ratio) of the ROF model outperforms the Laplacian model, but the discrepancy depends on the choice of the image. We observe a decreasing PSNR for a larger variance, implying a stronger noise. A higher PSNR value implies a better image quality

    Figure 15.  The SSIM(structural similarity index measure) of the ROF model outperforms the Laplacian model, again the performance gap highly depends on the image. We observe a decreasing SSIM for stronger noise. A higher SSIM value implies a better image quality

    Figure 16.  Detail of the image "Baboon". The results of the Fractional Laplacian and ROF model are similar.

    Figure 17.  Image components with fixed $ s_1 = 0.2, s_2 = -1, \beta = 1 $ and different choices for parameter $ \alpha $ for the noise-free image. The choice of the parameter $ \alpha $ has a strong impact on the decomposition

    Figure 18.  Decomposition of the image "kentaur" with optimal parameters $ \bar{s_1} = 0.172,\; \bar{\alpha} = 9999.7,\; \bar{s_2} = -0.926 $ and $ \bar{\beta} = 10410 $. We see a significant improvement of the SSIM-value and the desired image decomposition

    Figure 19.  The component $ u $ of (26) is obtained with optimal parameters $ \bar{s_1} = 0.351,\; \bar{\alpha} = 9999.4,\; \bar{s_2} = -0.918 $ and $ \bar{\beta} = 9997.2 $. The fractional image decomposition model achieves a significantly better result than the fractional image denoising model with an improvement of the SSIM value by 0.4. The result is less blurred and has a better contrast

    Figure 20.  Detail of the image "pepper" ($ \sigma = 0.1 $). The model (29) can not sufficiently distinguish between noise and component $ v $

    Figure 21.  Detail of the image "Baboon" ($ \sigma = 0.1 $)

  • [1] H. Antil and S. Bartels, Spectral approximation of fractional PDEs in image processing and phase field modeling, Comput. Methods Appl. Math., 17 (2017), 661-678.  doi: 10.1515/cmam-2017-0039.
    [2] H. Antil, Z. W. Di and R. Khatri, Bilevel optimization, deep learning and fractional laplacian regularization with applications in tomography, Inverse Problems, 36 (2020), 064001. doi: 10.1088/1361-6420/ab80d7.
    [3] H. AntilE. Otárola and A. J. Salgado, Optimization with respect to order in a fractional diffusion model: Analysis, approximation and algorithmic aspects, J. Sci. Comput., 77 (2018), 204-224.  doi: 10.1007/s10915-018-0703-0.
    [4] H. Antil and C. N. Rautenberg, Sobolev spaces with non-Muckenhoupt weights, fractional elliptic operators, and applications, SIAM J. Math. Anal., 51 (2019), 2479-2503.  doi: 10.1137/18M1224970.
    [5] J.-F. AujolG. GilboaT. Chan and S. Osher, Structure-texture image decomposition——modeling, algorithms, and parameter selection, International Journal of Computer Vision, 67 (2006), 111-136.  doi: 10.1007/s11263-006-4331-z.
    [6] J. Batson and L. Royer, {N}oise2{S}elf: Blind denoising by self-supervision, in Proceedings of the 36th International Conference on Machine Learning (eds. K. Chaudhuri and R. Salakhutdinov), vol. 97 of Proceedings of Machine Learning Research, PMLR, (2019), 524–533.
    [7] S. Boyd and  L. VandenbergheConvex Optimization, Cambridge University Press, Cambridge, 2004.  doi: 10.1017/CBO9780511804441.
    [8] F. Boyer and P. Fabrie, Mathematical Tools for the Study of the Incompressible Navier-Stokes Equations and Related Models, vol. 183 of Applied Mathematical Sciences, Springer, New York, 2013. doi: 10.1007/978-1-4614-5975-0.
    [9] A. Bueno-OrovioD. Kay and K. Burrage, Fourier spectral methods for fractional-in-space reaction-diffusion equations, BIT, 54 (2014), 937-954.  doi: 10.1007/s10543-014-0484-2.
    [10] Y. Gousseau and J.-M. Morel, Are natural images of bounded variation?, SIAM J. Math. Anal., 33 (2001), 634-648.  doi: 10.1137/S0036141000371150.
    [11] K. Kunish and T. Pock, A bilevel optimization approach for parameter learning in variational models, SIAM Journal on Imaging Sciences, 6 (2013), 938-983.  doi: 10.1137/120882706.
    [12] P. Liu and C.-B. Sch{ö}nlieb, Learning optimal orders of the underlying euclidean norm in total variation image denoising, arXiv preprint arXiv: 1903.11953.
    [13] Q. LiuZ. Zhang and Z. Guo, On a fractional reaction-diffusion system applied to image decomposition and restoration, Comput. Math. Appl., 78 (2019), 1739-1751.  doi: 10.1016/j.camwa.2019.05.030.
    [14] Y. Nesterov, Introductory Lectures on Convex Optimization, Springer US, 2004. doi: 10.1007/978-1-4419-8853-9.
    [15] S. OsherA. Solé and L. Vese, Image decomposition and restoration using total variation minimization and the $H^{-1}$ norm, Multiscale Model. Simul., 1 (2003), 349-370.  doi: 10.1137/S1540345902416247.
    [16] G. Peyré, The numerical tours of signal processing, Computing in Science & Engineering, 13 (2011), 94-97.  doi: 10.1109/MCSE.2011.71.
    [17] L. I. RudinS. Osher and E. Fatemi, Nonlinear total variation based noise removal algorithms, Phys. D, 60 (1992), 259-268.  doi: 10.1016/0167-2789(92)90242-F.
    [18] J. Saranen and G. Vainikko, Periodic Integral and Pseudodifferential Equations with Numerical Approximation, Springer Monographs in Mathematics, Springer-Verlag, Berlin, 2002. doi: 10.1007/978-3-662-04796-5.
    [19] J. Sprekels and E. Valdinoci, A new type of identification problems: Optimizing the fractional order in a nonlocal evolution equation, SIAM J. Control Optim., 55 (2017), 70-93.  doi: 10.1137/16M105575X.
    [20] D. UlyanovA. Vedaldi and V. Lempitsky, Deep image prior, International Journal of Computer Vision volume, 128 (2020), 1867-1888.  doi: 10.1007/s11263-020-01303-4.
  • 加载中

Figures(21)

SHARE

Article Metrics

HTML views(847) PDF downloads(625) Cited by(0)

Access History

Other Articles By Authors

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return