January  2017, 11(1): 1-23. doi: 10.3934/ipi.2017001

Augmented Lagrangian method for an Euler's elastica based segmentation model that promotes convex contours

1. 

Norwegian Defence Research Establishment, P.O. Box 25, NO-2027 Kjeller, Norway

2. 

Department of Mathematics, University of Bergen, P.O. Box 7803, NO-5020 Bergen, Norway

3. 

Department of Mathematics, University of Alabama, Box 870350, Tuscaloosa, AL 35487, USA

* Corresponding author: Wei Zhu

Received  September 2015 Revised  April 2016 Published  January 2017

Fund Project: The first author is supported by the Norwegian Research Council eVita project 214889

In this paper, we propose an image segmentation model where an $L^1$ variant of the Euler's elastica energy is used as boundary regularization. An interesting feature of this model lies in its preference for convex segmentation contours. However, due to the high order and non-differentiability of Euler's elastica energy, it is nontrivial to minimize the associated functional. As in recent work on the ordinary $L^2$-Euler's elastica model in imaging, we propose using an augmented Lagrangian method to tackle the minimization problem. Specifically, we design a novel augmented Lagrangian functional that deals with the mean curvature term differently than in previous works. The new treatment reduces the number of Lagrange multipliers employed, and more importantly, it helps represent the curvature more effectively and faithfully. Numerical experiments validate the efficiency of the proposed augmented Lagrangian method and also demonstrate new features of this particular segmentation model, such as shape driven and data driven properties.

Citation: Egil Bae, Xue-Cheng Tai, Wei Zhu. Augmented Lagrangian method for an Euler's elastica based segmentation model that promotes convex contours. Inverse Problems & Imaging, 2017, 11 (1) : 1-23. doi: 10.3934/ipi.2017001
References:
[1]

L. Ambrosio and S. Masnou, A direct variational approach to a problem arising in image reconstruction, Interfaces Free Bound, 5 (2003), 63-81.

[2]

L. Ambrosio and S. Masnou, On a variational problem arising in image reconstruction, Free boundary problems (Trento, 2002), Internat. Ser. Numer. Math., Birkh{a"}user, Basel,, 147 (2004), 17-26.

[3]

L. Ambrosio and V. M. Tortorelli, Approximation of functionals depending on jumps by elliptic functionals via Gamma convergence., Comm. Pure Appl. Math., 43 (1990), 999-1036. doi: 10.1002/cpa.3160430805.

[4]

E. BaeJ. Shi and X.-C. Tai, Graph Cuts for Curvature Based Image Denoising, IEEE Transactions on Image Processing, 20 (2011), 1199-1210. doi: 10.1109/TIP.2010.2090533.

[5]

K. BrediesT. Pock and B. Wirth, A convex, lower semicontinuous approximation of Euler's elastica energy, SIAM J. Math. Anal., 47 (2015), 566-613. doi: 10.1137/130939493.

[6]

C. Brito-Loeza and K. Chen, Multigrid algorithm for high order denoising, SIAM J. Imaging. Sciences, 3 (2010), 363-389. doi: 10.1137/080737903.

[7]

V. CasellesR. Kimmel and G. Sapiro, Geodesic active contours, Int. J. Comput. Vision, 22 (1997), 61-79. doi: 10.1109/ICCV.1995.466871.

[8]

T. Chan and L. A. Vese, Active contours without edges, IEEE Trans. Image Process., 10 (2001), 266-277. doi: 10.1109/83.902291.

[9]

T. Chan and S. Esedoglu, Aspects of total variation regularized L1 function approximation, SIAM J. Appl. Math., 65 (2005), 1817-1837. doi: 10.1137/040604297.

[10]

T. ChanS. Esedoglu and M. Nikolova, Algorithms for finding global minimizers of denoising and segmentation models, SIAM J. Appl. Math., 66 (2006), 1632-1648. doi: 10.1137/040615286.

[11]

T. ChanS. H. Kang and J. H. Shen, Euler's elastica and curvature based inpaintings, SIAM J. Appl. Math., 63 (2002), 564-592.

[12]

T. Chan and W. Zhu, Level set based shape prior segmentation, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2005), 1164-1170. doi: 10.1109/CVPR.2005.212.

[13]

Y. ChenH. TagareS. ThiruvenkadamF. HuangD. WilsonK. GopinathR. Briggs and E. Geiser, Using prior shapes in geometric active contours in a variational framework, Int. J. Comput. Vision, 50 (2002), 315-328.

[14]

D. CremersF. TischhauserJ. Weickert and C. Schnorr, Diffusion Snakes: Introducing statistical shape knowledge into the Mumford--Shah functional, Int. J. Comput. Vision, 50 (2002), 295-313.

[15]

E. De Giorgi, Some remarks on Γ-convergence and least squares methods, in Composite Media and Homogenization Theory, G. Dal Maso and G.F. Dell' Antonio (Eds.) Birkhauser, Boston, (1991), 135–142.

[16]

M. P. do Carmo, Differential Geometry of Curves and Surfaces, Prentice-Hall, Inc. , 1976.

[17]

Y. DuanY. WangX.-C. Tai and J. Hahn, A fast augmented Lagrangian method for Euler's elastica model, SSVM 2011, LSCS, 6667 (2012), 144-156.

[18]

N. El-Zehiry and L. Grady, Fast global optimization of curvature, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), 3257-3264. doi: 10.1109/CVPR.2010.5540057.

[19]

M. Elsey and S. Esedoglu, Analogue of the total variation denoising model in the context of geometry processing, SIAM J. Multiscale Modeling and Simulation, 7 (2009), 1549-1573. doi: 10.1137/080736612.

[20]

S. Esedoglu and R. March, Segmentation with depth but without detecting junctions, J. Math. Imaging and Vision., 18 (2003), 7-15. doi: 10.1023/A:1021837026373.

[21]

M. Hintermüller, C. N. Rautenberg and J. Hahn, Functional-analytic and numerical issues in splitting methods for total variation-based image reconstruction, Inverse Problems, 30 (2014), 055014.

[22]

M. KassA. Witkin and D. Terzopoulos, Snakes: Active contour models, Int. J. Comput. Vision, 1 (1988), 321-331. doi: 10.1007/BF00133570.

[23]

M. LeventonW. Grimson and O. Faugeraus, Statistical shape influence in geodesic active contours, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2000), 316-323.

[24]

P. L. Lions and B. Mercier, Splitting algorithms for the sume of two nonlinear opertors, SIAM J. Numer. Amal., 16 (1979), 964-979. doi: 10.1137/0716071.

[25]

R. March and M. Dozio, A variational method for the recovery of smooth boundaries, Image and Vision Computing, 15 (1997), 705-712. doi: 10.1016/S0262-8856(97)00002-4.

[26]

S. Masnou, Disocclusion: A variational approach using level lines, IEEE Trans. Image Process., 11 (2002), 68-76. doi: 10.1109/83.982815.

[27]

S. Masnou and J. M. Morel, Level lines based disocclusion, Proc. IEEE Int. Conf. on Image Processing, Chicago, IL, (1998), 259-263. doi: 10.1109/ICIP.1998.999016.

[28]

D. Mumford and J. Shah, Optimal approximation by piecewise smooth functions and associated variational problems, Comm. Pure Appl. Math., 42 (1989), 577-685. doi: 10.1002/cpa.3160420503.

[29]

M. MyllykoskiR. GlowinskiT. Kärkkäinen and T. Rossi, A new augmented Lagrangian approach for L1-mean curvature image denoising, SIAM J. Imaging Sci., 8 (2015), 95-125. doi: 10.1137/140962164.

[30]

M. Nitzberg, D. Mumford and T. Shiota, Filering, Segmentation, and Depth, Lecture Notes in Computer Science, 662, Springer Verlag, Berlin, 1993.

[31]

S. Osher and J. A. Sethian, Fronts propagating with curvature-dependent speed-algorithm based on Hamilton-Jacobi formulations, J. Comput. Phys., 79 (1988), 12-49. doi: 10.1016/0021-9991(88)90002-2.

[32]

S. Osher and R. Fedkiw, Level Set Methods and Dynamic Implicit Surfaces, Springer-Verlag, New York, 2003.

[33]

D. PengB. MerrimanS. OsherH. K. Zhao and M. Kang, A PDE-based fast local level set method, J. Comput. Phys., 155 (1999), 410-438. doi: 10.1006/jcph.1999.6345.

[34]

R. T. Rockafellar, Augmented Lagrangians and applications of the proximal point algorithm in convex programming, Mathematics of Operations Research, 1 (1976), 97-116. doi: 10.1287/moor.1.2.97.

[35]

L. RudinS. Osher and E. Fatemi, Nonlinear total variation based noise removal algorithm, Physica D, 60 (1992), 259-268. doi: 10.1016/0167-2789(92)90242-F.

[36]

T. Schoenemann, F. Kahl and D. Cremers, Curvature Regularity for Region-Based Image Segmentation and Inpainting: A Linear Programming Relaxation, IEEE International Conference on Computer Vision (ICCV), 2009. doi: 10.1109/ICCV.2009.5459209.

[37]

T. SchoenemannF. KahlS. Masnou and D. Cremers, A linear framework for region-based image segmentation and inpainting involving curvature penalization, Int. J. Comput. Vision, 99 (2012), 53-68. doi: 10.1007/s11263-012-0518-7.

[38]

E. Strekalovskiy and D. Cremers, Generalized ordering constraints for multilabel optimization, IEEE International Conference on Computer Vision, ICCV 2011, (2011), 2619-2626. doi: 10.1109/ICCV.2011.6126551.

[39]

X. C. TaiJ. Hahn and G. J. Chung, A fast algorithm for Euler's Elastica model using augmented Lagrangian method, SIAM J. Imaging Sciences, 4 (2011), 313-344. doi: 10.1137/100803730.

[40]

C. Wu and X. C. Tai, Augmented Lagrangian method, dual methods, and split Bregman iteration for ROF, Vectorial TV, and high order models, SIAM J. Imaging Sciences, 3 (2010), 300-339. doi: 10.1137/090767558.

[41]

F. YangK. Chen and B. Yu, Homotopy method for a mean curvature-based denoising model, Appl. Numer. Math., 62 (2012), 185-200. doi: 10.1016/j.apnum.2011.12.001.

[42]

W. Zhu and T. Chan, A variational model for capturing illusory contours using curvature, J. Math. Imaging Vision, 27 (2007), 29-40. doi: 10.1007/s10851-006-9695-8.

[43]

W. Zhu and T. Chan, Image denoising using mean curvature of image surface, SIAM J. Imaging Sciences, 5 (2012), 1-32. doi: 10.1137/110822268.

[44]

W. ZhuX. C. Tai and T. Chan, Image segmentation using Euler's elastica as the regularization, J. Sci. Comput., 57 (2013), 414-438. doi: 10.1007/s10915-013-9710-3.

[45]

W. ZhuX. C. Tai and T. Chan, Augmented Lagrangian method for a mean curvature based image denoising model, Inverse Probl. Imag., 7 (2013), 1409-1432. doi: 10.3934/ipi.2013.7.1409.

show all references

References:
[1]

L. Ambrosio and S. Masnou, A direct variational approach to a problem arising in image reconstruction, Interfaces Free Bound, 5 (2003), 63-81.

[2]

L. Ambrosio and S. Masnou, On a variational problem arising in image reconstruction, Free boundary problems (Trento, 2002), Internat. Ser. Numer. Math., Birkh{a"}user, Basel,, 147 (2004), 17-26.

[3]

L. Ambrosio and V. M. Tortorelli, Approximation of functionals depending on jumps by elliptic functionals via Gamma convergence., Comm. Pure Appl. Math., 43 (1990), 999-1036. doi: 10.1002/cpa.3160430805.

[4]

E. BaeJ. Shi and X.-C. Tai, Graph Cuts for Curvature Based Image Denoising, IEEE Transactions on Image Processing, 20 (2011), 1199-1210. doi: 10.1109/TIP.2010.2090533.

[5]

K. BrediesT. Pock and B. Wirth, A convex, lower semicontinuous approximation of Euler's elastica energy, SIAM J. Math. Anal., 47 (2015), 566-613. doi: 10.1137/130939493.

[6]

C. Brito-Loeza and K. Chen, Multigrid algorithm for high order denoising, SIAM J. Imaging. Sciences, 3 (2010), 363-389. doi: 10.1137/080737903.

[7]

V. CasellesR. Kimmel and G. Sapiro, Geodesic active contours, Int. J. Comput. Vision, 22 (1997), 61-79. doi: 10.1109/ICCV.1995.466871.

[8]

T. Chan and L. A. Vese, Active contours without edges, IEEE Trans. Image Process., 10 (2001), 266-277. doi: 10.1109/83.902291.

[9]

T. Chan and S. Esedoglu, Aspects of total variation regularized L1 function approximation, SIAM J. Appl. Math., 65 (2005), 1817-1837. doi: 10.1137/040604297.

[10]

T. ChanS. Esedoglu and M. Nikolova, Algorithms for finding global minimizers of denoising and segmentation models, SIAM J. Appl. Math., 66 (2006), 1632-1648. doi: 10.1137/040615286.

[11]

T. ChanS. H. Kang and J. H. Shen, Euler's elastica and curvature based inpaintings, SIAM J. Appl. Math., 63 (2002), 564-592.

[12]

T. Chan and W. Zhu, Level set based shape prior segmentation, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2005), 1164-1170. doi: 10.1109/CVPR.2005.212.

[13]

Y. ChenH. TagareS. ThiruvenkadamF. HuangD. WilsonK. GopinathR. Briggs and E. Geiser, Using prior shapes in geometric active contours in a variational framework, Int. J. Comput. Vision, 50 (2002), 315-328.

[14]

D. CremersF. TischhauserJ. Weickert and C. Schnorr, Diffusion Snakes: Introducing statistical shape knowledge into the Mumford--Shah functional, Int. J. Comput. Vision, 50 (2002), 295-313.

[15]

E. De Giorgi, Some remarks on Γ-convergence and least squares methods, in Composite Media and Homogenization Theory, G. Dal Maso and G.F. Dell' Antonio (Eds.) Birkhauser, Boston, (1991), 135–142.

[16]

M. P. do Carmo, Differential Geometry of Curves and Surfaces, Prentice-Hall, Inc. , 1976.

[17]

Y. DuanY. WangX.-C. Tai and J. Hahn, A fast augmented Lagrangian method for Euler's elastica model, SSVM 2011, LSCS, 6667 (2012), 144-156.

[18]

N. El-Zehiry and L. Grady, Fast global optimization of curvature, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), 3257-3264. doi: 10.1109/CVPR.2010.5540057.

[19]

M. Elsey and S. Esedoglu, Analogue of the total variation denoising model in the context of geometry processing, SIAM J. Multiscale Modeling and Simulation, 7 (2009), 1549-1573. doi: 10.1137/080736612.

[20]

S. Esedoglu and R. March, Segmentation with depth but without detecting junctions, J. Math. Imaging and Vision., 18 (2003), 7-15. doi: 10.1023/A:1021837026373.

[21]

M. Hintermüller, C. N. Rautenberg and J. Hahn, Functional-analytic and numerical issues in splitting methods for total variation-based image reconstruction, Inverse Problems, 30 (2014), 055014.

[22]

M. KassA. Witkin and D. Terzopoulos, Snakes: Active contour models, Int. J. Comput. Vision, 1 (1988), 321-331. doi: 10.1007/BF00133570.

[23]

M. LeventonW. Grimson and O. Faugeraus, Statistical shape influence in geodesic active contours, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2000), 316-323.

[24]

P. L. Lions and B. Mercier, Splitting algorithms for the sume of two nonlinear opertors, SIAM J. Numer. Amal., 16 (1979), 964-979. doi: 10.1137/0716071.

[25]

R. March and M. Dozio, A variational method for the recovery of smooth boundaries, Image and Vision Computing, 15 (1997), 705-712. doi: 10.1016/S0262-8856(97)00002-4.

[26]

S. Masnou, Disocclusion: A variational approach using level lines, IEEE Trans. Image Process., 11 (2002), 68-76. doi: 10.1109/83.982815.

[27]

S. Masnou and J. M. Morel, Level lines based disocclusion, Proc. IEEE Int. Conf. on Image Processing, Chicago, IL, (1998), 259-263. doi: 10.1109/ICIP.1998.999016.

[28]

D. Mumford and J. Shah, Optimal approximation by piecewise smooth functions and associated variational problems, Comm. Pure Appl. Math., 42 (1989), 577-685. doi: 10.1002/cpa.3160420503.

[29]

M. MyllykoskiR. GlowinskiT. Kärkkäinen and T. Rossi, A new augmented Lagrangian approach for L1-mean curvature image denoising, SIAM J. Imaging Sci., 8 (2015), 95-125. doi: 10.1137/140962164.

[30]

M. Nitzberg, D. Mumford and T. Shiota, Filering, Segmentation, and Depth, Lecture Notes in Computer Science, 662, Springer Verlag, Berlin, 1993.

[31]

S. Osher and J. A. Sethian, Fronts propagating with curvature-dependent speed-algorithm based on Hamilton-Jacobi formulations, J. Comput. Phys., 79 (1988), 12-49. doi: 10.1016/0021-9991(88)90002-2.

[32]

S. Osher and R. Fedkiw, Level Set Methods and Dynamic Implicit Surfaces, Springer-Verlag, New York, 2003.

[33]

D. PengB. MerrimanS. OsherH. K. Zhao and M. Kang, A PDE-based fast local level set method, J. Comput. Phys., 155 (1999), 410-438. doi: 10.1006/jcph.1999.6345.

[34]

R. T. Rockafellar, Augmented Lagrangians and applications of the proximal point algorithm in convex programming, Mathematics of Operations Research, 1 (1976), 97-116. doi: 10.1287/moor.1.2.97.

[35]

L. RudinS. Osher and E. Fatemi, Nonlinear total variation based noise removal algorithm, Physica D, 60 (1992), 259-268. doi: 10.1016/0167-2789(92)90242-F.

[36]

T. Schoenemann, F. Kahl and D. Cremers, Curvature Regularity for Region-Based Image Segmentation and Inpainting: A Linear Programming Relaxation, IEEE International Conference on Computer Vision (ICCV), 2009. doi: 10.1109/ICCV.2009.5459209.

[37]

T. SchoenemannF. KahlS. Masnou and D. Cremers, A linear framework for region-based image segmentation and inpainting involving curvature penalization, Int. J. Comput. Vision, 99 (2012), 53-68. doi: 10.1007/s11263-012-0518-7.

[38]

E. Strekalovskiy and D. Cremers, Generalized ordering constraints for multilabel optimization, IEEE International Conference on Computer Vision, ICCV 2011, (2011), 2619-2626. doi: 10.1109/ICCV.2011.6126551.

[39]

X. C. TaiJ. Hahn and G. J. Chung, A fast algorithm for Euler's Elastica model using augmented Lagrangian method, SIAM J. Imaging Sciences, 4 (2011), 313-344. doi: 10.1137/100803730.

[40]

C. Wu and X. C. Tai, Augmented Lagrangian method, dual methods, and split Bregman iteration for ROF, Vectorial TV, and high order models, SIAM J. Imaging Sciences, 3 (2010), 300-339. doi: 10.1137/090767558.

[41]

F. YangK. Chen and B. Yu, Homotopy method for a mean curvature-based denoising model, Appl. Numer. Math., 62 (2012), 185-200. doi: 10.1016/j.apnum.2011.12.001.

[42]

W. Zhu and T. Chan, A variational model for capturing illusory contours using curvature, J. Math. Imaging Vision, 27 (2007), 29-40. doi: 10.1007/s10851-006-9695-8.

[43]

W. Zhu and T. Chan, Image denoising using mean curvature of image surface, SIAM J. Imaging Sciences, 5 (2012), 1-32. doi: 10.1137/110822268.

[44]

W. ZhuX. C. Tai and T. Chan, Image segmentation using Euler's elastica as the regularization, J. Sci. Comput., 57 (2013), 414-438. doi: 10.1007/s10915-013-9710-3.

[45]

W. ZhuX. C. Tai and T. Chan, Augmented Lagrangian method for a mean curvature based image denoising model, Inverse Probl. Imag., 7 (2013), 1409-1432. doi: 10.3934/ipi.2013.7.1409.

Figure 1.  Two vectors $\nu$, $\mathbf{a}$ and a possible unit vector $\mathbf{b}$ that maximizes the function $\psi(\mathbf{b})$.
Figure 2.  The original image (the first row) and the segmentation results (the second row) with different curvature parameter $b$. Two features can be observed from these results: 1) data driven property: as the parameter $b$ increases, objects of relatively small size will be omitted in the final segmentation; 2) shape driven property: with the same parameter $b$, among those objects with equal areas, the one without a convex shape is taken out from the final segmentation (see the staircase-like shape). In this experiment, the other parameters are: $a=10^{-4},r_{1}=50,r_{2}=20,r_{3}=5$
Figure 3.  The original image and the segmentation result. The result demonstrates that the invisible part of the hat can be restored by the proposed segmentation model with a relatively large curvature parameter $b$. In this experiment, the parameters are $a=10^{-3},b=80, r_{1}=60,r_{2}=40,r_{3}=10$
Figure 4.  The original image and the segmentation result. The result shows that only the ''cap" part of the mushroom is captured while the ''stem" is omitted in order to form a convex segmentation contour by the proposed segmentation model with a relatively large curvature parameter $b$. In this experiment, the parameters are $a=10^{-4},b=18, r_{1}=20,r_{2}=10,r_{3}=5$
Figure 5.  The effect of the curvature parameter $b$ on the final segmentation results. One may observe that the smaller the parameter $b$ is chosen, the more salient the indentation of the hat would be. In these two experiments, all the other parameters are the same as those in Fig. 3
Figure 6.  The effect of the curvature parameter $b$ on the final segmentation results. One can see that once the parameter $b$ is chosen smaller, more part of the stem will be allowed in the final segmentation. In these two experiments, all the other parameters are the same as those in Fig. 4
Figure 7.  The plots of relative residuals (Eq. 43), relative errors in Lagrange multipliers (Eq. 44), and relative error in $u^{k}$ (Eq. 45) for the two examples ''hat" (Left column) and ''mushroom" (Right column). These plots demonstrate the convergence of the iteration process
Table 1.  Augmented Lagrangian method for the $ECV$-$L^{1}$ segmentation model
1. Initialization: $\phi^{0}$, $q^{0}$, $\mathbf{p}^{0}$, $\mathbf{n}^{0}$, and $\mathbf{\lambda}_{1}^{0}$, $\lambda_{2}^{0}$, $\mathbf{\lambda}_{3}^{0}$.
      For $k \geq 1$, do the following steps (Step $2\sim 4$):
2. Compute an approximate minimizer $(\phi^{k},q^{k},\mathbf{p}^{k},\mathbf{n}^{k})$ of the augmented Lagrangian functional with the fixed Lagrangian multiplier $\mathbf{\lambda}_{1}^{k-1}$, $\lambda_{2}^{k-1}$, $\mathbf{\lambda}_{3}^{k-1}$:
$(32)\;\;\;\;\;\;\;\;\;\;(\phi^{k},q^{k},\mathbf{p}^{k},\mathbf{n}^{k}) \approx \mbox{argmin } \mathcal{L}(\phi,q,\mathbf{p},\mathbf{n}, \mathbf{\lambda}_{1}^{k-1}, \lambda_{2}^{k-1}, \mathbf{\lambda}_{3}^{k-1}).$
3. Update the Lagrangian multipliers
                  $\mathbf{\lambda}_{1}^{k} = \mathbf{\lambda}_{1}^{k-1}+r_{1}(\mathbf{p}^{k}-\nabla \phi^{k})$
                  $\lambda_{2}^{k} = \lambda_{2}^{k-1}+r_{2}(q^{k}-\nabla\cdot \mathbf{n}^{k})$
                  $\mathbf{\lambda}_{3}^{k} = \mathbf{\lambda}_{3}^{k-1}+r_{3}(|\mathbf{p}|\mathbf{n}^{k}-\mathbf{p}^{k}),$
4. Measure the relative residuals and stop the iteration if they are smaller than a threshold $\epsilon_{r}$.
1. Initialization: $\phi^{0}$, $q^{0}$, $\mathbf{p}^{0}$, $\mathbf{n}^{0}$, and $\mathbf{\lambda}_{1}^{0}$, $\lambda_{2}^{0}$, $\mathbf{\lambda}_{3}^{0}$.
      For $k \geq 1$, do the following steps (Step $2\sim 4$):
2. Compute an approximate minimizer $(\phi^{k},q^{k},\mathbf{p}^{k},\mathbf{n}^{k})$ of the augmented Lagrangian functional with the fixed Lagrangian multiplier $\mathbf{\lambda}_{1}^{k-1}$, $\lambda_{2}^{k-1}$, $\mathbf{\lambda}_{3}^{k-1}$:
$(32)\;\;\;\;\;\;\;\;\;\;(\phi^{k},q^{k},\mathbf{p}^{k},\mathbf{n}^{k}) \approx \mbox{argmin } \mathcal{L}(\phi,q,\mathbf{p},\mathbf{n}, \mathbf{\lambda}_{1}^{k-1}, \lambda_{2}^{k-1}, \mathbf{\lambda}_{3}^{k-1}).$
3. Update the Lagrangian multipliers
                  $\mathbf{\lambda}_{1}^{k} = \mathbf{\lambda}_{1}^{k-1}+r_{1}(\mathbf{p}^{k}-\nabla \phi^{k})$
                  $\lambda_{2}^{k} = \lambda_{2}^{k-1}+r_{2}(q^{k}-\nabla\cdot \mathbf{n}^{k})$
                  $\mathbf{\lambda}_{3}^{k} = \mathbf{\lambda}_{3}^{k-1}+r_{3}(|\mathbf{p}|\mathbf{n}^{k}-\mathbf{p}^{k}),$
4. Measure the relative residuals and stop the iteration if they are smaller than a threshold $\epsilon_{r}$.
Table 2.  Alternating minimization method for solving the subproblems.
1. Initialization: $\widetilde{\phi}^{0}=\phi^{k-1}$, $\widetilde{q}^{0}=q^{k-1}$, $\widetilde{\mathbf{p}}^{0}=\mathbf{p}^{k-1}$, and $\widetilde{\mathbf{n}}^{0}=\mathbf{n}^{k-1}$.
2. For fixed Lagrangian multiplier $\mathbf{\lambda}_{1}=\mathbf{\lambda}_{1}^{k-1}$, $\lambda_{2}=\lambda_{2}^{k-1}$, and $\mathbf{\lambda}_{3}=\mathbf{\lambda}_{3}^{k-1}$, solve the following subproblems :
$(33)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{\phi}^{1} = \mbox{argmin } \mathcal{L}(\phi,\widetilde{q}^{0},\widetilde{\mathbf{p}}^{0},\widetilde{\mathbf{n}}^{0}, \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
$(34)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{q}^{1} = \mbox{argmin } \mathcal{L}(\widetilde{\phi}^{1},q,\widetilde{\mathbf{p}}^{0},\widetilde{\mathbf{n}}^{0}, \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
$(35)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{\mathbf{p}}^{1} = \mbox{argmin } \mathcal{L}(\widetilde{\phi}^{1},\widetilde{q}^{1},\mathbf{p},\widetilde{\mathbf{n}}^{0}, , \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
$(36)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{\mathbf{n}}^{1} = \mbox{argmin } \mathcal{L}(\widetilde{\phi}^{1},\widetilde{q}^{1},\widetilde{\mathbf{p}}^{1},\mathbf{n}, \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
3. $(\phi^{k},q^{k},\mathbf{p}^{k},\mathbf{n}^{k})=(\widetilde{\phi}^{1},\widetilde{q}^{1},\widetilde{\mathbf{p}}^{1}, \widetilde{\mathbf{n}}^{1})$.
1. Initialization: $\widetilde{\phi}^{0}=\phi^{k-1}$, $\widetilde{q}^{0}=q^{k-1}$, $\widetilde{\mathbf{p}}^{0}=\mathbf{p}^{k-1}$, and $\widetilde{\mathbf{n}}^{0}=\mathbf{n}^{k-1}$.
2. For fixed Lagrangian multiplier $\mathbf{\lambda}_{1}=\mathbf{\lambda}_{1}^{k-1}$, $\lambda_{2}=\lambda_{2}^{k-1}$, and $\mathbf{\lambda}_{3}=\mathbf{\lambda}_{3}^{k-1}$, solve the following subproblems :
$(33)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{\phi}^{1} = \mbox{argmin } \mathcal{L}(\phi,\widetilde{q}^{0},\widetilde{\mathbf{p}}^{0},\widetilde{\mathbf{n}}^{0}, \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
$(34)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{q}^{1} = \mbox{argmin } \mathcal{L}(\widetilde{\phi}^{1},q,\widetilde{\mathbf{p}}^{0},\widetilde{\mathbf{n}}^{0}, \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
$(35)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{\mathbf{p}}^{1} = \mbox{argmin } \mathcal{L}(\widetilde{\phi}^{1},\widetilde{q}^{1},\mathbf{p},\widetilde{\mathbf{n}}^{0}, , \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
$(36)\;\;\;\;\;\;\;\;\;\;\;\;\widetilde{\mathbf{n}}^{1} = \mbox{argmin } \mathcal{L}(\widetilde{\phi}^{1},\widetilde{q}^{1},\widetilde{\mathbf{p}}^{1},\mathbf{n}, \mathbf{\lambda}_{1}, \lambda_{2}, \mathbf{\lambda}_{3}) $
3. $(\phi^{k},q^{k},\mathbf{p}^{k},\mathbf{n}^{k})=(\widetilde{\phi}^{1},\widetilde{q}^{1},\widetilde{\mathbf{p}}^{1}, \widetilde{\mathbf{n}}^{1})$.
Table 3.  The presentation of the sizes, the numbers of iterations, and the computational time for the two real images.
ImageSize of imageNumber of iterationsTime (sec)
Fig. 3$321\times 481$1600472.3
Fig. 4$481\times 321$1600480.8
ImageSize of imageNumber of iterationsTime (sec)
Fig. 3$321\times 481$1600472.3
Fig. 4$481\times 321$1600480.8
[1]

Wei Zhu, Xue-Cheng Tai, Tony Chan. Augmented Lagrangian method for a mean curvature based image denoising model. Inverse Problems & Imaging, 2013, 7 (4) : 1409-1432. doi: 10.3934/ipi.2013.7.1409

[2]

Anis Theljani, Ke Chen. An augmented lagrangian method for solving a new variational model based on gradients similarity measures and high order regulariation for multimodality registration. Inverse Problems & Imaging, 2019, 13 (2) : 309-335. doi: 10.3934/ipi.2019016

[3]

Jianping Zhang, Ke Chen, Bo Yu, Derek A. Gould. A local information based variational model for selective image segmentation. Inverse Problems & Imaging, 2014, 8 (1) : 293-320. doi: 10.3934/ipi.2014.8.293

[4]

Xi-Hong Yan. A new convergence proof of augmented Lagrangian-based method with full Jacobian decomposition for structured variational inequalities. Numerical Algebra, Control & Optimization, 2016, 6 (1) : 45-54. doi: 10.3934/naco.2016.6.45

[5]

Wei Zhu. A numerical study of a mean curvature denoising model using a novel augmented Lagrangian method. Inverse Problems & Imaging, 2017, 11 (6) : 975-996. doi: 10.3934/ipi.2017045

[6]

Ruiliang Zhang, Xavier Bresson, Tony F. Chan, Xue-Cheng Tai. Four color theorem and convex relaxation for image segmentation with any number of regions. Inverse Problems & Imaging, 2013, 7 (3) : 1099-1113. doi: 10.3934/ipi.2013.7.1099

[7]

Chunlin Wu, Juyong Zhang, Xue-Cheng Tai. Augmented Lagrangian method for total variation restoration with non-quadratic fidelity. Inverse Problems & Imaging, 2011, 5 (1) : 237-261. doi: 10.3934/ipi.2011.5.237

[8]

Xueyong Wang, Yiju Wang, Gang Wang. An accelerated augmented Lagrangian method for multi-criteria optimization problem. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-9. doi: 10.3934/jimo.2018136

[9]

Li Jin, Hongying Huang. Differential equation method based on approximate augmented Lagrangian for nonlinear programming. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-15. doi: 10.3934/jimo.2019053

[10]

Shi Yan, Jun Liu, Haiyang Huang, Xue-Cheng Tai. A dual EM algorithm for TV regularized Gaussian mixture model in image segmentation. Inverse Problems & Imaging, 2019, 13 (3) : 653-677. doi: 10.3934/ipi.2019030

[11]

Yuan Shen, Wenxing Zhang, Bingsheng He. Relaxed augmented Lagrangian-based proximal point algorithms for convex optimization with linear constraints. Journal of Industrial & Management Optimization, 2014, 10 (3) : 743-759. doi: 10.3934/jimo.2014.10.743

[12]

Xiantao Xiao, Liwei Zhang, Jianzhong Zhang. On convergence of augmented Lagrangian method for inverse semi-definite quadratic programming problems. Journal of Industrial & Management Optimization, 2009, 5 (2) : 319-339. doi: 10.3934/jimo.2009.5.319

[13]

Xihong Yan. An augmented Lagrangian-based parallel splitting method for a one-leader-two-follower game. Journal of Industrial & Management Optimization, 2016, 12 (3) : 879-890. doi: 10.3934/jimo.2016.12.879

[14]

Qingsong Duan, Mengwei Xu, Yue Lu, Liwei Zhang. A smoothing augmented Lagrangian method for nonconvex, nonsmooth constrained programs and its applications to bilevel problems. Journal of Industrial & Management Optimization, 2019, 15 (3) : 1241-1261. doi: 10.3934/jimo.2018094

[15]

Wei Wang, Na Sun, Michael K. Ng. A variational gamma correction model for image contrast enhancement. Inverse Problems & Imaging, 2019, 13 (3) : 461-478. doi: 10.3934/ipi.2019023

[16]

Baoli Shi, Zhi-Feng Pang, Jing Xu. Image segmentation based on the hybrid total variation model and the K-means clustering strategy. Inverse Problems & Imaging, 2016, 10 (3) : 807-828. doi: 10.3934/ipi.2016022

[17]

Micol Amar, Andrea Braides. A characterization of variational convergence for segmentation problems. Discrete & Continuous Dynamical Systems - A, 1995, 1 (3) : 347-369. doi: 10.3934/dcds.1995.1.347

[18]

Chunrong Chen, T. C. Edwin Cheng, Shengji Li, Xiaoqi Yang. Nonlinear augmented Lagrangian for nonconvex multiobjective optimization. Journal of Industrial & Management Optimization, 2011, 7 (1) : 157-174. doi: 10.3934/jimo.2011.7.157

[19]

Qian Liu, Xinmin Yang, Heung Wing Joseph Lee. On saddle points of a class of augmented lagrangian functions. Journal of Industrial & Management Optimization, 2007, 3 (4) : 693-700. doi: 10.3934/jimo.2007.3.693

[20]

Dominique Zosso, Jing An, James Stevick, Nicholas Takaki, Morgan Weiss, Liane S. Slaughter, Huan H. Cao, Paul S. Weiss, Andrea L. Bertozzi. Image segmentation with dynamic artifacts detection and bias correction. Inverse Problems & Imaging, 2017, 11 (3) : 577-600. doi: 10.3934/ipi.2017027

2017 Impact Factor: 1.465

Metrics

  • PDF downloads (24)
  • HTML views (112)
  • Cited by (0)

Other articles
by authors

[Back to Top]