\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

trust region method for nonsmooth convex optimization

Abstract Related Papers Cited by
  • We propose an iterative method that solves a nonsmooth convex optimization problem by converting the original objective function to a once continuously differentiable function by way of Moreau-Yosida regularization. The proposed method makes use of approximate function and gradient values of the Moreau-Yosida regularization instead of the corresponding exact values. Under this setting, Fukushima and Qi (1996) and Rauf and Fukushima (2000) proposed a proximal Newton method and a proximal BFGS method, respectively, for nonsmooth convex optimization. While these methods employ a line search strategy to achieve global convergence, the method proposed in this paper uses a trust region strategy. We establish global and superlinear convergence of the method under appropriate assumptions.
    Mathematics Subject Classification: 90C25, 90C30.

    Citation:

    \begin{equation} \\ \end{equation}
  • 加载中
SHARE

Article Metrics

HTML views() PDF downloads(100) Cited by(0)

Access History

Other Articles By Authors

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return