A derivative-free method for solving large-scale nonlinear systems of equations
Gaohang Yu
In this paper, a fully derivative-free method for solving large-scale nonlinear systems of equations is presented. It uses in a systematic way the well-known Polak-Ribière-Polyak (PRP) conjugate gradient direction as a search direction and employs a backtracking process to obtain a suitable stepsize. Assume that the nonlinear mapping is Lipschitz continuous, some global convergence results are established. A modification of this method which may allow the objective function's sufficiently nonmonotone behavior is also presented in this paper. Numerical comparisons using a set of large-scale test problems in the CUTE library show that the proposed methods are encouraging.
keywords: conjugate gradient method nonlinear systems derivative-free method.
Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
Gaohang Yu Lutai Guan Guoyin Li
In this paper, we proposed two modified PRP conjugate gradient methods. It is a interesting feature that these new methods possess the sufficient descent property without assuming any line search condition and reduce to the standard PRP method when exact line search is used. Under some reasonable conditions, the global convergence is achieved for these methods. Preliminary numerical results show that these methods are efficient.
keywords: global convergence. conjugate gradient method Unconstrained optimization nonconvex minimization
Least absolute deviations learning of multiple tasks
Wei Xue Wensheng Zhang Gaohang Yu

In this paper, we propose a new multitask feature selection model based on least absolute deviations. However, due to the inherent nonsmoothness of $l_1 $ norm, optimizing this model is challenging. To tackle this problem efficiently, we introduce an alternating iterative optimization algorithm. Moreover, under some mild conditions, its global convergence result could be established. Experimental results and comparison with the state-of-the-art algorithm SLEP show the efficiency and effectiveness of the proposed approach in solving multitask learning problems.

keywords: Multitask learning feature selection least absolute deviations alternating direction method l1 regularization
Multivariate spectral gradient projection method for nonlinear monotone equations with convex constraints
Gaohang Yu Shanzhou Niu Jianhua Ma
In this paper, we present a multivariate spectral gradient projection method for nonlinear monotone equations with convex constraints, which can be viewed as an extension of multivariate spectral gradient method for solving unconstrained optimization problems. The proposed method does not need the computation of the derivative as well as the solution of some linear equations. Under some suitable conditions, we can establish its global convergence results. Preliminary numerical results show that the proposed method is efficient and promising.
keywords: Nonlinear system of equations projection method global convergence. monotone equations multivariate spectral gradient method

Year of publication

Related Authors

Related Keywords

[Back to Top]