
ISSN:
1547-5816
eISSN:
1553-166X
All Issues
Journal of Industrial & Management Optimization
April 2005 , Volume 1 , Issue 2
Select all articles
Export/Reference:
2005, 1(2): 149-152
doi: 10.3934/jimo.2005.1.149
+[Abstract](2196)
+[PDF](111.8KB)
Abstract:
Professor Jiye Han was born in Tianjin, China, in 1935. He is currently serving as a research fellow and PhD supervisor of the Institute of Applied Mathematics, Chinese Academy of Sciences (CAS). He is also a guest professor of Applied Mathematics of Tsinghua University and Dalian University of Technology. Professor Han is the associate editor-in-chief of Acta Mathematicae Applicatae Sinica (Chinese edition) and an executive editor of Acta Mathematicae Applicatae Sinica (English edition), Acta Mathematicae Sinica (Chinese edition), Operations Research Transactions, and Chinese Journal of Operations Research and Management Science. In addition, Professor Han is the President of Mathematical Programming Sub-society of the Operations Research Society of China.
For the full paper, please click the "Full Text" button above.
Professor Jiye Han was born in Tianjin, China, in 1935. He is currently serving as a research fellow and PhD supervisor of the Institute of Applied Mathematics, Chinese Academy of Sciences (CAS). He is also a guest professor of Applied Mathematics of Tsinghua University and Dalian University of Technology. Professor Han is the associate editor-in-chief of Acta Mathematicae Applicatae Sinica (Chinese edition) and an executive editor of Acta Mathematicae Applicatae Sinica (English edition), Acta Mathematicae Sinica (Chinese edition), Operations Research Transactions, and Chinese Journal of Operations Research and Management Science. In addition, Professor Han is the President of Mathematical Programming Sub-society of the Operations Research Society of China.
For the full paper, please click the "Full Text" button above.
2005, 1(2): 153-170
doi: 10.3934/jimo.2005.1.153
+[Abstract](2101)
+[PDF](241.8KB)
Abstract:
We propose a smoothing Newton algorithm for solving mathematical programs with complementarity constraints (MPCCs). Under some reasonable conditions, the proposed algorithm is shown to be globally convergent and to generate a $B$-stationary point of the MPCC. Preliminary numerical results on some MacMPEC problems are reported.
We propose a smoothing Newton algorithm for solving mathematical programs with complementarity constraints (MPCCs). Under some reasonable conditions, the proposed algorithm is shown to be globally convergent and to generate a $B$-stationary point of the MPCC. Preliminary numerical results on some MacMPEC problems are reported.
2005, 1(2): 171-180
doi: 10.3934/jimo.2005.1.171
+[Abstract](2591)
+[PDF](193.6KB)
Abstract:
We propose an iterative method that solves a nonsmooth convex optimization problem by converting the original objective function to a once continuously differentiable function by way of Moreau-Yosida regularization. The proposed method makes use of approximate function and gradient values of the Moreau-Yosida regularization instead of the corresponding exact values. Under this setting, Fukushima and Qi (1996) and Rauf and Fukushima (2000) proposed a proximal Newton method and a proximal BFGS method, respectively, for nonsmooth convex optimization. While these methods employ a line search strategy to achieve global convergence, the method proposed in this paper uses a trust region strategy. We establish global and superlinear convergence of the method under appropriate assumptions.
We propose an iterative method that solves a nonsmooth convex optimization problem by converting the original objective function to a once continuously differentiable function by way of Moreau-Yosida regularization. The proposed method makes use of approximate function and gradient values of the Moreau-Yosida regularization instead of the corresponding exact values. Under this setting, Fukushima and Qi (1996) and Rauf and Fukushima (2000) proposed a proximal Newton method and a proximal BFGS method, respectively, for nonsmooth convex optimization. While these methods employ a line search strategy to achieve global convergence, the method proposed in this paper uses a trust region strategy. We establish global and superlinear convergence of the method under appropriate assumptions.
2005, 1(2): 181-192
doi: 10.3934/jimo.2005.1.181
+[Abstract](2701)
+[PDF](309.7KB)
Abstract:
The gradient method is one simple method in nonlinear optimization. In this paper, we give a brief review on monotone gradient methods and study their numerical properties by introducing a new technique of long-term observation. We find that, one monotone gradient algorithm which is proposed by Yuan recently shares with the Barzilai-Borwein (BB) method the property that the gradient components with respect to the eigenvectors of the function Hessian are decreasing together. This might partly explain why this algorithm by Yuan is comparable to the BB method in practice. Some examples are also provided showing that the alternate minimization algorithm and the other algorithm by Yuan may fall into cycles. Some more efficient gradient algorithms are provided. Particularly, one of them is monotone and performs better than the BB method in the quadratic case.
The gradient method is one simple method in nonlinear optimization. In this paper, we give a brief review on monotone gradient methods and study their numerical properties by introducing a new technique of long-term observation. We find that, one monotone gradient algorithm which is proposed by Yuan recently shares with the Barzilai-Borwein (BB) method the property that the gradient components with respect to the eigenvectors of the function Hessian are decreasing together. This might partly explain why this algorithm by Yuan is comparable to the BB method in practice. Some examples are also provided showing that the alternate minimization algorithm and the other algorithm by Yuan may fall into cycles. Some more efficient gradient algorithms are provided. Particularly, one of them is monotone and performs better than the BB method in the quadratic case.
2005, 1(2): 193-200
doi: 10.3934/jimo.2005.1.193
+[Abstract](2069)
+[PDF](178.9KB)
Abstract:
In this paper, we consider a new kind of Fletcher-Reeves (abbr. FR) conjugate gradient method with errors, which is broadly applied in neural network training. Its iterate formula is $x_{k+1}=x_{k}+\alpha_{k}(s_{k}+\omega_{k})$, where the main direction $s_{k}$ is obtained by FR conjugate gradient method and $\omega_{k}$ is accumulative error. The global convergence property of the method is proved under the mild assumption conditions.
In this paper, we consider a new kind of Fletcher-Reeves (abbr. FR) conjugate gradient method with errors, which is broadly applied in neural network training. Its iterate formula is $x_{k+1}=x_{k}+\alpha_{k}(s_{k}+\omega_{k})$, where the main direction $s_{k}$ is obtained by FR conjugate gradient method and $\omega_{k}$ is accumulative error. The global convergence property of the method is proved under the mild assumption conditions.
2005, 1(2): 201-209
doi: 10.3934/jimo.2005.1.201
+[Abstract](2411)
+[PDF](167.5KB)
Abstract:
Motivated by the observation that some reformulation based extragradient methods for general monotone variational inequalities in real Hilbert space may not generate a solution of the original problem, we propose an iterative method with line searches and prove its convergence for general pseudomonotone (monotone) variational inequality problems.
Motivated by the observation that some reformulation based extragradient methods for general monotone variational inequalities in real Hilbert space may not generate a solution of the original problem, we propose an iterative method with line searches and prove its convergence for general pseudomonotone (monotone) variational inequality problems.
2005, 1(2): 211-217
doi: 10.3934/jimo.2005.1.211
+[Abstract](2289)
+[PDF](164.5KB)
Abstract:
The projection-type methods are a class of important methods for solving variational inequalities(VI). This paper presents a new treatment to a classical projection algorithm with variable steps, which was first proposed by Auslender in 1970s and later was developed by Fukushima in 1980s. The main purpose of this work is to weaken the assumption conditions while the convergence of original method is still valid.
The projection-type methods are a class of important methods for solving variational inequalities(VI). This paper presents a new treatment to a classical projection algorithm with variable steps, which was first proposed by Auslender in 1970s and later was developed by Fukushima in 1980s. The main purpose of this work is to weaken the assumption conditions while the convergence of original method is still valid.
2005, 1(2): 219-233
doi: 10.3934/jimo.2005.1.219
+[Abstract](2060)
+[PDF](234.9KB)
Abstract:
We propose a new smoothing technique based on discretization to solve semi-infinite variational inequalities. The proposed algorithm is tested by both linear and nonlinear problems and proven to be efficient.
We propose a new smoothing technique based on discretization to solve semi-infinite variational inequalities. The proposed algorithm is tested by both linear and nonlinear problems and proven to be efficient.
2005, 1(2): 235-250
doi: 10.3934/jimo.2005.1.235
+[Abstract](2488)
+[PDF](223.7KB)
Abstract:
This paper develops a smoothing algorithm to solve a system of constrained equations. Compared with the traditional methods, the new method does not need the continuous differentiability assumption for the corresponding merit function. By using some perturbing technique and suitable strategy of the chosen search direction, we find that the new method not only keeps the strict positivity of the smoothing variable at any non-stationary point of the corresponding optimization problem, but also enjoys global convergence and locally superliner convergence. The former character is the key requirement for smoothing methods. Some numerical examples arising from semi-infinite programming (SIP) show that the new algorithm is promising.
This paper develops a smoothing algorithm to solve a system of constrained equations. Compared with the traditional methods, the new method does not need the continuous differentiability assumption for the corresponding merit function. By using some perturbing technique and suitable strategy of the chosen search direction, we find that the new method not only keeps the strict positivity of the smoothing variable at any non-stationary point of the corresponding optimization problem, but also enjoys global convergence and locally superliner convergence. The former character is the key requirement for smoothing methods. Some numerical examples arising from semi-infinite programming (SIP) show that the new algorithm is promising.
2005, 1(2): 251-273
doi: 10.3934/jimo.2005.1.251
+[Abstract](2252)
+[PDF](305.3KB)
Abstract:
In this paper, we propose a network model called simple manufacturing network. Our model is a combined version of the ordinary multicommodity network and the manufacturing network flow model. It can be used to characterize the complicated manufacturing scenarios. By formulating the model as a minimum cost flow problem plus several bounded variables, we present a modified network simplex method, which exploits the special structure of the model and can perform the computation on the network. A numerical example is provided for illustrating our method.
In this paper, we propose a network model called simple manufacturing network. Our model is a combined version of the ordinary multicommodity network and the manufacturing network flow model. It can be used to characterize the complicated manufacturing scenarios. By formulating the model as a minimum cost flow problem plus several bounded variables, we present a modified network simplex method, which exploits the special structure of the model and can perform the computation on the network. A numerical example is provided for illustrating our method.
2019 Impact Factor: 1.366
Readers
Authors
Editors
Referees
Librarians
Email Alert
Add your name and e-mail address to receive news of forthcoming issues of this journal:
[Back to Top]