說明 
168 p 
附註 
Source: Dissertation Abstracts International, Volume: 7212, Section: B, page: 7455 

Adviser: Laurent O. Jay 

Thesis (Ph.D.)The University of Iowa, 2011 

Finding a local minimizer in unconstrained nonlinear optimization and a fixed point of a gradient system of ordinary differential equations (ODEs) are two closely related problems. QuasiNewton algorithms are widely used in unconstrained nonlinear optimization while RungeKutta methods are widely used for the numerical integration of ODEs. In this thesis, hybrid algorithms combining loworder implicit RungeKutta methods for gradient systems and quasiNewton type updates of the Jacobian matrix such as the BFGS update are considered. These hybrid algorithms numerically approximate the gradient flow, but the exact Jacobian matrix is not used to solve the nonlinear system at each step. Instead, a quasiNewton matrix is used to approximate the Jacobian matrix and matrixvector multiplications are performed in a limited memory setting to reduce storage, computations, and the need to calculate Jacobian information 

For hybrid algorithms based on RungeKutta methods of order at least two, a curve search is implemented instead of the standard line search used in quasiNewton algorithms. Stepsize control techniques are also performed to control the stepsize associated with the underlying RungeKutta method 

These hybrid algorithms are tested on a variety of test problems and their performance is compared with that of the limited memory BFGS algorithm 

School code: 0096 
Host Item 
Dissertation Abstracts International 7212B

主題 
Applied Mathematics


0364

Alt Author 
The University of Iowa. Applied Mathematical & Computational Sciences

