Record:   Prev Next
作者 Mohr, Darin Griffin
書名 Hybrid Runge-Kutta and quasi-Newton methods for unconstrained nonlinear optimization
國際標準書號 9781124873749
book jacket
說明 168 p
附註 Source: Dissertation Abstracts International, Volume: 72-12, Section: B, page: 7455
Adviser: Laurent O. Jay
Thesis (Ph.D.)--The University of Iowa, 2011
Finding a local minimizer in unconstrained nonlinear optimization and a fixed point of a gradient system of ordinary differential equations (ODEs) are two closely related problems. Quasi-Newton algorithms are widely used in unconstrained nonlinear optimization while Runge-Kutta methods are widely used for the numerical integration of ODEs. In this thesis, hybrid algorithms combining low-order implicit Runge-Kutta methods for gradient systems and quasi-Newton type updates of the Jacobian matrix such as the BFGS update are considered. These hybrid algorithms numerically approximate the gradient flow, but the exact Jacobian matrix is not used to solve the nonlinear system at each step. Instead, a quasi-Newton matrix is used to approximate the Jacobian matrix and matrix-vector multiplications are performed in a limited memory setting to reduce storage, computations, and the need to calculate Jacobian information
For hybrid algorithms based on Runge-Kutta methods of order at least two, a curve search is implemented instead of the standard line search used in quasi-Newton algorithms. Stepsize control techniques are also performed to control the stepsize associated with the underlying Runge-Kutta method
These hybrid algorithms are tested on a variety of test problems and their performance is compared with that of the limited memory BFGS algorithm
School code: 0096
Host Item Dissertation Abstracts International 72-12B
主題 Applied Mathematics
0364
Alt Author The University of Iowa. Applied Mathematical & Computational Sciences
Record:   Prev Next