|
• To introduce the field of unconstrained optimisation. • To outline basic line search methods for 1-variable functions • To outline basic direct and gradient-based search methods for multi-variable functions.
• Fundamental principles of optimization • Line searches for functions with one variable - Grid, Fibonacci, Golden Section & Quadratic • Direct search methods for functions of more than one variable - Univariate, DSC algorithm, Gram-Schmidt Orthogonalisation & Simplex method • Gradient methods for functions of more than one variable - Steepest Descent, Conjugate Gradient Algorithm – Fletcher & Reeves • Hessian-based Gradient methods for functions of more than one variable - Newton’s method, Quasi-Newton’s method, DFP & BFGS algorithms • The Sum of Squares Problem - Gauss-Newton and Levenberg-Marquart algorithms • An introduction to constrained optimisation
|