2021-02-042021-02-04http://hdl.handle.net/123456789/1114x, 140p. :ill.Optimization techniques are called into play everyday in decision making processes involving resource allocation in almost every discipline. However, the development and implementation of algorithm for solving optimization problems are fraught with many difficulties and frustrations. The aim of this thesis has been to examine and identify some of the problems associated with the development and implementation of a class of optimization algorithms and to develop a means of improving upon them. The study made use of functions such as the Rosenbrock's function that are known to be a good test of robustness of these techniques. The study covered a number of algorithms in both unconstrained and constrained optimization. Some of the problems encountered were in the implementation of the Modified Newton's method. It was discovered that if at any iterate Xk, the Hessian matrix H(Xk) is not positive definite, then-H(Xk)-1s'f(Xk) is not a descent direction. In this case, we look for a new direction where the Hessian matrix will be positive definite. Some of the suggestions proposed in the literature did not always lead to a descent direction. A new iterate could be found from Xk+1 = Xk - lkVk where Vk is the eigenvector corresponding to the negative eigenvalue of H(Xk). However, if this fails then an alternative is to make use of the Hessian at the previous iterate H(Xk-l) to compute the next iterate Xk+l which will hopefully give a positive definite Hessian. If this also fails, then setting H(Xk)-1 = In converts the Modified Newton's method into the Steepest Descent method, where Xk+1 = Xk - aks' f(Xk). The study also revealed that to determine the various critical points of a given function, it may require more than one technique. All the computations in this work made use of OCTAVE, a public domain mathematical software.Mathematical optimizationalgorithmsA study of iterative methods in optimizationThesis