HIGH-DIMENSIONAL OPTIMIZATION OF NON-SMOOTH CONVEX FUNCTIONS: ALGORITHMS AND THEORY
DOI:
https://doi.org/10.5281/zenodo.15913964Keywords:
convex optimization, descent method, Newton's method, constraint optimization, projection map.Abstract
In this study, we address the optimization of convex functions in N-dimensional spaces, a problem with widespread applications in various fields. Convex functions exhibit unique characteristics, such as having a single minimum value X* when they possess a finite minimum and the gradient vanishing at X* when the function is both differentiable and strictly convex.
To tackle this optimization problem, we explore the use of the descent (steepest) method and Newton's method, two well-established techniques in the field. The core challenge lies in minimizing the non-linear convex function subject to constraints of the form ( ), where i ranges from 1 to n.
We also consider the problem from the perspective of minimizing f over a closed convex subset. To achieve this, we introduce the projection map T, which maps elements in the N-dimensional space to a subset such that the Euclidean norm difference between the two sets is minimized, as expressed by the equation ‖ − ‖ = ‖ − ‖.