Journal of Statistical and Mathematical Sciences

Open Access | Peer-Reviewed

HIGH-DIMENSIONAL OPTIMIZATION OF NON-SMOOTH CONVEX FUNCTIONS: ALGORITHMS AND THEORY

Abstract

<p>In this study, we address the optimization of convex functions in N-dimensional spaces, a problem with widespread applications in various fields. Convex functions exhibit unique characteristics, such as having a single minimum value X* when they possess a finite minimum and the gradient vanishing at X* when the function is both differentiable and strictly convex.<br>To tackle this optimization problem, we explore the use of the descent (steepest) method and Newton's method, two well-established techniques in the field. The core challenge lies in minimizing the non-linear convex function subject to constraints of the form ( ), where i ranges from 1 to n.<br>We also consider the problem from the perspective of minimizing f over a closed convex subset. To achieve this, we introduce the projection map T, which maps elements in the N-dimensional space to a subset such that the Euclidean norm difference between the two sets is minimized, as expressed by the equation ‖ − ‖ = ‖ − ‖.</p>

Access Full Text

The PDF will open in a new window