Stochastic proximal first-order algorithm

Contact: Saeed Masiha

In this project, we are going to find a batch-free stochastic proximal first-order algorithm that achieves O(1/\epsilon) gradient oracle complexity to achieve an \epsilon-global optimum point when the objective function is smooth and satisfies 2-PL (or quadratic growth).

Requirements: Knowledge about theory of (convex) optimization, python programming skills (preferred), experience with ML libraries such as numpy, pytorch. If interested, please send your CV and a transcript of your grades to mohammadsaeed.masiha@epfl.ch