An Accelerated Gradient Method for Convex Smooth Simple Bilevel Optimization

Jincheng Cao, Ruichen Jiang, Erfan Yazdandoost Hamedani, Aryan Mokhtari

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

In this paper, we focus on simple bilevel optimization problems, where we minimize a convex smooth objective function over the optimal solution set of another convex smooth constrained optimization problem. We present a novel bilevel optimization method that locally approximates the solution set of the lower-level problem using a cutting plane approach and employs an accelerated gradient-based update to reduce the upper-level objective function over the approximated solution set. We measure the performance of our method in terms of suboptimality and infeasibility errors and provide non-asymptotic convergence guarantees for both error criteria. Specifically, when the feasible set is compact, we show that our method requires at most O(max{1/√ϵf, 1/ϵg}) iterations to find a solution that is ϵf-suboptimal and ϵg-infeasible. Moreover, under the additional assumption that the lower-level objective satisfies the r-th Hölderian error bound, we show that our method achieves an iteration complexity of Õ(max{ϵf2r-1/2r, ϵg2r-1/2r }), which matches the optimal complexity of single-level convex constrained optimization when r = 1.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
Volume37
StatePublished - 2024
Event38th Conference on Neural Information Processing Systems, NeurIPS 2024 - Vancouver, Canada
Duration: Dec 9 2024Dec 15 2024

ASJC Scopus subject areas

  • Signal Processing
  • Information Systems
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'An Accelerated Gradient Method for Convex Smooth Simple Bilevel Optimization'. Together they form a unique fingerprint.

Cite this