TY - JOUR
T1 - An Accelerated Gradient Method for Convex Smooth Simple Bilevel Optimization
AU - Cao, Jincheng
AU - Jiang, Ruichen
AU - Hamedani, Erfan Yazdandoost
AU - Mokhtari, Aryan
N1 - Publisher Copyright:
© 2024 Neural information processing systems foundation. All rights reserved.
PY - 2024
Y1 - 2024
N2 - In this paper, we focus on simple bilevel optimization problems, where we minimize a convex smooth objective function over the optimal solution set of another convex smooth constrained optimization problem. We present a novel bilevel optimization method that locally approximates the solution set of the lower-level problem using a cutting plane approach and employs an accelerated gradient-based update to reduce the upper-level objective function over the approximated solution set. We measure the performance of our method in terms of suboptimality and infeasibility errors and provide non-asymptotic convergence guarantees for both error criteria. Specifically, when the feasible set is compact, we show that our method requires at most O(max{1/√ϵf, 1/ϵg}) iterations to find a solution that is ϵf-suboptimal and ϵg-infeasible. Moreover, under the additional assumption that the lower-level objective satisfies the r-th Hölderian error bound, we show that our method achieves an iteration complexity of Õ(max{ϵf2r-1/2r, ϵg2r-1/2r }), which matches the optimal complexity of single-level convex constrained optimization when r = 1.
AB - In this paper, we focus on simple bilevel optimization problems, where we minimize a convex smooth objective function over the optimal solution set of another convex smooth constrained optimization problem. We present a novel bilevel optimization method that locally approximates the solution set of the lower-level problem using a cutting plane approach and employs an accelerated gradient-based update to reduce the upper-level objective function over the approximated solution set. We measure the performance of our method in terms of suboptimality and infeasibility errors and provide non-asymptotic convergence guarantees for both error criteria. Specifically, when the feasible set is compact, we show that our method requires at most O(max{1/√ϵf, 1/ϵg}) iterations to find a solution that is ϵf-suboptimal and ϵg-infeasible. Moreover, under the additional assumption that the lower-level objective satisfies the r-th Hölderian error bound, we show that our method achieves an iteration complexity of Õ(max{ϵf2r-1/2r, ϵg2r-1/2r }), which matches the optimal complexity of single-level convex constrained optimization when r = 1.
UR - https://www.scopus.com/pages/publications/105000510610
UR - https://www.scopus.com/inward/citedby.url?scp=105000510610&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:105000510610
SN - 1049-5258
VL - 37
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 38th Conference on Neural Information Processing Systems, NeurIPS 2024
Y2 - 9 December 2024 through 15 December 2024
ER -