Primal-dual incremental gradient method for nonsmooth and convex optimization problems

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

In this paper, we consider a nonsmooth convex finite-sum problem with a conic constraint. To overcome the challenge of projecting onto the constraint set and computing the full (sub)gradient, we introduce a primal-dual incremental gradient scheme where only a component function and two constraints are used to update each primal-dual sub-iteration in a cyclic order. We demonstrate an asymptotic sublinear rate of convergence in terms of suboptimality and infeasibility which is an improvement over the state-of-the-art incremental gradient schemes in this setting. Numerical results suggest that the proposed scheme compares well with competitive methods.

Original languageEnglish (US)
Pages (from-to)2541-2554
Number of pages14
JournalOptimization Letters
Volume15
Issue number8
DOIs
StatePublished - Nov 2021

Keywords

  • Convex optimization
  • Incremental gradient
  • Nonsmooth optimization
  • Primal-dual method

ASJC Scopus subject areas

  • Control and Optimization

Fingerprint

Dive into the research topics of 'Primal-dual incremental gradient method for nonsmooth and convex optimization problems'. Together they form a unique fingerprint.

Cite this