Efficient active learning of sparse halfspaces

Research output: Contribution to journalConference articlepeer-review

23 Scopus citations

Abstract

We study the problem of efficient PAC active learning of homogeneous linear classifiers (halfspaces) in Rd, where the goal is to learn a halfspace with low error using as few label queries as possible. Under the extra assumption that there is a t-sparse halfspace that performs well on the data (t ≪ d), we would like our active learning algorithm to be attribute efficient, i.e. to have label requirements sublinear in d. In this paper, we provide a computationally efficient algorithm that achieves this goal. Under certain distributional assumptions on the data, our algorithm achieves a label complexity of O(t · polylog(d, 1 ∊ )). In contrast, existing algorithms in this setting are either computationally inefficient, or subject to label requirements polynomial in d or 1 ∊ .

Original languageEnglish (US)
Pages (from-to)1856-1880
Number of pages25
JournalProceedings of Machine Learning Research
Volume75
StatePublished - 2018
Externally publishedYes
Event31st Annual Conference on Learning Theory, COLT 2018 - Stockholm, Sweden
Duration: Jul 6 2018Jul 9 2018

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Efficient active learning of sparse halfspaces'. Together they form a unique fingerprint.

Cite this