Abstract
We study the problem of efficient PAC active learning of homogeneous linear classifiers (halfspaces) in Rd, where the goal is to learn a halfspace with low error using as few label queries as possible. Under the extra assumption that there is a t-sparse halfspace that performs well on the data (t ≪ d), we would like our active learning algorithm to be attribute efficient, i.e. to have label requirements sublinear in d. In this paper, we provide a computationally efficient algorithm that achieves this goal. Under certain distributional assumptions on the data, our algorithm achieves a label complexity of O(t · polylog(d, 1 ∊ )). In contrast, existing algorithms in this setting are either computationally inefficient, or subject to label requirements polynomial in d or 1 ∊ .
| Original language | English (US) |
|---|---|
| Pages (from-to) | 1856-1880 |
| Number of pages | 25 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 75 |
| State | Published - 2018 |
| Externally published | Yes |
| Event | 31st Annual Conference on Learning Theory, COLT 2018 - Stockholm, Sweden Duration: Jul 6 2018 → Jul 9 2018 |
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability
Fingerprint
Dive into the research topics of 'Efficient active learning of sparse halfspaces'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS