Learning using anti-training with sacrificial data

Michael L. Valenzuela, Jerzy W Rozenblit

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Traditionally the machine-learning community has viewed the No Free Lunch (NFL) theorems for search and optimization as a limitation. We review, analyze, and unify the NFL theorem with the perspectives of "blind" search and meta-learning to arrive at necessary conditions for improving black-box optimization. We survey meta-learning literature to determine when and how meta-learning can benefit machine learning. Then, we generalize meta-learning in the context of the NFL theorems, to arrive at a novel technique called anti-training with sacrificial data (ATSD). Our technique applies at the meta level to arrive at domain specific algorithms. We also show how to generate sacrificial data. An extensive case study is presented along with simulated annealing results to demonstrate the efficacy of the ATSD method.

Original languageEnglish (US)
JournalJournal of Machine Learning Research
Volume17
StatePublished - Apr 1 2016

Keywords

  • Anti-training
  • Machine learning
  • Meta optimization
  • No Free Lunch
  • Optimization
  • Sacrificial data

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Learning using anti-training with sacrificial data'. Together they form a unique fingerprint.

Cite this