A Mixture of Delta-Rules Approximation to Bayesian Inference in Change-Point Problems

Robert C. Wilson, Matthew R. Nassar, Joshua I. Gold

Research output: Contribution to journalArticlepeer-review

69 Scopus citations

Abstract

Error-driven learning rules have received considerable attention because of their close relationships to both optimal theory and neurobiological mechanisms. However, basic forms of these rules are effective under only a restricted set of conditions in which the environment is stable. Recent studies have defined optimal solutions to learning problems in more general, potentially unstable, environments, but the relevance of these complex mathematical solutions to how the brain solves these problems remains unclear. Here, we show that one such Bayesian solution can be approximated by a computationally straightforward mixture of simple error-driven 'Delta' rules. This simpler model can make effective inferences in a dynamic environment and matches human performance on a predictive-inference task using a mixture of a small number of Delta rules. This model represents an important conceptual advance in our understanding of how the brain can use relatively simple computations to make nearly optimal inferences in a dynamic world.

Original languageEnglish (US)
Article numbere1003150
JournalPLoS computational biology
Volume9
Issue number7
DOIs
StatePublished - Jul 2013
Externally publishedYes

ASJC Scopus subject areas

  • Ecology, Evolution, Behavior and Systematics
  • Modeling and Simulation
  • Ecology
  • Molecular Biology
  • Genetics
  • Cellular and Molecular Neuroscience
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'A Mixture of Delta-Rules Approximation to Bayesian Inference in Change-Point Problems'. Together they form a unique fingerprint.

Cite this