TY - GEN

T1 - An infinite hidden Markov model with similarity-biased transitions

AU - Dawson, Colin Reimer

AU - Huang, Chaofan

AU - Morrison, Clayton T.

N1 - Publisher Copyright:
© 2017 by the author (s).

PY - 2017

Y1 - 2017

N2 - We describe a generalization of the Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) which is able to encode prior information that state transitions are more likely between "nearby" states. This is accomplished by defining a similarity function on the state space and scaling transition probabilities by pair-wise similarities, thereby inducing correlations among the transition distributions. We present an augmented data representation of the model as a Markov Jump Process in which: (1) some jump attempts fail, and (2) the probability of success is proportional to the similarity between the source and destination states. This augmentation restores conditional conjugacy and admits a simple Gibbs sampler. We evaluate the model and inference method on a speaker diarization task and a "harmonic parsing" task using four-part chorale data, as well as on several synthetic datasets, achieving favorable comparisons to existing models.

AB - We describe a generalization of the Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) which is able to encode prior information that state transitions are more likely between "nearby" states. This is accomplished by defining a similarity function on the state space and scaling transition probabilities by pair-wise similarities, thereby inducing correlations among the transition distributions. We present an augmented data representation of the model as a Markov Jump Process in which: (1) some jump attempts fail, and (2) the probability of success is proportional to the similarity between the source and destination states. This augmentation restores conditional conjugacy and admits a simple Gibbs sampler. We evaluate the model and inference method on a speaker diarization task and a "harmonic parsing" task using four-part chorale data, as well as on several synthetic datasets, achieving favorable comparisons to existing models.

UR - http://www.scopus.com/inward/record.url?scp=85048392821&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85048392821&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85048392821

T3 - 34th International Conference on Machine Learning, ICML 2017

SP - 1560

EP - 1576

BT - 34th International Conference on Machine Learning, ICML 2017

PB - International Machine Learning Society (IMLS)

T2 - 34th International Conference on Machine Learning, ICML 2017

Y2 - 6 August 2017 through 11 August 2017

ER -