-
Notifications
You must be signed in to change notification settings - Fork 13
Full Bayesian Inference for Hidden Markov Models
Hidden Markov Models ("HMMs") are a widely used class of models for sequentially ordered data. They posit a "hidden" state which evolves according to a Markov process (typically, a discrete-state discrete-time Markov chain); the observed data are drawn independently (but not identically) from a distribution which depends only on this hidden state. This combination of Markovian dynamics with a possibly complex set of conditional distributions for the observations provides a modeling "sweet spot" -- a class of models which are extremely flexible but still computationally and statistically tractable.
Owing to this "sweet spot," a large number of R
packages for fitting HMMs have been
developed by the community [1], almost all of which take a classical (frequentist) approach
to parameter estimation. In this student-proposed project, we will develop a package
which allows (full) Bayesian inference for a wide-range of Hidden Markov models. This project
builds off the 2017 GSoC Project "Bayesian Hierarchical Hidden Markov Models Applied to
Financial Time Series" [2] and a related conference publication [3].
See the associated student application for details.
-
Michael Weylandt ([email protected])
Department of Statistics, Rice University
Since this is a student-proposed project, there are no formal tests. Acceptance of project will be based solely on the student's application. Students interested in pursuing a similar project are encouraged to submit applications as well.
[1] For example, the HMM
,
depmixS4
, and
moveHMM
packages.
[2] https://luisdamiano.github.io/gsoc17/
[3] L. Damiano, B. Peterson, and M. Weylandt. "A Tutorial on Hidden Markov Models
Using Stan
." StanCon 2018 (Asilomar, CA). Available at
https://github.com/stan-dev/stancon_talks/tree/master/2018/Contributed-Talks/04_damiano