Tutorial, Bayes net for forensic DNA analysis, Satellite, Dice, DNA Helix, Distance Matrix of pairs of amino acids, Bristol Balloon festival
SuSTaIn About News Postgraduate degrees Events Research highlights Jobs Management Statistics Group Statistics Home Research Members Seminars Statistics@Bristol Mathematics Home External Links APTS Complexity science Royal Statistical Society International Society for Bayesian Analysis


Analyticity, Convergence, and Convergence Rate of Recursive Maximum-Likelihood Estimation in Hidden Markov Models

by Vladislav B. Tadic

This paper considers the asymptotic properties of the recursive maximum-likelihood estimator for hidden Markov models. The paper is focused on the analytic properties of the asymptotic log-likelihood and on the point-convergence and convergence rate of the recursive maximum-likelihood estimator. Using the principle of analytic continuation, the analyticity of the asymptotic log-likelihood is shown for analytically parameterized hidden Markov models. Relying on this fact and some results from differential geometry (Lojasiewicz inequality), the almost sure point convergence of the recursive maximum-likelihood algorithm is demonstrated, and relatively tight bounds on the convergence rate are derived. As opposed to the existing result on the asymptotic behavior of maximum-likelihood estimation in hidden Markov models, the results of this paper are obtained without assuming that the log-likelihood function has an isolated maximum at which the Hessian is strictly negative definite.

Keywords: Analyticity, convergence rate, hidden Markov models, Lojasiewicz inequality, maximum-likelihood estimation, point convergence, recursive identification.

Full text of the paper (pdf), which has recently appeared in IEEE Transactions On Information Theory, Vol. 56, No. 12, December 2010.