Publications
An upper bound for the Kullback-Leibler divergence for left-to-right transient Hidden Markov Models
Abstract
This paper reports an upper bound for the Kullback-Leibler divergence (UBKLD) in the context of comparing left-to-right hidden Markov models (HMMs) with final non-emitting states. We start by defining the general notion of transient hidden Markov models and by relating this definition with the well-known transient and recurrent characterization associated with the underlying Markov chain of the models. Additionally, an alternative derivation for the upper bound for the Kullback-Leibler divergence (KLD) for Gaussian Mixtures models (GMMs) is presented, which introduces a new interpretation for the nature of this bound, particularly relevant for generalizing it for the case of HMMs. This upper bound formulation is naturally extended to the context of HMMs with non-emitting states, where under some additional assumptions the UBKLD is proved to be well defined for a general family of transient models. In particular …
- Date
- January 1, 1970
- Authors
- Jorge Silva, Shrikanth Narayanan
- Journal
- IEEE Transactions on Information Theory