Hidden Markov Models (HMM)

The Hidden Markov Model (HMM) is a sophisticated statistical framework that relies on an unseen Markov process to generate sequences of indirect observations. Its distinct characteristics include:

  • Latent or Hidden Process (X)
    This invisible process adheres to Markov model principles, where the present state is determined solely by its immediate predecessor.
  • Observable Process (Y)
    Alongside, there's an observable process (Y), shaped by the latent process (X). We only get glimpses of Y's outcomes, which are subtly influenced by X's hidden states.
  • Conditional Dependence and Independence
    A pivotal element in HMMs is that each observable outcome at time t, namely Y(t), is directly linked to the latent state at the same moment, X(t). Previous states or observations don't sway the current observation if the present latent state is known.

    Therefore, to effectively utilize HMMs, it's essential to estimate parameters, which encompass probabilities of shifting between latent states and the probabilities of emission that connect these states to observable outcomes.

Hidden Markov Models, thus, operate with internal states that elude direct observation. These hidden states are what the model seeks to uncover, predict, or categorize.

Referred to as hidden states, they symbolize a system's underlying conditions.

For instance, consider these states as representing daily weather conditions, like "sunny" or "rainy," which we can't directly observe.

Conversely, observations are data points we can actually see or measure, linked intrinsically to our target of discovery.

Taking the weather analogy further, observations could be activities of individuals observed outdoors, such as "walking briskly," "shopping," "using an umbrella," and so on, each hinting at the underlying weather condition.

The transition probabilities in HMMs articulate how hidden states evolve over time.

For example, the probability that a sunny day might be followed by another sunny day, or shift to a rainy day.

The emission probabilities form the crucial link between hidden states and observations, indicating the likelihood of witnessing specific activities under certain weather conditions.

At its core, an HMM aims to use observations to deduce the hidden states, turning observable actions into insights about unseen conditions.

If, for example, someone is observed walking leisurely, and it's known that this behavior typically occurs in pleasant weather, one might infer that the weather is likely sunny.

Example

A classic example of an HMM is the "urn problem" in its discrete form.

Picture a room filled with urns (x1, x2, x3), each containing balls marked with specific labels (y1, y2, y3).

example of HMM model

Imagine a genie randomly selecting an urn and drawing a ball, which is then placed on a conveyor belt. Here, an observer can only see the balls but not from which urn they were drawn.

The genie's urn selection is based solely on the previously chosen urn, forming a Markov process with set transition probabilities (t12, t21, t23).

While the observer can see the sequence of balls, the urn selection remains hidden. Although the urns' compositions are known, pinpointing the exact origin of each observed ball (y1, y2, y3) remains uncertain. However, the observer can estimate the probability of each ball originating from a specific urn (x1, x2, x3).




Report a mistake or post a question




FacebookTwitterLinkedinLinkedin