Browse Definitions :
Definition

Markov model

What is a Markov model?

A Markov model is a stochastic method for randomly changing systems that possess the Markov property. This means that, at any given time, the next state is only dependent on the current state and is independent of anything in the past. Two commonly applied types of Markov model are used when the system being represented is autonomous -- that is, when the system isn't influenced by an external agent. These are as follows:

  1. Markov chains. These are the simplest type of Markov model and are used to represent systems where all states are observable. Markov chains show all possible states, and between states, they show the transition rate, which is the probability of moving from one state to another per unit of time. Applications of this type of model include prediction of market crashes, speech recognition and search engine algorithms.
  2. Hidden Markov models. These are used to represent systems with some unobservable states. In addition to showing states and transition rates, hidden Markov models also represent observations and observation likelihoods for each state. Hidden Markov models are used for a range of applications, including thermodynamics, finance and pattern recognition.

Another two commonly applied types of Markov model are used when the system being represented is controlled -- that is, when the system is influenced by a decision-making agent. These are as follows:

  1. Markov decision processes. These are used to model decision-making in discrete, stochastic, sequential environments. In these processes, an agent makes decisions based on reliable information. These models are applied to problems in artificial intelligence (AI), economics and behavioral sciences.
  2. Partially observable Markov decision processes. These are used in cases like Markov decision processes but with the assumption that the agent doesn't always have reliable information. Applications of these models include robotics, where it isn't always possible to know the location. Another application is machine maintenance, where reliable information on machine parts can't be obtained because it's too costly to shut down the machine to get the information.

How is Markov analysis applied?

Markov analysis is a probabilistic technique that uses Markov models to predict the future behavior of some variable based on the current state. Markov analysis is used in many domains, including the following:

  • Markov chains are used for several business applications, including predicting customer brand switching for marketing, predicting how long people will remain in their jobs for human resources, predicting time to failure of a machine in manufacturing, and forecasting the future price of a stock in finance.
  • Markov analysis is also used in natural language processing (NLP) and in machine learning. For NLP, a Markov chain can be used to generate a sequence of words that form a complete sentence, or a hidden Markov model can be used for named-entity recognition and tagging parts of speech. For machine learning, Markov decision processes are used to represent reward in reinforcement learning.
  • A recent example of the use of Markov analysis in healthcare was in Kuwait. A continuous-time Markov chain model was used to determine the optimal timing and duration of a full COVID-19 lockdown in the country, minimizing both new infections and hospitalizations. The model suggested that a 90-day lockdown beginning 10 days before the epidemic peak was optimal.

How are Markov models represented?

The simplest Markov model is a Markov chain, which can be expressed in equations, as a transition matrix or as a graph. A transition matrix is used to indicate the probability of moving from each state to each other state. Generally, the current states are listed in rows, and the next states are represented as columns. Each cell then contains the probability of moving from the current state to the next state. For any given row, all the cell values must then add up to one.

A graph consists of circles, each of which represents a state, and directional arrows to indicate possible transitions between states. The directional arrows are labeled with the transition probability. The transition probabilities on the directional arrows coming out of any given circle must add up to one.

Other Markov models are based on the chain representations but with added information, such as observations and observation likelihoods.

The transition matrix below represents shifting gears in a car with a manual transmission. Six states are possible, and a transition from any given state to any other state depends only on the current state -- that is, where the car goes from second gear isn't influenced by where it was before second gear. Such a transition matrix might be built from empirical observations that show, for example, that the most probable transitions from first gear are to second or neutral.

coin-toss Markov chain
This transition matrix represents shifting gears in a car with a manual transmission and the six states that are possible.

The image below represents the toss of a coin. Two states are possible: heads and tails. The transition from heads to heads or heads to tails is equally probable (.5) and is independent of all preceding coin tosses.

Markov model shifting car gears
The circles represent the two possible states -- heads or tails -- and the arrows show the possible states the system could transition to in the next step. The number .5 represents the probability of that transition occurring.

History of the Markov chain

Markov chains are named after their creator, Andrey Andreyevich Markov, a Russian mathematician who founded a new branch of probability theory around stochastic processes in the early 1900s. Markov was greatly influenced by his teacher and mentor, Pafnuty Chebyshev, whose work also broke new ground in probability theory.

Learn how organizations are using a combination of predictive analytics and AI to make decisions based on past behaviors.

This was last updated in August 2022

Continue Reading About Markov model

Networking
  • subnet (subnetwork)

    A subnet, or subnetwork, is a segmented piece of a larger network. More specifically, subnets are a logical partition of an IP ...

  • Transmission Control Protocol (TCP)

    Transmission Control Protocol (TCP) is a standard protocol on the internet that ensures the reliable transmission of data between...

  • secure access service edge (SASE)

    Secure access service edge (SASE), pronounced sassy, is a cloud architecture model that bundles together network and cloud-native...

Security
  • cyber attack

    A cyber attack is any malicious attempt to gain unauthorized access to a computer, computing system or computer network with the ...

  • digital signature

    A digital signature is a mathematical technique used to validate the authenticity and integrity of a digital document, message or...

  • What is security information and event management (SIEM)?

    Security information and event management (SIEM) is an approach to security management that combines security information ...

CIO
  • product development (new product development)

    Product development -- also called new product management -- is a series of steps that includes the conceptualization, design, ...

  • innovation culture

    Innovation culture is the work environment that leaders cultivate to nurture unorthodox thinking and its application.

  • technology addiction

    Technology addiction is an impulse control disorder that involves the obsessive use of mobile devices, the internet or video ...

HRSoftware
  • HireVue

    HireVue is an enterprise video interviewing technology provider of a platform that lets recruiters and hiring managers screen ...

  • Human Resource Certification Institute (HRCI)

    Human Resource Certification Institute (HRCI) is a U.S.-based credentialing organization offering certifications to HR ...

  • e-recruitment (e-recruiting)

    E-recruitment is an umbrella term for any electronic-based recruiting and recruitment management activity.

Customer Experience
  • digital marketing

    Digital marketing is the promotion and marketing of goods and services to consumers through digital channels and electronic ...

  • contact center schedule adherence

    Contact center schedule adherence is a standard metric used in business contact centers to determine whether contact center ...

  • customer retention

    Customer retention is a metric that measures customer loyalty, or an organization's ability to retain customers over time.

Close