Skip to main content

Questions tagged [markov-process]

A stochastic process with the property that the future is conditionally independent of the past, given the present.

0 votes
0 answers
10 views

HMMs "difficulty" compared to a Markov model

Given an HMM, it is easy to compute the best approximating $n$-gram model over the observations. For example, for $N=1$, we have $p(w_i|w_{i-1}) = \sum_{s_i,s_{i-1}}p(w_i,s_i|w_{i-1},s_{i-1})=\sum_{...
user1767774's user avatar
0 votes
0 answers
19 views

Estimating markov transition matrix using total elevator "ups" and "downs" by floor

I have data on elevator presses and I am hoping to use them to estimate a Markov transition matrix, so I can ultimately estimate how frequently people go to different floors. For each floor from 1-4, ...
Jon Spring's user avatar
1 vote
0 answers
12 views

Estimating Markov Chain Probabilities with Limited Data

Suppose I have some data on transitions between states of a Discrete Time Markov Chain. Let's say that transitions between some events are observed more frequently from others. For example, in a 3 ...
user avatar
9 votes
2 answers
518 views

Markov Chains with Changing Number of States

I have seen these kinds of Discrete State Markov Chains before (Continuous Time or Discrete Time): Homogeneous (Probability Transition Matrix is constant) Non-Homogeneous (Probability Transition ...
udon762's user avatar
  • 91
2 votes
0 answers
45 views

How to tune the unadjusted Langevin algorithm?

I want to start investigating the (unadjusted) simulation of the Langevin process $${\rm d}X_t=b(X_t){\rm d}t+\sigma{\rm d}W_t,$$ where $$b:=\frac{\sigma^2}2\nabla\ln p.$$ I don't want to simulate ...
0xbadf00d's user avatar
  • 303
0 votes
0 answers
31 views

Why reverse diffusion process is not a gaussian distribution?

The forward diffusion process, which goes from x_t to x_{t+1} is Gaussian, which is very reasonable as we go the next state by adding random gaussian noise. However, I do not understand why the ...
levitatmas's user avatar
0 votes
0 answers
50 views

Sum of powers (geometric series) of state transition matrix

I am working discrete time Markov chain analysis for some large state transition graph. I want to find the rewards/cost to reach from the init state to the terminal/accepting states. I have the state ...
JackDaniels's user avatar
0 votes
0 answers
4 views

Which variable is best suited for edge weights when computing graph algorithms instead of relative risks?

I am currently trying to develop graph data. Which variable is best suited for edge weights when computing graph algorithms? Relative risk Relative Risk: Many networks in my field use relative risks ...
user1190107's user avatar
0 votes
0 answers
12 views

Forward-Backward Algorithm for Autoregressive HMMs

I am currently studying HMMs, and covered the Forward-Backward Algorithms as well as the smoothing and filtering process. Recently, we were posed a question on Autoregressive HMMs which I've been ...
Kai's user avatar
  • 83
0 votes
1 answer
83 views

msm package: Mutlti state model initial value in 'vmmin' is not finite

I am new to msm package and markov models. I have a randomized trial dataset with readings from three time points: baseline, at 1 year, and at 2 year. I am trying to calculate annual transition ...
spri0330's user avatar
3 votes
1 answer
143 views

Can MCMC sample any probability distributions?

I have three fundamental questions related to MCMC. I would appreciate the help on any one of those. The most fundamental question in MCMC field, which I can't find a reference, is: Can MCMC generate ...
George Lu's user avatar
2 votes
1 answer
44 views

Show that the total variation distance of the Metropolis kernel to its proposal kernel is equal to the rejection probability

Furthermore, let $(E,\mathcal E,\lambda)$ be a $\sigma$-finite measure space; $Q$ be a Markov kernel on $(E,\mathcal E)$ with density $q$ with respect to $\lambda$; $\mu$ be a probability measure on $...
0xbadf00d's user avatar
  • 303
0 votes
0 answers
16 views

How can we compare the "performance" of different Markov chain Monte Carlo algorithms?

How can we judge the performance a Markov chain Monte Carlo (MCMC) algorithm? I guess we could consider one of the following: The variance of $X_t$ for a given $t\in I$; The asymptotic variance of $(...
0xbadf00d's user avatar
  • 303
0 votes
0 answers
9 views

What is the effect of sampling rate on parameter estimation when fitting a markov state model to timeseries data?

Let us say that I have some timeseries data, which can be described by a markov state model. And the time series has been sampled every $\Delta t$ time units. The sampling rate ($1/\Delta t$) must ...
ace_101's user avatar
0 votes
0 answers
11 views

How to test Markovian property in a financial time series?

I want to build a Markov Chain model for a financial time series to determine transition probabilities from one state to another. The underlying assumption is that series obeys the Markovian property. ...
Sane's user avatar
  • 489

15 30 50 per page
1
2 3 4 5
83