I am currently simulating a peer-to-peer overlay focused on video streaming over HTTP. When a peer wants to send an HTTP message to another peer, the delay is calculated.
Currently I am calculating delay as: delay = transmission delay + propagation delay
where transmission delay = message size / transmission rate
and propagation delay = distance / propagation speed
Now, apart from transmission and propagation delay, I would also like to consider packet jitter such that: total delay = transmission delay + propagation delay + jitter
I am stuck at finding a realistic way to generate jitter values. I was trying to find a way such that I can use the details of transmission and propagation delay to approximate a jitter value.
I have found that packet jitter can be approximated using a log-normal distribution however I'm not sure which mean and standard deviation values to use. I also found that the interquartile range of jitter values is usually in the range of 0 to 20 ms based on measurement data.
Therefore I was thinking of using log-normal distribution with a mean of 1 and standard deviation of 1 to generate values in the range of 0-20ms. However I have invented the values of mean and standard deviation so I don't think that's the correct way of generating realistic jitter. Does anyone know of a better way?