nanog mailing list archives

Re: Calculating Jitter


From: Eric Frazier <eric () dmcontact com>
Date: Fri, 10 Jun 2005 10:07:06 -0700


At 09:56 AM 6/10/2005, Fred Baker wrote:

you saw marshall's comment. If you're interested in a moving average, he's pretty close.

If I understood your question, though, you simply wanted to quantify the jitter in a set of samples. I should think there are two obvious definitions there.

A statistician would look, I should think, at the variance of the set. Reaching for my CRC book of standard math formulae and tables, it defines the variance as the square of the standard deviation of the set, which is to say

That is one thing I have never understood, if you can pretty much just look at a standard dev and see it is high, and yeah that means your numbers are flopping all over the place, then what good is the square of it? Does it just make graphing better in some way?

Thanks,

Eric

Current thread: