"The terms "jitter" and "delay variance" are often used synonymously. The best definition of the jitter is the amount of variation in the end-to-end packet transit time. For applications such as audio and video transmission, it does not matter much if the packets take 20 ms or 30 msec to be delivered, as long as the transit time is constant. Having some packets taking 20 msec and others taking 30 msec will give an uneven quality to sound or image." The most current measuring system is explained in the following paragraph:

The jitter can be bounded by computing the expected transit time for each hop along the path. When a packet arrives at a router, the router checks to see how much the packet is behind or ahead of its schedule. This information is stored in the packet and updated at each hop. If the packet is ahead of schedule, the router tries to get it out the door quickly. In fact, the algorithm for determining which of several packets competing for an output line should go next can always choose the packet furthest behind in its schedule. In this way, packets that are ahead of schedule get slowed down and packets that are behind schedule get speed
up, in both cases reducing the amount of jitter. This method does not always work because only a limited number of late
packets can be on hold in the input buffer of the router. After that, packets are lost which introduces another problem."

"The best way is to implement a traffic-smoothing algorithm, which is less rigid. In the other words, by introducing an insignificant jitter, we can prevent packet loss. The magnitude of the insignificant jitter may change from one application to another one. So, the parameters (e.g. amount of input buffer, delay variation, ingress speed, engress speed) must be defined before."

"P.S: The "Token Bucket' algorithm is the closest algorithm to the subject I have explained here."
-Submitted by Steven G. Savas, Principal Staff Engineer at Motorola

Click on your browser's back button to return to the list of troubleshooting tips.