Hi,
This question has been killling me for some time now. It has been asked a few times in the past, but I could not find a definitive answer to this. Can someone throw some light.
How do I convert the Jitter value present in RTCP packet to milliseconds. I think it must be straight forward to calculate this. Any formula to derive it, provided I know all the details of the Codec used?
With Regards, Stephen Regan.