Hello, I've been calculating the time interval between two consecutive tcp packets. That is, the interval between the the end of the previous packet, and the beginning of the next packet. This is always 12ms. This sounds like a little too much, and I was wondering if somebody had an explanation for this, or how I could minimize this. Thanks, Apu