[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Error in Error Model




The two state continous time error model (ErrorModel/TwoState) provides
wrong results (even when you manage to get it to work after adjusting the 
declarations). I checked this by doing the following:

(1) Measure the throughput of, say, TCP/Reno on a link with no losses.

(2) Create a two state continuous time random loss model, and set the good
and bad state durations to be very large (e.g RTT * 10^4). For
simplicty, set the average good and bad states to be equal.

(3) Now measure the throughput. The throughput should be exactly equal to
the case of non random loss * 0.5. However, the ns results show that it
the throughput is much less, which I beleive is a definite indication of a
wrong implementation.

P.S. The simple random loss model (i.e. one state) seems to be ok.
P.S.2 : I'm using ns-2.1b4a.

Best regards,

--Hussein.
------------------------------
EE Dept.,
University of Washington, Seattle.