[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ns] problem with errormodel



hi all,
  i have one TCP connection on a wireless channel. i want to simulate 
changing channel conditions. for now at least, i want to keep the channel 
changes predictable. i tried to simulate it in the following way:

1> associate an errormodel object with the mobilenodes. i added a function 
to ns-mobilenode.tcl which i essentially copied from ns-sat.tcl.
%%%
Node/MobileNode instproc interface-errormodel {em} {
    puts "in interface-errormodel"
    $self instvar mac_ ll_
    $mac_(0) up-target $em
    $em target $ll_(0)
}

i then call this function from my script after i've created my nodes
%%%
for {set i 0} {$i < $val(nn) } {incr i} {
                set node_($i) [$ns_ node]

                #bneeraj
                set em_($i) [new ErrorModel]
                $em_($i) unit pkt
                $em_($i) set rate_ 0.05
                $em_($i) ranvar [new RandomVariable/Uniform]
                $node_($i) interface-errormodel $em_($i)
                #eneeraj

                # disable random motion
                $node_($i) random-motion 0
        }

2> node_(0) has the app, node_(1) has the sink. in my script i have the 
following lines (say) to change the value of the 'rate_' parameter of the 
errormodel.
%%%
$ns_ at $(epoch1) "$em_(1) set rate_ 0.9
$ns_ at $(epoch2) "$em_(1) set rate_ 0.2

the problem is that i don't see rate_ changes in the middle of the 
simulation having any effect! if i set it at the beginning of the simulation 
to a very high value ($em_(1) set rate_ 0.99) i see a high incidence of 
retransmits as expected. but if i use the '$ns_ at ...' mechanism to change 
the rate at any other time, the rate set at the beginning still seems to be 
in effect.

i tried adding a command 'newrate' to the command set of the errormodel 
class to explicitly set the rate. i verified with printf statements that the 
'corrupt' function is indeed being called on all my packets and the rate_ 
parameter is changed to the new value. but i still see the incidence of 
retransmissions corresponding to the rate set at the beginning of the 
simulation.

ideally i'd like to associate a time-varying errormodel with the channel 
itself rather than individual node interfaces. but i couldn't figure out how 
to do that. i'd appreciate any help/suggestions with the problem.

thanks,
neeraj
______________________________________________________
Get Your Private, Free Email at http://www.hotmail.com