Suppose that, when a TCP segment is sent more than once, we take SampleRTT to be
ID: 3831112 • Letter: S
Question
Suppose that, when a TCP segment is sent more than once, we take SampleRTT to be the time between the original transmission and the ACK, as in Figure 5.10(a). Show that if a connection with a 1-packet window loses every other packet (i.e., each packet is transmitted twice), then EstimatedRTT increases to infinity. Assume TimeOut = EstimatedRTT; both algorithms presented in the text always set TimeOut even larger. (Hint: EstimatedRTT = EstimatedRTT+ ×(SampleRTTEstimatedRTT).)
Sender Receiver Sender Receiver onginal transmission Original transmission Retransmission ACK SC Retransmission ACK. (a) (b) FIGURE 5.10 Associating the AOK with a original transmission versus(b)retransmision.Explanation / Answer
Let us consider the real RTT (for successful transmissions) be 1.0 unit. By hypothesis, every
Packet times out once and then the retransmission is acknowledged after 1.0
unit ; this means that each SampleRTT measurement is TimeOut+1 = EstimatedRTT+1.
We then have,
EstimatedRTT = × EstimatedRTT + × SampleRTT
= EstimatedRTT + ×(SampleRTT EstimatedRTT).
EstimatedRTT +
Thus it follows that the Nth EstimatedRTT is greater than or equal(>=) to N.
Without the assumption, TimeOut = EstimatedRTT
we still have SampleRTT EstimatedRTT 1 and so the above argument still applies.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.