End-to-End Packet Delay and Loss Behavior in the Internet
- State the problem the paper is trying to solve.
The main problem the paper is trying to solve is: "How to measure and analyze the end-to-end
packet delay and loss behavior in the Internet? "
- State the main contribution of the paper: solving a new problem, proposing a
new algorithm, or presenting a new evaluation (analysis). If a new problem, why
was the problem important? Is the problem still important today? Will the
problem be important tomorrow? If a new algorithm or new
evaluation (analysis), what are the improvements over previous algorithms or
evaluations? How do they come up with the new algorithm or evaluation?
The main contribution of the paper is that it presented the idea of measuring round trip delays of small
UDP probe packets sent at regular time intervals to analyze the Internet's end-to-end packet delay and
loss behavior. They observed such interest phenomena as compression of the probe packets, rapid fluctuations
of queing delays over small intervals, etc. They showed in the paper that the UDP echo is not only useful
for studying network problems but that it can also be useful in analyzing end-to-end characteristics over
different time scales, of connections over the internets. This is a useful problem because it gets at the
beginning of analysis of the behavior of load on the Internet. The UDP probe packets are somewhat like
ping and may have led to the development of ping in the current day. In understanding packet delay and
loss behavior, one can apply this knowledge in the proper design of network algorithms such as routing
and flow control algorithms, in dimensioning of buffers and link capacity, and in choosing parameters in
simulations and analytics studies. This understanding is also essential for designing the emerging audio
and video applications for the Internet. For this last reason also, the problem still is rather relevant
- Summarize the (at most) 3 key main ideas (each in 1 sentence.)
The first main idea in the analysis of the data presented in the paper is that the transatlantic link
between France and the United States with a bandwidth equal to 128 kb/s is the bottleneck link on the
path from INRIA to UMd. The second main idea presented in the analysis was that clp, conditional loss
probability, is greater than ulp, unconditional loss probability, for all values of sigma(in ms), the
interval between the send times of two successive packets. Third main idea that was presented in the
analysis is that the loss gap stays close to 1 even for small values of sigma, which has important
consequences for the design of audio and video applications.
- Critique the main contribution
The main contributions seem to be the advancement of work in the area of analysis of the Internet. The
contributions to bottlenecks, loss probabilty, and loss gap are important, especially in the currently
growing area of audio and video applications of the internet. These applications demand high and
consistent bandwidth for their proper functioning and understanding how to navigate the load of the
Internet is very important when trying to develop them. The methods used by the author, however, seemed
to be have some room for improvement. Packet delays and losses on the INRIA-UMd connection were obtained
using NetDyn, a mesurement tool. UDP packets were sent from the source host to an intermediate host and
to a destination host. Unfortunately, source, intermediate, and destination host are geographically
distant and their local clocks may not have been synchronized, and hence the timestamps in the UDP probe
packets would have been difficult to interpret. The author solved this by setting the source and
destination host to be the same host. This was a good compromise for a solution but it would have been
much better if he could used two actually geographically distinct hosts but synchronized both their time
clocks to some common atomic clock. This would have given more accurate results and made the tests closer
to real internet traffic.
- Rate the significance of the paper on a scale of 5
(breakthrough), 4 (significant contribution), 3 (modest contribution), 2
(incremental contribution), 1 (no contribution or negative contribution).
Explain your rating in a sentence or two.
I believe this paper make a modest contribution because it presents a good rudimentary study of load on
the internet using relatively simple and common mechanisms and does a good job of incorporating past
research. The paper also made three discoveries in its data analysis and explained several practical ways
the information could be utilized, such as in audio and video applications.
- Rate how convincing the methodology is: how do the authors justify the solution approach
or evaluation? Do the authors use arguments, analyses, experiments, simulations, or a combination of
them? Do the claims and conclusions follow from the arguments, analyses or experiments? Are the
assumptions realistic (at the time of the research)? Are the assumptions still valid today? Are the
experiments well designed? Are there different experiments that would be more convincing? Are there
other alternatives the authors should have considered? (And, of course, is the paper free of
The author used an analytic approach to studying the Internet's packet delay and loss behavior. He cites
previous work as being of three types: analytic, simulation, and experimental and compares his
analytic approach to the other two as well as previous analytic approaches. Previous analytic approaches
use queueing network models to analyze packet delay in computer networks. Simulation approaches examine
the impact of routing and flow control mechanisms on end-to-end delay. Experimental approaches used
systematic measurements of packet delay and loss on the ARPANET. The nice thing about the paper is that
the analytic observations that were made agree with the results obtained using simulation and
experimental approaches. The paper is printing convincing in that it used the actual Internet to obtain
its results. Unfortunately, the paper analyzed the Internet of 1993 and the results may be different for
the Internet of today. Even so, the paper obtained pretty nice data and so that analyses that followed
from this data were pretty convincing in terms of considerations that should be kept in mind when
designing future applications for the Internet that would be affected by load.
- What is the most important limitation of the approach?
The most important limitation of the approach is that it only provides analysis for the system at its
time and the analysis would have to be repeated at a later time for the conclusions to still hold
relevant. This is seen by the fact that when the analysis was performed there were approximately 1
million hosts. Today, the number of hosts (and routers) is much greater.
- What lessons should researchers and builders take away from this work. What (if any)
questions does this work leave open?
Researchers should take away lessons on how to construct an analysis of packet delay and loss behavior
using the UDP echo tool probe packets. The work leaves open questions of what this analysis would look
like today as well as what the implications of the discovery that probe packets are lost randomly except
when Internet traffic intensity is very high are.