Paper review: Quality Adaptation for Congestion Controlled Video Playback over the Internet

Reviewer: Hanlin D. Qian

This paper addresses the problem of how to combine congestion control mechanisms with real-time streaming data over the Internet. The authors offers a mechanism called hierachical encoding that allows the media server to sent layered encodings for each stream. More layers can be transmitted as bandwidth increases, and fewer layers are transmitted as bandwidth decreases. This mechanism works with any AIMD TCP-friendly congestion control mechanisms.

The core of this paper focuses on two issues. On the server side, the sender needs to know how and when to add a layer when bandwidth increases and how and when to drop a layer when bandwidth decreases. On the client side, the receiver needs to know how much of each layer to buffer in order to provide smooth transitions between adding and dropping a layer, so that the number of layers don't fluctuate as quickly as the congestion behavior of the network. Here are some important ideas that the paper proposes:

  1. A new layer should not be added until 1) the existing bandwidth is available to include existing layers plus the new layer adn 2) when there is sufficient total buffering at the receiver to survive an immediate backoff and continue playing all the existing layers plus new layer. The equations for these two criteria are in the paper.
  2. A lower layer needs to be buffered more than a higher layer. That way when there is a layer drop, there can be enough buffer to satisfy the consumption rate.
  3. A smoothing technique is employed so that an optimum amount of buffering is calculated for each layer. A smoothing factor of Kmax determines how many backoffs this optimization can survive without undergoing buffer underflow. The paper provides the equations and an algorithm to make such optimization calculations.

I give this paper a rating of 3 - modest contribution. This paper attempts to provide a good solution to the streaming media and congestion control problem. The authors have some really good ideas, and the equations and algorithm provided give a good foundation for further research in this area.

There are, ofcourse, problems with this paper. For example, the second criterion required for the sender to determine whether to add a layer makes the assumption that the sender knows the receiver's consumpation rate and buffering state. Is this information necessarily obtainable? If it's sent over the network, is it guaranteed to arrive to the sender and not be outdated? How often should this information be transmitted to the sender? Also, the smoothing model is problematic because Kmax is a constant. The Internet state changes all the time, and what is a good number to set for Kmax? Also, the bigger Kmax, the more space is needed for buffering on the client side. Do we have infinite space to do so? As always, there is the problem of the simulation not being an accurate enough model to test the "real" Internet. And there is really no good way to model after the Internet with 100% accuracy.