Temporal Correlations of Local Network Losses


الملخص بالإنكليزية

We introduce a continuum model describing data losses in a single node of a packet-switched network (like the Internet) which preserves the discrete nature of the data loss process. {em By construction}, the model has critical behavior with a sharp transition from exponentially small to finite losses with increasing data arrival rate. We show that such a model exhibits strong fluctuations in the loss rate at the critical point and non-Markovian power-law correlations in time, in spite of the Markovian character of the data arrival process. The continuum model allows for rather general incoming data packet distributions and can be naturally generalized to consider the buffer server idleness statistics.

تحميل البحث