No Arabic abstract
In this paper, we aim to establish the connection between Age of Information (AoI) in network theory, information uncertainty in information theory, and detection delay in time series analysis. We consider a dynamic system whose state changes at discrete time points, and a state change wont be detected until an update generated after the change point is delivered to the destination for the first time. We introduce an information theoretic metric to measure the information freshness at the destination, and name it as generalized Age of Information (GAoI). We show that under any state-independent online updating policy, if the underlying state of the system evolves according to a stationary Markov chain, the GAoI is proportional to the AoI. Besides, the accumulative GAoI and AoI are proportional to the expected accumulative detection delay of all changes points over a period of time. Thus, any (G)AoI-optimal state-independent updating policy equivalently minimizes the corresponding expected change point detection delay, which validates the fundamental role of (G)AoI in real-time status monitoring. Besides, we also investigate a Bayesian change point detection scenario where the underlying state evolution is not stationary. Although AoI is no longer related to detection delay explicitly, we show that the accumulative GAoI is still an affine function of the expected detection delay, which indicates the versatility of GAoI in capturing information freshness in dynamic systems.
In this paper, we consider the age of information (AoI) of a discrete time status updating system, focusing on finding the stationary AoI distribution assuming that the Ber/G/1/1 queue is used. Following the standard queueing theory, we show that by invoking a two-dimensional state vector which tracks the AoI and packet age in system simultaneously, the stationary AoI distribution can be derived by analyzing the steady state of the constituted two-dimensional stochastic process. We give the general formula of the AoI distribution and calculate the explicit expression when the service time is also geometrically distributed. The discrete and continuous AoI are compared, we depict the mean of discrete AoI and that of continuous time AoI for system with M/M/1/1 queue. Although the stationary AoI distribution of some continuous time single-server system has been determined before, in this paper, we shall prove that the standard queueing theory is still appliable to analyze the discrete AoI, which is even stronger than the proposed methods handling the continuous AoI.
As 5G and Internet-of-Things (IoT) are deeply integrated into vertical industries such as autonomous driving and industrial robotics, timely status update is crucial for remote monitoring and control. In this regard, Age of Information (AoI) has been proposed to measure the freshness of status updates. However, it is just a metric changing linearly with time and irrelevant of context-awareness. We propose a context-based metric, named as Urgency of Information (UoI), to measure the nonlinear time-varying importance and the non-uniform context-dependence of the status information. This paper first establishes a theoretical framework for UoI characterization and then provides UoI-optimal status updating and user scheduling schemes in both single-terminal and multi-terminal cases. Specifically, an update-index-based scheme is proposed for a single-terminal system, where the terminal always updates and transmits when its update index is larger than a threshold. For the multi-terminal case, the UoI of the proposed scheduling scheme is proven to be upper-bounded and its decentralized implementation by Carrier Sensing Multiple Access with Collision Avoidance (CSMA/CA) is also provided. In the simulations, the proposed updating and scheduling schemes notably outperform the existing ones such as round robin and AoI-optimal schemes in terms of UoI, error-bound violation and control system stability.
Age of Information (AoI) has become an important concept in communications, as it allows system designers to measure the freshness of the information available to remote monitoring or control processes. However, its definition tacitly assumed that new information is used at any time, which is not always the case and the instants at which information is collected and used are dependent on a certain query process. We propose a model that accounts for the discrete time nature of many monitoring processes, considering a pull-based communication model in which the freshness of information is only important when the receiver generates a query. We then define the Age of Information at Query (QAoI), a more general metric that fits the pull-based scenario, and show how its optimization can lead to very different choices from traditional push-based AoI optimization when using a Packet Erasure Channel (PEC).
This letter analyzes a class of information freshness metrics for large IoT systems in which terminals employ slotted ALOHA to access a common channel. Considering a Gilbert- Elliot channel model, information freshness is evaluated through a penalty function that follows a power law of the time elapsed since the last received update, in contrast with the linear growth of age of information. By means of a signal flow graph analysis of Markov processes, we provide exact closed form expressions for the average penalty and for the peak penalty violation probability.
It is becoming increasingly clear that an important task for wireless networks is to minimize the age of information (AoI), i.e., the timeliness of information delivery. While mainstream approaches generally rely on the real-time observation of user AoI and channel state, there has been little attention to solve the problem in a complete (or partial) absence of such knowledge. In this article, we present a novel study to address the optimal blind radio resource scheduling problem in orthogonal frequency division multiplexing access (OFDMA) systems towards minimizing long-term average AoI, which is proven to be the composition of time-domain-fair clustered round-robin and frequency-domain-fair intra-cluster sub-carrier assignment. Heuristic solutions that are near-optimal as shown by simulation results are also proposed to effectively improve the performance upon presence of various degrees of extra knowledge, e.g., channel state and AoI.