Control of Status Updates for Energy Harvesting Devices that Monitor Processes with Alarms


Abstract in English

In this work, we derive optimal transmission policies in an energy harvesting status update system. The system monitors a stochastic process which can be either in a normal or in an alarm state of operation. We capture the freshness of status updates for each state of the stochastic process by introducing two Age of Information (AoI) variables and extend the definition of AoI to account for the state changes of the stochastic process. We formulate the problem at hand as a Markov Decision Process which, under the assumption that the demand for status updates is higher when the stochastic process is in the alarm state, utilizes a transition cost function that applies linear and non-linear penalties based on AoI and the state of the stochastic process. Finally, we evaluate numerically the derived policies and illustrate their effectiveness for reserving energy in anticipation of future alarm states.

Download