A Decentralized Optimization Framework for Energy Harvesting Devices


الملخص بالإنكليزية

Designing decentralized policies for wireless communication networks is a crucial problem, which has only been partially solved in the literature so far. In this paper, we propose the Decentralized Markov Decision Process (Dec-MDP) framework to analyze a wireless sensor network with multiple users which access a common wireless channel. We consider devices with energy harvesting capabilities, so that they aim at balancing the energy arrivals with the data departures and with the probability of colliding with other nodes. Randomly over time, an access point triggers a SYNC slot, wherein it recomputes the optimal transmission parameters of the whole network, and distributes this information. Every node receives its own policy, which specifies how it should access the channel in the future, and, thereafter, proceeds in a fully decentralized fashion, without interacting with other entities in the network. We propose a multi-layer Markov model, where an external MDP manages the jumps between SYNC slots, and an internal Dec-MDP computes the optimal policy in the near future. We numerically show that, because of the harvesting, a fully orthogonal scheme (e.g., TDMA-like) is suboptimal in energy harvesting scenarios, and the optimal trade-off lies between an orthogonal and a random access system.

تحميل البحث