Do you want to publish a course? Click here

A Heuristic Scheduling Scheme in Multiuser OFDMA Networks

608   0   0.0 ( 0 )
 Added by Zheng Sun
 Publication date 2008
and research's language is English




Ask ChatGPT about the research

Conventional heterogeneous-traffic scheduling schemes utilize zero-delay constraint for real-time services, which aims to minimize the average packet delay among real-time users. However, in light or moderate load networks this strategy is unnecessary and leads to low data throughput for non-real-time users. In this paper, we propose a heuristic scheduling scheme to solve this problem. The scheme measures and assigns scheduling priorities to both real-time and non-real-time users, and schedules the radio resources for the two user classes simultaneously. Simulation results show that the proposed scheme efficiently handles the heterogeneous-traffic scheduling with diverse QoS requirements and alleviates the unfairness between real-time and non-real-time services under various traffic loads.



rate research

Read More

In order to meet the ever-increasing demand for high throughput in WiFi networks, the IEEE 802.11ax (11ax) standard introduces orthogonal frequency division multiple access (OFDMA). In this letter, we address the station-resource unit scheduling problem in downlink OFDMA of 11ax subject to minimum throughput requirements. To deal with the infeasible instances of the constrained problem, we propose a novel scheduling policy based on weighted max-min fairness, which maximizes the minimum fraction between the achievable and minimum required throughputs. Thus, the proposed policy has a well-defined behavior even when the throughput constraints cannot be fulfilled. Numerical results showcase the merits of our approach over the popular proportional fairness and constrained sum-rate maximization strategies.
Support of real-time applications that impose strict requirements on packet loss ratio and latency is an essential feature of the next generation Wi-Fi networks. Initially introduced in the 802.11ax amendment to the Wi-Fi standard, uplink OFDMA seems to be a promising solution for supported low-latency data transmission from the numerous stations to an access point. In this paper, we study how to allocate OFDMA resources in an 802.11ax network and propose an algorithm aimed at providing the delay less than one millisecond and reliability up to 99.999% as required by numerous real-time applications. We design a resource allocation algorithm and with extensive simulation, show that it decreases delays for real-time traffic by orders of magnitude, while the throughput for non-real-time traffic is reduced insignificantly.
Bandwidth requirements of both wireless and wired clients in access networks continue to increase rapidly, primarily due to the growth of video traffic. Application awareness can be utilized in access networks to optimize quality of experience (QoE) of end clients. In this study, we utilize information at the client-side application (e.g., video resolution) to achieve superior resource allocation that improves user QoE. We emphasize optimizing QoE of the system rather than quality of service (QoS), as user satisfaction directly relies on QoE and optimizing QoS does not necessarily optimize QoE, as shown in this study. We propose application-aware resource-allocation schemes on an Ethernet passive optical network (EPON), which supports wireless (utilizing orthogonal frequency division multiple access) and wired clients running video-conference applications. Numerical results show that the application-aware resource-allocation schemes improve QoE for video-conference applications for wired and wireless clients.
Cell association scheme determines which base station (BS) and mobile user (MU) should be associated with and also plays a significant role in determining the average data rate a MU can achieve in heterogeneous networks. However, the explosion of digital devices and the scarcity of spectra collectively force us to carefully re-design cell association scheme which was kind of taken for granted before. To address this, we develop a new cell association scheme in heterogeneous networks based on joint consideration of the signal-to-interference-plus-noise ratio (SINR) which a MU experiences and the traffic load of candidate BSs1. MUs and BSs in each tier are modeled as several independent Poisson point processes (PPPs) and all channels experience independently and identically distributed ( i.i.d.) Rayleigh fading. Data rate ratio and traffic load ratio distributions are derived to obtain the tier association probability and the average ergodic MU data rate. Through numerical results, We find that our proposed cell association scheme outperforms cell range expansion (CRE) association scheme. Moreover, results indicate that allocating small sized and high-density BSs will improve spectral efficiency if using our proposed cell association scheme in heterogeneous networks.
Recently, many researchers have studied efficiently gathering data in wireless sensor networks to minimize the total energy consumption when a fixed number of data are allowed to be aggregated into one packet. However, minimizing the total energy consumption does not imply the network lifetime is maximized. In this paper, we study the problem of scheduling data aggregation trees working for different time periods to maximize the network lifetime when a fixed number of data are allowed to be aggregated into one packet. In addition, we propose a heuristic to balance the lifetime of nodes in data aggregation trees such that the network lifetime is maximized. Simulation results show that the proposed heuristic provides a good performance.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا