Do you want to publish a course? Click here

Energy and Latency of Beamforming Architectures for Initial Access in mmWave Wireless Networks

154   0   0.0 ( 0 )
 Added by C. Nicolas Barati
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Future millimeter-wave (mmWave) systems, 5G cellular or WiFi, must rely on highly directional links to overcome severe pathloss in these frequency bands. Establishing such links requires the mutual discovery of the transmitter and the receiver %in the angular domain potentially leading to a large latency and high energy consumption. In this work, we show that both the discovery latency and energy consumption can be significantly reduced by using fully digital front-ends. In fact, we establish that by reducing the resolution of the fully-digital front-ends we can achieve lower energy consumption compared to both analog and high-resolution digital beamformers. Since beamforming through analog front-ends allows sampling in only one direction at a time, the mobile device is on for a longer time compared to a digital beamformer which can get spatial samples from all directions in one shot. We show that the energy consumed by the analog front-end can be four to six times more than that of the digital front-ends, depending on the size of the employed antenna arrays. We recognize, however, that using fully digital beamforming post beam discovery, i.e., for data transmission, is not viable from a power consumption standpoint. To address this issue, we propose the use of digital beamformers with low-resolution analog to digital converters (4 bits). This reduction in resolution brings the power consumption to the same level as analog beamforming for data transmissions while benefiting from the spatial multiplexing capabilities of fully digital beamforming, thus reducing initial discovery latency and improving energy efficiency.



rate research

Read More

Millimeter-wave (mmWave) communications rely on directional transmissions to overcome severe path loss. Nevertheless, the use of narrow beams complicates the initial access procedure and increase the latency as the transmitter and receiver beams should be aligned for a proper link establishment. In this paper, we investigate the feasibility of random beamforming for the cell-search phase of initial access. We develop a stochastic geometry framework to analyze the performance in terms of detection failure probability and expected latency of initial access as well as total data transmission. Meanwhile, we compare our scheme with the widely used exhaustive search and iterative search schemes, in both control plane and data plane. Our numerical results show that, compared to the other two schemes, random beamforming can substantially reduce the latency of initial access with comparable failure probability in dense networks. We show that the gain of the random beamforming is more prominent in light traffics and low-latency services. Our work demonstrates that developing complex cell-discovery algorithms may be unnecessary in dense mmWave networks and thus shed new lights on mmWave network design.
We design a lightweight beam-searching algorithm for mobile millimeter-wave systems. We construct and maintain a set of path skeletons, i.e., potential paths between a user and the serving base station to substantially expedite the beam-searching process. To exploit the spatial correlations of the channels, we propose an efficient algorithm that measures the similarity of the skeletons and re-executes the beam-searching procedure only when the old one becomes obsolete. We identify and optimize several tradeoffs between: i) the beam-searching overhead and the instantaneous rate of the users, and ii) the number of users and the update overhead of the path skeletons. Simulation results in an outdoor environment with real building map data show that the proposed method can significantly improve the performance of beam-searching in terms of latency, energy consumption and achievable throughout.
This paper presents DeepIA, a deep learning solution for faster and more accurate initial access (IA) in 5G millimeter wave (mmWave) networks when compared to conventional IA. By utilizing a subset of beams in the IA process, DeepIA removes the need for an exhaustive beam search thereby reducing the beam sweep time in IA. A deep neural network (DNN) is trained to learn the complex mapping from the received signal strengths (RSSs) collected with a reduced number of beams to the optimal spatial beam of the receiver (among a larger set of beams). In test time, DeepIA measures RSSs only from a small number of beams and runs the DNN to predict the best beam for IA. We show that DeepIA reduces the IA time by sweeping fewer beams and significantly outperforms the conventional IAs beam prediction accuracy in both line of sight (LoS) and non-line of sight (NLoS) mmWave channel conditions.
Millimeter-wave (mmWave) networks rely on directional transmissions, in both control plane and data plane, to overcome severe path-loss. Nevertheless, the use of narrow beams complicates the initial cell-search procedure where we lack sufficient information for beamforming. In this paper, we investigate the feasibility of random beamforming for cell-search. We develop a stochastic geometry framework to analyze the performance in terms of failure probability and expected latency of cell-search. Meanwhile, we compare our results with the naive, but heavily used, exhaustive search scheme. Numerical results show that, for a given discovery failure probability, random beamforming can substantially reduce the latency of exhaustive search, especially in dense networks. Our work demonstrates that developing complex cell-discovery algorithms may be unnecessary in dense mmWave networks and thus shed new lights on mmWave system design.
Wireless energy transfer (WET) is a promising solution to enable massive machine-type communications (mMTC) with low-complexity and low-powered wireless devices. Given the energy restrictions of the devices, instant channel state information at the transmitter (CSIT) is not expected to be available in practical WET-enabled mMTC. However, because it is common that the terminals appear spatially clustered, some degree of spatial correlation between their channels to the base station (BS) is expected to occur. The paper considers a massive antenna array at the BS for WET that only has access to i) the first and second order statistics of the Rician channel component of the multiple-input multiple-output (MIMO) channel and also to ii) the line-of-sight MIMO component. The optimal precoding scheme that maximizes the total energy available to the single-antenna devices is derived considering a continuous alphabet for the precoders, permitting any modulated or deterministic waveform. This may lead to some devices in the clusters being assigned a low fraction of the total available power in the cluster, creating a rather uneven situation among them. Consequently, a fairness criterion is introduced, imposing a minimum amount of power allocated to the terminals. A piece-wise linear harvesting circuit is considered at the terminals, with both saturation and a minimum sensitivity, and a constrained version of the precoder is also proposed by solving a non-linear programming problem. A paramount benefit of the constrained precoder is the encompassment of fairness in the power allocation to the different clusters. Moreover, given the polynomial complexity increase of the proposed unconstrained precoder, and the observed linear gain of the systems available sum-power with an increasing number of antennas at the ULA, the use of massive antenna arrays is desirable.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا