ترغب بنشر مسار تعليمي؟ اضغط هنا

How to Mobilize mmWave: A Joint Beam and Channel Tracking Approach

275   0   0.0 ( 0 )
 نشر من قبل Jiahui Li
 تاريخ النشر 2018
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Maintaining reliable millimeter wave (mmWave) connections to many fast-moving mobiles is a key challenge in the theory and practice of 5G systems. In this paper, we develop a new algorithm that can jointly track the beam direction and channel coefficient of mmWave propagation paths using phased antenna arrays. Despite the significant difficulty in this problem, our algorithm can simultaneously achieve fast tracking speed, high tracking accuracy, and low pilot overhead. In static scenarios, this algorithm can converge to the minimum Cramer-Rao lower bound of beam direction with high probability. Simulations reveal that this algorithm greatly outperforms several existing algorithms. Even at SNRs as low as 5dB, our algorithm is capable of tracking a mobile moving at an angular velocity of 5.45 degrees per second and achieving over 95% of channel capacity with a 32-antenna phased array, by inserting only 10 pilots per second.



قيم البحث

اقرأ أيضاً

The performance of millimeter wave (mmWave) communications critically depends on the accuracy of beamforming both at base station (BS) and user terminals (UEs) due to high isotropic path-loss and channel attenuation. In high mobility environments, ac curate beam alignment becomes even more challenging as the angles of the BS and each UE must be tracked reliably and continuously. In this work, focusing on the beamforming at the BS, we propose an adaptive method based on Recurrent Neural Networks (RNN) that tracks and predicts the Angle of Departure (AoD) of a given UE. Moreover, we propose a modified frame structure to reduce beam alignment overhead and hence increase the communication rate. Our numerical experiments in a highly non-linear mobility scenario show that our proposed method is able to track the AoD accurately and achieve higher communication rate compared to more traditional methods such as the particle filter.
A communication setup is considered where a transmitter wishes to convey a message to a receiver and simultaneously estimate the state of that receiver through a common waveform. The state is estimated at the transmitter by means of generalized feedb ack, i.e., a strictly causal channel output, and the known waveform. The scenario at hand is motivated by joint radar and communication, which aims to co-design radar sensing and communication over shared spectrum and hardware. For the case of memoryless single receiver channels with i.i.d. time-varying state sequences, we fully characterize the capacity-distortion tradeoff, defined as the largest achievable rate below which a message can be conveyed reliably while satisfying some distortion constraints on state sensing. We propose a numerical method to compute the optimal input that achieves the capacity-distortion tradeoff. Then, we address memoryless state-dependent broadcast channels (BCs). For physically degraded BCs with i.i.d. time-varying state sequences, we characterize the capacity-distortion tradeoff region as a rather straightforward extension of single receiver channels. For general BCs, we provide inner and outer bounds on the capacity-distortion region, as well as a sufficient condition when this capacity-distortion region is equal to the product of the capacity region and the set of achievable distortions. A number of illustrative examples demonstrates that the optimal co-design schemes outperform conventional schemes that split the resources between sensing and communication.
We consider channel/subspace tracking systems for temporally correlated millimeter wave (e.g., E-band) multiple-input multiple-output (MIMO) channels. Our focus is given to the tracking algorithm in the non-line-of-sight (NLoS) environment, where the transmitter and the receiver are equipped with hybrid analog/digital precoder and combiner, respectively. In the absence of straightforward time-correlated channel model in the millimeter wave MIMO literature, we present a temporal MIMO channel evolution model for NLoS millimeter wave scenarios. Considering that conventional MIMO channel tracking algorithms in microwave bands are not directly applicable, we propose a new channel tracking technique based on sequentially updating the precoder and combiner. Numerical results demonstrate the superior channel tracking ability of the proposed technique over independent sounding approach in the presented channel model and the spatial channel model (SCM) adopted in 3GPP specification.
We propose a deep-learning approach for the joint MIMO detection and channel decoding problem. Conventional MIMO receivers adopt a model-based approach for MIMO detection and channel decoding in linear or iterative manners. However, due to the comple x MIMO signal model, the optimal solution to the joint MIMO detection and channel decoding problem (i.e., the maximum likelihood decoding of the transmitted codewords from the received MIMO signals) is computationally infeasible. As a practical measure, the current model-based MIMO receivers all use suboptimal MIMO decoding methods with affordable computational complexities. This work applies the latest advances in deep learning for the design of MIMO receivers. In particular, we leverage deep neural networks (DNN) with supervised training to solve the joint MIMO detection and channel decoding problem. We show that DNN can be trained to give much better decoding performance than conventional MIMO receivers do. Our simulations show that a DNN implementation consisting of seven hidden layers can outperform conventional model-based linear or iterative receivers. This performance improvement points to a new direction for future MIMO receiver design.
102 - Bei Guo , Renwang Li , Meixia Tao 2021
This paper considers a reconfigurable intelligent surface (RIS)-aided millimeter wave (mmWave) downlink communication system where hybrid analog-digital beamforming is employed at the base station (BS). We formulate a power minimization problem by jo intly optimizing hybrid beamforming at the BS and the response matrix at the RIS, under signal-to-interference-plus-noise ratio (SINR) constraints. The problem is highly challenging due to the non-convex SINR constraints as well as the non-convex unit-modulus constraints for both the phase shifts at the RIS and the analog beamforming at the BS. A penalty-based algorithm in conjunction with the manifold optimization technique is proposed to handle the problem, followed by an individual optimization method with much lower complexity. Simulation results show that the proposed algorithm outperforms the state-of-art algorithm. Results also show that the joint optimization of RIS response matrix and BS hybrid beamforming is much superior to individual optimization.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا