Do you want to publish a course? Click here

Vehicular LTE Connectivity Analysis in Urban and Rural Environments using USRP Measurements

82   0   0.0 ( 0 )
 Added by Karthik Vasudeva
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

The intelligent transportation system (ITS) offers a wide range of applications related to traffic management, which often require high data rate and low latency. The ubiquitous coverage and advancements of the Long Term Evolution (LTE) technology have made it possible to achieve these requirements and to enable broadband applications for vehicular users. In this paper, we perform field trial measurements in various different commercial LTE networks using software defined radios (SDRs) and report our findings. First, we provide a detailed tutorial overview on how to post-process SDR measurements for decoding broadcast channels and reference signal measurements from LTE networks. We subsequently describe the details of our measurement campaigns in urban, sub-urban, and rural environments. Based on these measurements, we report joint distributions of base station density, cellular coverage, link strength, disconnected vehicle duration, and vehicle velocity in these environments, and compare the LTE coverage in different settings. Our experimental results quantify the stronger coverage, shorter link distances, and shorter duration of disconnectivity in urban environments when compared to sub-urban and rural settings.



rate research

Read More

As the realization of vehicular communication such as vehicle-to-vehicle (V2V) or vehicle-to-infrastructure (V2I) is imperative for the autonomous driving cars, the understanding of realistic vehicle-to-everything (V2X) models is needed. While previous research has mostly targeted vehicular models in which vehicles are randomly distributed and the variable of carrier frequency was not considered, a more realistic analysis of the V2X model is proposed in this paper. We use a one-dimensional (1D) Poisson cluster process (PCP) to model a realistic scenario of vehicle distribution in a perpendicular cross line road urban area and compare the coverage results with the previous research that distributed vehicles randomly by Poisson Point Process (PPP). Moreover, we incorporate the effect of different carrier frequencies, mmWave and sub-6 GHz, to our analysis by altering the antenna radiation pattern accordingly. Results indicated that while the effect of clustering led to lower outage, using mmWave had even more significance in leading to lower outage. Moreover, line-of-sight (LoS) interference links are shown to be more dominant in lowering the outage than the non-line-of-sight (NLoS) links even though they are less in number. The analytical results give insight into designing and analyzing the urban V2X channels, and are verified by actual urban area three-dimensional (3D) ray-tracing simulation.
The capability of smarter networked devices to dynamically select appropriate radio connectivity options is especially important in the emerging millimeter-wave (mmWave) systems to mitigate abrupt link blockage in complex environments. To enrich the levels of diversity, mobile mmWave relays can be employed for improved connection reliability. These are considered by 3GPP for on-demand densification on top of the static mmWave infrastructure. However, performance dynamics of mobile mmWave relaying is not nearly well explored, especially in realistic conditions, such as urban vehicular scenarios. In this paper, we develop a mathematical framework for the performance evaluation of mmWave vehicular relaying in a typical street deployment. We analyze and compare alternative connectivity strategies by quantifying the performance gains made available to smart devices in the presence of mmWave relays. We identify situations where the use of mmWave vehicular relaying is particularly beneficial. Our methodology and results can support further standardization and deployment of mmWave relaying in more intelligent 5G+ all-mmWave cellular networks.
This paper studies the processing principles, implementation challenges, and performance of OFDM-based radars, with particular focus on the fourth-generation Long-Term Evolution (LTE) and fifth-generation (5G) New Radio (NR) mobile networks base stations and their utilization for radar/sensing purposes. First, we address the problem stemming from the unused subcarriers within the LTE and NR transmit signal passbands, and their impact on frequency-domain radar processing. Particularly, we formulate and adopt a computationally efficient interpolation approach to mitigate the effects of such empty subcarriers in the radar processing. We evaluate the target detection and the corresponding range and velocity estimation performance through computer simulations, and show that high-quality target detection as well as high-precision range and velocity estimation can be achieved. Especially 5G NR waveforms, through their impressive channel bandwidths and configurable subcarrier spacing, are shown to provide very good radar/sensing performance. Then, a fundamental implementation challenge of transmitter-receiver (TX-RX) isolation in OFDM radars is addressed, with specific emphasis on shared-antenna cases, where the TX-RX isolation challenges are the largest. It is confirmed that from the OFDM radar processing perspective, limited TX-RX isolation is primarily a concern in detection of static targets while moving targets are inherently more robust to transmitter self-interference. Properly tailored analog/RF and digital self-interference cancellation solutions for OFDM radars are also described and implemented, and shown through RF measurements to be key technical ingredients for practical deployments, particularly from static and slowly moving targets point of view.
Radio Map Prediction (RMP), aiming at estimating coverage of radio wave, has been widely recognized as an enabling technology for improving radio spectrum efficiency. However, fast and reliable radio map prediction can be very challenging due to the complicated interaction between radio waves and the environment. In this paper, a novel Transformer based deep learning model termed as RadioNet is proposed for radio map prediction in urban scenarios. In addition, a novel Grid Embedding technique is proposed to substitute the original Position Embedding in Transformer to better anchor the relative position of the radiation source, destination and environment. The effectiveness of proposed method is verified on an urban radio wave propagation dataset. Compared with the SOTA model on RMP task, RadioNet reduces the validation loss by 27.3%, improves the prediction reliability from 90.9% to 98.9%. The prediction speed is increased by 4 orders of magnitude, when compared with ray-tracing based method. We believe that the proposed method will be beneficial to high-efficiency wireless communication, real-time radio visualization, and even high-speed image rendering.
In this paper, a recently conducted measurement campaign for unmanned-aerial-vehicle (UAV) channels is introduced. The downlink signals of an in-service long-time-evolution (LTE) network which is deployed in a suburban scenario were acquired. Five horizontal and five vertical flight routes were considered. The channel impulse responses (CIRs) are extracted from the received data by exploiting the cell specific signals (CRSs). Based on the CIRs, the parameters of multipath components (MPCs) are estimated by using a high-resolution algorithm derived according to the space-alternating generalized expectation-maximization (SAGE) principle. Based on the SAGE results, channel characteristics including the path loss, shadow fading, fast fading, delay spread and Doppler frequency spread are thoroughly investigated for different heights and horizontal distances, which constitute a stochastic model.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا