ترغب بنشر مسار تعليمي؟ اضغط هنا

On Unified Vehicular Communications and Radar Sensing in Millimeter-Wave and Low Terahertz Bands

101   0   0.0 ( 0 )
 نشر من قبل Vitaly Petrov
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Future smart vehicles will incorporate high-data-rate communications and high-resolution radar sensing capabilities operating in the millimeter-wave and higher frequencies. These two systems are preparing to share and reuse a lot of common functionalities, such as steerable millimeter-wave antenna arrays. Motivated by this growing overlap, and advanced further by the space and cost constraints, the vehicular community is pursuing a vision of unified vehicular communications and radar sensing, which represents a major paradigm shift for next-generation connected and self-driving cars. This article outlines a path to materialize this decisive transformation. We begin by reviewing the latest developments in hybrid vehicular communications and radar systems, and then propose a concept of unified channel access over millimeter-wave and higher frequencies. Our supporting system-level performance characterization relies upon real-life measurements and massive ray-based modeling to confirm the significant improvements brought by our proposal to mitigating the interference and deafness effects. Since our results aim to open the door to unified vehicular communications and radar sensing, we conclude by outlining the potential research directions in this rapidly developing field.

قيم البحث

اقرأ أيضاً

Millimeter wave wireless spectrum deployments will allow vehicular communications to share high data rate vehicular sensor data in real-time. The highly directional nature of wireless links in millimeter spectral bands will require continuous channel measurements to ensure the transmitter (TX) and receiver (RX) beams are aligned to provide the best channel. Using real-world vehicular mmWave measurement data at 28 GHz, we determine the optimal beam sweeping period, i.e. the frequency of the channel measurements, to align the RX beams to the best channel directions for maximizing the vehicle-to-infrastructure (V2I) throughput. We show that in a realistic vehicular traffic environment in Austin, TX, for a vehicle traveling at an average speed of 10.5 mph, a beam sweeping period of 300 ms in future V2I communication standards would maximize the V2I throughput, using a system of four RX phased arrays that scanned the channel 360 degrees in the azimuth and 30 degrees above and below the boresight. We also investigate the impact of the number of active RX chains controlling the steerable phased arrays on V2I throughput. Reducing the number of RX chains controlling the phased arrays helps reduce the cost of the vehicular mmWave hardware while multiple RX chains, although more expensive, provide more robustness to beam direction changes at the vehicle, allowing near maximum throughput over a wide range of beam sweep periods. We show that the overhead of utilizing one RX chain instead of four leads to a 10% drop in mean V2I throughput over six non-line-of-sight runs in real traffic conditions, with each run being 10 to 20 seconds long over a distance of 40 to 90 meters.
The capability of smarter networked devices to dynamically select appropriate radio connectivity options is especially important in the emerging millimeter-wave (mmWave) systems to mitigate abrupt link blockage in complex environments. To enrich the levels of diversity, mobile mmWave relays can be employed for improved connection reliability. These are considered by 3GPP for on-demand densification on top of the static mmWave infrastructure. However, performance dynamics of mobile mmWave relaying is not nearly well explored, especially in realistic conditions, such as urban vehicular scenarios. In this paper, we develop a mathematical framework for the performance evaluation of mmWave vehicular relaying in a typical street deployment. We analyze and compare alternative connectivity strategies by quantifying the performance gains made available to smart devices in the presence of mmWave relays. We identify situations where the use of mmWave vehicular relaying is particularly beneficial. Our methodology and results can support further standardization and deployment of mmWave relaying in more intelligent 5G+ all-mmWave cellular networks.
Synergistic design of communications and radar systems with common spectral and hardware resources is heralding a new era of efficiently utilizing a limited radio-frequency spectrum. Such a joint radar-communications (JRC) model has advantages of low -cost, compact size, less power consumption, spectrum sharing, improved performance, and safety due to enhanced information sharing. Today, millimeter-wave (mm-wave) communications have emerged as the preferred technology for short distance wireless links because they provide transmission bandwidth that is several gigahertz wide. This band is also promising for short-range radar applications, which benefit from the high-range resolution arising from large transmit signal bandwidths. Signal processing techniques are critical in implementation of mmWave JRC systems. Major challenges are joint waveform design and performance criteria that would optimally trade-off between communications and radar functionalities. Novel multiple-input-multiple-output (MIMO) signal processing techniques are required because mmWave JRC systems employ large antenna arrays. There are opportunities to exploit recent advances in cognition, compressed sensing, and machine learning to reduce required resources and dynamically allocate them with low overheads. This article provides a signal processing perspective of mmWave JRC systems with an emphasis on waveform design.
154 - Qing Xue , Xuming Fang , 2017
For future networks (i.e., the fifth generation (5G) wireless networks and beyond), millimeter-wave (mmWave) communication with large available unlicensed spectrum is a promising technology that enables gigabit multimedia applications. Thanks to the short wavelength of mmWave radio, massive antenna arrays can be packed into the limited dimensions of mmWave transceivers. Therefore, with directional beamforming (BF), both mmWave transmitters (MTXs) and mmWave receivers (MRXs) are capable of supporting multiple beams in 5G networks. However, for the transmission between an MTX and an MRX, most works have only considered a single beam, which means that they do not make full potential use of mmWave. Furthermore, the connectivity of single beam transmission can easily be blocked. In this context, we propose a single-user multi-beam concurrent transmission scheme for future mmWave networks with multiple reflected paths. Based on spatial spectrum reuse, the scheme can be described as a multiple-input multiple-output (MIMO) technique in beamspace (i.e., in the beam-number domain). Moreover, this study investigates the challenges and potential solutions for implementing this scheme, including multibeam selection, cooperative beam tracking, multi-beam power allocation and synchronization. The theoretical and numerical results show that the proposed beamspace SU-MIMO can largely improve the achievable rate of the transmission between an MTX and an MRX and, meanwhile, can maintain the connectivity.
The use of extremely high frequency (EHF) or millimeter-wave (mmWave) band has attracted significant attention for the next generation wireless access networks. As demonstrated by recent measurements, mmWave frequencies render themselves quite sensit ive to blocking caused by obstacles like foliage, humans, vehicles, etc. However, there is a dearth of analytical models for characterizing such blocking and the consequent effect on the signal reliability. In this paper, we propose a novel, general, and tractable model for characterizing the blocking caused by humans (assuming them to be randomly located in the environment) to mmWave propagation as a function of system parameters like transmitter-receiver locations and dimensions, as well as density and dimensions of humans. Moreover, the proposed model is validated using a ray-launcher tool. Utilizing the proposed model, the blockage probability is shown to increase with human density and separation between the transmitter-receiver pair. Furthermore, the developed analysis is shown to demonstrate the existence of a transmitter antenna height that maximizes the received signal strength, which in turn is a function of the transmitter-receiver distance and their dimensions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا