ترغب بنشر مسار تعليمي؟ اضغط هنا

Towards Global and Limitless Connectivity: The Role of Private NGSO Satellite Constellations for Future Space-Terrestrial Networks

54   0   0.0 ( 0 )
 نشر من قبل Andra Voicu
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Satellite networks are expected to support global connectivity and services via future integrated 6G space-terrestrial networks (STNs), as well as private non-geostationary satellite orbit (NGSO) constellations. In the past few years, many such private constellations have been launched or are in planning, e.g. SpaceX and OneWeb to name a few. In this article we take a closer look at the private constellations and give a comprehensive overview of their features. We then discuss major technical challenges resulting from their design and briefly review the recent literature addressing these challenges. Studying the emerging private constellations gives us useful insights for engineering the future STNs. To this end, we study the satellite mobility and evaluate the impact of two handover strategies on the space-to-ground link performance of four real private NGSO constellations. We show that the link capacity, delay, and handover rate vary across the constellations, so the optimal handover strategy depends on the constellation design. Consequently, the communications solutions of future STNs should be compliant with the constellation specifics, and the STN standards need to be flexible enough to support satellite operation with the large parameter space observed in the emerging private constellations.



قيم البحث

اقرأ أيضاً

The recent development of high-altitude platforms (HAPs) has attracted increasing attention since they can serve as a promising communication method to assist satellite-terrestrial networks. In this paper, we consider an integrated three-layer satell ite-HAP-terrestrial network where the HAP support dual-band connectivity. Specifically, the HAP can not only communicate with terrestrial users over C-band directly, but also provide backhaul services to terrestrial user terminals over Ka-band. We formulate a sum-rate maximization problem and then propose a fractional programming based algorithm to solve the problem by optimizing the bandwidth and power allocation iteratively. The closed-form optimal solutions for bandwidth allocation and power allocation in each iteration are also derived. Simulation results show the capacity enhancement brought by the dual-band connectivity of the HAP. The influence of the power of the HAP and the power of the satellite is also discussed.
Low Earth orbit (LEO) satellite constellations rely on inter-satellite links (ISLs) to provide global connectivity. However, one significant challenge is to establish and maintain inter-plane ISLs, which support communication between different orbita l planes. This is due to the fast movement of the infrastructure and to the limited computation and communication capabilities on the satellites. In this paper, we make use of antenna arrays with either Butler matrix beam switching networks or digital beam steering to establish the inter-plane ISLs in a LEO satellite constellation. Furthermore, we present a greedy matching algorithm to establish inter-plane ISLs with the objective of maximizing the sum of rates. This is achieved by sequentially selecting the pairs, switching or pointing the beams and, finally, setting the data rates. Our results show that, by selecting an update period of 30 seconds for the matching, reliable communication can be achieved throughout the constellation, where the impact of interference in the rates is less than 0.7 % when compared to orthogonal links, even for relatively small antenna arrays. Furthermore, doubling the number of antenna elements increases the rates by around one order of magnitude.
With the emergence of Internet-of-Things (IoT) and ever-increasing demand for the newly connected devices, there is a need for more effective storage and processing paradigms to cope with the data generated from these devices. In this study, we have discussed different paradigms for data processing and storage including Cloud, Fog, and Edge computing models and their suitability in integrating with the IoT. Moreover, a detailed discussion on low latency and massive connectivity requirements of future cellular networks in accordance with machine-type communication (MTC) is also presented. Furthermore, the need to bring IoT devices to Internet connectivity and a standardized protocol stack to regulate the data transmission between these devices is also addressed while keeping in view the resource constraint nature of IoT devices.
Fully autonomous driving systems require fast detection and recognition of sensitive objects in the environment. In this context, intelligent vehicles should share their sensor data with computing platforms and/or other vehicles, to detect objects be yond their own sensors fields of view. However, the resulting huge volumes of data to be exchanged can be challenging to handle for standard communication technologies. In this paper, we evaluate how using a combination of different sensors affects the detection of the environment in which the vehicles move and operate. The final objective is to identify the optimal setup that would minimize the amount of data to be distributed over the channel, with negligible degradation in terms of object detection accuracy. To this aim, we extend an already available object detection algorithm so that it can consider, as an input, camera images, LiDAR point clouds, or a combination of the two, and compare the accuracy performance of the different approaches using two realistic datasets. Our results show that, although sensor fusion always achieves more accurate detections, LiDAR only inputs can obtain similar results for large objects while mitigating the burden on the channel.
95 - Dali Zhu , Haitao Liu , Ting Li 2021
In remote regions (e.g., mountain and desert), cellular networks are usually sparsely deployed or unavailable. With the appearance of new applications (e.g., industrial automation and environment monitoring) in remote regions, resource-constrained te rminals become unable to meet the latency requirements. Meanwhile, offloading tasks to urban terrestrial cloud (TC) via satellite link will lead to high delay. To tackle above issues, Satellite Edge Computing architecture is proposed, i.e., users can offload computing tasks to visible satellites for executing. However, existing works are usually limited to offload tasks in pure satellite networks, and make offloading decisions based on the predefined models of users. Besides, the runtime consumption of existing algorithms is rather high. In this paper, we study the task offloading problem in satellite-terrestrial edge computing networks, where tasks can be executed by satellite or urban TC. The proposed Deep Reinforcement learning-based Task Offloading (DRTO) algorithm can accelerate learning process by adjusting the number of candidate locations. In addition, offloading location and bandwidth allocation only depend on the current channel states. Simulation results show that DRTO achieves near-optimal offloading cost performance with much less runtime consumption, which is more suitable for satellite-terrestrial network with fast fading channel.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا