ترغب بنشر مسار تعليمي؟ اضغط هنا

Outdoor Position Recovery from HeterogeneousTelco Cellular Data

66   0   0.0 ( 0 )
 نشر من قبل Weixiong Rao
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Recent years have witnessed unprecedented amounts of data generated by telecommunication (Telco) cellular networks. For example, measurement records (MRs) are generated to report the connection states between mobile devices and Telco networks, e.g., received signal strength. MR data have been widely used to localize outdoor mobile devices for human mobility analysis, urban planning, and traffic forecasting. Existing works using first-order sequence models such as the Hidden Markov Model (HMM) attempt to capture spatio-temporal locality in underlying mobility patterns for lower localization errors. The HMM approaches typically assume stable mobility patterns of the underlying mobile devices. Yet real MR datasets exhibit heterogeneous mobility patterns due to mixed transportation modes of the underlying mobile devices and uneven distribution of the positions associated with MR samples. Thus, the existing solutions cannot handle these heterogeneous mobility patterns. we propose a multi-task learning-based deep neural network (DNN) framework, namely PRNet+, to incorporate outdoor position recovery and transportation mode detection. To make sure the framework work, PRNet+ develops a feature extraction module to precisely learn local-, short- and long-term spatio-temporal locality from heterogeneous MR samples. Extensive evaluation on eight datasets collected at three representative areas in Shanghai indicates that PRNet+ greatly outperforms state-of-the-arts.



قيم البحث

اقرأ أيضاً

Recent years have witnessed the fast growth in telecommunication (Telco) techniques from 2G to upcoming 5G. Precise outdoor localization is important for Telco operators to manage, operate and optimize Telco networks. Differing from GPS, Telco locali zation is a technique employed by Telco operators to localize outdoor mobile devices by using measurement report (MR) data. When given MR samples containing noisy signals (e.g., caused by Telco signal interference and attenuation), Telco localization often suffers from high errors. To this end, the main focus of this paper is how to improve Telco localization accuracy via the algorithms to detect and repair outlier positions with high errors. Specifically, we propose a context-aware Telco localization technique, namely RLoc, which consists of three main components: a machine-learning-based localization algorithm, a detection algorithm to find flawed samples, and a repair algorithm to replace outlier localization results by better ones (ideally ground truth positions). Unlike most existing works to detect and repair every flawed MR sample independently, we instead take into account spatio-temporal locality of MR locations and exploit trajectory context to detect and repair flawed positions. Our experiments on the real MR data sets from 2G GSM and 4G LTE Telco networks verify that our work RLoc can greatly improve Telco location accuracy. For example, RLoc on a large 4G MR data set can achieve 32.2 meters of median errors, around 17.4% better than state-of-the-art.
Operational networks are increasingly using machine learning models for a variety of tasks, including detecting anomalies, inferring application performance, and forecasting demand. Accurate models are important, yet accuracy can degrade over time du e to concept drift, whereby either the characteristics of the data change over time (data drift) or the relationship between the features and the target predictor change over time (model drift). Drift is important to detect because changes in properties of the underlying data or relationships to the target prediction can require model retraining, which can be time-consuming and expensive. Concept drift occurs in operational networks for a variety of reasons, ranging from software upgrades to seasonality to changes in user behavior. Yet, despite the prevalence of drift in networks, its extent and effects on prediction accuracy have not been extensively studied. This paper presents an initial exploration into concept drift in a large cellular network in the United States for a major metropolitan area in the context of demand forecasting. We find that concept drift arises largely due to data drift, and it appears across different key performance indicators (KPIs), models, training set sizes, and time intervals. We identify the sources of concept drift for the particular problem of forecasting downlink volume. Weekly and seasonal patterns introduce both high and low-frequency model drift, while disasters and upgrades result in sudden drift due to exogenous shocks. Regions with high population density, lower traffic volumes, and higher speeds also tend to correlate with more concept drift. The features that contribute most significantly to concept drift are User Equipment (UE) downlink packets, UE uplink packets, and Real-time Transport Protocol (RTP) total received packets.
141 - Yong Zeng , Xiaoli Xu 2019
This paper studies the path design problem for cellular-connected unmanned aerial vehicle (UAV), which aims to minimize its mission completion time while maintaining good connectivity with the cellular network. We first argue that the conventional pa th design approach via formulating and solving optimization problems faces several practical challenges, and then propose a new reinforcement learning-based UAV path design algorithm by applying emph{temporal-difference} method to directly learn the emph{state-value function} of the corresponding Markov Decision Process. The proposed algorithm is further extended by using linear function approximation with tile coding to deal with large state space. The proposed algorithms only require the raw measured or simulation-generated signal strength as the input and are suitable for both online and offline implementations. Numerical results show that the proposed path designs can successfully avoid the coverage holes of cellular networks even in the complex urban environment.
With the seamless coverage of wireless cellular networks in modern society, it is interesting to consider the shape of wireless cellular coverage. Is the shape a regular hexagon, an irregular polygon, or another complex geometrical shape? Based on fr actal theory, the statistical characteristic of the wireless cellular coverage boundary is determined by the measured wireless cellular data collected from Shanghai, China. The measured results indicate that the wireless cellular coverage boundary presents an extremely irregular geometrical shape, which is also called a statistical fractal shape. Moreover, the statistical fractal characteristics of the wireless cellular coverage boundary have been validated by values of the Hurst parameter estimated in angular scales. The statistical fractal characteristics of the wireless cellular coverage boundary can be used to evaluate and design the handoff scheme of mobile user terminals in wireless cellular networks.
In this paper, we develop a partner selection protocol for enhancing the network lifetime in cooperative wireless networks. The case-study is the cooperative relayed transmission from fixed indoor nodes to a common outdoor access point. A stochastic bivariate model for the spatial distribution of the fading parameters that govern the link performance, namely the Rician K-factor and the path-loss, is proposed and validated by means of real channel measurements. The partner selection protocol is based on the real-time estimation of a function of these fading parameters, i.e., the coding gain. To reduce the complexity of the link quality assessment, a Bayesian approach is proposed that uses the site-specific bivariate model as a-priori information for the coding gain estimation. This link quality estimator allows network lifetime gains almost as if all K-factor values were known. Furthermore, it suits IEEE 802.15.4 compliant networks as it efficiently exploits the information acquired from the receiver signal strength indicator. Extensive numerical results highlight the trade-off between complexity, robustness to model mismatches and network lifetime performance. We show for instance that infrequent updates of the site-specific model through K-factor estimation over a subset of links are sufficient to at least double the network lifetime with respect to existing algorithms based on path loss information only.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا