No Arabic abstract
We present a flexible framework for uncertainty principles in spectral graph theory. In this framework, general filter functions modeling the spatial and spectral localization of a graph signal can be incorporated. It merges several existing uncertainty relations on graphs, among others the Landau-Pollak principle describing the joint admissibility region of two projection operators, and uncertainty relations based on spectral and spatial spreads. Using theoretical and computational aspects of the numerical range of matrices, we are able to characterize and illustrate the shapes of the uncertainty curves and to study the space-frequency localization of signals inside the admissibility regions.
For the interpolation of graph signals with generalized shifts of a graph basis function (GBF), we introduce the concept of positive definite functions on graphs. This concept merges kernel-based interpolation with spectral theory on graphs and can be regarded as a graph analog of radial basis function interpolation in euclidean spaces or spherical basis functions. We provide several descriptions of positive definite functions on graphs, the most relevant one is a Bochner-type characterization in terms of positive Fourier coefficients. These descriptions allow us to design GBFs and to study GBF interpolation in more detail: we are able to characterize the native spaces of the interpolants, we provide explicit estimates for the interpolation error and obtain bounds for the numerical stability. As a final application, we show how GBF interpolation can be used to get quadrature formulas on graphs.
This paper presents a new approach to the detection of discontinuities in the n-th derivative of observational data. This is achieved by performing two polynomial approximations at each interstitial point. The polynomials are coupled by constraining their coefficients to ensure continuity of the model up to the (n-1)-th derivative; while yielding an estimate for the discontinuity of the n-th derivative. The coefficients of the polynomials correspond directly to the derivatives of the approximations at the interstitial points through the prudent selection of a common coordinate system. The approximation residual and extrapolation errors are investigated as measures for detecting discontinuity. This is necessary since discrete observations of continuous systems are discontinuous at every point. It is proven, using matrix algebra, that positive extrema in the combined approximation-extrapolation error correspond exactly to extrema in the difference of the Taylor coefficients. This provides a relative measure for the severity of the discontinuity in the observational data. The matrix algebraic derivations are provided for all aspects of the methods presented here; this includes a solution for the covariance propagation through the computation. The performance of the method is verified with a Monte Carlo simulation using synthetic piecewise polynomial data with known discontinuities. It is also demonstrated that the discontinuities are suitable as knots for B-spline modelling of data. For completeness, the results of applying the method to sensor data acquired during the monitoring of heavy machinery are presented.
The high dynamics and heterogeneous interactions in the complicated urban systems have raised the issue of uncertainty quantification in spatiotemporal human mobility, to support critical decision-makings in risk-aware web applications such as urban event prediction where fluctuations are of significant interests. Given the fact that uncertainty quantifies the potential variations around prediction results, traditional learning schemes always lack uncertainty labels, and conventional uncertainty quantification approaches mostly rely upon statistical estimations with Bayesian Neural Networks or ensemble methods. However, they have never involved any spatiotemporal evolution of uncertainties under various contexts, and also have kept suffering from the poor efficiency of statistical uncertainty estimation while training models with multiple times. To provide high-quality uncertainty quantification for spatiotemporal forecasting, we propose an uncertainty learning mechanism to simultaneously estimate internal data quality and quantify external uncertainty regarding various contextual interactions. To address the issue of lacking labels of uncertainty, we propose a hierarchical data turbulence scheme where we can actively inject controllable uncertainty for guidance, and hence provide insights to both uncertainty quantification and weak supervised learning. Finally, we re-calibrate and boost the prediction performance by devising a gated-based bridge to adaptively leverage the learned uncertainty into predictions. Extensive experiments on three real-world spatiotemporal mobility sets have corroborated the superiority of our proposed model in terms of both forecasting and uncertainty quantification.
The capability to achieve high-precision positioning accuracy has been considered as one of the most critical requirements for vehicle-to-everything (V2X) services in the fifth-generation (5G) cellular networks. The non-line-of-sight (NLOS) connectivity, coverage, reliability requirements, the minimum number of available anchors, and bandwidth limitations are among the main challenges to achieve high accuracy in V2X services. This work provides an overview of the potential solutions to provide the new radio (NR) V2X users (UEs) with high positioning accuracy in the future 3GPP releases. In particular, we propose a novel selective positioning solution to dynamically switch between different positioning technologies to improve the overall positioning accuracy in NR V2X services, taking into account the locations of V2X UEs and the accuracy of the collected measurements. Furthermore, we use high-fidelity system-level simulations to evaluate the performance gains of fusing the positioning measurements from different technologies in NR V2X services. Our numerical results show that the proposed hybridized schemes achieve a positioning error $boldsymbol{leq}$ 3 m with $boldsymbol{approx}$ 76% availability compared to $boldsymbol{approx}$ 55% availability when traditional positioning methods are used. The numerical results also reveal a potential gain of $boldsymbol{approx}$ 56% after leveraging the road-side units (RSUs) to improve the tail of the UEs positioning error distribution, i.e., worst-case scenarios, in NR V2X services.
Partition of unity methods (PUMs) on graphs are simple and highly adaptive auxiliary tools for graph signal processing. Based on a greedy-type metric clustering and augmentation scheme, we show how a partition of unity can be generated in an efficient way on graphs. We investigate how PUMs can be combined with a local graph basis function (GBF) approximation method in order to obtain low-cost global interpolation or classification schemes. From a theoretical point of view, we study necessary prerequisites for the partition of unity such that global error estimates of the PUM follow from corresponding local ones. Finally, properties of the PUM as cost-efficiency and approximation accuracy are investigated numerically.