No Arabic abstract
We investigate a novel anytime control algorithm for wireless networked control with random dropouts. The controller computes sequences of tentative future control commands using time-varying (Markovian) computational resources. The sensor-controller and controller-actuator channel states are spatial- and time-correlated, and are modeled as a multi-state Markov process. To compensate for the effect of packet dropouts, a dual-buffer mechanism is proposed. We develop a novel cycle-cost-based approach to obtain the stability conditions on the nonlinear plant, controller, network and computational resources.
There has been substantial progress recently in understanding toy problems of purely implicit signaling. These are problems where the source and the channel are implicit -- the message is generated endogenously by the system, and the plant itself is used as a channel. In this paper, we explore how implicit and explicit communication can be used synergistically to reduce control costs. The setting is an extension of Witsenhausens counterexample where a rate-limited external channel connects the two controllers. Using a semi-deterministic version of the problem, we arrive at a binning-based strategy that can outperform the best known strategies by an arbitrarily large factor. We also show that our binning-based strategy attains within a constant factor of the optimal cost for an asymptotically infinite-length version of the problem uniformly over all problem parameters and all rates on the external channel. For the scalar case, although our results yield approximate optimality for each fixed rate, we are unable to prove approximately-optimality uniformly over all rates.
Integrated sensing and communication (ISAC) is a promising technology to improve the band-utilization efficiency via spectrum sharing or hardware sharing between radar and communication systems. Since a common radio resource budget is shared by both functionalities, there exists a tradeoff between the sensing and communication performance. However, this tradeoff curve is currently unknown in ISAC systems with human motion recognition tasks based on deep learning. To fill this gap, this paper formulates and solves a multi-objective optimization problem which simultaneously maximizes the recognition accuracy and the communication data rate. The key ingredient of this new formulation is a nonlinear recognition accuracy model with respect to the wireless resources, where the model is derived from power function regression of the system performance of the deep spectrogram network. To avoid cost-expensive data collection procedures, a primitive-based autoregressive hybrid (PBAH) channel model is developed, which facilitates efficient training and testing dataset generation for human motion recognition in a virtual environment. Extensive results demonstrate that the proposed wireless recognition accuracy and PBAH channel models match the actual experimental data very well. Moreover, it is found that the accuracy-rate region consists of a communication saturation zone, a sensing saturation zone, and a communication-sensing adversarial zone, of which the third zone achieves the desirable balanced performance for ISAC systems.
We characterize the practical photon-counting receiver in optical scattering communication with finite sampling rate and electrical noise. In the receiver side, the detected signal can be characterized as a series of pulses generated by photon-multiplier (PMT) detector and held by the pulse-holding circuits, which are then sampled by the analog-to-digit convertor (ADC) with finite sampling rate and counted by a rising-edge pulse detector. However, the finite small pulse width incurs the dead time effect that may lead to sub-Poisson distribution on the recorded pulses. We analyze first-order and second-order moments on the number of recorded pulses with finite sampling rate at the receiver side under two cases where the sampling period is shorter than or equal to the pulse width as well as longer than the pulse width. Moreover, we adopt the maximum likelihood (ML) detection. In order to simplify the analysis, we adopt binomial distribution approximation on the number of recorded pulses in each slot. A tractable holding time and decision threshold selection rule is provided aiming to maximize the minimal Kullback-Leibler (KL) distance between the two distributions. The performance of proposed sub-Poisson distribution and the binomial approximation are verified by the experimental results. The equivalent arrival rate and holding time predicted by the of sub-Poisson model and the associated proposed binomial distribution on finite sampling rate and the electrical noise are validated by the simulation results. The proposed the holding time and decision threshold selection rule performs close to the optimal one.
Wireless connectivity has traditionally been regarded as an opaque data pipe carrying messages, whose context-dependent meaning and effectiveness have been ignored. Nevertheless, in emerging cyber-physical and autonomous networked systems, acquiring, processing, and sending excessive amounts of distributed real-time data, which ends up being stale or useless to the end user, will cause communication bottlenecks, increased latency, and safety issues. We envision a communication paradigm shift, which makes the Semantics of Information, i.e., the significance and the usefulness of messages with respect to the goal of data exchange, the underpinning of the entire communication process. This entails a goal-oriented unification of information generation, transmission, and usage, by taking into account process dynamics, signal sparsity, data correlation, and semantic information attributes. We apply this structurally new, synergetic approach to a communication scenario where the destination is tasked with real-time source reconstruction for the purpose of remote actuation. Capitalizing on semantics-empowered sampling and communication policies, we show significant reduction in both reconstruction error and cost of actuation error, as well as in the number of uninformative samples generated.
Applications towards 6G have brought a huge interest towards arrays with a high number of antennas and operating within the millimeter and sub-THz bandwidths for joint communication and localization. With such large arrays, the plane wave approximation is often not accurate because the system may operate in the near-field propagation region (Fresnel region) where the electromagnetic field wavefront is spherical. In this case, the curvature of arrival (CoA) is a measure of the spherical wavefront that can be used to infer the source position using only a single large array. In this paper, we study a near-field tracking problem for inferring the state (i.e., the position and velocity) of a moving source with an ad-hoc observation model that accounts for the phase profile of a large receiving array. For this tracking problem, we derive the posterior Cramer-Rao Lower Bound (P-CRLB) and show the effects when the source moves inside and outside the Fresnel region. We provide insights on how the loss of positioning information outside Fresnel comes from an increase of the ranging error rather than from inaccuracies of angular estimation. Then, we investigate the performance of different Bayesian tracking algorithms in the presence of model mismatches and abrupt trajectory changes. Our results demonstrate the feasibility and high accuracy for most of the tracking approaches without the need of wideband signals and of any synchronization scheme. signals and of any synchronization scheme.