Do you want to publish a course? Click here

Software-Defined Radio Implementation of Age-of-Information-Oriented Random Access

97   0   0.0 ( 0 )
 Added by He Chen
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

More and more emerging Internet of Things (IoT) applications involve status updates, where various IoT devices monitor certain physical processes and report their latest statuses to the relevant information fusion nodes. A new performance measure, termed the age of information (AoI), has recently been proposed to quantify the information freshness in time-critical IoT applications. Due to a large number of devices in future IoT networks, the decentralized channel access protocols (e.g. random access) are preferable thanks to their low network overhead. Built on the AoI concept, some recent efforts have developed several AoI-oriented ALOHA-like random access protocols for boosting the network-wide information freshness. However, all relevant works focused on theoretical designs and analysis. The development and implementation of a working prototype to evaluate and further improve these random access protocols in practice have been largely overlooked. Motivated as such, we build a software-defined radio (SDR) prototype for testing and comparing the performance of recently proposed AoI-oriented random access protocols. To this end, we implement a time-slotted wireless system by devising a simple yet effective over-the-air time synchronization scheme, in which beacons that serve as reference timing packets are broadcast by an access point from time to time. For a complete working prototype, we also design the frame structures of various packets exchanged within the system. Finally, we design a set of experiments, implement them on our prototype and test the considered algorithms in an office environment.



rate research

Read More

This paper presents the design and implementation of signaling splitting scheme in hyper-cellular network on a software defined radio platform. Hyper-cellular network is a novel architecture of future mobile communication systems in which signaling and data are decoupled at the air interface to mitigate the signaling overhead and allow energy efficient operation of base stations. On an open source software defined radio platform, OpenBTS, we investigate the feasibility of signaling splitting for GSM protocol and implement a novel system which can prove the proposed concept. Standard GSM handsets can camp on the network with the help of signaling base station, and data base station will be appointed to handle phone calls on demand. Our work initiates the systematic approach to study hyper-cellular concept in real wireless environment with both software and hardware implementations.
In this paper, an analysis is carried out for a method to mitigate the path loss through the dynamic spectrum access (DSA) method. The path loss is a major component which determines the QoS of a wireless link. Its effect is complemented by the presence of obstruction between the transmitter and receiver. The future cellular network (5G) focuses on operating with the millimeter-wave (mmW). In higher frequency, path loss can play a significant role in degrading the link quality due to higher attenuation. In a scenario, where the operating environment is changing dynamically, sudden degradation of operating conditions or arrival of obstruction between transmitter and receiver may result in link failure. The method analyzed here includes dynamically allocating spectrum at a lower frequency band for a link suffering from high path loss. For the analysis, a wireless link was set up using Universal Software Radio Peripherals (USRPs). The received power is observed to increase by dynamically changing the operating frequency from 1.9 GHz to 830 MHz. Finally the utility of software defined radio (SDR) in the RF front end, to combat the path loss in the future cellular networks, is studied.
376 - Jiaxin Liang , He Chen , 2020
Time-sensitive wireless networks are an important enabling building block for many emerging industrial Internet of Things (IoT) applications. Quick prototyping and evaluation of time-sensitive wireless technologies are desirable for R&D efforts. Software-defined radio (SDR), by allowing wireless signal processing on a personal computer (PC), has been widely used for such quick prototyping efforts. Unfortunately, because of the textit{uncontrollable delay} between the PC and the radio board, SDR is generally deemed not suitable for time-sensitive wireless applications that demand communication with low and deterministic latency. For a rigorous evaluation of its suitability for industrial IoT applications, this paper conducts a quantitative investigation of the synchronization accuracy and end-to-end latency achievable by an SDR wireless system. To this end, we designed and implemented a time-slotted wireless system on the Universal Software Radio Peripheral (USRP) SDR platform. We developed a time synchronization mechanism to maintain synchrony among nodes in the system. To reduce the delays and delay jitters between the USRP board and its PC, we devised a {textit{Just-in-time}} algorithm to ensure that packets sent by the PC to the USRP can reach the USRP just before the time slots they are to be transmitted. Our experiments demonstrate that $90%$ ($100%$) of the time slots of different nodes can be synchronized and aligned to within $ pm 0.5$ samples or $ pm 0.05mu s$ ($ pm 1.5$ samples or $ pm 0.15mu s$), and that the end-to-end packet delivery latency can be down to $3.75ms$. This means that SDR-based solutions can be applied in a range of IIoT applications that require tight synchrony and moderately low latency, e.g., sensor data collection, automated guided vehicle (AGV) control, and Human-Machine-Interaction (HMI).
270 - Jordi Paillisse 2020
Enterprise Networks, over the years, have become more and more complex trying to keep up with new requirements that challenge traditional solutions. Just to mention one out of many possible examples, technologies such as Virtual LANs (VLANs) struggle to address the scalability and operational requirements introduced by Internet of Things (IoT) use cases. To keep up with these challenges we have identified four main requirements that are common across modern enterprise networks: (i) scalable mobility, (ii) endpoint segmentation, (iii) simplified administration, and (iv) resource optimization. To address these challenges we designed SDA (Software Defined Access), a solution for modern enterprise networks that leverages Software-Defined Networking (SDN) and other state of the art techniques. In this paper we present the design, implementation and evaluation of SDA. Specifically, SDA: (i) leverages a combination of an overlay approach with an event-driven protocol (LISP) to dynamically adapt to traffic and mobility patterns while preserving resources, and (ii) enforces dynamic endpoint groups for scalable segmentation with low operational burden. We present our experience with deploying SDA in two real-life scenarios: an enterprise campus, and a large warehouse with mobile robots. Our evaluation shows that SDA, when compared with traditional enterprise networks, can (i) reduce overall data plane forwarding state up to 70% thanks to a reactive protocol using a centralized routing server, and (ii) reduce by an order of magnitude the handover delays in scenarios of massive mobility with respect to other approaches. Finally, we discuss lessons learned while deploying and operating SDA, and possible optimizations regarding the use of an event-driven protocol and group-based segmentation.
Timeliness is an emerging requirement for many Internet of Things (IoT) applications. In IoT networks, where a large-number of nodes are distributed, severe interference may incur during the transmission phase which causes age of information (AoI) degradation. It is therefore important to study the performance limit of AoI as well as how to achieve such limit. In this paper, we aim to optimize the AoI in random access Poisson networks. By taking into account the spatio-temporal interactions amongst the transmitters, an expression of the peak AoI is derived, based on explicit expressions of the optimal peak AoI and the corresponding optimal system parameters including the packet arrival rate and the channel access probability are further derived. It is shown that with a given packet arrival rate (resp. a given channel access probability), the optimal channel access probability (resp. the optimal packet arrival rate), is equal to one under a small node deployment density, and decrease monotonically as the spatial deployment density increases due to the severe interference caused by spatio-temproal coupling between transmitters. When joint tuning of the packet arrival rate and channel access probability is performed, the optimal channel access probability is always set to be one. Moreover, with the sole tuning of the channel access probability, it is found that the optimal peak AoI performance can be improved with a smaller packet arrival rate only when the node deployment density is high, which is contrast to the case of the sole tuning of the packet arrival rate, where a higher channel access probability always leads to better optimal peak AoI regardless of the node deployment density. In all the cases of optimal tuning of system parameters, the optimal peak AoI linearly grows with the node deployment density as opposed to an exponential growth with fixed system parameters.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا