ترغب بنشر مسار تعليمي؟ اضغط هنا

Analysis of Path Loss mitigation through Dynamic Spectrum Access: Software Defined Radio

142   0   0.0 ( 0 )
 نشر من قبل Chandan Pradhan
 تاريخ النشر 2015
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

In this paper, an analysis is carried out for a method to mitigate the path loss through the dynamic spectrum access (DSA) method. The path loss is a major component which determines the QoS of a wireless link. Its effect is complemented by the presence of obstruction between the transmitter and receiver. The future cellular network (5G) focuses on operating with the millimeter-wave (mmW). In higher frequency, path loss can play a significant role in degrading the link quality due to higher attenuation. In a scenario, where the operating environment is changing dynamically, sudden degradation of operating conditions or arrival of obstruction between transmitter and receiver may result in link failure. The method analyzed here includes dynamically allocating spectrum at a lower frequency band for a link suffering from high path loss. For the analysis, a wireless link was set up using Universal Software Radio Peripherals (USRPs). The received power is observed to increase by dynamically changing the operating frequency from 1.9 GHz to 830 MHz. Finally the utility of software defined radio (SDR) in the RF front end, to combat the path loss in the future cellular networks, is studied.

قيم البحث

اقرأ أيضاً

More and more emerging Internet of Things (IoT) applications involve status updates, where various IoT devices monitor certain physical processes and report their latest statuses to the relevant information fusion nodes. A new performance measure, te rmed the age of information (AoI), has recently been proposed to quantify the information freshness in time-critical IoT applications. Due to a large number of devices in future IoT networks, the decentralized channel access protocols (e.g. random access) are preferable thanks to their low network overhead. Built on the AoI concept, some recent efforts have developed several AoI-oriented ALOHA-like random access protocols for boosting the network-wide information freshness. However, all relevant works focused on theoretical designs and analysis. The development and implementation of a working prototype to evaluate and further improve these random access protocols in practice have been largely overlooked. Motivated as such, we build a software-defined radio (SDR) prototype for testing and comparing the performance of recently proposed AoI-oriented random access protocols. To this end, we implement a time-slotted wireless system by devising a simple yet effective over-the-air time synchronization scheme, in which beacons that serve as reference timing packets are broadcast by an access point from time to time. For a complete working prototype, we also design the frame structures of various packets exchanged within the system. Finally, we design a set of experiments, implement them on our prototype and test the considered algorithms in an office environment.
270 - Jordi Paillisse 2020
Enterprise Networks, over the years, have become more and more complex trying to keep up with new requirements that challenge traditional solutions. Just to mention one out of many possible examples, technologies such as Virtual LANs (VLANs) struggle to address the scalability and operational requirements introduced by Internet of Things (IoT) use cases. To keep up with these challenges we have identified four main requirements that are common across modern enterprise networks: (i) scalable mobility, (ii) endpoint segmentation, (iii) simplified administration, and (iv) resource optimization. To address these challenges we designed SDA (Software Defined Access), a solution for modern enterprise networks that leverages Software-Defined Networking (SDN) and other state of the art techniques. In this paper we present the design, implementation and evaluation of SDA. Specifically, SDA: (i) leverages a combination of an overlay approach with an event-driven protocol (LISP) to dynamically adapt to traffic and mobility patterns while preserving resources, and (ii) enforces dynamic endpoint groups for scalable segmentation with low operational burden. We present our experience with deploying SDA in two real-life scenarios: an enterprise campus, and a large warehouse with mobile robots. Our evaluation shows that SDA, when compared with traditional enterprise networks, can (i) reduce overall data plane forwarding state up to 70% thanks to a reactive protocol using a centralized routing server, and (ii) reduce by an order of magnitude the handover delays in scenarios of massive mobility with respect to other approaches. Finally, we discuss lessons learned while deploying and operating SDA, and possible optimizations regarding the use of an event-driven protocol and group-based segmentation.
This paper presents the design and implementation of signaling splitting scheme in hyper-cellular network on a software defined radio platform. Hyper-cellular network is a novel architecture of future mobile communication systems in which signaling a nd data are decoupled at the air interface to mitigate the signaling overhead and allow energy efficient operation of base stations. On an open source software defined radio platform, OpenBTS, we investigate the feasibility of signaling splitting for GSM protocol and implement a novel system which can prove the proposed concept. Standard GSM handsets can camp on the network with the help of signaling base station, and data base station will be appointed to handle phone calls on demand. Our work initiates the systematic approach to study hyper-cellular concept in real wireless environment with both software and hardware implementations.
145 - Xiang Tan , Li Zhou , Haijun Wang 2021
With the development of the 5G and Internet of Things, amounts of wireless devices need to share the limited spectrum resources. Dynamic spectrum access (DSA) is a promising paradigm to remedy the problem of inefficient spectrum utilization brought u pon by the historical command-and-control approach to spectrum allocation. In this paper, we investigate the distributed DSA problem for multi-user in a typical multi-channel cognitive radio network. The problem is formulated as a decentralized partially observable Markov decision process (Dec-POMDP), and we proposed a centralized off-line training and distributed on-line execution framework based on cooperative multi-agent reinforcement learning (MARL). We employ the deep recurrent Q-network (DRQN) to address the partial observability of the state for each cognitive user. The ultimate goal is to learn a cooperative strategy which maximizes the sum throughput of cognitive radio network in distributed fashion without coordination information exchange between cognitive users. Finally, we validate the proposed algorithm in various settings through extensive experiments. From the simulation results, we can observe that the proposed algorithm can converge fast and achieve almost the optimal performance.
376 - Jiaxin Liang , He Chen , 2020
Time-sensitive wireless networks are an important enabling building block for many emerging industrial Internet of Things (IoT) applications. Quick prototyping and evaluation of time-sensitive wireless technologies are desirable for R&D efforts. Soft ware-defined radio (SDR), by allowing wireless signal processing on a personal computer (PC), has been widely used for such quick prototyping efforts. Unfortunately, because of the textit{uncontrollable delay} between the PC and the radio board, SDR is generally deemed not suitable for time-sensitive wireless applications that demand communication with low and deterministic latency. For a rigorous evaluation of its suitability for industrial IoT applications, this paper conducts a quantitative investigation of the synchronization accuracy and end-to-end latency achievable by an SDR wireless system. To this end, we designed and implemented a time-slotted wireless system on the Universal Software Radio Peripheral (USRP) SDR platform. We developed a time synchronization mechanism to maintain synchrony among nodes in the system. To reduce the delays and delay jitters between the USRP board and its PC, we devised a {textit{Just-in-time}} algorithm to ensure that packets sent by the PC to the USRP can reach the USRP just before the time slots they are to be transmitted. Our experiments demonstrate that $90%$ ($100%$) of the time slots of different nodes can be synchronized and aligned to within $ pm 0.5$ samples or $ pm 0.05mu s$ ($ pm 1.5$ samples or $ pm 0.15mu s$), and that the end-to-end packet delivery latency can be down to $3.75ms$. This means that SDR-based solutions can be applied in a range of IIoT applications that require tight synchrony and moderately low latency, e.g., sensor data collection, automated guided vehicle (AGV) control, and Human-Machine-Interaction (HMI).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا