ترغب بنشر مسار تعليمي؟ اضغط هنا

Machine Learning based detection of multiple Wi-Fi BSSs for LTE-U CSAT

134   0   0.0 ( 0 )
 نشر من قبل Adam Dziedzic
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

According to the LTE-U Forum specification, a LTE-U base-station (BS) reduces its duty cycle from 50% to 33% when it senses an increase in the number of co-channel Wi-Fi basic service sets (BSSs) from one to two. The detection of the number of Wi-Fi BSSs that are operating on the channel in real-time, without decoding the Wi-Fi packets, still remains a challenge. In this paper, we present a novel machine learning (ML) approach that solves the problem by using energy values observed during LTE-U OFF duration. Observing the energy values (at LTE-U BS OFF time) is a much simpler operation than decoding the entire Wi-Fi packets. In this work, we implement and validate the proposed ML based approach in real-time experiments, and demonstrate that there are two distinct patterns between one and two Wi-Fi APs. This approach delivers an accuracy close to 100% compared to auto-correlation (AC) and energy detection (ED) approaches.



قيم البحث

اقرأ أيضاً

The application of Machine Learning (ML) techniques to complex engineering problems has proved to be an attractive and efficient solution. ML has been successfully applied to several practical tasks like image recognition, automating industrial opera tions, etc. The promise of ML techniques in solving non-linear problems influenced this work which aims to apply known ML techniques and develop new ones for wireless spectrum sharing between Wi-Fi and LTE in the unlicensed spectrum. In this work, we focus on the LTE-Unlicensed (LTE-U) specification developed by the LTE-U Forum, which uses the duty-cycle approach for fair coexistence. The specification suggests reducing the duty cycle at the LTE-U base-station (BS) when the number of co-channel Wi-Fi basic service sets (BSSs) increases from one to two or more. However, without decoding the Wi-Fi packets, detecting the number of Wi-Fi BSSs operating on the channel in real-time is a challenging problem. In this work, we demonstrate a novel ML-based approach which solves this problem by using energy values observed during the LTE-U OFF duration. It is relatively straightforward to observe only the energy values during the LTE-U BS OFF time compared to decoding the entire Wi-Fi packet, which would require a full Wi-Fi receiver at the LTE-U base-station. We implement and validate the proposed ML-based approach by real-time experiments and demonstrate that there exist distinct patterns between the energy distributions between one and many Wi-Fi AP transmissions. The proposed ML-based approach results in a higher accuracy (close to 99% in all cases) as compared to the existing auto-correlation (AC) and energy detection (ED) approaches.
Driven by growing spectrum shortage, Long-term Evolution in unlicensed spectrum (LTE-U) has recently been proposed as a new paradigm to deliver better performance and experience for mobile users by extending the LTE protocol to unlicensed spectrum. I n the paper, we first present a comprehensive overview of the LTE-U technology, and discuss the practical challenges it faces. We summarize the existing LTE-U operation modes and analyze several means for LTE-U coexistence with Wi-Fi medium access control protocols. We further propose a novel hyper access-point (HAP) that integrates the functionalities of LTE small cell base station and commercial Wi-Fi AP for deployment by cellular network operators. Our proposed LTE-U access embedding within the Wi-Fi protocol is non-disruptive to unlicensed Wi-Fi nodes and demonstrates performance benefits as a seamless and novel LTE and Wi-Fi coexistence technology in unlicensed band. We provide results to demonstrate the performances advantage of this novel LTE-U proposal.
We show experimentally that workload-based AP-STA associations can improve system throughput significantly. We present a predictive model that guides optimal resource allocations in dense Wi-Fi networks and achieves 72-77% of the optimal throughput w ith varying training data set sizes using a 3-day trace of real cable modem traffic.
Smartphone apps for exposure notification and contact tracing have been shown to be effective in controlling the COVID-19 pandemic. However, Bluetooth Low Energy tokens similar to those broadcast by existing apps can still be picked up far away from the transmitting device. In this paper, we present a new class of methods for detecting whether or not two Wi-Fi-enabled devices are in immediate physical proximity, i.e. 2 or fewer meters apart, as established by the U.S. Centers for Disease Control and Prevention (CDC). Our goal is to enhance the accuracy of smartphone-based exposure notification and contact tracing systems. We present a set of binary machine learning classifiers that take as input pairs of Wi-Fi RSSI fingerprints. We empirically verify that a single classifier cannot generalize well to a range of different environments with vastly different numbers of detectable Wi-Fi Access Points (APs). However, specialized classifiers, tailored to situations where the number of detectable APs falls within a certain range, are able to detect immediate physical proximity significantly more accurately. As such, we design three classifiers for situations with low, medium, and high numbers of detectable APs. These classifiers distinguish between pairs of RSSI fingerprints recorded 2 or fewer meters apart and pairs recorded further apart but still in Bluetooth range. We characterize their balanced accuracy for this task to be between 66.8% and 77.8%.
We propose and experimentally evaluate a novel method that dynamically changes the contention window of access points based on system load to improve performance in a dense Wi-Fi deployment. A key feature is that no MAC protocol changes, nor client s ide modifications are needed to deploy the solution. We show that setting an optimal contention window can lead to throughput and latency improvements up to 155%, and 50%, respectively. Furthermore, we devise an online learning method that efficiently finds the optimal contention window with minimal training data, and yields an average improvement in throughput of 53-55% during congested periods for a real traffic-volume workload replay in a Wi-Fi test-bed.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا