ترغب بنشر مسار تعليمي؟ اضغط هنا

Echo State Network based Symbol Detection in Chaotic Baseband Wireless Communication

81   0   0.0 ( 0 )
 نشر من قبل Hai-Peng Ren
 تاريخ النشر 2021
  مجال البحث هندسة إلكترونية
والبحث باللغة English




اسأل ChatGPT حول البحث

In some Internet of Things (IoT) applications, multi-path propagation is a main constraint of communication channel. Recently, the chaotic baseband wireless communication system (CBWCS) is promising to eliminate the inter-symbol interference (ISI) caused by multipath propagation. However, the current technique is only capable of removing the partial effect of ISI, due to only past decoded bits are available for the suboptimal decoding threshold calculation. However, the future transmitting bits also contribute to the threshold. The unavailable future information bits needed by the optimal decoding threshold are an obstacle to further improve the bit error rate (BER) performance. Different from the previous method using echo state network (ESN) to predict one future information bit, the proposed method in this paper predicts the optimal threshold directly using ESN. The proposed ESN-based threshold prediction method simplifies the symbol decoding operation by removing the threshold calculation from the transmitting symbols and channel information, which achieves better BER performance as compared to the previous method. The reason for this superior result lies in two folds, first, the proposed ESN is capable of using more future symbols information conveyed by the ESN input to get more accurate threshold; second, the proposed method here does not need to estimate the channel information using Least Square method, which avoids the extra error caused by inaccurate channel information estimation. By this way, the calculation complexity is decreased as compared to the previous method. Simulation results and experiment based on a wireless open-access research platform under a practical wireless channel, show the effectiveness and superiority of the proposed method.

قيم البحث

اقرأ أيضاً

138 - Hui-Ping Yin , Hai-Peng Ren 2021
To retrieve the information from the serious distorted received signal is the key challenge of communication signal processing. The chaotic baseband communication promises theoretically to eliminate the inter-symbol interference (ISI), however, it ne eds complicated calculation, if it is not impossible. In this paper, a genetic algorithm support vector machine (GA-SVM) based symbol detection method is proposed for chaotic baseband wireless communication system (CBWCS), by this way, treating the problem from a different viewpoint, the symbol decoding process is converted to be a binary classification through GA-SVM model. A trained GA-SVM model is used to decode the symbols directly at the receiver, so as to improve the bit error rate (BER) performance of the CBWCS and simplify the symbol detection process by removing the channel identification and the threshold calculation process as compared to that using the calculated threshold to decode symbol in the traditional methods. The simulation results show that the proposed method has better BER performance in both the static and time-varying wireless channels. The experimental results, based on the wireless open-access research platform, indicate that the BER of the proposed GA-SVM based symbol detection approach is superior to the other counterparts under a practical wireless multipath channel.
Some new properties of the chaotic signal have been implemented in communication system applications recently. However, due to the broadband property of the chaotic signal, it is very difficult for a practical transducer or antenna to convert such a broadband signal into a signal that would be suitable for practical band-limited wireless channel. Thus, the use of chaos property to improve the performance of conventional communication system without changing the system configuration becomes a critical issue in communication with chaos. In this paper, chaotic baseband waveform generated by a chaotic shaping filter is used to show that this difficulty can be overcome. The generated continuous-time chaotic waveform is proven to be topologically conjugate to a symbolic sequence, allowing the encoding of arbitrary information sequence into the chaotic waveform. A finite impulse response filter is used to replace the impulse control in order to encode information into the chaotic signal, simplifying the algorithm for high speed communication. A wireless communication system is being proposed using the chaotic signal as the baseband waveform, which is compatible with the general wireless communication platform. The matched filter and decoding method, using chaos properties, enhance the communication system performance. The Bit Error Rate (BER) and computational complexity performances of the proposed wireless communication system are analyzed and compared with the conventional wireless systems. The results show that the proposed chaotic baseband waveform of our wireless communication method has better BER performance in both the static and time-varying wireless channels. The experimental results, based on the commonly-used wireless open-access research platform, show that the BER of the proposed method is superior to the conventional method under a practical wireless multipath channel.
The design of symbol detectors in digital communication systems has traditionally relied on statistical channel models that describe the relation between the transmitted symbols and the observed signal at the receiver. Here we review a data-driven fr amework to symbol detection design which combines machine learning (ML) and model-based algorithms. In this hybrid approach, well-known channel-model-based algorithms such as the Viterbi method, BCJR detection, and multiple-input multiple-output (MIMO) soft interference cancellation (SIC) are augmented with ML-based algorithms to remove their channel-model-dependence, allowing the receiver to learn to implement these algorithms solely from data. The resulting data-driven receivers are most suitable for systems where the underlying channel models are poorly understood, highly complex, or do not well-capture the underlying physics. Our approach is unique in that it only replaces the channel-model-based computations with dedicated neural networks that can be trained from a small amount of data, while keeping the general algorithm intact. Our results demonstrate that these techniques can yield near-optimal performance of model-based algorithms without knowing the exact channel input-output statistical relationship and in the presence of channel state information uncertainty.
We propose a cell planning scheme to maximize the resource efficiency of a wireless communication network while considering quality-of-service requirements imposed by different mobile services. In dense and heterogeneous cellular 5G networks, the ava ilable time-frequency resources are orthogonally partitioned among different slices, which are serviced by the cells. The proposed scheme achieves a joint optimization of the resource distribution between network slices, the allocation of cells to operate on different slices, and the allocation of users to cells. Since the original problem formulation is computationally intractable, we propose a convex inner approximation. Simulations show that the proposed approach optimizes the resource efficiency and enables a service-centric network design paradigm.
Some new findings for chaos-based wireless communication systems have been identified recently. First, chaos has proven to be the optimal communication waveform because chaotic signals can achieve the maximum signal to noise ratio at receiver with th e simplest matched filter. Second, the information transmitted in chaotic signals is not modified by the multipath wireless channel. Third, chaos properties can be used to relief inter-symbol interference (ISI) caused by multipath propagation. Although recent work has reported the method of obtaining the optimal threshold to eliminate the ISI in chaos-based wireless communication, its practical implementation is still a challenge. By knowing the channel parameters and all symbols, especially the future symbol to be transmitted in advance, it is almost an impossible task in the practical communication systems. Owning to Artificial intelligence (AI) recent developments, Convolutional Neural Network (CNN) with deep learning structure is being proposed to predict future symbols based on the received signal, so as to further reduce ISI and obtain better bit error rate (BER) performance as compared to that used the existing sub-optimal threshold. The feature of the method involves predicting the future symbol and obtaining a better threshold suitable for time variant channel. Numerical simulation and experimental results validate our theory and the superiority of the proposed method.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا