Do you want to publish a course? Click here

Drone classification from RF fingerprints using deep residual nets

104   0   0.0 ( 0 )
 Added by Sanjoy Basak
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Detecting UAVs is becoming more crucial for various industries such as airports and nuclear power plants for improving surveillance and security measures. Exploiting radio frequency (RF) based drone control and communication enables a passive way of drone detection for a wide range of environments and even without favourable line of sight (LOS) conditions. In this paper, we evaluate RF based drone classification performance of various state-of-the-art (SoA) models on a new realistic drone RF dataset. With the help of a newly proposed residual Convolutional Neural Network (CNN) model, we show that the drone RF frequency signatures can be used for effective classification. The robustness of the classifier is evaluated in a multipath environment considering varying Doppler frequencies that may be introduced from a flying drone. We also show that the model achieves better generalization capabilities under different wireless channel and drone speed scenarios. Furthermore, the newly proposed models classification performance is evaluated on a simultaneous multi-drone scenario. The classifier achieves close to 99 % classification accuracy for signal-to-noise ratio (SNR) 0 dB and at -10 dB SNR it obtains 5 % better classification accuracy compared to the existing framework.



rate research

Read More

This paper investigates the problem of detection and classification of unmanned aerial vehicles (UAVs) in the presence of wireless interference signals using a passive radio frequency (RF) surveillance system. The system uses a multistage detector to distinguish signals transmitted by a UAV controller from the background noise and interference signals. First, RF signals from any source are detected using a Markov models-based naive Bayes decision mechanism. When the receiver operates at a signal-to-noise ratio (SNR) of 10 dB, and the threshold, which defines the states of the models, is set at a level 3.5 times the standard deviation of the preprocessed noise data, a detection accuracy of 99.8% with a false alarm rate of 2.8% is achieved. Second, signals from Wi-Fi and Bluetooth emitters, if present, are detected based on the bandwidth and modulation features of the detected RF signal. Once the input signal is identified as a UAV controller signal, it is classified using machine learning (ML) techniques. Fifteen statistical features extracted from the energy transients of the UAV controller signals are fed to neighborhood component analysis (NCA), and the three most significant features are selected. The performance of the NCA and five different ML classifiers are studied for 15 different types of UAV controllers. A classification accuracy of 98.13% is achieved by k-nearest neighbor classifier at 25 dB SNR. Classification performance is also investigated at different SNR levels and for a set of 17 UAV controllers which includes two pairs from the same UAV controller models.
90 - Hao Zhang , Lu Yuan , Guangyu Wu 2021
Automatic modulation classification (AMC) is of crucial importance for realizing wireless intelligence communications. Many deep learning based models especially convolution neural networks (CNNs) have been proposed for AMC. However, the computation cost is very high, which makes them inappropriate for beyond the fifth generation wireless communication networks that have stringent requirements on the classification accuracy and computing time. In order to tackle those challenges, a novel involution enabled AMC scheme is proposed by using the bottleneck structure of the residual networks. Involution is utilized instead of convolution to enhance the discrimination capability and expressiveness of the model by incorporating a self-attention mechanism. Simulation results demonstrate that our proposed scheme achieves superior classification performance and faster convergence speed comparing with other benchmark schemes.
104 - Ender Ozturk , Fatih Erden , 2020
This paper investigates the problem of classification of unmanned aerial vehicles (UAVs) from radio frequency (RF) fingerprints at the low signal-to-noise ratio (SNR) regime. We use convolutional neural networks (CNNs) trained with both RF time-series images and the spectrograms of 15 different off-the-shelf drone controller RF signals. When using time-series signal images, the CNN extracts features from the signal transient and envelope. As the SNR decreases, this approach fails dramatically because the information in the transient is lost in the noise, and the envelope is distorted heavily. In contrast to time-series representation of the RF signals, with spectrograms, it is possible to focus only on the desired frequency interval, i.e., 2.4 GHz ISM band, and filter out any other signal component outside of this band. These advantages provide a notable performance improvement over the time-series signals-based methods. To further increase the classification accuracy of the spectrogram-based CNN, we denoise the spectrogram images by truncating them to a limited spectral density interval. Creating a single model using spectrogram images of noisy signals and tuning the CNN model parameters, we achieve a classification accuracy varying from 92% to 100% for an SNR range from -10 dB to 30 dB, which significantly outperforms the existing approaches to our best knowledge.
Across the globe, remote image data is rapidly being collected for the assessment of benthic communities from shallow to extremely deep waters on continental slopes to the abyssal seas. Exploiting this data is presently limited by the time it takes for experts to identify organisms found in these images. With this limitation in mind, a large effort has been made globally to introduce automation and machine learning algorithms to accelerate both classification and assessment of marine benthic biota. One major issue lies with organisms that move with swell and currents, like kelps. This paper presents an automatic hierarchical classification method (local binary classification as opposed to the conventional flat classification) to classify kelps in images collected by autonomous underwater vehicles. The proposed kelp classification approach exploits learned feature representations extracted from deep residual networks. We show that these generic features outperform the traditional off-the-shelf CNN features and the conventional hand-crafted features. Experiments also demonstrate that the hierarchical classification method outperforms the traditional parallel multi-class classifications by a significant margin (90.0% vs 57.6% and 77.2% vs 59.0%) on Benthoz15 and Rottnest datasets respectively. Furthermore, we compare different hierarchical classification approaches and experimentally show that the sibling hierarchical training approach outperforms the inclusive hierarchical approach by a significant margin. We also report an application of our proposed method to study the change in kelp cover over time for annually repeated AUV surveys.
Convolutional Networks (ConvNets) have recently improved image recognition performance thanks to end-to-end learning of deep feed-forward models from raw pixels. Deep learning is a marked departure from the previous state of the art, the Fisher Vector (FV), which relied on gradient-based encoding of local hand-crafted features. In this paper, we discuss a novel connection between these two approaches. First, we show that one can derive gradient representations from ConvNets in a similar fashion to the FV. Second, we show that this gradient representation actually corresponds to a structured matrix that allows for efficient similarity computation. We experimentally study the benefits of transferring this representation over the outputs of ConvNet layers, and find consistent improvements on the Pascal VOC 2007 and 2012 datasets.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا