ﻻ يوجد ملخص باللغة العربية
We address the use of neural networks (NNs) in classifying the environmental parameters of single-qubit dephasing channels. In particular, we investigate the performance of linear perceptrons and of two non-linear NN architectures. At variance with time-series-based approaches, our goal is to learn a discretized probability distribution over the parameters using tomographic data at just two random instants of time. We consider dephasing channels originating either from classical 1/f{alpha} noise or from the interaction with a bath of quantum oscillators. The parameters to be classified are the color {alpha} of the classical noise or the Ohmicity parameter s of the quantum environment. In both cases, we found that NNs are able to exactly classify parameters into 16 classes using noiseless data (a linear NN is enough for the color, whereas a single-layer NN is needed for the Ohmicity). In the presence of noisy data (e.g. coming from noisy tomographic measurements), the network is able to classify the color of the 1/f{alpha} noise into 16 classes with about 70% accuracy, whereas classification of Ohmicity turns out to be challenging. We also consider a more coarse-grained task, and train the network to discriminate between two macro-classes corresponding to {alpha} lessgtr 1 and s lessgtr 1, obtaining up to 96% and 79% accuracy using single-layer NNs.
We investigate the dynamics of quantum and classical correlations in a system of two qubits under local colored-noise dephasing channels. The time evolution of a single qubit interacting with its own environment is described by a memory kernel non-Ma
We discuss the problem of estimating a frequency via N-qubit probes undergoing independent dephasing channels that can be continuously monitored via homodyne or photo-detection. We derive the corresponding analytical solutions for the conditional sta
This paper introduces a new online learning framework for multiclass classification called learning with diluted bandit feedback. At every time step, the algorithm predicts a candidate label set instead of a single label for the observed example. It
We provide a unifying view of statistical information measures, multi-way Bayesian hypothesis testing, loss functions for multi-class classification problems, and multi-distribution $f$-divergences, elaborating equivalence results between all of thes
In this paper, we propose online algorithms for multiclass classification using partial labels. We propose two variants of Perceptron called Avg Perceptron and Max Perceptron to deal with the partial labeled data. We also propose Avg Pegasos and Max