ترغب بنشر مسار تعليمي؟ اضغط هنا

Learning user representations based on historical behaviors lies at the core of modern recommender systems. Recent advances in sequential recommenders have convincingly demonstrated high capability in extracting effective user representations from th e given behavior sequences. Despite significant progress, we argue that solely modeling the observational behaviors sequences may end up with a brittle and unstable system due to the noisy and sparse nature of user interactions logged. In this paper, we propose to learn accurate and robust user representations, which are required to be less sensitive to (attack on) noisy behaviors and trust more on the indispensable ones, by modeling counterfactual data distribution. Specifically, given an observed behavior sequence, the proposed CauseRec framework identifies dispensable and indispensable concepts at both the fine-grained item level and the abstract interest level. CauseRec conditionally samples user concept sequences from the counterfactual data distributions by replacing dispensable and indispensable concepts within the original concept sequence. With user representations obtained from the synthesized user sequences, CauseRec performs contrastive user representation learning by contrasting the counterfactual with the observational. We conduct extensive experiments on real-world public recommendation benchmarks and justify the effectiveness of CauseRec with multi-aspects model analysis. The results demonstrate that the proposed CauseRec outperforms state-of-the-art sequential recommenders by learning accurate and robust user representations.
Spectroscopy is an indispensable tool in understanding the structures and dynamics of molecular systems. However computational modelling of spectroscopy is challenging due to the exponential scaling of computational complexity with system sizes unles s drastic approximations are made. Quantum computer could potentially overcome these classically intractable computational tasks, but existing approaches using quantum computers to simulate spectroscopy can only handle isolated and static molecules. In this work we develop a workflow that combines multi-scale modeling and time-dependent variational quantum algorithm to compute the linear spectroscopy of systems interacting with their condensed-phase environment via the relevant time correlation function. We demonstrate the feasibility of our approach by numerically simulating the UV-Vis absorption spectra of organic semiconductors. We show that our dynamical approach captures several spectral features that are otherwise overlooked by static methods. Our method can be directly used for other linear condensed-phase spectroscopy and could potentially be extended to nonlinear multi-dimensional spectroscopy.
Variational quantum algorithms (VQAs) are widely speculated to deliver quantum advantages for practical problems under the quantum-classical hybrid computational paradigm in the near term. Both theoretical and practical developments of VQAs share man y similarities with those of deep learning. For instance, a key component of VQAs is the design of task-dependent parameterized quantum circuits (PQCs) as in the case of designing a good neural architecture in deep learning. Partly inspired by the recent success of AutoML and neural architecture search (NAS), quantum architecture search (QAS) is a collection of methods devised to engineer an optimal task-specific PQC. It has been proven that QAS-designed VQAs can outperform expert-crafted VQAs under various scenarios. In this work, we propose to use a neural network based predictor as the evaluation policy for QAS. We demonstrate a neural predictor guided QAS can discover powerful PQCs, yielding state-of-the-art results for various examples from quantum simulation and quantum machine learning. Notably, neural predictor guided QAS provides a better solution than that by the random-search baseline while using an order of magnitude less of circuit evaluations. Moreover, the predictor for QAS as well as the optimal ansatz found by QAS can both be transferred and generalized to address similar problems.
With continuing improvements on the quality of fabricated quantum devices, it becomes increasingly crucial to analyze noisy quantum process in greater details such as characterizing the non-Markovianity in a quantitative manner. In this work, we prop ose an experimental protocol, termed Spectral Transfer Tensor Maps (SpecTTM), to accurately predict the RHP non-Markovian measure of any Pauli channels without state-preparation and measurement (SPAM) errors. In fact, for Pauli channels, SpecTTM even allows the reconstruction of highly-precised noise power spectrum for qubits. At last, we also discuss how SpecTTM can be useful to approximately characterize non-Markovianity of non-Pauli channels via Pauli twirling in an optimal basis.
Quantum architecture search (QAS) is the process of automating architecture engineering of quantum circuits. It has been desired to construct a powerful and general QAS platform which can significantly accelerate current efforts to identify quantum a dvantages of error-prone and depth-limited quantum circuits in the NISQ era. Hereby, we propose a general framework of differentiable quantum architecture search (DQAS), which enables automated designs of quantum circuits in an end-to-end differentiable fashion. We present several examples of circuit design problems to demonstrate the power of DQAS. For instance, unitary operations are decomposed into quantum gates, noisy circuits are re-designed to improve accuracy, and circuit layouts for quantum approximation optimization algorithm are automatically discovered and upgraded for combinatorial optimization problems. These results not only manifest the vast potential of DQAS being an essential tool for the NISQ application developments, but also present an interesting research topic from the theoretical perspective as it draws inspirations from the newly emerging interdisciplinary paradigms of differentiable programming, probabilistic programming, and quantum programming.
We propose a neural-network variational quantum algorithm to simulate the time evolution of quantum many-body systems. Based on a modified restricted Boltzmann machine (RBM) wavefunction ansatz, the proposed algorithm can be efficiently implemented i n near-term quantum computers with low measurement cost. Using a qubit recycling strategy, only one ancilla qubit is required to represent all the hidden spins in an RBM architecture. The variational algorithm is extended to open quantum systems by employing a stochastic Schrodinger equation approach. Numerical simulations of spin-lattice models demonstrate that our algorithm is capable of capturing the dynamics of closed and open quantum many-body systems with high accuracy without suffering from the vanishing gradient (or barren plateau) issue for the considered system sizes.
In this paper, we propose to investigate the problem of out-of-domain visio-linguistic pretraining, where the pretraining data distribution differs from that of downstream data on which the pretrained model will be fine-tuned. Existing methods for th is problem are purely likelihood-based, leading to the spurious correlations and hurt the generalization ability when transferred to out-of-domain downstream tasks. By spurious correlation, we mean that the conditional probability of one token (object or word) given another one can be high (due to the dataset biases) without robust (causal) relationships between them. To mitigate such dataset biases, we propose a Deconfounded Visio-Linguistic Bert framework, abbreviated as DeVLBert, to perform intervention-based learning. We borrow the idea of the backdoor adjustment from the research field of causality and propose several neural-network based architectures for Bert-style out-of-domain pretraining. The quantitative results on three downstream tasks, Image Retrieval (IR), Zero-shot IR, and Visual Question Answering, show the effectiveness of DeVLBert by boosting generalization ability.
Let $G$ be an $n$-vertex graph with $m$ edges. When asked a subset $S$ of vertices, a cut query on $G$ returns the number of edges of $G$ that have exactly one endpoint in $S$. We show that there is a bounded-error quantum algorithm that determines a ll connected components of $G$ after making $O(log(n)^6)$ many cut queries. In contrast, it follows from results in communication complexity that any randomized algorithm even just to decide whether the graph is connected or not must make at least $Omega(n/log(n))$ many cut queries. We further show that with $O(log(n)^8)$ many cut queries a quantum algorithm can with high probability output a spanning forest for $G$. En route to proving these results, we design quantum algorithms for learning a graph using cut queries. We show that a quantum algorithm can learn a graph with maximum degree $d$ after $O(d log(n)^2)$ many cut queries, and can learn a general graph with $O(sqrt{m} log(n)^{3/2})$ many cut queries. These two upper bounds are tight up to the poly-logarithmic factors, and compare to $Omega(dn)$ and $Omega(m/log(n))$ lower bounds on the number of cut queries needed by a randomized algorithm for the same problems, respectively. The key ingredients in our results are the Bernstein-Vazirani algorithm, approximate counting with OR queries, and learning sparse vectors from inner products as in compressed sensing.
Neural-Network Quantum State (NQS) has attracted significant interests as a powerful wave-function ansatz to model quantum phenomena. In particular, a variant of NQS based on the restricted Boltzmann machine (RBM) has been adapted to model the ground state of spin lattices and the electronic structures of small molecules in quantum devices. Despite these progresses, significant challenges remain with the RBM-NQS based quantum simulations. In this work, we present a state-preparation protocol to generate a specific set of complex-valued RBM-NQS, that we name the unitary-coupled RBM-NQS, in quantum circuits. This is a crucial advancement as all prior works deal exclusively with real-valued RBM-NQS for quantum algorithms. With this novel scheme, we achieve (1) modeling complex-valued wave functions, (2) using as few as one ancilla qubit to simulate $M$ hidden spins in an RBM architecture, and (3) avoiding post-selections to improve scalability.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا