No Arabic abstract
Physicists often claim that there is an effective repulsion between fermions, implied by the Pauli principle, and a corresponding effective attraction between bosons. We examine the origins of such exchange force ideas, the validity for them, and the areas where they are highly misleading. We propose that future explanations of quantum statistics should avoid the idea of a effective force completely and replace it with more appropriate physical insights, some of which are suggested here.
Here we explore the possibility of a lower limit to velocity or velocity change which is 20 orders of magnitude smaller than the speed of light and explore the various observable signatures including those in cosmic rays and gamma ray bursts.
We derive a standard Lorentz code (SLC) of motion by exploring rigid double transformations of, so-called, master space-induced supersymmetry (MS-SUSY), subject to certain rules. The renormalizable and actually finite flat-space field theories with $N_{max}=4$ supersymmetries in four dimensions, if only such symmetries are fundamental to nature, yield the possible extension of Lorentz code (ELC), at which the SLC violating new physics appears. In the framework of local MS-SUSY, we address the inertial effects. We argue that a space-time deformation of MS is the origin of inertia effects that can be observed by us. We go beyond the hypothesis of locality. This allows to improve the relevant geometrical structures referred to the noninertial frame in Minkowski space for an arbitrary velocities and characteristic acceleration lengths. This framework furnishes justification for the introduction of the weak principle of equivalence, i.e., the universality of free fall. The implications of the inertia effects in the more general post-Riemannian geometry are briefly discussed.
Entanglement has long stood as one of the characteristic features of quantum mechanics, yet recent developments have emphasized the importance of quantumness beyond entanglement for quantum foundations and technologies. We demonstrate that entanglement cannot entirely capture the worst-case sensitivity in quantum interferometry, when quantum probes are used to estimate the phase imprinted by a Hamiltonian, with fixed energy levels but variable eigenbasis, acting on one arm of an interferometer. This is shown by defining a bipartite entanglement monotone tailored to this interferometric setting and proving that it never exceeds the so-called interferometric power, a quantity which relies on more general quantum correlations beyond entanglement and captures the relevant resource. We then prove that the interferometric power can never increase when local commutativity-preserving operations are applied to qubit probes, an important step to validate such a quantity as a genuine quantum correlations monotone. These findings are accompanied by a room-temperature nuclear magnetic resonance experimental investigation, in which two-qubit states with extremal (maximal and minimal) interferometric power at fixed entanglement are produced and characterized.
Contrastive learning (CL) is effective in learning data representations without label supervision, where the encoder needs to contrast each positive sample over multiple negative samples via a one-vs-many softmax cross-entropy loss. However, conventional CL is sensitive to how many negative samples are included and how they are selected. Proposed in this paper is a doubly CL strategy that contrasts positive samples and negative ones within themselves separately. We realize this strategy with contrastive attraction and contrastive repulsion (CACR) makes the query not only exert a greater force to attract more distant positive samples but also do so to repel closer negative samples. Theoretical analysis reveals the connection between CACR and CL from the perspectives of both positive attraction and negative repulsion and shows the benefits in both efficiency and robustness brought by separately contrasting within the sampled positive and negative pairs. Extensive large-scale experiments on standard vision tasks show that CACR not only consistently outperforms existing CL methods on benchmark datasets in representation learning, but also provides interpretable contrastive weights, demonstrating the efficacy of the proposed doubly contrastive strategy.
We introduce a new method for representing the low energy subspace of a bosonic field theory on the qubit space of digital quantum computers. This discretization leads to an exponentially precise description of the subspace of the continuous theory thanks to the Nyquist-Shannon sampling theorem. The method makes the implementation of quantum algorithms for purely bosonic systems as well as fermion-boson interacting systems feasible. We present algorithmic circuits for computing the time evolution of these systems. The complexity of the algorithms scales polynomially with the system size. The algorithm is a natural extension of the existing quantum algorithms for simulating fermion systems in quantum chemistry and condensed matter physics to systems involving bosons and fermion-boson interactions and has a broad variety of potential applications in particle physics, condensed matter, etc. Due to the relatively small amount of additional resources required by the inclusion of bosons in our algorithm, the simulation of electron-phonon and similar systems can be placed in the same near-future reach as the simulation of interacting electron systems. We benchmark our algorithm by implementing it for a $2$-site Holstein polaron problem on an Atos Quantum Learning Machine (QLM) quantum simulator. The polaron quantum simulations are in excellent agreement with the results obtained by exact diagonalization.