We present a MATLAB/Octave toolbox to decompose finite dimensionial representations of compact groups. Surprisingly, little information about the group and the representation is needed to perform that task. We discuss applications to semidefinite programming.
We calculate the ghost two-point function in Coulomb gauge QCD with a simple model vacuum gluon wavefunction using Monte Carlo integration. This approach extends the previous analytic studies of the ghost propagator with this ansatz, where a ladder-rainbow expansion was unavoidable for calculating the path integral over gluon field configurations. The new approach allows us to study the possible critical behavior of the coupling constant, as well as the Coulomb potential derived from the ghost dressing function. We demonstrate that IR enhancement of the ghost correlator or Coulomb form factor fails to quantitatively reproduce confinement using Gaussian vacuum wavefunctional.
Since the very early days of quantum theory there have been numerous attempts to interpret quantum mechanics as a statistical theory. This is equivalent to describing quantum states and ensembles together with their dynamics entirely in terms of phase-space distributions. Finite dimensional systems have historically been an issue. In recent works [Phys. Rev. Lett. 117, 180401 and Phys. Rev. A 96, 022117] we presented a framework for representing any quantum state as a complete continuous Wigner function. Here we extend this work to its partner function -- the Weyl function. In doing so we complete the phase-space formulation of quantum mechanics -- extending work by Wigner, Weyl, Moyal, and others to any quantum system. This work is structured in three parts. Firstly we provide a brief modernized discussion of the general framework of phase-space quantum mechanics. We extend previous work and show how this leads to a framework that can describe any system in phase space -- putting it for the first time on a truly equal footing to Schrodingers and Heisenbergs formulation of quantum mechanics. Importantly, we do this in a way that respects the unifying principles of parity and displacement in a natural broadening of previously developed phase space concepts and methods. Secondly we consider how this framework is realized for different quantum systems; in particular we consider the proper construction of Weyl functions for some example finite dimensional systems. Finally we relate the Wigner and Weyl distributions to statistical properties of any quantum system or set of systems.
Although a standard in natural science, reproducibility has been only episodically applied in experimental computer science. Scientific papers often present a large number of tables, plots and pictures that summarize the obtained results, but then loosely describe the steps taken to derive them. Not only can the methods and the implementation be complex, but also their configuration may require setting many parameters and/or depend on particular system configurations. While many researchers recognize the importance of reproducibility, the challenge of making it happen often outweigh the benefits. Fortunately, a plethora of reproducibility solutions have been recently designed and implemented by the community. In particular, packaging tools (e.g., ReproZip) and virtualization tools (e.g., Docker) are promising solutions towards facilitating reproducibility for both authors and reviewers. To address the incentive problem, we have implemented a new publication model for the Reproducibility Section of Information Systems Journal. In this section, authors submit a reproducibility paper that explains in detail the computational assets from a previous published manuscript in Information Systems.
This paper presents a use case exploring the application of the Archival Resource Key (ARK) persistent identifier for promoting and maintaining ontologies. In particular, we look at improving computation with an in-house ontology server in the context of temporally aligned vocabularies. This effort demonstrates the utility of ARKs in preparing historical ontologies for computational archival science.
A purely numerical approach to compact Riemann surfaces starting from plane algebraic curves is presented. The critical points of the algebraic curve are computed via a two-dimensional Newton iteration. The starting values for this iteration are obtained from the resultants with respect to both coordinates of the algebraic curve and a suitable pairing of their zeros. A set of generators of the fundamental group for the complement of these critical points in the complex plane is constructed from circles around these points and connecting lines obtained from a minimal spanning tree. The monodromies are computed by solving the defining equation of the algebraic curve on collocation points along these contours and by analytically continuing the roots. The collocation points are chosen to correspond to Chebychev collocation points for an ensuing Clenshaw-Curtis integration of the holomorphic differentials which gives the periods of the Riemann surface with spectral accuracy. At the singularities of the algebraic curve, Puiseux expansions computed by contour integration on the circles around the singularities are used to identify the holomorphic differentials. The Abel map is also computed with the Clenshaw-Curtis algorithm and contour integrals. As an application of the code, solutions to the Kadomtsev-Petviashvili equation are computed on non-hyperelliptic Riemann surfaces.