In the work of Mukhin and Varchenko from 2002 there was introduced a Wronskian map from the variety of full flags in a finite dimensional vector space into a product of projective spaces. We establish a precise relationship between this map and the Plucker map. This allows us to recover the result of Varchenko and Wright saying that the polynomials appearing in the image of the Wronsky map are the initial values of the tau-functions for the Kadomtsev-Petviashvili hierarchy.
In this paper we review different expansions for neutrino oscillation probabilities in matter in the context of long-baseline neutrino experiments. We examine the accuracy and computational efficiency of different exact and approximate expressions. We find that many of the expressions used in the literature are not precise enough for the next generation of long-baseline experiments, but several of them are while maintaining comparable simplicity. The results of this paper can be used as guidance to both phenomenologists and experimentalists when implementing the various oscillation expressions into their analysis tools.
One can hardly believe that there is still something to be said about cubic equations. To dodge this doubt, we will instead try and say something about Sylvester. He doubtless found a way to solve cubic equations. As mentioned by Rota, it was the only method in this vein that he could remember. We realize that Sylvesters magnificent approach for reduced cubic equations boils down to an easy identity.
Recovering the 3D shape of transparent objects using a small number of unconstrained natural images is an ill-posed problem. Complex light paths induced by refraction and reflection have prevented both traditional and deep multiview stereo from solving this challenge. We propose a physically-based network to recover 3D shape of transparent objects using a few images acquired with a mobile phone camera, under a known but arbitrary environment map. Our novel contributions include a normal representation that enables the network to model complex light transport through local computation, a rendering layer that models refractions and reflections, a cost volume specifically designed for normal refinement of transparent shapes and a feature mapping based on predicted normals for 3D point cloud reconstruction. We render a synthetic dataset to encourage the model to learn refractive light transport across different views. Our experiments show successful recovery of high-quality 3D geometry for complex transparent shapes using as few as 5-12 natural images. Code and data are publicly released.
We perform the flavour $SU(3)$ analysis of the recently discovered $Omega(2012)$ hyperon. We find that well known (four star) $Delta(1700)$ resonance with quantum numbers of $J^P=3/2^-$ is a good candidate for the decuplet partner of $Omega(2012)$ if the branching for the three-body decays of the latter is not too large $le 70$%. That implies that the quantum numbers of $Omega(2012)$ are $I(J^P)=0(3/2^-)$. The predictions for the properties of still missing $Sigma$ and $Xi$ decuplet members are made. We also discuss the implications of the ${ overline{ K} Xi(1530)}$ molecular picture of $Omega(2012)$. Crucial experimental tests to distinguish various pictures of $Omega(2012)$ are suggested.
The recent success of machine learning (ML) has led to an explosive growth both in terms of new systems and algorithms built in industry and academia, and new applications built by an ever-growing community of data science (DS) practitioners. This quickly shifting panorama of technologies and applications is challenging for builders and practitioners alike to follow. In this paper, we set out to capture this panorama through a wide-angle lens, by performing the largest analysis of DS projects to date, focusing on questions that can help determine investments on either side. Specifically, we download and analyze: (a) over 6M Python notebooks publicly available on GITHUB, (b) over 2M enterprise DS pipelines developed within COMPANYX, and (c) the source code and metadata of over 900 releases from 12 important DS libraries. The analysis we perform ranges from coarse-grained statistical characterizations to analysis of library imports, pipelines, and comparative studies across datasets and time. We report a large number of measurements for our readers to interpret, and dare to draw a few (actionable, yet subjective) conclusions on (a) what systems builders should focus on to better serve practitioners, and (b) what technologies should practitioners bet on given current trends. We plan to automate this analysis and release associated tools and results periodically.