No Arabic abstract
This paper presents new and efficient algorithms for matching stellar catalogues where the transformation between the coordinate systems of the two catalagoues is unknown and may include shearing. Finding a given object whether a star or asterism from the first catalogue in the second is logarithmic in time rather than polynomial, yielding a dramatic speed up relative to a naive implementation. Both acceleration of the matching algorithm and the ability to solve for arbitrary affine transformations not only will allow the registration of stellar catalogues and images that are now impossible to use but also will find applications in machine vision and other imaging applications.
We propose a new pattern-matching algorithm for matching CCD images to a stellar catalogue based statistical method in this paper. The method of constructing star pairs can greatly reduce the computational complexity compared with the triangle method. We use a subsample of the brightest objects from the image and reference catalogue, and then find a coordinate transformation between the image and reference catalogue based on the statistical information of star pairs. Then all the objects are matched based on the initial plate solution. The matching process can be accomplished in several milliseconds for the observed images taken by Yunnan observatory 1-m telescope.
The minimum degree algorithm is one of the most widely-used heuristics for reducing the cost of solving large sparse systems of linear equations. It has been studied for nearly half a century and has a rich history of bridging techniques from data structures, graph algorithms, and scientific computing. In this paper, we present a simple but novel combinatorial algorithm for computing an exact minimum degree elimination ordering in $O(nm)$ time, which improves on the best known time complexity of $O(n^3)$ and offers practical improvements for sparse systems with small values of $m$. Our approach leverages a careful amortized analysis, which also allows us to derive output-sensitive bounds for the running time of $O(min{msqrt{m^+}, Delta m^+} log n)$, where $m^+$ is the number of unique fill edges and original edges that the algorithm encounters and $Delta$ is the maximum degree of the input graph. Furthermore, we show there cannot exist an exact minimum degree algorithm that runs in $O(nm^{1-varepsilon})$ time, for any $varepsilon > 0$, assuming the strong exponential time hypothesis. This fine-grained reduction goes through the orthogonal vectors problem and uses a new low-degree graph construction called $U$-fillers, which act as pathological inputs and cause any minimum degree algorithm to exhibit nearly worst-case performance. With these two results, we nearly characterize the time complexity of computing an exact minimum degree ordering.
Fast access to large catalogs is required for some astronomical applications. Here we introduce the catsHTM tool, consisting of several large catalogs reformatted into HDF5-based file format, which can be downloaded and used locally. To allow fast access, the catalogs are partitioned into hierarchical triangular meshes and stored in HDF5 files. Several tools are provided to perform efficient cone searches at resolutions spanning from a few arc seconds to degrees, within a few milliseconds time. The first released version includes the following catalogs (by alphabetical order): 2MASS, 2MASS extended sources, AKARI, APASS, Cosmos, DECaLS/DR5, FIRST, GAIA/DR1, GAIA/DR2, GALEX/DR6Plus7, HSC/v2, IPHAS/DR2, NED redshifts, NVSS, Pan-STARRS1/DR1, PTF photometric catalog, ROSAT faint source, SDSS sources, SDSS/DR14 spectroscopy, Spitzer/SAGE, Spitzer/IRAC galactic center, UCAC4, UKIDSS/DR10, VST/ATLAS/DR3, VST/KiDS/DR3, WISE and XMM. We provide Python code that allows to perform cone searches, as well as MATLAB code for performing cone searches, catalog cross-matching, general searches, as well as load and create these catalogs.
We present new software to cross-match low-frequency radio catalogues: the Positional Update and Matching Algorithm (PUMA). PUMA combines a positional Bayesian probabilistic approach with spectral matching criteria, allowing for confusing sources in the matching process. We go on to create a radio sky model using PUMA based on the Murchison Widefield Array Commissioning Survey, and are able to automatically cross-match 98.5% of sources. Using the characteristics of this sky model, we create simple simulated mock catalogues on which to test PUMA, and find that PUMA can reliably find the correct spectral indices of sources, along with being able to recover ionospheric offsets. Finally, we use this sky model to calibrate and remove foreground sources from simulated interferometric data, generated using OSKAR (the Oxford University visibility generator). We demonstrate that there is a substantial improvement in foreground source removal when using higher frequency and higher resolution source positions, even when correcting positions by an average of 0.3 given a synthesized beam-width of 2.3.
We present a new and efficient algorithm for finding point sources in the photon event data stream from the Fermi Gamma-Ray Space Telescope, FermiFAST. The key advantage of FermiFAST is that it constructs a catalogue of potential sources very fast by arranging the photon data in a hierarchical data structure. Using this structure FermiFAST rapidly finds the photons that could have originated from a potential gamma-ray source. It calculates a likehihood ratio for the contribution of the potential source using the angular distribution of the photons within the region of interest. It can find within a few minutes the most significant half of the Fermi Third Point Source catalogue (3FGL) with nearly 80% purity from the four years of data used to construct the catalogue. If a higher purity sample is desirable, one can achieve a sample that includes the most significant third of the Fermi 3FGL with only five percent of the sources unassociated with Fermi sources. Outside the galaxy plane, all but eight of the 580 FermiFAST detections are associated with 3FGL sources. And of these eight, six yield significant detections of greater than five sigma when a further binned likelihood analysis is performed. This software allows for rapid exploration of the Fermi data, simulation of the source detection to calculate the selection function of various sources and the errors in the obtained parameters of the sources detected.