ﻻ يوجد ملخص باللغة العربية
We use the $k$-nearest neighbor probability distribution function ($k$NN-PDF, Banerjee & Abel 2021) to assess convergence in a scale-free $N$-body simulation. Compared to our previous two-point analysis, the $k$NN-PDF allows us to quantify our results in the language of halos and numbers of particles, while also incorporating non-Gaussian information. We find good convergence for 32 particles and greater at densities typical of halos, while 16 particles and fewer appears unconverged. Halving the softening length extends convergence to higher densities, but not to fewer particles. Our analysis is less sensitive to voids, but we analyze a limited range of underdensities and find evidence for convergence at 16 particles and greater even in sparse voids.
Cross-correlations between datasets are used in many different contexts in cosmological analyses. Recently, $k$-Nearest Neighbor Cumulative Distribution Functions ($k{rm NN}$-${rm CDF}$) were shown to be sensitive probes of cosmological (auto) cluste
The use of summary statistics beyond the two-point correlation function to analyze the non-Gaussian clustering on small scales is an active field of research in cosmology. In this paper, we explore a set of new summary statistics -- the $k$-Nearest N
Random walks including non-nearest-neighbor jumps appear in many real situations such as the diffusion of adatoms and have found numerous applications including PageRank search algorithm, however, related theoretical results are much less for this dy
We investigate the application of Hybrid Effective Field Theory (HEFT) -- which combines a Lagrangian bias expansion with subsequent particle dynamics from $N$-body simulations -- to the modeling of $k$-Nearest Neighbor Cumulative Distribution Functi
The K-nearest neighbor (KNN) classifier is one of the simplest and most common classifiers, yet its performance competes with the most complex classifiers in the literature. The core of this classifier depends mainly on measuring the distance or simi