ترغب بنشر مسار تعليمي؟ اضغط هنا

Simulating Photometric Images of Moving Targets with Photon-mapping

312   0   0.0 ( 0 )
 نشر من قبل Junju Du
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a novel, easy-to-use method based on the photon-mapping technique to simulate photometric images of moving targets. Realistic images can be created in two passes: photon tracing and image rendering. The nature of light sources, tracking mode of the telescope, point spread function (PSF), and specifications of the CCD are taken into account in the imaging process. Photometric images in a variety of observation scenarios can be generated flexibly. We compared the simulated images with the observed ones. The residuals between them are negligible, and the correlation coefficients between them are high, with a median of $0.9379_{-0.0201}^{+0.0125}$ for 1020 pairs of images, which means a high fidelity and similarity. The method is versatile and can be used to plan future photometry of moving targets, interpret existing observations, and provide test images for image processing algorithms.



قيم البحث

اقرأ أيضاً

Astronomical images from optical photometric surveys are typically contaminated with transient artifacts such as cosmic rays, satellite trails and scattered light. We have developed and tested an algorithm that removes these artifacts using a deep, a rtifact free, static sky coadd image built up through the median combination of point spread function (PSF) homogenized, overlapping single epoch images. Transient artifacts are detected and masked in each single epoch image through comparison with an artifact free, PSF-matched simulated image that is constructed using the PSF-corrected, model fitting catalog from the artifact free coadd image together with the position variable PSF model of the single epoch image. This approach works well not only for cleaning single epoch images with worse seeing than the PSF homogenized coadd, but also the traditionally much more challenging problem of cleaning single epoch images with better seeing. In addition to masking transient artifacts, we have developed an interpolation approach that uses the local PSF and performs well in removing artifacts whose widths are smaller than the PSF full width at half maximum, including cosmic rays, the peaks of saturated stars and bleed trails. We have tested this algorithm on Dark Energy Survey Science Verification data and present performance metrics. More generally, our algorithm can be applied to any survey which images the same part of the sky multiple times.
We develop a method to infer log-normal random fields from measurement data affected by Gaussian noise. The log-normal model is well suited to describe strictly positive signals with fluctuations whose amplitude varies over several orders of magnitud e. We use the formalism of minimum Gibbs free energy to derive an algorithm that uses the signals correlation structure to regularize the reconstruction. The correlation structure, described by the signals power spectrum, is thereby reconstructed from the same data set. We show that the minimization of the Gibbs free energy, corresponding to a Gaussian approximation to the posterior marginalized over the power spectrum, is equivalent to the empirical Bayes ansatz, in which the power spectrum is fixed to its maximum a posteriori value. We further introduce a prior for the power spectrum that enforces spectral smoothness. The appropriateness of this prior in different scenarios is discussed and its effects on the reconstructions results are demonstrated. We validate the performance of our reconstruction algorithm in a series of one- and two-dimensional test cases with varying degrees of non-linearity and different noise levels.
To analyze dynamic positron emission tomography (PET) images, various generic multivariate data analysis techniques have been considered in the literature, such as principal component analysis (PCA), independent component analysis (ICA), factor analy sis and nonnegative matrix factorization (NMF). Nevertheless, these conventional approaches neglect any possible nonlinear variations in the time activity curves describing the kinetic behavior of tissues with specific binding, which limits their ability to recover a reliable, understandable and interpretable description of the data. This paper proposes an alternative analysis paradigm that accounts for spatial fluctuations in the exchange rate of the tracer between a free compartment and a specifically bound ligand compartment. The method relies on the concept of linear unmixing, usually applied on the hyperspectral domain, which combines NMF with a sum-to-one constraint that ensures an exhaustive description of the mixtures. The spatial variability of the signature corresponding to the specific binding tissue is explicitly modeled through a perturbed component. The performance of the method is assessed on both synthetic and real data and is shown to compete favorably when compared to other conventional analysis methods. The proposed method improved both factor estimation and proportions extraction for specific binding. Modeling the variability of the specific binding factor has a strong potential impact for dynamic PET image analysis.
127 - Torsten En{ss}lin 2014
Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which perm its the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.
129 - F. Feroz , J. Skilling 2013
In performing a Bayesian analysis, two difficult problems often emerge. First, in estimating the parameters of some model for the data, the resulting posterior distribution may be multi-modal or exhibit pronounced (curving) degeneracies. Secondly, in selecting between a set of competing models, calculation of the Bayesian evidence for each model is computationally expensive using existing methods such as thermodynamic integration. Nested Sampling is a Monte Carlo method targeted at the efficient calculation of the evidence, but also produces posterior inferences as a by-product and therefore provides means to carry out parameter estimation as well as model selection. The main challenge in implementing Nested Sampling is to sample from a constrained probability distribution. One possible solution to this problem is provided by the Galilean Monte Carlo (GMC) algorithm. We show results of applying Nested Sampling with GMC to some problems which have proven very difficult for standard Markov Chain Monte Carlo (MCMC) and down-hill methods, due to the presence of large number of local minima and/or pronounced (curving) degeneracies between the parameters. We also discuss the use of Nested Sampling with GMC in Bayesian object detection problems, which are inherently multi-modal and require the evaluation of Bayesian evidence for distinguishing between true and spurious detections.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا