ترغب بنشر مسار تعليمي؟ اضغط هنا

Large-scale Structures revealed by Wavelet Decomposition

122   0   0.0 ( 0 )
 نشر من قبل Jesus Pando
 تاريخ النشر 1997
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a detailed review of large-scale structure (LSS) study using the discrete wavelet transform (DWT). After describing how one constructs a wavelet decomposition we show how this bases can be used as a complete statistical discription of LSS. Among the topics studied are the the DWT estimation of the probability distribution function; the reconstruction of the power spectrum; the regularization of complex geometry in observational samples; cluster identification; extraction and identification of coherent structures; scale-decomposition of non-Gaussianity, such as spectra of skewnes and kurtosis and scale-scale correlations. These methods are applied to both observational and simulated samples of the QSO Lyman-alpha forests. It is clearly demonstrated that the statistical measures developed using the DWT are needed to distinguish between competing models of structure formation. The DWT also reveals physical features in these distributions not detected before. We conclude with a look towards the future of the use of the DWT in LSS.

قيم البحث

اقرأ أيضاً

The proper orthogonal decomposition (POD) is a powerful classical tool in fluid mechanics used, for instance, for model reduction and extraction of coherent flow features. However, its applicability to high-resolution data, as produced by three-dimen sional direct numerical simulations, is limited owing to its computational complexity. Here, we propose a wavelet-based adaptive version of the POD (the wPOD), in order to overcome this limitation. The amount of data to be analyzed is reduced by compressing them using biorthogonal wavelets, yielding a sparse representation while conveniently providing control of the compression error. Numerical analysis shows how the distinct error contributions of wavelet compression and POD truncation can be balanced under certain assumptions, allowing us to efficiently process high-resolution data from three-dimensional simulations of flow problems. Using a synthetic academic test case, we compare our algorithm with the randomized singular value decomposition. Furthermore, we demonstrate the ability of our method analyzing data of a 2D wake flow and a 3D flow generated by a flapping insect computed with direct numerical simulation.
56 - L. Van Waerbeke 2003
We review all the cosmic shear results obtained so far, with a critical discussion of the present strengths and weaknesses. We discuss the future prospects and the role cosmic shear could play in a precision cosmology era.
We propose a large-scale hologram calculation using WAvelet ShrinkAge-Based superpositIon (WASABI), a wavelet transform-based algorithm. An image-type hologram calculated using the WASABI method is printed on a glass substrate with the resolution of $65,536 times 65,536$ pixels and a pixel pitch of $1 mu$m. The hologram calculation time amounts to approximately 354 s on a commercial CPU, which is approximately 30 times faster than conventional methods.
74 - J. Pando , P. Lipa , M. Greiner 1997
It is of fundamental importance to determine if and how hierarchical clustering is involved in large-scale structure formation of the universe. Hierarchical evolution is characterized by rules which specify how dark matter halos are formed by the mer ging of halos at smaller scales. We show that scale-scale correlations of the matter density field are direct and sensitive measures to quantify this merging tree. Such correlations are most conveniently determined from discrete wavelet transforms. Analyzing two samples of Ly-alpha forests of QSOs absorption spectra, we find significant scale-scale correlations whose dependence is typical for a branching process. Therefore, models which predict a history independent evolution are ruled out and the halos hosting the Ly-alpha clouds must have gone through a history dependent merging process during their formation.
Tensor decomposition is a well-known tool for multiway data analysis. This work proposes using stochastic gradients for efficient generalized canonical polyadic (GCP) tensor decomposition of large-scale tensors. GCP tensor decomposition is a recently proposed version of tensor decomposition that allows for a variety of loss functions such as Bernoulli loss for binary data or Huber loss for robust estimation. The stochastic gradient is formed from randomly sampled elements of the tensor and is efficient because it can be computed using the sparse matricized-tensor-times-Khatri-Rao product (MTTKRP) tensor kernel. For dense tensors, we simply use uniform sampling. For sparse tensors, we propose two types of stratified sampling that give precedence to sampling nonzeros. Numerical results demonstrate the advantages of the proposed approach and its scalability to large-scale problems.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا