ترغب بنشر مسار تعليمي؟ اضغط هنا

Astrophysical Supercomputing with GPUs: Critical Decisions for Early Adopters

86   0   0.0 ( 0 )
 نشر من قبل Christopher Fluke
 تاريخ النشر 2010
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

General purpose computing on graphics processing units (GPGPU) is dramatically changing the landscape of high performance computing in astronomy. In this paper, we identify and investigate several key decision areas, with a goal of simplyfing the early adoption of GPGPU in astronomy. We consider the merits of OpenCL as an open standard in order to reduce risks associated with coding in a native, vendor-specific programming environment, and present a GPU programming philosophy based on using brute force solutions. We assert that effective use of new GPU-based supercomputing facilities will require a change in approach from astronomers. This will likely include improved programming training, an increased need for software development best-practice through the use of profiling and related optimisation tools, and a greater reliance on third-party code libraries. As with any new technology, those willing to take the risks, and make the investment of time and effort to become early adopters of GPGPU in astronomy, stand to reap great benefits.

قيم البحث

اقرأ أيضاً

90 - S. Ord 2009
The MWA is a next-generation radio interferometer under construction in remote Western Australia. The data rate from the correlator makes storing the raw data infeasible, so the data must be processed in real-time. The processing task is of order ~10 TFLOPS. The remote location of the MWA limits the power that can be allocated to computing. We describe the design and implementation of elements of the MWA real-time data processing system which leverage the computing abilities of modern graphics processing units (GPUs). The matrix algebra and texture mapping capabilities of GPUs are well suited to the majority of tasks involved in real-time calibration and imaging. Considerable performance advantages over a conventional CPU-based reference implementation are obtained.
To date the most precise estimations of the critical exponent for the Anderson transition have been made using the transfer matrix method. This method involves the simulation of extremely long quasi one-dimensional systems. The method is inherently s erial and is not well suited to modern massively parallel supercomputers. The obvious alternative is to simulate a large ensemble of hypercubic systems and average. While this permits taking full advantage of both OpenMP and MPI on massively parallel supercomputers, a straight forward implementation results in data that does not scale. We show that this problem can be avoided by generating random sets of orthogonal starting vectors with an appropriate stationary probability distribution. We have applied this method to the Anderson transition in the three-dimensional orthogonal universality class and been able to increase the largest $Ltimes L$ cross section simulated from $L=24$ (New J. Physics, 16, 015012 (2014)) to $L=64$ here. This permits an estimation of the critical exponent with improved precision and without the necessity of introducing an irrelevant scaling variable. In addition, this approach is better suited to simulations with correlated random potentials such as is needed in quantum Hall or cold atom systems.
We discuss fits of unconventional dark energy models to the available data from high-redshift supernovae, distant galaxies and baryon oscillations. The models are based either on brane cosmologies or on Liouville strings in which a relaxation dark en ergy is provided by a rolling dilaton field (Q-cosmology). Such cosmologies feature the possibility of effective four-dimensional negative-energy dust and/or exotic scaling of dark matter. We find evidence for a negative-energy dust at the current era, as well as for exotic-scaling (a^{-delta}) contributions to the energy density, with delta ~= 4, which could be due to dark matter coupling with the dilaton in Q-cosmology models. We conclude that Q-cosmology fits the data equally well with the LambdaCDM model for a range of parameters that are in general expected from theoretical considerations.
The science investigation teams (SITs) for the WFIRST coronagraphic instrument have begun studying the capabilities of the instrument to directly image reflected light off from exoplanets at contrasts down to contrasts of ~10^-9 with respect to the s tellar flux. Detection of point sources at these high contrasts requires yield estimates and detailed modeling of the image of the planetary system as it propagates through the telescope optics. While the SITs might generate custom astrophysical scenes, the integrated model, propagated through the internal speckle field, is typically done at JPL. In this white paper, we present a standard file format to ensure a single distribution system between those who produce the raw astrophysical scenes, and JPL modelers who incorporate those scenes into their optical modeling. At its core, our custom file format uses FITS files, and incorporates standards on packaging astrophysical scenes. This includes spectral and astrometric information for planetary and stellar point sources, zodiacal light and extragalactic sources that may appear as contaminants. Adhering to such a uniform data distribution format is necessary, as it ensures seamless work flow between the SITs and modelers at JPL for the goals of understanding limits of the WFIRST coronagraphic instrument.
84 - Torsten En{ss}lin 2014
Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which perm its the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا