ترغب بنشر مسار تعليمي؟ اضغط هنا

We present a high-performance, graphics processing unit (GPU)-based framework for the efficient analysis and visualization of (nearly) terabyte (TB)-sized 3-dimensional images. Using a cluster of 96 GPUs, we demonstrate for a 0.5 TB image: (1) volume rendering using an arbitrary transfer function at 7--10 frames per second; (2) computation of basic global image statistics such as the mean intensity and standard deviation in 1.7 s; (3) evaluation of the image histogram in 4 s; and (4) evaluation of the global image median intensity in just 45 s. Our measured results correspond to a raw computational throughput approaching one teravoxel per second, and are 10--100 times faster than the best possible performance with traditional single-node, multi-core CPU implementations. A scalability analysis shows the framework will scale well to images sized 1 TB and beyond. Other parallel data analysis algorithms can be added to the framework with relative ease, and accordingly, we present our framework as a possible solution to the image analysis and visualization requirements of next-generation telescopes, including the forthcoming Square Kilometre Array pathfinder radiotelescopes.
Visualisation and analysis of terabyte-scale datacubes, as will be produced with the Australian Square Kilometre Array Pathfinder (ASKAP), will pose challenges for existing astronomy software and the work practices of astronomers. Focusing on the pro posed outcomes of WALLABY (Widefield ASKAP L-Band Legacy All-Sky Blind Survey), and using lessons learnt from HIPASS (HI Parkes All Sky Survey), we identify issues that astronomers will face with WALLABY data cubes. We comment on potential research directions and possible solutions to these challenges.
General purpose computing on graphics processing units (GPGPU) is dramatically changing the landscape of high performance computing in astronomy. In this paper, we identify and investigate several key decision areas, with a goal of simplyfing the ear ly adoption of GPGPU in astronomy. We consider the merits of OpenCL as an open standard in order to reduce risks associated with coding in a native, vendor-specific programming environment, and present a GPU programming philosophy based on using brute force solutions. We assert that effective use of new GPU-based supercomputing facilities will require a change in approach from astronomers. This will likely include improved programming training, an increased need for software development best-practice through the use of profiling and related optimisation tools, and a greater reliance on third-party code libraries. As with any new technology, those willing to take the risks, and make the investment of time and effort to become early adopters of GPGPU in astronomy, stand to reap great benefits.
We investigate the neutral hydrogen (HI) content of sixteen groups for which we have multi-wavelength data including X-ray observations. Wide-field imaging of the groups was obtained with the 20-cm multibeam system on the 64-m Parkes telescope. We ha ve detected ten previously uncatalogued HI sources, one of which has no visible optical counterpart. We examine the HI properties of the groups, compared to their X-ray characteristics, finding that those groups with a higher X-ray temperature and luminosity contain less HI per galaxy. The HI content of a group depends on its morphological make-up, with those groups dominated by early-type galaxies containing the least total HI. We determined the expected HI for the spiral galaxies in the groups, and found that a number of the galaxies were HI deficient. The HI deficient spirals were found both in groups with and without a hot intra-group medium. The HI deficient galaxies were not necessarily found at the centre of the groups, however, we did find that two thirds of HI deficient galaxies were found within about 1 Mpc from the group centre, indicating that the group environment is affecting the gas-loss from these galaxies. We determined the HI mass function for a composite sample of 15 groups, and found that it is significantly flatter than the field HI mass function. We also find a lack of high HI-mass galaxies in groups. One possible cause of this effect is the tidal stripping of HI gas from spiral galaxies as they are pre-processed in groups.
This review examines progress on the Pop I, fundamental-mode Cepheid distance scale with emphasis on recent developments in geometric and quasi-geometric techniques for Cepheid distance determination. Specifically I examine the surface brightness met hod, interferometric pulsation method, and trigonometric measurements. The three techniques are found to be in excellent agreement for distance measures in the Galaxy. The velocity p-factor is of crucial importance in the first two of these methods. A comparison of recent determinations of the p-factor for Cepheids demonstrates that observational measures of p and theoretical predictions agree within their uncertainties for Galactic Cepheids.
We demonstrate how interactive, three-dimensional (3-d) scientific visualizations can be efficiently interchanged between a variety of mediums. Through the use of an appropriate interchange format, and a unified interaction interface, we minimize the effort to produce visualizations appropriate for undertaking knowledge discovery at the astronomers desktop, as part of conference presentations, in digital publications or as Web content. We use examples from cosmological visualization to address some of the issues of interchange, and to describe our approach to adapting S2PLOT desktop visualizations to the Web. Supporting demonstrations are available at http://astronomy.swin.edu.au/s2plot/interchange/
32 - D.W. Longcope , G. Barnes , 2008
Coronal magnetic field may be characterized by how its field lines interconnect regions of opposing photospheric flux -- its connectivity. Connectivity can be quantified as the net flux connecting pairs of opposing regions, once such regions are iden tified. One existing algorithm will partition a typical active region into a number of unipolar regions ranging from a few dozen to a few hundred, depending on algorithmic parameters. This work explores how the properties of the partitions depend on some algorithmic parameters, and how connectivity depends on the coarseness of partitioning for one particular active region magnetogram. We find the number of connections among them scales with the number of regions even as the number of possible connections scales with its square. There are several methods of generating a coronal field, even a potential field. The field may be computed inside conducting boundaries or over an infinite half-space. For computation of connectivity, the unipolar regions may be replaced by point sources or the exact magnetogram may be used as a lower boundary condition. Our investigation shows that the connectivities from these various fields differ only slightly -- no more than 15%. The greatest difference is between fields within conducting walls and those in the half-space. Their connectivities grow more different as finer partitioning creates more source regions. This also gives a quantitative means of establishing how far away conducting boundaries must be placed in order not to significantly affect the extrapolation. For identical outer boundaries, the use of point sources instead of the exact magnetogram makes a smaller difference in connectivity: typically 6% independent of the number of source regions.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا