ترغب بنشر مسار تعليمي؟ اضغط هنا

Testing the effect of resolution on gravitational fragmentation with Lagrangian hydrodynamic schemes

60   0   0.0 ( 0 )
 نشر من قبل Takashi Okamoto
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

To study the resolution required for simulating gravitational fragmentation with newly developed Lagrangian hydrodynamic schemes, Meshless Finite Volume method (MFV) and Meshless Finite Mass method (MFM), we have performed a number of simulations of the Jeans test and compared the results with both the expected analytic solution and results from the more standard Lagrangian approach: Smoothed Particle Hydrodynamics (SPH). We find that the different schemes converge to the analytic solution when the diameter of a fluid element is smaller than a quarter of the Jeans wavelength, $lambda_mathrm{J}$. Among the three schemes, SPH/MFV shows the fastest/slowest convergence to the analytic solution. Unlike the well-known behaviour of Eulerian schemes, none of the Lagrangian schemes investigated displays artificial fragmentation when the perturbation wavelength, $lambda$, is shorter than $lambda_mathrm{J}$, even at low numerical resolution. For larger wavelengths ($lambda > lambda_mathrm{J}$) the growth of the perturbation is delayed when it is not well resolved. Furthermore, with poor resolution, the fragmentation seen with the MFV scheme proceeds very differently compared to the converged solution. All these results suggest that, when unresolved, the ratio of the magnitude of hydrodynamic force to that of self-gravity at the sub-resolution scale is the largest/smallest in MFV/SPH, the reasons for which we discussed in detail. These tests are repeated to investigate the effect of kernels of higher-order than the fiducial cubic spline. Our results indicate that the standard deviation of the kernel is a more appropriate definition of the size of a fluid element than its compact support radius.

قيم البحث

اقرأ أيضاً

108 - N. R. Morgan , B. J. Archer 2021
The intent of this paper is to discuss the history and origins of Lagrangian hydrodynamic methods for simulating shock driven flows. The majority of the pioneering research occurred within the Manhattan Project. A range of Lagrangian hydrodynamic sch emes were created between 1943 and 1948 by John von Neumann, Rudolf Peierls, Tony Skyrme, and Robert Richtmyer. These schemes varied significantly from each other; however, they all used a staggered-grid and finite difference approximations of the derivatives in the governing equations, where the first scheme was by von Neumann. These ground-breaking schemes were principally published in Los Alamos laboratory reports that were eventually declassified many decades after authorship, which motivates us to document the work and describe the accompanying history in a paper that is accessible to the broader scientific community. Furthermore, we seek to correct historical omissions on the pivotal contributions made by Peierls and Skyrme to creating robust Lagrangian hydrodynamic methods for simulating shock driven flows. Understanding the history of Lagrangian hydrodynamic methods can help explain the origins of many modern schemes and may inspire the pursuit of new schemes.
69 - Hsi-Yu Schive , Ui-Han Zhang , 2011
We present the implementation and performance of a class of directionally unsplit Riemann-solver-based hydrodynamic schemes on Graphic Processing Units (GPU). These schemes, including the MUSCL-Hancock method, a variant of the MUSCL-Hancock method, a nd the corner-transport-upwind method, are embedded into the adaptive-mesh-refinement (AMR) code GAMER. Furthermore, a hybrid MPI/OpenMP model is investigated, which enables the full exploitation of the computing power in a heterogeneous CPU/GPU cluster and significantly improves the overall performance. Performance benchmarks are conducted on the Dirac GPU cluster at NERSC/LBNL using up to 32 Tesla C2050 GPUs. A single GPU achieves speed-ups of 101(25) and 84(22) for uniform-mesh and AMR simulations, respectively, as compared with the performance using one(four) CPU core(s), and the excellent performance persists in multi-GPU tests. In addition, we make a direct comparison between GAMER and the widely-adopted CPU code Athena (Stone et al. 2008) in adiabatic hydrodynamic tests and demonstrate that, with the same accuracy, GAMER is able to achieve two orders of magnitude performance speed-up.
RadioAstron satellite admits in principle a testing the gravitational redshift effect with an accuracy of better than $10^{-5}$. It would surpass the result of Gravity Probe A mission at least an order of magnitude. However, RadioAstrons communicatio ns and frequency transfer systems are not adapted for a direct application of the non relativistic Doppler and troposphere compensation scheme used in the Gravity Probe A experiment. This leads to degradation of the redshift test accuracy approximately to the level 0.01. We discuss the way to overcome this difficulty and present preliminary results based on data obtained during special observing sessions scheduled for testing the new techniques.
Advanced LIGO and the next generation of ground-based detectors aim to capture many more binary coalescences through improving sensitivity and duty cycle. Earthquakes have always been a limiting factor at low frequency where neither the pendulum susp ension nor the active controls provide sufficient isolation to the test mass mirrors. Several control strategies have been proposed to reduce the impact of teleseismic events by switching to a robust configuration with less aggressive feedback. The continental United States has witnessed a huge increase in the number of induced earthquake events primarily associated with hydraulic fracking-related waste water re-injection. Effects from these differ from teleseismic earthquakes primarily because of their depth which is in turn linked to their triggering mechanism. In this paper, we discuss the impact caused due to these low magnitude regional earthquakes and explore ways to minimize the impact of induced seismicity on the detector.
Aims. A transient astrophysical event observed in both gravitational wave (GW) and electromagnetic (EM) channels would yield rich scientific rewards. A first program initiating EM follow-ups to possible transient GW events has been developed and exer cised by the LIGO and Virgo community in association with several partners. In this paper, we describe and evaluate the methods used to promptly identify and localize GW event candidates and to request images of targeted sky locations. Methods. During two observing periods (Dec 17 2009 to Jan 8 2010 and Sep 2 to Oct 20 2010), a low-latency analysis pipeline was used to identify GW event candidates and to reconstruct maps of possible sky locations. A catalog of nearby galaxies and Milky Way globular clusters was used to select the most promising sky positions to be imaged, and this directional information was delivered to EM observatories with time lags of about thirty minutes. A Monte Carlo simulation has been used to evaluate the low-latency GW pipelines ability to reconstruct source positions correctly. Results. For signals near the detection threshold, our low-latency algorithms often localized simulated GW burst signals to tens of square degrees, while neutron star/neutron star inspirals and neutron star/black hole inspirals were localized to a few hundred square degrees. Localization precision improves for moderately stronger signals. The correct sky location of signals well above threshold and originating from nearby galaxies may be observed with ~50% or better probability with a few pointings of wide-field telescopes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا