Do you want to publish a course? Click here

How to make a mature accreting magnetar

349   0   0.0 ( 0 )
 Publication date 2017
  fields Physics
and research's language is English
 Authors A.P. Igoshev




Ask ChatGPT about the research

Several candidates for accreting magnetars have been proposed recently by different authors. Existence of such systems contradicts the standard magnetic field decay scenario where a large magnetic field of a neutron star reaches $lesssim$ few$times 10^{13}$G at ages $gtrsim 1$ Myr. Among other sources, the high mass X-ray binary 4U0114+65 seems to have a strong magnetic field around $10^{14}$ G. We develop a new Bayesian estimate for the kinematic age and demonstrate that 4U0114+65 has kinematic age 2.4-5 Myr ($95%$ credential interval) since the formation of the neutron star. We discuss which conditions are necessary to explain the potential existence of magnetars in accreting high-mass binaries with ages about few Myrs and larger. Three necessary ingredients are: the Hall attractor to prevent rapid decay of dipolar field, relatively rapid cooling of the crust in order to avoid Ohmic decay due to phonons, and finally, low values of the parameter $Q$ to obtain long Ohmic time scale due to impurities. If age and magnetic field estimates for proposed accreting magnetars are correct, then these systems set the strongest limit on the crust impurity for a selected sample of neutron stars and provide evidence in favour of the Hall attractor.



rate research

Read More

This is a story about making quantum computers speak, and doing so in a quantum-native, compositional and meaning-aware manner. Recently we did question-answering with an actual quantum computer. We explain what we did, stress that this was all done in terms of pictures, and provide many pointers to the related literature. In fact, besides natural language, many other things can be implemented in a quantum-native, compositional and meaning-aware manner, and we provide the reader with some indications of that broader pictorial landscape, including our account on the notion of compositionality. We also provide some guidance for the actual execution, so that the reader can give it a go as well.
Extremely strong magnetic fields of the order of $10^{15},{rm G}$ are required to explain the properties of magnetars, the most magnetic neutron stars. Such a strong magnetic field is expected to play an important role for the dynamics of core-collapse supernovae, and in the presence of rapid rotation may power superluminous supernovae and hypernovae associated to long gamma-ray bursts. The origin of these strong magnetic fields remains, however, obscure and most likely requires an amplification over many orders of magnitude in the protoneutron star. One of the most promising agents is the magnetorotational instability (MRI), which can in principle amplify exponentially fast a weak initial magnetic field to a dynamically relevant strength. We describe our current understanding of the MRI in protoneutron stars and show recent results on its dependence on physical conditions specific to protoneutron stars such as neutrino radiation, strong buoyancy effects and large magnetic Prandtl number.
New text as data techniques offer a great promise: the ability to inductively discover measures that are useful for testing social science theories of interest from large collections of text. We introduce a conceptual framework for making causal inferences with discovered measures as a treatment or outcome. Our framework enables researchers to discover high-dimensional textual interventions and estimate the ways that observed treatments affect text-based outcomes. We argue that nearly all text-based causal inferences depend upon a latent representation of the text and we provide a framework to learn the latent representation. But estimating this latent representation, we show, creates new risks: we may introduce an identification problem or overfit. To address these risks we describe a split-sample framework and apply it to estimate causal effects from an experiment on immigration attitudes and a study on bureaucratic response. Our work provides a rigorous foundation for text-based causal inferences.
The Transformer architecture has revolutionized deep learning on sequential data, becoming ubiquitous in state-of-the-art solutions for a wide variety of applications. Yet vanilla Transformers are notoriously resource-expensive, requiring $O(L^2)$ in serial time and memory as functions of input length $L$. Recent works proposed various linear self-attention mechanisms, scaling only as $O(L)$ for serial computation. We perform a thorough analysis of recent Transformer mechanisms with linear self-attention, Performers, in terms of overall computational complexity. We observe a remarkable computational flexibility: forward and backward propagation can be performed with no approximations using sublinear memory as a function of $L$ (in addition to negligible storage for the input sequence), at a cost of greater time complexity in the parallel setting. In the extreme case, a Performer consumes only $O(1)$ memory during training, and still requires $O(L)$ time. This discovered time-memory tradeoff can be used for training or, due to complete backward-compatibility, for fine-tuning on a low-memory device, e.g. a smartphone or an earlier-generation GPU, thus contributing towards decentralized and democratized deep learning.
We present here the first convincing observational manifestation of a magnetar-like magnetic field in an accreting neutron star in binary system - the first pulsating ultra-luminous X-ray source X-2 in the galaxy M82. Using the Chandra X-ray observatory data we show that the source exhibit the bimodal distribution of the luminosity with two well-defined peaks separated by a factor of 40. This behaviour can be interpreted as the action of the propeller regime of accretion. The onset of the propeller in a 1.37 s pulsar at luminosity of ~$10^{40}$ erg/s implies the dipole component of the neutron star magnetic field of ~$10^{14}$ G.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا