No Arabic abstract
Solar flares are large-scale releases of energy in the solar atmosphere, which are characterised by rapid changes in the hydrodynamic properties of plasma from the photosphere to the corona. Solar physicists have typically attempted to understand these complex events using a combination of theoretical models and observational data. From a statistical perspective, there are many challenges associated with making accurate and statistically significant comparisons between theory and observations, due primarily to the large number of free parameters associated with physical models. This class of ill-posed statistical problem is ideally suited to Bayesian methods. In this paper, the solar flare studied by Raftery et al. (2009) is reanalysed using a Bayesian framework. This enables us to study the evolution of the flares temperature, emission measure and energy loss in a statistically self-consistent manner. The Bayesian-based model selection techniques imply that no decision can be made regarding which of the conductive or non-thermal beam heating play the most important role in heating the flare plasma during the impulsive phase of this event.
All three components of the current density are required to compute the heating rate due to free magnetic energy dissipation. Here we present a first test of a new model developed to determine if the times of increases in the resistive heating rate in active region (AR) photospheres are correlated with the subsequent occurrence of M and X flares in the corona. A data driven, 3 D, non-force-free magnetohydrodynamic model restricted to the near-photospheric region is used to compute time series of the complete current density and the resistive heating rate per unit volume $(Q(t))$ in each pixel in neutral line regions (NLRs) of 14 ARs. The model is driven by time series of the magnetic field ${bf B}$ measured by the Helioseismic & Magnetic Imager on the Solar Dynamics Observatory (SDO) satellite. Spurious Doppler periods due to SDO orbital motion are filtered out of the time series for ${bf B}$ in every AR pixel. For each AR, the cumulative distribution function (CDF) of the values of the NLR area integral $Q_i(t)$ of $Q(t)$ is found to be a scale invariant power law distribution essentially identical to the observed CDF for the total energy released in coronal flares. This suggests that coronal flares and the photospheric $Q_i$ are correlated, and powered by the same process. The model predicts spikes in $Q_i$ with values orders of magnitude above background values. These spikes are driven by spikes in the non-force free component of the current density. The times of these spikes are plausibly correlated with times of subsequent M or X flares a few hours to a few days later. The spikes occur on granulation scales, and may be signatures of heating in horizontal current sheets. It is also found that the times of relatively large values of the rate of change of the NLR unsigned magnetic flux are also plausibly correlated with the times of subsequent M and X flares, and spikes in $Q_i$.
With the unprecedented photometric precision of the Kepler Spacecraft, significant systematic and stochastic errors on transit signal levels are observable in the Kepler photometric data. These errors, which include discontinuities, outliers, systematic trends and other instrumental signatures, obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline tries to remove these errors while preserving planet transits and other astrophysically interesting signals. The completely new noise and stellar variability regime observed in Kepler data poses a significant problem to standard cotrending methods such as SYSREM and TFA. Variable stars are often of particular astrophysical interest so the preservation of their signals is of significant importance to the astrophysical community. We present a Bayesian Maximum A Posteriori (MAP) approach where a subset of highly correlated and quiet stars is used to generate a cotrending basis vector set which is in turn used to establish a range of reasonable robust fit parameters. These robust fit parameters are then used to generate a Bayesian Prior and a Bayesian Posterior Probability Distribution Function (PDF) which when maximized finds the best fit that simultaneously removes systematic effects while reducing the signal distortion and noise injection which commonly afflicts simple least-squares (LS) fitting. A numerical and empirical approach is taken where the Bayesian Prior PDFs are generated from fits to the light curve distributions themselves.
A common problem in ultra-high energy cosmic ray physics is the comparison of energy spectra. The question is whether the spectra from two experiments or two regions of the sky agree within their statistical and systematic uncertainties. We develop a method to directly compare energy spectra for ultra-high energy cosmic rays from two different regions of the sky in the same experiment without reliance on agreement with a theoretical model of the energy spectra. The consistency between the two spectra is expressed in terms of a Bayes factor, defined here as the ratio of the likelihood of the two-parent source hypothesis to the likelihood of the one-parent source hypothesis. Unlike other methods, for example chi^2 tests, the Bayes factor allows for the calculation of the posterior odds ratio and correctly accounts for non-Gaussian uncertainties. The latter is particularly important at the highest energies, where the number of events is very small.
We introduce a hybrid approach to solar flare prediction, whereby a supervised regularization method is used to realize feature importance and an unsupervised clustering method is used to realize the binary flare/no-flare decision. The approach is validated against NOAA SWPC data.
In this paper, a Bayesian semiparametric copula approach is used to model the underlying multivariate distribution $F_{true}$. First, the Dirichlet process is constructed on the unknown marginal distributions of $F_{true}$. Then a Gaussian copula model is utilized to capture the dependence structure of $F_{true}$. As a result, a Bayesian multivariate normality test is developed by combining the relative belief ratio and the Energy distance. Several interesting theoretical results of the approach are derived. Finally, through several simulated examples and a real data set, the proposed approach reveals excellent performance.