ترغب بنشر مسار تعليمي؟ اضغط هنا

Automatic Detection of Occulted Hard X-ray Flares Using Deep-Learning Methods

295   0   0.0 ( 0 )
 نشر من قبل Shin-Nosuke Ishikawa
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a concept for a machine-learning classification of hard X-ray (HXR) emissions from solar flares observed by the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI), identifying flares that are either occulted by the solar limb or located on the solar disk. Although HXR observations of occulted flares are important for particle-acceleration studies, HXR data analyses for past observations were time consuming and required specialized expertise. Machine-learning techniques are promising for this situation, and we constructed a sample model to demonstrate the concept using a deep-learning technique. Input data to the model are HXR spectrograms that are easily produced from RHESSI data. The model can detect occulted flares without the need for image reconstruction nor for visual inspection by experts. A technique of convolutional neural networks was used in this model by regarding the input data as images. Our model achieved a classification accuracy better than 90 %, and the ability for the application of the method to either event screening or for an event alert for occulted flares was successfully demonstrated.



قيم البحث

اقرأ أيضاً

While the evolution of linear initial conditions present in the early universe into extended halos of dark matter at late times can be computed using cosmological simulations, a theoretical understanding of this complex process remains elusive. Here, we build a deep learning framework to learn this non-linear relationship, and develop techniques to physically interpret the learnt mapping. A three-dimensional convolutional neural network (CNN) is trained to predict the mass of dark matter halos from the initial conditions. We find no change in the predictive accuracy of the model if we retrain the model removing anisotropic information from the inputs. This suggests that the features learnt by the CNN are equivalent to spherical averages over the initial conditions. Our results indicate that interpretable deep learning frameworks can provide a powerful tool for extracting insight into cosmological structure formation.
Astronomers require efficient automated detection and classification pipelines when conducting large-scale surveys of the (optical) sky for variable and transient sources. Such pipelines are fundamentally important, as they permit rapid follow-up and analysis of those detections most likely to be of scientific value. We therefore present a deep learning pipeline based on the convolutional neural network architecture called $texttt{MeerCRAB}$. It is designed to filter out the so called bogus detections from true astrophysical sources in the transient detection pipeline of the MeerLICHT telescope. Optical candidates are described using a variety of 2D images and numerical features extracted from those images. The relationship between the input images and the target classes is unclear, since the ground truth is poorly defined and often the subject of debate. This makes it difficult to determine which source of information should be used to train a classification algorithm. We therefore used two methods for labelling our data (i) thresholding and (ii) latent class model approaches. We deployed variants of $texttt{MeerCRAB}$ that employed different network architectures trained using different combinations of input images and training set choices, based on classification labels provided by volunteers. The deepest network worked best with an accuracy of 99.5$%$ and Matthews correlation coefficient (MCC) value of 0.989. The best model was integrated to the MeerLICHT transient vetting pipeline, enabling the accurate and efficient classification of detected transients that allows researchers to select the most promising candidates for their research goals.
Owing to the remarkable photometric precision of space observatories like Kepler, stellar and planetary systems beyond our own are now being characterized en masse for the first time. These characterizations are pivotal for endeavors such as searchin g for Earth-like planets and solar twins, understanding the mechanisms that govern stellar evolution, and tracing the dynamics of our Galaxy. The volume of data that is becoming available, however, brings with it the need to process this information accurately and rapidly. While existing methods can constrain fundamental stellar parameters such as ages, masses, and radii from these observations, they require substantial computational efforts to do so. We develop a method based on machine learning for rapidly estimating fundamental parameters of main-sequence solar-like stars from classical and asteroseismic observations. We first demonstrate this method on a hare-and-hound exercise and then apply it to the Sun, 16 Cyg A & B, and 34 planet-hosting candidates that have been observed by the Kepler spacecraft. We find that our estimates and their associated uncertainties are comparable to the results of other methods, but with the additional benefit of being able to explore many more stellar parameters while using much less computation time. We furthermore use this method to present evidence for an empirical diffusion-mass relation. Our method is open source and freely available for the community to use. The source code for all analyses and for all figures appearing in this manuscript can be found electronically at https://github.com/earlbellinger/asteroseismology
There are several supervised machine learning methods used for the application of automated morphological classification of galaxies; however, there has not yet been a clear comparison of these different methods using imaging data, or a investigation for maximising their effectiveness. We carry out a comparison between several common machine learning methods for galaxy classification (Convolutional Neural Network (CNN), K-nearest neighbour, Logistic Regression, Support Vector Machine, Random Forest, and Neural Networks) by using Dark Energy Survey (DES) data combined with visual classifications from the Galaxy Zoo 1 project (GZ1). Our goal is to determine the optimal machine learning methods when using imaging data for galaxy classification. We show that CNN is the most successful method of these ten methods in our study. Using a sample of $sim$2,800 galaxies with visual classification from GZ1, we reach an accuracy of $sim$0.99 for the morphological classification of Ellipticals and Spirals. The further investigation of the galaxies that have a different ML and visual classification but with high predicted probabilities in our CNN usually reveals an the incorrect classification provided by GZ1. We further find the galaxies having a low probability of being either spirals or ellipticals are visually Lenticulars (S0), demonstrating that supervised learning is able to rediscover that this class of galaxy is distinct from both Es and Spirals. We confirm that $sim$2.5% galaxies are misclassified by GZ1 in our study. After correcting these galaxies labels, we improve our CNN performance to an average accuracy of over 0.99 (accuracy of 0.994 is our best result).
206 - Iftach Sadeh 2019
The next generation of observatories will facilitate the discovery of new types of astrophysical transients. The detection of such phenomena, whose characteristics are presently poorly constrained, will hinge on the ability to perform blind searches. We present a new algorithm for this purpose, based on deep learning. We incorporate two approaches, utilising anomaly detection and classification techniques. The first is model-independent, avoiding the use of background modelling and instrument simulations. The second method enables targeted searches, relying on generic spectral and temporal patterns as input. We compare our methodology with the existing approach to serendipitous detection of gamma-ray transients. The algorithm is shown to be more robust, especially for non-trivial spectral features. We use our framework to derive the detection prospects of low-luminosity gamma-ray bursts with the upcoming Cherenkov Telescope Array. Our method is an unbiased, completely data-driven approach for multiwavelength and multi-messenger transient detection.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا