No Arabic abstract
Photometric surveys with the Hubble Space Telescope (HST) allow us to study stellar populations with high resolution and deep coverage, with estimates of the physical parameters of the constituent stars being typically obtained by comparing the survey data with adequate stellar evolutionary models. This is a highly non-trivial task due to effects such as differential extinction, photometric errors, low filter coverage, or uncertainties in the stellar evolution calculations. These introduce degeneracies that are difficult to detect and break. To improve this situation, we introduce a novel deep learning approach, called conditional invertible neural network (cINN), to solve the inverse problem of predicting physical parameters from photometry on an individual star basis and to obtain the full posterior distributions. We build a carefully curated synthetic training data set derived from the PARSEC stellar evolution models to predict stellar age, initial/current mass, luminosity, effective temperature and surface gravity. We perform tests on synthetic data from the MIST and Dartmouth models, and benchmark our approach on HST data of two well-studied stellar clusters, Westerlund 2 and NGC 6397. For the synthetic data we find overall excellent performance, and note that age is the most difficult parameter to constrain. For the benchmark clusters we retrieve reasonable results and confirm previous findings for Westerlund 2 on cluster age ($1.04_{-0.90}^{+8.48},mathrm{Myr} $), mass segregation, and the stellar initial mass function. For NGC 6397 we recover plausible estimates for masses, luminosities and temperatures, however, discrepancies between stellar evolution models and observations prevent an acceptable recovery of age for old stars.
The advent of space-based observatories such as CoRoT and Kepler has enabled the testing of our understanding of stellar evolution on thousands of stars. Evolutionary models typically require five input parameters, the mass, initial Helium abundance, initial metallicity, mixing- length (assumed to be constant over time), and the age to which the star must be evolved. Some of these parameters are also very useful in characterizing the associated planets and in studying galactic archaeology. How to obtain these parameters from observations rapidly and accurately, specifically in the context of surveys of thousands of stars, is an outstanding ques- tion, one that has eluded straightforward resolution. For a given star, we typically measure the effective temperature and surface metallicity spectroscopically and low-degree oscillation frequencies through space observatories. Here we demonstrate that statistical learning, using artificial neural networks, is successful in determining the evolutionary parameters based on spectroscopic and seismic measurements. Our trained networks show robustness over a broad range of parameter space, and critically, are entirely computationally inexpensive and fully automated. We analyze the observations of a few stars using this method and the results com- pare well to inferences obtained using other techniques. This method is both computationally cheap and inferentially accurate, paving the way for analyzing the vast quantities of stellar observations from past, current, and future missions.
Adversarial examples (AEs) are images that can mislead deep neural network (DNN) classifiers via introducing slight perturbations into original images. This security vulnerability has led to vast research in recent years because it can introduce real-world threats into systems that rely on neural networks. Yet, a deep understanding of the characteristics of adversarial examples has remained elusive. We propose a new way of achieving such understanding through a recent development, namely, invertible neural models with Lipschitz continuous mapping functions from the input to the output. With the ability to invert any latent representation back to its corresponding input image, we can investigate adversarial examples at a deeper level and disentangle the adversarial examples latent representation. Given this new perspective, we propose a fast latent space adversarial example generation method that could accelerate adversarial training. Moreover, this new perspective could contribute to new ways of adversarial example detection.
We present the CoRoT light curve of the bright B2.5V star HD 48977 observed during a short run of the mission in 2008, as well as a high-resolution spectrum gathered with the HERMES spectrograph at the Mercator telescope. We use several time series analysis tools to explore the nature of the variations present in the light curve. We perform a detailed analysis of the spectrum of the star to determine its fundamental parameters and its element abundances. We find a large number of high-order g-modes, and one rotationally induced frequency. We find stable low-amplitude frequencies in the p-mode regime as well. We conclude that HD 48977 is a new Slowly Pulsating B star with fundamental parameters found to be Teff = 20000 $pm$ 1000 K and log(g)=4.2 $/pm$ 0.1. The element abundances are similar to those found for other B stars in the solar neighbourhood. HD 48977 was observed during a short run of the CoRoT satellite implying that the frequency precision is insufficient to perform asteroseismic modelling of the star. Nevertheless, we show that a longer time series of this star would be promising for such modelling. Our present study contributes to a detailed mapping of the instability strips of B stars in view of the dominance of g-mode pulsations in the star, several of which occur in the gravito-inertial regime.
Locally resonant elastic metamaterials (LREM) can be designed, by optimizing the geometry of the constituent self-repeating unit cells, to potentially damp out vibration in selected frequency ranges, thus yielding desired bandgaps. However, it remains challenging to quickly arrive at unit cell designs that satisfy any requested bandgap specifications within a given global frequency range. This paper develops a computationally efficient framework for (fast) inverse design of LREM, by integrating a new type of machine learning models called invertible neural networks or INN. An INN can be trained to predict the bandgap bounds as a function of the unit cell design, and interestingly at the same time it learns to predict the unit cell design given a bandgap, when executed in reverse. In our case the unit cells are represented in terms of the widths of the outer matrix and middle soft filler layer of the unit cell. Training data on the frequency response of the unit cell is provided by Bloch dispersion analyses. The trained INN is used to instantaneously retrieve feasible (or near feasible) inverse designs given a specified bandgap constraint, which is then used to initialize a forward constrained optimization (based on sequential quadratic programming) to find the bandgap satisfying unit cell with minimum mass. Case studies show favorable performance of this approach, in terms of the bandgap characteristics and minimized mass, when compared to the median scenario over ten randomly initialized optimizations for the same specified bandgaps. Further analysis using FEA verify the bandgap performance of a finite structure comprised of $8times 8$ arrangement of the unit cells obtained with INN-accelerated inverse design.
Magnetic activity in stars manifests as dark spots on their surfaces that modulate the brightness observed by telescopes. These light curves contain important information on stellar rotation. However, the accurate estimation of rotation periods is computationally expensive due to scarce ground truth information, noisy data, and large parameter spaces that lead to degenerate solutions. We harness the power of deep learning and successfully apply Convolutional Neural Networks to regress stellar rotation periods from Kepler light curves. Geometry-preserving time-series to image transformations of the light curves serve as inputs to a ResNet-18 based architecture which is trained through transfer learning. The McQuillan catalog of published rotation periods is used as ansatz to groundtruth. We benchmark the performance of our method against a random forest regressor, a 1D CNN, and the Auto-Correlation Function (ACF) - the current standard to estimate rotation periods. Despite limiting our input to fewer data points (1k), our model yields more accurate results and runs 350 times faster than ACF runs on the same number of data points and 10,000 times faster than ACF runs on 65k data points. With only minimal feature engineering our approach has impressive accuracy, motivating the application of deep learning to regress stellar parameters on an even larger scale