No Arabic abstract
We analyze a data set comprising 370 GW band structures composed of 61716 quasiparticle (QP) energies of two-dimensional (2D) materials spanning 14 crystal structures and 52 elements. The data results from PAW plane wave based one-shot G$_0$W$_0$@PBE calculations with full frequency integration. We investigate the distribution of key quantities like the QP self-energy corrections and renormalization factor $Z$ and explore their dependence on chemical composition and magnetic state. The linear QP approximation is identified as a significant error source and propose schemes for controlling and drastically reducing this error at low computational cost. We analyze the reliability of the $1/N_text{PW}$ basis set extrapolation and find that is well-founded with narrow distributions of $r^2$ peaked very close to 1. Finally, we explore the validity of the scissors operator approximation concluding that it is generally not valid for reasonable error tolerances. Our work represents a step towards the development of automatized workflows for high-throughput G$_0$W$_0$ band structure calculations for solids.
Machine learning is increasingly recognized as a promising technology in the biological, biomedical, and behavioral sciences. There can be no argument that this technique is incredibly successful in image recognition with immediate applications in diagnostics including electrophysiology, radiology, or pathology, where we have access to massive amounts of annotated data. However, machine learning often performs poorly in prognosis, especially when dealing with sparse data. This is a field where classical physics-based simulation seems to remain irreplaceable. In this review, we identify areas in the biomedical sciences where machine learning and multiscale modeling can mutually benefit from one another: Machine learning can integrate physics-based knowledge in the form of governing equations, boundary conditions, or constraints to manage ill-posted problems and robustly handle sparse and noisy data; multiscale modeling can integrate machine learning to create surrogate models, identify system dynamics and parameters, analyze sensitivities, and quantify uncertainty to bridge the scales and understand the emergence of function. With a view towards applications in the life sciences, we discuss the state of the art of combining machine learning and multiscale modeling, identify applications and opportunities, raise open questions, and address potential challenges and limitations. We anticipate that it will stimulate discussion within the community of computational mechanics and reach out to other disciplines including mathematics, statistics, computer science, artificial intelligence, biomedicine, systems biology, and precision medicine to join forces towards creating robust and efficient models for biological systems.
We investigate the effects of multi-task learning using the recently introduced task of semantic tagging. We employ semantic tagging as an auxiliary task for three different NLP tasks: part-of-speech tagging, Universal Dependency parsing, and Natural Language Inference. We compare full neural network sharing, partial neural network sharing, and what we term the learning what to share setting where negative transfer between tasks is less likely. Our findings show considerable improvements for all tasks, particularly in the learning what to share setting, which shows consistent gains across all tasks.
Learning problems form an important category of computational tasks that generalizes many of the computations researchers apply to large real-life data sets. We ask: what concept classes can be learned privately, namely, by an algorithm whose output does not depend too heavily on any one input or specific training example? More precisely, we investigate learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in contexts where aggregate information is released about a database containing sensitive information about individuals. We demonstrate that, ignoring computational constraints, it is possible to privately agnostically learn any concept class using a sample size approximately logarithmic in the cardinality of the concept class. Therefore, almost anything learnable is learnable privately: specifically, if a concept class is learnable by a (non-private) algorithm with polynomial sample complexity and output size, then it can be learned privately using a polynomial number of samples. We also present a computationally efficient private PAC learner for the class of parity functions. Local (or randomized response) algorithms are a practical class of private algorithms that have received extensive investigation. We provide a precise characterization of local private learning algorithms. We show that a concept class is learnable by a local algorithm if and only if it is learnable in the statistical query (SQ) model. Finally, we present a separation between the power of interactive and noninteractive local learning algorithms.
The LIGO/Virgo gravitational-wave (GW) interferometers have to-date detected ten merging black hole (BH) binaries, some with masses considerably larger than had been anticipated. Stellar-mass BH binaries at the high end of the observed mass range (with chirp mass ${cal M} gtrsim 25 M_{odot}$) should be detectable by a space-based GW observatory years before those binaries become visible to ground-based GW detectors. This white paper discusses some of the synergies that result when the same binaries are observed by instruments in space and on the ground. We consider intermediate-mass black hole binaries (with total mass $M sim 10^2 -10^4 M_{odot}$) as well as stellar-mass black hole binaries. We illustrate how combining space-based and ground-based data sets can break degeneracies and thereby improve our understanding of the binarys physical parameters. While early work focused on how space-based observatories can forecast precisely when some mergers will be observed on the ground, the reverse is also important: ground-based detections will allow us to dig deeper into archived, space-based data to confidently identify black hole inspirals whose signal-to-noise ratios were originally sub-threshold, increasing the number of binaries observed in both bands by a factor of $sim 4 - 7$.
We discuss the features of instabilities in binary systems, in particular, for asymmetric nuclear matter. We show its relevance for the interpretation of results obtained in experiments and in ab initio simulations of the reaction between $^{124}Sn+^{124}Sn$ at 50AMeV.}