No Arabic abstract
Random graph models are frequently used as a controllable and versatile data source for experimental campaigns in various research fields. Generating such data-sets at scale is a non-trivial task as it requires design decisions typically spanning multiple areas of expertise. Challenges begin with the identification of relevant domain-specific network features, continue with the question of how to compile such features into a tractable model, and culminate in algorithmic details arising while implementing the pertaining model. In the present survey, we explore crucial aspects of random graph models with known scalable generators. We begin by briefly introducing network features considered by such models, and then discuss random graphs alongside with generation algorithms. Our focus lies on modelling techniques and algorithmic primitives that have proven successful in obtaining massive graphs. We consider concepts and graph models for various domains (such as social network, infrastructure, ecology, and numerical simulations), and discuss generators for different models of computation (including shared-memory parallelism, massive-parallel GPUs, and distributed systems).
In recent years, significant advances have been made in the design and analysis of fully dynamic algorithms. However, these theoretical results have received very little attention from the practical perspective. Few of the algorithms are implemented and tested on real datasets, and their practical potential is far from understood. Here, we present a quick reference guide to recent engineering and theory results in the area of fully dynamic graph algorithms.
Learning in the presence of outliers is a fundamental problem in statistics. Until recently, all known efficient unsupervised learning algorithms were very sensitive to outliers in high dimensions. In particular, even for the task of robust mean estimation under natural distributional assumptions, no efficient algorithm was known. Recent work in theoretical computer science gave the first efficient robust estimators for a number of fundamental statistical tasks, including mean and covariance estimation. Since then, there has been a flurry of research activity on algorithmic high-dimensional robust estimation in a range of settings. In this survey article, we introduce the core ideas and algorithmic techniques in the emerging area of algorithmic high-dimensional robust statistics with a focus on robust mean estimation. We also provide an overview of the approaches that have led to computationally efficient robust estimators for a range of broader statistical tasks and discuss new directions and opportunities for future work.
Opinion dynamics have attracted the interest of researchers from different fields. Local interactions among individuals create interesting dynamics for the system as a whole. Such dynamics are important from a variety of perspectives. Group decision making, successful marketing, and constructing networks (in which consensus can be reached or prevented) are a few examples of existing or potential applications. The invention of the Internet has made the opinion fusion faster, unilateral, and on a whole different scale. Spread of fake news, propaganda, and election interferences have made it clear there is an essential need to know more about these dynamics. The emergence of new ideas in the field has accelerated over the last few years. In the first quarter of 2020, at least 50 research papers have emerged, either peer-reviewed and published or on pre-print outlets such as arXiv. In this paper, we summarize these ground-breaking ideas and their fascinating extensions and introduce newly surfaced concepts.
With the commissioning of the LHC expected in 2009, and the LHC upgrades expected in 2012, ATLAS and CMS are planning for detector upgrades for their innermost layers requiring radiation hard technologies. Chemical Vapor Deposition (CVD) diamond has been used extensively in beam conditions monitors as the innermost detectors in the highest radiation areas of BaBar, Belle and CDF and is now planned for all LHC experiments. This material is now being considered as an alternate sensor for use very close to the interaction region of the super LHC where the most extreme radiation conditions will exist. Recently the RD42 collaboration constructed, irradiated and tested polycrystalline and single-crystal chemical vapor deposition diamond sensors to the highest fluences available. We present beam test results of chemical vapor deposition diamond up to fluences of 1.8 x 10^16 protons/cm^2 showing that both polycrystalline and single-crystal chemical vapor deposition diamonds follow a single damage curve allowing one to extrapolate their performance as a function of dose.
With the rapid development of recommender systems, accuracy is no longer the only golden criterion for evaluating whether the recommendation results are satisfying or not. In recent years, diversity has gained tremendous attention in recommender systems research, which has been recognized to be an important factor for improving user satisfaction. On the one hand, diversified recommendation helps increase the chance of answering ephemeral user needs. On the other hand, diversifying recommendation results can help the business improve product visibility and explore potential user interests. In this paper, we are going to review the recent advances in diversified recommendation. Specifically, we first review the various definitions of diversity and generate a taxonomy to shed light on how diversity have been modeled or measured in recommender systems. After that, we summarize the major optimization approaches to diversified recommendation from a taxonomic view. Last but not the least, we project into the future and point out trending research directions on this topic.