ترغب بنشر مسار تعليمي؟ اضغط هنا

The Theoretical Agenda in CMB Research

62   0   0.0 ( 0 )
 نشر من قبل J. Richard Bond
 تاريخ النشر 1996
  مجال البحث فيزياء
والبحث باللغة English
 تأليف J. Richard Bond




اسأل ChatGPT حول البحث

The terrain that theorists cover in this CMB golden age is described. We ponder early universe physics in quest of the fluctuation generator. We extoll the virtues of inflation and defects. We transport fields, matter and radiation into the linear (primary anisotropies) and nonlinear (secondary anisotropies) regimes. We validate our linear codes to deliver accurate predictions for experimentalists to shoot at. We struggle at the computing edge to push our nonlinear simulations from only illustrative to fully predictive. We are now phenomenologists, optimizing statistical techniques for extracting truths and their errors from current and future experiments. We begin to clean foregrounds. We join CMB experimental teams. We combine the CMB with large scale structure, galaxy and other cosmological observations in search of current concordance. The brave use all topical data. Others carefully craft their prior probabilities to downweight data sets. We are always unbiased. We declare theories sick, dead, ugly. Sometimes we cure them, resurrect them, rarely beautify them. Our goal is to understand how all cosmic structure we see arose and what the Universe is made of, and to use this to discover the laws of ultrahigh energy physics. Theorists are humble, without hubris.



قيم البحث

اقرأ أيضاً

We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda i ncludes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night); multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.
In the 21st Century information environment, adversarial actors use disinformation to manipulate public opinion. The distribution of false, misleading, or inaccurate information with the intent to deceive is an existential threat to the United States --distortion of information erodes trust in the socio-political institutions that are the fundamental fabric of democracy: legitimate news sources, scientists, experts, and even fellow citizens. As a result, it becomes difficult for society to come together within a shared reality; the common ground needed to function effectively as an economy and a nation. Computing and communication technologies have facilitated the exchange of information at unprecedented speeds and scales. This has had countless benefits to society and the economy, but it has also played a fundamental role in the rising volume, variety, and velocity of disinformation. Technological advances have created new opportunities for manipulation, influence, and deceit. They have effectively lowered the barriers to reaching large audiences, diminishing the role of traditional mass media along with the editorial oversight they provided. The digitization of information exchange, however, also makes the practices of disinformation detectable, the networks of influence discernable, and suspicious content characterizable. New tools and approaches must be developed to leverage these affordances to understand and address this growing challenge.
164 - Rob van Glabbeek 2017
Often fairness assumptions need to be made in order to establish liveness properties of distributed systems, but in many situations these lead to false conclusions. This document presents a research agenda aiming at laying the foundations of a theo ry of concurrency that is equipped to ensure liveness properties of distributed systems without making fairness assumptions. This theory will encompass process algebra, temporal logic and semantic models, as well as treatments of real-time. The agenda also includes developing a methodology that allows successful application of this theory to the specification, analysis and verification of realistic distributed systems, including routing protocols for wireless networks. Contemporary process algebras and temporal logics fail to make distinctions between systems of which one has a crucial liveness property and the other does not, at least when assuming justness, a strong progress property, but not assuming fairness. Setting up an alternative framework involves giving up on identifying strongly bisimilar systems, inventing new induction principles, developing new axiomatic bases for process algebras and new congruence formats for operational semantics, and creating new treatments of time and probability. Even simple systems like fair schedulers or mutual exclusion protocols cannot be accurately specified in standard process algebras (or Petri nets) in the absence of fairness assumptions. Hence the work involves the study of adequate language or model extensions, and their expressive power.
We propose a funding scheme for theoretical research that does not rely on project proposals, but on recent past scientific productivity. Given a quantitative figure of merit on the latter and the total research budget, we introduce a number of polic ies to decide the allocation of funds in each grant call. Under some assumptions on scientific productivity, some of such policies are shown to converge, in the limit of many grant calls, to a funding configuration that is close to the maximum total productivity of the whole scientific community. We present numerical simulations showing evidence that these schemes would also perform well in the presence of statistical noise in the scientific productivity and/or its evaluation. Finally, we prove that one of our policies cannot be cheated by individual research units. Our work must be understood as a first step towards a mathematical theory of the research activity.
Participation on social media platforms has many benefits but also poses substantial threats. Users often face an unintended loss of privacy, are bombarded with mis-/disinformation, or are trapped in filter bubbles due to over-personalized content. T hese threats are further exacerbated by the rise of hidden AI-driven algorithms working behind the scenes to shape users thoughts, attitudes, and behavior. We investigate how multimedia researchers can help tackle these problems to level the playing field for social media users. We perform a comprehensive survey of algorithmic threats on social media and use it as a lens to set a challenging but important research agenda for effective and real-time user nudging. We further implement a conceptual prototype and evaluate it with experts to supplement our research agenda. This paper calls for solutions that combat the algorithmic threats on social media by utilizing machine learning and multimedia content analysis techniques but in a transparent manner and for the benefit of the users.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا