ترغب بنشر مسار تعليمي؟ اضغط هنا

The LSST Data Mining Research Agenda

144   0   0.0 ( 0 )
 نشر من قبل Kirk D. Borne
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night); multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.



قيم البحث

اقرأ أيضاً

61 - J. Richard Bond 1996
The terrain that theorists cover in this CMB golden age is described. We ponder early universe physics in quest of the fluctuation generator. We extoll the virtues of inflation and defects. We transport fields, matter and radiation into the linear (p rimary anisotropies) and nonlinear (secondary anisotropies) regimes. We validate our linear codes to deliver accurate predictions for experimentalists to shoot at. We struggle at the computing edge to push our nonlinear simulations from only illustrative to fully predictive. We are now phenomenologists, optimizing statistical techniques for extracting truths and their errors from current and future experiments. We begin to clean foregrounds. We join CMB experimental teams. We combine the CMB with large scale structure, galaxy and other cosmological observations in search of current concordance. The brave use all topical data. Others carefully craft their prior probabilities to downweight data sets. We are always unbiased. We declare theories sick, dead, ugly. Sometimes we cure them, resurrect them, rarely beautify them. Our goal is to understand how all cosmic structure we see arose and what the Universe is made of, and to use this to discover the laws of ultrahigh energy physics. Theorists are humble, without hubris.
In the 21st Century information environment, adversarial actors use disinformation to manipulate public opinion. The distribution of false, misleading, or inaccurate information with the intent to deceive is an existential threat to the United States --distortion of information erodes trust in the socio-political institutions that are the fundamental fabric of democracy: legitimate news sources, scientists, experts, and even fellow citizens. As a result, it becomes difficult for society to come together within a shared reality; the common ground needed to function effectively as an economy and a nation. Computing and communication technologies have facilitated the exchange of information at unprecedented speeds and scales. This has had countless benefits to society and the economy, but it has also played a fundamental role in the rising volume, variety, and velocity of disinformation. Technological advances have created new opportunities for manipulation, influence, and deceit. They have effectively lowered the barriers to reaching large audiences, diminishing the role of traditional mass media along with the editorial oversight they provided. The digitization of information exchange, however, also makes the practices of disinformation detectable, the networks of influence discernable, and suspicious content characterizable. New tools and approaches must be developed to leverage these affordances to understand and address this growing challenge.
164 - Rob van Glabbeek 2017
Often fairness assumptions need to be made in order to establish liveness properties of distributed systems, but in many situations these lead to false conclusions. This document presents a research agenda aiming at laying the foundations of a theo ry of concurrency that is equipped to ensure liveness properties of distributed systems without making fairness assumptions. This theory will encompass process algebra, temporal logic and semantic models, as well as treatments of real-time. The agenda also includes developing a methodology that allows successful application of this theory to the specification, analysis and verification of realistic distributed systems, including routing protocols for wireless networks. Contemporary process algebras and temporal logics fail to make distinctions between systems of which one has a crucial liveness property and the other does not, at least when assuming justness, a strong progress property, but not assuming fairness. Setting up an alternative framework involves giving up on identifying strongly bisimilar systems, inventing new induction principles, developing new axiomatic bases for process algebras and new congruence formats for operational semantics, and creating new treatments of time and probability. Even simple systems like fair schedulers or mutual exclusion protocols cannot be accurately specified in standard process algebras (or Petri nets) in the absence of fairness assumptions. Hence the work involves the study of adequate language or model extensions, and their expressive power.
The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field, ground-based survey system that will image the sky in six optical bands from 320 to 1050 nm, uniformly covering approximately $18,000$deg$^2$ of the sky over 800 times. The L SST is currently under construction on Cerro Pachon in Chile, and expected to enter operations in 2022. Once operational, the LSST will explore a wide range of astrophysical questions, from discovering killer asteroids to examining the nature of Dark Energy. The LSST will generate on average 15 TB of data per night, and will require a comprehensive Data Management system to reduce the raw data to scientifically useful catalogs and images with minimum human intervention. These reductions will result in a real-time alert stream, and eleven data releases over the 10-year duration of LSST operations. To enable this processing, the LSST project is developing a new, general-purpose, high-performance, scalable, well documented, open source data processing software stack for O/IR surveys. Prototypes of this stack are already capable of processing data from existing cameras (e.g., SDSS, DECam, MegaCam), and form the basis of the Hyper-Suprime Cam (HSC) Survey data reduction pipeline.
Participation on social media platforms has many benefits but also poses substantial threats. Users often face an unintended loss of privacy, are bombarded with mis-/disinformation, or are trapped in filter bubbles due to over-personalized content. T hese threats are further exacerbated by the rise of hidden AI-driven algorithms working behind the scenes to shape users thoughts, attitudes, and behavior. We investigate how multimedia researchers can help tackle these problems to level the playing field for social media users. We perform a comprehensive survey of algorithmic threats on social media and use it as a lens to set a challenging but important research agenda for effective and real-time user nudging. We further implement a conceptual prototype and evaluate it with experts to supplement our research agenda. This paper calls for solutions that combat the algorithmic threats on social media by utilizing machine learning and multimedia content analysis techniques but in a transparent manner and for the benefit of the users.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا