ترغب بنشر مسار تعليمي؟ اضغط هنا

XROM and RCOM: Two New OGLE-III Real Time Data Analysis Systems

156   0   0.0 ( 0 )
 نشر من قبل Andrzej Udalski
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English
 تأليف A. Udalski




اسأل ChatGPT حول البحث

We describe two new OGLE-III real time data analysis systems: XROM and RCOM. The XROM system has been designed to provide continuous real time photometric monitoring of the optical counterparts of X-ray sources while RCOM system provides real time photometry of R Coronae Borealis variable stars located in the OGLE-III fields. Both systems can be used for triggering follow-up observations in crucial phases of variability episodes of monitored objects.



قيم البحث

اقرأ أيضاً

478 - L.Wyrzykowski 2014
We present the design and first results of a real-time search for transients within the 650 sq. deg. area around the Magellanic Clouds, conducted as part of the OGLE-IV project and aimed at detecting supernovae, novae and other events. The average sa mpling of about 4 days from September to May, yielded a detection of 238 transients in 2012/2013 and 2013/2014 seasons. The superb photometric and astrometric quality of the OGLE data allows for numerous applications of the discovered transients. We use this sample to prepare and train a Machine Learning-based automated classifier for early light curves, which distinguishes major classes of transients with more than 80% of correct answers. Spectroscopically classified 49 supernovae Type Ia are used to construct a Hubble Diagram with statistical scatter of about 0.3 mag and fill the least populated region of the redshifts range in the Union sample. We investigate the influence of host galaxy environments on supernovae statistics and find the mean host extinction of A_I=0.19+-0.10 mag and A_V=0.39+-0.21 mag based on a subsample of supernovae Type Ia. We show that the positional accuracy of the survey is of the order of 0.5 pixels (0.13 arcsec) and that the OGLE-IV Transient Detection System is capable of detecting transients within the nuclei of galaxies. We present a few interesting cases of nuclear transients of unknown type. All data on the OGLE transients are made publicly available to the astronomical community via the OGLE website.
Exploratory data analysis tools must respond quickly to a users questions, so that the answer to one question (e.g. a visualized histogram or fit) can influence the next. In some SQL-based query systems used in industry, even very large (petabyte) da tasets can be summarized on a human timescale (seconds), employing techniques such as columnar data representation, caching, indexing, and code generation/JIT-compilation. This article describes progress toward realizing such a system for High Energy Physics (HEP), focusing on the intermediate problems of optimizing data access and calculations for query sized payloads, such as a single histogram or group of histograms, rather than large reconstruction or data-skimming jobs. These techniques include direct extraction of ROOT TBranches into Numpy arrays and compilation of Python analysis functions (rather than SQL) to be executed very quickly. We will also discuss the problem of caching and actively delivering jobs to worker nodes that have the necessary input data preloaded in cache. All of these pieces of the larger solution are available as standalone GitHub repositories, and could be used in current analyses.
We give the first algorithm for testing the feasibility of a system of sporadic real-time tasks on a set of identical processors, solving one major open problem in the area of multiprocessor real-time scheduling [S.K. Baruah and K. Pruhs, Journal of Scheduling, 2009]. We also investigate the related notion of schedulability and a notion that we call online feasibility. Finally, we show that discrete-time schedules are as powerful as continuous-time schedules, which answers another open question in the above mentioned survey.
While experiments on fusion plasmas produce high-dimensional data time series with ever increasing magnitude and velocity, data analysis has been lagging behind this development. For example, many data analysis tasks are often performed in a manual, ad-hoc manner some time after an experiment. In this article we introduce the DELTA framework that facilitates near real-time streaming analysis of big and fast fusion data. By streaming measurement data from fusion experiments to a high-performance compute center, DELTA allows to perform demanding data analysis tasks in between plasma pulses. This article describe the modular and expandable software architecture of DELTA and presents performance benchmarks of its individual components as well as of entire workflows. Our focus is on the streaming analysis of ECEi data measured at KSTAR on NERSCs supercomputers and we routinely achieve data transfer rates of about 500 Megabyte per second. We show that a demanding turbulence analysis workload can be distributed among multiple GPUs and executes in under 5 minutes. We further discuss how DELTA uses modern database systems and container orchestration services to provide web-based real-time data visualization. For the case of ECEi data we demonstrate how data visualizations can be augmented with outputs from machine learning models. By providing session leaders and physics operators results of higher order data analysis using live visualization they may monitor the evolution of a long-pulse discharge in near real-time and may make more informed decision on how to configure the machine for the next shot.
87 - P.R. Wozniak 2002
We present the first edition of a catalog of variable stars from OGLE-II Galactic Bulge data covering 3 years: 1997-1999. Typically 200-300 I band data points are available in 49 fields between -11 and 11 degrees in galactic longitude, totaling rough ly 11 square degrees in sky coverage. Photometry was obtained using the Difference Image Analysis (DIA) software and tied to the OGLE data base with the DoPhot package. The present version of the catalog comprises 221,801 light curves. In this preliminary work the level of contamination by spurious detections is still about 10%. Parts of the catalog have only crude calibration, insufficient for distance determinations. The next, fully calibrated, edition will include the data collected in year 2000. The data is accessible via FTP. Due to the data volume, we also distribute DAT tapes upon request.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا