Do you want to publish a course? Click here

Web-based tools for the analysis of TAOS data and much more

98   0   0.0 ( 0 )
 Added by Davide Ricci
 Publication date 2014
  fields Physics
and research's language is English




Ask ChatGPT about the research

We suggest a new web-based approach for browsing and visualizing data produced by a network of telescopes, such as those of the ongoing TAOS and the forthcoming TAOS II projects. We propose a modern client-side technology and we present two examples based on two software packages developed for different kinds of server- side database approaches. In spite our examples are specific for the browsing of TAOS light curves, the software is coded in a way to be suitable for the use in several types of astronomical projects.



rate research

Read More

A preliminary data analysis of the stellar light curves obtained by the robotic telescopes of the TAOS project is presented. We selected a data run relative to one of the stellar fields observed by three of the four TAOS telescopes, and we investigate the common trend and the correlation between the light curves. We propose two ways to remove these trends and show the preliminary results. A project aimed at flagging interesting behaviors, such as stellar variability, and to set up an automated follow-up with the San Pedro Martir Facilities is on the way.
539 - R.L. Akeson , X. Chen , D. Ciardi 2013
We describe the contents and functionality of the NASA Exoplanet Archive, a database and tool set funded by NASA to support astronomers in the exoplanet community. The current content of the database includes interactive tables containing properties of all published exoplanets, Kepler planet candidates, threshold-crossing events, data validation reports and target stellar parameters, light curves from the Kepler and CoRoT missions and from several ground-based surveys, and spectra and radial velocity measurements from the literature. Tools provided to work with these data include a transit ephemeris predictor, both for single planets and for observing locations, light curve viewing and normalization utilities, and a periodogram and phased light curve service. The archive can be accessed at http://exoplanetarchive.ipac.caltech.edu.
The next decade will feature a growing number of massive ground-based photometric, spectroscopic, and time-domain surveys, including those produced by DECam, DESI, and LSST. The NOAO Data Lab was launched in 2017 to enable efficient exploration and analysis of large surveys, with particular focus on the petabyte-scale holdings of the NOAO Archive and their associated catalogs. The Data Lab mission and future development align well with two of the NSFs Big Ideas, namely Harnessing Data for 21st Century Science and Engineering and as part of a network to contribute to Windows on the Universe: The Era of Multi-messenger Astrophysics. Along with other Science Platforms, the Data Lab will play a key role in scientific discoveries from surveys in the next decade, and will be crucial to maintaining a level playing field as datasets grow in size and complexity.
We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distributions and other related quantities. Nested fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.
Recently, there have been several national calls to emphasize physics practices and skills within laboratory courses. In this paper, we describe the redesign and implementation of a two-course sequence of algebra-based physics laboratories at Michigan State University called Design Analysis Tools and Apprenticeship (DATA) Lab. The large-scale course transformation removes physics specific content from the overall learning goals of the course, and instead, uses physics concepts to focus on specific laboratory practices and research skills that students can take into their future careers. Students in DATA Lab engage in the exploration of physical systems to increase their understanding of the experimental process, data analysis, collaboration, and scientific communication. In order to ensure our students are making progress toward the skills outlined in the course learning goals, we designed all of the assessments in the courses to evaluate their progress specific to these laboratory practices. Here, we will describe the structures, scaffolds, goals, and assessments of the course.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا