Do you want to publish a course? Click here

Disentangling of spectra - theory and practice

101   0   0.0 ( 0 )
 Added by Petr Hadrava
 Publication date 2009
  fields Physics
and research's language is English
 Authors P. Hadrava




Ask ChatGPT about the research

In this document a review of the authors method of Fourier disentangling of spectra of binary and multiple stars is presented for the purpose of the summer school organized at Ondrejov observatory in September 2008. Related methods are also discussed and some practical hints for the use of the authors code KOREL and related auxiliary codes with examples are given.



rate research

Read More

Forecasting has always been at the forefront of decision making and planning. The uncertainty that surrounds the future is both exciting and challenging, with individuals and organisations seeking to minimise risks and maximise utilities. The large number of forecasting applications calls for a diverse set of forecasting methods to tackle real-life challenges. This article provides a non-systematic review of the theory and the practice of forecasting. We provide an overview of a wide range of theoretical, state-of-the-art models, methods, principles, and approaches to prepare, produce, organise, and evaluate forecasts. We then demonstrate how such theoretical concepts are applied in a variety of real-life contexts. We do not claim that this review is an exhaustive list of methods and applications. However, we wish that our encyclopedic presentation will offer a point of reference for the rich work that has been undertaken over the last decades, with some key insights for the future of forecasting theory and practice. Given its encyclopedic nature, the intended mode of reading is non-linear. We offer cross-references to allow the readers to navigate through the various topics. We complement the theoretical concepts and applications covered by large lists of free or open-source software implementations and publicly-available databases.
There is an ongoing debate in computer science how algorithms should best be studied. Some scholars have argued that experimental evaluations should be conducted, others emphasize the benefits of formal analysis. We believe that this debate less of a question of either-or, because both views can be integrated into an overarching framework. It is the ambition of this paper to develop such a framework of algorithm engineering with a theoretical foundation in the philosophy of science. We take the empirical nature of algorithm engineering as a starting point. Our theoretical framework builds on three areas discussed in the philosophy of science: ontology, epistemology and methodology. In essence, ontology describes algorithm engineering as being concerned with algorithmic problems, algorithmic tasks, algorithm designs and algorithm implementations. Epistemology describes the body of knowledge of algorithm engineering as a collection of prescriptive and descriptive knowledge, residing in World 3 of Poppers Three Worlds model. Methodology refers to the steps how we can systematically enhance our knowledge of specific algorithms. In this context, we identified seven validity concerns and discuss how researchers can respond to falsification. Our framework has important implications for researching algorithms in various areas of computer science.
Recent efforts have applied quantum tomography techniques to the calibration and characterization of complex quantum detectors using minimal assumptions. In this work we provide detail and insight concerning the formalism, the experimental and theoretical challenges and the scope of these tomographical tools. Our focus is on the detection of photons with avalanche photodiodes and photon number resolving detectors and our approach is to fully characterize the quantum operators describing these detectors with a minimal set of well specified assumptions. The formalism is completely general and can be applied to a wide range of detectors
Nowadays, tiered architectures are widely accepted for constructing large scale information systems. In this context application servers often form the bottleneck for a systems efficiency. An application server exposes an object oriented interface consisting of set of methods which are accessed by potentially remote clients. The idea of method caching is to store results of read-only method invocations with respect to the application servers interface on the client side. If the client invokes the same method with the same arguments again, the corresponding result can be taken from the cache without contacting the server. It has been shown that this approach can considerably improve a real world systems efficiency. This paper extends the concept of method caching by addressing the case where clients wrap related method invocations in ACID transactions. Demarcating sequences of method calls in this way is supported by many important application server standards. In this context the paper presents an architecture, a theory and an efficient protocol for maintaining full transactional consistency and in particular serializability when using a method cache on the client side. In order to create a protocol for scheduling cached method results, the paper extends a classical transaction formalism. Based on this extension, a recovery protocol and an optimistic serializability protocol are derived. The latter one differs from traditional transactional cache protocols in many essential ways. An efficiency experiment validates the approach: Using the cache a systems performance and scalability are considerably improved.
When studying the evolutionary stages of protostars that form in clusters, the role of any intracluster medium cannot be neglected. High foreground extinction can lead to situations where young stellar objects (YSOs) appear to be in earlier evolutionary stages than they actually are, particularly when using simple criteria like spectral indices. To address this issue, we have assembled detailed SED characterizations of a sample of 56 Spitzer-identified candidate YSOs in the clusters NGC 2264 and IC 348. For these, we use spectra obtained with the Infrared Spectrograph onboard the Spitzer Space Telescope and ancillary multi-wavelength photometry. The primary aim is twofold: 1) to discuss the role of spectral features, particularly those due to ices and silicates, in determining a YSOs evolutionary stage, and 2) to perform comprehensive modeling of spectral energy distributions (SEDs) enhanced by the IRS data. The SEDs consist of ancillary optical-to-submillimeter multi-wavelength data as well as an accurate description of the 9.7 micron silicate feature and of the mid-infrared continuum derived from line-free parts of the IRS spectra. We find that using this approach, we can distinguish genuine protostars in the cluster from T Tauri stars masquerading as protostars due to external foreground extinction. Our results underline the importance of photometric data in the far-infrared/submillimeter wavelength range, at sufficiently high angular resolution to more accurately classify cluster members. Such observations are becoming possible now with the advent of the Herschel Space Observatory.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا