Do you want to publish a course? Click here

PlotXY: a high quality plotting system for the Herschel Interactive Processing Environment (HIPE), and the astronomical community

228   0   0.0 ( 0 )
 Added by Pasquale Panuzzo
 Publication date 2012
and research's language is English




Ask ChatGPT about the research

The Herschel Interactive Processing Environment (HIPE) was developed by the European Space Agency (ESA) in collaboration with NASA and the Herschel Instrument Control Centres to provide the astronomical community a complete environment to process and analyze the data gathered by the Herschel Space Observatory. One of the most important components of HIPE is the plotting system (named PlotXY) that we present here. With PlotXY it is possible to produce easily high quality publication ready 2D plots. It provides a long list of features, with fully configurable components, and interactive zooming. The entire code of HIPE is written in Java and is open source released under the GNU Lesser General Public License version 3. A new version of PlotXY is being developed to be independent from the HIPE code base; it is available to the software development community for the inclusion in other projects at the URL http://code.google.com/p/jplot2d/.



rate research

Read More

The Herschel Space Observatory is the fourth cornerstone mission in the ESA science programme and performs photometry and spectroscopy in the 55 - 672 micron range. The development of the Herschel Data Processing System started in 2002 to support the data analysis for Instrument Level Tests. The Herschel Data Processing System was used for the pre-flight characterisation of the instruments, and during various ground segment test campaigns. Following the successful launch of Herschel 14th of May 2009 the Herschel Data Processing System demonstrated its maturity when the first PACS preview observation of M51 was processed within 30 minutes of reception of the first science data after launch. Also the first HIFI observations on DR21 were successfully reduced to high quality spectra, followed by SPIRE observations on M66 and M74. A fast turn-around cycle between data retrieval and the production of science-ready products was demonstrated during the Herschel Science Demonstration Phase Initial Results Workshop held 7 months after launch, which is a clear proof that the system has reached a good level of maturity. We will summarise the scope, the management and development methodology of the Herschel Data Processing system, present some key software elements and give an overview about the current status and future development milestones.
238 - J. Surace , R. Laher , F. Masci 2015
The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system at IPAC handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. This was developed under an extremely limited budget profile in an unusually agile development environment. Here we discuss the mechanics of this system and our overall development approach. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST), a major NSF facility scheduled to begin operations in the early 2020s.
We present here a provenance management system adapted to astronomical projects needs. We collected use cases from various astronomy projects and defined a data model in the ecosystem developed by the IVOA (International Virtual Observatory Alliance). From those use cases, we observed that some projects already have data collections generated and archived, from which the provenance has to be extracted (provenance on top), and some projects are building complex pipelines that automatically capture provenance information during the data processing (capture inside). Different tools and prototypes have been developed and tested to capture, store, access and visualize the provenance information, which participate to the shaping of a full provenance management system able to handle detailed provenance information.
64 - R. Rutledge 1998
The Astronomers Telegram (ATEL; http://fire.berkeley.edu:8080/) is a web based short-notice (<4000 characters) publication system for reporting and commenting on new astronomical observations, offering for the first time in astronomy effectively instantaneous distribution of time-critical information for the entire professional community. It is designed to take advantage of the World Wide Webs simple user interface and the ability of computer programs to provide nearly all the necessary functions. One may post a Telegram, which is instantly (<1 second) available at the Web-site, and distributed by email within 24 hours through the Daily Email Digest, which is tailored to the subject selections of each reader. Optionally, urgent Telegrams may be distributed through Instant Email Notices. While ATEL will be of particular use to observers of transient objects (such as gamma-ray bursts, microlenses, supernovae, novae, or X-ray transients) or in fields which are rapidly evolving observationally, there are no restrictions on subject matter.
There are many applications where users seek to explore the impact of the settings of several categorical variables with respect to one dependent numerical variable. For example, a computer systems analyst might want to study how the type of file system or storage device affects system performance. A usual choice is the method of Parallel Sets designed to visualize multivariate categorical variables. However, we found that the magnitude of the parameter impacts on the numerical variable cannot be easily observed here. We also attempted a dimension reduction approach based on Multiple Correspondence Analysis but found that the SVD-generated 2D layout resulted in a loss of information. We hence propose a novel approach, the Interactive Configuration Explorer (ICE), which directly addresses the need of analysts to learn how the dependent numerical variable is affected by the parameter settings given multiple optimization objectives. No information is lost as ICE shows the complete distribution and statistics of the dependent variable in context with each categorical variable. Analysts can interactively filter the variables to optimize for certain goals such as achieving a system with maximum performance, low variance, etc. Our system was developed in tight collaboration with a group of systems performance researchers and its final effectiveness was evaluated with expert interviews, a comparative user study, and two case studies.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا