ترغب بنشر مسار تعليمي؟ اضغط هنا

Software Holography: Interferometric Data Analysis for the Challenges of Next Generation Observatories

37   0   0.0 ( 0 )
 نشر من قبل Miguel F. Morales
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Next generation radio observatories such as the MWA, LWA, LOFAR, CARMA and SKA provide a number of challenges for interferometric data analysis. These challenges include heterogeneous arrays, direction-dependent instrumental gain, and refractive and scintillating atmospheric conditions. From the analysis perspective, this means that calibration solutions can not be described using a single complex gain per antenna. In this paper we use the optimal map-making formalism developed for CMB analyses to extend traditional interferometric radio analysis techniques--removing the assumption of a single complex gain per antenna and allowing more complete descriptions of the instrumental and atmospheric conditions. Due to the similarity with holographic mapping of radio antenna surfaces, we call this extended analysis approach software holography. The resulting analysis algorithms are computationally efficient, unbiased, and optimally sensitive. We show how software holography can be used to solve some of the challenges of next generation observations, and how more familiar analysis techniques can be derived as limiting cases.

قيم البحث

اقرأ أيضاً

The high energy physics community is discussing where investment is needed to prepare software for the HL-LHC and its unprecedented challenges. The ROOT project is one of the central software players in high energy physics since decades. From its exp erience and expectations, the ROOT team has distilled a comprehensive set of areas that should see research and development in the context of data analysis software, for making best use of HL-LHCs physics potential. This work shows what these areas could be, why the ROOT team believes investing in them is needed, which gains are expected, and where related work is ongoing. It can serve as an indication for future research proposals and cooperations.
Computing has dramatically changed nearly every aspect of our lives, from business and agriculture to communication and entertainment. As a nation, we rely on computing in the design of systems for energy, transportation and defense; and computing fu els scientific discoveries that will improve our fundamental understanding of the world and help develop solutions to major challenges in health and the environment. Computing has changed our world, in part, because our innovations can run on computers whose performance and cost-performance has improved a million-fold over the last few decades. A driving force behind this has been a repeated doubling of the transistors per chip, dubbed Moores Law. A concomitant enabler has been Dennard Scaling that has permitted these performance doublings at roughly constant power, but, as we will see, both trends face challenges. Consider for a moment the impact of these two trends over the past 30 years. A 1980s supercomputer (e.g. a Cray 2) was rated at nearly 2 Gflops and consumed nearly 200 KW of power. At the time, it was used for high performance and national-scale applications ranging from weather forecasting to nuclear weapons research. A computer of similar performance now fits in our pocket and consumes less than 10 watts. What would be the implications of a similar computing/power reduction over the next 30 years - that is, taking a petaflop-scale machine (e.g. the Cray XK7 which requires about 500 KW for 1 Pflop (=1015 operations/sec) performance) and repeating that process? What is possible with such a computer in your pocket? How would it change the landscape of high capacity computing? In the remainder of this paper, we articulate some opportunities and challenges for dramatic performance improvements of both personal to national scale computing, and discuss some out of the box possibilities for achieving computing at this scale.
Data Challenge 1 (DC1) is the first synthetic dataset produced by the Rubin Observatory Legacy Survey of Space and Time (LSST) Dark Energy Science Collaboration (DESC). DC1 is designed to develop and validate data reduction and analysis and to study the impact of systematic effects that will affect the LSST dataset. DC1 is comprised of $r$-band observations of 40 deg$^{2}$ to 10-year LSST depth. We present each stage of the simulation and analysis process: a) generation, by synthesizing sources from cosmological N-body simulations in individual sensor-visit images with different observing conditions; b) reduction using a development version of the LSST Science Pipelines; and c) matching to the input cosmological catalog for validation and testing. We verify that testable LSST requirements pass within the fidelity of DC1. We establish a selection procedure that produces a sufficiently clean extragalactic sample for clustering analyses and we discuss residual sample contamination, including contributions from inefficiency in star-galaxy separation and imperfect deblending. We compute the galaxy power spectrum on the simulated field and conclude that: i) survey properties have an impact of 50% of the statistical uncertainty for the scales and models used in DC1 ii) a selection to eliminate artifacts in the catalogs is necessary to avoid biases in the measured clustering; iii) the presence of bright objects has a significant impact (2- to 6-$sigma$) in the estimated power spectra at small scales ($ell > 1200$), highlighting the impact of blending in studies at small angular scales in LSST;
High precision astrometry provides the foundation to resolve many fundamental problems in astrophysics. The application of astrometric studies spans a wide range of fields, and has undergone enormous growth in recent years. This is as a consequence o f the increasing measurement precision and wide applicability, which is due in turn to the development of new techniques. Forthcoming next generation observatories have the potential to further increase the astrometric precision, providing there is a matching improvement in the methods to correct for systematic errors. The EVN and other observatories are providing demonstrations of these and are acting as pathfinders for next-generation telescopes such as the SKA and ngVLA. We will review the perspectives for the coming facilities and examples of the current state-of-the-art for astrometry.
By all measures, wireless networking has seen explosive growth over the past decade. Fourth Generation Long Term Evolution (4G LTE) cellular technology has increased the bandwidth available for smartphones, in essence, delivering broadband speeds to mobile devices. The most recent 5G technology is further enhancing the transmission speeds and cell capacity, as well as, reducing latency through the use of different radio technologies and is expected to provide Internet connections that are an order of magnitude faster than 4G LTE. Technology continues to advance rapidly, however, and the next generation, 6G, is already being envisioned. 6G will make possible a wide range of powerful, new applications including holographic telepresence, telehealth, remote education, ubiquitous robotics and autonomous vehicles, smart cities and communities (IoT), and advanced manufacturing (Industry 4.0, sometimes referred to as the Fourth Industrial Revolution), to name but a few. The advances we will see begin at the hardware level and extend all the way to the top of the software stack. Artificial Intelligence (AI) will also start playing a greater role in the development and management of wireless networking infrastructure by becoming embedded in applications throughout all levels of the network. The resulting benefits to society will be enormous. At the same time these exciting new wireless capabilities are appearing rapidly on the horizon, a broad range of research challenges loom ahead. These stem from the ever-increasing complexity of the hardware and software systems, along with the need to provide infrastructure that is robust and secure while simultaneously protecting the privacy of users. Here we outline some of those challenges and provide recommendations for the research that needs to be done to address them.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا