The calculation of the electron g-factor was carried out in 1950 by Karplus and Kroll. Seven years later, Petermann detected and corrected a serious error in the calculation of a Feynman diagram. Although it is hard to believe, neither the original calculation nor the subsequent correction was ever published. Therefore, the entire prestige of QED and the Standard Model depends on the calculation of a single Feynman diagram (IIc) that has never been published and cannot be independently verified. In this article we begin the search for any published recalculation of this Feynman diagram IIc that allows us to independently validate the theoretical calculation.
This article looks at philosophical aspects and questions that modern astrophysical research gives rise to. Other than cosmology, astrophysics particularly deals with understanding phenomena and processes operating at intermediate cosmic scales, which has rarely aroused philosophical interest so far. Being confronted with the attribution of antirealism by Ian Hacking because of its observational nature, astrophysics is equipped with a characteristic methodology that can cope with the missing possibility of direct interaction with most objects of research. In its attempt to understand the causal history of singular phenomena it resembles the historical sciences, while the search for general causal relations with respect to classes of processes or objects can rely on the cosmic laboratory: the multitude of different phenomena and environments, naturally provided by the universe. Furthermore, the epistemology of astrophysics is strongly based on the use of models and simulations and a complex treatment of large amounts of data.
Scientific research is and was at all times a transnational (global) activity. In this respect, it crosses several borders: national, cultural, and ideological. Even in times when physical borders separated the scientific community, scientists kept their minds open to the ideas created beyond the walls and tried to communicate despite all the obstacles. An example of such activities in the field of physics is the travel in the year 1838 of a group of three scientists through the Western Europe: Andreas Ettingshausen (professor at the University of Vienna), August Kunzek (professor at the University of Lviv) and P. Marian Koller (director of the observatory in Chremsminster, Upper Austria). 155 years later a vivid scientific exchange began between physicists from Austria and Ukraine, in particular, between the Institute for Condensed Matter Physics of the National Academy of Sciences of Ukraine in Lviv and the Institute for Theoretical Physics of Johannes Kepler University Linz. This became possible due to the programs financed by national institutions, but it had its scientific background in already knotted historic scientific networks, when Lviv was an international center of mathematics and in Vienna the School of Statistical Thought arose. Due to the new collaboration, after the breakup of the Soviet Union, Ukraine became the first country to join the Middle European Cooperation in Statistical Physics (MECO) founded in the early 1970s with the aim of bridging the gap between scientists from the Eastern and Western parts of Europe separated by the iron curtain.
The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question here by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.