No Arabic abstract
We discuss the reception of Copernican astronomy by the Provenc{c}al humanists of the XVIth-XVIIth centuries, beginning with Michel de Montaigne who was the first to recognize the potential scientific and philosophical revolution represented by heliocentrism. Then we describe how, after Keplers Astronomia Nova of 1609 and the first telescopic observations by Galileo, it was in the south of France that the New Astronomy found its main promotors with the humanists and amateurs eclaires, Nicolas-Claude Fabri de Peiresc and Pierre Gassendi. The professional astronomer Jean-Dominique Cassini, also from Provence, would later elevate the field to new heights in Paris.
A number of philosophers and scientists have discussed the possibility of inseparability between the subject (i.e., the observer) and the object (i.e., the observed universe). In particular, it has recently been proposed that this inseparability may be obtained through the discrete physical universe being filled with the observers continuous consciousness through quantum evolution with time going backwards. The proposal of a universe view with interwoven matter and mind through cyclical time bears a resemblance to Immanuel Kants discussion of the Copernican Revolution in philosophy, where the priority shifted from the object to the subject.
Typicality arguments attempt to use the Copernican Principle to draw conclusions about the cosmos and presently unknown conscious beings within it. The most notorious is the Doomsday Argument, which purports to constrain humanitys future from its current lifespan alone. These arguments rest on a likelihood calculation that penalizes models in proportion to the number of distinguishable observers. I argue that such reasoning leads to solipsism, the belief that one is the only being in the world, and is therefore unacceptable. Using variants of the Sleeping Beauty thought experiment as a guide, I present a framework for evaluating observations in a large cosmos: Fine Graining with Auxiliary Indexicals (FGAI). FGAI requires the construction of specific models of physical outcomes and observations. Valid typicality arguments then emerge from the combinatorial properties of third-person physical microhypotheses. Indexical (observer-relative) facts do not directly constrain physical theories. Instead they serve to weight different provisional evaluations of credence. These weights define a probabilistic reference class of locations. As indexical knowledge changes, the weights shift. I show that the self-applied Doomsday Argument fails in FGAI, even though it can work for an external observer. I also discuss how FGAI could handle observations in large universes with Boltzmann brains.
The French Revolution brought principles of liberty, equality, and brotherhood to bear on the day-to-day challenges of governing what was then the largest country in Europe. Its experiments provided a model for future revolutions and democracies across the globe, but this first modern revolution had no model to follow. Using reconstructed transcripts of debates held in the Revolutions first parliament, we present a quantitative analysis of how this system managed innovation. We use information theory to track the creation, transmission, and destruction of patterns of word-use across over 40,000 speeches and more than one thousand speakers. The parliament as a whole was biased toward the adoption of new patterns, but speakers individual qualities could break these overall trends. Speakers on the left innovated at higher rates while speakers on the right acted, often successfully, to preserve prior patterns. Key players such as Robespierre (on the left) and Abbe Maury (on the right) played information-processing roles emblematic of their politics. Newly-created organizational functions---such as the Assemblys President and committee chairs---had significant effects on debate outcomes, and a distinct transition appears mid-way through the parliament when committees, external to the debate process, gain new powers to propose and dispose to the body as a whole. Taken together, these quantitative results align with existing qualitative interpretations but also reveal crucial information-processing dynamics that have hitherto been overlooked. Great orators had the publics attention, but deputies (mostly on the political left) who mastered the committee system gained new powers to shape revolutionary legislation.
We pursue a program to confront observations with arbitrarily inhomogeneous cosmologies beyond the FLRW metric. The main idea is to test the Copernican principle rather than assuming it a priori. We consider the $Lambda$CDM model endowed with a spherical $Lambda$LTB inhomogeneity around us, that is, we assume isotropy and test the hypothesis of homogeneity. We confront the $Lambda$LTB model with the latest available data from CMB, BAO, type Ia supernovae, local $H_0$, cosmic chronometers, Compton y-distortion and kinetic Sunyaev-Zeldovich effect. We find that these data can constrain tightly this extra inhomogeneity, almost to the cosmic variance level: on scales $gtrsim 100$ Mpc structures can have a small non-Copernican effective contrast of just $delta_L sim 0.01$. Furthermore, the constraints on the standard $Lambda$CDM parameters are not weakened after marginalizing over the parameters that model the local structure, to which we assign ignorance priors. In other words, dropping the FLRW metric assumption does not imply worse constraints on the cosmological parameters. This positive result confirms that the present and future data can be meaningfully analyzed within the framework of inhomogeneous cosmology.
This paper presents several measurements of total production cross sections and total inelastic cross sections for the following reactions: $pi^{+}$+C, $pi^{+}$+Al, $K^{+}$+C, $K^{+}$+Al at 60 GeV/c, $pi^{+}$+C and $pi^{+}$+Al at 31 GeV/c . The measurements were made using the NA61/SHINE spectrometer at the CERN SPS. Comparisons with previous measurements are given and good agreement is seen. These interaction cross sections measurements are a key ingredient for neutrino flux prediction from the reinteractions of secondary hadrons in current and future accelerator-based long-baseline neutrino experiments.