No Arabic abstract
The tolerancing process links the virtual and the real worlds. From the former, tolerances define a variational geometrical language (geometric parameters). From the latter, there are values limiting those parameters. The beginning of a tolerancing process is in this duality. As high precision assemblies cannot be analyzed with the assumption that form errors are negligible, we propose to apply this process to assemblies with form errors through a new way of allowing to parameterize forms and solve their assemblies. The assembly process is calculated through a method of allowing to solve the 3D assemblies of pairs of surfaces having form errors using a static equilibrium. We have built a geometrical model based on the modal shapes of the ideal surface. We compute for the completely deterministic contact points between this pair of shapes according to a given assembly process. The solution gives an accurate evaluation of the assembly performance. Then we compare the results with or without taking into account the form errors. When we analyze a batch of assemblies, the problem is to compute for the nonconformity rate of a pilot production according to the functional requirements. We input probable errors of surfaces (position, orientation, and form) in our calculus and we evaluate the quality of the results compared with the functional requirements. The pilot production then can or cannot be validated.
Causal inference with observational data can be performed under an assumption of no unobserved confounders (unconfoundedness assumption). There is, however, seldom clear subject-matter or empirical evidence for such an assumption. We therefore develop uncertainty intervals for average causal effects based on outcome regression estimators and doubly robust estimators, which provide inference taking into account both sampling variability and uncertainty due to unobserved confounders. In contrast with sampling variation, uncertainty due unobserved confounding does not decrease with increasing sample size. The intervals introduced are obtained by deriving the bias of the estimators due to unobserved confounders. We are thus also able to contrast the size of the bias due to violation of the unconfoundedness assumption, with bias due to misspecification of the models used to explain potential outcomes. This is illustrated through numerical experiments where bias due to moderate unobserved confounding dominates misspecification bias for typical situations in terms of sample size and modeling assumptions. We also study the empirical coverage of the uncertainty intervals introduced and apply the results to a study of the effect of regular food intake on health. An R-package implementing the inference proposed is available.
We present a renormalization approach to solve the Sznajd opinion formation model on complex networks. For the case of two opinions, we present an expression of the probability of reaching consensus for a given opinion as a function of the initial fraction of agents with that opinion. The calculations reproduce the sharp transition of the model on a fixed network, as well as the recently observed smooth function for the model when simulated on a growing complex networks.
We consider the realization of a quantum computer in a chain of nuclear spins coupled by an Ising interaction. Quantum algorithms can be performed with the help of appropriate radio-frequency pulses. In addition to the standard nearest-neighbor Ising coupling, we also allow for a second neighbor coupling. It is shown, how to apply the 2pi k method in this more general setting, where the additional coupling eventually allows to save a few pulses. We illustrate our results with two numerical simulations: the Shor prime factorization of the number 4 and the teleportation of a qubit along a chain of 3 qubits. In both cases, the optimal Rabi frequency (to suppress non-resonant effects) depends primarily on the strength of the second neighbor interaction.
We investigate the effect of high order radiative corrections in unpolarized electron proton elastic scattering and compare with the calculations at lowest order, which are usually applied to experimental data. Particular attention is devoted to the $epsilon$ dependence of radiative corrections, which is directly related to the electric proton form factor. We consider in particular the effects of the interference terms for soft and hard photon emission. Both quadratic amplitude describing the collinear emission along the scattered electron as well as the interference with the amplitudes of emission from the initial electron and the emission from protons are important in leading and next to leading approximation and they may compensate in particular kinematical conditions.
In a recent paper (Chabrier et al. 2019), we have derived a new equation of state (EOS) for dense hydrogen/helium mixtures which covers the temperature-density domain from solar-type stars to brown dwarfs and gaseous planets. This EOS is based on the so-called additive volume law and thus does not take into account the interactions between the hydrogen and helium species. In the present paper, we go beyond these calculations by taking into account H/He interactions, derived from quantum molecular dynamics simulations. These interactions, which eventually lead to H/He phase separation, become important at low temperature and high density, in the domain of brown dwarfs and giant planets. The tables of this new EOS are made publicly available.