Do you want to publish a course? Click here

What to do if N is two?

148   0   0.0 ( 0 )
 Added by Pascal Fries
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

The field of in-vivo neurophysiology currently uses statistical standards that are based on tradition rather than formal analysis. Typically, data from two (or few) animals are pooled for one statistical test, or a significant test in a first animal is replicated in one (or few) further animals. The use of more than one animal is widely believed to allow an inference on the population. Here, we explain that a useful inference on the population would require larger numbers and a different statistical approach. The field should consider to perform studies at that standard, potentially through coordinated multi-center efforts, for selected questions of exceptional importance. Yet, for many questions, this is ethically and/or economically not justifiable. We explain why in those studies with two (or few) animals, any useful inference is limited to the sample of investigated animals, irrespective of whether it is based on few animals, two animals or a single animal.



rate research

Read More

124 - C. H. Bryan Liu 2019
Experimentation and Measurement (E&M) capabilities allow organizations to accurately assess the impact of new propositions and to experiment with many variants of existing products. However, until now, the question of measuring the measurer, or valuing the contribution of an E&M capability to organizational success has not been addressed. We tackle this problem by analyzing how, by decreasing estimation uncertainty, E&M platforms allow for better prioritization. We quantify this benefit in terms of expected relative improvement in the performance of all new propositions and provide guidance for how much an E&M capability is worth and when organizations should invest in one.
In the system of a gravitating Q-ball, there is a maximum charge $Q_{{rm max}}$ inevitably, while in flat spacetime there is no upper bound on $Q$ in typical models such as the Affleck-Dine model. Theoretically the charge $Q$ is a free parameter, and phenomenologically it could increase by charge accumulation. We address a question of what happens to Q-balls if $Q$ is close to $Q_{{rm max}}$. First, without specifying a model, we show analytically that inflation cannot take place in the core of a Q-ball, contrary to the claim of previous work. Next, for the Affleck-Dine model, we analyze perturbation of equilibrium solutions with $Qapprox Q_{{rm max}}$ by numerical analysis of dynamical field equations. We find that the extremal solution with $Q=Q_{{rm max}}$ and unstable solutions around it are critical solutions, which means the threshold of black-hole formation.
We highlight that the anomalous orbits of Trans-Neptunian Objects (TNOs) and an excess in microlensing events in the 5-year OGLE dataset can be simultaneously explained by a new population of astrophysical bodies with mass several times that of Earth ($M_oplus$). We take these objects to be primordial black holes (PBHs) and point out the orbits of TNOs would be altered if one of these PBHs was captured by the Solar System, inline with the Planet 9 hypothesis. Capture of a free floating planet is a leading explanation for the origin of Planet 9 and we show that the probability of capturing a PBH instead is comparable. The observational constraints on a PBH in the outer Solar System significantly differ from the case of a new ninth planet. This scenario could be confirmed through annihilation signals from the dark matter microhalo around the PBH.
121 - Jiri J. Mares 2016
Temperature, the central concept of thermal physics, is one of the most frequently employed physical quantities in common practice. Even though the operative methods of the temperature measurement are described in detail in various practical instructions and textbooks, the rigorous treatment of this concept is almost lacking in the current literature. As a result, the answer to a simple question of what the temperature is is by no means trivial and unambiguous. There is especially an appreciable gap between the temperature as introduced in the frame of statistical theory and the only experimentally observable quantity related to this concept, phenomenological temperature. Just the logical and epistemological analysis of the present concept of phenomenological temperature is the kernel of the contribution.
127 - Carl H. Gibson 2012
Turbulence is defined as an eddy-like state of fluid motion where the inertial-vortex forces of the eddies are larger than any other forces that tend to damp the eddies out. By this definition, turbulence always cascades from small scales (where the vorticity is created) to larger scales (where other forces dominate and the turbulence fossilizes). Fossil turbulence is any perturbation in a hydrophysical field produced by turbulence that persists after the fluid is no longer turbulent at the scale of the perturbation. Fossil turbulence patterns and fossil turbulence waves preserve and propagate information about previous turbulence to larger and smaller length scales. Big bang fossil turbulence patterns are identified in anisotropies of temperature detected by space telescopes in the cosmic microwave background. Direct numerical simulations of stratified shear flows and wakes show that turbulence and fossil turbulence interactions are recognizable and persistent.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا