ﻻ يوجد ملخص باللغة العربية
Application designers often face the question of whether to store large objects in a filesystem or in a database. Often this decision is made for application design simplicity. Sometimes, performance measurements are also used. This paper looks at the question of fragmentation - one of the operational issues that can affect the performance and/or manageability of the system as deployed long term. As expected from the common wisdom, objects smaller than 256KB are best stored in a database while objects larger than 1M are best stored in the filesystem. Between 256KB and 1MB, the read:write ratio and rate of object overwrite or replacement are important factors. We used the notion of storage age or number of object overwrites as way of normalizing wall clock time. Storage age allows our results or similar such results to be applied across a number of read:write ratios and object replacement rates.
When an object impacts the free surface of a liquid, it ejects a splash curtain upwards and creates an air cavity below the free surface. As the object descends into the liquid, the air cavity eventually closes under the action of hydrostatic pressur
We review some aspects, especially those we can tackle analytically, of a minimal model of closed economy analogous to the kinetic theory model of ideal gases where the agents exchange wealth amongst themselves such that the total wealth is conserved
In cases where both components of a binary system show oscillations, asteroseismology has been proposed as a method to identify the system. For KIC 2568888, observed with $Kepler$, we detect oscillation modes for two red giants in a single power dens
Instrumental variables (IVs) are extensively used to estimate treatment effects when the treatment and outcome are confounded by unmeasured confounders; however, weak IVs are often encountered in empirical studies and may cause problems. Many studies
With the advent of increasingly elaborate experimental techniques in physics, chemistry and materials sciences, measured data are becoming bigger and more complex. The observables are typically a function of several stimuli resulting in multidimensio