ﻻ يوجد ملخص باللغة العربية
Time-resolved radiography can be used to obtain absolute shock Hugoniot states by simultaneously measuring at least two mechanical parameters of the shock, and this technique is particularly suitable for one-dimensional converging shocks where a single experiment probes a range of pressures as the converging shock strengthens. However, at sufficiently high pressures, the shocked material becomes hot enough that the x-ray opacity falls significantly. If the system includes a Lagrangian marker, such that the mass within the marker is known, this additional information can be used to constrain the opacity as well as the Hugoniot state. In the limit that the opacity changes only on shock heating, and not significantly on subsequent isentropic compression, the opacity of shocked material can be determined uniquely. More generally, it is necessary to assume the form of the variation of opacity with isentropic compression, or to introduce multiple marker layers. Alternatively, assuming either the equation of state or the opacity, the presence of a marker layer in such experiments enables the non-assumed property to be deduced more accurately than from the radiographic density reconstruction alone. An example analysis is shown for measurements of a converging shock wave in polystyrene, at the National Ignition Facility.
In this study we show that standard well-known file compression programs (zlib, bzip2, etc.) are able to forecast real-world time series data well. The strength of our approach is its ability to use a set of data compression algorithms and automatica
Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CC
We present a new version 3.2 of the LanHEP software package. New features include UFO output, color sextet particles and new substutution techniques which allow to define new routines.
Suppose there is a large file which should be transmitted (or stored) and there are several (say, m) admissible data-compressors. It seems natural to try all the compressors and then choose the best, i.e. the one that gives the shortest compressed fi
Time series data compression is emerging as an important problem with the growth in IoT devices and sensors. Due to the presence of noise in these datasets, lossy compression can often provide significant compression gains without impacting the perfo