No Arabic abstract
Numerical simulation models associated with hydraulic engineering take a wide array of data into account to produce predictions: rainfall contribution to the drainage basin (characterized by soil nature, infiltration capacity and moisture), current water height in the river, topography, nature and geometry of the river bed, etc. This data is tainted with uncertainties related to an imperfect knowledge of the field, measurement errors on the physical parameters calibrating the equations of physics, an approximation of the latter, etc. These uncertainties can lead the model to overestimate or underestimate the flow and height of the river. Moreover, complex assimilation models often require numerous evaluations of physical solvers to evaluate these uncertainties, limiting their use for some real-time operational applications. In this study, we explore the possibility of building a predictor for river height at an observation point based on drainage basin time series data. An array of data-driven techniques is assessed for this task, including statistical models, machine learning techniques and deep neural network approaches. These are assessed on several metrics, offering an overview of the possibilities related to hydraulic time-series. An important finding is that for the same hydraulic quantity, the best predictors vary depending on whether the data is produced using a physical model or real observations.
4D acoustic imaging via an array of 32 sources / 32 receivers is used to monitor hydraulic fracture propagating in a 250~mm cubic specimen under a true-triaxial state of stress. We present a method based on the arrivals of diffracted waves to reconstruct the fracture geometry (and fluid front when distinct from the fracture front). Using Bayesian model selection, we rank different possible fracture geometries (radial, elliptical, tilted or not) and estimate model error. The imaging is repeated every 4 seconds and provide a quantitative measurement of the growth of these low velocity fractures. We test the proposed method on two experiments performed in two different rocks (marble and gabbro) under experimental conditions characteristic respectively of the fluid lag-viscosity (marble) and toughness (gabbro) dominated hydraulic fracture propagation regimes. In both experiments, about 150 to 200 source-receiver combinations exhibit clear diffracted wave arrivals. The results of the inversion indicate a radial geometry evolving slightly into an ellipse towards the end of the experiment when the fractures feel the specimen boundaries. The estimated modelling error with all models is of the order of the wave arrival picking error. Posterior estimates indicate an uncertainty of the order of a millimeter on the fracture front location for a given acquisition sequence. The reconstructed fracture evolution from diffracted waves is shown to be consistent with the analysis of $90^{circ}$ incidence transmitted waves across the growing fracture.
Current system thermal-hydraulic codes have limited credibility in simulating real plant conditions, especially when the geometry and boundary conditions are extrapolated beyond the range of test facilities. This paper proposes a data-driven approach, Feature Similarity Measurement FFSM), to establish a technical basis to overcome these difficulties by exploring local patterns using machine learning. The underlying local patterns in multiscale data are represented by a set of physical features that embody the information from a physical system of interest, empirical correlations, and the effect of mesh size. After performing a limited number of high-fidelity numerical simulations and a sufficient amount of fast-running coarse-mesh simulations, an error database is built, and deep learning is applied to construct and explore the relationship between the local physical features and simulation errors. Case studies based on mixed convection have been designed for demonstrating the capability of data-driven models in bridging global scale gaps.
Forecasting the movements of stock prices is one the most challenging problems in financial markets analysis. In this paper, we use Machine Learning (ML) algorithms for the prediction of future price movements using limit order book data. Two different sets of features are combined and evaluated: handcrafted features based on the raw order book data and features extracted by ML algorithms, resulting in feature vectors with highly variant dimensionalities. Three classifiers are evaluated using combinations of these sets of features on two different evaluation setups and three prediction scenarios. Even though the large scale and high frequency nature of the limit order book poses several challenges, the scope of the conducted experiments and the significance of the experimental results indicate that Machine Learning highly befits this task carving the path towards future research in this field.
Technologies such as aerial photogrammetry allow production of 3D topographic data including complex environments such as urban areas. Therefore, it is possible to create High Resolution (HR) Digital Elevation Models (DEM) incorporating thin above ground elements influencing overland flow paths. Even though this category of big data has a high level of accuracy, there are still errors in measurements and hypothesis under DEM elaboration. Moreover, operators look for optimizing spatial discretization resolution in order to improve flood models computation time. Errors in measurement, errors in DEM generation, and operator choices for inclusion of this data within 2D hydraulic model, might influence results of flood models simulations. These errors and hypothesis may influence significantly flood modelling results variability. The purpose of this study is to investigate uncertainties related to (i) the own error of high resolution topographic data, and (ii) the modeller choices when including topographic data in hydraulic codes. The aim is to perform a Global Sensitivity Analysis (GSA) which goes through a Monte-Carlo uncertainty propagation, to quantify impact of uncertainties, followed by a Sobol indices computation, to rank influence of identified parameters on result variability. A process using a coupling of an environment for parametric computation (Prom{e}th{e}e) and a code relying on 2D shallow water equations (FullSWOF 2D) has been developed (P-FS tool). The study has been performed over the lower part of the Var river valley using the estimated hydrograph of 1994 flood event. HR topographic data has been made available for the study area, which is 17.5 km 2 , by Nice municipality. Three uncertain parameters were studied: the measurement error (var. E), the level of details of above-ground element representation in DEM (buildings, sidewalks, etc.) (var. S), and the spatial discretization resolution (grid cell size for regular mesh) (var. R). Parameter var. E follows a probability density function, whereas parameters var. S and var. R. are discrete operator choices. Combining these parameters, a database of 2, 000 simulations has been produced using P-FS tool implemented on a high performance computing structure. In our study case, the output of interest is the maximal
A mathematical model is developed that captures the transport of liquid water in hardened concrete, as well as the chemical reactions that occur between the imbibed water and the residual calcium silicate compounds residing in the porous concrete matrix. The main hypothesis in this model is that the reaction product -- calcium silicate hydrate gel -- clogs the pores within the concrete thereby hindering water transport. Numerical simulations are employed to determine the sensitivity of the model solution to changes in various physical parameters, and compare to experimental results available in the literature.