ترغب بنشر مسار تعليمي؟ اضغط هنا

Global Sensitivity Analysis for Bottleneck Assignment Problems

77   0   0.0 ( 0 )
 نشر من قبل Elad Michael
 تاريخ النشر 2021
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

In assignment problems, decision makers are often interested in not only the optimal assignment, but also the sensitivity of the optimal assignment to perturbations in the assignment weights. Typically, only perturbations to individual assignment weights are considered. We present a novel extension of the traditional sensitivity analysis by allowing for simultaneous variations in all assignment weights. Focusing on the bottleneck assignment problem, we provide two different methods of quantifying the sensitivity of the optimal assignment, and present algorithms for each. Numerical examples as well as a discussion of the complexity for all algorithms are provided.



قيم البحث

اقرأ أيضاً

Growing model complexities in load modeling have created high dimensionality in parameter estimations, and thereby substantially increasing associated computational costs. In this paper, a tensor-based method is proposed for identifying composite loa d modeling (CLM) parameters and for conducting a global sensitivity analysis. Tensor format and Fokker-Planck equations are used to estimate the power output response of CLM in the context of simultaneously varying parameters under their full parameter distribution ranges. The proposed tensor structured is shown as effective for tackling high-dimensional parameter estimation and for improving computational performances in load modeling through global sensitivity analysis.
Sensitivity indices are commonly used to quantity the relative inuence of any specic group of input variables on the output of a computer code. In this paper, we focus both on computer codes the output of which is a cumulative distribution function a nd on stochastic computer codes. We propose a way to perform a global sensitivity analysis for these kinds of computer codes. In the rst setting, we dene two indices: the rst one is based on Wasserstein Fr{e}chet means while the second one is based on the Hoeding decomposition of the indicators of Wasserstein balls. Further, when dealing with the stochastic computer codes, we dene an ideal version of the stochastic computer code thats ts into the frame of the rst setting. Finally, we deduce a procedure to realize a second level global sensitivity analysis, namely when one is interested in the sensitivity related to the input distributions rather than in the sensitivity related to the inputs themselves. Several numerical studies are proposed as illustrations in the dierent settings.
The global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol indices. Metamodel-based techniques have been developed in order to replace the cpu time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common metamodel-based sensitivity analysis methods are well-suited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the metamodeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various steps of our methodology. This technique is then applied to a real hydrogeological case: for each model input variable, a spatial map of Sobol indices is thus obtained.
Global sensitivity analysis (GSA) of numerical simulators aims at studying the global impact of the input uncertainties on the output. To perform the GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here o n dependence measures based on reproducing kernel Hilbert spaces: the Hilbert-Schmidt Independence Criterion denoted HSIC. Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify the global impact of this uncertainty on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a double Monte Carlo loop, requires a large number of model evaluations which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a single Monte Carlo loop with a limited calculation budget. Firstly, we build a unique sample of inputs from a well chosen probability distribution and the associated code outputs are computed. From this inputs/output sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted esti-mators are demonstrated. Finally, we define 2 nd-level HSIC-based measures between the probability distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.
We consider the revenue management problem of finding profit-maximising prices for delivery time slots in the context of attended home delivery. This multi-stage optimal control problem admits a dynamic programming formulation that is intractable for realistic problem sizes due to the so-called curse of dimensionality. Therefore, we study three approximate dynamic programming algorithms both from a control-theoretical perspective and in a parametric numerical case study. Our numerical analysis is based on real-world data, from which we generate multiple scenarios to stress-test the robustness of the pricing policies to errors in model parameter estimates. Our theoretical analysis and numerical benchmark tests show that one of these algorithms, namely gradient-bounded dynamic programming, dominates the others with respect to computation time and profit-generation capabilities of the delivery slot pricing policies that it generates. Finally, we show that uncertainty in the estimates of the model parameters further increases the profit-generation dominance of this approach.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا