General Feasibility Bounds for Sample Average Approximation via Vapnik-Chervonenkis Dimension


الملخص بالإنكليزية

We investigate the feasibility of sample average approximation (SAA) for general stochastic optimization problems, including two-stage stochastic programming without the relatively complete recourse assumption. Instead of analyzing problems with specific structures, we utilize results from the Vapnik-Chervonenkis (VC) dimension and Probably Approximately Correct learning to provide a general framework that offers explicit feasibility bounds for SAA solutions under minimal structural or distributional assumption. We show that, as long as the hypothesis class formed by the feasbible region has a finite VC dimension, the infeasibility of SAA solutions decreases exponentially with computable rates and explicitly identifiable accompanying constants. We demonstrate how our bounds apply more generally and competitively compared to existing results.

تحميل البحث